Video Genome Project is a prototype created in Orange Labs San Francisco.
The concept behind is to find implicit meta data to describe media content, therefore to provide very personalized recommendations.
for example we map concepts from wikipedia to annotate videos; another example is that we use emotions -- by capturing emotion inputs like :-) or :-( in users' chatting messages,
which produced a very interesting by-product: an emotion heat map to show real time emotion feedback of our audiences.
If you have noticed, when mouse over the video, or when the video finishes playing, we ask you to tell us if you like or dislike the video.
Yes, we need you to tell us your feedback at least one time, so we will know what kind of videos we should recommend to you next time when you come back.
So in my case, I was recommended with Apple related videos.
If you are interested in using Videogenome API for media tagging or searching videos by concepts, please contact us at firstname.lastname@example.org.
Since this project is still a prototype, we really need your comments. Thanks!