is a FP7 project that will develop a new framework for enabling
real-time multimedia indexing and search in the Social Web. The project
will move beyond conventional text-based indexing and retrieval models
by mining and aggregating user inputs and content over multiple social
networking sites. Social Indexing will incorporate information about the
structure and activity of the users’ social network directly into the
multimedia analysis and search process.
Furthermore, it will enhance
the multimedia consumption experience by developing novel user-centric
media visualization and browsing paradigms. For example, SocialSensor
will analyse the dynamic and massive user contributions in order to
extract unbiased trending topics and events and will use social
connections for improved recommendations.
To achieve its objectives,
SocialSensor introduces the concept of Dynamic Social COntainers
(DySCOs), a new layer of online multimedia content organisation with
particular emphasis on the real-time, social and contextual nature of
content and information consumption. Through the proposed
DySCOs-centered media search, SocialSensor will integrate social content
mining, search and intelligent presentation in a personalized, context
and network-aware way, based on aggregation and indexing of both UGC and
multimedia Web content.
The resulting multimedia search system will
be showcased and evaluated in two use cases: (a) news, involving
professional news editors, journalists and casual readers, benefiting
from the improved capabilities of SocialSensor for discovering new
interesting social content and integrating it in the news creation and
delivery lifecycle, and (b) infotainment, providing new multimedia
search tools and unique media consumption experiences to attendants of
large events (e.g. festivals). Providing real-time social indexing
capabilities for both of these use cases is expected to have a
transformational impact on both sectors.