mediaeval 2015 - sava at mediaeval 2015: search and anchoring in video archives

29
Search and Anchoring Video Archives SAVA Overview Maria Eskevich, Robin Aly , David N. Racca Roeland Ordelman, Shu Chen, Gareth J.F. Jones

Upload: multimediaeval

Post on 20-Jan-2017

114 views

Category:

Education


0 download

TRANSCRIPT

Search  and  Anchoring  Video  Archives  

SAVA    Overview  

Maria  Eskevich,  Robin  Aly,    David  N.  Racca  

Roeland  Ordelman,  Shu  Chen,  Gareth  J.F.  Jones  

Outline  

•  Task  definiGon  •  Dataset  (Videos  +  user  input)  •  Ground  truth  creaGon  •  EvaluaGon  procedure  •  Results  

5/13/13   LIME  workshop  -­‐  WWW2013    

Terminology  •  Video  (e.g,  2  hours)  

•  Search  result  (e.g.  10  min)  

•  Anchor:  segment  for  which  a  user  requests  a  link  (e.g.,  1  min)    “I  want  to  know  more  about  this”  

•  Hyperlink  

•  Target:  relevant  segment  for  given  anchor  (e.g.,  5  min)  

7/2/13   DGA  workshop  -­‐  July  2013,  Paris  

Use  Case  

7/2/13   DGA  workshop  -­‐  July  2013,  Paris  

Video 1

Video 2 Video 3

Text query: Speech cue: “hunger around the globe”Visual cue: “hungry people slim bodies”

Search results:Video Start End Jump-InVideo1 13:30 15:00 13:30Video10 15:10 17:00 15:10Video12 29:50 31:00 29:50

Target Target

Result 1

Anchor Anchor Anchor Anchor

HyperlinkHyperlink

Search  Task  DefiniGon  

Video 1

Text query: Speech cue: “hunger around the globe”Visual cue: “hungry people slim bodies”

Search results:Video Start End Jump-InVideo1 13:30 15:00 13:30Video10 15:10 17:00 15:10Video12 29:50 31:00 29:50

Result 1

User  -­‐  Input   ParGcipant  Submission  

Anchoring  Task  DefiniGon  

Video 1Anchor? Anchor? Anchor? Anchor?

Input   ParGcipant  Submission  

Video  Start    End      Video  

Task  history  

•  ME  2011  Rich  Speech  Retrieval  (predecessor)  •  ME  2012  S&HL  “brave  new”  task:    –  Search  &  Linking  (blip.tv)  

•  ME  2013  S&HL  “regular”  task  –  Search:  (known-­‐item)  Linking:  (bbc  collec=on)  

•  ME  2014  S&HL  “regular”  task  –  Search:  (mulG  relevant)  Linking:  (mul=  relevant)  

•  ME  2015  Search  &  Anchoring  +  Linking@TRECVid  –  Search:  mulG  relevant  – Anchoring:  "brave  new  task"  

7/2/13   DGA  workshop  -­‐  July  2013,  Paris  

Dataset:  Video  collecGon  

•  Test  collecGon  Search:  – copyright  cleared  broadcasts  from  the  period  of  12.05.2008  –  31.07.2008  

– 2686  hours  – ~200  videos  rebroadcast  or  audio-­‐visual  signal  was  out  of  sync.  

•  Anchoring  test  collecGon  – 33  videos  for  anchoring  for  anchors  of  2013  and  2014  ediGon  

5/13/13   LIME  workshop  -­‐  WWW2013    

Dataset:  Query  GeneraGon  

•  Users  – BBC  employees  – BriGsh  Film  InsGtute  –  Journalists,  +  prospecGve  Students  

•  InstrucGons:  – Personal  – Teleconference  session  

•  SubjecGve  impression:  task  difficult  but  doable.  

5/13/13   LIME  workshop  -­‐  WWW2013    

GeneraGon  of  Info'Need  

7/2/13   DGA  workshop  -­‐  July  2013,  Paris  

Formulate  InformaGon  

need  

Text  search  

Visual  search  

Search  Interface  (AXES)  

Annotate  Clips  

Fine  Tune  Clip  Boundaries  

7/2/13   DGA  workshop  -­‐  July  2013,  Paris  

Define  anchors  

Dataset:  Search  Queries  

•  42  queries  for  search  task,  e.g.    <top>          <itemId>item_35</itemId>          <queryText>michael  jackson  quincy  jones</queryText>          <visualCues>singer,  dancing,  michael  jackson</visualCues>      </top>  

5/13/13   LIME  workshop  -­‐  WWW2013    

Run  staGsGcs  

•  Search  – 10  runs;  EURECOM  +  IRSIA  

•  Anchoring  – 10  runs;  EURECOME,  IRSIA,  TUD  

Ground  truth  creaGon  

•  Search  sub-­‐task:  – MTurk  –  top-­‐10  of  all  runs  – Showing  descripGon  for  desired  segments  – Binary  relevance  judgment  with  explanaGon  

•  Anchoring  sub-­‐task:  – Combine  top-­‐25  segments  to  connected  parts  – MTurk  for  these  segments  – Binary  relevance  judgments  

5/13/13   LIME  workshop  -­‐  WWW2013    

Search  Assessment  

Search  result  DescripGon  informaGon  

need  

Relevance  judgment  

Judgment  details  &  

VerificaGon  

Combining  Segments  Four  Judgment  

5/13/13   LIME  workshop  -­‐  WWW2013    

Run  1  

Run  2  

Combining  Segments  

Anchoring  Assessment  

Context  

Submired  anchor  

Relevance  judgment  

Judgment  details  &  verificaGon  

Context  

Ground-­‐Truth  StaGsGcs  Search  

EvaluaGon:  Search  Task  

•  Depending  on  jump-­‐in  point  •  Measures:    – MAP,  P_10  adapted  binary  measures  (overlap  important  decision)    

•  Maisp  

5/13/13  

Maisp  results  

EvaluaGon:  Anchoring  Task  

•  Measures:  P@10  ,  Recall  •  Segments  overlapping  with  relevant  segments  are  considered  relevant  

•  Recall:  How  many  of  the  known-­‐relevant  segments  were  found  

RUN  ANALYSIS  

Results:  Search  Task  

Results:  Anchoring  Task  

Conclusions  

•  Task  defined  by  users  •  Search  task:  maisp  measure  •  First  steps  anchoring  task  •  Few  runs  prevent  strong  conclusions  

The  Search  and  Hyperlinking  task  was  funded  by    We  are  grateful  to    

 Jana  Eggink  and      Andy  O'Dwyer    

from  the  BBC  for  preparing  the  collecGon  and  hosGng  the  user  trials.