Community Sessions


  • Other Evaluation Initiatives
    The speakers for the Other Evaluation Initiatives session are:


    TREC at 20 years
    Donna Harman/Ellen Voorhees, NIST, USA

    Abstract:
    The short talk will review TREC's 20th meeting (2011), with comments on what has been learned (and not learned) about evaluation over those years.



    TAC at 2011
    Hoa Dang/Donna Harman, NIST, USA

    Abstract:
    The short talk with review the TAC 2011 tasks, as well as looking ahead to 2012 and 2013.



    What is happening in NTCIR.
    Noriko Kando,Hideo Joho and Tetsuya Sakai - National Institute of Informatics (NII), Japan

    Abstract:
    This talk briefly introduces what happened in NTCIR-9 (Sept 2010 - December 2011), and is happening in NTCIR-10. From NTCIR-9, the structure was changed to have multiple general co-chairs, program co-chairs, and EVIA co-chairs to make the organization stronger. The repertory of the tasks were changed and had CrossLink, GeoTime, Intent, 1Click, PatentMT, SpokenDoc, and VisEx in NTCIR-9. These changes seemed successful in various aspect. For NTCIR-10, "Math" was added as a Pilot task, and GeoTime and VisEx have a interval to prepare for better version at NTCIR-11. Currently 48 researchers were serving as task organizers and 115 teams registered, and working hard toward NTCIR-10 Conference on 18th - 21st June 2013, Tokyo, Japan.



    ROMIP: One Step Forward, One Step Aside
    Pavel Braslavski, Kontur labs, Russia

    Abstract:
    The presentation surveys briefly the history and present state of ROMIP - a Russian TREC-like information retrieval evaluation campaign started in 2002. In particular, I will report on new ROMIP tracks - sentiment analysis track introduced in 2011 and machine translation evaluation track scheduled for the next year (2013).



    FIRE: A community development exercise
    Prasenjit Majumder

    Abstract:
    The FIRE initiative in India is holding its 4th evaluation campaign this year. This talk will outline the impact of FIRE in terms of community development in the Indian subcontinent. Attracting participants, particularly from the IR community in India, is one of the biggest challenges facing FIRE. The talk will also very briefly describe the tasks being offered this year.



    MediaEval in 2012
    Gareth Jones

    Abstract:
    MediaEval is a benchmarking initiative dedicated to evaluating new algorithms for multimedia access and retrieval. It emphasizes the 'multi' in multimedia and focuses on human and social aspects of multimedia tasks. MediaEval attracts participants who are interested in multimodal approaches to multimedia involving, e.g., speech recognition, multimedia content analysis, user-contributed information (tags, tweets), viewer affective response, social networks, temporal and geo-coordinates. MediaEval 2012 features a range of evaluation tasks exploring all of these dimensions of multimedia search. This presentation will briefly review the tasks for MediaEval 2012 and the outlook for the development of the MediaEval benchmarking initiative.