Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
MSR-Bing Image Retrieval Challenge (IRC)

Microsoft Research in partnership with Bing is happy to launch the second MSR-Bing Challenge on Image Retrieval. Do you have what it takes to build the best image retrieval system? Enter the MSR-Bing Image Retrieval Challenge in ACM Multimedia and/or ICME to develop an image scoring system for a search query. Last Challenge: MSR-Bing IRC @ ACM Multimedia 2014. Next Challenge: MSR-Bing IRC @ ICME 2015.


The dataset that the challenge is based on can be found here.

Next Challenge: MSR-Bing IRC @ ICME 2015

(New!) We will do the challenge again in ICME 2015. More details to come soon ...

Important dates:

  • Important dates (Extended and firm):

    • April 21: Final evaluation set available for download (encrypted)
    • April 24: Evaluation starts (password for decrypt the evaluation set delivers at 11:30pm on March 23, PDT)
    • April 25: Evaluation end at 0:00AM PDT (very beginning of March 25). Result submission due.
    • April 28: Evaluation results announce.
    • May 1, 2015: Paper submission (please follow the guideline of the main conference)
    • May 10, 2015: Notification
    • May 15, 2015: Paper camera ready due

Last Challenge: MSR-Bing IRC @ ACM MM 2014

More details about the challenge, please visit:

1. The grand challenge page at ACM Multimedia 2014.
2. IRC @ MM 14 at this site

Latest announcement will be posted here. 

Recent Updates:

  • July 5: Evaluation results:


  • June 26: Due to many requests, the MM14 grand challenge submission deadline was extended for a week. So we also extend MSR-Bing challenge result submission deadline for one week. Please check the updated dates below.
  • June 25: Encrypted evaluation dataset is available for download now: Please follow the below steps to submit your prediction results:
    1. Register a "paper" entry at Make sure to finish this step ASAP (at the latest 30 minutes before the challenge starts). Password to decrypt the evaluation set will be set through CMT.
    2. Download the encrypted evaluation dataset. Please note the downloaded file was zipped twice (once with a password and once not).
    3. Unzip the downloaded file (without password) to make sure the file is not corrupted.
    4. Unzip the file you get from Step C with the password that will be sent to you through CMT. You will then get two files: one is a (key, image thumbnail) table, and the other is a (key, label) table. Please refer to this page know the details how to do generate prediction results.
    5. Before the end of the challenge, submit your prediction results (up to 6 zipped files - see instructions below).
    6. Submit your grand challenge paper according to the guideline in the ACM Multimedia 2014 website. Please note the CMT site is only for prediction results submission. Your paper should be submitted to EasyChair paper system. Make sure that you include your evaluation results in the paper (which will be sent to you before the paper submission deadline).
  • June 25: Evaluation set will be available by EOD today. CMT will be also online at the same time. Instructions: You are requested to register an entry at the CMT site to receive the password to decrypt the evaluation set as well as submit your prediction results. Please note prediction results based on Clickture-Lite (1M images) are mandatory, while the results on Clickture-Full (40M images) are optional. When submitting the prediction results, please name the files appropriately so we know which are based on 1M dataset (include "1M" in the file name) and which are based on 40M dataset (include 40M in the file name), as well as which are master runs (include "master" in the file name). If you submitted results based on both datasets, you are allowed to submit three runs for each dataset (including one master run for each dataset). Please note final evaluation will be based on the master runs though we will also return you the scores for other runs. (New!)
  • June 25: Evaluation starts and ends dates changed (1 day delay).
  • June 19: Trial set is available here:  (New!)

Schedule (updated on June 26):

  • Feb 15, 2014: Dataset available for download (Clickture-Lite) and hard-disk delivery (Clickture-Full).
  • June 18: Trail set available for download and test.
  • June 25: Final evaluation set available for download (encrypted)
  • July 3 (updated/firm): Evaluation starts (password for decrypt the evaluation set delivers at 11:30pm on July 2, PDT)
  • July 4 (updated/firm): Evaluation end at 0:00AM PDT (very beginning of July 4)/Result submission due
  • July 5: Evaluation results announce.
  • July 6, 2014: Paper submission (please follow the guideline of the main conference)

Links to the Challenges at Difference Conferences:

Related People
Xian-Sheng Hua
Xian-Sheng Hua

Jin Li
Jin Li

Related Projects