RSC - Advancing the Chemical Sciences


 

How the entries were assessed

Ice cubes

With over 22 000 people from all over the world and from all walks of life entering the competition, assessing all the entries was no mean feat!


Peer review and public voting


The entries were assessed by two groups in parallel: a panel of scientific experts and the public.

In order to explore some of the issues surrounding the open science debate and to experiment with new strategies for the open access community, the Royal Society of Chemistry (RSC) and Hermes 2012 decided to use two different methods to select the winning entry.

Initially, the entries were digitally filtered. Because of the overwhelming number of entries in the competition, some simple filtering was applied to make sure that the judges only had to evaluate the entries  that were likely to contain a winning answer. More detailed information on how we did this is available at "In detail: the digital filtering process".

A crack team of chemists who work at the Royal Society of Chemistry assessed the entries according to whether they are scientifically sound, imaginative and creative.

The team was made up of Publishing Editors, whose day-to-day job is to assess whether chemistry papers submitted to the RSC's journals are of the world class standards the RSC prides itself on publishing, and Communications and Education Specialists, who find exciting and engaging ways to bring chemistry into the public eye and into the classroom.

All the entries that made it through the first round were sent to a panel of expert chemists and physicists, and these experts whittled the entries down to a few potential winners. From these few, a winner was chosen...

A public peer review system was also used in parallel. The pubilc voted for their favourite competition entry.


Digitally filtering the entries

Details on how we filtered out the entries that weren't going to go on to be judged by a panel of experts