We are a language technology research group working on improving the way news is presented to readers. One of the systems we want to build will summarise issues via quotes about them, so that you can see what different people are saying about a new policy or event. To build this system, we need example data: news articles that have been manually marked to identify the quotes, along with who said them and when. That's where you come in.
You will use a web interface we have developed for this task, and follow explicit guidelines that describe the annotation task. We will be checking over your annotations and giving you feedback about what we need done differently. Accuracy is critical for us, as our system cannot learn from inconsistent annotations.
This task requires first-rate English fluency, and the ability to follow instructions that sometimes rely on subtle linguistic distinctions. We particularly encourage bids from freelancers who worked with us successfully on previous projects.
The project will be structured as follows. You will bid to complete a set of 100 documents, which we estimate to be between 10 and 15 hours of work, depending on how quickly you annotate.