User Tools

Site Tools


nlp:dataset_creation

Dataset Creation

Annotation

For annotation tools, see Software - Annotation Tools. Annotation can often be greatly sped up by building your own annotation tool with exactly the features you want for your application. This is can be a worthwhile time investment, since a well-designed tool can speed up annotation.
To annotate data manually (without using crowdsourcing), practitioners generally follow these steps:

  • Gather data: Decide the data source and gather some data to annotate. Be sure to consider any ethical issues. If you want to be able to release the data publicly, check for potential copyright or privacy violations.
  • Decide what to annotate: Look at a portion of the data and decide a rough idea of what you want to annotate – that is the phenomena you want to capture and at what granularity. Write a document describing the preliminary annotation scheme.
  • Pilot annotation: Try annotating some data by yourself or with some collegues using the annotation scheme. (Annotate the same data). Compare annotations and decide on edge cases (decide what to do on the difficult boardline cases). Decide if you want to simplify or extend the annotation scheme.
  • Refine annotation scheme (iterative): Refine your annotation scheme until you're happy with it and it is easy to annotate. This may take several rounds of pilot annotation.
  • Compute inter-annotator agreement: Make sure to doubly annotate a subset of the data you annotate so you can compute inter-annotator agreement
  • Full-scale annotation: Annotate a bunch of data yourself or using annotators you've trained (usually it helps to have these annotators involved in developing the annotation scheme during the pilot annotation). Depending on the complexity of the annotation task, you may need to have regular meetings during this time to decide on edge cases as they come up.
  • Release: Release it with a document or published paper describing the annotation scheme and a datasheet. Make sure there are no copyright or privacy violations when releasing the data. You can create a project page such as this one or this one.
  • Updates: You can release another version of the data to annotate more data or fix errors. You can also have a bug report form (like this one) for the dataset to allow others to fix errors.

Annotation Agreement

Building Your own Annotation Tool

  • For simple projects, annotation can be done in a spreadsheet
  • When building your own annotation tool, here are some things to consider
    • The purpose of the tool is to make the annotation faster. Think carefully about what interface will be fastest for trained annotators.
    • To speed up development, use whatever language and API you are familiar with or find easiest.
    • Think very carefully about ways to reduce unnecessary mouse clicks, typing, reading text, etc. Every mouse click counts. Aggressively remove anything that is unnecessary, like typing escape or enter to save. Instead, automatically save when you go to the next example, etc.
    • Plan on doing some iterations on the tool. You will need to try it, and change it based on your experience.
    • It doesn't need to be perfect, it just needs to be fast to use. It's ok to have bugs in the annotation tool if it's not widely used, and they don't slow down annotation.
    • Don't make it full-featured. You just need the features that make annotation fast.

Dataset and Data Selection Issues

Data Validation

Crowdsourcing

Alternative Methods

Methods of Faster or Cheaper Annotation

Methods of Avoiding Dataset Bias or Improving Robustness

Reducing Bias

Documentation

nlp/dataset_creation.txt · Last modified: 2023/12/10 06:18 by jmflanig

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki