January 4, 2024

If You Want Data Annotation Done Well, Build The Right Workforce

If You Want Data Annotation Done Well, Build The Right Workforce

Quality data annotation begins with building the right workforce. Building an effective, motivated workforce, however, is no easy feat—especially when annotators are often crowdsourced & technically not full-time employees.

The business case boils down to this: How do you cultivate a strong, invested team that consistently delivers good work, while still aligning to the on-demand business model inherent to most annotation companies?

The stakes feel abstract until you witness the real-world ripple effects from poor data training. While companies race to deploy machine learning across their products, few pause to seriously confront this workforce challenge that will either power AI done right or wrong. 

Construct the wrong annotation team—with poorly defined standards, minimal training, nonexistent feedback loops and engagement—and you indirectly sanction biases creeping into models and products failing end-users. But invest in the right annotation talent, with the right support and infrastructure, and quality becomes very achievable.

In this article, we will walk you through SUPA’s journey in solving quality crowdsourced data annotators into a community of high performers.

The Mindset of a Data Annotator

Data annotators are the human reviewers who carefully label and categorize datasets to train AI systems. Their work is essential, but often invisible. We wanted to truly understand—and support—the crowdsourced teams powering this critical behind-the-scenes work.

Brainstorming ways to identify annotator sentiment

Through regular interviews, surveys and experiments, we identified 3 key motivations driving our annotator community:

  1. Annotators crave connection -  More than transient gig workers, our teams desire responsive relationships and communication just as full-time employees do.
  2. Annotators want career growth -  Many are excited by how AI/ML is shaping the future and wish to skill up in this area long-term. Others chose this work for added flexibility, but still aim for stability in the role over time. 
  3. Engagement boosts work quality -  On projects with rich guidance, community and support, annotator accuracy and consistency outperform other projects which are more transactional in nature e.g. the annotators are just told when & how to work without any further instructions.
    Annotators are also motivated by the sense of being part of something bigger & more meaningful, which we try to create in each project. In fact, we found that engagement & support has the greatest impact on more complex projects, contributing to a 50% increase in quality & productivity on average.

These insights showed us that to achieve AI done right, we must treat annotators as true partners in the process. With this lens in place, we evolved our workforce strategy to focus less on per-task incentives and more on enabling annotators' professional and personal growth.

SUPA’s Community Strategy for Quality Data Annotation

Given our findings, the workforce strategy was created around three pillars: Unite, Engage, and Upskill. This approach aims to foster a community of top annotator talent that delivers exceptional data quality over the long term.

Stage 1: Unite

We first focused on uniting annotators under a shared vision for excellence and career growth. After determining the premium caliber of annotator we hoped to attract, we began by outlining advancement paths and benefits unique to our community.

To maintain quality standards in line with our vision, we instituted accuracy and performance metrics aligned to industry best practices. Annotators receive regular feedback tied to these standards as they progress from entry level to expert roles. Career growth responds directly to measurable competency gains.

This yielded many dividends for us as the annotators suddenly found themselves viewed as more than just gig-workers. We saw an uptick in productivity of over 30% from our high performers and far more engagement.

Stage 2: Engage

Employee engagement reduces turnover, fuels productivity, and impacts profits—a truth proving consistent even with gig workers like our annotator teams.

We actively engage annotators through multiple platforms, including all-hands Town Halls for cross-community sharing and leadership Q&As. Annotators also provide regular feedback that shapes guideline improvements and new feature development.

Additionally, we also organize weekly Office Hours meetings between our subject matter experts and individual annotators, which allow for 1:1 expert support while tackling complex datasets. These connection opportunities lead annotators to more highly value their contributions and take pride in advancing AI done right.

Stage 3: Upskill

With unified purpose and stronger engagement established, we further our community’s growth through robust development resources.

Our training modules help annotators gain specialized skills in high-demand areas for annotation  like semantic segmentation  and classification work. We also have more specialized modules that tackle common annotation mistakes in high complexity projects e.g. edge cases and areas for potential bias. 

By encouraging professional advancement grounded in real-world AI proficiencies, we empower annotators to better themselves while continuously improving SUPA quality and capabilities. Our community thrives when each individual steps into their full potential.

The Bottom Line

The fruits of this Unite, Engage, Upskill approach become clear when reviewing SUPA’s groundbreaking accuracy benchmarks and ever-rising talent retention numbers quarter-to-quarter.

But most importantly, we see it in the real human stories of annotators experiencing fulfillment in their careers—and AI models delivering standout results—backed by a standout workforce.

Closing Thoughts: Sustaining Data Annotation Quality By Investing in Community

“Too often, leaders attempt to change the way people act without changing the way they think (i.e., their beliefs). As a result, they get compliance, but not commitment; involvement, but not investment; and progress, but not lasting performance.” - Roger Conners

Great results do not spontaneously emerge in companies—nor in communities. Short-term tactics may sometimes deliver passable work, but only sound long-term workforce strategies can achieve enduring performance.

Amidst the AI boom, quality data annotation remains bottlenecked by transient gig workers and limited engagement. But at SUPA we took a different approach—one focused on people first, technology second. We understood only by fundamentally transforming workforce assumptions could we achieve enduring gains.

Our solution? Don't just create jobs. Cultivate community.

We focused on people first, technology second. We listened, learned, and embedded engagement into each interaction. We elevated individuals while also building collective capability.

Despite AI/ML’s technical complexities, human insight and ethics remain at the core of quality outcomes. Great people ultimately build great technology. Our annotator community resides on the frontlines of upholding that standard for our clients..

The numbers validate our strategy, but community voices tell the deeper story:

  1. “SUPA always give equal opportunity to everyone - they encourage and push those who are performing well. Their continued encouragement, motivation and guidance made me understand the various projects and mold myself as an Annotator”
  2. “I appreciate the fact that I can keep developing my skills in SUPA. Knowing that my work plays a significant role in the development of AI and that I'm helping to enhance AI performance by producing high-quality annotations makes my job even more meaningful”

By focusing on connections—between people, purpose, and professional growth—SUPA built the model for a way of working that values annotators as irreplaceable partners in the AI age. The results speak volumes, but to us, the people always spoke louder.

---

SUPA has worked in the AI/ML industry for over 7 years. Learn more on how we can help you build better AI here.