November 29, 2022

Is SUPA BOLT for you?

Is SUPA BOLT for you?

A few weeks ago, Steve, our Head of Sales, ran a quick LinkedIn poll on preferred data labeling options. The results? 50% of voters chose in-house labeling, and another 29% preferred to work with an internal team using an external labeling tool. 

There’s definitely a case for keeping things internal. Ease of communication and feedback from your internal team, less back-and-forth between an external party and of course, clear insight and visibility into labeling quality. But what if you could get all these benefits without having to hire, onboard, and manage your own labeling team?

That’s why we built BOLT – to make the data labeling experience a seamless one. Our clients get all the perks of working with an in-house team without actually using internal resources

Here’s how and why BOLT might be the data labeling platform for you:

1. BOLT gives you the flexibility to scale up quickly. Or scale down. 

There’s no easy way to predict the labeling volume of a project. It’s also a big thing to commit to a monthly spend on data labeling – especially when you’re not sure about how much training data you need to build a robust model. 

With BOLT, we offer a pay-as-you-go service with no minimum spend. We will label as much or as little data as you need, so you won’t need to worry about the costs of having a fixed headcount of labelers. 

If, however, you’ve got a consistent data labeling pipeline you will benefit from our volume-based rates, have a chat with our Salesguy Steve

2. BOLT is connected to a trained workforce.

Our data annotation platform is connected to our team of curated and trained annotation experts – we’re really proud of them.

Our annotators go through a rigorous training and assessment process to ensure they have the required skills and knowledge for producing precise annotations. We also track their on-the-job Annotation Quality Score (AQS), which allows us to effectively assign tasks based on complexity to annotators with the right level of experience.  

BOLT’s Enterprise clients also get to enjoy a fixed set of elite annotators who are given project-specific training by our experienced Annotation Managers

3. BOLT is simple to set up. 

BOLT was designed to be simple. We ran many user interviews and tests to get this right (and we’re still doing interviews to get better!). Upload your data, set up labels and add instructions – then press Start Project. Not convinced? See how we stack up against Amazon SageMaker.

With BOLT, it takes under 10 minutes to set up a project. So you can validate a test batch or run a proof of concept and see results in about an hour*.

*Average time it takes for projects that have less than 700 annotations ($50 free credit /0.07 per bounding box)

4. BOLT is fast. 

Many data labeling vendors boast about speed – we do too. 

“The annotators were really quick. I would upload and 5 minutes later - 10 images done. I checked 5 minutes later - 100 images done.”

Sparsh Shankar, Associate Machine Learning Engineer at Sprinklr

But most annotation providers don’t provide visibility on both progress AND quality of the labeling work. Oftentimes, vendors and clients use different platforms to view quality, resulting in a he-said-she-said situation. With Project Collaboration built into BOLT, you can collaborate with your team and use our QA tools to vet quality and reduce label noise. 

5. BOLT gives you insights to iterate.

We know that it takes up to three batches before you start getting quality right. BOLT was built for our users to quickly iterate and validate quality. Here’s what one of our clients had to say about our speed: 

“What surprised me is the amount of insight I could gather with smaller batches of data - not only did I discover more edge cases, I could also quickly change up my instructions and launch a revised batch within hours. I was also able to view the labels as they were being generated, which gave me quick feedback about the label quality, rather than waiting for the whole batch.”

Sravan Bhagavatula, Director of Computer Vision at Greyscale AI

We advise our users to start with a test batch to see if their instructions are clear and understandable. Our annotators are also trained to give you feedback which you will see on your Dashboard under Tasks on Hold. 

*

Still not convinced? Get in touch and send us a test batch of data. We’ll run the project and add you as a collaborator for you to view the results.