
Beyond the Hype: The Messy Reality of Training AI
Scour LinkedIn jobs and you’re sure to come across half a dozen listings like the following: “Content Reviewer: Review AI content for clarity. Set your own hours.”
There are variations of these roles, but the deluge on job boards means one thing: training AI models is a real business. One World Economic Forum survey shows the fastest-growing skill in the marketplace is “AI and big data.”
Despite my initial hesitation about AI (I’m a writer, so I’ve had concerns about AI replacing my role), I decided to get on board with data annotation and AI data training. I’ve spent the last few months hopping from company to company, and I’d like to share an insider’s view of my experience.
First, the Positive Aspects of Data Annotation Work
With writing work being slow, I’ve had time in my schedule to pick up these data annotation projects. I like that I can work on my schedule, as little or as much as I want (with some caveats of availability). All the agencies I’ve worked with have paid promptly each week. Some even offer bonuses.
Pay varies dramatically based on project needs, but lately I have seen better-paying opportunities for subject-matter experts, instead of the flood of $15/hour generalist jobs I saw a few months ago.
The Onboarding Process for AI workers
Once I apply for a role I think I’m a good fit for, I’m usually given a link for an interview…with an AI recruiter! It’s the strangest thing, talking to the camera without a person on the other end. The interview questions vary in quality. Some ask great ones, while others are overly technical for the job, in my opinion.
If I’m deemed worthy of the job, I get an email saying I’m in.
Onboarding happens in a flurry of emails with access to Slack, a timer, and the system. I’m required to read onboarding documents and sometimes take a quiz to test my understanding. If I pass, I get access to tasks and can begin work.
Drawbacks to Data Annotation Work
As streamlined as the onboarding process can be, it’s the actual work that can get messy. Here are some drawbacks you should be aware of if you’re considering taking on data annotation work.
1. AI Training Is an Aggressive Market
Now that I have data annotation on my resume, I get emails on LinkedIn about roles almost every day. However, it’s important to understand what’s really happening. Companies like Mercor and Micro1 pay referral fees for new hires, sometimes several hundred dollars. So the professionals contacting me say they are a “recruitment and referral partner,” which just means they want me to click their referral link so they get paid. I often get multiple emails from different “referral partners” for the same job.
This isn’t necessarily a bad thing from the worker’s perspective, but it does mean you’ll see multiple listings (worded slightly differently) for the same job. So you waste time looking for work because you keep clicking on the same job!
2. AI Agencies Overhire
Every AI agency I’ve worked with has hired hundreds of people for a short-term project. Many times, I don’t even get a chance to work on a project because the piranhas have already consumed all of the work, and then the project closes.
The Slack channels are a mess. Hundreds of people ask the same questions without searching to see if the question has already been answered. They clog the space with unnecessary chitchat, which makes it difficult for someone looking for work-related information to find it.
Sometimes within days, the project is over. I often spend more time onboarding than actually doing paid work, which is a travesty.
3. Organization for AI Training Projects Is Nil
I have to commend any project lead who works in this space because I imagine it’s a nightmare of a job. They deal with demanding clients, few parameters for what is deemed quality work, and an incessant stream of chatter on Slack.
But what I have seen over time is that AI agencies are getting smarter. While a few months ago I’d be thrown into a project with just a short training document, more agencies are requiring workers to pass quizzes to get to the real work. It’s smart, but flawed. More than once, I’ve failed a quiz, been booted out, and then weeks later received an email saying they’d messed up the quizzes and I was back in. Only now there was no work!
4. Ghosting Is Common on AI Projects
Several times, I’ve been kicked off a project without an explanation why. I get blocked from Slack and have no recourse to ask what happened. A little common courtesy would go a long way here. This is such a new industry, and we’re all learning, so why not help us do better by explaining why we are no longer eligible to work on a project?
5. AI Projects End Without Warning
Project leads are always obtuse when workers ask how long a project will last. Inevitably, it is usually only a matter of days or weeks before the work is completed and the lights are turned off. Sometimes leads say a project is just paused, but I’ve yet to see one come back online.
6. Project Instructions Change Frequently
Given how frenetic these projects are, it seems like the client and agency don’t take enough time to flesh out the requirements and instructions initially. That means people knee-deep in the project are suddenly given updated instructions to adhere to.
There’s Still a Lot of Room for Improvement in the AI Training Industry
Yes, this is a new frontier, and agencies and workers alike are still learning. I invite AI agencies to consider us workers instead of just cogs in the machine. Rather than ask people to work for a few hours and then sit on their hands waiting for more work that never comes, wouldn’t it be better to line up several projects and keep workers happy (and loyal), without having to train new hires every few weeks for new projects?
Data annotation jobs could develop into full-time, permanent opportunities if AI agencies reformulate how they hire and give work. That way, employees are more dedicated to the role and don’t, like me, hop from one opportunity to another.



