What if you could get paid for giving your data to companies using it to train AI models? Avi Patel, founder of AI startup Kled, is making that happen.
Avi Patel has a simple pitch for a complicated problem. If AI companies make money from training their models on human data, humans should get paid for providing it.
This week, Patel’s startup Kled AI, announced a$5.5 million seed round for what it calls a “human data marketplace.” It brought its total financing to $9 million.
The investor list is unusually eclectic – spanning well-known names such as Sebastian Thrun, founder of Waymo, and Aglaé Ventures.
The idea behind Kled’s human data marketplace
In a LinkedIn post last year, Patel has traced his interest in “who gets paid” back to an earlier startup. After dropping out of college at the age of 18, he built a Web3 company that later collapsed after what he described as “multiple false copyright claims.”
It was an experience that left him convinced digital licensing is broken. And that creators and users are too often cut out when their work is reused.
That frustration became Nitrility, the music licensing startup he built next. Now, Kled has grown out of that work as a consumer-facing offshoot aimed at paying people for their data to be used in AI training.
On paper, Kled’s model is easy to explain. People upload their files (video, text, images), Kled pays them, and then Kled licenses that data – packaged into training-ready datasets – to companies building AI models.
But the reason it’s catching attention is deeper than the business model. It’s a direct challenge to the way the modern internet works. While most products quietly extract value from users, Kled is trying to price it.
In other words, participants may be getting paid for data that helps train the very systems reshaping – and sometimes replacing – parts of human work.
Paying people for their data: how Kled’s model works
Kled says it has already found early traction through a consumer app. In recent updates, the startup has claimed users are uploading 3 to 4.5 million files per day, and that the app has hit #1 in the App Store’s Finance category in 4 countries.
Kled also claims its “top earner” has made $7,400 per month. Those numbers may be self-reported, but they help explain why investors might take the idea seriously.
The timing also matters. AI companies are racing to build better models, and better models usually need better data.
A lot of today’s training material has been scraped from the open web, which has triggered lawsuits, creator backlash, and a wave of broader questions that won’t go away. Who consented? Who got paid?
Kled’s bet is that “opt-in + payout” not only solves an ethical headache, but it unlocks data for companies that scraping can’t easily get. That’s where the story gets interesting – and, well, kind of messy.
“Someone found a way to make humans fund their own replacement and the humans are calling it a side hustle,” someone wrote on X in response to Kled’s fundraising news.
“But honestly? At least Kled is telling you what’s happening to your face. OpenAI scraped your tweets and built a $300 billion company. Meta scraped your photos. Google scraped your code. They all got rich off your data and you got nothing.”
How much is human data worth?
If you’re paying people for their data, you’re not just building a marketplace – you’re essentially building a trust machine.
And that’s the part Patel knows can collapse fast. If you’ve already watched your own company fall apart over rights and claims by the age of 22, you start to treat “consent” and “ownership” like the foundation.
Kled says privacy and compliance with laws like GDPR and HIPAA are “top of mind.” It also describes processes meant to anonymize and remove sensitive information before data is packaged and labeled.
Outside the startup world, researchers and regulators are also grappling with what “anonymized” really means in practice, especially for sensitive records.
In other words, the hard part isn’t just asking everyday consumers to upload their data. It’s making sure the system stays safe, legal, and worthy of trust at scale.
For Patel, the seed round buys time to prove that trust can be earned – and that the incentives can hold.
If Kled works, it turns a blurry, uncomfortable reality into a visible transaction, with humans transitioning from silent inputs to paid suppliers. If it doesn’t, it will be for the oldest reason in data: people decided the price wasn’t worth it.
Either way, it looks like what Patel is building is forcing us to face a question the AI boom keeps trying to outrun.





