AI-assisted fraud is no longer a distant risk for digital platforms, it is already reshaping how systems can be exploited. A North Carolina man’s guilty plea in a multi-million dollar music streaming scheme shows just how easily artificial intelligence can be used to manipulate royalties at scale.
Michael Smith, 54, admitted to running a fraud operation that used AI-generated songs and automated bots to game music streaming platforms. The scheme brought in over $8 million in royalties, diverting earnings away from legitimate artists.
How AI-assisted Fraud Worked
Music streaming platforms pay royalties based on the number of times a song is played. These payments come from a shared pool, meaning every artificial stream reduces what genuine artists earn.
Smith exploited this model in a calculated way.
He created thousands of fake user accounts and used software to stream his own songs repeatedly. But instead of pushing a few tracks to the top, he spread the activity across a large number of songs to avoid raising suspicion.
The key enabler was artificial intelligence.


To sustain the operation, Smith generated hundreds of thousands of songs using AI. This gave him a constant supply of content that could be streamed without limits. Combined with bots simulating listeners, the setup made the activity look close enough to normal user behaviour to slip past basic detection systems. This is what makes AI-assisted fraud different. It is not just automated — it is scalable in a way that traditional fraud never was.
“Michael Smith generated thousands of fake songs using artificial intelligence and then streamed those fake songs billions of times,” said U.S. Attorney Jay Clayton. “Although the songs and listeners were fake, the millions of dollars Smith stole was real. Millions of dollars in royalties that Smith diverted from real, deserving artists and rights holders. Smith’s brazen scheme is over, as he stands convicted of a federal crime for his AI-assisted fraud.”
The numbers are significant, but the impact is broader than one case. Streaming fraud directly affects payouts to artists who rely on these platforms for income. When fake engagement enters the system, it distorts how revenue is distributed.
A Larger Problem for Platforms
This case highlights a structural issue. Streaming platforms are designed to reward engagement, more plays mean more earnings. But when engagement can be manufactured at scale, the model becomes vulnerable.
What stands out is the level of planning. By spreading streams across thousands of tracks, the activity avoided obvious spikes that might trigger alerts. This suggests that fraud is becoming less visible, not more.
For platforms, that creates a difficult balance. Tightening controls too much risks affecting genuine users, while weak detection leaves room for abuse.
Why AI-assisted Fraud is Growing
The tools needed to carry out this kind of scheme are no longer out of reach. AI can generate content quickly and cheaply, while automation tools can replicate user behaviour at scale.
This combination lowers the barrier for fraud.
It also means similar tactics could appear in other digital ecosystems, anywhere engagement drives revenue. Whether it is streaming, advertising, or social media, the underlying risk is the same.
What Happens Next
Smith has pleaded guilty to conspiracy to commit wire fraud and faces a maximum sentence of five years in prison. He has also agreed to forfeit over $8 million. Sentencing is scheduled for July 29, 2026.
The case is being handled by federal prosecuors in New York, with support from the Federal Bureau of Investigation.
The bigger takeaway is hard to ignore. AI-assisted fraud is no longer theoretical, it is already affecting how digital platforms operate.
For the music industry, this case raises uncomfortable questions about how royalties are tracked and protected. For other platforms, it serves as a warning.
When fake content and fake users can generate real money, detection can no longer rely on surface-level signals. Systems will need to get better at identifying behaviour, not just numbers.
Because if they don’t, the cost will continue to fall on those who are playing by the rules.
