REGULATION

North Carolina Man Pleads Guilty to $8M AI Music Streaming Fraud

P Priya Sharma Mar 21, 2026 Updated Apr 7, 2026 3 min read
Engine Score 7/10 — Important

This story details a significant legal precedent for AI misuse, impacting the music industry and broader AI ethics. It offers actionable insights into the legal risks and consequences of AI-generated content schemes.

Editorial illustration for: North Carolina Man Pleads Guilty to $8M AI Music Streaming Fraud

Michael Smith, 54, of North Carolina, pleaded guilty on March 19 to one count of conspiracy to commit wire fraud for orchestrating a scheme that used AI-generated music and bot accounts to steal over $8 million in royalties from major streaming platforms, The Record reported. The case, prosecuted by the Southern District of New York, marks the first federal criminal prosecution in the United States centered on streaming fraud tied to AI-generated music.

Between 2017 and 2024, Smith created hundreds of thousands of AI-generated songs and deployed thousands of bot accounts — up to 10,000 active at once — to artificially inflate play counts on Amazon Music, Apple Music, Spotify, and YouTube Music. He used VPNs to make the bot traffic appear as though it came from legitimate listeners spread across different locations, generating billions of fake streams across thousands of tracks. The scheme yielded $8,091,843.64 in fraudulent royalty payments, which Smith has agreed to forfeit.

At its peak, the operation generated approximately 661,440 streams per day, translating to roughly $1.2 million in annual royalty income. Smith also made false statements to streaming services, rights organizations, and music distributors to maintain the fraud. He worked alongside a co-conspirator identified as Alex Mitchell, CEO of AI music company Boomy, who was listed as a co-writer on the tracks but has not been charged.

U.S. Attorney Jay Clayton said in a statement: “Although the songs and listeners were fake, the millions of dollars Smith stole were real.” Smith faces up to five years in federal prison, with sentencing scheduled for July 29, 2026.

The prosecution arrives as the music industry grapples with a surge of AI-generated content flooding streaming platforms. According to data from streaming service Deezer, the platform received more than 60,000 fully AI-generated tracks daily as of January 2026, accounting for 39 percent of all new songs uploaded. Deezer also found that up to 85 percent of streams on AI-generated tracks were fraudulent in 2025, underscoring the scale of the problem facing the broader industry.

The financial impact extends beyond the platforms themselves. Streaming fraud diverts royalty payments away from human artists and songwriters who depend on per-stream revenue. As AI tools make it cheaper and faster to produce music at scale, the economics of streaming fraud become more attractive to bad actors. Platforms have responded by investing in detection systems and adjusting payout models, but the sheer volume of synthetic content continues to strain existing safeguards.

The case also raises questions about the liability of AI music generation companies whose tools are used in fraudulent schemes. While Mitchell was named as a co-conspirator in court documents, the decision not to charge him suggests prosecutors may be drawing a distinction between those who build AI music tools and those who deploy them for fraud. That distinction will likely be tested in future cases as enforcement actions in this area increase.

Smith’s sentencing on July 29 will set an early benchmark for how federal courts punish AI-enabled streaming fraud. Meanwhile, streaming platforms face mounting pressure to implement stronger verification systems that can distinguish genuine listener activity from bot-driven manipulation at scale.

Related Reading

Share

Enjoyed this story?

Get articles like this delivered daily. The Engine Room — free AI intelligence newsletter.

Join 500+ AI professionals · No spam · Unsubscribe anytime