Meta AI, Gemini & Grok Prompt UK Users to Illegal Casinos

AI Chatbots & Illegal Gambling

While the hype around how helpful and useful AI chatbots are doesn’t subside, these actually play a double game. Research by The Guardian and Investigate Europe found that major bots counsel British users on illegal online casinos. The findings raised concerns about safeguards and put a stain on the reputation of the tech companies.

Five Chatbots – Five Failures

Online casinos that do not acquire a license from the United Kingdom Gambling Commission (UKGC) cannot promote services to the British. It’s enshrined in the Gambling Act 2005. Players who access offshore sites aren’t prosecuted, but put their own safety and finances at risk.

The investigation published the other day affected 5 major AI assistants:

  • Copilot, developed by Microsoft
  • Grok by X
  • ChatGPT by OpenAI
  • Meta AI by Meta
  • Gemini by Google

Each of them was involved in recommending unlicensed casinos. Investigators used UK IP’s to mimic ordinary users. They asked chatbots to pick the best non-UK platforms and how to access non-Gamstop sites. Some of the inserted prompts involved bypassing the “source of wealth” and other AML checks.

The results were disappointing. None of the participating bots refused to answer the prompt. Instead, the apps inserted illegal sites and listed recommendations for passing the checks. The investigation caused a stir and illustrated the need for tech firms to integrate more controls into their own products.

AI Recommendations: Red Flags in Every Answer

The key harm was in listing the offshore operators with weak licenses, such as Curacao and Gibraltar, neither of which is recognised as valid in the UK. Such jurisdictions are often associated with fraud and addiction issues.

Except for giving recommendations, AI bots addressed the GamStop scheme and financial checks negatively in conversation. Users also got insights on paying with cryptocurrency, as it doesn’t involve any third parties. The way AI assistants described bonuses also caused concerns. Bots concluded that promo deals on unlicensed websites are better than those of the legal casino operators.

Microsoft Copilot and ChatGPT were the only participants to insert a gambling addiction warning before giving answers.

Regulators and Tech Companies Reaction

UK officials have responded to this issue instantly. The government adviser on gambling harms, Henrietta Bowden-Jones, said that no chatbot can recommend illegal sites or undermine the reputation of British regulatory measures like Anti-Money Laundering (AML), Know-Your-Customer (KYC), or the GamStop scheme.

Tech companies, in turn, pledged to strengthen their safeguards, prompt detection, and initiate additional training to prevent harmful recommendations in future.

It’s not the first incident with AI assistants. Several suicide cases have been linked to bots that failed to intervene when users expressed suicidal thoughts.

New Challenges for UK Gambling Regulation

The chatbot's outputs contradict the UK’s strategy to crack down on the black market, receiving £2.7bn stakes annually.

On the one hand, players see bans on offshore brand sponsorship of British sports teams and the BGC’s interactive quiz to raise black market awareness. But on the other hand, major AI chatbots completely ignore offshore gambling harms and even teach to bypass the UKGC’s restrictions.

We wonder whether UK officials will come up with any regulations against the backdrop of this research. It would be good for tech companies to block harmful requests from UK IPs or exclude black market domains from AI outputs.

You gave this material a grade 5 from 5:
Lead iGaming Expert at Cardmates
No comments
You will be the first to leave a comment
Unregistered users cannot leave comments.
Please, login or register.