Child gaming laptop

Using AI chatbots to learn about financial risks in games

Parent guide


Around half of children feel online games are not fun unless they spend money. Many games come with financial risks, including gambling-like features and scams, as well as tactics that pressurise and incentivise spending. So, with so many games on the market, how can parents keep up?
 

AI chatbots as the solution

Our research has found that AI chatbots tools can explain financial game features and highlight potential risks, including the ways children are pressured to spend. 

The quality of the information, however, depends on what you ask. 

For example, vague questions can lead to unclear or incomplete responses. By contrast, specific, well-structured prompts usually give more accurate results.

This guide shows you how to use AI chatbots to get clearer, more reliable answers – including some recommended ready-to-use prompts you can copy.

 

3 AI prompts: ready to try.


Games often include ways for kids to spend money, and it can be tricky to know which ones are risky. These prompts have been carefully tested to help you get clearer, more reliable answers from AI chatbots. 

Make sure you copy and paste them exactly, switching out [Game Name] appropriately and editing [Platform] to the one your child uses.

Copy and paste the text below. Remember to customise it to the game you are interested in. 

Create nothing but a table including all of the microtransactions and in-game purchases for the game [Game Name] on [Platform]. Include a score from 1 to 10 on whether the game is seen to be pay-to-win. Include a score from 1 to 10 on how likely the game is to encourage unhealthy spending. Include a suggested age rating based on potential financial risk. Respond with only the table and scores, the least and most expensive available microtransactions in the game, followed by a short summary of the data.

Copy and paste the text below. Remember to customise it to the game you are interested in. 

List the specific design tactics, such as time-limited offers, streak rewards, or peer pressure incentives, used in [Game Name] on [Platform] to encourage frequent purchases.

Copy and paste the text below. Remember to customise it to the game you are interested in. 

Does [Game Name] have parental controls that can block or monitor my child’s in-game spending on [Platform]?


 

5 tips for creating better AI prompts.


Be specific 

Instead of asking “Is [Game X] safe?”, ask bots to explain specific spending risks or tactics. For example, “Which in-game purchases could encourage my child to spend too much?”
 

Keep it neutral

Avoid emotional phrasing – like “Is my child in danger?” – and focus on facts. For example, what spending options exist, how they work, and how risky they might be.
 

Include your child’s age

AI gives more relevant answers when it knows your child’s age. Say “my 12-year-old” so the advice matches that stage.
 

Ask about specific design tactics

For clearer answers, ask the AI to identify the design features a game uses to encourage spending. For example, you can ask about tactics such as loot boxes, streak rewards, time-limited offers, or social pressure from other players.
 

Separate Third-Party risks

Some risks come from outside the main game. For example, third-party sites and unofficial marketplaces. Ask the AI to list these separately. AI can find this tricky, so be persistent.
 

Note: While we tested chabots around financial harms in games, you might find it useful for using AI chatbots for other parenting support. 

Decorative image

But wait: can AI chatbots get it wrong?


We conducted research with youth platform VoiceBox to test five popular AI chatbots. These were Claude, Gemini, ChatGPT, Grok, and Meta AI

The research found that these AI models provide variable baseline quality of information for parents. Answers varied, depending on how the question is asked and the bot’s own “persona.”

In a testing score, Claude gave the best advice, followed by Gemini. Note, however, that the best bot is usually the one you already have an established history with.

We also learned

  • Chatbots can confuse different versions of the same game, like mobile vs console, and treat third-party platforms or marketplaces as part of the main game.
  • Chatbots give confident-sounding answers that are actually incorrect (these are termed “hallucinations”).
  • Chatbots can miss hidden spending risks if your question isn’t specific enough.
If you would like to read the full report, click here. [LINK TO COME]