Top video game Fortnite could be putting underage users at risk of being contacted by strangers, the UK’s leading charity for children, the NSPCC, has warned.
The National Society for the Prevention of Cruelty to Children (NSPCC) noted that some of Fortnite’s features may put children’s safety at risk, allowing users to communicate with each other through voice and text chats. The game enables players to turn off voice chat in its settings, but not text conversations, placing children at risk of being approached by unwanted people.
It comes as figures from one of the UK’s top mobile service providers, O2, and the NSPCC revealed that one-in-four children was approached by someone they did not know on some popular games and apps.
Laura Randall, the NSPCC’s associate head of child safety online, said: “Apps, sites, and games such as Fortnite: Battle Royale can be great opportunities for young people to play and engage online.
“However, in light of emerging concerns about the risks children could be exposed to, we are urging parents to be aware of Fortnite’s features. It’s vital parents have regular conversations with their children about the games they are playing, and how to stay safe online.”
Epic Games, a US video game software developer, said Fortnite had racked up 45 million players as of January.
The NSPCC has issued fresh advice to parents about the game following the child safety concerns. They include having an open conversation with children about their online activities and how to stay safe. It also advised parents to make use of privacy and parental-control settings.
If you like this story, share it with a friend!