Age-verification protocols put in place by social media sites are ineffective and easily sidestepped by children, a study has found.
The age at which children are allowed on social media platforms varies depending on country, with the UK and US allowing 13-year-olds to have accounts. France, Ireland and Germany, however, have a minimum age of 16.
A report by Lero, an Irish software research centre, found tech-savvy children are able to circumvent these measures with relative ease.
‘This results in children being exposed to privacy and safety threats such as cyberbullying, online grooming, or exposure to content that may be inappropriate for their age,’ said lead author Dr Liliana Pasquale from University College Dublin.
Scroll down for video
A report by Lero, an Irish software research centre, found tech-savvy children are able to circumvent age-verification these measures with relative ease (stock)
The study looked at the current methods of age verification for ten popular social media platforms: Snapchat, Instagram, TikTok, HouseParty, Facebook, WhatsApp, Viber, Messenger, Skype and Discord.
Researchers studied the age verification process in April 2019 and again in April 2020.
The Children’s Online Privacy Protection Act (COPPA) came into force in 2000 and makes 13 the minimum age a person can get an account.
GDPR in Europe added a stipulation that children aged between 13 and 16 must get parental permission.
However, the researchers found that, for all ten apps, if a child simply says they are 16 or older when first trying to set up an account, there is no proof of age required.
This automatic approval means underage children will get an account as a result of their dishonesty, with no future checks.
The study looked at the current methods of age verification for ten popular social media platforms: Snapchat, Instagram, TikTok, HouseParty, Facebook, WhatsApp, Viber, Messenger, Skype and Discord. Researchers found that, for all ten apps, if a child simply says they are 16 or older when first trying to set up an account, there is no proof of age required (stock photo)
TikTok makes under-16s’ profiles private by default
TikTok has introduced a raft of updates to improve the privacy and safeguarding of children on the platform.
Users under the age of 16 now have their accounts set to private by default, which means the only people who can view their content are approved followers.
Strangers are also prevented from commenting on videos made by under-16s, meaning minors have just two options – allowing comments from friends only, or turning of comments entirely.
Children’s charity NSPCC praised the package of updates, which are designed to protect young and vulnerable users on the app.
Dr Pasquale says the current processes which are intended to protect children are ineffective.
‘In reality, the application of substantial financial penalties was the main trigger for app providers to implement more effective age verification mechanisms,’ she said.
‘Based on our study and on our survey of biometrics-based age recognition techniques, we propose a number of recommendations to app providers and developers.’
The study also tested other forms of age verification, including biometric and voice analysis.
It found these too had limitations, with voice recordings, for example, able to bypass the measure.
Recommendations from the experts includes apps providing a much clearer explanation of age-restricted features, applying the strictest privacy settings to those who say they are under 18 by default and encouraging honesty.
The study authors also call for much more robust age verification methods. Currently, biometric security measures which use facial features are not reliable or accurate enough.
Theage-verification method should also be an ongoing process and not a one-off upon sign up, the authors add.