Children face many online dangers – including pornography, sexual grooming and trolls – and the threat is growing thanks to their evolving digital behaviour. They are spending more time online, tablet and smartphone ownership is on the rise, and kids’ digital capabilities are outpacing those of parents and carers, who often feel ill-equipped to advise.
Although many digital content and service providers are now expected to help parents, charities and teachers to deal with these issues, brands and social platforms are also beginning to be held accountable. But the balance between parental control and empowering children to make informed choices has prompted them to take vastly different approaches.
Exposure to online video
Children’s internet use has reached record highs, according to Ofcom data published last month. Children aged five to 15 spend around 15 hours each week online, overtaking time spent watching a TV set for the first time. Pre-schoolers aged three to four spend eight hours and 18 minutes a week online, up an hour and a half since last year.
Ofcom’s report also shows that YouTube is one of the most popular online destinations, with 73% of those aged five to 15 using the video site and 37% of pre-schoolers regularly watching. The minimum age for YouTube is 13, though the platform has a downloadable kids’ app.
The popularity of the channel, and of vloggers, can be an advantage in reaching children with safety-related messages. For example, children’s charity the NSPCC and agency Livity created a series of vlogger-voiced YouTube animations called ‘Fight Against Porn Zombies’ (FAPZ) to remove the embarrassment and shame associated with discussing issues around porn, directing them to Childline as a source of support.
Research by the NSPCC, the Office of the Children’s Commissioner and Middlesex University found over half of children aged 11 to 16 have been exposed to online pornography, with almost all (94%) having seen it by age 14. They are as likely to have been inadvertently exposed to pornography, for example via a pop-up ad, as to have actively searched for it.
“We know that YouTubers have a huge following with our audience so their involvement [in FAPZ] substantially increased the reach and drove more young people to view the content and supporting information,” says Julia Fossi, head of child online safety, public affairs and policy at the NSPCC.
“It has incomparable usage rates to other video platforms and we wanted this campaign to exist in the spaces where young people are. We also know that a lot of self-generated porn is shared on social platforms, so it’s logical to carry out this campaign in a social space where teens may be exposed to explicit imagery.”
Since FAPZ, the charity has developed a ‘Listen to Your Selfie’ campaign, which launched in September. It focuses on grooming and child sex exploitation, with surrounding content on unhealthy relationship behaviours.
The campaign features two online films, one focusing on an online gaming scenario where a 14-year-old boy is groomed by an older male. The film signposts to supporting advice on Childline and features promoted content on Snapchat, Instagram, YouTube and Facebook.
“Children no longer relate to the phrase ‘online lives’ – we say it but they don’t,” according to Children’s Commissioner for England, Anne Longfield. “That distinction doesn’t exist, as near-constant digital engagement has become their norm.”
This reality prompted the launch of 5Rights, an international initiative that lays out five principles stating children’s digital rights. They include the right to easily edit or delete all content they have created and the right to know who is holding or profiting from their information, what their information is being used for and whether it is being copied, sold or traded.
The initiative says children and young people should be confident that they will be protected from illegal practices and supported if confronted by troubling or upsetting scenarios online. It also states that they should be empowered to reach into creative places but with the capacity and support to easily disengage, and that they need to be taught the skills to use, create and critique digital technologies.
Longfield polled children about their digital rights and says 63% “had no idea what their digital rights were and 81% wanted more help understanding them or more knowledge of what they should expect as [internet] users”.
Solutions devised to help manage the risks that are faced in the online world must be done in partnership with children.
Julia Fossi, NSPCC
The research also showed that 88% did not know what information is outlined in terms and conditions and never read them. Longfield is therefore looking at introducing a “rights-friendly, easily understood model of terms and conditions to the industry – this framework has been built by expert lawyers and is not only easily understandable, but retains its legal functionality”. This is so young people know “what they sign up to every time they click that button, what rights they have and what rights they give away”.
Speaking at a Westminster Education Forum event on technology, child development, 5Rights and policy, Longfield said: “We want children to be confident creators and managers of their online time, this is one of the most pressing social issues of our time. Protecting children is a shared responsibility in which parents, carers, governments, policy makers, educators and, importantly, industry, all have a vital role to play.”
As children spend more of their time online, the awareness of advertising and vlogger endorsements has also increased, according to the Ofcom study, as more than half of internet users aged 12 to 15 (55%) are now aware that online advertising can be personalised – up 10 percentage points in the past year. In addition, the age group’s awareness of product endorsement from vloggers has also increased by 10 percentage points to 57% in 2016.
However, many children still need help to identify advertising on Google’s search pages, with only a minority of eight- to 11-year-olds (24%) and 12- to 15-year-olds (38%) correctly recognising sponsored links.
Marc Goodchild, chief product officer at kids content creator Gingersnap and executive advisor at The Children’s Media Foundation, expressed concern at the Westminster Education Forum over a status quo “where there is no accountability and no transparency”.
He believes the online world is reaching a “turning point” where “technological oligarchs” have created “consolidation in the industry and there are fewer platforms that are controlling what children access, where and how they access it”.
Goodchild says: “Children even up to the age of 15 are unskilled at being able to spot an advert on Google, and when asked most of them don’t know the difference between a placed ad in search and an organic entry.”
He adds: “We have digital experts in their thousands coming up with very clever algorithms to improve the monetisation and stickiness of these platforms to make sure that anyone making stuff can turn data into money. If only a fraction of that money was used to make the internet more age-appropriate by design, you would have a much better device.”
Building online safety into the design of a service is the route Sky took with its digital safety strategy, breaking it down into three strands including filters, education and products.
Sky Broadband Shield is a network-level filter for content across all devices using Sky’s services in the home, but the company decided to turn this on by default because the take-up rates in homes is 62%, compared to an opt-in model where it would be closer to 10%.
Sky also backs Internet Matters, a not-for-profit organisation that aims to help keep children safe in the digital world, together with BT, TalkTalk and Virgin Media. Google and the BBC are soon to join.
Joining Goodchild at the Westminster Education Forum, director of policy at Sky, Adam Kinsley, told delegates that it is important to design products for children right from the start.
Kids want to discover, they are curious and don’t want to be told what they can watch.
Adam Kinsley, Sky
The Sky Kids app is a curated walled garden of age-appropriate content, built to ensure that children can enjoy content in a safe way. It includes age profiling where up to 10 users can specify those details, children can build an avatar, there is a bedtime mode and there are no ads.
Kinsley says children were involved in the design of the app. He explains: “The user interface was put in front of a panel of children and we were keen that this was an app that they could discover content in.” It also offers autoplay features, where the app gives you content automatically once a piece of content is finished.
He says: “As a content company, that is neat because you can get [kids] to look at content that you invest a lot of money in. Parents [had] mixed views about it, but when we put it in front of children they hated it. They want to discover, they are curious and don’t want to be told what they can watch.”
Fossi at the NSPCC agrees with this approach. She says: “Children and young people are experts in their experiences online – and as such, any solutions devised to help manage the risks that are faced in the online world must be done in partnership with young people themselves.”
Educating and supporting parents
Internet Matters released a study that shows 48% of parents believe their children know more about the internet than they do and 78% of children agree. It also shows that children spend significantly longer on the internet than their parents, and twice as long on social media.
In 2014, O2 began to notice that online child safety was becoming a big concern among parents and more widely among customers, as coverage of child sex abuses dominated the media.
Bill Eyres, head of sustainability at O2, says the brand “stepped back” and looked at what it could do to support customers and insight showed that “parents were struggling because kids are so far ahead of them in the digital world and they need more support”.
That insight saw O2 partner with the NSPCC. Eyres says: “The biggest gap in the market we saw was human-to-human advice. There’s lots of content online but what parents wanted was the ability to get tailored advice. That is why we developed the free helpline, which isn’t just for customers but available to anyone.”
It was important for O2 to integrate the approach into the core of the business and office and retail staff are trained in online safety. It launched a kids tablet that offered a walled garden of content and included it in its ‘More for You’ brand campaign.
The Ofcom study backs O2’s approach. It shows that conversations are an important part of keeping children safe online. Nine in 10 children aged eight to 15 have had conversations with parents or teachers about being safe online, according to the research, and would tell someone if they saw something they found worrying or nasty.
Parents of older children are most likely to be having these types of conversations with their children, with 92% of parents of 12- to 15-year-olds saying they have spoken to their child about online safety, an increase of six percentage points since 2015.
Eyres says: “Our first year [in online safety] was driving awareness of the issues and the need to have conversations. Now we are naturally moving into behaviour change. The biggest thing is about behaviour change, [it] is about helping parents have regular high-quality conversations with their children about online safety.”
Results from O2’s brand campaign show that seven in 10 parents had a conversation with kids as a result of its advertising and consumers felt warmer towards O2 after seeing the campaign.
Nearly all parents (96%) of five- to 15-year-olds manage their children’s internet use in some way – through technical tools, talking to or supervising their child, or setting rules about access to the internet and online behaviour. Two in five parents used all four approaches.
App creators are also involving parents in the process of shaping brands for kids and these are even made with parent control in mind. Hopster, a UK TV and learning tablet app for children aged two to six, provides tools for parents to monitor usage.
Working with video platform Ooyala, it reinvents kids’ TV channels by combining shows with an ad-free educational curriculum and gaming that can be personalised, and uses technology to help reassure parents that their kids are in the right environment.
Nick Walters, founder and CEO at Hopster, says: “We have reports on a monthly basis on what they have learnt and discovered – and we provide tools. We have a feature that allows them to set a defined amount of time to spend in front of the TV and after that the TV shuts down.”
Walters adds: “One of the great things about a digital product is that you are in constant communication with your users. You get app reviews and emails and we also run a private social network for our most engaged users, who not only feed back on the product as it is today, but share ideas and thoughts in real time about where we should be taking the product.”
Social media accountability
A raft of apps are appearing that aim to alleviate some of the online safety issues around social media sites, even though the age limit for Twitter, Facebook, Instagram, Snapchat and YouTube is 13.
Online Them, a monitoring tool created by Hello Soda, enables parents – with their child’s consent – to use social media data to see who their children are talking to. The app uses advanced text analytics and artificial intelligence to identify high-risk language, indicators of cyberbullying and adult content, and alert parents when children start interacting with a new account.
“The tool builds trust between a parent and their child, as the parent requires consent from the child in order to access their data,” says James Blake, CEO at Hello Soda. “While steps are being taken to make the internet safer for children, the content that young people consume online is largely unregulated. It’s difficult for parents to know who exactly their children are talking to, or whether they’re being exposed to adult content or cyberbullying. These kind of tools will help to build a safer online community for all young people.”
Another example is recently launched app Oyoty, which is designed to make children more aware of what they are sharing online by singling out content that is personal or inappropriate as soon as it is posted on Facebook, Instagram or Twitter.
Founder, Deepak Tewari, says the age limit on social media is “the big elephant in the room”. He says: “According to a study done by the BBC last year, more than 70% of kids under the age of 12 have social media accounts. This is part of the reason why [the Oyoty app was developed]. There are a whole bunch of kids using social media either without authorisation or training.”
Using artificial intelligence, it starts a conversation with the child and guides them through the process of editing or removing content, as well as educating them about what they should or should not be posting.
Tewari adds: “We did a lot of work to make sure we are a friend of the child and non-judgemental. [Users] are welcome to post or disregard what the app is telling them but it just [makes a] suggestion in a fun way to make it light. [We also] worked with a child psychologist on the language to make sure conversations were not condescending, or big brother-like.”
The future of the app is in the technology. Tewari would like to expand to multiple languages and functions, for example adding in sensors for grooming or self-harming. Involving the child is also vital. Tewari says: “The technology is strong – but to reach out to the child is the differentiator. We use all our power to empower and equip the child.”
Goodchild at The Children’s Media Foundation believes that social media services are covered in grey areas of legislation. He says: “Social media is creating new opportunity but also new pressure on kids [who] are obsessed by likes. Getting a like will negate their in-built preferences for trying to protect their own safety and security.”
He adds: “We are all complicit in it because we all do campaigns, as a kids content maker I know I have to work with those guys because that is where the audience is but the organisations themselves are not prepared to put their hands up and say it is fit for purpose. If you have a vast number of children using your services, you have to stand up and be accountable.”
Perhaps more brands, content creators, providers and curators will wake up to societal concerns around online safety for children, particularly when issues are raised by users themselves. A combination of educating parents and empowering children could prompt those businesses to launch more child-friendly products and services and to take account of the changing behaviour of kids online.
Source: Marketing Week
Growing up digital: How to protect kids online