|

Mary Poppins and messy conversations: TikTok prepares for looming regulation

Mary Poppins and messy conversations: TikTok prepares for looming regulation

TikTok’s child safety chief insists the entertainment app is learning from past mistakes of tech pioneers, Facebook and YouTube amid new laws on social media companies being drafted in the UK.

Alexandra Evans, TikTok’s head of child safety public policy in Europe, told the IAB’s Digital Trust Seminar that “safety first” is the Chinese company’s approach, while admitting that “messy” conversations – in which people open-up about difficult issues such as eating disorders – lie within grey areas of policy.

The former childrens’ rights campaigner and British Board of Film Classification policy director, joined TikTok in March last year and works with an EMEA Trust & Safety Hub, based in Dublin, where staff numbers have ballooned from just 20 people at the start of 2020 to an estimated 1,100 in early 2021.

For context, this represented about half of TikTok’s 2,000 European staff, as of the end of 2020.

The video app, owned by China’s ByteDance, announced last August that it will set up its first European data centre in Ireland, where tech giants Google, Facebook and Apple also have their European headquarters and benefit from a low corporate tax rate of 12.5%.

“What I love about TikTok is we are a second-generation platform, which means we have the advantage of seeing how others have made decisions and we can either replicate or we can choose to take a different path,” Evans said in conversation with James Chandler, the IAB UK’s chief marketing officer.

To help forge this path, Evans (pictured, above) explained she has learnt from her BBFC experience that any classifications or rules TikTok sets need to “resonate” with the people being impacted by them.

In practice, this means running focus groups with young people in the UK, Ireland, France, Germany and Italy, in order to work though difficult issues, as well as creating “digital literacy” resources for parents and care-givers.

Evans compared differences in expectations by age groups with classic movie Mary Poppins, where Mr Banks and his children have radically different expectations about the ideal nanny’s attributes.

TikTok has had to move fast after exploding onto the online media scene in 2018 as a new short-form video app – then mostly used for making lip-syncing videos and comedy skits.

In 2019, it became mired in a series of brand safety and privacy incidents, including being investigated by the UK Information Commissioner’s Office over the way it was handling the personal data of young users.

In February that year, the US Federal Trade Commission fined TikTok a record $5.7 million for illegally collecting the names, email addresses, pictures and locations of children under age 13. A report at around that time from children’s charity Barnados revealed that predators were targeting users as young as eight with sexually explicit messages.

Evans, who joined TikTok in March 2020, described a more proactive approach by TikTok of late, such as its decision in January of this year to make any accounts private by default for young users aged between 13 and 15.

This is important because TikTok’s meteoric rise in popularity in the UK and US has been driven by teens and young adults. Its UK user base is set to grow to reach the tens of millions this year.

“The decision we made about the default setting, which we made as a two-year-old platform – before most platforms [which] were many years older, really speaks to our huge ambitions in this space,” she added.

[advert position=”left”]

It is a critical time for social media companies operating in the UK, with the Government having this month launched an Online Safety Bill.

The proposed law would give Ofcom the power over social networks to remove and limit the spread of child sex abuse, terrorist material and suicide content, or face “duty of care” fines of up to 10% of global annual turnover.

Without referring to the Online Safety Bill, Evans said: “We’re just very comfortable with the fact that regulation and policymakers have views about how we should operate. We are very welcoming of the fact that some of the parameters that we operate as a company will be defined by regulation.”

While Evans pointed out that TikTok is clear about not allowing content that normalises, glamourises or promotes harmful content, such as unhealthy eating disorders, she said it will be “tricky” to police content in which users are describing their own experiences about difficult subjects.

She explained: “That conversation is a messy conversation. That’s a place where I do see the grey areas: we [want to] ensure TikTok is a place where you come and connect and feel like you are supported and to prove support for others, but in a way that fully accounts for other users rights to be protected from something that is harmful.”

Evans’ interview with Chandler was pre-recorded for broadcast at the IAB Digital Trust Forum on 26 May.

Media Jobs