We must apply the 'toy car principle' to the online world
Tuesday, August 18, 2020
When designing any product or service, isolated assumptions are risky - and when we’re talking about vulnerable users, potentially dangerous.
For example, a company designing a toy car for a young child, meant for rolling on the ground, will use user research to observe whether or not that is what a child will actually do with it. If it becomes clear that a child of the proposed target age for the toy will in fact put the car in their mouth, bite off the wheels and potentially choke then the company can do one of two things: either change the design to make it child-proof or change the age range of the toy to make it clear it’s ‘not for children under the age of 3 due to small parts.’
Now, a small child is unlikely to have an appreciation of how dangerous a toy car may inadvertently be. They’d probably be as happy to play with a model car with detachable components and hundreds of tiny pieces, as they would be with one designed for their specific age. But that doesn’t mean it’s ok for a company to sell toys that are potentially dangerous. Rules are in place to ensure products meet specific safety standards and in the case of children’s toys, that’s in order to protect the child.
For technology companies and online platforms who are providing a service, the design process should be no less rigorous. It’s all too easy to design products and services that are attractive to users and allow them to access all sorts of information – but are the right questions about the impact on service users (who cover a range of ages, backgrounds and have had many different life experiences) and their safety, being asked?
A growing danger: online harms
At Catch22, we have just released the results of our National Online Harms Consultation. The findings confirm what many on the frontline will already be aware of:
- 32% of young users have seen harm occur to them or to a friend because of online behaviour, citing sexual exploitation and self-harm.
- 73% of young users have seen online content that concerned them, referring to extremely violent content, sexually explicit messages, and cyber bullying.
- Only 40% of young users would report harmful content to the platform they are using.
We asked frontline support staff, youth workers and teachers for their input too and their concerns included:
- Grooming, cyber bullying and the sending and receiving of explicit images.
- 38% do not feel sufficiently trained to deal with the issues young people face online, highlighting the constant changes to platforms and privacy settings.
- The lack of age verification, citing children as young as 10-years-old as being groomed online.
All groups of respondents also highlighted growing concerns for mental health issues being aggravated online, and a call for online counselling support to match.
What young people want
Of the young people who responded in detail, many said they wanted more power over their privacy. They referred to the need for age-verification, faster content moderation, and an urgent need to prevent user accounts from people who clearly are not who they say they are. They cited explicit messages and images sent to them from strangers.
Young people know the change they want to see. While the technical details fall on those building platforms and approving legislation, the responsibility to listen to young people and to make a safer world lies on all of us. It’s for us to take in what they say and to respond; to conduct research with young people not on them, to educate them about the world around them, including their online world, and to give them the confidence to embrace the opportunities they will see in this digital sphere.
While we are constantly asking how we can prevent harm, we must also understand what is driving young people to interact online in the way that they are. What is it about that interaction that exposes them to manipulation? Are young people going on these platforms because they want to meet new people? Or do they only want to talk to their friends? Maybe they just feel unheard in the offline world but find online discussions easier? Whatever the motivation, there’s a susceptibility to harmful behaviour that comes with that.
Embracing the online world
We don’t want young people to be scared of the digital world. It’s not going anywhere and our reliance on it is set to stay. That should be exciting – no generation has had easier access to free education, an infinite amount of knowledge and discussion, and the ability to platform off their own culture and creativity, all with a global audience. That’s on top of the ability we now have to talk to anyone, anywhere, to do our entire job from the comfort of home, and to develop skills while a global pandemic is underway.
It is entirely natural that when such opportunities exist, young people want to explore. It’s a natural instinct to explore a product and push boundaries (as the toddler would with a toy car – whatever the stated age range on the packet!). And if the right restrictions and safety measures aren’t in place, the potential is devastating.
Legislation – and education
We will be waiting for the online harms legislation for a while. But it won’t solve the problems. What we need is proper consideration from those shaping our online platforms to get to a place where young people and their families want to leap into this world. Where they’ll be both forewarned and forearmed, but comforted by the fact they know these platforms and the creators of digital platforms have built a world where the motivations are understood.
Whether it’s a toy car or a social media platform, the principles of designing something that is fun, engaging, useful and easy to use – whilst ensuring user safety is paramount – remain the same.
We need a safe online world where every platform abides by this toy car principle – where the user-friendly route opens doors to opportunity and enough care is taken to quickly close the routes that do otherwise.
Naomi Hulston is chief operating officer at Catch22