A report submitted to the UK Parliament outlines the damaging effects on children of ‘persuasive design’ on the internet.
The Disrupted Childhood Report co-authored by the 5Rights Foundation, the British charity advocating for children’s rights online, and [reputation and privacy consultancy] Schillings, sets out how “persuasive design” practices employed on the internet deliberately keep children online to collect their data for commercial gain. [For some examples of persuasive design, see explanation below.]
The report sets out how these strategies exploit human instincts, how they are deployed, why they are habit-forming and what emotional, physical and educational development impacts they are having on a generation of children.
Baroness Beeban Kidron, founder and Chair of 5Rights Foundation, and report co-author says: “The tech industry needs to look at their stratospheric share prices, then our children, and decide which is more important. Children need a new deal.”
The 24 recommendations contained in the report call on the tech sector to make seismic changes to the design of products and services in order to meet the needs of children. They also call on government to add “compulsive use” to its current list of harms in all policies and to set up a centre of expertise for policy and research, in instances where it intersects with childhood.
Said Baroness Kidron, when introducing a Code of Conduct as part of Britain’s Data Protection Act 2018: “The draft Code represents the beginning of a new deal between children and the tech sector. For too long we have failed to recognise children’s rights and needs online, with tragic outcomes. I firmly believe in the power of technology to transform lives and be a force for good. But in order to fulfil that role it must consider the best interests of children, not simply its own commercial interests. That is what the Code will require online services to do.”
In addition to consolidating research from academics (including EU Kids Online, Oxford, Harvard, Stanford and LSE, American Psychological Association, and the Association of Teachers and Lecturers) and a roll call of dismayed tech insiders (Tim Berners-Lee, Tristan Harris, Jaron Lanier, Nir Eyal, Sean Parker), importantly the report also features the voices of children, who themselves are asking for fairer treatment.
Jenny Afia, partner at Schillings and co-author of the report, comments: “Struggles between parents and children over screens are the result of a far deeper conflict between a system designed to be compulsive, worth billions of pounds to shareholders, and the needs and legal rights of children.”
The report points out that designing services to be compulsive, and then asking kids to put their phones down, is not the answer. Access to digital services for entertainment, socialising, learning and citizenship is crucially important to children, young people and the future of society as a whole. What is required is access – but on terms that meet the needs and rights of children.
Some examples of what the Code aims to prohibit or control:
• Nudge techniques: The Code will prevent online service providers using “nudge techniques” to lead or encourage children to provide unnecessary personal data, weaken or turn off privacy protections, or as a tactic to extend their use. For example, it will stop the insidious practice of timed notifications being used as a method to punish a child’s absence online.
• Detrimental use of data: The Code will ensure that online services are not promoting behaviour that is detrimental to a child’s health or wellbeing. It points to established and evidence-based guidance to identify what might be considered “detrimental”, and makes clear that if there is any doubt, a precautionary approach should be taken.
• Profiling: A contributing factor to the tragic case of Molly Russell was the graphic content promoting self-harm and suicide that she had been accessing on Instagram. Importantly, rather than Molly having to seek this content out herself, it was being recommended to her based on data drawn from her viewing and browsing history (i.e. data that was used to profile her). The Code will require online services to switch off profiling by default, and profiling will be prohibited altogether if appropriate measures aren’t put in place “to protect children from harmful effects”.
• Location services: Under the Code, geolocation tracking must be off by default for children, and “options which make a child’s location visible to others must default back to off at the end of each session”. The ease with which the real time and predicted location of a child can be tracked using data from the apps they use is alarming and open to abuse. The National Crime Agency and several police forces have issued warnings that services collecting geolocation data could be used to groom, stalk, sexually exploit or abduct children. Children report that these services even manipulate children into going to particular locations (e.g. the well-publicised deal between Pokémon Go and McDonald’s, which saw thousands of children driven to fast-food outlets by the game).
To download the “Disrupted Childhood” Report and view its recommendations for industry, government, parents and investors, visit: https://5rightsframework.com/
What is persuasive design?
Persuasive design practices manipulate innate human behaviour. The Disrupted Childhood Report highlights how digital services routinely deploy persuasive techniques with the specific intent to collect personal data for commercial use. A third of all users globally are under 18. Examples of persuasive design include:
• The rush: design features built around rewards and anticipations, such as likes, hearts and comments, which create expectations, elicit dopamine hits and fuel the need for the next response – all to extend the time spent on digital services.
• The popularity contest: design features which exploit the fear of not appearing popular, such as public counts of friends, followers, likes and retweets. Measuring friendship numerically creates an arms race for more interaction, more friends – and as the research shows – denudes the quality of relationships.
• The summons: alerts that play into our innate response to movement, noise and light, such as the buzzes, pings, vibrations and notifications coloured red.
• Losing time: design features built to remove the need to make a conscious decision to keep using a digital service, such as auto play, auto suggestions, infinite scrolling and games with no save option. In the name of “personalisation” these techniques pull children into a bubble of suggestion and activity with no end.
• The social obligation: design features built to exploit the human need to be social, such as SnapChat, streaks that can trap young people in multiple relationships that they find time-consuming to maintain, and hard to get out of.
And “online now”ˆ status, typing bubbles, read receipts… scores of tiny obligations that build into an overwhelming struggle for a child’s attention.
Copyright © 2019 www.noseweek.co.za