At a time when it is just as easy to leave digital footprints as it is to leave physical ones, New York State is making a decisive move to protect its youngest inhabitants from being able to see ‘addictive feeds’ on social media. When lawmakers in the state recently passed a bill aimed at limiting the amount of time children under the age of 18 could spend on these platforms, the story spread across the networks, generating a flurry of debate. If the legislation that has been drafted in New York under the name of the Stop Addictive Feed Exploitation (SAFE) for Kids Act becomes law, it holds the potential to set precedent on how digital content is delivered to minors. Minors, as well as the titans of the web such as Google.
At a basic level, the SAFE for Kids Act would refocus the algorithms that served up targeted content based on user data for children by simply categorising feeds that way as ‘addictive’, and limiting the exposure without parental consent. In other words, New York’s state legislature wasn’t mincing words about the consequences of catering to kids with algorithmic content: it was- or at least had the potential to possibly be – harmful to their mental health and their privacy.
Under that law, ‘non-addictive feeds’ – chronologically rather than algorithmically ordered posts – would be permitted. The goal would be to cultivate a digital domain in which content is not specifically designed to capture or retain adolescent users in potentially harmful ways.
Coupled with the SAFE Act, New York’s Child Data Protection Act is another step toward protecting minors in their digital lives: it prohibits the collection or sale of information about users under 18 without their consent.
NetChoice, a trade association representing Google, Meta, Snap and other big tech, has come out hard against the SAFE Act. Calling it both ‘dangerous’ and ‘unconstitutional’, it reveals the faultlines between government’s agenda to protect children and big tech’s fight to keep the internet free. The obligation to verify the ages of social media users and prohibit notifications during certain hours would place a new and complicated regime of regulatory compliance on the processing of online communications.
Indeed, as a pillar of the digital environment, Google occupies one of the most interesting positions between novelty and responsibility. The dialogue surrounding the SAFE Act represents a wider quandary about how we balance user engagement with ethical concerns. At the same time, Google’s response as part of NetChoice theorises the discipline of content curation and access in an increasingly regulated digital environment.
NetChoice’s criticism of the act – deeply rooted in free-speech and open-internet principles – reflects this dilemma over whether to preserve a Dumasian internet or one where every adolescent Nicholas can be equipped like a 21st-century D’Artagnan. The charge that such law can be abused to allow the government to track online activity contributes to the larger debate over digital rights and responsibilities.
If enforced, this law would fundamentally change the way that social media sites treat their users under 18, as age- and consent-verification processes, as well as restrictions on notifications, would indicate a desire for a more fortified digital experience for younger users. Compliance would come with penalties for sites.
For parents, the SAFE Act offers an opportunity to take a more proactive role in their children’s digital lives. By requiring a parental opt-in for access to curated feeds, it allows parents to make their own informed choices about their children’s digital exposure. Its passage represents the first steps toward the sort of protections that we’ve come to expect in the digital playground that our children now devote an increasing portion of their time to.
Yet, at the core of the debate over the SAFE Act, and the related legislative agenda, lies a question about Google’s role in the intellectual future of the internet: how should the company – as media company, gatekeeper of vast amounts of information, and arbiter of digital life – handle the ethics of content curation and privacy?
In the wake of this decisive leap toward the digital age, those calls should remain resounding. The controversy surrounding the SAFE Act shines a light on the need for tech companies, legislators and the public to work together to promote digital ecosystems that preserve safety, privacy and wellbeing, particularly for younger users.
The SAFE Act is a crucial step on the path to safer digital spaces. It’s also a test for the tech giants. As the dialogue around digital safety and privacy for children progresses, a partnership with technology companies such as Google could be a vital component in the shift towards an internet environment that’s more secure and ethical.
More Info:
© 2024 UC Technology Inc . All Rights Reserved.