Rapid technological advance in the digital era has raised difficult questions at the crossroads of technology and privacy. As has become a commonplace adage in the digital age, the children are slowest to adapt to the online world around them. But they’re not completely helpless, and that’s where the latest chapter of multiplying risks and ethical dilemmas to AI technologies began. A recent report by Human Rights Watch (HRW) revealed how personal photos of Brazilian children, aged between five and 17 years old, were used to train AI without their consent, exposing a Pandora’s Box to their private lives and how AI could penetrate them.
At the centre of this controversy is a dataset called LAION-5B, a scrape of billions of public images that was created by a German nonprofit. It contains many children’s lives exposed. From images of a baby just born and shining brightly in a delivery room, to shots of a teenager at a high-school carnival with her friends, these personal pictures are often hidden away in a private blog or low-view channel on YouTube. They were not meant to be part of the public domain, let alone to train cutting-edge image-generating AIs.
The consequences of this use without consent are profound: not only does this increase the chance that such individuals will be subjected to the production of images that will also be non-consensually AI-manipulated, but because many of these children’s names and locations are listed in image captions, this breach of privacy is specifically a direct route to the identities of the children.
But LAION moved fast to remove the links from the dataset once the HRW report was published. Yet this move only addresses the tip of an iceberg. What’s left of the images remains on the web, still available to be harvested anew by other databases. LAION has been working with partners (including the Internet Watch Foundation and Human Rights Watch) to clean its database of illegal and unethical content.
This illustrates a stark truth of our digital era: for our young people, privacy is fragile, and this will only intensify. AI-generated deepfakes of pornographic content, albeit real, from Brazilian media reports already illustrate the traumatic effects such misuse can have on the targeted individuals, with debilitating shadows in an environment in which content is immortal in a digital world.
But the general lesson here is clear: it will require governments, technology developers and communities to unite to ensure that children’s rights and data privacy are kept safe, that there are safeguards against the misuse of AI, and that there are policies to protect personal data. Policies that curtail the misuse of AI, and policies that protect data, are important – they also have become imperative.
What used to be a place of refuge, a place for identity and privacy – or even, as bell hooks put it in her book Homeplace (1999), ‘an ugly, funny, scary, and dear place you get used to’ – has evolved into something altogether different. We’ve come to occupy homes that are no longer truly our own, and the digital effects of living so transparently in such public spaces have had profound consequences. The internet is great. It‘s a democracy. The problem is that, in expanding the public space on to our personal spaces, it has mainstreamed our most intimate private lives.
It requires us to reimagine what privacy and home mean today, and to set new guidelines for how we behave – to respect the privacy of other people, even when we are all operating in a shared digital space. As we move into an ever-more digitised age, characterised by AI and other new technologies, we must make sure that our progress doesn’t erode the rights and dignity of the youngest amongst us.
Here is the crucial lesson of the digital age: as much as it offers us new possibilities, in the same instant, it confronts us with fresh challenges. Safeguarding the very notion of ‘home’ – be it analogue or digital – is an absolute necessity as we embark on the task of forging a future in which technology empowers people to protect and promote the rights and dignities of all, and in particular the rights of the most vulnerable.
© 2024 UC Technology Inc . All Rights Reserved.