Gallery Review Europe Blog European Fine art Solving the children’s privacy ‘puzzle’
European Fine art

Solving the children’s privacy ‘puzzle’


Organizations and lawmakers striving to protect the online safety of children and teens in today’s advanced and vast digital environment find themselves working to solve a “puzzle” that tries to balance varying legal jurisdictions and cultural considerations.

“Globally there are many, many pieces to this puzzle when it comes to policies and laws governing children’s data,” Centre for Information Policy Leadership Director of Privacy and Data Policy Natascha Gerlach said during a session at the IAPP Europe Data Protection Congress 2023 in Brussels. “It becomes increasingly more difficult to coherently navigate and also comply when you are a platform, a service, an organization acting in this space.”

It was one in a slew of DPC sessions that brought together regulators and leaders in the children’s privacy space to discuss the regulatory environment, its challenges and more. Panelists had much to discuss given the increased obligations being required for children’s online safety in major jurisdictions.

The EU General Data Protection Regulation states children merit specific protections regarding their personal data and sets the age of consent at 16. Additionally, EU platforms are facing increased requirements from the Digital Markets Act and Digital Services Act that include nonspecific children’s protections.

The U.K. Age-Appropriate Design Code establishes standards for children’s data protection online while the Online Safety Act creates obligations around child safety on online platforms.

And in the U.S., states including Arkansas, California and Utah have passed laws to protect children online, while the Children’s Online Privacy Protection Act remains in force at the federal level.

How companies are responding 

TikTok Data Protection Officer Caroline Goulding, CIPP/E, CIPP/US, CIPM, FIP, said companies are not only facing a “multi-faceted, complex landscape externally,” but also have “quite a lot internally now to navigate.”

In addition to the Office of the Data Protection Officer, and separate from the privacy legal function, Goulding said TikTok has a compliance officer focused on the DSA, which governs obligations of large online platforms, and a dedicated content safety legal function.

“So, all of us have different responsibilities for the various aspects of safety and privacy, but we all need to work together. Everything is moving at pace, and we so definitely need a lot more alignment and cross-communication,” she said.

In September, TikTok was fined 345 million euros by Ireland’s Data Protection Commission over alleged children’s data protection violations. Goulding said the company made progress in implementing additional children’s protections prior to the DPC’s investigation, and while TikTok is appealing the fine before the EU General Court, it is also “taking steps to address the remaining criticisms,” including a redesign of account registration flow for new 16 and 17 year-olds that pre-selects private account.

Snap Associate General Counsel, Privacy James Brunger said the company takes a combined approach to the variety of requirements they must meet across different legislative frameworks.

“We see the same function with these areas, so I spend time working with the business on risk assessments, holistically looking at content, looking at DSA risks, looking at privacy and looking at them together. We see them as different sides of the same puzzle,” he said. “Discussions can get very polarized between privacy and safety and we look at them together in trying to come up with a proportionate answer. We find this easier and helps to not get stuck in some of these issues.”

Global Data Protection Officer for X, formerly known as Twitter, Renato Leite Monteiro said the company has been working with the U.K. Information Commissioner’s Office on compliance with the Age-Appropriate Design Code for the past year and developed an internal code of conduct so teams know and understand what elements to take into account when developing products and features. The goal is to see if new product launches pose any additional risks to children that need to be mitigated or to meet regulatory obligations.

“Even with all of this engagement that we have and the lessons that we have already internalized, there is a lot of uncertainty,” he said, adding the company would welcome additional guidelines from regulators or industry associations to help assess and address risks to children.

Information Accountability Foundation Chief Strategy Officer Elizabeth Denham, who helped draft the U.K. Age-Appropriate Design Code while serving as the U.K. information commissioner, said she understands the “stress and pressure companies are facing in trying to implement and do privacy engineering standards.”

“But you have such resources and such ability to design products that are made and rolled out with the best interests of the children in mind, so I have confidence that the big platforms can do this work,” she said.

AI and kids

Digital technology is a part of children’s social lives, education, entertainment, and more, and in today’s digital world, that includes generative artificial intelligence.

“They are using it to do their homework, they are using it to help choose their outfits, they are making friends with virtual companions, they are using generative AI features on platforms for storytelling and also using it to confirm their creative instincts in art and music. So the list of opportunities that generative AI creates for children is virtually endless,” said Bird & Bird Partner Head of Privacy and Data Protection, Ireland, Anna Morgan.

There’s been a “policy push” in recent years, Morgan said, and not just from governments — like the proposed Artificial Intelligence Act in the EU and U.S. President Joe Biden’s executive order — but from various international organizations, including the World Economic Forum and UNICEF, which have published tools and guidelines around the use of AI for kids.

“It’s fair to say that this is a really fast-moving landscape at the moment,” U.K. ICO Group Manager, AI and Data Science, Sophia Ignatidou said.

The U.S. executive order does not include explicit references to children, Ignatidou said, while the EU AI Act would ban systems that take advantage of children’s vulnerability, which she called somewhat “abstract.” The U.K. government published a white paper detailing plans for a pro-innovation approach to AI regulation and recently held the U.K. AI Safety Summit.

“Enormous discussions,” Ignatidou said, “but again, children are not in the center. And to be honest, sometimes I’m not sure whether or not that’s a problem. The problem is that children are not aware where AI is involved.”

Ignatidou said the ICO’s Digital Regulation Cooperation Forum is “very much trying to align our digital positions and our positions in relation to AI, and in that context, children are a part of our thinking.”

The ICO is also actively working to enforce potential harms and regulatory violations by generative AI.  

In early October, the ICO issued a preliminary enforcement notice against Snap over potential failure to properly assess the privacy risks posed by its generative AI chatbot, “My AI.” Ignatidou said the ICO’s findings are provisional, “so no conclusions will be drawn at this stage that it will actually lead to an enforcement notice,” and the regulator is “expecting Snap’s representations” and will consider them before making a final decision.

Generative AI poses concerning risks to children, 5Rights Foundation Head of Accountability Duncan McCann said, from the potential use of children’s data to train systems to the potential for creation of child sexual abuse material.

“These images are as photorealistic as real, as with all the other image generation work that’s being done. So it disrupts our options to actually address these perpetrators,” McCann said. “It also creates a huge problem for the networks and systems that we’re deploying.”

EU works toward CSAM rules

Dutch Member of European Parliament Paul Tang said lawmakers have been “working hard to put the pieces of the puzzle together.” He particularly noted draft rules to fight child sexual abuse materials, which raised contention around scanning of encrypted and unencrypted communications.

“What we tried to do is put the pieces of the puzzle together,” Tang said, noting lawmakers understood discussing the general monitoring and scanning of personal communications would “set the house on fire.”

“We didn’t want to set the house on fire, but we had to find another way forward,” he said, noting lawmakers are also exploring safety by design or default measures that ensure children are better protected and mandatory age-verification measures for services used by children.  

“I think what’s illegal offline should also be illegal online. … and age-verification is that. If an adult man approaches a school yard with playing children, someone will come up to the man and ask ‘What are you doing?’ We don’t do that at this point. What we do in real life, we don’t do in the virtual world,” Tang said.

Moving forward

There’s undeniable “fragmentation in policy and laws around the world when it comes to children,” Denham said. But with the U.K.’s Age-Appropriate Design Code and Online Safety Act, legislative activity in the EU, and U.S. states enacting children’s regulations, Denham said, “it’s inevitable that there will be a design focus for children’s services and products online.”

She said companies should consider the “best interests of the child” in designing all products and services.

“And when there’s a conflict or a tension between the commercial interests of the company and the best interests of the child, the best interest of the child is going to win out,” she said.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version