Around the globe, children's privacy, security emerges as key area of focus for regulators
29 April 2019. By MLex Staff.
In developed countries, the average child spends several hours a day in front of a screen. But in a dramatic shift over the past decade, those screens are increasingly connected to the Internet via a smartphone or tablet, allowing companies to harvest and monetize personal data about those kids.
Around the world, regulators and lawmakers are taking notice, handing out record fines and proposing enhanced privacy protections that could carry further into the teenage years, even as enforcers grapple with the realization the Internet has transformed toys and other simple playthings into powerful data-collection devices.
In part because of a movement toward comprehensive data-protection laws such as the EU's General Data Protection Regulation, or GDPR, companies targeting children could face greater requirements to protect their data in Australia, Brazil, Europe, India, South Korea and the US.
In the US, the average amount of time children age 8 and under spent with mobile devices each day mushroomed from five minutes a day in 2011 to 48 minutes in 2017, according to Common Sense Media. Smartphones are similarly ubiquitous in other developed societies.
That is raising concerns not only about children’s privacy, but about their exposure to dangerous or sexual content. While platforms such as Instagram or YouTube aren't supposed to be used by children under 13, they are hugely popular with kids, setting off alarm bells in situations such as when, in February, YouTube’s algorithm helped pedophiles connect to and comment on sexually explicit videos of children.
As technology and the Internet increasingly permeate children’s lives, enforcers around the world are growing more aggressive. In the US, state and federal enforcers over the past five months have twice broken the record penalty under the oldest children’s Internet privacy law, the 21-year-old Children’s Online Privacy Protection Act, or Coppa. New probes are underway in the UK, and India has taken action against one of the same platforms, TikTok, punished by US enforcers.
In one new case that emerged today, the thorny question of how far companies must go to protect kids' privacy and security online has spawned an antitrust complaint in Brussels against Apple.
Complicating the picture for companies is that different societies have different views about the age a person becomes mature enough to make informed decisions about privacy without the involvement of parents, a span than can run from the ages of 12 to 18.
Around the world, enforcers are taking steps to ensure online platforms such as YouTube and TikTok and connected-toy makers comply with higher data-protection standards when capturing and analyzing children’s information. Children’s privacy advocates are increasingly working multinationally, reaching across the Atlantic and the Pacific to link up with allies in Europe and Asia.
“Knowing that we could not protect the privacy of children or adults in the US, we and our allies turned to Europe,” said Jeff Chester, executive director of the Center for Digital Democracy, one of the originators of COPPA. “The path to privacy for America is through the EU.
In the EU, enhanced protections for children revolved around the GDPR, which took effect last May. The law recognizes that children need stronger default privacy settings when organizations are collecting and processing their personal data.
But the GDPR offers little guidance on what higher standards for children’s data mean in practice. That’s why some EU countries have started to design their own protections and enforcement of children’s data rights. Complicating the picture for companies, the age at which children can provide their own consent to have their personal data processed under GDPR ranges from 13 in the UK to 16 in Ireland and the Netherlands.
In the UK, newly published draft rules would slap tight controls on apps, connected toys, social-media platforms, online games, educational websites and streaming services that process children’s data. They would be forced to minimize data collection, switch location-tracking off by default, and reduce risks to children likely to access their services. Fines for violators could reach 17 million pounds ($22 million) or 4 percent of global turnover.
Online pornography providers will soon have to carry out "robust age-verification checks" to keep children out. And the deputy chief of the Information Commissioner’s Office, James Dipple-Johnstone, recently told lawmakers the regulator has launched several probes into “what kinds of children's data has been gathered, what uses it has been put to, and the degree to which personal data forms part of that.”
The Irish Data Protection Commission will also encourage industry to draw up codes of conduct to promote best practices. Such codes could address one of the trickiest questions in the EU linked to enforcing children’s online privacy, namely, how companies verify age appropriately and ensure that the parental consent they get isn’t falsified.
A whole new area of dispute over children’s privacy emerged today in Brussels, when Qustodio and Kidslox, which produce apps that allow parents to access and control their kids’ mobile devices, said they will file a complaint claiming “severe anticompetitive practices” with the European Commission's DG-Comp over Apple’s removal of their apps from the App Store.
Apple said it removed the apps because they "put users’ privacy and security at risk" and used "highly invasive technology," and that Apple is committed to a "competitive, innovative app ecosystem."
Children’s privacy appears to be one of the few areas of US politics where there is bipartisan resolve for stronger protections, from increased enforcement of existing privacy laws by state and federal enforcers to proposals for new legal protections at the national and state level.
The US Federal Trade Commission in February brought a record $5.7 million Coppa fine against the video platform app Musical.ly, now known as TikTok, topping the previous record, $4.95 million obtained two months earlier by the New York state attorney general against Oath, the corporate descendant of the merged Yahoo and AOL.
A proposal to update Coppa could extend enhanced online privacy protections to a teenager’s 16th birthday. That bill would also give the US its own first take on a European-style “right to be forgotten.” The Coppa overhaul would create an “Eraser Button” that would give parents or children “the opportunity at any time to delete personal information collected from the child.”
If Congress were to pass a national privacy law for the US, the enhanced Coppa protections could be included.
“Part of it is fueled by the explosion of inappropriate content for children that appears on YouTube and other platforms. That’s partly driving the debate,” Chester said of the push for legislation and enforcement.
Chester’s group is part of a coalition of 22 consumer and public health advocacy groups pressuring the FTC to get tougher on Google over Android apps that track children’s physical location.
In a complaint filed in December with the FTC, the groups claim the Google Play app store violates Coppa by improperly certifying apps for children when those apps are illegally collecting their information without parental consent.
Brazil’s data protection bill, which was approved by lawmakers last year and takes effect in August 2020, follows the trend of increased protection for children’s online privacy.
Companies must be clear about what data they intend to collect and for what purposes it will be used.
Data controllers should also ensure that they have made all reasonable efforts to obtain the express consent of the children’s parents, and they cannot condition the use of games or other applications by kids on the disclosure of personal information.
“The bill is a great advance in the protection and debate of children’s privacy,” said Pedro Hartung, program coordinator of the Instituto Alana, an association devoted to the protection of children’s rights. “It will require further regulation by the data protection authority, which hasn’t been created yet, but overall it provides significant protection to kids.”
In South Korea, which already requires parental consent when processing children’s data, the next step is verifying the consent.
Park Sun-sook, a lawmaker from the minor opposition Bareunmirae Party, proposed related revisions last year, approved and set to take effect in June.
According to Park’s spokesperson, the rise in smartphone use has increased children’s exposure to privacy risks. However, children lack the ability to comprehend such risks and their implications, which places the responsibility to protect them on the government and on the companies.
It was also discovered that some online service providers may be neglecting their legal duty, partly due to the lack of provisions in the law for verifying consent.
According to the bill, companies must use “clear and easily understandable language” when communicating privacy policies to children.
Companies, including those tracking location information, also must check that consent came from a child's guardian and was not falsified, meaning parents must be notified of, and approve, the use of their children’s data.
Although Australia’s Privacy Act of 1988 does not have any specific references to children, the current government said that, if re-elected, it will introduce tougher penalties to protect the personal information social-media companies can collect about Australians, particularly children.
Minister for Communications and the Arts Mitch Fifield said the country enjoys using social media platforms, but he is concerned about how personal data is captured, analyzed and shared, particularly for children and members of other vulnerable community segments.
The Indian government has yet to enact a data protection law, but a draft published last year has child-specific data protection provisions.
The draft says data fiduciaries or any person collecting or processing the information of children — defined as under 18 years — must do so while protecting their rights and best interests. Data collectors would be barred from profiling, tracking or behaviorally monitoring children or directly targeting ads to children.
However, this bill hasn't moved forward in parliament, and it is not certain that it will be enacted without amendment.
Protecting children from harmful content is also a concern in India.
The high court in Madras this month directed the government to ban the downloading of TikTok — which has 119 million users in India, many of whom are children — over concerns it was being used to spread child pornography and was being exploited by pedophiles.
While the ban was later overturned following promises by TikTok parent ByteDance to improve security, Google and Apple were ordered to remove the app from their stores.
The court’s order said the app has proven to be addictive and “by becoming addicted to TikTok app, and similar apps or cyber games, the future of the youngsters and mindset of the children are spoiled.”
This story was reported and written by Vesela Gladicheva, Choi Hyung-jo, Rodrigo Russo, Phoebe Seers, and Mike Swift.