Google chief executive Sundar Pichai was in Dublin last week to announce a significant €1 million grant to children's charity Barnardos, which will be used to develop a children's internet safety programme.
According to a corporate blog post by Google’s senior regional manager EMEA, Stuart McLaughlin, “The grant will help Barnardos bring its Online Safety Programme workshops to more than 75,000 Irish children over the next four years.”
The programme is aimed at children, teachers and parents. Google also said its Be Internet Legends initiative would be available in Ireland, providing further online safety resources.
Making programmes and resources such as these available to Irish children and the adults who work with and care for them is a laudable initiative.
But Google as the provider of same is sheer hypocrisy.
It’s like having tobacco companies introduce a healthy lifestyle initiative focused on eating better and visiting the gym more often, while handing out free cigarettes to all participants.
Where to even begin? Only a month ago, Google was fined $170 million (€155 million) after the US Federal Trade Commission (FTC) found the company's YouTube division had knowingly and unlawfully gathered children's data and targeted them with ads.
Under the settlement, YouTube must ask children’s content creators to identity videos that are intended for children so that YouTube no longer incorporates targeted ads, and must obtain permission from parents before gathering or sharing a child’s personal information.
Obligations
Yes, you would have thought that perhaps Google should have been doing that anyway, because after all, the audience is … children. And, the internet giant surely was aware of its obligations under the US federal Children’s Online Privacy Protection Act (COPPA), which applies to children under 13 has been around for years.
According to an FTC statement: "Several channel owners told YouTube and Google that their channels' content was directed to children, and in other instances YouTube's own content rating system identified content as directed to children. In addition … YouTube manually reviewed children's content from its YouTube platform to feature in its YouTube Kids app. Despite this knowledge of channels directed to children on the YouTube platform, YouTube served targeted advertisements on these channels."
Yet, this just scrapes the surface of the problematical ways in which Google serves content to children, or gathers data without any ability to differentiate whether it belongs to children.
For example, children's advocates have long complained about the disturbing content woven through channels aimed at children on YouTube. Last year, Wired. com uncovered numerous videos that used child-oriented search terms but had disturbing violent or sexualised content, often from recommended content sidebar links.
Earlier this year, the Guardian noted the proliferation of such content, adding that even without the extreme content fears, YouTube Kids poses “a parenting minefield”.
Parental concerns
Critics also noted that $170 million was a paltry sum for a company the size of Google. Josh Golin, director of the lawsuit's lead complainant Campaign for a Commercial-Free Childhood, noted the fine added up to just a few months' ad revenue for YouTube and the settlement failed to address a "plethora of parental concerns about YouTube – from inappropriate content and recommendations to excessive screen time – [which] can all be traced to Google's business model of using data to maximise watch time and ad revenue".
Google also has a history of privacy violations and settlements. It is currently subject to a 20-year consent agreement with the FTC for data-mining violations connected to its former social network Buzz.
It's also the focus of a major recent privacy complaint to the Irish Data Protection Commission (DPC) by Dr Johnny Ryan, chief policy officer of browser company Brave, alleging unlawful processing of data by Google's market-dominating "DoubleClick/Authorized Buyers" advertising business. The DPC has opened a formal investigation.
Earlier this month, Ryan presented further technical evidence, which he says reveals “a mechanism by which Google appears to be circumventing its purported GDPR privacy protections”.
So, funding an initiative to improve children’s safety online, no matter how worthy in its own limited terms, is little more than a positive optics initiative in which Google places the onus on the end-user – in this case, vulnerable children – to apply small plasters to a festering internet.
This is unacceptable. The overwhelmed, deliberately-underinformed end-user is not responsible for sorting a problem of such daunting depth and scale. These companies must address the core problems causing the distress – their secretive, lucrative, data-mining and ad-targeting business models, and laissez-faire, algorithm-driven approach to content “management”. If they won’t, then governments and regulators must act decisively, imposing meaningful structural reform rather than today’s petty, random fines and minor operational adjustments.