Redacted Facebook privacy assessment leaves information gaps on compliance
20 December 2021 22:41 by Mike Swift
Facebook’s compliance with the US Federal Trade Commission’s 2019 privacy settlement was assessed in a report in June, additional sections of which were obtained by MLex today, suggesting the company changed its practices around facial recognition and other areas to sufficiently comply with elements of the order.
The redactions Facebook sought and the FTC granted after MLex requested a copy of the initial compliance assessment under the US Freedom of Information Act four months ago are so extensive, however, that it's impossible to judge whether the company is fully in compliance with the FTC order, which included an unprecedented $5 billion fine.
“Facebook maintains processes intended to provide users with clear, conspicuous, and accurate notice of their privacy rights and Facebook's policies and practices regarding the processing of Covered Information, including how that information will be collected, used, shared, retained and deleted,” says the assessment by consulting firm Protiviti.
Facebook has also “implemented processes to ensure that prior to the creation, use, or sharing of Facial Recognition Templates, users are eligible and have provided consent for facial recognition features,” the assessment concludes. And the company has created new protocols to make sure data provided by users, such as a telephone number used for multi-factor authentication, isn't repurposed by Facebook to boost its ability to target them with advertising.
MLex was the first to report in September, following delivery of the initial section of the assessment after its FOIA request, that there were significant “gaps and weaknesses” in Facebook’s response to the FTC order, despite Facebook’s demonstrated commitment to rebuilding its privacy compliance structure into an “appropriately comprehensive” privacy framework.
The independent assessor concluded in the executive summary of the 230-page report that “substantial” shortfalls remained that Facebook must address in developing its “Mandated Privacy Program.” A comprehensive explanation of those “gaps and weaknesses” were redacted from the report obtained by MLex, including the sections it obtained today.
The balance of the report obtained today is so heavily redacted that even a count of the number of documents Protiviti reviewed and the number of stakeholders it interviewed to assess Facebook’s compliance has been excised, making it difficult to judge even the quality of the assessment process, much less Facebook’s compliance with the FTC order. MLex filed its Freedom of Information Act request Aug. 12.
The following redacted paragraph on page 71 of the assessment shows how difficult it is to judge the quality and comprehensiveness of the assessment, much less Facebook’s compliance with it: “The Assessor evaluated the coverage, consistency and efficacy of the [redacted] process and the enforcement of [redacted],” the assessment said. “The scope of the Assessor's evaluation included the intake of changes into the [redacted] process, the execution of the [redacted] to identify risks and associated mitigations, and the enforcement of the implementation of the mitigations, along with the governance controls overseeing the end-to-end process. Our evaluation of the [redacted] included the review of over [redacted] documents, reports, and evidence of the [redacted] process outputs. Key documentation we reviewed included but was not limited to the following: [multiple redacted paragraphs].”
The opening 15 pages of the Protiviti report MLex obtained in September concluded “Facebook has made extensive investments in its privacy program since the effective date of the [FTC] order" in April 2020, "and meaningful progress has been made,” launching a program that “is logical and appropriately comprehensive.”
However, the assessor said, “the gaps and weaknesses noted within our review demonstrate that substantial additional work is required, and additional investments must be made, in order for the program to mature.”
With Facebook now renamed Meta Platforms, a spokesman for the company said today that the redactions were necessary to protect the company’s confidential commercial information.
"The redactions were determined by the FTC with input from Meta and the independent Assessor,” Meta said in the written statement. “They reflect the portions of the report containing confidential commercial information, which is protected from public release by law. This is standard practice for reports of this nature in order to protect the integrity of the assessment and the confidentiality of commercial information.”
The FTC didn't immediately respond to a request for comment today about the scale of the redactions.
Good for privacy?
If little can be gleaned from the heavily redacted assessment of Facebook’s privacy program, there is a sense that evaluators are more positive now than beforehand.
Sandwiched between inked-out black boxes is the assertion that “Facebook, in effect, created a new Privacy Program, including a foundational redesign of both its Safeguard environment and compliance documentation.”
For Facebook users and the wider Internet, the question is the significance of that change. Even if the social media company has in effect created a new program, if it has instigated a “foundational redesign” of its “Safeguard environment” and its compliance documentation is first rate — is Facebook a more private place now than before the 2019 settlement?
One criticism of the FTC settlement has been that it didn’t mandate privacy changes so much as impose new reporting requirements on the social media giant. The observation that Facebook’s “compliance documentation” underwent a foundational redesign suggests that to be the case.
Facebook also has taken certain privacy steps outside compliance with the FTC order, such as deleting its facial recognition templates and allowing users to adjust data-sharing permissions.
“They’ve done things to make the user experience of the platform more manageable,” said Alan Butler, president of the Electronic Privacy Information Center. The non-profit attempted to block the 2019 settlement with the FTC from gaining approval in federal court.
The fundamental privacy problem with Facebook remains in place, however.
“Facebook has not fundamentally changed what it is doing, which is collecting data about what people are saying and where they are going online, which is a fundamentally invasive part of their business model,” Butler said. Facebook is a data mining enterprise with a social media veneer, he added.
That the FTC should have wrung further concessions out of Facebook dealing with its business model of user data-fueled targeted advertising is a criticism that’s also dogged the FTC settlement from the start.
Facebook may never have acceded to a settlement that cut into its revenue, and the FTC at the time lacked the political will for a fight over what new Chair Lina Khan now calls “surveillance capitalism”.
“The fundamental flaw in all of this is that companies basically get to write their own rules in the terms of service,” Butler said. “We don’t have rules, Facebook gets to write its own rules.”
Changes
The initial assessment by Protiviti, covering the six-month period ending in April after the FTC’s privacy order against Facebook took effect, “confirms that the [FTC] Order constituted a watershed moment at Facebook,” the company’s chief privacy officer for product, Michel Protti, told the FTC in a letter July 1. The letter was also obtained by MLex through a FOIA request.
Facebook has made organizational changes, such as creating a dedicated privacy committee on its board of directors and requiring all employees to attend privacy training programs. Even Chief Executive Mark Zuckerberg now gets a quarterly report on users' privacy risks, and will now begin certifying each quarter that Meta is in compliance with the FTC order.
“Quarterly reports are delivered to the Principal Executive Officer and to the Assessor that provide a summary of the Privacy Review Statements generated during the prior fiscal quarter in accordance with Part VII.E.2.b of the [FTC] Order,” the section of the Protiviti assessment obtained today by MLex says. “The reports include a discussion of the material risks to the privacy, confidentiality, and integrity of the Covered Information, that were identified and how such risks were addressed.”
Facebook employees are now required to take privacy training, the assessor said. “Facebook developed the Internal Privacy Policy to explain employees' role in supporting the Mandated Privacy Program,” the report says. “Each employee agrees to abide by the Internal Privacy Policy through their completion of the Code of Conduct training.”
The privacy problem that triggered the FTC’s investigation and subsequent privacy order was the 2018 acknowledgment that a third-party personality quiz app on its platform that had been downloaded by several hundred users had gained access to the personal data of more than 80 million Facebook users, which was sold to political data-minting company Cambridge Analytica prior to the 2016 presidential election.
The section of the report obtained by MLex today suggests the consultant did evaluate whether Facebook has brought third-party apps under full control.
“Our operating effectiveness testing was structured to determine whether the third-party apps can only operate within the bounds described by technical documentation, engineering teams, and the Platform Terms,” the assessment said.
The way Protiviti tested whether third-party apps can only do what they tell Facebook that apps can do was fully redacted in the report, however, making it impossible to evaluate the quality of that testing.
Related Articles
No results found