FTC, Commercial Surveillance, and Overlays
The U.S. Federal Trade Commission on August 11, 2022 announced it is exploring rules cracking down on commercial surveillance and lax data security practices. The sub-heading of that press release plainly states it is seeking public comment on harms from the business of collecting, analyzing, and monetizing information about people.
Major companies have regularly been in the news for gathering and selling user information, and in more than a few cases data breaches of that user information, never mind unintended consequences of using that information. Sometimes that data gathering is clearly covered by the Terms of Use that its users must agree to in order to use the service. Sometimes not.
Where Accessibility Overlays Come In
An accessibility overlay refers to technologies meant to improve the accessibility of a web site. This is usually done by referencing code from another site, which the user’s browser retrieves and applies to the current page on each visit. In an ideal scenario, this makes the page more accessible and means a user does not need to ask for help (itself a privacy risk).
Accessibility overlays are in a unique position because they gather more non-HIPAA health data than the typical web site. They are generally marketed to site owners as a legal risk mitigation solution, not a vehicle for users to disclose their disabilities.
Asking users to self-identify is built into accessibility overlay interfaces. They prompt users if they are using a screen reader, have dyslexia, or match pre-defined profiles such as ADHD, vision impaired, prone to seizures, and so on. In some cases, the overlay makes a guess based on how the user interacts with the page.
Unlike a user of a social media service, who agrees to the platform’s Terms of Service before accessing it, a user who visits a web site with an accessibility overlay has no opportunity to decline the service before it starts to gather information about them. Many users may not even recognize that the overlay has Terms that are distinct from the site they are visiting, governed under laws from a different locality. Assuming the site has its own Terms.
[W]e find that 99.98% of Americans would be correctly re-identified in any dataset using 15 demographic attributes. Our results suggest that even heavily sampled anonymized datasets are unlikely to satisfy the modern standards for anonymization…
As profiles or settings follow overlay users across sites, more information is gathered about their surfing habits. That information can be paired with the non-HIPAA health data the overlay vendor is capturing, with no clear way to opt out.
This means accessibility overlay vendors are stewards of a great deal of personal information from users. Users who may not understand they have given up that information. Information that can be valuable to data brokers. Even if the best actors in the space have a good privacy policy, that privacy policy is moot in the face of a data breach.
Anonymizing this non-HIPAA health data gathered from users is insufficient because de-anonymizing it is trivial — 99.98% of users can be identified with only 15 data points.
The risk, real or perceived, to users is compounded by the historic behavior of the current overlay vendors. Disabled users have repeatedly raised concerns about the validity of overlays, relationship with the community, marketing promises, and overall effectiveness, to the point the National Federation of the Blind disavowed them. One never responded to a DPIA request under GDPR from a British user.
700+ digital accessibility practitioners, experts, end users, researchers, and more have signed an open letter imploring companies not to use overlays. There is a site dedicated to cataloguing the false marketing claims of overlay companies.
Many tech-savvy disabled users genuinely do not trust accessibility overlay vendors. Users who understand them are constantly wary they may visit a site where an overlay is installed, potentially matching up any information on file with browsing patterns, and making their disability status available to anyone willing to pay.
What to Do
Read through the Commercial Surveillance and Data Security Rulemaking, where fact sheets, a plain text version, and a Spanish language version are also available.
FTC’s Advance Notice of Proposed Rulemaking (ANPR) will soon is now open for public comment on the harms stemming from commercial surveillance and whether new rules are needed to protect people’s privacy and information.
If you want to share comments publicly, you will have 60 days from the date of publish in the Federal Register August 22, 2022: Trade Regulation Rule on Commercial Surveillance and Data Security
You can also share your input during a virtual public form on September 8, 2022 at 2:00pm ET. Sign up by August 31, 2022 at 8:00pm ET.
The Commercial Surveillance and Data Security Rulemaking sections on “Automated Systems”, “Discrimination”, and “Consumer Consent” offer some sample questions users can answer.
Here is one from “Automated Systems”:
Does the weight that companies give to the outputs of automated decision-making systems overstate their reliability? If so, does that have the potential to lead to greater consumer harm when there are algorithmic errors?
Here a question from “Discrimination” (disability is a protected category, though not listed in the sample question):
How prevalent is algorithmic discrimination based on protected categories such as race, sex, and age? Is such discrimination more pronounced in some sectors than others? If so, which ones?
And from “Consumer Consent”:
Should the Commission require different consent standards for different consumer groups (e.g., parents of teenagers (as opposed to parents of pre-teens), elderly individuals, individuals in crisis or otherwise especially vulnerable to deception)?
I encourage you to provide your own feedback however you see fit and based on your own experiences.
While there was some hope proposed bill S.3620 – Health Data Use and Privacy Commission Act might start to offer some protection for users, there has been no movement in months. This FTC effort is the next best bet. If you have been affected by overlays, now is a good opportunity to say so.
August 23, 2022: My Comment on the ANPR
FTC’s Advance Notice of Proposed Rulemaking (ANPR) went live yesterday. You can share your public comments at Trade Regulation Rule on Commercial Surveillance and Data Security until Friday, October 21, 2022.
My comment is now on the FTC site. It is in plain text and includes an image attachment. It mostly recaps what is in this post. The entire comment:
Broadly I support this effort.
One class of product in particular puts disabled users at more risk for data harvesting.
An accessibility overlay refers to technologies meant to improve the accessibility of a web site. This is usually done by referencing code from another site, which the user’s browser retrieves and applies to the current page on each visit. In an ideal scenario, this makes the page more accessible and means a user does not need to ask for help (itself a privacy risk).
Accessibility overlays are in a unique position because they gather more non-HIPAA health data (https://news.bloomberglaw.com/health-law-and-business/insight-the-top-five-health-care-privacy-issues-to-watch-in-2020#:~:text=Non%2DHIPAA%20Health%20Data) than the typical web site. They are generally marketed to site owners as a legal risk mitigation solution, not a vehicle for users to disclose their disabilities.
Asking users to self-identify is built into accessibility overlay interfaces. They prompt users if they are using a screen reader, have dyslexia, or match pre-defined profiles such as ADHD, vision impaired, prone to seizures, and so on. In some cases, the overlay makes a guess based on how the user interacts with the page. The attached image shows part of an overlay, with toggles for each of: Vision-impaired profile, ADHD friendly profile, and Cognitive disability profile.
Unlike a user of a social media service, who agrees to the platform’s Terms of Service before accessing it, a user who visits a web site with an accessibility overlay has no opportunity to decline the service before it starts to gather information about them. Many users may not even recognize that the overlay has Terms that are distinct from the site they are visiting, governed under laws from a different locality. Assuming the site has its own Terms.
As profiles or settings follow overlay users across sites, more information is gathered about their surfing habits. That information can be paired with the non-HIPAA health data the overlay vendor is capturing, with no clear way to opt out.
This means accessibility overlay vendors are stewards of a great deal of personal information from users. Users who may not understand they have given up that information. Information that can be valuable to data brokers. Even if the best actors in the space have a good privacy policy, that privacy policy is moot in the face of a data breach.
Anonymizing this non-HIPAA health data gathered from users is insufficient because de-anonymizing it is trivial — 99.98% of users can be identified with only 15 data points:
“[W]e find that 99.98% of Americans would be correctly re-identified in any dataset using 15 demographic attributes. Our results suggest that even heavily sampled anonymized datasets are unlikely to satisfy the modern standards for anonymization…”
— Estimating the success of re-identifications in incomplete datasets using generative models, 23 July 2019, Nature, https://www.nature.com/articles/s41467-019-10933-3The risk, real or perceived, to users is compounded by the historic behavior of the current overlay vendors. Disabled users have repeatedly raised concerns (https://www.vice.com/en/article/m7az74/people-with-disabilities-say-this-ai-tool-is-making-the-web-worse-for-them) about the validity of overlays (https://www.nbcnews.com/tech/innovation/blind-people-advocates-slam-company-claiming-make-websites-ada-compliant-n1266720), relationship with the community (https://www.numerama.com/politique/759804-faciliti-la-solution-pour-linclusion-en-ligne-qui-refuse-la-critique.html), marketing promises (https://www.wired.com/story/company-tapped-ai-website-landed-court/), and overall effectiveness (https://www.nytimes.com/2022/07/13/technology/ai-web-accessibility.html), to the point the National Federation of the Blind disavowed them (https://www.forbes.com/sites/gusalexiou/2021/06/26/largest-us-blind-advocacy-group-bans-web-accessibility-overlay-giant-accessibe/). One never responded to a DPIA request under GDPR (https://tink.uk/accessibe-and-data-protection/) from a British user.
700+ digital accessibility practitioners, experts, end users, researchers, and more have signed an open letter (https://overlayfactsheet.com/) imploring companies not to use overlays. There is a site dedicated to cataloguing the false marketing claims (https://overlayfalseclaims.com/) of overlay companies.
Many tech-savvy disabled users genuinely do not trust accessibility overlay vendors. Users who understand them are constantly wary they may visit a site where an overlay is installed, potentially matching up any information on file with browsing patterns, and making their disability status available to anyone willing to pay.
If this is at all helpful for your own comment, then borrow as you see fit.
Update: 9 September 2022
I find it interesting that IAAP has referred to the FTC effort as “accessibility in the news,” considering that disability is a protected category though not listed in the sample question.
Accessibility in the news
Federal Trade Commission (FTC) Request for Public Comment: Trade Regulation Rule on Commercial Surveillance & Data Security
Read more: regulations.gov/document/FTC-2022
I don’t mean to imply that it is referencing the issues I raise in this post, but I rather hope it is. Even if utterly obliquely.
Leave a Comment or Response