SenseIT
9/12/2022
How Digital Accessibility Improves Data Privacy and Vice Versa
Concerns about data privacy in the era of big data are ubiquitous, but they become even more critical when the data in question pertains to the health and finances of an already marginalized subset of the population.
Data Privacy and Accessibility
In the age of in-home voice assistants, screen reader detectors, and accessibility overlays that collect user information, it’s becoming increasingly difficult to ensure that users with disabilities (and without) can feel secure in their ability to maintain relative anonymity online. Designing barrier-free digital products is the only way to ensure that every user’s right to privacy is protected. Data privacy and digital accessibility are inextricably linked, and both are worthwhile ventures.
Why Data Privacy is Particularly Important to People with Disabilities
Privacy in certain areas of life is considered a right by most people as well as the law. Chief among those areas is health information. Disability status is not considered protected health information under the Health Insurance Portability and Accountability Act (HIPAA), the law that makes most patient health information confidential. Though, the Americans with Disabilities Act (ADA) does guarantee that personnel records relating to disability be kept confidential[i]. That law doesn’t protect people with disabilities who are unemployed, and it only protects people’s information in the context of their workplace – it doesn’t apply to the broader public. But the combination of those two laws, and the fact that most people don’t know the specifics of either of them, make it so that the cultural norm and expectation is that disability-related health information, like other health information, is private.
This might be more important to someone with disability than someone without. Many people with disabilities are (justifiably) apprehensive of their status as disabled being revealed and exploited. For people without disabilities, the common refrain is that if you haven’t done anything wrong, you shouldn’t be concerned about data privacy. But this is overly simplistic in the best of cases, and harmful in others. It assumes that the people with access to your data won’t find a way to harm you with it and that they aren’t interested in doing so in any case – this has proven false for people in marginalized positions in society for as long as there has been data to exploit. People with disabilities have undergone horrors at the hands of people in power, and not enough has been done on a systemic level to foster trust that that won’t happen again. It also assumes that societal perception of “doing something wrong” is static, yet we know it isn’t, and there have been times throughout history when simply existing as a person with disabilities would have been considered “doing something wrong.”
Even if you don’t think that reason to want privacy has legitimacy, it shouldn’t matter. Why a person feels a certain way doesn’t matter – that they feel the way they do is enough to address their concerns for what they are, and empathy is free. The fact that privacy regarding disability status is somewhat unclear under the law is disconcerting, but it is clear that it’s illegal to discriminate against someone on the basis of their disability status, and there is very little reason for that information to be public. In theory, we all have the right to consent to what personal information is made public and what isn’t. The specific reasons why we might make the choices we do are irrelevant – we all deserve the right to choose.
Potential Threats to Data Privacy
Many of the issues that lead to a breach in data privacy are due to user error and how easy it is to make those errors (like putting sensitive information in the wrong input field, for example). A product that is designed according to the WCAG’s specifications about input assistance[ii] should make it so that those user errors are less likely to happen. However, most products aren’t designed to WCAG specifications.
Financial privacy is probably second in importance in most people’s minds (and first in some) after health information privacy, and there are several laws that have to do with it in the United States[iii]. Some have to do with the government’s right (or lack thereof) to access citizens’ financial records. Some are about financial institutions disclosing customer information to third parties more generally. The specifics of each law are different, but the overarching theme is that financial information is treated as private, and exceptional circumstances wherein consumer information is either made public or disclosed to third parties, it must be made clear to the consumer. In truly exceptional circumstances they may not be notified, but none of those circumstances include being a person with disabilities on its own. Inaccessible products make it more likely that a person with disabilities’ financial privacy will be violated, either because they require another person’s assistance to use the product or because the product makes it more likely for user error to cause a privacy breach.
How Insufficient Accessibility Solutions Lead to Breaches in Data Privacy
Cookies are small bits of text that track users’ activity on the internet to allow for better targeted advertising; they’re also a concern for data security, since many of them are run by third parties, and are easily hacked. That means that users’ personal information can be accessed by anyone with the knowledge and desire to do so. Mozilla Firefox and Safari no longer support third-party cookies, largely due to their inherent privacy risk, and Google plans to phase them out of Chrome in the near future. In many cases, the banner asking for the user’s consent to cookies is itself an accessibility barrier[iv], and a user who cannot access the cookie consent form cannot give informed consent to cookies. Cookies are a cause of concern for all of us, due to the lack of transparency about how they’re used, but all the more so for people with disabilities, since the cookies are even further obscured by their lack of accessibility, as well as because of the higher stakes attached to data privacy for people with disabilities.
Ironically, some of those exact cookies are meant to help with accessibility. In yet another strike against accessibility overlays as a solution, many of them operate via cookies; they detect whether a user is using a screen reader or other assistive technology and automatically apply what they view as the appropriate settings. Some of them even leave what’s called a zombie cookie, a cookie that stays on the end user’s computer even when they aren’t on the original website they got it from, and apply the same settings on other sites that use the same overlay. This is a blatant violation of privacy, especially considering that there is often no option to opt out of these cookies. This is the very data that users don’t want collected.
Voice assistants have amazing potential as assistive technology, but mainstream users have voiced concerns about data collection (these are usually first party cookies, which are generally thought of as safer as far as hacking goes, but they still have all the transparency issues that other cookies have). Since we have no idea what’s being done with the data collected, other than targeted advertising, it’s no surprise we feel a bit apprehensive about it, especially since the companies that make the most popular ones are huge and opaque to begin with. Users with disabilities are again made more vulnerable by technology that should have helped them.
How Accessible Digital Products Protect Data Privacy
Accessible digital products might still have data privacy issues, as we see in the case of voice assistants, but they certainly have fewer privacy issues than inaccessible products. Data privacy and accessibility are not perfectly overlapping circles, so it isn’t the case that the advantages of an accessible product are simply the opposite of the disadvantages of an inaccessible one. But the middle of the Venn Diagram between data privacy and digital accessibility is large, and the less protected consumer data is by a given product, the more likely that same product is to present barriers to accessibility.
Moreover, accessible digital products can actually encourage data privacy. The biggest advantage a barrier-free web product has to a user with disabilities is that they can use it fully and independently. That means that user error is minimal, they don’t need to rely on another person, and they know what data is or isn’t being collected and by whom. They won’t encounter screen reader detectors or other invasive technologies, and everyone can maintain their desired degree of anonymity. And because designing a digitally accessible product takes conscious thought about the end-user experience, data privacy issues are more likely to have been caught and remediated.
Accessibility is a Win-Win
The only effective approach to digital accessibility – removing barriers – is also the best approach to data privacy, since the two go hand in hand. Digital product designers and developers have a duty to their end users: that they ensure that the users are able to use the product and that it doesn’t harm them to do so. Caring for your product’s accessibility is a way to help with both.
[i] There are exceptions of course. Most of them are still on a need-to-know basis. If that interests you, you can read more here.
[iii] You can read about them here.
[iv] See here for further explanation.
This may also interest you