Skip to main content
Image: "Roe v wade overturned: Protest to defend US abortion rights (Melb)", Matt Hrkac via Flickr.

The security and privacy practices of technology companies such as Facebook have once again come under fire from organisations that denounce these companies’ failure to meet international standards for the protection of human rights.

In August 2022, a teenager and her mother in Nebraska, USA, were charged with a number of felonies and misdemeanours for terminating the girl’s pregnancy. According to the case’s court files, the information revealing the steps that led to the abortion was disclosed by Facebook/Meta, following a court order requesting that the platform release copies of private conversations between mother and daughter.

This case occurs in the context of the United States Supreme Court decision that overturned the historic Roe v. Wade ruling – the ruling that had guaranteed the right to abortion, marking a huge step forward in women’s rights. In this post-Roe era, we are starting to see the first few responses by tech companies and how these come into conflict with their users’ privacy.

How did we reach this point?

In order to understand what this case might signify, it is important to look at how tech companies work. Their main business model involves collecting data from their users, including potentially sensitive information regarding health and sexuality, among other issues. This data gathering is inherently not neutral. It disproportionately impacts women as well as persons with diverse sexual orientations and gender identities, affecting their privacy and security.

Civil society organisations, including the Association for Progressive Communications (APC), have been warning for years that these companies are failing to meet their responsibilities when it comes to rights such as security and privacy for women and LGBTIQ+ persons. It should not be this way. Facebook and other internet companies have responsibilities with respect to rights – responsibilities established under the United Nations Guiding Principles on Business and Human Rights, which is the framework that should govern these companies’ activities. In this particular case in Nebraska, as human rights defenders have highlighted, Facebook’s swift release of the requested information to the authorities does not appear to have come after careful consideration of how it would impact the privacy and health rights of the users.

What do these responsibilities entail?

Tech companies must limit their data collection and avoid any unnecessary access to and use of that information. Pursuant to the recommendations of the United Nations Special Rapporteur on the right to privacy, they must take into account the disproportionate impact of their operations on women and sexually and gender-diverse persons.

Companies must first identify any risks, impacts and possible violations that their activities may entail for human rights, and then take steps to reduce, avoid or mitigate them. Among other practices, this involves engaging individuals and groups whose rights may be affected by a company’s activities in the development of that company's policies and services, so as to better understand their needs.

The use of end-to-end encryption, as noted in several United Nations resolutions and reports, is another key measure for ensuring privacy and freedom of expression in digital environments. This system guarantees that only the sender and the receiver can read or listen to what is being sent, and that the information is not available to anyone else, including the company that enables that communication. WhatsApp implemented default end-to-end encryption some time ago. Instagram, however, is yet to do so.

Encryption, and allowing for anonymous use of communication services, is key as it guarantees the safety of women, particularly with regard to sensitive information such as that connected with sexual and reproductive health. Encryption and anonymity are essential for the enjoyment of rights by women and persons with diverse gender identity and sexual orientation, as well as by other groups that are particularly vulnerable to privacy violations, structural discriminations, and online violence.

Transparency and accountability are also critical issues that must guide companies in their operations, including in the collection and use of data. Companies must properly communicate how their activities can affect the rights of their women users and disclose the mitigation measures they have implemented. They must apply clear policies regarding the informed consent of these users. Their transparency duty also means that companies must inform women users in a clear and accessible manner about the gathering, use, sharing and withholding of data that may affect their privacy. If there are government requests for data, these must also be clearly communicated to the users involved.

How can we defend ourselves from these threats to privacy?

We must start by trying to regain as much control as possible over our digital activities and spaces. We need to ask ourselves what kind of data we are making available to the platforms. A good starting point is the Feminist Principles of the Internet that APC developed together with women’s rights activists and organisations nearly a decade ago. These principles are based on the idea that surveillance is a tool historically used to control and constrain the bodies, voices and activism of women and sexually and gender-diverse persons. This includes surveillance practices by the private sector.

Following those same principles, we support the right to privacy and to have control over our personal data and information online at every level, and we reject the practices of both states and private companies that use data to make a profit and to manipulate online behaviour.

As for more practical recommendations, it is important that we reconsider how we use platforms such as Facebook and ask ourselves if we really want to continue using them or if we prefer to switch to messaging services such as Signal that use end-to-end encryption by default. As the DIY Guide to Feminist Cybersecurity suggests, if we do not encrypt our activities on the internet, then our activities are not private and anyone (individual or company) can potentially access them. The guide developed by the Electronic Frontier Foundation (EFF) is another useful resource, especially for those looking for information on abortion in the post-Roe context. Among other practices, it suggests choosing a separate internet browser with hardened privacy settings, such as DuckDuckGo.

In short, human rights, and the right to privacy in particular, must be at the core of the services and policies that govern companies. These services and policies must be designed applying a gender perspective and recognising the disproportionate impact that privacy violations have on historically discriminated groups. At the same time, as women who use these services, we need to reflect on how we can protect ourselves from the current trend of surveillance and data mining, especially when it comes to sensitive information such as information related to our health. We need to take increasing control over our spaces, information and conversations, and the relationships we have in them, prioritising projects, tools and platforms that by default address the privacy of their users as their main concern.

Fragments of this article were published in an earlier interview with the journalist Camila Alfie, for Página 12.

Region
APC-wide activities