During the current COVID-19 pandemic, policy makers begun considering people’s immunity as the cornerstone of public health policies aimed at protecting citizens and their communities, sometimes to the detriment of other goals and values (Lavazza and Farina 2020; Farina and Lavazza 2020). Biopolitics—in the sense of a control that is made on bodies through the categories of science and medicine—has become central in the political sphere as society is confronted with an unknown threat (Foucault et al. 2008). It may hence be useful to reflect on how the debate surrounding the adoption of digital contact tracing apps (DCT henceforth) bears on our conception of privacy and freedom in function of the objective of achieving immunity from contagion. As we start reflecting on this issue an important question arises: is it right to sacrifice citizens’ rights in favour of immunity?

Immunity is the body’s ability to prevent the invasion of pathogens and typically refers to communicable diseases that can be of different severity and to which one can be exposed with different degrees of risk. Naturally, not all pathologies and not all risks are the same and not always the search for immunity can justify a trade-off against other equally cherished values. In this sense, the mandatory subscription to a DCT app that uses data sharing (deemed as one of the best tools for addressing the threat to immunity) is a measure that should be calibrated in the face of the degree of urgency corresponding to the need for immunity concerning the emergency in progress.

The task of a philosophical and ethical reflection is then to identify a correct balance between the degree of immunity that is thought to be necessary and the respective reduction of other inalienable values and rights. Biopolitical decisions that put immunity first as a synonym of security run into the risk of sacrificing privacy and basic freedoms, by using technical tools that promise highly effective control over the spread of the virus. This has become a particularly pressing concern given the recent deluge of DCT apps aimed at monitoring Covid-19 exposure (Floridi 2020; Ienca and Vayena 2020).

DCT apps have become a critical tool in the arsenals of governments for fighting the pandemic and hence for achieving immunity (Stevens and Haines 2020). In truth, though, contact tracing is not an entirely new technique. Public health officials have long used it to break the chain of transmission of infectious diseases (think about the plague); however, given the technological and infrastructural connectiveness of the world and of the societies in which we live, this practice—that once had logistical and technical limitations—has the potential of becoming quickly pervasive, creating dangerous vulnera in the rule of law and threatening many acquired fundamental civil liberties with the justification of the search for immunity as a more important value.

At the most basic level, Covid-tracing apps are designed to automatically notify users of potential exposure to Covid-19, by tracking their phone’s location. Therefore—prima facie—they seem pretty helpful tools, which a citizen—in performing her civil duties and in the interest of public health—should download and actively use. However, the landscape of these apps is so varied that if not properly standardised their effectiveness may be significantly reduced and the justification of their use be challenged (see Morley et al. 2020; Whitelaw et al. 2020 for interesting analyses of these points).

Furthermore, there are a series of fundamental ethical questions underlying DCT apps’ usage that deserve much attention. For instance: who is producing a given app? How the producer of that given app is related to the national government? On which technology does the app rely? What is its level of privacy? What will happen with the data collected and how such information will be treated in the future?

Besides these crucial social and political concerns, there are also important legal aspects underlying DCT apps’ usage that need to be better analysed when planning large-scale rollouts. For instance, oversight bodies should test the robustness of adopted privacy-preserving measures and possibly produce legal definitions for the roles of all the actors involved in DCT apps’ development and implementation. In addition, ad hoc legislation may be required to specify rules that can safeguard citizens against possible misuses. Furthermore, sanctions for unlawful handling of personal data ought to be implemented (Blasimme and Vayena 2020). And yet, while these are all important issues to consider, we believe that they do not address the fundamental problem at stake; the problem of mass surveillance.

Since 2013 we have become increasingly aware of government’s data breaches. Tech companies, often on behalf of governments, actively profile their customers through social media (Cambridge Analytica, for instance: Isaak and Hanna 2018). Moreover, government agencies routinely carry out mass surveillance activity by accessing our inboxes, listening to our conversations, keeping track of the videos and of the websites we open, or by following the transactions we make online (Snowden 2019; Bernal 2016). An increasing number of laptops each year is found with hidden keyloggers;Footnote 1 motherboards with spying chips inbuilt in them. Some agencies even developed programs (like the Brutal KangarooFootnote 2) to infiltrate a close network or air-gapped computer without requiring internet access.

This means that people’s metadata is constantly under scrutiny and at the risk of being exposed, even if some of us are still not fully aware of the pervasiveness of this sophisticated system of espionage.Footnote 3 In this sense then, by using DCT apps—which due to their publicity must necessarily be subject to independent scrutiny by free and rational moral agents and abide to specific national regulations—people won’t be doing more harm to their privacy and constitutional rights than what they already caused to themselves. People, over the last two decades, have been giving up their privacy sometimes willingly (for instance, by accepting Facebook, Google or Instagram’s terms and conditions of usage). They have also been tolerating the constant violations of their digital freedoms, often silently. Thus, during the pandemic, privacy could paradoxically be protected more than usual. However, the point to be emphasised here is the potential role of a Trojan horse that the combination of biopolitical justification (immunity) and mass surveillance (to get that goal) could play in the future. Societies need to be aware of this risk and think about possible countermeasures.

It is certain that in response to the pandemic, a number of governments will overstep their constitutional boundaries and try to seize the opportunity to strengthen the system of comprehensive mass surveillance that has been created and implemented in the last two decades. For this reason, it might be argued that the answer to the problem of freedom after COVID-19 should not exclusively lie in the formulation of legal and ethical frameworks for DCT apps’ usage (even though this latter attempt can be much praised); rather it ought to involve (if we are really serious about it) a paradigmatic change.

Such a change should encompass a cultural strategy to reduce the strength of the biopolitical appeal to immunity and make it proportional to the real danger (for example, not all threat to immunity needs to be addressed with drastic measures). It should also include the adoption of a number of technical solutions (such as end-to-end encryption and anonymization) as well as a radical departure from the current political landscape, focusing—for example—on the development of policies addressing software and hardware vulnerabilities and weaknesses of the Internet architecture. The former is certainly more feasible to achieve than the latter, at least in the short-term (Schuster et al. 2017), on the condition that people start using advanced anonymized encryption technology on a large scale (Schneier 2007).

In the framework of a bio-political quest for perfect immunity from contagion, our basic freedoms might be significantly eroded if we do not introduce a principle of proportionality among different values. Crucially, such a principle should help us attaining the legitimate objectives of preserving citizens and communities’ immunity without exceeding the limits of what is appropriate and necessary in order to achieve those objectives (for example, one should not undermine basic civil liberties).

In order to regulate the proliferation and usage of DCT apps, oversight bodies should thus adapt technological design to socially perceived risks or expectations, take into account ethical and legal considerations, and introduce checks and balances to prevent abuses. However, this is unlikely to suffice to fully preserve our civil freedoms, as they ultimately depend on wider and more encompassing strategies that we ought to take collectively in order to defends our rights from the abuses that governments might conduct with the aim (or the excuse) of defending immunity. In this vein, the experts’ task should be the assessment of real threats to immunity, so as to avoid the unjustified use of tools that can unjustifiably limit the rights of citizens as it might happen with DCTs.