This document attempts to outline how democracies can influence the norms that arise from technology. By definition, a norm is a rule that guides behavior among members of a society or group. Channeling founding sociologist Émile Durkheim, Nicki Lisa Cole distinguishes between “norms”, the rules that guide our behavior, and “Normative,” what we think should be normal.
Indisputably, technology creates the rules that guide behavior in fundamental ways, but to a great extent, people have not had much of an influence on how those rules have been formed, i.e. we have not had the opportunity to form normative statements around that technology. There are several ways to formulate such normative statements, where law is generally thought of as the most prominent. This document argues that democracies should influence those rules much more directly by developing technologies with a democratic mandate.
We see that technologies create norms for everything from how elections campaigns are run, to how we organize children’s parties. This has happened because of development of large imbalances in power, due to technological and economic centralization, a large knowledge gap between the private and public sector, etc. This has in the course of the last few decades compromised democracies’ very ability to regulate technology, despite the fact that technology is entirely a human artifact. I believe that to restore regulability, democracies must acknowledge the power technology development has to create norms. While all technologies may create norms, most technology development has only minor societal relevance. However, some technologies may have a major societal impact, and in those cases, democracies must be empowered to influence what the impact should be and these are what I call “Normative Technologies”.
For an example of how different architectures encourage different behaviors, consider that the option to add comments to news stories has been implemented by many news outlets. As this has attracted hate speech and other undesired behaviors, many have also stopped offering them. Normative Technologies recognize that having arenas for people to express opinions is important in democracies, but the need to curate comments cause concerns for censorship, place a large burden on news outlets, and yet provide questionable value. Technologies could allow people to comment in their own spaces, and news outlets could choose to promote quality comments through an editorial process, and this would alleviate some concerns, but it may create others, as it may bolster echo chambers. Developing, studying the effects of choices in this area and adjusting the technology to strengthen discourse would be a typical Normative Technology task.
Outlining Normative Technologies
As per the above Normative Technologies are technologies that have the potential to set societal norms. For such technologies to be in service of society, they must enhance regulability, and for regulability to be regained, democracies must have increased sovereignty, but in an open and non-hegemonic sense, as argued by Benjamin Pajot in “Barbed wire on the Internet prairie: against new enclosures, digital commons as drivers of sovereignty”. He further notes that hegemonic strategies that are so prevalent in today’s technological landscape can be challenged by digital commons. Digital commons are characterized by that they are non-exclusive and non-rivalrous, access to them cannot or should not be prevented, and sharing them will not deplete them. As every citizen must have equal access to technologies that form norms, normative technologies must be non-exclusive. It is then also just an added benefit that they are non-rivalrous. Thus, Normative Technologies are understood to be digital commons. Free and Open Source Software (FOSS) generally exhibits such traits and is already prevalent and has the potential to provide an existing basis.
To further promote regulability and citizen’ influence on the direction of technology, Normative Technologies must be guided by legitimate governing bodies, which must ensure that citizens have full access to knowledge about how the technologies are built, and that a broad range of trained individuals are able to create and change it. These are governing bodies with persons who are able to make detailed technical decisions.
It is understood that legitimate governing bodies do not necessarily emerge in large digital commons projects, as there are many sources of power that can come into play, generally to compromise either the effective influence citizens can have or to enter a situation of temporary or even permanent exclusivity.
However, legitimate governing bodies must derive their legitimacy from multiple sources, including those who implement the technology. Some representatives may for example, be elected by users, and democracies should seek to appoint their representatives into such governing bodies, but such representatives must have the know-how to participate constructively.
Normative Technologies are built with the acknowledgement that concrete implementations will exert strong power on the technologies citizens use in practice. Furthermore, it is acknowledged that this power is built on a legitimate interest, as key challenges and decision points often do not emerge before technologies are implemented. It is therefore crucial that there is a tight feedback loop back and forth between implementers and other decision makers when norms are being crafted.
Implementations of Normative Technologies are built with the recognition that citizens can legitimately reject technologies by not using them. Normative Technologies should therefore be understood as propositions to the citizens, not as final decisions on what technologies should be available to them. This leaves final power with citizens, and allows citizens to reject technologies that are not good. When this happens, both development and oversight bodies must examine the situation to understand whether the rejection is the result of inappropriate power structures, or the proposed norms does not have support in the population, or if the implementation was flawed, or due to some other reason.