Technology sets norms, societies need to create things to steer technology
What should the technology landscape look like if it was there to serve humanity?
That is a question that should not be answered by technology companies alone, but rather by societies around the world. Thus, this site is not about answering that question, but about discussing measures that need to be taken so that societies can truly create answers to that question and be empowered to put those answers into actions.
We base this on the observation that technologies creates rules that guide behavior, but to a great extent people haven’t had much of an influence on how these rules are made. Democracies can influence those rules if they directly engage in creating technologies with a democratic mandate rather than just a commercial mandate. Technologies created may become Normative Technologies.
To further explore this concept, please see the following:
A critical delineation between different schools of thought in AI regulation centers around open and closed systems and models. This paper is focused on this issue, as some thinkers and governments have proposed that keeping foundational models and source code closed is important to ensure the safety of AI deployment. I argue that closed systems poses a greater risk as they are subject to scrutiny by a very small number of professionals, and will cause a knowledge imbalance that research cannot overcome. I contend that risk control is a separate issue from the open-closed axis, and that open systems can be made safer by good governance systems.
Through history, institutions have been formed to help guide increasingly complex societies. Now, with technology companies becoming at least as powerful as nation states, new institutions are again needed to balance power. This article argues that new institutions is a form of innovation, and will help usher an era of new innovations for society.