C. Dianne Martin is Emeritus Professor of Computer Science at George Washington University, and Adjunct Professor in the School of Information, University of North Carolina at Chapel Hill. She has been teaching Computers and Society since 1983.
In 1787 Jeremy Bentham, the British Utilitarian philosopher and penal reform theorist, wrote a series of letters in which he proposed a Panopticon (all-seeing), also called The Inspection-House. His letters put forward “the idea of a new principle of construction applicable to any sort of establishment, in which persons of any description are to be kept under inspection”. For the next 16 years he was obsessed with the desire to implement his model prison design, which he believed would transform penal methods by drastically cutting cost through significant downsizing of the workforce needed to oversee prison populations. He also felt such prisons would have positive moral value to the prisoners.
Emerging—and existing—technologies are bringing us closer to the brink. And even if they turn out to be more benign, envisioning some technological advance as our salvation will waste precious time as the ecosystems upon which we rely move closer to collapse and the violent forces of authoritarianism gain power.
All technology, from hammers and hummers to routers and killer robots, is intended to increase power: to do something cheaper, easier, faster, with more entertainment value, with stronger impact, at greater distances, in more places, or with greater stealth. Technological power, like economic, political, cultural, institutional, or physical power, is distributed unevenly. It tends to be accumulated by people and organizations who already have too much. Algorithmic power has accelerated those differences; the computer has helped create today’s staggering economic divide. Many of the world’s richest people gained their fortunes through such algorithms, and it is their ideologies as well as the computer systems themselves that are taking us in dangerous directions.
Brett Frischmann is the Charles Widger Endowed University Professor in Law, Business and Economics, Villanova University. His most relevant book to his thoughts below is Re-Engineering Humanity (Cambridge University Press 2018).
Q: What do you do when you see a little button on a webpage or app screen that says I agree? A: Click the button.
The familiar and incredibly simple click-to-agree mechanism is ubiquitous. We encounter it throughout our digital lives. It is nothing less than the “legal backbone” of the Internet, app stores, e-commerce, and so much more. Yet electronic contracting and the illusion of consent-by-clicking are a sham!
Imagine a world governed by smart technologies engineered to achieve three distinct yet interrelated normative ends: optimized transactional efficiency, resource productivity and human happiness. We could have congestion-free roads—no stop and go, no road rage! Instantaneous, personalized entertainment—no need to search or browse! Successful social interactions—no misunderstanding or missed cues! No surprise ailments, no failures, no missed opportunities! Heck, no surprises of any kind! There are so many imperfections in our world that smart technology could fix.
We do not live in such a world, but the technologies required for it to exist are already being rapidly developed and deployed. Take, for example, the Internet of Things (IoT)—big data, sensors, algorithms, artificial intelligence and various other related technologies. Their promoters make seductive promises. Supposedly, smart phones, grids, cars, homes, clothing and so on will make our lives easier, better, happier.