A Call to Action

Digital technologies for learning, health, politics, and commerce have enriched the world.  Digital heroes like Sir Tim Berners-Lee, Batya Friedman, Alan Kay, JCR Licklider, and Joe Weizenbaum have blazed trails.  Yet there is trouble.  We depend upon software that nobody totally understands. We are vulnerable to cyberterrorism.  Privacy is overrun by surveillance capitalism. Totalitarian control advances.  Daily internet news matching our beliefs makes it hard to tell true from false. People are addicted to devices. Jobs disappear without social safety nets. Digital leviathans threaten to control all commerce. Our values are threatened. 

There are risks of premature use of AI in domains such as health, criminal justice, senior care, and warfare. Much current AI is unreliable, without common sense, deceptive in hiding that it’s an algorithm, unable to explain decisions, unjust, neither accountable nor responsible, and untrustworthy.  

Our digital dreams are now digital nightmares.  

We cannot put the genii back in the bottle, but we can ensure human control. Our ethical duty is to do so.  Digital technologists can advise thoughtful citizens and society on productive actions; there is much they can do by ourselves. The digital world can be safer and consistent with our values. 

Citizens Can and Must Act 

I will start with actions for ordinary people.  I mean non-technical people who encounter technology daily — our parents and neighbors and friends.  

We must exercise self-control over digital technologies, limiting the use of cell phones and establishing periods of disconnection — for even days at a time. Parents can establish rules for appropriate digital immersion by their children. 

Many people feel inadequate with respect to technology. They feel they don’t understand it, can’t do it, and it’s their fault because they’re stupid.  They are not stupid, rather it is the technology that is badly designed. People should not put up with jargon —geekspeak. If someone uses … concurrency … phishing … platform … , speakers must be asked to explain. Tell people they can expect to have technology ideas and concepts explained clearly. GOFAI … intelligent tutor … machine learning … user experience … all can be explained to enable people to feel less steamrolled by tech. 

People can indicate their approval or disapproval of big tech firms by their purchasing power. If you disagree with their behaviour, don’t buy their products. Do not download their apps. Tell the company what you did and why. 

Citizens must be included in critical decisions about technology and its use.  They must be full participants in social, political, and ethical discussions.  They must make their voices heard. They can lobby expressing support or disapproval of actions involving tech, such as whether it should become a “smart city”. This happened recently, where I live in Toronto. 

People invested in a company’s stock can make their voices heard as shareholders. Amazon experienced this with respect to climate change. Buy shares of a firm you feel is evil; speak up at the shareholders’ meeting. 

Society Can and Must Act 

Other actions may be taken by society typically acting through governments. 

Colleges were once committed to a liberal education.  This was thrown out by many universities in the 1970s. This must change.  Students studying computer science must not focus totally on computing and math, as is now often true. Most colleges are government-funded, so society can apply pressure. CS students should be required to take “computers and society” or “computer ethics”, not required by many universities. 

My books provide a thorough approach. An imaginative and engaging method is to introduce issues by reading and viewing science fiction novels or films. Harvard has developed a program called “Embedded Ethics”. Philosophers are embedded into courses to teach students how to think about ethical and social issues. Students study bias in machine learning, fake news in networks, and accessibility in HCI.  Results have been positive.  The program keeps ethics at the forefront throughout the curriculum.  Students are engaged, “expressing eagerness for … more opportunities to develop skills in ethical reasoning”. 

Medicine, law, and engineering license practitioners and accredit their qualifications and degree-granting institutions.  Physicians, lawyers, and engineers may be held legally responsible for their work.  Actions violating ethical norms are subject to sanctions, removal of qualifications to practice, even criminal prosecution. CS has not been subject to such standards, with no mechanism to ensure acceptable performance. This must change. 

CS departments must be accredited. General education accreditation requirements must be strengthened.  The current ABET accreditation requires only half a computers and society course, with no requirement for ethics. This is wrong. Computer scientists must be licensed, with requirements for continuing education, professional practice standards, and procedures for disciplining — fines and even prison — for CS malpractice. 
 
Regulation has been sparse. Legal action has been rare. This is changing because of social media’s hate speech, fake news, and privacy invasions.  Europe has been at the vanguard of legislation, as with the General Data Protection Regulation. It has been forceful in enforcing laws against Big Tech. 

Big Tech has created wonderful products. Yet the leviathans must be broken up to check their increasing monopoly power — their ability to squash innovative competitors, to spread unchecked into new markets, and to control prices.  

Digital Technology People Can and Must Act 

I have insisted that ordinary people not accept geek speak, so you must do your part. Practice your ability to describe in plain English your tech work or articles in the media to relatives and friends. 

Speak up when you encounter systems that are confusing and unusable, that have been inadequately tested, and that leave people unhappy and frustrated. Complain to managers if you are working for a firm shipping such monsters of poor design and careless implementation.  

Bloatware is systems cluttered with thousands of commands and features, most only appealing to a tiny minority of users. Much software is packed with more features than any normal human needs and can use, making systems unreliable and forbidding. Campaign vociferously against bloatware. 

Tech R&D require CS talent to realize human dreams rather than the nightmares. AI agents helping people must be identified as algorithms.  Decisions and actions of algorithms should be transparent and explainable. There are opportunities to ensure that algorithms make fair, just decisions. 

Design of AI keeping in mind goals such as openness, transparency, and fairness is an example of value sensitive design (VSD), developed by University of Washington Prof. Batya Friedman and collaborators. Computing professionals designing systems can devise ways to ensure that functionality and UI reflect values such as safety, simplicity, clarity, honesty, and empathy. 

VSD is an example of tech motivated by social good. CS students can look for applications that address pressing societal problems, such as the environment. They can look for aspects that speak to and respond to user needs, such as clarity of the user experience, or availability of sensitive customer support. 

Because of the pervasiveness of digital technologies, CS students should consider government careers or even running for public office. 

Students can consider a firm’s ethical track record in deciding whether to work there.  Facebook saw this after misuses of social media during the 2016 US election caused many new graduates to shun the company. 

Employees can speak up when they believe their firm’s actions are immoral.  There are ways to escalate speech that are relatively safe.  Start with private conversations and email with fellow workers and with managers. If this has no effect, go public, first within the firm, then outside it. Google has been a case study in employee unrest due to ethical concerns. Unions are starting to form in tech. 

Two actions go beyond protesting. One can become a conscientious objector, objecting to a specific task or to all work at a firm, of course at great personal risk to one’s career. The final action is whistleblowing.  This occurs when employees are so certain of the immorality of confidential actions that they announce them to the world. The best recent example is Frances Haugen’s revelations about actions and failures to act by Facebook that have caused great harm. The US government has protections forbidding retaliation against whistleblowers.  There is not true of companies; whistleblowing requires great courage. 

Summing Up 

There is much we can do.  We can assume responsibility so that our friends and neighbors understand enough tech to exercise their rights as citizens with respect to its use. We can ensure that the education of computer scientists and governance of CS reflect our beliefs.  We can pursue careers recalling our values and speak up when a company’s actions violate them. We can work with politicians and activists to advance the public good. 

This is a clarion call to explain CS clearly, to take ownership of the responsibilities of CS, to overcome lethargy and defeatism, to think hard about our values and about tech safety, and to step forward and act. 

This blog is drawn from my March CACM Ethics column and my 2020 ACM SIGCHI Social Impact Award speech. The ideas are expanded in my book, Digital Dreams Have Become Nightmares: What We Must Do

FOR THINKING, WRITING AND DISCUSSING 

What do you plan to do to make the digital world a safer place? 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s