Ben Shneiderman is an Emeritus Distinguished University Professor in the Department of Computer Science at the University of Maryland, and a much-honoured pioneer in the field of Human-Computer Interaction. His recent book, Human-Centered AI, is a valuable contribution to the literature discussing challenges for the appropriate use of artificial intelligence and proposing approaches and steps to achieve a safer and more humane future incorporating the likely increased use of AI.
Although there is much that I could discuss, I shall focus primarily on Part 3, Design Metaphors, and Part 4, Governance Structures.
Digital technologies for learning, health, politics, and commerce have enriched the world. Digital heroes like Sir Tim Berners-Lee, Batya Friedman, Alan Kay, JCR Licklider, and Joe Weizenbaum have blazed trails. Yet there is trouble. We depend upon software that nobody totally understands. We are vulnerable to cyberterrorism. Privacy is overrun by surveillance capitalism.Totalitarian control advances. Daily internet news matching our beliefs makes it hard to tell true from false. People are addicted to devices. Jobs disappear without social safety nets. Digital leviathans threaten to control all commerce. Our values are threatened.
There are risks of premature use of AI in domains such as health, criminal justice, senior care, and warfare. Much current AI is unreliable, without common sense, deceptive in hiding that it’s an algorithm, unable to explain decisions, unjust, neither accountable nor responsible, and untrustworthy.
The number of seniors is growing rapidly worldwide. The population of adults aged 60 years and over will grow from 901 million in 2015 to 1.4 billion in 2030 and 2.1 billion in 2050. The number of ‘oldest old’ — those aged 80 years and older — will grow from 125 million in 2015 to 434 million in 2050. Declining birth rates reduce the Caregiver Support Ratio, the ratio of available caregivers to those who need care, hence adequate care for older adults is often lacking.
In the USA, the number of potential family caregivers aged 45 to 64 divided by the number of oldest old is projected to decline from more than seven in 2010 to less than three by 2050. It is hard to find and train good paid caregivers — many are ‘imported’ from other countries such as the Philippines. Hence there are too few people to care for growing numbers of seniors. Many caregivers are also illegal immigrants; U.S. policies made the situation there worse. The problem is more dire in some other countries. Japan’s population aged 65 and over is projected to grow from a current level of 25% to 40% by 2055. The country will need to add one million senior care workers and nurses by then.
In a recent blog, I discussed digital technology’s contribution to the environmental apocalypse, with massive amounts of energy being used in three ways: (1) to manufacture digital technologies; (2) to operate them; and (3) to dispose of and replace them with newer versions.
Electronic waste (e-waste) occurs when repair of electrical and electronic equipment (EEE) is impossible or undesirable and where devices are discarded thoughtlessly. A 2020 estimate of the amount of e-waste produced in the world was 54 million metric tons, which amounts to 7.3 kg fo every person in the world. Who would have predicted that the figure would be so high? The amount is doubling every 16 years. Asia generates the greatest quantity, followed by the Americas and Europe, which also produces the most per person.
AI and in particular machine learning has made great progress in the last decade. Yet I am deeply concerned about the hype associated with AI, and the risks to society stemming from premature use of the software. We are particularly vulnerable in domains such as medical diagnosis, criminal justice, seniors care, driving, and warfare. Here AI applications have begun or are imminent. Yet much current AIs are unreliable and inconsistent, without common sense; deceptive in hiding that they are algorithms and not people; mute and unable to explain decisions and actions; unfair and unjust; free from accountability and responsibility; and used but not trusted.
Recent increases in hurricanes, flooding, heat waves, fires, and drought are signs that the world is coming closer to irreversible damage.For example, scientists recently predicted that an Antarctic ice shelf holding up the huge Thwaites Glacier could collapse within 3 to 10 years, leading to the glacier sliding into the ocean and raising sea levels worldwide by more than 2 feet.
What is digital technology’s contribution to the environmental apocalypse? Energy is used in three ways: (1) to manufacture digital technologies; (2) to operate them; and (3) to dispose of and replace them with newer versions.
In a blog posted two days ago, I highlighted phrases and sentences from Mark Zuckerberg’s recent keynote speech sketching his vision of Meta’s intended metaverse. Here are thoughts triggered by his words:
1. “ you’re going to be able to do almost anything you can imagine … “This isn’t about spending more time on screens … [include] communities whose perspectives have often been overlooked … consider everyone …”
No, Mark, be honest. This isabout getting more people into Meta, and about getting them to spend more time in the metaverse, because that’s the only way you can sustain the growth your shareholders expect, and the only way you can withstand the onslaught of firms like Tiktok that now have greater appeal to the next generation of users.
In a recent blog, I suggested that we have finally lost patience with Facebook after new revelations by whistleblower Frances Haugen and the Wall Street Journal. Leaked documents show that FB knows that almost six million VIPs are given special dispensation to violate their content standards; criminals use FB to recruit women, incite violence against ethnic minorities, and support government action against political dissent; Instagram is toxic to many young girls, contributing to poor self-image, mental health, and suicidal thoughts; the firm relaxed its safeguards too soon after the U.S. election, contributing to the January 6 riot; and FB is incapable of suppressing election and vaccine misinformation.
In 2004, Mark Zuckerberg built an app to connect Harvard undergrads to one another. By 2006, it was available to anyone over the age of 13. Soon thereafter, his Facebook (FB) social media firm was animated by the concept that connectivity was a human right for the world’s billions. FB is now visited by almost 3 billion distinct users each month. The firm has become a monopoly, counting Instagram and WhatsApp among its divisions. (Further details appear in Chapters 11 and 17 of Digital Dreams Have Become Nightmares: What We Must Do.)
FB’s dominance has led to serious problems which are well known.Its news feed widely shares toxic material — misinformation, hate speech, and fake news. People post private information which FB exploits commercially through surveillance capitalism. Fake social media participants constructed by Russia in the 2016 US presidential election and other elections has skewed the results. Children’s addiction to social media harms their sense of self-worth and their physical and mental health and well-being.
Contact tracing is a process for identifying, informing, and monitoring people who might have come into contact with a person who has been diagnosed with an infectious disease such as COVID-19. It starts with a positive test. Public health officials then need to know who that person might have inadvertently infected. This requires tracking down anyone that person had contacted (was “close enough” for “long enough”) recently (14 days in the case of COVID). They can then be informed that they might have been infected and take measures to quarantine and monitor for symptoms. For example, restaurants initiate tracing by recording the name and phone number of one person in each party taking a table in the restaurant.