Bad AI: The Dark Side of the Latest Tech Craze By William Lama, Ph.D
Artificial Intelligence is Amazing
My last Palos Verdes Pulse article touted the amazingness of Artificial Intelligence and its conversational chatbots, such as ChatGPT. The Brave New World of Artificial Intelligence By William Lama, Ph.D — Palos Verdes Pulse
Artificial Intelligence is promoted as the model for many tech applications, from medical diagnostics to driverless cars to fraud prevention. With virtual healthcare, patients can now message their physicians through an online portal. John Ayers, MD decided to test AI on medical questions. He randomly selected exchanges posted on Reddit’s /r/AskDocs where vetted doctors respond to publicly posed medical questions. He then posed the same queries to ChatGPT. Healthcare pros evaluated both physicians and AI, for quality and empathy on a 5-point scale. The ChatGPT average quality rating was 4.1 vs 3.3 for the physicians. More surprising, the AI answers were 10 times more “empathetic” than the doctor’s responses. ChatGPT May Have a Big Role in Healthcare | RealClearScience
Another amazing medical advance is the Neuralink brain implant for people who are paralyzed or suffering from neurological diseases. Founder Elon Musk calls Nueralink a "symbiosis with artificial intelligence." Elon Musk Scores a Big Win - TheStreet
So, is AI technology all positive? Are there no negatives? I decided to ask ChatGPT:
One of the biggest concerns is that AI could be used to automate harmful or unethical activities, such as cyberattacks, propaganda campaigns, and autonomous weapons. AI-powered algorithms may also perpetuate and amplify biases and discrimination present in the data they are trained on, leading to unfair and unjust outcomes.
Moreover, there is a risk that AI could be used to perpetuate authoritarianism and erode civil liberties. For example, AI could be used to monitor and control citizens' behavior, suppress dissenting opinions, or target specific groups of people for surveillance or persecution.
The Dark Side of AI
Any new technology is ultimately judged on its usefulness and its uses, which can be good or bad. The internal combustion engine got good grades for its use in horseless carriages that relieved the cities of horse excrement. Now it is being criticized for its carbon emissions.
The applications of AI will determine its acceptance. But first AI needs to be created. Here I will look at the negative effects of AI development.
When I wrote about the cryptocurrency craze, my last article noted some of its darker consequences, including the energy used in mining, the emitted CO2, and electronic waste.
The Dark Side of Cryptocurrency by William Lama, Ph.D. — Palos Verdes Pulse
Associated with the creation of AI are many “negative externalities” that I will focus on here.
The Physical Cost of AI
In her book Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence, Kate Crawford reveals the dark side of AI. In her view, “artificial intelligence is both embodied and material, made from natural resources, fuel, human labor, infrastructures, logistics, histories, and classifications.” Amazon.com : Kate Crawford Atlas of AI
Crawford takes us on a journey to the largest U.S. lithium mine, located in Silver Peak Nevada. At the mine, evaporation ponds separate lithium from brine pumped from underground. About 500,000 gallons of water are wasted to produce one ton of lithium carbonate. Developer Lithium Americas has said it plans to produce 88,000 tons of battery-quality lithium carbonate annually, wasting 44 billion gallons of water. 'Silicon Valley of lithium': Nevada mine breaks ground - E&E News
We think of AI as something ethereal, up there in the “cloud.” Crawford reveals that metaphor as a myth promoted by the tech industries. The cloud refers to servers that are accessed over the Internet, and the software and databases that run on those servers. The cloud lives in huge data centers that consume enormous amounts of electricity and require Li-ion battery backup.
The servers (Cloud computers) and hard drives also require rare-earth minerals such as dysprosium, neodymium, and yttrium. China supplies 95% of the world’s rare earth minerals. And the mines are an ugly mess. Here is a view of the rare earth waste in Baotou, Mongolia.
Over 99% of the earth removed in the mines is discarded as waste, creating pollutants such as ammonia. The minerals need to be refined in a process that produces 20,000 gallons of acidic water and a ton of radioactive waste for every ton of refined rare earth mineral.
The Human Cost of AI – Ghost Work
Faking AI is an exhausting job! The digital personal assistant company* (x.ai) claimed that its chatty gal, Amy, could handle many mundane tasks. In the real world, human workers at x.ai put in long hours annotating emails, to sustain the illusion that the service is automated and functioning 24/7.
* Not to be confused with Elon Musk’s new company, X.AI.
The Humans Hiding Behind the Chatbots
ImageNet is a database containing over a million images used for AI visual recognition training. Taken from the web, these pictures need to be categorized and labelled. Mechanical Turk, is the Amazon system for organizing thousands of human workers (known as “turkers”) to do small tasks. It was the perfect way to assemble the image database. ImageNet employs thousands of turkers to label and sort images, receiving a tiny payment for each one.
For Web Images, Creating New Technology to Seek and Find - The New York Times
These examples of the dark side of AI do not touch on the darkest applications which include cyberattacks, propaganda, smart weapons, discrimination, population control, surveillance, and persecution. These last items look like China’s hideous social credit system. China Social Credit System, Punishments, Rewards (businessinsider.com)
Every year this site gives out the Awful AI award for the most unethical AI research. GitHub - daviddao/awful-ai: 😈Awful AI is a curated list to track current scary usages of AI - hoping to raise awareness
Congratulations to a 2022 winner, Lensa, for stealing from artists without their consent, profiting from AI models that have been trained on stolen art without compensation.
Net-Net
When evaluating any new technology, one must ask about its costs (some AI costs are described above) and its applications. What is it used for?
Nuclear fission can be used to generate enough electricity to power the AI industry and much more besides. Or it can be used to make bombs. Elon Musk even suggested we Nuke Mars! What was he thinking?
Blockchain technology can be used for smart contracts and cross border security transactions. Then Sam B-F used it to defraud clients of $8 Billion. Now the U.S. government is introducing Central Bank Digital Currency. What could go wrong?
AI itself admits that it could be used to perpetuate authoritarianism and erode civil liberties.
And pity the poor English teacher who must decide whether take-home essays were written by Charlie or by ChatGPT.
Good people must stay in control of AI.
William Lama has a PhD in theoretical physics from the University of Rochester. He was a college physics professor and a scientist at Xerox Research Labs. He spent his last decade at Xerox managing software and electronics R&D. After retiring he served as Palos Verdes Library trustee for eight years, three as president of the Board. He may be reached at wlama2605@gmail.com.