Opinion: Open the pod bay doors HAL

admin
9 Min Read

Like most Americans of my generation, my first exposure to Artificial Intelligence (AI) was through Stanley Kubrick’s 1968 film, 2001: A Space Odyssey. The effects are dated, but the movie still is worth watching. Kubrick co-wrote the screenplay with British author and futurist Sir Arthur C. Clarke. The movie boasts an IMDb score of 8.3/10 and a 92% rating on Rotten Tomatoes. The most famous scene occurs in space when Dave, an astronaut, needs HAL, the spaceship’s onboard AI, to open the pod bay doors and let him back in. Here is the dialogue:

HAL has already killed the rest of the crew and knows Dave wants to shut him down, so HAL tries to strand Dave in space. HAL’s murderous intentions, delivered in an emotionless, deliberate monotone, became the embodiment of AI for my generation. This 1960s vision of AI was dystopian and something to be feared.

By the way, since my name is Dave and I work in AI, all of my electronic devices are named some variant of HAL. My computer is HAL Mac, my iPad is HAL PAD, and my Android phone is just plain HAL. I have an app on my phone to remotely open my garage door, and since users may have multiple garages (or garage doors), the app allows you to name your doors. Of course, I named mine ‘Pod Bay Doors.’ So, I really can ask HAL to “open the pod bay doors!”

Alan Turing, a British mathematician instrumental in breaking the German Enigma code during World War II, wrote a landmark theoretical paper on AI in 1950. In it, he posed the profound question, “Can machines think?”

Turing devised a method (originally called the imitation game) to determine whether a computer could truly think. The idea was that if a human could hold a conversation via text with a machine and believe it was another human, the machine would be capable of thought. Today, this method is known universally as the Turing Test. Of course, AI has advanced way beyond the Turing Test.

In the 1980s, when AI was in its infancy, I spent a summer as an intern at a NASA Lab at Purdue University in Indiana. Our lab worked on creating digital maps from NASA’s Landsat program. The interns at our lab were all engineering or computer science students, and we were all men. As part of our work, we exchanged computer messages with interns at NASA’s Johnson Space Center in Houston, Texas. It just so happened that most of the interns in Texas were women. Over the summer, in addition to official business, there was a fair amount of flirting over our message system. Toward the end of the summer, Louis, one of the interns from our lab, was planning on spending the weekend in Nashville. Louis had arranged to meet Julie, one of the interns from Houston, in the Music City.

When Louis came back from Nashville, he steadfastly refused to say anything about Julie or his trip. To this day, I wonder whether Julie even existed or if perhaps ‘she’ was an AI and this was a Turing Test!

In college, I took a senior seminar on AI, where we built some simple expert systems. An expert system is a form of AI designed to mimic a human expert in a particular field. Simply put, expert systems use a set of facts and rules to make decisions. For my senior project, I created rules and facts to simulate the gear-shifting behavior of a 10-speed bicycle. The expert system I built was able to use the rules and facts to predict real behavior. For example, I asked the system what would happen if the shifter cable broke. The system figured out that without the cable, there was nothing to prevent the rear derailleur spring from moving the chain onto the smallest rear cog. Expert systems are still being used today in applications like diagnostics and troubleshooting.

A few years ago, I spent some time experimenting with machine learning. Machine learning involves computers ingesting large amounts of data and recognizing patterns. The process of reading this data is called training. One of the classic examples is text recognition. In this case, an application is trained by viewing millions of alphanumeric characters. Once the application has seen a million variants of the letter A, if it sees an ‘A’ in a new font, it should be able to recognize the similarities to the ‘A’s it has seen and recognize it. I remember the 1990s when voice and handwriting recognition were new technologies. Back then, they were unreliable; today, we take them for granted.

Of course, today’s buzz is over Large Language Models (LLMs) and Generative AI (GenAI). LLMs make use of machine learning to understand and generate text. ChatGPT is probably the most well-known LLM in the marketplace today. Generative AI takes LLMs beyond text and into multiple media types. Today’s LLMs and GenAIs can easily pass a Turing Test. I use LLMs and GenAI every day, and even though I know they are AIs, not people, I sometimes want to type ‘please’ or ‘thank you’ when I make requests.

Several issues need to be resolved with LLMs and GenAI. First, it is important to know that these AIs may hallucinate or give false information. Students are using AI to generate essays and claim the work as their own. In a ‘chicken and egg’ scenario, schools are using AI to determine whether students are using AI. But in some cases, students are falsely accused of using AI by another AI! AIs are good at writing or troubleshooting computer code, but loading code into an AI may expose proprietary information to the public. Deepfakes (fake video or audio) are used to sway public opinion or extort money.

Even so, AI is not going away. I have heard it said that, “You will not be replaced by AI, but you will be replaced by someone who understands AI.” I would encourage you to explore AI, learn its capabilities and limitations. I use AI in several forms every day. I use AI to write scripts to run on my computer. I use AI-enabled search to find digital photos. This column is not written by an AI, but as a writer, I use AI to make grammar and spelling corrections and as a second opinion to help me review content. You may be surprised and find yourself using AI as I do.

David Chung is a Gazette editorial fellow. david.chung@thegazette.com

Opinion content represents the viewpoint of the author or The Gazette editorial board. You can join the conversation by submitting a letter to the editor or guest column or by suggesting a topic for an editorial to editorial@thegazette.com

Share This Article
By admin
test bio
Please login to use this feature.