Artificial Intelligence 101
January 22, 2024415 views0 comments
YOMI MAKANJUOLA, PhD
Yomi Makanjuola, who has joined Business a.m.’s board of elite contributors from the United Kingdom, earned a doctorate in Materials Engineering & Design and worked primarily at Accenture as an Associate Partner in Nigeria. Currently, he is a private management consultant and author in the UK. His most recent book is Nigeria Like A Rolling Stone, now available at https://amzn.eu/d/8SRPZ0n
Normally, “artificial” either connotes a synthetic object or a behaviour that is insincere. For instance, the former elicits the perceptible difference between cotton and polyester yarn, which also personifies an inauthentic or phoney character. In today’s digital world, human cognition faces the recurring challenge of separating truth from misinformation. As such, when most people hear the contemporary sound bite “artificial intelligence” (AI), they are justifiably baffled.
True, one does not have to understand how electrons interact with a light bulb to intuit that throwing a switch produces illumination. Likewise, among the hordes who own smartphones, the minutiae of digital technology is incomprehensible. If experts are to be believed, AI represents frontier game-changing innovation that will radically alter how we think, live, work and play. Technically, AI will continue to mature while producing, at best, what may be construed as artificial emotional intelligence.
Spanning over 200 years, the original Industrial Revolution is currently undergoing its fourth iteration. Steam-powered machinery gave way to electrified factories and assembly line production that epitomised the Second Industrial Revolution. Mass production resulted in significantly faster work processes at lower cost. By the second half of the last century, advances in computer technology produced analogue machines, succeeded by the invention of semiconductors and solid-state electronics. The Third Industrial Revolution enhanced automation and raised productivity, following the introduction of industrial robots. It also heralded the personal computer revolution and global networks, driven by innovative information and communications technologies.
Until the late 1970s, computers were mainly categorised as mini and mainframe. These were large general-purpose machines produced by giants like IBM and Digital Equipment Corporation, and were typically deployed by the military and large corporations. Thereafter, two start-up companies, Apple and Microsoft, famously pioneered mass-market computing and programming.
At the margins, artificial intelligence had a decades-long but chequered history, due to limited computing power and data volumes. Subsequently, the era of the Internet and smartphones generated ever-expanding datasets, beyond the data warehouses built by multinational companies. Therefore, from the 2000s onwards, rekindled investors’ interest altered the landscape. So, in intelligible terms, what is AI and how does it actually work?
Although there is no consensus about what constitutes the Fourth Industrial Revolution, one of its key planks is AI, which similarly does not have a standard definition. Nonetheless, the term “artificial intelligence” was coined by a US researcher, John McCarthy, in 1956. AI could be described as an array of technologies working together to simulate human intelligence. In other words, AI enables machines to perform human-like tasks by learning from experience and adjusting to new inputs.
By leveraging advanced algorithms (computational procedures and rules for problem-solving), huge computing and storage capacity, and large data volumes, AI systems statistically analyse data for correlations and patterns. Through iterative learning and logical reasoning, AI can generate near-realistic images, text, audio, and other media. Associated technologies include machine learning, speech recognition, natural language processing, machine vision, and robotics. In this AI milieu roiling business models, data is supreme. Outside the scope of the well-advertised challenge of self-driving cars, there is increasing demand for AI skills in health care, financial services, manufacturing, retail, software development, legal services, among others.
In late 2022, artificial intelligence slipped into public consciousness with the release of a generative AI natural language tool called ChatGPT. Using digital neural networks and a large language model, ChatGPT interacts with the user in human-like fashion to answer questions, and compose high-quality essays, electronic messages, and even computer code. Faster processing capabilities, expedited by unimaginably powerful quantum computers, may someday enable AI to find a cure for cancer and tackle similar human quandaries.
By most accounts, AI will be transformational and is poised to make life more abundant – definitely faster and cheaper. However, it is not guaranteed to make everything better because, unchecked, it could pose an existential threat to human civilisation, on a par with weapons of mass destruction. Without sounding complacent, there are more immediate concerns, not least search engines’ conjectured bias.
On ethical grounds, lack of governance and global regulations could enable bad actors and states to inject AI into military, space, and biological weapons. In the medium to long term, some futurists believe that autonomous machines could emerge that not only pass the Turing Test (posed in 1950 by the British scientist, Alan Turing, such that machine response is indistinguishable from human behaviour), but exhibit hyper-capable superintelligence. But, so far, the replication of human agency or consciousness remains outside the ambit of algorithms.
So how should a lagging country like Nigeria respond to this consequential turning of the technological wheel? If the impact of AI on productivity and economic growth proves as pervasive as electricity’s footprint during the 20th century, then no country can afford to be left behind. Subjectively, the link between AI and job losses will transform the world of work. But, for now, much about AI is still a black box. Regardless, it may be worthwhile sneaking a peek inside the box to avoid being blindsided.
- business a.m. commits to publishing a diversity of views, opinions and comments. It, therefore, welcomes your reaction to this and any of our articles via email: comment@businessamlive.com