CHAPTER 2: IS AI REALLY UBIQUITOUS - DOES IT OR WILL IT PERMEATE EVERYTHING WE DO?
"Artificial intelligence is going to have a bigger impact on the world than some of the most ubiquitous innovations in history.
AI is one of the most important things humanity is working on. It is more profound than, I dunno, electricity or fire."
Sundar Pichai30
Today, we take AI in its many forms for granted. It is so embedded in our daily lives that we see it practically everywhere.
But is it really ubiquitous and what does that mean exactly?
Ubiquitous AI
First, let's look at the very definition of ubiquitous. According to Merriam-Webster's Dictionary, ubiquitous means "existing or being everywhere at the same time: constantly encountered; widespread."31
Following this definition, AI is perhaps not yet completely ubiquitous - or everywhere - but its use is certainly increasing rapidly. It has become so common that - even if we're unaware of its existence - it may be performing background tasks on the floor of a retail outlet, managing your Internet searches, making recommendations on Netflix, guiding surgeons in the operating room of a hospital, making your smartphone smart, and so much more.
One might look at AI today as one might have looked at electricity in the past. Once unknown, electricity today has evolved into the commodity most of us now know and take for granted - standardized, affordable, and available, everywhere, and all the time. Kevin Kelly of Wired Magazine has the following to say: AI is more like a kind of
"Cheap, reliable, industrial-grade digital smartness running behind everything, and almost invisible except when it blinks off. This common utility will serve you as much IQ as you want but no more than you need. Like all utilities, AI will be supremely boring, even as it transforms the Internet, the global economy, and civilization."32
What is allowing AI to become almost a "common utility"?
Twenty years ago, I had to have surgery on my right hand, and I decided to buy a program that would translate speech to text so I could continue working. It was great, except that I spent hours upon hours trying to train the program to recognize my specific speech patterns and type out the right words on the computer.
I'd say something such as: "In reference to your recent order ." and it would come out like this: "Referring to your youngest daughter ." Well, needless to say, it was somewhat frustrating.
But today, I can just say, "Alexa, play Adele," and within a few seconds, there comes "Rolling in the Deep" from my speakers. I can say, "Find my phone," and my Alexa will call the number, and I am able to track down my missing cell phone somewhere in the cupboard or refrigerator.
So, how did this big change come about in such a short space of time? Why are algorithms that were basically developed in the mid-1950s to 1980s only now causing such a huge change in how we do business or relate to society?
Changes in technology
The answer is really quite simple: the technology has finally become powerful enough to enable the early promise of AI. Twenty years ago, computers were doing an "all right" job at processing, and supercomputers cost millions of dollars and took up massive amounts of space. Today, we have supercomputers in our smartphones.
There are a number of features that have enabled this massive change in processing capability, and thus the growth of AI.
First, chips today can run trillions of calculations per second. But more recently, chips have acquired other advanced abilities, aside from just raw power. Because many AI "math problems" have a similar structure, some chips are being optimized to carry out those calculations more quickly.
An example of these are the 3rd generation Intel® Xeon® Scalable Processors33 (code-named "Ice Lake"), which provide the foundation for Intel's data center platform, enabling organizations to capitalize on some of the most significant transformations today by leveraging the power of AI.
In addition, more memory is now being built into the processor itself, so there is less need to move data back and forth between the processor and memory chips, further speeding up the processes.
And not only is the hardware faster, sometimes augmented by specialized arrays of processors (e.g. GPUs), it is also accessible in the various cloud services. What once had to be run in specialized labs using supercomputers, can now be set up in the cloud at a much smaller cost and with less effort. Access to cloud processing has made access to the necessary hardware platforms to run AI simply much more available, enabling a proliferation of new efforts in AI development. And emerging open-source technologies, such as Hadoop, allow speedier development of scaled AI technologies applied to large and distributed data sets.
Simply defined, Hadoop is an open-source software framework for storing data and running applications on clusters of commodity hardware. It provides massive storage for any kind of data, enormous processing power, and the ability to handle virtually limitless concurrent tasks or jobs.
Additionally, since Hadoop is open source and free, it is less costly. It also has significant fault tolerance, so that processing is protected against hardware failure - if one node fails, jobs are automatically redirected to other nodes to make sure the distributed computing can continue. Multiple copies of all data are stored automatically.
Massive increase in data
The term "big data" has now been around for a while. But what is big data? Simply put, the term "big data" refers to the collection of massive amounts of data from a multiplicity of sources, and our ability to organize, analyze, and use it to our advantage.
The amount of data that is generated is increasing at an exponential rate. On average, in less than two days, we create as much data as we did from the dawn of time until today.34
Massive capability for data collection combined with less expensive storage, perhaps in the cloud, paired with the awareness by all kinds of industry players that collecting every little bit of data might someday come in handy, has brought about a high demand for solutions that go beyond the simple statistical analysis of data, and promise new insights and intelligence through enhanced pattern recognition and analytical capability. The AI industry to big data is as the automobile industry is to the horse and cart.
Faster communication speeds
The use of fiber-optic cables and 3G, 4G, and now 5G wireless capability was critical for permitting very large quantities of data to move rapidly back and forth. In fact, none of the video streaming services we now enjoy would be possible without these rapid data movement capabilities.
Optical computing is so rapid because it uses laser-generated light, instead of the much slower electricity used in traditional digital electronics, to transmit information at mind-boggling speeds.
Deployment of 5G-enabled wireless is ever-increasing. AI and 5G are synergistic, working together in a hyperconnected world in which virtually everyone and everything are connected.
Improved algorithms
Simply defined, an algorithm is a set of instructions that are executed when triggered by a command or an event. At their foundation, algorithms are essentially mathematical instructions.
To make computers function, programmers created algorithms to tell the computer step-by-step how to do what the programmer wanted it to do. A program could be 20 or 20 million lines of very specific code. Once completed and triggered, the computer executes the program, following each step mechanically, to accomplish the end goal.
Rather than have a computer follow a specific set of pre-determined instructions, algorithms for AI are designed to allow computers to learn on their own; e.g. ML. And that is the main breakthrough between traditional computer programming and the algorithms that allows AI to learn to perform a task.
What makes people refer to AI as ubiquitous?
Now that we've taken a short look at how improved technologies have allowed AI to take a quantum leap forward in just a couple of decades, let's go back to the original question. Is AI really ubiquitous?
Well, there is virtually no major industry that hasn't been affected to some degree by modern AI - more specifically, "narrow AI," which performs objective functions using data-trained models and often falls into the categories of DL or ML. This statement is especially true in the last few years, as data collection and analysis have ramped up considerably thanks to a robust Internet of Things (IoT), improved connectivity, the proliferation of connected devices, and ever-speedier computer processing.
Figure 2-1: AI is integrated into many areas
To further answer this question, we need to look at some, if not all, of the areas where AI and its use is proliferating. Looking at this rather limited list, it quickly becomes clear that AI is indeed highly integrated into much of the world around us - even if there remain areas where AI has not yet made its presence felt. Note that the list below is in no particular order in reference to the level or degree of AI implementation, nor do the descriptions below claim to fully and deeply address the degree of AI implementation...