We think of AI as brand-new technology, but is it? If you dig into its history, you’ll find that artificial intelligence has existed for more than 60 years. In fact, the term artificial intelligence was coined in 1956 by computer scientist and professor of mathematics John McCarthy at Dartmouth College.
The concept of objects coming to life and exhibiting human characteristics and abilities can be traced even further back in history than that. Leonardo da Vinci wrote extensively on automatons, machines that imitate movements of human beings. He even built a robotic knight. Ancient Greek mythology had a tale of a giant mechanical-man creature made of bronze that protected people from invaders and pirates.
If the history of AI dates so far back, then why has it just now become the disruptive force that everyone can’t stop talking about? Indeed, the convergence of three forces sparked the AI revolution and made it possible to use AI techniques to build real-life applications.
1. Computing Power
If you can think back to the early days of the Internet, you probably remember when there was only dial-up. Today, it is amazing to think how we ever managed with such painfully slow connections. Now, imagine trying to run an AI app on such an antiquated technology. Even a simple search or casual browsing session could take hours. These days, we get impatient and abandon a web page that doesn’t load within three seconds.
To build high-functioning systems, you must have the right hardware and infrastructure. Early computers were physically huge, and they’ve continued to shrink in size and price while increasing in computing power, which is key to AI’s progress now. It has only been in the past few decades that computer processing power has evolved to support AI systems. Faster computers can process more data and perform higher caliber functions as a result.
This is playing a critical role in future AI advancement. In the tech world, companies are racing to develop a computing model that meets the accelerated pace of AI software development. Google has rolled out TPU (Tensor Processing Units) which it claims to be 15 times faster than a GPU (graphics processing units) and designed specifically for machine learning.
IBM is working to develop a quantum computing system to power their supercomputer Watson. There’s also cloud storage and computer power to consider. Tapping into the cloud is allowing these smart devices and apps to communicate and learn from one another while storing vast amounts of data.
2. Digital Data Boom
We are generating more data today than ever before. A single modern car has 100 or more sensors that monitor functions such as fuel level and tire pressure. Every day, humans create a massive amount of data, and that number is expected to increase exponentially over the coming decade.
To put the rapid growth of data into sharper focus, years ago (2013) IBM reported 90% of the data in the world today had been created in the last two years alone. In Data Age 2025 , market-intelligence firm International Data Corporation (IDC) forecasts that by 2025, global data will grow to 163 zettabytes (or a trillion gigabytes). That’s 10 times the 16.1 zettabytes of data generated in 2016.
The bottom line: Data is increasing at an exponential rate.
That’s big data. And with new applications and processes being created every day, we’ll have an increasing number of sensors, systems, and devices to transmit even more data. Much of this big data is in unstructured form, such as numbers in photographs, and that makes it difficult to organize and analyze. Often, human expertise is required to clean and prepare unstructured datasets for machine learning.
3. Better Algorithms
This explosion of data has made it possible to refine algorithms and develop more extensive datasets algorithms can consume for machine learning. Using datasets as past experiences, algorithms instruct machines on what they should do.
Before basic algorithms just told computers what to do step-by-step. Now, algorithms have become so sophisticated that they facilitate machine learning and allow computers to learn on their own. Before, there simply wasn’t enough available data to train a machine, let alone build algorithms that allowed machines to train themselves.
For example, autonomous vehicles rely on enriched visual data to navigate the real world. That is, each frame of video collected by autonomous systems must be enriched with data that identifies objects, such as a road sign, in each frame.
“There have been some significant improvements in the algorithms. Some of them were first developed literally 30 years ago, but they’ve now been tweaked and improved, and by having faster computers and more data you can learn more rapidly what works and what doesn’t work,” said AI expert and MIT Sloan professor Erik Brynjolfsson .
In the future, the progress of AI will rely on advancement in these three areas and also on people, who will be the driving force behind the vision and application of AI. At CloudFactory, we work with innovative companies that are seeking to grow and need a tech-savvy workforce that can label, annotate, and enrich unstructured data for AI systems. To learn more, download our white paper, Humans in the AI Tech Stack.
Data Science ML Models Computer Vision AI & Machine Learning