1. The Artificial Intelligence 2.0 Revolution
Once upon a time, computers were as big as rooms. They were capable of complex mathematical calculations. They were not, though, meant to be operated by people like me and you. Nor were they designed for creating documents and presentations, playing games, or surfing the web. Early computers were powered by vacuum tubesjust like most other sophisticated electronic devices of the timeand were used in scientific research. Then, the semiconductor revolution happened, and the transistor was born.
Note
A vacuum tube is an electronic device that was a common component in old radio and television sets, amplifiers, and even computers. The tube is a glass enclosure that houses an anode and a cathode inside a vacuum (no air or gas). Its based on the principle that electric current can move through vacuum and does not need solid material for the purpose. The first vacuum tube was a diode that, unlike semi-conductor diodes of today, was large and fragile.
Transistors gave birth to microprocessors, and microprocessors eventually brought computers into our homes and allowed them to do much more than just record scientific data, crunch numbers, or break codes. The first IBM personal computer was powered by an Intel 8088 chip, which ran at a blazing speed of 4.77MHz. Processors soon went through a revolution of their own, one dictated by the famous Moores Law. The processing power of computers roughly doubled every 18 months, allowing them to do tasks that could not be efficiently done on previous generation processors.
What is the common pattern here? All these revolutions have affected not just the performance or software development methods but also computing in general, in a way unimaginable before.
There have been other such historic revolutionsin parallel and subsequentthat have changed computing forever. Take a recent phenomenon for instance: the Cloud revolution . Back in 2010, when cloud was just a buzzword in newspapers and magazines, there was widespread confusion about the true meaning of cloud. Everyone talked about the disruptive potential and lasting benefits of Cloud, but only pockets of technology-savvy people actually understood it. A few years later everyone had adopted it. Cloud has affected not just businesses by offering entirely new business models to run their companies on but also affected our personal lives. Today, we cannot begin to imagine a world without Cloud: a world without online storage, unlimited music and video streaming, photo sharing, collaborative document editing, and social networking at the speed of light. Businesses have saved millions of dollars by basing the better part of their infrastructure on Cloud services rather than bearing steep costs of managing in-house networks of servers.
Cloud went on to become a key enabler of the Big Data revolution. Cloud computing gave us enough power to analyze billions of records, worth terabytes of data, exponentially quicker, and at considerably lower costs.
Lets try to understand the role of Cloud through an example. Consider an e-commerce website, say Amazon.com. At a high level, it stores two types of data about its userstransactional and non-transactional. Non-transactional data is information about customers (name, email, or address), items (name, price, discount, or seller), etc. Transactional data, on the other hand, is information about a particular transaction on the website, e.g., buying an item, submitting a product review, adding an item to wishlist or cart, etc. This type of data grows at a rapid pace on sites like Amazon. On Prime Day 2016, Amazon recorded sales of close to 600 items per second. That is 2.1 million items in one hour alone!
Storing such gigantic data was once prohibitively expensive, forcing companies to archive or remove transactional data after a set retention period (a few weeks to a couple of years). The potent combination of Cloud and Big Data technologies has not only enabled us to store (rather than throwing away) huge amounts of historic data at dirt cheap prices, but it has also allowed us to leverage the archived data to perform complex data analytics tasks over years worth of data to derive meaningful statistics and graphs for customer behavior and buying trendswhat were the top selling hundred items in each category during peak hours on sale day; which items were popular among users in terms of viewing but not in terms of buying?
Almost simultaneously came the IoT revolution. As with Big Data , Cloud is a key enabler of this revolution. Powered by a variety of sensors, IoT devices generate so much data that Big Data technologies usually go hand-in-hand with IoT. Cloud provides both storage and computing power to the otherwise lightweight devices on an IoT network. After helping spark two big revolutions, Cloud did it again with Artificial Intelligence (AI).
AI has surfaced and resurfaced in several waves, but its only recently that it has become commonplace. The widespread affordability and use of AI in software development has been seen as a revolution. The first AI revolution, the one we are currently witnessing, is about AI-as-a-Service. The book Artificial Intelligence for .NET: Speech, Language, and Search (Apress, 2017) gives an in-depth of creating AI-enabled software applications. With advancements in IoT and the emergence of Blockchain, AI is on brink of a second revolution, one that involves creating complete product offerings with intelligent software and custom hardware.
Note
The as a service model is generally associated with Cloud infrastructure, and that is true here as well. Worlds top tech companiesGoogle, IBM, Microsoft, and Amazonare offering AI services, in the form of SDKs and RESTful APIs, that help developers add intelligence to their software applications. Lets understand this in a little more detail.
Lets explore each component of the upcoming AI 2.0 revolution in detail.
Artificial Intelligence
The meaning of artificial intelligence (AI) has evolved over generations of research. The basic concepts of AI have not changed, but its applications have. How AI was perceived in the 1950s is very different from how its actually being put to use today. And its still evolving.
Artificial intelligence is a hot topic these days. It has come a long way from the pages of popular science fiction books to becoming a commodity. And, no, AI has nothing to do with superior robots taking over the world and enslaving us humans. At least, not yet. Anything intelligent enough, from your phones virtual assistant (Siri and Cortana) to your trusty search engine (Google and Bing) to your favorite mobile app or video game, is powered by AI. Figure shows an AI-powered intelligent chatbot .
Figure 1-1
An intelligent chatbot that can place pizza orders by humanly understanding its users
Interest in AI peaked during the 2000s, especially at the start of 2010s. Huge investments were made in AI research by both academia and corporations, investments that have not only affected these institutions but also its affiliates and users. For software developers, this has been nothing short of a boon. Advances made by companies such as Microsoft, Google, Facebook, and Amazon in various fields of AI, and the subsequent open-sourcing and commercialization of their products, has enabled software developers to create human-like experiences in their apps with unprecedented ease. This has resulted in an explosion of smart, intelligent apps that can understand their users just as a normal human would.