What is Artificial Intelligence (AI)?

		 		Version of the creation painting by michelangelo, with human hand reaching out to touch robot hand

Photo by Syda Productions/Shutterstock.com

Admit it. When you ask Siri a question and she gets the answer right, there’s a part of you that smiles, marvelling at the thought that you just had a conversation with a machine. And she—not it—understood you.

Even though Apple introduced their voice activated virtual assistant in 2011, it stills feels like a little taste of magic.

Our society has always been curious about artificial intelligence (AI), but that interest has exploded in the past few years. It’s become a permanent fixture in mainstream entertainment, as popular movies and TV shows like Blade Runner 2049, Her, Black Mirror and Westworld all provide their own vision of a future shared with machines. In the present world, AI is shaping more and more of our personal and professional activities—beyond Siri, Alexa, or Google Home. We increasingly rely on machines to help track, organize and make sense of our busy lives.

But when you hear people throw out terms like artificial intelligence, machine learning and deep learning, it can start to seem like you need a degree in computer science to keep track of all the jargon.

What’s AI, really? Do AI and machine learning mean the same thing? Are all of these just 21st century buzzwords for robots that are coming to steal jobs?


Photo by Ilya Pavlov on Unsplash

Providing clarity around what AI is and why people are talking about it can help you figure out how it can be useful for your company. So, here it is: the AI crash course you wish you had before you started down the Google rabbit hole.

What is AI?

In broad terms, Artificial Intelligence (AI) is a branch of computer science that works to build machines that imitate human intelligence. Applications of artificial intelligence (i.e. machines) are capable of performing tasks previously done by a person but often faster and at a larger scale.

Everyone seems to have a different concept of AI, which is probably due to it constantly changing as technology evolves (Forbes has a full timeline of the history of AI). What would have been considered artificial intelligence 30 years ago—or even five years ago—is now just expected technology.

For example, when your Netflix recommends a list of rom-coms featuring strong female leads—that’s AI in action. When Facebook suggests which friends should be tagged in your photo—boom, AI. When WebMD takes your common cold symptoms and spits out a diagnosis of certain death—also AI, but arguably the wrong approach to medical care. A human doesn’t handpick your results; an AI program uses lots and lots of data to find the best match, based on what it knows about you.


Because these applications are so common, you don’t think about them as AI. Instead, the term might bring to mind more complex virtual assistants that can perform multiple different functions all on one platform, like Siri or Alexa. But the term AI is quite broad, encompassing machines that perform a very narrow set of tasks and those that have an ever-growing list.

We’ll touch more on these two types of AI later on.

Why is AI so popular right now?

AI has fallen in and out of popularity over the past 50 years or so, ever since the “artificial intelligence” term was first coined at Dartmouth in 1956. The field would generate interest (and funding) and then drop when it didn’t produce the practical applications people were dreaming of. C3PO was, sadly, still out of reach. The theories were there, but the technology wasn’t.

star-wars-1936225 1280

There are two main reasons why AI is back and here to stay:

More power

Human thought is essentially pattern recognition that links together millions of neurons in our brain through what are called biological neural networks. The idea behind AI is building a system inspired by those neural networks. But since the human brain has billions of neurons and trillions of interconnections, it’s not only complicated to build an artificial brain, it also requires a lot of power to run.

Graphical Processing Units (GPUs), developed to process all the amazing graphics in video games and more, have given us more computing power and made that power widely accessible and inexpensive. This means that the machines we use every day are capable of processing a lot of information in a short amount of time. Blink. It’s done. Even cell phones now are capable of running algorithms that power AI like Siri and Alexa—and the list is only growing.

More data

We’re talking about “big data” here. Through the development and growth of new internet platforms—like Twitter, Facebook, Google Analytics and our favourite apps—our global community is capturing, processing, creating and storing more data than ever before. The means to do so is everywhere, all the time. Cell phones, cars, fitness trackers—they’re all contributing massive amounts of data that machines need to learn even remotely like a human does. This big data feeds the powerful machines we’ve created and makes them even more useful.

The new era of AI is being driven by these increases in computing power and the supply of data, both of which have helped open up a new avenue in the field: machine learning.

AI vs. machine learning

Machine learning and AI are often used interchangeably. While the terms are related, they’re technically not the same thing. Machine learning is a type of AI. The idea is that instead of “training” or programming a machine with millions of lines of code, you give it access to data so it can learn on its own.

AI is the destination; machine learning is the current path we’re taking to get there.

Deep learning is a branch of machine learning that enables you to build models that capture more complex relationships in data. For example, deep learning algorithms are the technology that powers computer vision (e.g. image recognition), how an application sees and interprets images. It’s how a Facebook application can review all of your photos and know whether you’re a dog person or a cat person.

shutterstock 484275199

Image by all_is_magic's/Shutterstock.com

How should my business think about AI?

Many companies are eager to jump on board with AI. Its seemingly limitless potential is attractive but also presents a problem: how do you narrow it down and apply such a new technology in practical, productive ways?

We know getting on a train that hits every stop won’t be as effective as finding the express that goes only to your desired destination. So it makes sense that the most successful AI tools target and solve a specific problem within your business. That’s where you start. What’s the problem? What AI offers targeted solutions?

One way to keep you on track is to think about AI as two separate things: general AI and applied AI.

General AI applications work to match and possibly surpass human intelligence. They are the generalists, the jack-of-all trade tools. Theoretically, this application could book a kick-off meeting at a trendy downtown restaurant, make predictions about stock market growth and carry on a stimulating conversation about whether the latest episode of Black Mirror is indeed plausible. Whatever you need, it can do.

The concept is great, but a reliable application of the technology is still being developed—and many predict it will take another 20, 30, 50 plus years to be fully realized. This is why you don’t use Siri for much more than looking up the weather. She’s interesting in theory but frustrating in reality.

Images by scrapbook.ellentv.com/post/21792954925/ellen-talks-to-siri

Applied AI applications have a defined purpose or set of purposes. They are the specialists. Some of the most successful applications are trained to perform one core task, such as providing answers to customer questions. Though the technology may change to improve how the AI answers questions, the task itself stays the same. One problem, one solution.

Applied AI is used to power a range of applications or tools, including smart thermostats (like Nest), e-commerce recommendations (on sites like Amazon) and chatbots (like Coinbase’s Ada Bot).

Ada’s approach to AI

Our goal at Ada is to provide an AI application for customer support teams that is simple to use, quick to deploy and highly accurate. To hit each of these checkpoints, the artificial brain powering our platform has to be a specialist. Why? While generalists can do a lot of things well, specialists have mastered their craft. By offering this level of expertise behind the scenes, we give our clients the freedom to focus on their craft, interacting with customers in front of the curtain.

At Ada, we believe adding a specialist to your team is how you get the most value from AI.

While more generalized AI is a fascinating example that shows us how far we can push the boundaries of what technology can do, applied AI takes that power and makes it practical.

Our software takes the complexity of machine learning and transforms it into a powerful platform that any non-technical person can use. This targeted approach lets our clients do one thing really, really well: reduce phone, email and live chat customer inquiries.

Through machine learning, Ada has developed a user-friendly platform that lets you build and train a chatbot to become your virtual agent, reducing the strain on your human agents. Through an AI chatbot, your customers get fast, straightforward answers to simple inquiries and requests. This lets your customer support team focus on the more complex and sensitive matters that require a human touch.

One problem. One solution. And we handle all the AI magic in between.

Like what you read? Let's chat

We'll reach out to set up a personalized demonstration of how Ada can help your company automate your support

A Friendly HAL: How AI May Make The Future of Work More Inclusive

“The future cannot be predicted, but futures can be invented. It was man’s ability to invent which has made human society what it is. The mental processes of inventions are still mysterious. They are rational but not logical, that is to say, not deductive.” ― Dennis Gabor, Inventing The Future It...

Read More