Cognitive Computing | What Is It & How Does It Work?

Basically, IBM has been working on the foundations of Cognitive Computing technology for decades. Combining more than a dozen disciplines of advanced computer science with 100 years of business expertise. Gartner estimates the world’s information will grow by 800 percent in the next five years. With 80 percent of that data being unstructured.

Unstructured data comes from texts, photos, videos, books, and manuals. It is data hidden in aromas, tastes, textures, and vibrations. It comes from our own activities, and from a planet being pervasively instrumented. In a global economy and society where value increasingly comes from information, knowledge, and services.

Ultimately, this data represents the most abundant, valuable and complex raw material in the world. And until now, we have not had the means to mine it effectively. By the way, Google isn’t the only company creating useful apps for the web. There are plenty of great online resources not made online daily, provided you know where to look.

For instance, there is a high percentage of online apps and website tools that are worth collecting in your browser’s bookmarks. Not to mention, ready to go at a moment’s notice, to convert files, enhance photos, make GIFs, pick colors, transfer documents and more. You can see, 12 Useful Web Tools You Didn’t Know About

What Is Cognitive Computing?

Cognitive Computing refers to systems that learn at scale, reason with purpose and interact with humans naturally rather than being explicitly programmed. Whereby, Cognitive Computing learns and reasons from its interactions with us. As well as from its experiences with their environment.

They are made possible by advances in a number of scientific fields over the past half-century and are different in important ways from the information systems that preceded them. Putting cognitive technology to work is easier than you might think, but how?

First, the cognitive era is an ongoing movement of sweeping technological transformation. The impetus of this movement is the emerging field of cognitive technology.

Radically, disruptive systems that understand unstructured data, a reason to form hypotheses, learn from experience and interact with humans naturally.

Success in the cognitive era will depend on the ability to derive intelligence from all forms of data with this technology.

How did Cognitive Computing evolve?

Cognitive computing is perhaps most unique in that it upends the established IT doctrine that a technology’s value diminishes over time.

And because cognitive systems improve as they learn, they actually become more valuable. This quality among others makes cognitive technology highly desirable for businesses.

After all, many early adopters are leveraging the competitive advantage it affords.

Here is how Cognitive Computing evolved

The Tabulating Era (the 1900s — 1940s)

In general, the birth of computing consisted of single-purpose mechanical systems that counted. Using punched cards to input and store data, and to eventually instruct the machine what to do (albeit in a primitive way).

Tabulation machines were essentially calculators that supported the scaling of both business and society. Helping to organize, understand and manage everything. From population growth to the advancement of a global economy.

The Programming Era (the 1950s — onward)

The shift from mechanical tabulators to electronic systems began during World War II, driven by military and scientific needs. Following the war, digital “computers” evolved rapidly and moved into businesses and governments.

They performed if/then logical operations and loops, with instructions coded in software. Originally built around vacuum tubes, they were given a huge boost by the invention of the transistor and the microprocessor. Thereafter, which came to demonstrate “Moore’s Law.”

Doubling in capacity and speed every 18 months for six decades. Everything we now know as a computing device—from the mainframe to the personal computer, to the smartphone and tablet — is a programmable computer.

The Cognitive Era (since 2011)

The Cognitive Computing Era is the next step in the application of science to understand nature and improve the human condition.

Within the scientific community—as opposed to the media and popular entertainment —the verdict is in. There is broad agreement on the importance of pursuing a cognitive future. Along with recognition of the need to develop the technology responsibly.

The potential for something beyond programmable systems was foreseen as far back as 1960. When computing pioneer J.C.R. Licklider wrote his seminal paper “Man-Computer Symbiosis.”

Why Is Cognitive Computing Important?

Generally, cognitive systems are probabilistic. Meaning that they are designed to adapt and make sense of the complexity and unpredictability of unstructured information. They can “read” text, “see” images and “hear” natural speech.

And they interpret that information, organize it and offer explanations of what it means. Along with the rationale for their conclusions. Yes, of course, they do not offer definitive answers.

In fact, they do not “know” the answer. Rather, they are designed to weigh information and ideas from multiple sources to reason, and then offer hypotheses for consideration.

A cognitive system assigns a confidence level to each potential insight or answer. And now we are seeing firsthand the impact of cognitive computing. As well as its ability to transform businesses, governments, and society.

The true potential for the cognitive era can be realized by combining the data analytics and statistical reasoning of machines. With uniquely human qualities, such as self-directed goals, common sense, and ethical values. In fact, this is what Watson is doing now!

By helping banks to analyze customer requests and financial data. To surface insights and to help them make investment recommendations. Companies in heavily regulated industries are querying the system to keep up with ever-changing legislation and standards of compliance.

In the health industry, oncologists are testing ways in which cognitive systems can help interpret cancer patients’ clinical information. And identify individualized, evidence-based treatment options that leverage specialists’ experience and research.

How do I get started?

Cognitive initiatives come in all shapes and sizes, from transformational to tactical and everything in between. What the most successful projects have in common, no matter how ambitious, is they begin with a clear view of what the technology can do.

Therefore, your first task is to gain a firm understanding of cognitive capabilities. Since the cognitive era is here not only because the technology has come of age, but also because the phenomenon of big data requires it.

Computing systems of the past can capture, move and store unstructured data, but they cannot understand it. Cognitive systems can. The application of this breakthrough is ideally suited to address numerous business challenges. Like scaling human expertise and augmenting human intelligence.

Becoming a cognitive business looks different for almost everyone. Although a common perception is a cognitive technology is complex and difficult, that is not necessarily true.

While some early adopters start with ambitions to transform their organization or industry, most start relatively small. Talk to many successful early adopters and you will hear some variation. Especially, on the theme of “I want to improve one specific operational process.”

The point is, it is helpful to avoid assumptions regarding what adoption will look like for you. It is better to keep an open mind during this information-gathering phase.

Here are resources that will help give you a solid foundation;

Envision the possible and define your ideal outcomes

Judging by the success of early adopters, it’s no surprise more and more organizations are looking to adopt.

Many are grappling with how and when, but why is most important. No one starts down this path expressly to adopt cognitive technology; the whole point is to improve the organization.

Adopting cognitive technology above all else should align with business priorities. Successful early adopters identify a problem, then build a case for how solving that problem will support specific outcomes like saving money, gaining customers or increasing revenue.

Employ good planning for a specific and strategic use case

Usage patterns tend to fall into four major categories that play to the strengths of cognitive technology.

First, cognitive technology is often used to enable innovation and discovery by understanding new patterns, insights, and opportunities.

Second, it is often used to optimize operations to provide better awareness, continuous learning, better forecasting, and optimization.

Third, to augment and scale expertise by capturing and sharing the collective knowledge of the organization.

Finally, to create adaptive, personalized experiences, including individualized products and services, to better engage customers and meet their needs.

Pursue cognitive technology for the sake of technology

One temptation, however, is to pursue cognitive technology for the technology’s sake.

“Most of the failures we’ve seen are when you start with the technology instead of the business case,” according to an IBM cognitive technology architect. “There are so many things you can do with cognitive technology, and people get really excited. But you need to focus on what impacts your bottom line.”

Conversely, overthinking can lead to inaction. According to a CEO that leverages cognitive technology, “a lot of companies are over-analyzing what they should be doing. They want a fully detailed design and guaranteed quality of output, but it doesn’t work that way.

It’s better to start small with a good idea and from there scale-out and scale-up. There is no universal template for success, but focus on persistence is a proven formula.”

Prevent the perfect from becoming the enemy of the good

In some cases, the best advice is to select a use case quickly to overcome the inertia created by a misguided desire for perfection. Adoption can mean something as basic as tapping a pre-built cognitive application.

Starting small does not prohibit future expansion, and strategy can evolve over time. “Often what’s difficult is the trade-off of fixing current pain points and doing something that aligns with a long-term vision,” according to an IBM cognitive strategy specialist. “This is where people can struggle.

The challenge is to marry fixing the current problem with making sure it is the right move for the long term. So prioritizing the right use case that balances these things is the big challenge, and it’s where we can help the client the most.”

As you develop your strategy, share ideas with other forward thinkers within your organization—their support is essential—or brainstorm with a member of the IBM team.

Choose the best implementation approach for you

In another example, we can see how cognitive is applied to equipment maintenance to enable different interactive services to assist machine technicians and operators.

By directly integrating edge devices – such as machines and robots – in the cloud using Watson IoT Platform, manufacturers can develop personalized products and services. Not to mention, improve operations, reduce costs and avoid the risk of downtime.

By accessing different Watson services, in addition to other APIs on IBM Bluemix, a technician or operator is able to take a huge advantage. Particularly, of analytic functions, predictive maintenance, and visualization of information in a dashboard.

Once you gain a realistic understanding of what cognitive technology can do, and specifically how it will help your business, it’s time to choose your approach.

1. Deploy cognitive solutions and apps

Many early adopters know exactly where they want to install cognitive technology. So, they embed readily available cognitive offerings into existing workflows.

The lever of this approach is a pre-built cognitive solution, like Watson Virtual Agent or Watson Explorer. Furthermore, these products are already coded. And only require installation and integration with data sources upfront.

2. Build your own cognitive apps

Developers (like jmexclusives) on the other hand, can build their own cognitive apps through Bluemix, IBM’s cloud platform. More than 40,000 developers are building with APIs (Application Programming Interfaces).

The Watson Developer Cloud offers common language descriptions, demonstrations, case studies and starter kits for each API. “It’s good to let developers get in and play around,” said an IBM cognitive expert.

“Because the technology is so new, it’s almost impossible to explain everything upfront. You learn a lot by doing.” Don’t forget to watch a video on What is a cognitive API?

3. Utilize visual Inspection

Visual inspection is another use case example. Whereby, Cognitive Visual Inspection helps organizations to detect defects.

Through real-time production images that are captured through an ABB solution. And then analyzed using IBM Watson IoT for Manufacturing.

Previously, these inspections were done semi-manually. Unfortunately, which was often a slow and error-prone process.

4. Collaborate to create cognitive solutions

If your strategy is ambitious and transformational, you will likely need to collaborate on unique and customized solutions.

For instance, IBM offers various advisory programs designed to support these types of initiatives in which the adopter aims to change whole business functions. Or ways of working and competing.

These programs often deliver prototypes, or proofs-of-concept, that simulate your desired cognitive-enabled state using your own data.

Having said that, to collaborate with cognitive technology adoption specialists, contact a member of the IBM Cognitive Solutions Team.


Improving operations and increasing competitive differentiation are top of mind among manufacturing organizations. By utilizing the power of cognitive capabilities, IoT for manufacturing can help harness and mine the influx of information. Making shop floors more cognitive through effective processing, analysis, and operational optimization.

For example, Industry 4.0 offers organizations new ways to adapt or improve processes for better quality or to meet market demand. Industry 4.0 goes beyond automation into the instrumentation of the shop floor. In that case, through IoT-aware devices, such as sensors, beacons or RFID.

By all means, I hope the above-revised guide on Cognitive Computing was helpful to you. But, if you’ll have additional contributions, suggestions, and questions in regards to this or more of our blog articles, Contact Us. You can also leave your thoughts in the comments box below. Here are more links related to other useful blog topics;

  1. What is a Website Mobile App?
  2. Free Online Web Tools for Beginners
  3. Top 10 Strategic Technology Trends for 2020
  4. Website Development: The Key Essential Skills

Get Free Newsletters

Help Us Spread The Word