Home / Definitions / Cognitive Computing

Cognitive computing refers to computing that simulates the thought processes of humans. Cognitive computing utilizes self-learning or deep-learning algorithms backed by natural language processing, artificial intelligence and extensive data resources (“Big Data“) to operate in a manner similar to the way the human brain thinks and works when attempting to solve problems.

One of the key reasons enterprises have committed substantial resources to the area of cognitive computing is its potential for use in applications like healthcare, finance, law and education, where vast quantities of complex data can be efficiently and effectively processed and analyzed in order to solve complicated problems and help improve human decision making.

By providing cognitive computing platforms with Big Data, artificial intelligence and self-learning algorithms, these systems are able to “learn” more and increase their accuracy over time, as they’re able to develop an elaborate neural network that provides considerably more flexibility and adaptability than a traditional decision tree-based data modeling approach.

Cognitive Computing Gaining Steam and Attention

One of the most well-known examples of cognitive computing is IBM’s Watson platform, which Big Blue originally developed to answer questions on the quiz show Jeopardy and match the computer’s “brain” against human competitors on the show.

Other cognitive computing platforms being developed or currently in operation include Microsoft Cognitive Computers, Google DeepMind, HPE Haven OnDemand and Cisco Cognitive Threat Analytics.

Was this Article helpful? Yes No
Thank you for your feedback. 0% 0%