Main » TERM » C »

cognitive computing

Cognitive computing refers to computing that simulates the thought processes of humans. Cognitive computing utilizes self-learning or deep-learning algorithms backed by natural language processing, artificial intelligence and extensive data resources ("Big Data") to operate in a manner similar to the way the human brain thinks and works when attempting to solve problems.

One of the key reasons enterprises have committed substantial resources to the area of cognitive computing is its potential for use in applications like healthcare, finance, law and education, where vast quantities of complex data can be efficiently and effectively processed and analyzed in order to solve complicated problems and help improve human decision making.

By providing cognitive computing platforms with Big Data, artificial intelligence and self-learning algorithms, these systems are able to "learn" more and increase their accuracy over time, as they're able to develop an elaborate neural network that provides considerably more flexibility and adaptability than a traditional decision tree-based data modeling approach.

Cognitive Computing Gaining Steam and Attention

One of the most well-known examples of cognitive computing is IBM's Watson platform, which Big Blue originally developed to answer questions on the quiz show Jeopardy and match the computer's "brain" against human competitors on the show.

Other cognitive computing platforms being developed or currently in operation include Microsoft Cognitive Computers, Google DeepMind, HPE Haven OnDemand and Cisco Cognitive Threat Analytics.







LATEST ARTICLES
Facts about Cloud Computing in 2017

The following facts and statistics capture the changing landscape of cloud computing and how service providers and customers are keeping up with... Read More »

Facts about Computer Science: Education and Jobs

The following computer science facts and statistics provide a quick introduction to the changing trends in education and related careers. Read More »

Texting & Chat Abbreviations

From A3 to ZZZ this guide lists 1,500 text message and online chat abbreviations to help you translate and understand today's texting lingo. Read More »

STUDY GUIDES
The Five Generations of Computers

Learn about each of the five generations of computers and major technology developments that have led to the computing devices that we use... Read More »

Computer Architecture Study Guide

Computer architecture provides an introduction to system design basics for most computer science students. Read More »

Network Fundamentals Study Guide

Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »