ARCHIVED CONTENT
You are viewing ARCHIVED CONTENT released online between 1 April 2010 and 24 August 2018 or content that has been selectively archived and is no longer active. Content in this archive is NOT UPDATED, and links may not function.By Seth Earley
According to technology publisher TechTarget[i], “Cognitive computing is “the simulation of human thought processes in a computerized model. Cognitive computing involves self-learning systems that use data mining, pattern recognition and natural language processing to mimic the way the human brain works.”
Other definitions refer to “computer systems modeled after the human brain.”[ii] IBM’s well known foray into the space centered on Watson, which competed with humans in playing Jeopardy, and won. Watson’s expected applications include medicine, finance and a range of consumer facing applications.
The definition put forward by an industry consortium[iii] suggests that cognitive computing “addresses complex situations that are characterized by ambiguity and uncertainty,” that they learn from experience, and understand users’ context and intent.
This Cognitive Computing stuff sounds pretty good. You may be asking yourself, “Where can I get me some of that?”
Read the original article at: A Primer on Cognitive Computing