Cognitve 4 -Cognitive Computing in Action

There is growing interest and imagination about how Cognitive computing can help organizations to think in whole new ways. To see cognitive in action, here are some Industry specific examples of organizations benefiting from power of cognitive computing.

macyRetail

Cognitive systems are driving more personalized shopping experiences and helping reveal customer trends. Macy’s, a U.S. department store chain, is using Watson cognitive technology to create an in-store mobile companion that assists in servicing customer needs. It allows customers to input natural language questions regarding each participating store’s unique product assortment, services and facilities and receive a customized response to the inquiry. Click here for a video.

Banking

Cognitive systems are improving the client experience and driving more efficient operations. Bradesco, the second largest bank in Brazil, used Watson as a virtual agent assistant to answer banking questions, reducing the number of calls; improved its client experience and delivery using Watson Self Service Assistant for a new virtual channel; and kept its innovation leadership position by implementing cognitive computing and being the first Watson Portuguese implementation.

Health

Cognitive systems are transforming global health. Medtronic, a U.S. and Ireland-based medical device company, has insulin pumps which, combined with Watson, will help predict dangerous spikes or drops in blood sugar hours in advance, then notify people living with diabetes so they can take action before it becomes a problem.

Energy and Utilities

Cognitive systems are acting as a trusted advisor. Woodside, an Australian oil and gas producer, needed to access knowledge from decades worth of projects to save the company millions of dollars. With Watson Engagement Advisor, they created a service that culls through 30 years of documented expertise, recognizing patterns and continually learning from them. Watson acts as a trusted advisor answering questions from engineers, while learning through newly adapted knowledge. Here is a video on it:

In next blog we will explore some of the Watson offerings (aka APIs) available for consumption. So stay tuned.

Cognitive 3 – What is Watson ?

In my previous blog, we discussed how cognitive business understands, reasons, learns and interacts. Watson is IBM’s brand for cognitive capabilities.

Watson came in lime light when it appeared as a contestant on the US game show Jeopardy! where it handsomely beat two of the show’s best ever contestants (it’s winning total was more than three times that of second placed Ken Jennings). The show poses answers, and contestants must correctly identify the question being asked. For example, one puzzle Watson faced was “Jodie Foster took this home for her role in Silence of the Lambs”. Watson correctly inferred that in this content “took this home” meant “winning an Oscar”. Sometimes “took this home” infers a cold, groceries, or any number of things. Watson’s cognitive system enabled it to behave with human-like characteristics and correctly understand the context.

IBM-Watson-Jeopardy

How does Watson provided answers to those questions?

Watson did the following to provide the correct answer:
1. Question Analysis – In this step, Watson tries to figure out what the question is asking for, and what the answer type (should) be.
2. Hypothesis Generation – Here, Watson creates hundreds of different possible candidate answers. Later Watson will prioritize one of the answer as correct.
3. Hypothesis and Evidence Scoring – Now, Watson weighs each answer. It downgrades or upgrades answers, by looking at the evidence that does not or does support the hypothesis.
4. Final Merging and Ranking – Finally, Watson ranks all the candidate answers, and displays the top 3 answers. It gives confidence scores for each candidate answer and says out the final, first ranked answer.

In 2011 comprised what is now a single API—Q&A—built on five underlying technologies (Natural Language Processing, Machine Learning, Question Analysis, Feature Engineering, and Ontology Analysis). Since then, Watson has grown to a family of 28 APIs. By the end of 2016, there will be nearly 50 Watson APIs— with more added every year! Each API is capable of performing a different task, from recognizing bias in language to gathering information in news reports. In combination, these APIs can be adapted to solve any number of business problems or create deeply engaging experiences. And soon Watson will have ability to interpret data that human senses cannot, such as infrared and sonar.

How Watson is different from Traditional computer Systems?

Traditional computer systems depend on a knowledge base of structured information. They are limited in the kinds of information they can use, and that information must be analyzed and structured for them before they can use it. In contrast, Watson can read unstructured information and figure out its contents, giving it access to a much larger body of information and allowing it to digest that information with much less pre-processing. Because Watson is trained on a corpus of knowledge rather than being programmed, it has more flexibly and so it understands what we are looking for. And it’s ranking of answers helps humans make better decision.

I will explore some of Watson’s APIs / customer use cases in my future blogs. Stay tuned…

Cognitive 2 – The 3 eras of computing

In my last blog I mentioned that today we are at the beginning or the dawn of the cognitive computing era.  And so, what does that mean?  There are three eras of computing and each different from the one preceding it.

Screen Shot 2016-08-03 at 11.28.34 AM

Let us start with the first era, which actually we consider as the tabulating era.  This started maybe in the late 1800s, turn of the century, and was really about machines that were doing counting and tabulation, punch card readers or maybe special purpose machines. These helped with tasks such as controlling industrial looms or measuring voting and census data. They did it really well, but were ultimately limited in that single task.

Then came the programmable era. This starts with the dawn of the digital computer around in the 1950s.  And the big change here is that you have general purpose computing systems that are programmable — they can be reprogrammed to perform different tasks and do a variety of things.  People created software that used programming languages to give computers more complex tasks. Their performance, however, was limited by their adherence to established processes and decision-trees, by their inability to find relationships within unstructured information and they were also somewhat constrained in the way they interact with humans.

But what we see today and emerging over the last few years is something we call cognitive computing era. The major driver for this era is this sudden exponential increase in the amount of unstructured data that’s out there.  And so the challenge is, what are the computing technologies that can really leverage all of this unstructured information in a much more natural way?  And we believe that the only way we’re really going to survive with this onslaught of data is to create whats being called cognitive computing systems.

Today’s cognitive systems understand, learn, and communicate with people in natural language rather than software code. They can extract meaning from large amounts of visual, verbal, and numerical unstructured information; and they can learn as they do so, helping people make complex decisions based on Big Data. This era is leading to creation of automated IT systems that are capable of solving problems without requiring human assistance.

Lets get a taste of the cognitive systems through the means of an example. Lets consider the following question -“Did Maya shoot an elephant in her pajamas?
There are some ambiguities in the statement –  Did Maya shoot a gun or a photo? Was she or the elephant wearing pajamas? Although a human mind can resolves the question’s ambiguities based on context, but a conventional analytic software program cannot. A cognitive system is capable of resolving these ambiguities. First it creates multiple hypotheses about the meaning of question elements such as what object was shooting and who was wearing pajamas. Then it examines those elements against the context of its corpus–the set of documents that constitutes its essential knowledge. And thus it understands the most likely meaning of the question. Then it can answer the question providing a measure of its confidence in that answer.

 

 

Cognitive I – Need for Cognitive Systems

watsonThere is a lot of buzz about Cognitive Computing. I plan to write a series of blogs on it as cognition is framing the future of the digital economy. In this blog, lets explore the need of Cognitive technology.

In today’s world  we see the business models converge across categories and industries. The Ubers’ and Airbnbs’ have shifted consumer behavior very rapidly threatening the existence of very strong, established players in the market. This new dynamic was result of their on demand model powered through the immediacy of technology. Now, with emergence of such new business models, organizations can no longer continue to see their competitive set from within industry; they need to be structured to look beyond their traditional boundaries. As brands, businesses and organizations shift to become lifestyle centric, competition can come from anywhere.  Piotr Ruszowski, chief marketing officer, Mondial Assistance, Poland  says,  “The biggest threat is new competitors that aren’t yet classified as competitors.”

To win in this dynamic age, there is a need for organizations to become all knowing. This means getting insights from all data including the ‘dark’ data that sits outside of the firewall of the organization. It includes unstructured information – books, emails, tweets, journals, blogs, images, sound and videos.  The challenge here is that this pool of dark matter is only going to get bigger – the statistic here that “By the year 2020, about 1.7 MB of new information will be created every second, for every human being on the planet” represents the magnitude of how much we really mean when we say big.

As you go outside your firewall to the data that’s coming, it is increasingly unstructured. Traditional systems are programmed and so are not structured to be able to glean insight from dark data. Therefore organizations need to look to cognitive systems that have the capability to be able to make sense of it How? We will explore later, but the key difference here is structured data will tell you that your sales are down for instance, but it’s the unstructured data that can tell you why?

With Cognition, Business technologies that automate and detect can now also advise and enhance human expertise, powering organizations to be able to make richer, more data driven decisions.

Summary:
Here are the benefits of a cognitive business:

  • Puts to work all forms of data, whether structured or unstructured
  • Facilitates evidence-based, confidence-weighted decisions.
  • Discovers new insights and patterns in new kinds of data.
  • Learns and adapts with use, actions, outcomes and new data to stay current.
  • Navigates natural language to allow conversational-style interaction, enhancing adoption and use

The success of cognitive computing will not be measured by Turing tests or a computer’s ability to mimic humans. It will be measured in more practical ways, like return on investment, new market opportunities, diseases cured and lives saved. How cognitive systems work and what are some of the industries already benefiting from cognitive systems? Stay tuned.

 

IA Thin Client -Your entry point into data lake

In one of my previous blogs, I was mentioning how a data lake is a set of one or more data repositories that have been created to support data discovery, analytics, ad hoc investigations, and reporting. Some Enterprises have invested money and created data lake, but are not sure how to begin utilizing their data. IA Thin Client gives the first grip on the data to the business user or analyst. Extending the capacity of Information Analyzer on Hadoop and giving a user friendly thin client, it helps the Enterprises to get to know their data. Here are few of it’s capabilities
1.    Customers can see the listing of all the data they have in there HDFS file system which they can preview and select a handful of interesting ones.

2.    They can group these interesting ones into some Workspaces say – Customer related, Employee related, Finance related and so on.

3.    IA Thin Client gives them a dashboard where they can see the overall picture of data in a particular Workspaces.Workspace

4. From Workspace you can drill into details of  of one of these interesting structured / semi structured data and run data analysis to find more details about the data. This detailed analysis gives insight about data in easily understandable way – What is the quality of data? What is format of data? Can the data be classified into one of the several known data classifications? User can also see detailed information for each of the columns of the data (format, any data quality problem observed, data type, min-max values, classification, frequent values, sampling of actual values and so on).DatasetDetails

5.    Using the tool  user can make some suggestion to the meta data of the data. For example after looking they feel that some data formats do not look correct, or the minimum value should have been something else, or the data quality problem identified can be ignored etc. Editing these also reflect on the overall data quality score.

6.  Tool allows to add a note to data or link one of the interesting data to the existing data governance catalog.

7.    Tool allows the customer to apply some existing data rule to the data and see how the data performs against it.

8.    Moreover this is done on a simple, intuitive, easy to use thin client so that a non-technical person can easily navigate through the data.

You can watch a 4 minute video to get a first hand experience of the tool. Or join  InfoSphere Information Analyzer thin client – Tech Talk on June 23, 2016 by clicking the following link (Password: Governance). This presentation will provide a comprehensive overview of the Information Analyzer thin client. This topic will be presented by Dan Schallenkamp Offering Manager for Data Quality products within Information Server.

Cloud – The Horsepower Behind IoT

As promised in my last blog, in this blog I will explore the foundational block that will make IoT a reality. IoT requires huge amounts of integration. It’s not enough having components of technology; these things all have to work together.

IoT1

The cloud makes all this connection possible
For the IoT to be successful there needs to be a common way for devices to connect to each other and for building useful applications that take advantage of data from different sources. Here’s where cloud computing enters into the picture.

New cloud computing platforms like IBM Bluemix are a natural home for IoT-based applications and the data created by different devices. They’re being set up to handle the speed and volume of the data that’s being received and have the ability to ebb and flow according to demand, all the while remaining accessible anywhere from any device.

These platforms are making it easy for businesses to connect traditional enterprise-based information systems to both private and public IoT-enabled devices. This allows enterprises to quickly and economically build IoT-based sense-and-respond systems that can scale up or down based on changes in the environment and transaction level.

A one-size fits-all public cloud doesn’t work for many companies, who are looking for ways to take advantage of cloud computing without giving up their legacy IT investments. A hybrid model enables them to take advantage of a cloud environment, yet still maintain a level of control that is consistent with their business needs—a practical approach that delivers the best of both worlds.

 

The Real Opportunity for IoT

In my last blog, I described about IoT and ended with a thought on the “Real Opportunity” of IoT.  Lets explore this a little further, but in one word, it will be Analytics.

Here is a recap on Internet of Things (IoT) : First, in the simplest terms, IoT deals with physical devices that generate data from sensors and send the streams of data via the Internet to some kind of “hub” for data collection, visualization, and analytics. Second, IoT deals with multiple types of sensors and data formats. Third, IoT solutions might deal with thousands and millions of connected devices and huge amount of data.

Now, Billions of Internet-connected ‘things’ will, by definition, generate massive amounts of data of varying complexity, formats and timeliness. This is just a swamp, especially if all you do is collect data and don’t do anything with it.  For example, Insurers pay more than $1 billion in claims in the United States for cars and trucks damaged by hail. Can Weather Company’s weather data make it possible for insurers to send text-message alerts to policy holders, warning them of an imminent hailstorm and advising them of safe locations nearby? Note IoT will make it possible to identify the exact location of these cars /trucks and identify the owner to send the text message!

AnalyticsTherefore while many people focus on the devices themselves— how they function, how they perform and how they look—the real opportunity is in the data these devices are consuming and generating and the value it provides for businesses and even entire connected cities. Retailers will piggyback on Analytics, and use IoT to pull consumers into one of their channels, where they will entice them with products that have been contextualized and personalized for the customers’ gratification. And there will be similar usecases for manufacturers, servicing organizations, public utilities, industrial, telecommunications, healthcare providers and more—to serve their customers in new, personalized ways. Using predictive, prescriptive, cognitive and investigative analytics will make it possible for organizations to discover new relationships and correlations that bring together broader and deeper insights that lead to smarter business decisions in terms of risks, costs, growth, customer service and other things.

What all will be required for organizations to harness the power of Analytics and what will be the challenges? Stay tuned.

Disclaimer: The postings on this site are my own and don’t necessarily represent IBM’s positions, strategies or opinions