Shyam's Slide Share Presentations

VIRTUAL LIBRARY "KNOWLEDGE - KORRIDOR"

This article/post is from a third party website. The views expressed are that of the author. We at Capacity Building & Development may not necessarily subscribe to it completely. The relevance & applicability of the content is limited to certain geographic zones.It is not universal.

TO VIEW MORE CONTENT ON THIS SUBJECT AND OTHER TOPICS, Please visit KNOWLEDGE-KORRIDOR our Virtual Library

Sunday, February 8, 2015

Your Data Should Be Faster, Not Just Bigger 02-08

Your Data Should Be Faster, Not Just Bigger




It’s universally acknowledged that Big Data is now a fact of life, but while large enterprises have spent heavily on managing largevolumes and disparate varieties of data for analytical purposes, they have devoted far less to managing high velocity data. That’s a problem, because high velocity data provides the basis for real-time interaction and often serves as an early-warning system for potential problems and systemic malfunctions.
wherebigdata (1)
Moreover, data proliferation has been accelerating. EMC recently reported that data volumes can be expected to double every two years, with the greatest growth coming from the vast amounts of new data being produced by intelligent devices and sensors. Oracle president Mark Hurd has predicted that the number of devices connected to the Internet will grow from 9 billion to 50 billion by the end of this decade.
What makes device data, sensor data, and other forms of “fast data” distinctive is that, unlike historical data, it is live, interactive, automatically generated, and often self-correcting. Historical data is used to identify patterns that inform future decision-making, while fast data is designed for real-time decisions and real-time responses. Think of fast data as the continuous processing of events and data in order to gain instantaneous insight and take instantaneous action.
While fast data is not really new, it has been largely restricted to a couple of high-value uses: complex event processing (CEP) activities that operate on event streams, examples of which include algorithmic trading and fraud monitoring in financial services; and event correlation, which includes the systems that monitor and manage complex industrial components such as jet engines.
So, what has changed? First, the explosion of fast data has driven the demand for instant action. Second, innovators in social media and services like Uber have shown that businesses can be differentiated based on their ability to act on data instantly. Uber knows where you are, where you’re going, and how you will pay to get there because it can capture, analyze, and act on data in real time. The availability of lower-cost memory is making fast data accessible for a broader set of applications, including:
  • First responder systems that rely on integrating fast response data collection and analysis
  • Network usage systems that respond instantly based on traffic patterns
  • Customer-experience management systems that analyze vast amount of behavioral data in real time to tailor interactions and support self-service
Organizations that know how to use fast data will be more nimble, adaptive, and competitive. What can companies do differently to prepare for the opportunities created by high-velocity data? Here are a couple of suggestions:
  1. Automate decision-making to increase customer engagement. Monitor customer activities to identify—and respond to—patterns, thresholds, and triggers. One major retailer is seeking to engage with customers in real time while they are online, but is hampered by traditional systems and batch processing environments. They are now creating an environment that marries customer and inventory data with streaming data so they can, for example, report to the customer whether a product is in stock and address any inventory gaps immediately.
  2. Integrate machine-generated data to personalize interactions.EMC projects that soon nearly two-thirds of all data will be generated by machines, not people. That presents a technical challenge: how can companies capture and analyze many flows of data concurrently? The good news is that next-generation data systems that prioritize real-time data are in the early stages of adoption. By integrating new sources of machine-generated data, and combining it with traditional data sources, firms can further personalize their customer interactions. Ericsson, the mobile broadband company, has developed real-time visibility into its system performance, using device and user data as it is generated. This enables Ericsson to identify and improve slow performance as needed and to serve customers with programming tailored to them.
Firms that take these steps now will be well positioned to improve their operations and better serve their customers using high-speed data.