Enterprises are passionate to analyze the massive and ever-increasing volumes of big data. This big data enables businesses to mine useful data from a variety of sources and help them to take more wise decisions.
The new SAS software offers real-time decision making by continuously analyzing data as it is received. It analyzes high-volume “events in motion” as it is form of complex event processing (CEP) technology preferred for critical data management and analytic applications.
Larry Tabb, chief executive officer TABB Group, said in a statement, “Combining high-speed, low-latency streaming data with deep, predictive analytics for risk valuations, surveillance and new trading strategies can be a key differentiator for firms that need to monitor exposures, liquidity and capital across portfolios on-demand during the trading day.”
SAS’ efficient, scalable architecture offers high supports hot fail-over. SAS DataFlux Event Stream Processing Engine, part of the SAS Information Management portfolio, also supports grid-based distributed processing for large-scale deployments.
“Firms that are wrestling with the challenges of risk, regulation and revenue growth need to be both faster and smarter in data management and analytics to turn information into reusable assets that better address client needs while improving efficiency and use of capital,” said founder of the TABB Group, a financial markets' research and strategic advisory firm focused on capital markets.
Jim Davis, SAS senior vice president and chief marketing officer said, “Competitive organizations must incorporate analytics into day-to-day decisions, modeling business processes they can enforce and monitor in real time. SAS DataFlux Event Stream Processing Engine changes that, analyzing high-velocity big data as it’s coming in, before it’s too late to act.”