When it comes to dealing with massive amounts of data, agility is key. codeFutures understands the complexities and challenges inherent in processing and formatting big data and the company rolled out their AgilData platform at Cloud Expo in New York City last month as a tool for handling huge amounts of data in disparate formats.
According to codeFutures CEO and CTO Cory Isaacson, the company is one of the early innovators in the realm of database scalability technology. He spoke to TMC (News - Alert) senior editor Peter Bernstein about the importance of having an agile infrastructure when performing real-time processing of data.
“Databases are typically looked at as very static infrastructures and they’re very, very inflexible in how you deal with them and it creates a huge amount of work for application developers,” said Isaacson. An agile infrastructure enables real-time processing of data so that it may be transformed into the exact format you need for a specific application, enabling efficiencies and conveniences not inherent in traditional data processing techniques.
The AgilData platform can take data from pretty much any source, from files to transactional processing systems, and transform it into aggregate infrastructures for generating reports and other actionable information. The transformation is achieved through a streaming technology framework that enables users to view data as a real-time stream.
“This is a fundamental shift from current approaches that view databases as a static repository – one that grows in complexity, is hard to manage and requires a continuous escalation of resources to maintain, let alone mine strategic value,” added Isaacson.
He is well versed on the topic of scaling big data, having just authored an e-book titled “Understanding Big Data Scalability.” The book is scheduled to be published later this month and covers topics including the scope and sources of big data and how applications and databases may be scaled to provide optimum performance and value.