The Internet of Things (IoT) is creating a world more interconnected than ever imagined. Cisco (News - Alert) Systems estimates that by 2020, 50 billion devices will be connected. Companies like Nest provide appliances that learn users’ schedules, program themselves and can be controlled from the user’s phone. Fitbit and other wearable devices track personal data and wirelessly upload it to the cloud.
All of this data needs to be stored, putting a strain on current storage solutions. Service providers will need to accommodate ever-increasing demands for storage as the Internet of Things churns out unprecedented amounts of data. To remain competitive, many service providers are exploring new options in data center architecture that will permit greater flexibility and control over hardware costs.
Current storage solutions: slow and costly
Existing data center architecture is comprised mainly of appliances. In industry parlance, an appliance is server hardware that comes with proprietary, mandatory software. The software is designed for the hardware and vice versa, and come tightly wedded together as a package. The benefits of this configuration include convenience and ease of use.
Because hardware is bound to fail at some point, traditional appliances typically include redundant copies of expensive components to anticipate and prevent failure caused reliance on a single point of entry. These redundant extra components bring with them higher hardware costs, greater energy usage and additional layers of complexity. When companies, in anticipation of growth events like the IoT, begin to consider how to scale out their data centers, costs for this traditional architecture skyrockets.
Another difficulty regarding traditional appliances is their vertical construction—requests come in via a single point of entry and are then re-routed. Think about a million users connected to that one entry point at the same time. That’s a recipe for a bottleneck, which prevents service providers from being able to scale to meet the capacity needed to support the Internet of Things.
A cost-reducing, scalable storage alternative
One such option is software-defined storage. By taking features typically found in hardware and moving them to the software layer, a software-defined approach to data center architecture eliminates the dependency on server “appliances” with software hard-wired into the system. This option provides the scalability and speed that the IoT demands.
Although the term “software-defined” may seem like a recent catchphrase, many everyday electronic devices have been “software-defined” for years. Take the PC, for example: software can be installed on any hardware platform, allowing the user to custom-tailor both the hardware and the software according to his or her needs. The average PC can use Linux as an operating system if the owner so chooses. This gives the user greater freedom to allocate his or her budget precisely as needed for the task at hand – whether towards a high-powered graphic design setup, for example, or a lightweight Web browser.
Software-defined storage provides a cost-reducing alternative to traditional appliances. Because software-defined storage liberates the software from the hardware, allowing administrators to choose inexpensive commodity servers. When coupled with lightweight, efficient software solutions, the use of commodity servers can result in substantial cost savings for online service providers seeking ways to accommodate their users’ growing demand for storage.
In terms of scalability, software-defined storage also provides advantages; not every data center is created equal. A telco servicing one particular area will have different storage needs than a major bank with branches in several countries, and a cloud services host provider will have different needs still. While appliances might be good enough for most of these needs, the benefits realized by fully uncoupling the software from the hardware can extract substantial gains in economy of scale.
Software-defined storage gives administrators the freedom to examine the needs of their business, and to handpick the specific components and software that best support their growth goals. While this approach does require more technically trained staff, the flexibility afforded by software-defined storage delivers a simpler, stronger and more tailored data center for the company’s needs.
Furthermore, a software-defined approach uses a horizontal architecture that streamlines and redistributes data, which eliminates the potential bottlenecking problems of vertical, single-entry-point models. Data is handled faster and more efficiently, and this non-hierarchical construction can be scaled out easily and cost-effectively.
A Distributed Approach
The Internet of Things will likely benefit from a distributed approach to storage infrastructure as well. With millions of devices needing to access storage, the current storage model that uses a single point of entry cannot scale to meet the demand. To accommodate the ballooning ecosystem of storage-connected devices all over the world, service providers, enterprises and telcos need to be able to spread their storage layers over multiple data centers in different locations worldwide. It’s becoming increasingly clear that one data center is not enough to meet the storage needs of the Internet of Things: storage must instead to be distributed in a way that lets it be run in several data centers globally.
The Future of Storage is Here
The Internet of Things, with its billions of connections and petabytes of data, is here. There are clear trends pointing to ever-increasing demands for cheap storage, and if companies continue to rely on expensive, inflexible appliances in their data centers, they will be forced to outlay significant funds to develop the storage capacity they need to meet customer demand. Fortunately, the future in storage has also arrived. Software-defined solutions offer an attractive alternative to companies looking to “future proof” their data centers.