At its core, the cloud trend is really about distributed computing: making a pool of computing resources accessible on-demand in an easily managed and convenient manner. Looked at this way, clouds can be used for many different functions, not just consumers and small businesses accessing applications over the internet.
For example, with the right technology companies can create private clouds that let an entire network’s resources be leveraged to greatly accelerate processes. This is exactly the niche where Xoreax brings value.
The company offers a simple agent that can be installed to create private clouds to harness a network’s computing power, without needing to make changes to source code.
Xoreax Grid Engineer (XGE) technology is highly configurable; users can control how much CPU power on each computer is used, and set associated limits (e.g.: if more than 20 percent of a particular machine’s processing power is being used locally, it is not tapped by the network).
According to Dori Exterman, CTO, Xoreax’s technology is used by more than 20 percent of Fortune 100 companies. XGE forms the backbone of IncrediBuild, a product that accelerates Visual Studio builds, making them up to 30 times faster.
The technology is also used by financial industry, healthcare, medical research, gaming and energy for all kinds of processes that take a lot of time.
“It’s really useful for developers who sometimes need to wait 2-3 hours a build to complete, and now only need to take a short coffee break,” Exterman said during a TMCnet video interview at Cloud Expo 2011.
Exterman acknowledged that there are a lot of players in the field once known as ‘grid computing’ and now more commonly referred to as ‘high performance computing’ or HPC for short.
“What differentiates us from competitors is the element that you don’t need to change your source code,” he emphasized during the video interview. “Other solutions require you to change your architecture, which takes a lot of time. The only thing our customers need to do is write a small XML file with names of the processes they want to distribute, and everything else is done for them.”
Depending on the application, some optimization of network speeds may also be in order.
“We are mainly focused on applications that require intensive CPU power but have few I/Os,” Exterman said. “For example, simulations that require processing many numbers. Some simulations take weeks to complete if you run them only a on a local machine. When you can distribute them to hundreds of computer, it makes things much faster.”
For more discussion about distributed computing, including predictions about development of hybrid solutions for private and public clouds, watch the full video interview.