infoTECH Feature

September 19, 2017

Quality Over Quantity: The Art of Software Normalization

By Special Guest
Angus Vause, Marketing Director at Certero Software

Finding all of your raw software data overwhelming? Struggling to take actionable information away from it?

Software normalization (or recognition as it’s also known) isn’t particularly new, but it’s becoming an increasingly hot topic from an ITSM perspective, driven by demands for normalized software data to populate the CMDB (Configuration Management Database).

When it comes to a normalization service – and the database of information which underpins it – there are certain nuances which need to be considered. So it’s timely to take a closer look and dispel some of the myths. 

Why Normalization is Needed

Capturing an inventory of all the software installed across an IT estate returns a vast list of complex and – to the uninitiated – confusing data. Transforming this raw and ‘noisy’ data into meaningful information is a complex, resource-intensive task. Of the many thousands of software applications in a typical IT estate, it’s usually only hundreds that actually offer any commercial exposure, clarity on which is vital for organizations to ensure compliance. Being able to decipher this data to build a list of licensable software (including associated details, such as publisher, product, version, edition, release date, upgrade/downgrade rights etc.) is a major challenge.

In the majority of cases, it simply doesn’t make economic sense for organizations to undertake the work required for software normalization themselves using internal resources. This has led to many solution providers developing software normalization services underpinned by a comprehensive database, which is effectively used as a source of reference to decipher inventory data and identify licensable applications.

Database Size is Irrelevant

When it comes to promoting the effectiveness of their normalization capabilities, too many providers focus on the size of their software recognition database (SRDB), making a song and dance about the fact they have hundreds of thousands of entries. However a vast amount of this data isn’t really commercially relevant. While it may be of use from an operational or technical perspective, the value offered in terms of licensing and compliance is limited. A more meaningful measure might be the number of licensable applications actually included within the database. But that measure on its own still isn’t sufficient. We have to consider quality.

Quality is What Really Counts

What if the process for identifying, categorizing and recording those licensable applications within the normalization database isn’t rigorous and consistent? What if unskilled resource is used instead of experts with the right levels of knowledge and experience? The end result will be inaccurate entries and what use is a massive database if it’s full of rubbish? 

In my experience, the value of understanding how the software vendors themselves define their products is essential for developing an effective normalization database. Most vendors utilize SKUs (stock keeping units) which are their definitive identifier for each software application. Hence using these SKUs is the only way to ensure zero ambiguity and accurately populate the normalization database. 

Some providers choose not to utilize SKUs in their normalization database, instead opting to create their own definitions for different software. This way of working was established at a time when software normalization was new to the market and offered a quick solution to market demand. While initially providing a good resource, it has become apparent that this legacy approach – which isn’t necessarily accurate – is limited in meeting the demands of today’s requirements.

Obviously there are some vendors who don’t use SKUs. In these circumstances the normalization provider has to conduct research in order to establish the necessary information for the particular application in question and thereby accurately populate the database. Again this highlights the need for skilled resource and high quality process. 

UNSPSC (United Nations Standard Products and Services Code)

Other developments have seen the requirement for normalization to be able to classify installed software in line with UNSPSC application categories. Again, doing this accurately requires a rigorous, consistent approach and an element of understanding. Shockingly, we have come across numerous instances of providers’ normalization databases containing glaring inaccuracies.

An Example

Let’s consider the case of a customer who, for example, is only interested in gaining a clear, normalized view of their Microsoft (News - Alert) estate. All they need from their chosen solution provider is assurance that their database contains the information necessary to accurately identify and normalize all instances of their deployed Microsoft applications. The fact that one normalization provider has a database with millions of entries, while another provider’s only has thousands (I exaggerate for effect) is of no relevance whatsoever. The point is that either of them could be the one with the most accurate data. The only way to be certain is to look at the actual quality of the data itself.

In Conclusion

So what does this all boil down to? Basically the common practice of using database size as a measure of a normalization provider’s capability is flawed. If you want to make sense of the software installed across your estate and ensure licensing compliance/optimization, our advice is to ask your provider what processes they use in generating and maintaining their normalization database. A rigorous, consistent approach, skilled resource, use of manufacturer SKUs, correct UNSPSC categorization and comprehensive coverage of commercially licensable applications are absolute essentials. Remember that accuracy far outweighs size and, ultimately, your objective should be to work with a provider who supports and understands the importance of quality over quantity.

About Certero

Certero is a leader in the development, delivery and enablement of enterprise-level solutions which help customers maximize the value they get from their IT assets, driving transformation and organizational advantage. Core expertise in all elements of IT hardware & software asset management is complemented by world class, tailored services. Our mission is to provide the best solutions that work individually and holistically together, seamlessly and optimally on a single platform, which is unique to Certero.

Website: www.certero.com

About the Author: Angus Vause is the Marketing Director at Certero Software. Certero are a specialist vendor and supplier of Software Asset Management, Enterprise Mobile Management, PC Power Management, and Password Reset products & solutions, complemented by world class services, including SAM Services.




Edited by Mandi Nowitz
FOLLOW US

Subscribe to InfoTECH Spotlight eNews

InfoTECH Spotlight eNews delivers the latest news impacting technology in the IT industry each week. Sign up to receive FREE breaking news today!
FREE eNewsletter

infoTECH Whitepapers