Search
Search
Close this search box.

Flexera and BDNA Merge – An Interesting Opportunity

Flexera and BDNA merger

In early September, Flexera announced that they were acquiring BDNA because they are; “reimagining the way software is bought, sold, managed and secured”. This acquisition, or merger if you prefer, offers an interesting opportunity to organizations seeking to improve the data quality they utilize on a daily basis.

Flexera is a privately held software company that has been around since 1987 and has grown significantly in recent years, in large part due to its Hardware & Software Asset Management (HAM & SAM) FlexNet platform, to become a multinational company with over 70,000 customers. BDNA, has been a leader in data quality and describes itself as “delivering the Industry’s Most Authoritative Enterprise Technology Data that Drives the Most Effective Business Decisions”.

The combination of these two companies has the potential to be a disruptive event in the service management and operations area. Service management and operations software vendors struggle with processing and/or displaying accurate information due to poor data quality. This is a result of the volume and velocity of change in operational data. Organizations are generally left to source and integrate various solutions in an effort to improve the quality and completeness of data they consume and share within their business. This is a massive challenge for organizations and it is becoming more difficult as IoT devices dump more unverified data into IT environments.

It is Important to understand that data quality and data accuracy are not one in the same.

  • Data accuracy is only one of the components of data quality
  • Data quality, includes other factors such as completeness, legitimacy and reliability. Collectively, it must be measured with respect to how it will be used

Having accurate data that is not timely or reliable still poses problems for organizations. This is particualrly the case when you consider the velocity of data changes in current business environments. Having accurate data from two days ago is not good enough if you’re making business decisions every hour of the day. As soon as the users of the data realize that it is not up to date, the data will be viewed as unreliable and/or incomplete in the minds of the users.

Some software vendors promote approaches to data consumption that frankly are not beneficial to organizations, but are certainly self-servicing to the vendor in my opinion. For example, some major vendors promote the concept that organizations should presume that the data provided them from their discovery tool is accurate, without validation. Then, only after the data has been taken in, could it be audited and verified. Of course, this concept is pushed primarily by vendors promoting their own discovery tools, or those who rely on the accuracy of them. It should be obvious, but this approach should not be taken by any organization. Anyone who has dealt with poor data quality knows that this will only result in massive failures.

There is no shortage of new data being generated within organization every day. This adding to poor quality data that they already have. That poor quality is. in a large part, a result of  data volume, but also the velocity of changes to data. The volume and pace at which data is being generated and updated is massive, it can no longer be handled via manual methods. Automated methods are needed to address the data quality issue.

The velocity of change makes simple reconciliation against a preset repository only partially effective. That is because reconciliation only helps identify and link data together to predetermined records. These records represent what we believe to be the correct data. However, this too is subject to the constraints and limitations brought on by the velocity issue, because it assumes a high level of time based accuracy. Simply implementing more tools to that process will generate further unverified data that simply adds to the existing challenge.

What the industry needs is more cooperation between vendors to deliver positive business outcomes for organizations. The Flexera/BDNA combination has a legitimate potential to deliver that, and organizations have the potential to benefit greatly from it. The respective strengths of each vendor are now merged and should deliver better data quality and alleviate an organizations’ need to integrate independent solutions at additional costs. When a vendor has the capability to improve backend data quality, which BDNA’s Technopedia claims to do, and combines that with an IT supply chain open platform like Flexera, there is a potential for some real advancements in data quality solutions.

All organizations need better data quality and governance. This is where tools from vendors like Flexera, BDNA, Blazent and consumers of their data, like ServiceNow and BMC, come in to play. These tools help improve the accuracy, value and dissemination of existing data through various mechanisms that rationalize and/or cleanse it. Each vendor has their own techniques and approaches to data quality, but the end result is that these tools provide organizations with significantly improved data quality for all aspects of their operational areas.

I’m very hopeful that the new combined company of Flexera and BDNA will not only succeed in their mission but will push other vendors to act on this serious need to address the data volume and velocity of change issue. This will certainly be something to keep a very close eye on in the coming months and years.

Organizations need vendors to help them with smarter, simpler offerings to improve data quality. Vendors need to stop promoting approaches that benefit the vendor first and organization second. Their goal must be focused on enabling an organizations’ positive business outcomes.

Improving data quality is the foundation of that goal.

TAGS :
SHARE :
salsesforce commerce
Optimize a Business Website
AI Marketing Trends

Explore our topics