Data Governance Drives Effective Analytics

We recently came across Aberdeen’s report “Nimble IT and the Data Layer: The Lynchpin of Analytical Success,” which summarizes survey results of data management professionals and analytics users.  The report explores the relationship between IT environments and decision-making by business users.

Interestingly, Aberdeen found that something called a “nimble IT” environment empowers line-of-business users to make better decisions and improve company performance.

We all intuitively understand what Aberdeen means by “nimble” – an IT organization that is strategic, yet agile, and quick to anticipate and respond to organizational needs. But how do organizations get there?

According to Aberdeen, organizations achieving success in getting to this type of environment share a unifying trait. “The most important characteristic of companies with ‘nimble IT’ is their ability to improve data access and collaboration, but do it with a strong degree of governance and oversight.”

In high-functioning analytical organizations, there is a synergy between data access and data governance.  “One can’t exist without the other,” says Aberdeen. “It is because of the clearly communicated and responsibly enforced policies around data access and usage that enables these companies to share information across functions with a high degree of confidence.”

If data is to be shared widely across the organization, data custodians must have effective data governance in place. This means managing data confidentiality, integrity, and accessibility.

Confidentiality means appropriate authentication, access controls, the ability to audit to your organization’s standards, and for some organizations, the ability to de-identify data. It’s easy to enforce confidentiality by locking up data and restricting access in a draconian way; it’s hard to both enforce confidentiality and provide comprehensive access. Modern data solutions have ways of providing granular access and reducing the administrative burden of managing access policies across a large number of users and protected assets.

Data integrity is achieved when data meets quality standards for accuracy, consistency, and completeness. Integrity also includes an ability to track data flow and transformation lineage, history of modified values, and rollback capabilities.

Accessibility is the outcome of effective data confidentiality and integrity mechanisms. Accessibility is more than just making data available; it means providing enough context that users are able to find the data they need and understand its meaning. The relevant context may include units of measure, reference ranges, validation rules, how the data was collected, filtered, transformed, or aggregated, how accurate it is, and so on. Accessibility often includes the need to define data sharing agreements and data retention policies.

It’s only after governance is in place that the vision of self-service analytics can be realized.  “Self-service analytics, a product of savvy planning and strong communication between stakeholders, only comes to fruition after diligent IT effort in the data layer.”

The punch lines of the Aberdeen report?

  • You must govern data to liberate insight
  • You need to focus on data management if you want to drive self-service analytics

This picture is especially relevant as organizations move to include big data as a component in their data architecture; until recently, data governance was a feature gap in big data solutions. But newer innovative governance solutions are beginning to appear, with more usable and flexible data infrastructures to meet the need of organizations.

To read the full Aberdeen report,
Click Here

To connect with a PHEMI team member on how to integrate a fully governed big data solution into your architecture, click here.

Posted in Blog Tagged with: , ,

Leave a Reply