By Steve Totman, Chief Strategy Officer at Privitar

Data leaders tell me it takes weeks, often months, sometimes longer to get access to data with commercially or personally sensitive information. 

The reason isn’t unwillingness to share or any lack of technology to store, process and move data to end users or applications. What stands in the way is a mountain of individual questions, answers, decisions, and actions needed to approve and protect data access, along with the lack of tools to streamline and automate these processes.

Data has a last mile problem.

The “last mile” is that final leg of any supply chain when products have left the central warehouse and make their way to unique destinations. The stage with the shortest distance consumes the most time and cost because each last mile is unique and has to be handled differently.

In data management, the technology is there to move huge volumes of data from operational systems to data lakes and warehouses and on to analytics and business intelligence (BI) tools shared by hundreds of users. The seamless process hits a buffer in the last mile because individual projects need approval and then governance, security, and privacy measures that fit their unique context. 

For those of us familiar with Takeshi’s Castle, the last mile is like watching those contestants who throw themselves whole-heartedly into a challenge until an insurmountable obstacle leads to a humbling outcome.

Tell me if this scenario seems familiar: 

A business user wants to verify a key business decision with a certain collection of customer data they can see in their data catalog, so they ask their manager for access, who calls the data team that owns the data source, who tells them their department is not approved to access that source and directs them to the governance team, who asks a long series of questions about the project (with days, nights, weekends, and even holidays between each) before referring the case to the legal team, who determines that regulations apply to this use of data and sends the conditions that must be met back to the governance team, who translates those conditions into the specific row filtering and column masking requirements the data needs to meet and goes back to the business manager, who goes back to the data team with the requirements, who engages the organization’s data engineers, who write some custom code and appoint a data architect to the case, who builds the data flow that finally, finally provisions redacted data to our tired, confused, and world-weary business user.

It’s a painfully slow and tedious process, only for the user to find out the dataset is useless and the convoluted process has to start again over email, phone, support tickets, Slack messages, and Zoom calls that are impossible to audit.

Many data-driven initiatives falter in the last mile.

I recently talked to the CDO of a Fortune 500 services company, an industry veteran who’s seen two decades of progress in technology and processes but still sees the same problems around data access and getting the most out of data. We are still dealing with the same issues we experienced at the start of our careers, when “data warehouse” was an exciting new concept.

Most organizations I talk to have made savvy and appropriate investments:

  • They’ve deployed the systems to collect, store, manage, aggregate, and analyze their data.
  • They’ve discovered, classified, curated, and cataloged data from every source.
  • They’ve designed self-service and data shopping user experiences for their business users and analysts.

Yet projects still stall, and some never reach completion.

The time it takes to get approved and protected data in front of their users is putting the data team’s targets at risk and damaging their chance of success. 

Good decisions should be highly prized, but the manual steps to reach them are crippling timely access to data. 

Approving access and applying protections case-by-case is where many organizations go off track:

  • Complex, manual, and inconsistent approval processes delay access to data
  • Legacy governance controls restrict access for many users
  • Modern governance solutions chart how data should be used, but don’t make sensitive assets safe for use

Regulatory oversight, compliance mandates, consumer attitudes, corporate policy, and risk appetite are all factors that affect how data will have to be handled in the last mile. Decision makers in groups like governance, legal, and compliance often seem at odds with each other.

For some new projects, it takes so long to access data that the results are palpably stale. Data teams are losing their appetite to ask in the first place, which only pushes them down riskier routes.

If the last mile is the toughest, turn that to your advantage. 

Successful business leaders are often those who solve common challenges while contenders fall by the wayside.

Organizations that want data to reach users at the right time should adopt capabilities that streamline and automate their governance measures:

  • A single solution where all stakeholders can collaborate. 
  • Consistent, repeatable approval processes managed in technology.
  • Policies that enforce automatically across the modern data stack.
  • A scalable policy solution that uses contextual attributes like location of the data, location of the user, use case and purpose, regulations, and more.
  • Integration with data catalogs and other metadata systems to leverage your governance metadata to ensure controls are fit for every context.
  • A comprehensive range of controls to protect data, including access control, privacy-enhancing data transformations, digital watermarks and lineage features, and audit logs.

At Privitar, we heard about last mile problems from our customers and built a platform to give them frictionless access to data without compromising on compliance or customer trust. 

We streamlined the workflows to approve and protect access to sensitive data, backed by enforceable policies that grant access to data or transform data based on conditions and contextual attributes. Policies created in one place can be enforced everywhere in an organization. 

Privitar’s Data Security Platform
streamlines approval processes and automates data protection. We already solve the last mile problem in production at global enterprise customers.

The business user I introduced earlier (a data consumer) gains an intuitive, self-service data shopping interface to find the data they want, add it to a project where they describe their use case and purpose, and request access to the data. Their request is routed to a platform user in the governance team (a data guardian) who can quickly review the project and approve it if they know the data and conditions are covered by data protection policies. Since the organization’s lines of business (data owners) already connected their datasets to the platform, the business user gets the data right away when the policies automatically apply in their data stack.

Privitar solves data’s last mile problem, making data that’s secure, approved, and appropriate for the use case available to the tools and systems where users turn data into value. 

I’m pleased to share two new resources that will help you explore the solution:

  • Our technical article looks under the hood to explain how the streamlined last mile process works.
  • Our strategic primer proposes how to maximize the value of your data catalog investment by integrating governance to enforcement across the modern data stack with the Data Security Platform.