Thursday, March 27, 2008

Mergers and Acquisitions: Data's Influence on Company Value

Caveat Emptor! Many large companies have a growth strategy that includes mergers and acquisitions, but many are missing a key negotiating strategy during the buying process.

If you’re a big company, buying other companies in your market brings new customers into your fold. So, rather than paying for a marketing advertising campaign to get new customers, you can buy them as part of an acquisition. Because of this, most venture capitalists and business leaders know that two huge factors in determining a company’s value during an acquisition are the customer and prospect lists.

Having said that, it’s strange how little this is examined in the buy-out process. Before they buy, companies look at certain assets under a microscope - tangible assets like buildings and inventory are examined. Human assets, like the management staff are given a strong look. Cash flow is audited and examined with due diligence. But, data assets are often only given a hasty passing glance.

Data assets quickly dissolve when the company being acquired has data quality issues. It’s not uncommon for a company to have 20%, 40%, or even 50% customer duplication (or near duplicates) in their data base, for example. So, if you think you’re getting 100,000 new customers, you may actually be getting 50,000 after you cleanse. It’s also common for actual inventory levels in the physical warehouse to be misaligned with the inventory levels in the ERP systems. This too may be due to data quality issues, and lead to surprises after the acquisition.

So what can you do as an acquiring company to mitigate these risks? The key is due diligence on data. Ask to profile the data of the company you’re going to buy. Bring in your team, or hire a third party to examine the data. Look at the customer data, the inventory data, the supply chain data or whatever data is a valuable asset in the acquisition. If privacy and security are an issue the results of the profiling can usually be rolled up into some nice charts and graphs that’ll give you a picture of the status of organizational information.

In my work with Trillium Software, I have talked to customers who have saved millions in acquisition costs by evaluating the data prior to buying a company. Some have gone so far as evaluation of the overlap between their own customer base and the new customer base to determine value. Why pay for a customer when (s)he is already on the customer list?

Profiling lets you set up business rules that are important to your company. Does each record have a valid tax ID number? What percentage of the database contact information is null? How many bogus e-mails appear? Does the data make sense, or are there a lot of near duplicates and misfielded data. In inventory data, how structured or unstructured is the data? All of these can quickly be ascertained with a data profiling technology. All of these technical issues can be correlated into business value, and therefore negotiating value, for your company.

The data governance teams that I have met that I have done this due diligence for their companies have become real superstars, and are very much a strategic part of their corporations. It’s easy for a CEO to see the value you bring when you can prove that they are paying the right price for a company acquisition.

Sunday, March 16, 2008

Data Governance in a Recession

What effect will a recession have on your data governance projects? Some have predicted that the nation will fall into a recession in 2008, although others disagree. In other words, it depends on whom you believe as to our economic fate in 2008. Still, with even the hint that a recession is pending, companies often move to cut costs. These cuts tend to affect major IT initiatives like data governance.

For those of us in the IT and enterprise software business, CFO-thinking runs counter to logic. During revenue-generating high times, IT tends to spend money to deliver automation that either cuts costs and/or improves productivity. So, the money spent delivers something back. However, during tougher economic times, or even when those times are presumed to be around the corner, cost cutting will be on the forefront, preventing us from fixing the inefficiencies. When revenues are good, efficiencies can be made better through IT. When revenues are bad, efficiencies are thrown out the window.

Talk of a recession may slide your plans for big projects like master data management and data governance onto the back burner. Instead, you may be asked to be more tactical – solving problems at a project level rather than an enterprise level. Instead of setting strategy, you may be asked to migrate a database, cleanse a list for a customer mailing, etc. without devoting resources to establishing a clear corporate strategy.

The good news is that times will get better. If and when there is a recession, we most certainly DON’T want to have to rewire and re-do our efforts later on. If you are asked to become more tactical, there are some things to keep in mind that’ll save you strategic frustration:

  • Convince management that data governance will save money, despite the resources needed up-front. Any vendor worth their salt has case studies showing the return on investment and can help you make the case if you bring them into the process early.
  • If you have to stay tactical, make sure the tools and tactics you choose on the project-based initiatives have a life in the strategic initiative. In other words, don’t cut costs on technology that won’t scale. Don’t choose tools that have limitations like lack of global support, poor connectivity, or limited performance if you’ll need those things later. Choosing these tools may hurt you when you want to go enterprise-wide; they’ll get into your plumbing and will be hard to replace. They’ll also get into the minds of your people, potentially requiring training and retraining. Even in tough economic times, you’re setting the standard when you select tools. Don’t let it come back to haunt you when times are good.
  • Make sure you understand all the pieces you need to buy early in the process. Many enterprise vendors require a LOT of different packages to do real-time data quality, for example. Hidden costs can be particularly problematic.
  • Make sure you understand all of the work involved, both in your project and in an enterprise implementation. There are big differences in the effort needed to get things done. Take the services effort into account during scoping.
  • If cutbacks are severe but the need is still great, consider software leasing and SaaS (Software as a Service) to minimize costs. Many vendors offer their enterprise software as a service offering. If times are tough, work with the vendor on alternative ways to purchase.

On another note, I want to thank Beth from the ‘Confessions of a Database Geek’ blog for the mention of my blog this week. If you’re a blogger, you know that every mention by other bloggers gives you street cred, and I am most appreciative of that. It's great to be mentioned by one of the best. Thanks Beth!


Monday, March 10, 2008

Approaching IT Projects with Data Quality in Mind


I co-authored a white paper at the end of 2006 with a simple goal: to talk directly to project managers about the process they go through when putting together a data intensive project. By “data intensive” project, I mean dealing with mergers and acquisitions data, CRM, ERP consolidation, Master Data Management, and any project where you have to move big data.

Project teams can be so focused on application features and functions that they sometimes miss the most important part. In the case of a merger, project teams must often deal with unknown data coming in from the merger that may require profiling at part of their project plan. In the case of a CRM system, companies are trying to consolidate whatever ad hoc system is in place and data from people who may care very little about data quality. In the case of master data management and data governance, the thought of sharing data across the enterprise brings to mind a need for a corporate standard for data. Data intensive projects may have different specific needs, but just remembering that you need to consider data in your project will get you far.

To achieve real success, companies need to plan a way to manage data as part of the project steps. If you don’t think about the data as part of the project preparation, blueprinting, implementation, rollout preparation, go live and maintenance, your project is vulnerable to failure. Most commonly, delay and failure is due to late-project realization that the data has problems. Knowing the data challenges you face early in the process is the key to success.

This white paper discusses the importance and ways to best involve business users in the project to ensure their needs are met. It covers ways to stay in scope on the project while considering the big picture and the going concern of data quality within your organization. Finally, it covers how to incorporate technology throughout a project to expedite data quality initiatives. The white paper is still available today for download. Click here and see "Data Quality Essentials: For Any Data-Intensive Project"

Answer to above: All of them

Saturday, March 1, 2008

Taking Data Governance from Theory to Practical Application

There is a lot of theoretical hype about data governance in the data management world, some valuable, and some not so valuable.

I personally have a hard time with any articles that try to cut data governance down to the “five most important things”. To me, it’s akin to saying, here are the five most important things you need to remember when trying to disassemble and reassemble a Boeing 757. You just can’t distill it that far and expect anything useful. Usually, this is the type of white paper produced by a marketing department run amok and useful only as a preface in the book of data governance.

Instead of trying to distill it, we need to become students of data governance and then take that knowledge and shape it for our company. It’s safe to say that the list of five, ten, or twenty five most important things to watch will be different across industry, across the company, across the globe, and across time. That’s why I have plenty to write about in this blog, and plenty to talk about on my webinars.

However, I wanted to share with you another chapter in the book of data governance that speaks to practical application. My colleague Len Dubois had a chance to interview Nigel Turner and Dave Evans from BT Design. I’ve mentioned Nigel and Dave before on my blog. They have a phenomenal story of starting up small and building increasing ROI over time. They’ve calculated huge gains in efficiency, to the tune of $1 Billion. I also credit them for the very clever Do Nothing Option.

If you’d like to hear this three-part podcast series, as told by these pioneers of data governance from Wales, please follow the link. The podcast covers the information quality challenges tackled, software selection, and lessons learned.

Disclaimer: The opinions expressed here are my own and don't necessarily reflect the opinion of my employer. The material written here is copyright (c) 2010 by Steve Sarsfield. To request permission to reuse, please e-mail me.