Thursday, June 25, 2009

Evil Dictators: You Can’t Rule the World without Data Governance

Buried in the lyrics of one of my favorite heavy metal songs are these beautiful words:

Now, what do you own the world? How do you own disorder, disorder? – System of the Down, Toxicity


System of the Down’s screamingly poetic lyrics reminds us of a very important lesson that we can take into the business. After all, it is the goal of many companies to “own their world”. If you’re Coke, you want to dominate over Pepsi. If you’re MacDonald’s, you want to crush Burger King. Yet to own competitive markets, you have to run your business with the utmost efficiency. Without data governance, or at least enterprise data quality initiatives, you won’t have that efficiency.

Your quest for world domination will be in jeopardy in many ways without data governance. If your evil world domination plan is to buy up companies, poor data quality and lack of continuity will prevent you from creating a unified environment after the merge. On the day of a merger, you may be asked to produce, one list of products, one list of customers, one list of employees, and one accurate financial report. Where is that data going to come from if it is not clean all over your company? How will the data get clean without data governance?

Data governance brings order to the business units. With order comes the ability to own the information of your business. The ownership brings the ability to make effective and timely decisions. In large companies, whose business units may be warring against each other for sales and control of the information, it’s impossible to own the chaos. It’s difficult to make good decisions and bring order to your people. If you want to own your market, you must have order.

Those companies succeeding in this data-centric world are treating their data assets just as they would treat cold, hard cash. With data governance, companies strive to protect their vast ecosystem of data like it is a monetary system. It can't be the data center's problem alone; it has to be everyone's responsibility throughout the entire company.

Data governance is the choice of CEOs and benevolent dictators, too. The choice about data governance is one about hearing the voices of your people. It's only when you harmonize the voices of technologists, executives and business teams that allow you produce a beautiful song; one that can bring your company teamwork, strategic direction and profit. When you choose data governance, you choose order, communication and hope for your world.

So megalomaniacs, benevolent dictators and CEOs pay heed. You can’t own the world without data governance.

Friday, June 19, 2009

Get your Submissions in for the June Blog Carnival for Information/Data Quality Bloggers

I’m pleased to be hosting the June edition of "El Festival del IDQ Bloggers – A Blog Carnival for Information/Data Quality Bloggers". If you are a data quality blogger, please feel free to submit your best blog entries today.

This blog carnival is simply a collection of posts from different data quality blogs. Anyone can submit a data quality blog post and get the benefits of extra traffic, networking with other bloggers and discovering interesting posts. The only requirement is that the submitted post has a data quality theme.

This will be the JUNE issue of the carnival, so your submissions must have been posted in June. To qualify, you should e-mail your submission to: blogcarnival@iaidq.org – your email should include:
• URL of the blog post being submitted
• Brief description of the blog (not the post, the blog)
• Brief description of the author
• Optional – URL of an author profile (e.g. LinkedIn, Twitter)

Not all entries will make it into the issue, but don’t be discouraged. Keep submitting to future issues and we’ll get you next month.
For more information: see the IAIDQ web page

Friday, June 12, 2009

Interview on Data Quality Pro.com

From Data Quality Pro.com

If you are active within the data quality and data governance community then chances are you will have come across Steve Sarsfield and his Data Governance and Data Quality Insider blog.
Steve has also recently published an excellent book, aptly titled "The Data Governance Imperative" so we recently caught up with him to find out more about some of the topics in the book and to pose some of the many questions organisations face when launching data governance initiatives.


Read the interview>>


Plus, at the end of the interview we provide details of how to win a copy of "The Data Governance Imperative".


Tuesday, June 9, 2009

MIT's Information Quality Industry Symposium

This year, I am honored to be part of MIT's Information Quality Industry Symposium in Cambridge, MA. In past years I have attended this conference and have been pleased with the quality of the speakers and how informed the industry is getting about data quality. This year, my company is sponsoring the event and I will be co-presenting with my colleague Nelson Ruiz.

The speaker's list is impressive! Some of the featured speakers include very experienced practitioners like Larry English, Bill Inmon, Danette McGilvray and Gwen Thomas. Attendees will be sure to gain some insight on information quality with such a full line-up of experts.

In true MIT form, this forum has a lot of theoretical content in addition to the practical sessions. This is one of the more academic venues for researching data quality, and therefore less commercial. The presentations are interesting in that they often gave you another perspective on the problem of data quality. Some of them are clearly cutting edge.

My session entitled Using Data Quality Scorecards to Sell IQ Value will be more practical. When it comes to convincing your boss that you need to invest in DQ, how can you create metrics that will ignite their imagination? How do you get the funding... and how do you take information quality enterprise-wide.

If you have some travel budget open, please come to Boston this summer and check out this small and friendly event. As a reader of this blog, feel free to use the Harte-Hanks Trillium Software $100 discount pass when registering.

Wednesday, June 3, 2009

Informatica Acquires AddressDoctor

Global Data is Hard to Do

Yesterday, Informatica announced their intent to acquire AddressDoctor. This acquisition is all about being able to handle global data quality in today’s market, but it has a surprising potential twist. Data quality vendors have been striving for a better global solution because so many of the large data quality projects contain global data. If your solution doesn’t handle global data, it often just won’t make the cut.

The interesting twist here is that both IBM and Dataflux leverage AddressDoctor for their handling of global address data. There are several other smaller vendors that do also - MelissaData, QAS, and Datanomic. Trillium Software technology is not impacted by this acquisition. They have been building in-house technology for years to support the parsing of global data and have leveraged their parent company’s acquisition Global Address to beef up the geocoding capability of the Trillium Software System.

Informatica has handed the competition a strong blow here. Where will these vendors go to get their global data quality? In the months to come, there will be challenges to face. Informatica, still busy with integrating the disparate parts of Evoke, Similarity and Identity Systems, will now have to integrate AddressDoctor. Other vendors like IBM, Dataflux, MelissaData, QAS and Datanomic may now have to figure out what to do for global data if Informatica decides not to renew partner agreements.

For more analysis on this topic, you can read Rob Karel's blog. Read how this Forrester analyst thinks the move is to limit the choices on MDM platforms.

To be on the safe side, I’d like to restate my opinions in this blog are my own. Even though I work for Harte-Hanks Trillium Software, my comments are my independent thoughts and not necessarily those of my employer.

Thursday, May 21, 2009

Guiding Call Center Workers to Data Quality

Data Governance and data quality are often the domain of data quality vendors, but any technology that can help your quest to achieve better data is worth exploring. Rather than fixing up data after it has been corrupted, it’s a good idea to use preventative technologies to stop poor data quality in the first place.

I recently met with some folks from Panviva Software to talk about how the company’s technologies do just that. Panviva is considered the leader in Business Process Guidance, an emerging set of technologies that could help your company improve data quality and lower training costs on your call centers.

The technology is powerful, particularly in situations where the call center environment is complex – multiple environments mixed together. IT departments in the banking, insurance, telecommunication and high-tech industries have particularly been rattled with many mergers and acquisitions. Call center workers at those companies must be trained where to navigate and which application to use to get a customer service process accomplished. On top of that, processes may change often due to change in regulation, change in corporate policy, or the next corporate merger.

To use a metaphor, business process guidance is a GPS for your complicated call center apps.

If you think about it, the way we drive our cars has really improved over the years because of the GPS. We no longer need buy a current road map at Texaco and follow the map as far as it’ll take us. Instead, GPS technology knows where we are, what potential construction and traffic issues we may face – we simply need to tell it where we want to go. Business Process Guidance provides that same paradigm improvement for enterprise applications. Rather than forcing training on your Customer Service Representatives (CSRs) with all of its unabridged training manuals, business process guidance provides a GPS-like function that sits on top of those systems, providing context-sensitive information on where you need to go. When a customer calls into the call center, the technology combines the context of the CSR’s screens with knowledge of the company’s business processes to guide the CSR to much faster call times and lower error rates.

A case study at BT leverages Panviva technology to reduce the error rate in BT's order entry system from 30% down to 6%, an amazing 80% reduction. That’s powerful technology on the front-end of your data stream.

Sunday, May 10, 2009

Data Governance – the Movie


To really drive home the challenge of data governance in your company, you have to believe that it’s a movie, not a photo. A snapshot is taken once and done, but that’s not what happens when you embark on a data governance initiative.

In a movie, you start with a hero – that’s you the data governance champion. You have a good heart and want to fight for justice in the cruel data management world.

Next, there needs to be conflict, a dark cloud that overshadows our hero. In most cases, the conflict goes back to the beginning when your company was just starting out. Back then, your first customers may have been from your local area at first, but slowly the circle began to grow - first locally, then regionally, then nationwide, then worldwide. As new offices opened and new systems were born, the silos formed. The hero warned the company that they need a data management strategy, but no one listened. Almost no small or medium sized company thinks about data management when they’re growing up, despite the best efforts of our heroes.

When it comes time to fix it all, you can’t think of it as taking a snapshot of the data and fixing it up with Photoshop. The hero must embark on a long journey of battle and self-sacrifice to defeat evil. Corporate change, like rapid growth, mergers, downsizing, and new laws governing the corporation happens frequently in business. The battle for corporate data management requires small steps to mature the corporation into a better way of doing business. It’s Neo from the Matrix fighting Agent Smith and evolving into ‘the One”. It’s John McLane slowly taking out the bad guys in Nakatomi Plaza.

I see what’s missing in many people’s minds in reference to data governance is that concept of time. It took a long time to mess up the data in your big corporation, and it takes time to reverse it. When you select your tools and your people and your processes for data governance, you always want to keep that enterprise vision in mind. The vision has a timeline, throughout which the data champion will have unexpected issues thrown at them. It’s not about the free data cleansing software that you get with your enterprise application. That stuff won’t hold up when you try to use it once you get out of your native environment. It’s about making sure the process, the team, and the tools stand up over time, across projects, across business units and across data types. There are few and fewer vendors standing who can offer that kind of enterprise vision.

Disclaimer: The opinions expressed here are my own and don't necessarily reflect the opinion of my employer. The material written here is copyright (c) 2010 by Steve Sarsfield. To request permission to reuse, please e-mail me.