This year, I am honored to be part of MIT's Information Quality Industry Symposium in Cambridge, MA. In past years I have attended this conference and have been pleased with the quality of the speakers and how informed the industry is getting about data quality. This year, my company is sponsoring the event and I will be co-presenting with my colleague Nelson Ruiz.
The speaker's list is impressive! Some of the featured speakers include very experienced practitioners like Larry English, Bill Inmon, Danette McGilvray and Gwen Thomas. Attendees will be sure to gain some insight on information quality with such a full line-up of experts.
In true MIT form, this forum has a lot of theoretical content in addition to the practical sessions. This is one of the more academic venues for researching data quality, and therefore less commercial. The presentations are interesting in that they often gave you another perspective on the problem of data quality. Some of them are clearly cutting edge.
My session entitled Using Data Quality Scorecards to Sell IQ Value will be more practical. When it comes to convincing your boss that you need to invest in DQ, how can you create metrics that will ignite their imagination? How do you get the funding... and how do you take information quality enterprise-wide.
If you have some travel budget open, please come to Boston this summer and check out this small and friendly event. As a reader of this blog, feel free to use the Harte-Hanks Trillium Software $100 discount pass when registering.
Tuesday, June 9, 2009
MIT's Information Quality Industry Symposium
Wednesday, June 3, 2009
Informatica Acquires AddressDoctor
Global Data is Hard to Do
Yesterday, Informatica announced their intent to acquire AddressDoctor. This acquisition is all about being able to handle global data quality in today’s market, but it has a surprising potential twist. Data quality vendors have been striving for a better global solution because so many of the large data quality projects contain global data. If your solution doesn’t handle global data, it often just won’t make the cut.
The interesting twist here is that both IBM and Dataflux leverage AddressDoctor for their handling of global address data. There are several other smaller vendors that do also - MelissaData, QAS, and Datanomic. Trillium Software technology is not impacted by this acquisition. They have been building in-house technology for years to support the parsing of global data and have leveraged their parent company’s acquisition Global Address to beef up the geocoding capability of the Trillium Software System.
Informatica has handed the competition a strong blow here. Where will these vendors go to get their global data quality? In the months to come, there will be challenges to face. Informatica, still busy with integrating the disparate parts of Evoke, Similarity and Identity Systems, will now have to integrate AddressDoctor. Other vendors like IBM, Dataflux, MelissaData, QAS and Datanomic may now have to figure out what to do for global data if Informatica decides not to renew partner agreements.
For more analysis on this topic, you can read Rob Karel's blog. Read how this Forrester analyst thinks the move is to limit the choices on MDM platforms.
To be on the safe side, I’d like to restate my opinions in this blog are my own. Even though I work for Harte-Hanks Trillium Software, my comments are my independent thoughts and not necessarily those of my employer.
Thursday, May 21, 2009
Guiding Call Center Workers to Data Quality
Data Governance and data quality are often the domain of data quality vendors, but any technology that can help your quest to achieve better data is worth exploring. Rather than fixing up data after it has been corrupted, it’s a good idea to use preventative technologies to stop poor data quality in the first place.
I recently met with some folks from Panviva Software to talk about how the company’s technologies do just that. Panviva is considered the leader in Business Process Guidance, an emerging set of technologies that could help your company improve data quality and lower training costs on your call centers.
The technology is powerful, particularly in situations where the call center environment is complex – multiple environments mixed together. IT departments in the banking, insurance, telecommunication and high-tech industries have particularly been rattled with many mergers and acquisitions. Call center workers at those companies must be trained where to navigate and which application to use to get a customer service process accomplished. On top of that, processes may change often due to change in regulation, change in corporate policy, or the next corporate merger.
To use a metaphor, business process guidance is a GPS for your complicated call center apps.
If you think about it, the way we drive our cars has really improved over the years because of the GPS. We no longer need buy a current road map at Texaco and follow the map as far as it’ll take us. Instead, GPS technology knows where we are, what potential construction and traffic issues we may face – we simply need to tell it where we want to go. Business Process Guidance provides that same paradigm improvement for enterprise applications. Rather than forcing training on your Customer Service Representatives (CSRs) with all of its unabridged training manuals, business process guidance provides a GPS-like function that sits on top of those systems, providing context-sensitive information on where you need to go. When a customer calls into the call center, the technology combines the context of the CSR’s screens with knowledge of the company’s business processes to guide the CSR to much faster call times and lower error rates.
A case study at BT leverages Panviva technology to reduce the error rate in BT's order entry system from 30% down to 6%, an amazing 80% reduction. That’s powerful technology on the front-end of your data stream.
Sunday, May 10, 2009
Data Governance – the Movie

To really drive home the challenge of data governance in your company, you have to believe that it’s a movie, not a photo. A snapshot is taken once and done, but that’s not what happens when you embark on a data governance initiative.
In a movie, you start with a hero – that’s you the data governance champion. You have a good heart and want to fight for justice in the cruel data management world.
Next, there needs to be conflict, a dark cloud that overshadows our hero. In most cases, the conflict goes back to the beginning when your company was just starting out. Back then, your first customers may have been from your local area at first, but slowly the circle began to grow - first locally, then regionally, then nationwide, then worldwide. As new offices opened and new systems were born, the silos formed. The hero warned the company that they need a data management strategy, but no one listened. Almost no small or medium sized company thinks about data management when they’re growing up, despite the best efforts of our heroes.
When it comes time to fix it all, you can’t think of it as taking a snapshot of the data and fixing it up with Photoshop. The hero must embark on a long journey of battle and self-sacrifice to defeat evil. Corporate change, like rapid growth, mergers, downsizing, and new laws governing the corporation happens frequently in business. The battle for corporate data management requires small steps to mature the corporation into a better way of doing business. It’s Neo from the Matrix fighting Agent Smith and evolving into ‘the One”. It’s John McLane slowly taking out the bad guys in Nakatomi Plaza.
I see what’s missing in many people’s minds in reference to data governance is that concept of time. It took a long time to mess up the data in your big corporation, and it takes time to reverse it. When you select your tools and your people and your processes for data governance, you always want to keep that enterprise vision in mind. The vision has a timeline, throughout which the data champion will have unexpected issues thrown at them. It’s not about the free data cleansing software that you get with your enterprise application. That stuff won’t hold up when you try to use it once you get out of your native environment. It’s about making sure the process, the team, and the tools stand up over time, across projects, across business units and across data types. There are few and fewer vendors standing who can offer that kind of enterprise vision.
Monday, May 4, 2009
Don’t Sweat the Small Stuff, Except in Data Quality
April was a busy month. I was the project manager on a new web application, nearly completed my first German web site (also as project manager) and released the book “Data Governance Imperative”. All this real work has taken me away from something I truly love – blogging.
I did want to share something that affected my project this month, however. Data issues can come in the smallest of places and can have a huge effect on your time line.
For the web project I completed this month, the goal was to replace a custom-coded application with a similar application built within a content management system. We had to migrate log in data of users of the application, all with various access levels, to the new system.
During go live, we were on a tight deadline to migrate the data, do final testing of the new application and seamlessly switch everyone over. That all had to happen on the weekend. No one would be the wiser come Monday morning. If you’ve ever done an enterprise application upgrade, you may have followed a similar plan.
We had done our profiling and knew that there were no data issues. However when the migration actually took place, lo and behold – the old system allowed # as a character in the username and password while the new system didn’t. It forced us to stop the migration and write a rule to handle the issue. Even with this simple issue, the time line came close to missing its Monday morning deadline.
Should we have spotted that issue? Yes, in hindsight we could have better understood the system restrictions on the username and password and set up a custom business rule in the data profiler to test it. We might have even forced the users to change the # before the switch while they were still using the old application.
The experience reminds me that data quality is not just about making the data right, it’s about making the data fit for business purpose – fit for the target application. When data is correct for one legacy application, it can be unfit for others. It reminds me that you can plan and test all you want, but you have to be ready for hiccups during the go live phase of the project. The tools, like profiling, are there to help you limit the damage. We were lucky in that this database was relatively small and reload was relatively simple once we figured it all out. For bigger projects, more complete staging of the project – making dry run before the go live phase would have been more effective.
Sunday, April 19, 2009
New Book - The Data Governance Imperative
My new book entitled The Data Governance Imperative is making its way to Amazon, Barnes and Noble, and other outlets this week. I’m very proud of this and happy to see it finally hit the streets. It was a lot of work and dedication to get it done.
I decided to write this book because I saw a common recurring question that arose during discussions about data governance. How do I get my boss to believe that data governance is important? How do I work with my colleagues to build better information and a better company? How do I break through the barriers preventing data governance maturity like getting money, resources and expertise to accomplish the task? When it comes to justifying the costs of data governance to their organization, building organizational processes, learning how to staff initiatives, understanding the role and importance of technologies, and dealing with corporate politics, there is little information available.
In my years working at Trillium Software, I have been exposed to many great projects in Fortune 1000 companies worldwide. Over the years, I’ve made note of the success factors that contribute to strong data governance. I’ve seen successful strategies for data governance and the common threads to success within and across the industry.
I’ve written the Data Governance Imperative to help readers pioneer data governance initiatives, breaking through political barriers by shining a light on the benefits of corporate information quality. This book is designed to give data governance team members insight into the art of starting data governance. It could be helpful to:
- Data governance teams – those looking for direction/validation in starting a corporate data governance initiative.
- Business stakeholders – those working in marketing, sales, finance and other business roles who need to understand the goals and functions of a data governance team.
- C-level executives – those looking to learn about the benefits of data governance without having to read excessive technical jargon, or even those who need to be convinced that data governance is the right thing to do.
- IT executives – those who believe in the power of information quality but have faced challenges in convincing others in their corporation of its value.
Thursday, April 2, 2009
Next Week’s Can’t-Miss Webinars
Presenters can either make or break a webinar. Simply put, good webinars are given by people who are passionate and knowledgeable about their topic. In order to give give up an hour of a busy day, I have to believe that it will impart some knowledge beyond product demos and brochure-ware. In looking ahead to next week, I see a couple of high points:
Data Governance: Strategies for Building Business Value
Date: Tuesday, April 14, 2009 at 11 a.m. Eastern
Trillium Software will host a Web seminar that includes featured guest speaker Rob Karel of Forrester Research presenting a discussion titled: Data Governance: Strategies for Building Business Value. If you’ve never seen Rob Karel speak, I can tell you from experience that it’s a real treat. I played emcee to a 2008 webinar with Rob on data governance. It was very well attended and very positively reviewed. At that time, the webinar concluded with a lot of great questions on selling the business case for data governance. In this session, Rob plans to tackle that topic a bit more - outlining the best practices and skills needed to obtain executive buy-in for data governance projects.
How to Boost Service, Cut Costs and Deliver Great Customer Experiences - Even in an Economic Downturn
Date: Thursday, April 16, 2009 at 11 a.m. Eastern
Teradata and the SmartData Collective will co-sponsor a webinar on dealing with a down economy. We’ve seen a couple of companies cover this topic, but the panel looks very strong. Judging from the panel and the description, this webinar looks to have a CRM-focus - how technology can help you a) provide an experience that customers will love, and; b) cut costs and help you differentiate your communications strategies from your competition. Curtis Rapp from Air2Web will be in on the discussion, so I’m guessing there will be some talk about Teradata Relationship Manager Mobile and using text messaging in your Teradata apps.
The panel of experts will include:
- Dave Schrader, Teradata - published author and long time Teradata employee
- Lisa Loftis, CRM and BI Expert - author on CRM topics
- Curtis Rapp, Air2Web – the partner responsible for some of Teradata’s mobile solution (CRM on your cell phone)
- Rebecca Bucnis, Teradata - another long-time and experienced Teradata employee

