Thursday, May 21, 2009

Guiding Call Center Workers to Data Quality

Data Governance and data quality are often the domain of data quality vendors, but any technology that can help your quest to achieve better data is worth exploring. Rather than fixing up data after it has been corrupted, it’s a good idea to use preventative technologies to stop poor data quality in the first place.

I recently met with some folks from Panviva Software to talk about how the company’s technologies do just that. Panviva is considered the leader in Business Process Guidance, an emerging set of technologies that could help your company improve data quality and lower training costs on your call centers.

The technology is powerful, particularly in situations where the call center environment is complex – multiple environments mixed together. IT departments in the banking, insurance, telecommunication and high-tech industries have particularly been rattled with many mergers and acquisitions. Call center workers at those companies must be trained where to navigate and which application to use to get a customer service process accomplished. On top of that, processes may change often due to change in regulation, change in corporate policy, or the next corporate merger.

To use a metaphor, business process guidance is a GPS for your complicated call center apps.

If you think about it, the way we drive our cars has really improved over the years because of the GPS. We no longer need buy a current road map at Texaco and follow the map as far as it’ll take us. Instead, GPS technology knows where we are, what potential construction and traffic issues we may face – we simply need to tell it where we want to go. Business Process Guidance provides that same paradigm improvement for enterprise applications. Rather than forcing training on your Customer Service Representatives (CSRs) with all of its unabridged training manuals, business process guidance provides a GPS-like function that sits on top of those systems, providing context-sensitive information on where you need to go. When a customer calls into the call center, the technology combines the context of the CSR’s screens with knowledge of the company’s business processes to guide the CSR to much faster call times and lower error rates.

A case study at BT leverages Panviva technology to reduce the error rate in BT's order entry system from 30% down to 6%, an amazing 80% reduction. That’s powerful technology on the front-end of your data stream.

Sunday, May 10, 2009

Data Governance – the Movie


To really drive home the challenge of data governance in your company, you have to believe that it’s a movie, not a photo. A snapshot is taken once and done, but that’s not what happens when you embark on a data governance initiative.

In a movie, you start with a hero – that’s you the data governance champion. You have a good heart and want to fight for justice in the cruel data management world.

Next, there needs to be conflict, a dark cloud that overshadows our hero. In most cases, the conflict goes back to the beginning when your company was just starting out. Back then, your first customers may have been from your local area at first, but slowly the circle began to grow - first locally, then regionally, then nationwide, then worldwide. As new offices opened and new systems were born, the silos formed. The hero warned the company that they need a data management strategy, but no one listened. Almost no small or medium sized company thinks about data management when they’re growing up, despite the best efforts of our heroes.

When it comes time to fix it all, you can’t think of it as taking a snapshot of the data and fixing it up with Photoshop. The hero must embark on a long journey of battle and self-sacrifice to defeat evil. Corporate change, like rapid growth, mergers, downsizing, and new laws governing the corporation happens frequently in business. The battle for corporate data management requires small steps to mature the corporation into a better way of doing business. It’s Neo from the Matrix fighting Agent Smith and evolving into ‘the One”. It’s John McLane slowly taking out the bad guys in Nakatomi Plaza.

I see what’s missing in many people’s minds in reference to data governance is that concept of time. It took a long time to mess up the data in your big corporation, and it takes time to reverse it. When you select your tools and your people and your processes for data governance, you always want to keep that enterprise vision in mind. The vision has a timeline, throughout which the data champion will have unexpected issues thrown at them. It’s not about the free data cleansing software that you get with your enterprise application. That stuff won’t hold up when you try to use it once you get out of your native environment. It’s about making sure the process, the team, and the tools stand up over time, across projects, across business units and across data types. There are few and fewer vendors standing who can offer that kind of enterprise vision.

Monday, May 4, 2009

Don’t Sweat the Small Stuff, Except in Data Quality

April was a busy month. I was the project manager on a new web application, nearly completed my first German web site (also as project manager) and released the book “Data Governance Imperative”. All this real work has taken me away from something I truly love – blogging.

I did want to share something that affected my project this month, however. Data issues can come in the smallest of places and can have a huge effect on your time line.

For the web project I completed this month, the goal was to replace a custom-coded application with a similar application built within a content management system. We had to migrate log in data of users of the application, all with various access levels, to the new system.

During go live, we were on a tight deadline to migrate the data, do final testing of the new application and seamlessly switch everyone over. That all had to happen on the weekend. No one would be the wiser come Monday morning. If you’ve ever done an enterprise application upgrade, you may have followed a similar plan.

We had done our profiling and knew that there were no data issues. However when the migration actually took place, lo and behold – the old system allowed # as a character in the username and password while the new system didn’t. It forced us to stop the migration and write a rule to handle the issue. Even with this simple issue, the time line came close to missing its Monday morning deadline.

Should we have spotted that issue? Yes, in hindsight we could have better understood the system restrictions on the username and password and set up a custom business rule in the data profiler to test it. We might have even forced the users to change the # before the switch while they were still using the old application.

The experience reminds me that data quality is not just about making the data right, it’s about making the data fit for business purpose – fit for the target application. When data is correct for one legacy application, it can be unfit for others. It reminds me that you can plan and test all you want, but you have to be ready for hiccups during the go live phase of the project. The tools, like profiling, are there to help you limit the damage. We were lucky in that this database was relatively small and reload was relatively simple once we figured it all out. For bigger projects, more complete staging of the project – making dry run before the go live phase would have been more effective.

Disclaimer: The opinions expressed here are my own and don't necessarily reflect the opinion of my employer. The material written here is copyright (c) 2010 by Steve Sarsfield. To request permission to reuse, please e-mail me.