Thursday, February 21, 2008

SAP Data Management Success Stories

I’m preparing for a web seminar on SAP data management success, and I’m really starting to look forward to it.

Moen and Oki Data will be sharing their data quality success stories with our audience. These are two very successful implementations of the Trillium Software System in the SAP environment.

My Trillium Software colleague, Laurie and I will take up only about ten minutes to first frame up and then wrap up the presentation. But the bulk of the presentation is about Moen and Oki Data and the success they’ve been able to achieve in a) quickly starting a data management program in SAP R/3, SAP ERP and SAP CRM; and b) taking the process and technology from one project to another.

If you want to join us, please click here. The webinar is on February 27th at 2 PM Eastern.

Tuesday, February 19, 2008

Data Governance – Does it take a platform?

I was reading through a major enterprise software vendor’s white paper and their recommendations on how to launch a data governance program. (I’m not going to provide a link - it wasn’t worth it.) Of course, much of the messaging was around buying software and the “platform” you need to do data governance… their platform.

Yet, I’m not sure it’s the wisest choice to start by buying a data governance platform. If your solution to data governance is to buy software, then you’re not really doing data governance. So much of data governance is about things like getting executives to recognize data as an asset, setting up processes, planning teams and resources, the politics of data ownership, understanding the goals of the organization and making decisions about data to support them, and so on.

Now I know it’s blasphemy for a guy who works for an enterprise software company to talk like that. In the past, I probably have been guilty of pushing the platform over process improvements. But, it’s a new day. I see real successes starting to emerge from companies who begin by taking a look at the strategy and process of data governance in the context of their business plan. Companies are beginning to soul-search a bit, before buying a platform, to know how ready they are for data governance and plan their maturation process.

Why not bring in some expertise on data governance first? Bring in the right mix of technology and business experience to build a plan, build a process and work through the politics of data governance first. There are some pretty good systems integrators out there who can help. We have partnerships with Accenture and Deloitte, for example, and they have helped set strategy on many projects.

Trillium Software also has a growing business around the business strategy of data governance. These programs are run by an arm of our professional services team called strategic services, and they too are really starting to show promise, as they work hand-in-hand with our customers to set up the processes and strategy of data governance, opening up communications between IT and management on data governance. These include the following programs:

• Data Quality Workshop - a knowledge sharing exercise that incorporates interactive group dynamics, analytics, and presentations to learn about the customer’s business, understand and share key aspects of a total data quality solution, and determine how to best solve business problems through a comprehensive data quality program. We’ll come in for a couple of days and help you through some of the data governance strategy.

• Strategic Planning Services - a service offering that helps you to build a future vision for data quality that optimizes processes and improves data quality enterprise-wide. This service focuses on future data quality strategies such as dealing with complex enterprise data quality deployments, expansion of data quality initiatives and the effects of mergers and acquisitions on the business.

• Data Governance Planning – This service helps organizations with developing, refining, and supporting their data governance strategies and programs. It recognizes that data quality by itself does not define data governance. Rather, it also includes a focus on business processes and people to achieve success.

If this is something that your company needs, send me an e-mail and I’ll set it up for you, or find out more here. These workshops are particularly helpful if you have some key stakeholders dragging their feet on data governance. They can help you all get on the same page.

Does it take a platform to do data governance? Maybe, but data governance is a far-off dream for many companies. In this case, it takes a lot more than technology to fulfill a dream.

Sunday, February 10, 2008

Mainframe Computing and Information Quality

Looking for new ways to use the power of your mainframe? My friend Wally called me the other day and was talking about moving applications off the mainframe to the Unix platform and cleansing data during the migration. “Sure, we can help you with that.” I said. But he was surprised to hear that there is a version of the Trillium Software System that is optimized for the Mainframe (z/OS server). We’ve continually updated our mainframe data quality solution and we have no plans to stop.

Mainframe computers still play a central role in the daily operations of many large companies. Mainframes are designed to allow many simultaneous users and applications access to the same data without interfering with one other. Security, scalability, and reliability are key factors to the mainframe’s power in mission-critical applications. These applications typically include customer order processing, financial transactions, production and inventory control, payroll, and others.

While others have abandoned the mainframe platform, the Trillium Software System supports the z/OS (formerly known as OS/390) environment. Batch data standardization executes on either a 64-bit or 31-bit system. It also supports CICS, the transactional-based processing system designed for real time processing. z/OS and CICS easily support thousands of transactions per second, making it a very powerful data quality platform. The Trillium Software System can power your mainframe with an outstanding data quality engine, no matter if your data is stored in DB2, text files, COBOL copybooks, or XML.

The Trillium Software System will standardize, cleanse and match data using our proprietary rules engine. You can remove duplicates, ensure that your name and address data will mail properly, CASS certify data and more. It’s a great way to get your data ready for SOA on the mainframe, too.

My hats off to Clara C. on our development team, who heads up the project for maintaining the mainframe version of the Trillium Software System. She’s well-known at Trillium Software for her mainframe acumen and for hosting the annual pot-luck lunch around the holidays. (She makes an excellent mini hot dog in Jack Daniels sauce.)

I’m not sure whether Wally will stick with his mainframe or migrate the whole thing to UNIX servers, but he was happy to know he has an option. With an open data quality platform, like the Trillium Software System, it’s not a huge job to move the whole process from the mainframe to UNIX by leveraging the business rules developed on one platform and copying them to the other.

Tuesday, February 5, 2008

Oracle Data Integration Suite - Trillium Software Inside

Finally! Finally, I can talk about the exciting news regarding Trillium Software’s partnership with Oracle. It’s a perfect decision for Oracle to begin working with Trillium in the data integration market, combining Sunopsis technology with Trillium Software technology to address some of the competitive challenges of IBM and the Webshere platform.

Trillium Software has long been a supporter of the Oracle platform, first offering batch technology for cleansing Oracle databases. A few years ago, we began offering direct support for Oracle’s older data integrator, OWB. Now, this integration with ODI is going to serve Oracle customers with excellent data quality within a superb data integration platform.

Trillium Software prides itself in it’s our connectivity into major enterprise applications. Here are a few of the most popular ones:

  • SAP - SAP R/3, SAP CRM, SAP ERP and SAP NetWeaver MDM.
  • Oracle - OWB, ODI, Siebel eBusiness, Siebel UCM, Oracle CDH, and Oracle eBusiness Suite.
  • Ab Initio
  • Siperian

In addition, we still have quite a few customers on the Informatica platform, and we continue to support those customers, despite the fact that Informatica has had a competitive data quality solution since its acquisition of Similarity Systems. We even maintain our integration with IBM Websphere, despite IBM’s acquisition of Ascential, who had acquired data quality vendor Vality. Still, we have a significant number of users who are using Datastage with Trillium Software and don’t want to switch.

Why support all these integration points when other vendors don’t? It’s where the reality of the marketplace meets product development. Let’s face it, large companies most often don’t run a single application platform across their entire enterprise. Most have a mixture of IBM, Oracle, Siebel, and many other enterprise vendors. Sometimes, this makes perfect sense for the organization. The heterogeneous enterprise often occurs when the application vendors can’t meet all the needs of the organization. So, for example, SAP ERP may meets the need of manufacturing, but Siebel better meets the requirements of sales and marketing.

On the other hand, it makes sense to standardize the data platform of your company. If you can plug the same rules engine into any of these platforms, data quality is more easily a simple component of corporate governance. Now you don’t have to hire staff to operate and maintain multiple data quality tools. Now, you won’t have to try to tune one data quality tool to make it behave like another. It is much easier to achieve a company-wide gold customer master record with a single information quality platform like Trillium Software.

Tuesday, January 29, 2008

The “Do Nothing” Option

When it comes to writing proposals and setting scope for data intensive projects like data integration, master data management, CRM, and data warehouse, project managers often find themselves struggling to justify the additional funding needed for information quality. But I recently picked up a neat trick when I was talking to the folks at BT. If you recall, BT has a huge implementation of enterprise-wide information quality and a great data governance story. If you want to read the details, they are written up in a 2006 Gartner report (See " Strategic Focus on Data Quality Yields Big Benefits for BT" on Gartner.com). Also, you can hear it in BT’s own words in a webinar on Trilliumsoftware.com.
Project managers, can I share a secret with you? Let’s keep this secret just between us, OK? Let’s not let upper management know you’re pulling this stuff on them.
The key is to swaying the approval process, according to BT, is to always calculate what will happen if you don’t do anything about the data quality. They call it the “Do Nothing Option”. Sounds simple, but I suspect it’s something that is missing from many proposals. In your scoping documents, make sure that you state in black and white both the ROI of improving data quality processes and the potential risks that you run when you ignore it. In short, if you invest resources in information quality, all will be right with the world. If you do nothing, anarchy and chaos will ensue.
For example, you’re installing a new CRM application. The new CRM application will likely have ROI on customizing workflow, tracking customer interactions, efficiently contacting customers and prospects, and reporting, among many other benefits. However, if you do nothing with information quality, what are the risks and what are the costs associated with the risks? Be specific about what kinds of problems are likely to occur and the costs associated with them. Are there mailing costs? Are there process costs associated with chasing down duplicate accounts and customers? Are there billing efficiencies that can be mitigated with information quality? Any inventory inefficiencies? Will reports show incorrect information requiring more process and problem chasing? Will all forms of corporate compliance be met with the current data, or are there risks of fines associated with compliance? Will data management issues instead have to be handled by the call center and if so, at what cost? What is the long-term impact of not managing the data to the corporation? Etc. What’s the cost of doing nothing?
If you’re still having trouble with management, even after explaining what will happen if you do nothing, feel free to contact me. I’m very interested in the process that project managers must endure in order to do the right thing and take on the responsibility for managing data.

Thursday, January 24, 2008

The Rise of the Business-focused Data Steward


In a December 2007 research note from Gartner entitled “Best Practices for Data Stewardship”, Gartner give some very practical and accurate advice on starting and executing a data steward program. They reiterate this advice in a press release issued this month. The new advice is to have business people become your data stewards. So, in marketing you have someone assigned as a data steward to work with the IT. The business person knows the meaning of the data as well as where they want to go with it. They become responsible for the data, and owners of it.

It’s a great concept, and one that I expect will become more and more a reality this year. However, there is some growth that needs to happen in the software industry. There are very few tools that serve a business-focused data steward. Most tools on the market are additional features that have been tacked on to IT-focused tools. Sure, a data profiler can show some cool charts and graphs, but not many business users want to learn how to use them. Should a business user really have to learn about metadata, entities, and attributes in order to find out if the data meets the need of the organization?

Rather, a marketing person wants to know if (s)he can do an offer mailing without getting most of it back. A CIO wants to know if a customer database that they just got as part of a merger has complete and current information. Accounting wants to know that they have valid tax ID numbers (social security numbers) for customers with whom they give credit, and the compliance team want to know that they are stopping those listed on the OFAC from opening accounts. Metadata? They don’t care. They just need the metrics to track the business problem.

This was really the concept that Trillium Software had when we designed TS Insight, our data quality reporting tool. The tool uses business rules and analysis from our profiler and presents them in a very friendly way – via a web browser. The more technical users can set up regular updates that display compliance with the business rules. The less technical users can open their web browsers to their customized page and metrics that are important to them. The business rules can track pretty much anything about the data without being too technical.

TS Insight is still in ramp-up for us. We came out with version 1.0 last year and we’re about to release version 2.5 this quarter. Still, we have a big head start on anyone else in the industry with this tool, serving the needs of the business-focused data steward. If this is something you’d like to see, please send me an e-mail and I’ll set up a demo.

Wednesday, January 16, 2008

Data Quality and Being Green

It’s clear that the green movement, specifically the desire for the general public to want to work with companies who are environmentally responsible, is here to stay. The general public is overwhelmingly in favor of your efforts to be green. For example, Wal-mart made headlines when it recently announced a program to reduce greenhouse gas emissions. Not only was this positive news for the world, but Wal-mart saved money on reduced energy costs.

For this and other reasons, marketers are relying less and less on direct mail as a core channel to send targeted information to customers. Consumers' desire to be green are causing marketers and finance teams alike to rethink paper-based channels, increasing their reliance on electronic communications (eg, websites, email, and e-statements).

The green movement is changing the world of data management, as follows:

  • De facto name and address standards – As we go into 2008, the general public simply won’t accept duplicates - bills, marketing offers and other mailings from you must be as clean as possible, or the customer is more apt to unsubscribe to your offers. In the past, if you got three catalogs from that computer retailer, it was a joke. They will laugh no more. Being green is a serious subject to many of your customers.
  • Importance of non-name and address data – Sure, the customer name and address will still be an important, but additional information such as e-mail address, customer contact preferences, and whether the customer is on the “Do Not Call” list are fundamental. Build this type of data as you go forward with additional processes at the call center and sales level. In other words, if you want someone’s e-mail address, you should ask for it.
  • Electronic Billing – As a customer, it sure is easier to get your bills via web site. As a company, it sure costs less to bill your customers via e-mail and secure web site. As an eco-friendly company, it sure looks good to provide a way to stop all those perfume soaked papers from being delivered by the mailman. Electronic billing is the way to go.
  • Potential Higher Revenues – Let’s face it, the cost of direct mail is much higher than sending an e-mail blast. Within reason, you can afford to make more offers to your customers and increase upsell potential, as long as you’ve done your data management homework. A men’s clothing store e-mails me weekly about specials, and I’m happy to get the offers. As a customer, I have seen their evolution. In the past, I received barely one postcard per quarter. As a result of their switch to e-mail and the increased touches, I do buy more at the store.

Trillium Software can help you meet these data management challenges and become greener. How? Of course, the Trillium Software System helps remove the duplicates, but it can help understand and repair the quality of e-mails and contact preference data. In association with our parent company, Harte-Hanks with can often do reverse look-up on data, so if you have an address, Harte-Hanks can often find a phone number or an e-mail. We can help manage the “Do Not Call”By managing data more effectively; you can become a stronger, greener company.

More info on being green? Take a look at the DMA’s Green15 Toolkit.


Disclaimer: The opinions expressed here are my own and don't necessarily reflect the opinion of my employer. The material written here is copyright (c) 2010 by Steve Sarsfield. To request permission to reuse, please e-mail me.