We’re in the midst of a mortgage crisis. Call it a natural extension of capitalism, where greed can inspire unregulated “innovation”. That greed is now coming home to roost.
This problem has many moving pieces and it's difficult to describe in an elevator pitch. By the actions of our leaders and bankers, mortgage lenders were inspired to write dubious mortgages, and the US population was encouraged to apply for them. At first, these unchecked mortgages lead to more free cash, more spending, and a boom in the economy. Unfortunately, the boom was built on a foundation of quicksand. It forced us to take drastic measures to bring balance back to the system. The $700 billion bill already passed is an example to those measures. Hopefully for our government, we won’t need too many more balancing measures.
So where do we go from here? The experts say that the best-case scenario would be for the world economy to do well - unemployment stays low, personal income keeps pace with inflation and real estate prices find a bottom. I'm optimistic that we'll see that day soon. Many of the experts aren't so sure.
One thing that history teaches us is that regulatory oversight is bound to get stiffer after this fiasco. We had similar “innovations” in capitalism with the savings and loan scandal, the artificial dot-com boom, Enron, Tyco and WorldCom. Those scandals where followed up by new worldwide regulations like Sarbanes-Oxley, Bill 198 (Canada), JSOX (Japan) and Deutscher Corporate Governance Kodex (Germany) to name just a few. These laws tightened oversight of the accounting industry and toughen corporate disclosure rules. They also moved to make the leaders of corporations more personally liable for reporting irregularities.
The same should be true after the mortgage crisis. The types of loans that have brought us to this situation may only exist in tightly regulated form in the future. In the coming months, we should see a renewed emphasis on detecting fraud at every step of the process. For the financial services industry especially, it will be more important than ever to have good clean data, accurate business intelligence and holistic data governance to achieve the regulations to come.
If you’re running a company that still can’t get a handle on your customers, has a hard time detecting fraud, has a lot of missing and outlier values in your data, has many systems with many duplicated forms of data values, you’ll want to get started now on your governing your data. Go now, run, since data governance and building intelligence can take years of hard work. The goal here would be to begin to mitigate the potential risk you have in meeting regulatory edicts. If you get going now, you’ll not only beat the rush to comply, but you'll reap the real and immediate benefits of data governance.
Tuesday, October 21, 2008
Financial Service Companies Need to Prepare for New Regulation
Thursday, October 9, 2008
Teradata Partners User Group
Road trip! Next week, I’m heading to Teradata Partners User Group and Conference in Las Vegas, and I’m looking forward to it. The event should be a fantastic opportunity to take a peak inside the Teradata world.
The event is a way for Trillium Software to celebrate its partnership with Teradata. This partnership has always made a lot of sense to me. Teradata and Trillium Software have had similar game-plans throughout the years – focus on your core competency, be the best you can be at it, but maintain an open and connectible architecture that allows in other high-end technologies. There are many similarities in the philosophies of the two companies.
Both companies have architecture that works well in particularly large organizations with vast amounts of data. One key feature with Teradata, for example, is that you can linearly expand the database capacity response time by adding more nodes to the existing database. Similarly, with Trillium Software, you can expand the number of records cleansed in real-time by adding more nodes to the cleansing system. Trillium Software uses a load balancing technology called the Director to manage cleansing and matching on multiple servers. In short, both technologies will scale to support very large volumes of complex, global data.
The estimate is for about 4000 Teradata enthusiasts to show up and participate in the event. So, if you’re among them, please come by the Trillium Software exhibit and say hello.
Monday, October 6, 2008
Data Governance and Chicken Parmesan
With the tough economy and shrinking 401K’s, some of my co-workers at Trillium are starting to cut back a bit in personal spending. They talk about how expensive everything is and speak with regret if they don’t bring a lunch instead of buying one at the Trillium cafeteria. Until now, I’ve kept quiet about this topic and waited politely until the conversation turned to say, fantasy football. But between you and me, I don’t agree that there is a huge cost savings with making your own.
Restaurants can sell chicken parmesan for $15.99 and still make a profit because they have the system of making it that uses economy of scale. They buy ingredients cheaper, and because they use the sauce in other dishes, have ‘reusability’ working for them, too. They use the sauce in their eggplant parmesan, spaghetti with meatballs, and many other dishes, and that reuse is powerful. Most of the high-end technologies you choose for your company have to have the same reusability as the sauce for the maximum benefit. Using data quality technologies that only plug into SAP, for example, when your future data governance projects may lead you to Oracle and Tibco and Siperian just doesn’t make sense.
One other consideration - what if something goes wrong with my homemade chicken parmesan? I had little recourse if my own home-cooked solution were to go up in flames, except to get into even more expense and order out. But if the restaurant chicken parmesan is bad, you can call them and they’ll make me another one at no charge. Likewise, you have contractual recourse when a vendor solution doesn’t do what they say it will.
If you’re thinking of cooking up your own technical solutions for data governance hoping to save a ton of money, think again. Your most economical solution might just be to order out.
Monday, September 29, 2008
The Data Intelligence Gap: Part Two
In part one, I wrote about the evolution of a corporation and how rapid growth leads to a data intelligence gap. It makes sense that a combination of people, process and technology combine to close the gap, but just what kind of technology can be used to help you cross the divide and connect the needs of business with the data available in the corporation?
Of course, the technology needed depends on the company’s needs and how mature they are about managing their data. Many technologies exist to help close the gap, improve information quality and meet the business needs of the organization. Let’s look at them:
CATEGORY | TECHNOLOGIES | HOW IT CLOSES THE |
Preventative | Type-Ahead Technology | This technology watches the user type helps completes the data entry in real time. For example, products like Harte-Hanks Global Address help call center staff and others who enter address data into your system by speeding up the process and ensuring the data is correct. |
Data Quality Dashboard | Dashboards allow business users and IT users to keep an eye on data anomalies by constantly checking if the data meets business specifications. Products like TS Insight even give you some attractive charts and graphs on the status of data compliance and the trend of its conformity. Dashboards are also a great way to communicate the importance of closing the data intelligence gap. When your people get smarter about it, they will help you achieve cleaner, more useful information. | |
Diagnostic and Health | Data Profiling | Not sure about the health and suitability of any given data set? Profile it with products like TS Discovery, and you’ll begin to understand how much data is missing, outlier values in the data, and many other anomalies. Only then will you be able to understand the scope of your data quality project. |
Batch Data Quality | Once the anomalies are discovered. A batch cleansing process can solve many problems with name and address data, supply chain data and more. Some solutions are batch-centric, while others can do both batch cleansing and scalable enterprise-class data quality (see below). | |
Infrastructure | Master Data Management ( | Products from the mega-vendors like |
Enterprise-Class Data Quality | Products like the Trillium Software System provide real time data quality to any application in the enterprise, including the | |
Data Monitoring | You can often use the same technology to monitor data as you do for profiling data. These tools keep track of the quality of the data. Unlike data quality dashboards, the IT staff can really dig into the nitty-gritty if necessary. | |
Enrichment | Services and Data Sources | Companies like Harte-Hanks offer data sources that can help fill the gaps when mission-critical data is missing. You can buy data and services to segment your database, check customer lists for change of address, look for customers on the do-not-call list, reverse phone number look ups, and more. |
These are just some of the technologies involved in closing the data intelligence gap. In my next installment of this series, I’ll look at people and process. Stay tuned.
Monday, September 22, 2008
Are There Business Advantages to Poor Data Management?
I have long held the belief, perhaps even religion, that companies who do a good job governing and managing their data will be blessed with so many advantages over those who don’t. This weekend, as I was walking through the garden, the serpent tempted me with an apple. Might there actually be some business advantage in poorly managing your data?
The experience started when I noticed a bubble on the sidewall of my tire. Just a small bubble, but since I was planning on a trip down and back on the lonely Massachusetts Turnpike (Mass Pike) on a Sunday night, I decided to get it checked out. No need to risk a blow-out.
I remembered that I had purchased one of those “road hazard replacement” policies. I called the nearest location of a chain of stores that covers New England. Good news. The manager assured me that I didn’t need my paperwork and that the record would be in the database.
Of course, when I arrived at the tire center, no record of my purchase or my policy could be found. Since I didn’t bring the printed receipt, the tire center manager gave me a couple of options: 1) Drive down the Mass Pike with the bubbly tire and come back again on Monday when they could “access the database in the main office”; or 2) Drive home, find paperwork, come back to store... Hmm. Not sure where it was. 3) Buy a new tire at full price.
I opted to buy a new tire and attempt to claim a refund from the corporate office later when I found my receipts. The jury is still out on the success of that strategy.
However, this got me thinking. Could the inability for the stores to maintain more that 18 months of records actually be a business advantage? How many customers lose the paperwork, or even forget about their road hazard policies and just pay the replacement price? How much additional revenue was this shortcoming actually generating each year? What additional revenue would be possible if the database only stored 12 months of transactions?
Finding fault in the one truth - data management is good - did hurt. However, I realized that advantages of the poor data infrastructure design at the tire chain is very short-sighted. True, it actually may lower pay-outs on the road hazard policies short-term, but eventually, this poor customer database implementation has to catch up to them in decreased customer satisfaction and word-of-mouth badwill. There are so many tire stores here competing for the same buck, eventually, the poor service will cause most good customers to move on.
If you're buying tires soon in New England and want to know what tire chain it was, e-mail me and I'll tell. But before I tell you all, I'm going to hold out hope for justice... and hope that our foundation beliefs are still intact.
Saturday, September 20, 2008
New Data Governance Books
A couple of new, important books hit the streets this month. I’m adding these books to my recommended reading list.
Data Driven: Profiting from Your Most Important Business Asset is Tom Redmond’s new book making the most of your data to sharpen your company's competitive edge and enhance its profitability. I like how Tom uses real-life metaphors in this book to simplify the concepts of governing your data.
Master Data Management is David Loshin’s new book that provides help for both business and technology managers as they strive to improve data quality. Among the topics covered are strategic planning, managing organizational change and the integration of systems and business processes to achieve better data.
Both Tom and David have written several books on data quality and master data management, and I think their material gets stronger and stronger as they plug in new experiences and reference new strategies.
EDIT: In April of 2009, I also released my own book on data governance called "The Data Governance Imperative".
Check it out.>>
Monday, August 11, 2008
The Data Intelligence Gap: Part One
There is a huge chasm in many corporations today, one that hurts companies by keeping them from revenue, more profit, and better operating efficiency. The gap, of course, lies in corporate information.
What the Business Wants to Know | Data needed | What’s inhibiting peak efficiency |
Can I lower my inventory costs and purchase prices? Can I get discounts on high volume items purchased? | Reliable inventory data. | Multiple ERP and |
Are my marketing programs effective? Am I giving customers and prospects every opportunity to love our company? | Customer attrition rates. Results of marketing programs. | Typos. Lack of standardization of name and address. Multiple |
Are any customers or prospects “bad guys”? Are we complying with all international laws? | Reliable customer data for comparison to “watch” lists. | Lack of standards. Ability to match names that may have slight variations against watch lists. Missing values. |
Am I driving the company in the right direction? | Reliable business metrics. Financial trends. | Extra effort and time needed to compile sales and finance data – time to cross-check results. |
Is the company we’re buying worth it? | Fast comprehension of the reliability of the information provided by the seller. | Ability to quickly check the accuracy of the data, especially the customer lists, inventory level accuracy, financial metrics, and the existence of “bad guys” in the data. |

