In the coming weeks the feds and the surviving financial services institutions will have the daunting task of unraveling all the securitized loans and other instruments that are hiding the toxic investments. But does the technology exist to do that? And if so, could it have been used to prevent the bad debt from hitting the fan in the first place?
The fact is that despite government regulations like Sarbanes-Oxley, there is little visibility mandated by current regulations into the origination of loans and how they are broken up, resold, and resold again.
To cite the classic example of how we got into this mess, consumers were given 100-percent-plus variable mortgages without any security. Not only could those mortgages be sold to other banks, but they could be divided into five, ten, or twenty tranches -- financialese for slices -- and resold to five to ten different organizations, making it difficult to track who was involved and who ended up taking the risk.
Theoretically, the financial service providers were clear on the risks of each type of loan and had a way to gauge whether they had enough liquidity -- cash and other easily sold assets -- available if the riskier loans went south. But a New York Times report indicates that in fact many financial institutions gamed their analytics to favor positive scenarios over negative ones in order to justify keeping less money in reserves should the risky loan blow up. "A large number of buyers of these kinds of instruments really didn't care about the value. They just wanted to flip it. A lot of people just didn't want to know," concurs says Josh Greenbaum, principal at Enterprise Applications Consulting.
Analytics and CEP tools could have helped
Had these financial services companies and banks established business intelligence metrics as to the ratios of what kind of debt they were holding versus the cash reserves they held, their analytics systems might have driven alerts earlier in the process, says Michael Corcoran, a product manager at the BI provider Information Builders. But as anyone in business already knows, consolidating that kind of data to get those answers more often than not is a slow process that typically ends up being done manually in an Excel spreadsheet well after the fact.
Jeff Wooton, vice president of product strategy at Aleri, a complex event processing (CEP) company, agrees that most data consolidation takes far too long to give a complete picture. "It relies on overnight data consolidation runs, overnight reports, and manual processes like spreadsheets."
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.