Last year, JPMorgan Chase made the startling admission that they lost approximately $5.8 billion due to bad trading decisions at a relatively small unit based in London. This event, that has come to be known as the London Whale Affair, was a black eye for the bank and its superstar CEO Jamie Dimon.
Throughout the subprime crisis of the late 2000’s, JPMorgan and Dimon had acquired a reputation as sound risk managers. While other Wall Street giants such as Lehman, Bear Stearns, and Citicorp struggled mightily, JPMorgan, appeared as an island of relative stability. Dimon was credited with piloting an organization with sound controls and strong governance. The revelation that poor risk management, inadequate oversight and operational deficiencies led to a multi-billion dollar loss caused people to rethink the bank’s reputation.
In order to gain a better understanding of the events surrounding the trading losses, the firm put together a Management Task Force. In mid-January, after a lengthy investigation, the task force released a 132 page postmortem on the London Whale Affair. The report chronicles the event, describes in detail how it happened, and makes a number of recommendations to prevent the likelihood of a recurrence. It is a very enlightening document that provides interesting reading for risk managers, auditors, compliance professionals and anyone else focused on improving controls and corporate governance.
Much of the press coverage of the London Whale has focused on the rogue trader angle. As seen in prior situations, an individual, or small group within a firm can make inappropriate trading “bets”, leading to dramatic losses. The Joseph Jett/Kidder Peabody affair in 1994, and the Nick Leeson/Barings Bank event in 1995 are two prominent examples. And the London Whale Affair, while different from these events, shares some underlying root causes.
But as a technology professional, there is a particular aspect of this event that draws my interest. The postmortem report notes the contributing role of spreadsheet errors. As it turns out, Microsoft Excel was a key tool used by the London-based traders to analyze risk, make investment decisions, and understand investing performance. The report points out the following issues with the use of these Excel spreadsheets:
- A critical model used to determine risk was based on a series of spreadsheets which were manually maintained by copying and pasting data between the sheets
- A spreadsheet error failed to reflect a $400 million loss
- A key spreadsheet used to determine risk used an incorrect statistical model for calculations. The spreadsheet had a cell that determined which model it would use and defaulted to the wrong model
- Another spreadsheet error made an incorrect calculation by dividing by the sum of two numbers, instead of dividing by their averages
- Inadequate tracking of changes to spreadsheets prevented personnel from determining when errors were first introduced
Seasoned professionals who have worked in the information technology or risk management fields are probably shaking their heads by now. The inappropriate use of end user tools for high-value, enterprise class functions is nothing new. But in this instance, the operational immaturity and level of dramatic losses at a world class bank are eye opening. Let’s look at how we got here.
Once upon a time, in a world before personal computers, professionals had limited choices for information processing. Simple math could be performed with a calculator and the use of a manual ledger pad. Anyone wanting more sophisticated and automated processing of data needed to visit their corporate IT department. That was all well and good except that the IT folks were generally the only game in town and had finite resources. That meant that business units often didn’t get the turnaround time they desired for technology requests. In some cases it meant their request was simply denied.
In the early ’80’s, the personal computer revolution ushered in an era of radical user empowerment. Suddenly, the IT organization was no longer a bottle neck, and individuals were able to solve their own data processing puzzles. The first significant PC business application was the spreadsheet. Initial products included Visicalc, Supercalc and Lotus 1-2-3. Ultimately, Microsoft Excel, released in 1985, became the dominant product in this space. Since then, spreadsheets have grown to become an indispensable tool of the knowledge worker. They are used for a vast range of activities from budgeting to project management to sophisticated financial analysis.
But the flexibility and empowerment that spreadsheets provide also lead to a dark side. They encourage end users to autonomously create sophisticated, complex models that are used for mission critical business functions. While this can provide for rapid development and enhancement of analysis and reporting functions, as highlighted by the London Whale Affair, it can also lead to significant control deficiencies.
Spreadsheets, and other end user tools (e.g. Microsoft Access), typically lack the operational controls seen in more sophisticated enterprise class software. Here are some of those deficiencies:
- Enterprise Awareness – A spreadsheet can typically be created/modified/deleted by anyone in the organization, without the approval or awareness of others. That spreadsheet owner frequently takes control of the ongoing use of the spreadsheet. This leads to a host of potential control issues:
- Knowledge Continuity – When the spreadsheet owner is temporarily unavailable or leaves the firm, the process associated with the spreadsheet may “die”
- Backups/Archiving/Media Reliability – The spreadsheet may live on a drive that is not backed up (e.g. local laptop drive) or that isn’t designated for business critical documents
- Business Continuity – The spreadsheet may be outside the visibility of business continuity planning and not included in any resumption plans
- Testing/QA – The spreadsheet owner may take full control of testing, instead of having “another set of eyes” to ensure proper quality
- Change Mgt – Changes to the software are frequently done on the fly, by the owner
- Error Proneness – Compared to enterprise class software products, spreadsheets have a number of characteristics that make them more prone to human and technical errors
- Data Input – Whether by import, cut and paste, or manual entry, spreadsheets are vulnerable to errors. Bad data imports, keystroke errors, and inadvertent mouse clicks are common and frequently undetected
- Error Checking – The ability to ensure that the spreadsheet is performing its logic properly is more difficult to do and less typically applied than with enterprise software
- Auditing/Troubleshooting/Debugging – The toolset available to detect, isolate and remediate problems is typically less mature than enterprise class tools. Additionally, end users are typically less familiar with these practices than IT professionals
As highlighted in the Task Force report, many of these deficiencies were seen in the spreadsheets used by the analysts and traders involved in the London Whale Affair. There is little doubt that the use of these spreadsheets was operationally immature and wholly insufficient to handle the high value trading activity in the London office. Putting billions of dollars of firm capital at risk to simple “cut and paste” errors is completely unacceptable.
But while the London Whale Affair provides a black and white view of inappropriate use of desktop tools, it raises a larger, and cloudier issue. As I described earlier, desktop class tools arose to empower end users to rapidly solve their own problems. Spreadsheets have been a dramatic driver of knowledge worker productivity and agility. It is unthinkable to go back to a world where every analytical exercise required a full blown project to be approved and implemented by a central technology team.
So what’s a firm to do? It’s like the old saying, “spreadsheets, can’t live with ’em, can’t live without ’em.” Forward thinking firms should have a set of guidelines for the use of spreadsheets. As with most operational maturity decisions, the exact nature of these guidelines will vary based on the profile of the firm. Is the firm highly regulated? Does it have a strong brand name, with a great concern around reputational risk? Is it conservative in nature, looking to manage risk carefully and move methodically? If the answer to one or all of these questions is yes, the firm would be more restrictive in its policies towards the use of autonomously managed spreadsheets.
There is one simple practice that should be used to identify or decide if spreadsheets are being used appropriately. All “critical” business functions should be documented for the purposes of operational oversight and business continuity. As part of this identification process, end users should have to disclose all tools used to perform their critical functions. Spreadsheets identified through this exercise should be put through an evaluation process to determine if they are operationally suitable for the type of process they are supporting.
In some cases, it will be clear that the control capabilities of spreadsheets will be insufficient. In that case, an enterprise class software product will need to be implemented. In other cases, the spreadsheet may represent a reasonable approach. However, even where that is so, the firm must ensure that a set of compensating controls are in place. These controls would include the following:
- Full documentation, with a backup individual trained to understand and operate the spreadsheet
- Appropriate backup and archiving process
- Inclusion in business continuity plans
- Version Control
- Quality Assurance and Testing
- Change Management
- Production monitoring of operational processes associated with the spreadsheet
Spreadsheets and other desktop tools are a prime example of one of the fundamental “tradeoff” decisions in technology management; the balancing of controls and innovation. Taking an overly restrictive stance to their use will result in a slow and unresponsive organization. Having insufficient restrictions sets the stage for London Whale style meltdowns. The trick for firms is to put in place the appropriate controls framework that matches their unique corporate profile and culture. The following steps are necessary to make this happen:
- Define the policies, practices and standards associated with the use of desktop tools
- Educate associates about these policies, explaining the rationale and importance of a balanced approach
- Enforce the policies through monitoring and a periodic review process