Einstein Nobel futures contract
Last week FiveBooks carried an interview with well-known Einstein biographer, Walter Isaacson, reviewing his favourite five books on Einstein. Isaacson discusses a very interesting futures contract related to Einstein’s Nobel Prize:
This is a great piece of writing and of research about Einstein’s relationship with his first wife who served as his sounding-board in the miracle year of 1905 when he discovers special relativity and lays the groundwork for quantum theory. Mileva Maric was a physics student at Zurich Polytechnic, and when she and Einstein met they fell madly in love.
...
When the passionate relationship exploded and Einstein wanted a divorce he couldn’t afford the money Maric wanted to raise their two boys. So Einstein says to her that one day he’ll win the Nobel Prize for his 1905 work and if she gives him a divorce he’ll give her the prize money when he wins. She takes a week to calculate the odds and consult other scientists, but she is a good scientist herself and she takes the bet. He didn’t win until 1921 but he did give her the money and she bought three apartment buildings in Zurich.
I was aware that Einstein gave the prize money to his first wife, but did not know that there was actually a futures contract (or more precisely a forward contract). Wikipedia has more details about the transaction. As I think carefully about it, this futures contract is actually quite hard to value:
- First of all, we need the risk neutral probability (as of 1919) that Einstein’s work would win the Nobel Prize at all. This is a nontrivial task – those who think that this probability was close to one must keep in mind the fact that no Nobel Prize was ever awarded for relativity theory (Einstein’s prize was for his work on the photon).
- There is a significant information asymmetry in assessing this probability because Einstein obviously knew more about his own work than anybody else.
- To value the futures contract it is also necessary to estimate the expected time at which the prize would be awarded. Coase received the economics Nobel Prize six decades after his original work. Physicists do not usually have to wait that long, but in 1919, Einstein had already waited 14 years.
- Mileva Maric would also have to worry about mortality risk because the Nobel Prize is not given posthumously. Her own mortality is less important because the money would probably have gone to a trust for the benefit of her children.
- There is also the vexed issue of counterparty risk – during and after the First World War, it could not be taken for granted that the endowment of the Nobel Foundation would be preserved sufficiently well in real terms to ensure that the expected prize money could be paid out.
- Finally, since the futures contract was clearly non-tradable, a significant liquidity discount would also have to be applied while doing the valuation.
Despite all these difficulties, the fact is that Mileva Maric was able to arrive at a reasonable valuation and conclude the negotiations. It is interesting to read that the information asymmetry was resolved not only by her own competence as a physicist, but also by consulting other scientists. This is similar to the use of rating agencies in debt markets.
In the real world, instruments do end up getting valued despite the theoretical difficulties involved. One of Einstein’s famous quotes is probably relevant here with suitable modifications: “God does not care about mathematical difficulties, he integrates empirically.”
Posted at 6:02 pm IST on Mon, 27 Dec 2010 permanent link
Categories: derivatives
Self promotion
My blog has been listed among 50 Best Business Professor Blogs by bschool.com. The list includes 15 blogs on finance and accounting. If one excludes the accounting and personal finance blogs, the list is even shorter. Perhaps too few finance professors are blogging.
Posted at 10:40 am IST on Sat, 25 Dec 2010 permanent link
Categories: miscellaneous
Hard to act against systemic risk
Richard Bookstaber says in an interview yesterday that it is not difficult to detect systemic risk – the hard part is to take useful action against it:
But I don’t think systemic risk is hard; at least monitoring systemic risk is not difficult. Nobody can hide risk of that magnitude. It’s there to be seen. As I already mentioned, it is taking action that is difficult.
I entirely agree with this assessment. For example, in Indian banking (and many other parts of the financial sector), it is not at all difficult to see that infrastructure finance is a big systemic risk:
- There is a large amount of it.
- There are huge maturity mismatches.
- Infrastructure as the ultimate non tradeable is a bet on the growth of the Indian economy.
- Many of the instruments and techniques are new and many lenders have little prior experience with the sector.
- The central bank has been willing to bend the rules to facilitate expansion of credit to this sector.
- Infrastructure in India is closely bound up with real estate which is implicated in most banking crises.
- No individual bank has any incentive to worry about it because it can count on a bail out – infrastructure is too large and strategically important to be allowed to fail.
In short, it is the classic tail risk mitigated by the high likelihood of a sovereign bailout. It is simply not in the interest of anybody (lender, borrower, regulator or government) to do anything about it.
Posted at 9:29 pm IST on Fri, 24 Dec 2010 permanent link
Categories: regulation
Financial history books
In my blog post more than a month ago on books related to the global financial crisis, I promised to post a list of books related to financial history. One of the lessons from the crisis was that practitioners, regulators and academics in the field of finance need to have a good understanding of financial history.
If you wish to read only one book on financial history, I would recommend A History of Interest Rates by Sidney Homer and Richard Sylla. This book covers the evolution of credit markets over 5,000 years and is packed with charts and tables about interest rates across space and time. It was from this book that I learned for example that credit predates money and probably predates barter.
Another important book is The Early History of Financial Economics, 1478-1776: From Commercial Arithmetic to Life Annuities and Joint Stocks by Geoffrey Poitras. I found it surprising that so much of financial economics including net present value and expected utility had been developed prior to Adam Smith and the Wealth of Nations. Poitras also covers the early development of the stock market in Amsterdam and has extensive extracts from Joseph de la Vega’s pioneering book Confusion de Confusiones of 1688.
I was fascinated by The Origins of Value: The Financial Innovations that Created Modern Capital Markets edited by William N. Goetzmann. This wonderfully illustrated book covers financial innovations from ancient Mesopotamia and China to the modern world. For example, it describes how Benjamin Franklin printed United States government bonds at home introducing technical innovations designed to make counterfeiting difficult.
Manias, Panics and Crashes: a History of Financial Crises by Charles Kindleberger would probably be the favourite financial history book for many academics. It is also one of my favourite books – I have been fond of saying that one must approach the study of finance with Ito’s lemma in one hand and Kindleberger in the other. In this list, I have ranked it lower for two reasons. First, it is more a theory of financial crises than a history. Second, it covers only the period since modern financial markets developed in Holland in the late 16th and early 17th centuries.
For an institutional perspective of financial history, I enjoyed The Origins and Development of Financial Markets and Institutions: From the Seventeenth Century to the Present edited by Jeremy Atack and Larry Neal. I also learned a lot from Meir Kohn’s forthcoming books on commerce and finance in preindustrial Europe (draft chapters are available on his website)
In my post on crisis related books, I mentioned This Time is Different: Eight Centuries of Financial Folly by Carmen Reinhart and Kenneth Rogoff but I must mention it again here. This book is more of a macroeconomist’s view of financial crises than a financial economist’s view but the wealth of data and analysis in this book make it indispensable.
Among the books that I read but did not enjoy as much as the above books are:
- Devil Take the Hindmost: A History of Financial Speculation by Edward Chancellor
- The Ascent of Money: A Financial History of the World by Niall Ferguson
- False Economy: A Surprising Economic History of the World by Alan Beattie
Turning to the financial history of the 19th and 20th centuries, there is a wealth of material that is available on each country and each period. Among the books that provide an interesting multicountry perspective, I would like to mention:
- Financial Development of India, Japan and the United States: A Trilateral Institutional, Statistical and Analytic Comparison by Raymond Goldsmith
- Evolving Financial Markets and International Capital Flows: Britain, the Americas, and Australia, 1865-1914 by Lance Davis and Robert Gallman.
I believe that the plural of biography is not history and have therefore not included the numerous books that have been written about individual banks and bankers. For example, Niall Ferguson is absolutely outstanding in his two-volume book on The House of Rothschild. There are also many good books on the House of Morgan and on the Medici, but I have not seen any on the Hopes of Rotterdam/Amsterdam.
Posted at 4:44 pm IST on Fri, 10 Dec 2010 permanent link
Categories: crisis, financial history, interesting books
Where is the rupee-dollar market?
In two blog posts three months ago, I put together data from the BIS and the RBI to suggest that half of the rupee-dollar market is outside India. Now we have the data from the BIS. Table D.6 gives the location wise break up of the rupee-dollar market: 50% is in India, 16% is in Singapore, 12% in UK, 11% in Hong Kong, and 9% in the US with the balance scattered across four countries (Switzerland, Australia, Japan and Canada) each with less than 0.5%. The offshore market exceeds $20 billion a day.
One of the reasons most people underestimated the size of this market is that they were looking only at Singapore, while the market is spread across several locations.
Posted at 9:09 pm IST on Thu, 2 Dec 2010 permanent link
Categories: derivatives, international finance
Stock Exchange Ownership and Competition
I have a column in today’s Financial Express about a key committee report on ownership and competition in India’s stock exchanges.
The Jalan Committee appointed by Sebi has effectively recommended the back-door nationalisation of stock exchanges, clearing corporations and depositories (market infrastructure institutions or MIIs). If these recommendations are accepted, we will extinguish the essential spark of dynamism that has given India a world class equity market.
The Jalan report is permeated from beginning to end with the ideology of socialism and the command economy, but instead of overt nationalisation, it seeks to destroy private sector exchanges and other MIIs through a thousand small cuts. Let me list out just a few key elements of this strategy:
- The maximum profit that an exchange (or MII) can earn would be capped at a certain percentage of annual return on net worth of the previous year. Any excess over this would be transferred to the investor protection fund (IPF) or similar fund. The economic result of this would be that the existing equity capital of all exchanges would effectively become non-cumulative preference shares with a fixed rate of dividend, and the government (in the form of the IPF) would own the economic equity of the exchange.
- In keeping with the back-door nationalisation of the exchanges, it is proposed that their key executives should also be remunerated like bureaucrats—fixed salary with fixed annual increments, no variable pay and no form of stock options.
- The users of the market infrastructure (the members of the exchanges and clearing corporation) are prohibited from sitting on the board.
- Only public financial institutions and banks are allowed to become anchor investors in exchanges and other MIIs attenuating any residual chance of shareholder control.
The Jalan report also takes an essentially anti-competitive position in the field of exchanges and MIIs. The view taken is that there is sufficient competition already and that it would be undesirable to have more competition: “Sebi should have the discretion to limit the number of MIIs operating in the market, in the interest of the market and in public interest.”
There are two problems with this. First, the Indian MII industry (exchanges and depositories) is a near monopoly where the dominant players have been disciplined not by actual competition but by the threat of competition. In the jargon of the economists, it is a contestable market but not a competitive market. In this context, an attempt to limit new entrants will entrench the existing near-monopoly and remove the disciplining force of potential competition.
Historical experience tells us that we have got far better and cheaper telecommunications from competitive profit-seeking companies than from a monopoly state-run public utility. Competitive private players have given us cheaper and more convenient air travel than monopoly state players. It is the same story in industry after industry. The Jalan report is ignoring this overwhelming evidence from India and elsewhere, and propounding the belief that a cosy government-controlled monopoly or oligopoly would serve the market and the public interest better than a competitive industry structure.
The Jalan report seeks to limit competition by several means quite apart from the explicit limit on the number of exchanges and depositories. The regulated public utility model ensures that it is very unattractive for new entrants. Anybody seeking to challenge an entrenched incumbent faces a high chance of failure and the only incentive for entry is the prospect of large profit in case of success. The ceiling on profitability rules this out. Moreover, the new entrant would not be able to attract talented managers because of the inability to offer performance-based compensation.
As if all this were not enough, the proposed ownership norms rule out most strong strategic investors with deep pockets who have an incentive to enter the business. If only public financial institutions and banks are allowed to become anchor investors, the current incumbents are unlikely to be seriously challenged.
India, therefore, runs the risk of losing the competitive dynamism of the Indian equity markets. What is more tragic is that this is happening at a time when the stock exchange business in Asia is entering a period of regional, if not global, competition. The abortive bid by the Singapore Stock Exchange to buy the Australian Stock Exchange marked the first warning shot in this process. Asia is going to be one of the fastest growing equity markets in the world, and India’s world-class exchanges and depositories have a wonderful opportunity to occupy a pole position in this space.
At this critical juncture, when each exchange in Asia is deciding whether to be predator or prey in the emerging pan-Asian competition, the Jalan Committee is pushing India in the wrong direction. The recommendations, if implemented, would ensure that Indian exchanges never become pan-Asian institutions. Worse, Indian exchanges could even become completely unviable, if the business moves to exchanges outside India that may offer better service at more competitive prices.
Posted at 10:59 am IST on Tue, 30 Nov 2010 permanent link
Categories: corporate governance, exchanges
Bailouts and thanksgiving
I wrote a column in Wednesday’s Financial Express arguing that bailout recipients seem to regard the bailout as an entitlement and are therefore unwilling to say ‘thank you’ let alone ‘sorry.’ This sense of entitlement aggravates the risk of moral hazard.
Warren Buffet’s open letter last week (NYT, November 17, 2010) thanking ‘Uncle Sam’ for bailing him out in 2008 serves to remind us that very few corporate leaders in the US or in India have been willing to say ‘thank you’ let alone ‘sorry’ after the global financial crisis and the ensuing bailout. Warren Buffet proudly says that “My own company, Berkshire Hathaway, might have been the last to fall,” but at least he admits that he too needed help.
By contrast, many corporate leaders around the world are busy creating a revisionist history of the crisis in which they can pretend that there was no serious problem with their companies at all. I do not think of the willingness to say ‘thank you’ or ‘sorry’ as a matter of politeness and courtesy. I view it instead as a measure of whether the recipient thinks of the bailout as an entitlement that he can look forward to in future or as a piece of good fortune that may or may not be repeated in future.
Warren Buffet’s letter describes the bailout as a correct policy but makes it clear that the bailout required a confluence of ideological disposition and technical competence in key decision makers. When recipients have this perspective, the moral hazard created by the bailout is attenuated.
While I disagree with the pro-bailout ideology of the Indian government (including RBI and other regulators) at the time, there is little doubt that the Indian crisis response was well thought-out, coordinated and implemented smoothly. Key recipients of this government bailout should be thankful to the government for this but, unfortunately, they seem to see the bailout as their right. Let us look at some examples of how this is leading to moral hazard.
Many debt-oriented mutual funds in India would have had to suspend redemptions but for the support provided by the government through the banking system. More than the mutual funds themselves, the corporate sector that had parked cash surpluses with these funds owe a big thank you to the government. For years, the corporate sector used these mutual funds as a tax sheltered vehicle to earn a high post-tax return on their cash surpluses.
Without the government bailout, these cash surpluses would have become illiquid and inaccessible at the time of the corporate sector’s greatest need. Moreover, the corporate sector would have taken large losses on their investments as the mutual funds tried to liquidate troubled assets into a risk-averse market.
This bailout encouraged moral hazard and within months corporate investors were pouring money back into liquid mutual funds believing that they would be protected if things go wrong. It is very disappointing that the government did not take the opportunity provided by the crisis to end the tax sheltered status of short-term mutual fund investments. Similarly, it is very unfortunate that the government did not impose a hair cut on impatient investors trying to redeem out of these funds at the height of the liquidity crisis.
Many Indian banks benefited from foreign exchange swaps extended by RBI during the crisis. When they were unable to roll over the foreign borrowing after the Lehman failure, and when their credit default swap spreads were trading at around 10% per annum, the RBI swaps were very attractively priced. Indian banks have therefore been in no hurry to subsidiarise their foreign branches.
Non-bank financial companies (NBFCs) owe a large debt of gratitude to the government for the lender of last resort (LOLR) support that they received at the peak of the crisis. Despite the fig leaf of providing the LOLR support through the banking system, the dividing line between banks and NBFCs was largely obliterated in those months.
The entire real estate sector owes a big thank you to the government for encouraging the banking system to restructure real estate loans during the crisis. More importantly, the sector benefited from the bailout of mutual funds and NBFCs, which were important sources of finance for them. Again the moral hazard engendered by this bailout is the main reason we have to worry about a real estate bubble so soon after the crisis.
The Indian corporate sector should also be grateful to the US Fed for unleashing a flood of dollar liquidity into the world after the Lehman failure. The resulting revival of capital flows into India in mid-2009 was instrumental in repairing damaged corporate balance sheets. There is no guarantee that conditions would be the same in the next crisis.
All these bailouts contributed to the relative mildness of the 2008 crisis in India, which has made the Indian private sector complacent. As a result, Indian managers are behaving less prudently than they ought to. That itself is a source of systemic risk. It would be far better if Indian corporate leaders emulated Buffet’s sense of humility and gratitude.
Posted at 9:40 pm IST on Fri, 26 Nov 2010 permanent link
Categories: crisis
Why do Indian mutual funds intermediate inter-bank lending?
The Reserve Bank of India’s Report on Trends and Progress in Banking in India 2009-10 contains an interesting discussion on the inter-linkages between banks and debt mutual funds (Box IV.1 on page 64). As of November 2009, banks had invested Rs 1.3 trillion in debt mutual funds, but had also borrowed Rs 2.8 trillion from these funds – banks were thus net borrowers to the extent of over Rs 1.5 trillion. It appears that debt mutual funds are intermediating two kinds of flows in the debt market.
- Debt mutual funds were intermediating Rs 1.5 trillion of debt flows to the banks largely from the corporate sector.
- More interestingly, debt mutual funds were also intermediating Rs 1.3 trillion of interbank lending.
The RBI report describes the intermediation of interbank lending as follows:
When banks were arranged in a descending order by the amount of their net borrowings from MFs, public sector banks figured prominently at the upper end as major borrowers, while the new private sector banks along with SBI could be seen as major lenders to MFs.
The interbank money market is one of the oldest and most liquid markets in India. The repo market for secured lending is also well established with a central counterparty for risk mitigation, and has worked smoothly even during the turbulent periods of 2008. Why would there be a need for mutual funds to intermediate this market? Two possible reasons come to mind.
- The intermediation may be happening largely due to the tax advantages of mutual funds.
- The mutual funds may be intermediating the interest rate risk involved in investing in certificates of deposit issued by banks with several months of residual maturity, while allowing their own investors to redeem at any time. In the ultimate analysis however, this is an illusion because a mutual fund does not absorb risks, it only passes these risks on to its investors. Since the primary non-bank investors in a debt mutual fund come from the corporate sector, the question is whether the corporate sector has superior ability or willingness to assume this risk. I think the answer is no; in times of stress the corporate sector has been desperate to bail out of mutual funds. In 2008, it was left to the government to bail out the mutual funds themselves.
It appears to me that the whole system of artificial tax breaks to mutual funds has created economically useless layers of intermediation while also adding to systemic risk and fragility.
Posted at 1:52 pm IST on Mon, 15 Nov 2010 permanent link
Categories: banks, mutual funds
Mutual funds supporting their parents
Nearly three years ago, Ajay Shah sent out an email to a few of us about how the regulatory framework of a “first world country” would deal with possible conflicts of interest between a mutual fund and its parent company. At that time, it was too early in the global financial crisis for me to give the flippant answer that there are no first world countries any more – we are all third world countries.
The example that I gave then was that of UBS allegedly stuffing its own shares into its mutual fund and into the portfolios of wealth management clients and then voting them to win a proxy war against Martin Ebner way back in 1994. Holders of class N shares voting in favour of the UBS share unification proposal in that meeting were effectively voting to destroy the value of their own shares as Loderer and Zgraggen explained in an interesting paper (“When Shareholders Choose Not to Maximize Value: The Union Bank of Switzerland’s 1994 Proxy Fight”, Journal of Applied Corporate Finance, Fall 1999). Ideally therefore, portfolio investors should have sold all their class N shares ahead of the meeting.
But now there is an academic paper showing that Spanish mutual funds buy shares in their parent banks to prop up the share price after a significant fall (Golezyand and Marinz, “Price Support in the Stock Market”, SSRN, June 2010). The paper finds compelling evidence for such price support with careful econometrics that rules out alternative explanations like portfolio rebalancing into the banking sector, contrarian trading, timing skills or information-driven trading.
The authors point out that “Strictly speaking, price support activities by mutual funds are illegal, as the trades are not necessarily placed in the interest of the fund investors.” However they also believe that Spain is a country in which such crimes are not closely monitored and are not severely prosecuted.
Posted at 2:45 pm IST on Tue, 9 Nov 2010 permanent link
Categories: banks, corporate governance, mutual funds
Goodhart's law and leverage ratios
I am a strong believer in Goodhart’s Law which says that any measure begins to lose its usefulness when it is used as a regulatory target. The Basel Committee paper on “Calibrating regulatory minimum capital requirements and capital buffers: a top-down approach” released last week shows that this is quite true of the leverage ratio.
Table 2 on page 17 shows that if we exclude countries which had a leverage ratio requirement before the financial crisis, the leverage ratio does predict financial distress of banks. There is a large and statistically significant difference in the 2006 leverage ratios for banks that were stressed in the crisis of 2007/2008 and those that were not. However when these countries are included, the difference is smaller and is not statistically significant in most cases.
The Basle committee, which is now wedded to the idea of a leverage ratio, does not draw the conclusion that Goodhart’s Law is in operation. It refuses to even provide the only data which is truly relevant – the data for the countries which had a leverage ratio requirement pre-crisis. It is very likely that for these countries the leverage ratio would have been seen to be practically useless.
Posted at 2:07 pm IST on Mon, 1 Nov 2010 permanent link
Categories: leverage, post crisis finance, regulation, risk management
Crisis related books
My favourite crisis related book, Raghuram Rajan’s Fault Lines won the Financial Times and Goldman Sachs Business Book of the Year award last week. That gives me an excuse to write about the various crisis related books that I have read. There are clearly many important books on the crisis I have not read, and so I cannot comment on them. My list is not therefore intended to be comprehensive.
First of all are the books that provide a theoretical analysis of the crisis in its entirety and not just a few aspects of it.
- Of the books that I have read in this genre, the one that I liked the most was Raghuram Rajan’s Fault Lines. It provides a comprehensive analysis of the crisis covering both the domestic factors in the US and the global factors.
- A close second was Nouriel Roubini’s Crisis Economics. Roubini’s writings during the depths of the crisis were perhaps the most insightful discussions that one could get in real time. (I remember that Roubini was also the most insightful commentator on the Argentine crisis of 2001 in real time and next only to Krugman in real time analysis of the Asian Crisis in 1997-98). Perhaps it was because I read Roubini’s book with such high expectations that I found his book a little rushed. That pushed the book to second place in my ranking. Yet, I think that Roubini is also a book that anybody who wants an analytical understanding of the whole crisis must read.
Then there are books that provide important but partial theoretical perspectives on the crisis.
- Simon Johnson’s 13 Bankers is a wonderful book. We know a lot about emerging market crises because there have been so many of them; and Simon Johnson applies that knowledge to the crisis in the United States. The idea that the US has behaved like an third world country during the crisis is an important idea to which I am very sympathetic. I talked about Mahathirism in the US and UK in a blog post in April 2008. Yet I somehow found his May 2009 article The Quiet Coup in the Atlantic more powerful and satisfying than the book itself.
- Animal Spirits by George Akerlof and Robert Shiller, is a book length analysis of what Keynes discussed in a single paragraph in the General Theory. It appears to me that “Animal Spirits” needs more than a paragraph, but less than a book.
- Martin Wolf, Fixing Global Finance is a book that I liked very much, despite its heavy focus on global imbalances.
- Robert Shiller’s The Subprime Solution is a masterly analysis of the housing bubble by an economist who did warn about the bubble long before it burst. Similarly, Andrew Smithers’s Wall Street Revalued comes from an economist who made his name by warning about the dot com bubble.
- Are the Golden Years of Central Banking Over? by Stefan Gerlach and others deals at length with the monetary policy and financial regulation implications of the crisis. Richard Koo’s The Holy Grail of Macroeconomics about balance sheet recessions and the lessons from Japan is also an interesting book.
- Among the books in this genre that I read but did not enjoy to the same extent were ECONned by Yves Smith, How markets fail by John Cassidy, The origin of financial crises by George Cooper and POP: why bubbles are great for the economy by Daniel Gross.
The third category is books that provide a detailed factual narrative of the entire crisis.
- By far the best book here is the magnum opus by Andrew Sorkin – Too Big to Fail. He has been able to put together eye witness accounts from so many sources that the book makes it appear that Sorkin was a fly on the wall watching and listening as the momentous events unfolded.
- Another must read book is Henry Paulson’s On the Brink. Despite being written so soon after Paulson stepped down as Treasury Secretary, the book is surprisingly frank and detailed. He does seek to justify what he did, but manages to do so without becoming jarring. I am tempted to say that Paulson has done a far better job as an author than he did as Treasury Secretary.
- I am waiting eagerly for Bernanke’s long promised book Before Asia Opens. If Bernanke serves as Fed Chairman for as long as Greenspan did, we might have forgotten about the crisis by the time we get that book.
Another important category of crisis books look at specific actors or groups of actors that made or lost a fortune in the crisis.
- The Big Short by Michael Lewis and the Confidence Game by Christene Richard are among the most enjoyable crisis related books that I have read. (I have yet to read Gregory Zuckerman’s Greatest Trade Ever).
- William Cohan chronicles the demise of Bear Stearns in House of Cards, while Roger Lowenstein has a broader sweep covering Bear, Lehman, AIG and the rest of the infamous cast in his End of Wall Street. I believe however that no book on Lehman is likely to match the dull prose of the 2,200 page report produced by the court examiner Anton Valukas.
- Several great books about hedge funds, quants and nerds helped me understand the crisis much better though they are only partly (or in some cases, only peripherally) about the crisis. Sebastian Mullaby’s More Money than God, Scott Patterson’s The Quants, David Leinweber’s Nerds on Wall Street, and Richard Bookstaber’s Demon of Our Own Design are all books from which I learned a lot. They are also well written and immensely enjoyable. Lecturing Birds on Flying by Pablo Triana gives an extreme anti-quant view, but I found the book unconvincing. Gillian Tett’s fabulous book on the origin and growth of credit default swaps – Fool’s Gold – is another must read. Selling America Short by Richard Sauer is also a very good book; all regulators should definitely read this.
I now turn to official reports related to the crisis.
- On the crisis as a whole, the IMF’s Global Financial Stability Report, the BIS Quarterly Review and Annual Report were indispensable in making sense of the crisis as it unfolded. But if I ask myself which of these would one like to re-read years later, it would be the BIS Annual Report for 2007-08, particularly its first chapter, “The unsustainable has run its course”. The Financial Crisis Inquiry Commission in the US is due to submit its report at the end of the year, but expectations are rather muted about this.
- On individual firms, I have already mentioned the Valukas report on Lehman. The Shareholder Report on UBS’s writedowns, and the subsequent Transparency Report are good sources of information on UBS. The report of the Special Investigation Commission in Iceland provides detailed information about the failure of the big Icelandic banks. Some information is available about AIG in the reports of the Congressional Oversight Panel, but we still know too little about that company.
I have not mentioned the books on financial history without which one cannot make sense of the crisis at all. This however is a subject for a separate blog post that I hope to compose in the next few days. So in this post, I will confine myself to only one book in this genre – This Time is Different – by Carmen Reinhart and Kenneth Rogoff. This is a book that simply cannot be dropped from any reading list on the crisis
Postscript: I have cheated a little. Many of the books that I have mentioned have a long subtitle in addition to the main title that I have mentioned here. I was too lazy to type these subtitles; moreover, the title is usually much more pithy without the subtitle. I have also consciously avoided giving links to Amazon or any other book site for any of these books in the belief that any half decent search engine will make up for this omission.
Posted at 4:44 pm IST on Sun, 31 Oct 2010 permanent link
Categories: crisis, financial history, interesting books
Some more thoughts on the flash crash
I have written a number of times about the flash crash during the last five months (here, here, here, here, here, and here.) This post tries to put together the key issues as I see them.
- The flash crash happened towards the end of the day in the US when
Asia and Europe were both closed and therefore nothing happened in
those markets. The rest of the world has therefore been able to think
of this as America’s problem. Had this happened when other
markets were open, it would have been every body else’s problem.
Quite apart from that, I think the flash crash is a serious issue for regulators in all countries. Large market orders interacting with a thin order book can cause a flash crash in any market anywhere in the world.
- The crash happened on a relatively calm and benign day. Yes, there
were some protests in Greece, but that hardly counts as a crisis
situation. Things would have been much worse if the crash had happened
when a big bank was tottering or when a serious terrorist attack or a
grave natural calamity was in progress.
In this sense, the flash crash episode can even be regarded as a good thing to have happened. Without causing much damage, it provided a wake up call to regulators to analyze market designs and make them more robust. If we do not take corrective action, we will probably end up facing a much bigger market disturbance on a bad day.
-
It is a mistake to think that the flash crash can happen only with automated high frequency trading. With a sufficiently thin order book, a flash crash can be triggered by a cascade of stop loss orders (at one level, a stop loss order is also an algorithm). Indeed, even a cascade of market orders is sufficient.
- It is time to reconsider the whole issue of internalization that
allows brokers to execute proprietary trades against customer order
flow without routing everything to a transparent execution venue.
The SEC report discusses internalization at length, but its cognitive capture is so complete that it does not see anything wrong in what happened. What are the fiduciary responsibility of a market maker who thinks that the market is too risky to trade in on its own account but merrily routes customer orders into that same market? Is it acceptable to argue that this is fine because the trades would be cancelled anyway?
The CFTC/SEC report has not provided confidence to the retail investors about the integrity of the market structure. It has failed to provide a convincing and conclusive answer to whether any market abuses happened during the flash crash.
- There is need to examine whether the SEC’s implementation of
the National Market System is fundamentally flawed.
The technology underlying the dissemination of the National Best Bid and Offer (NBBO) is so antiquated compared to the rest of the modern market infrastructure that the professionals seem to be relying only on proprietary price feeds.
At the same time, as trading gets fragmented across multiple execution venues, the need for best execution is only becoming more and more pressing.
As trading has moved to ever shorter latencies, the surveillance capability of the regulators has fallen far behind. Regulators do not seem to have the capabilities to analyze high frequency data at all and I think this is a serious problem.
There is a major problem with the accuracy of time stamps of major market infrastructure providers. Today using GPS time synchronization and PTP instead of NTP, it is possible to achieve accuracy of a few microseconds, and regulators need to mandate this for systemically important entities.
- We should consider eliminating the ability of exchanges to cancel
wrong trades completely.
The SEC has moved to make the cancellation process a little more objective and transparent than before. But today’s information technology makes it possible to get rid of trade cancellation completely. If an objective algorithm exists to determine the trades to be cancelled, the same algorithm can be used to prevent those trades from happening at all. Unlike in a manual system, the reaction time of a computer can be fast enough to “cancel” the trade in real time by not matching the trade at all.
- The evidence seems to suggest that the market structure has become
so fragile that it is an accident waiting to happen. I am reminded of
the old nursery rhyme:
For want of a nail the shoe was lost.
For want of a shoe the horse was lost.
For want of a horse the rider was lost.
For want of a rider the battle was lost.
For want of a battle the kingdom was lost.
And all for the want of a horseshoe nail.When this kind of a thing happens, the solution is not to go in search of the blacksmith who lost the nail (or the mutual fund that did a large futures trade), but to build more buffers and make the system more robust so that a few nails and horses can be lost without catastrophic consequences. Just as we stress test individual banks, it is also necessary to stress test the entire market structure.
The best way to do that is to build simulated computer models of the entire market structure including the most popular trading algorithms and then stress test the whole edifice.
Posted at 8:23 pm IST on Wed, 20 Oct 2010 permanent link
Categories: exchanges, investigation
More on Flash Crash Report
I have a column in yesterday’s Financial Express with a more detailed discussion about the flash crash report that I blogged about (here / here) last week.
The Commodity Futures Trading Commission (CFTC) and Securities and Exchange Commission (SEC), which regulate the US equity markets and equity futures markets respectively, have released a hundred page report on the flash crash of May 6, 2010. On the afternoon of May 6, the broad market index in the US dropped by over 5% in the space of less than five minutes only to bounce back in the next five minutes. Then, even as the broad index was recovering, several stocks crashed to near zero. For example, Accenture fell from $30 to $0.01 in the space of seven seconds and then snapped back to the old level within two minutes.
The worst sufferers were retail investors who found their orders executing at absurd prices. A retail sell order (possibly a stop-loss order) might have been triggered when Accenture was trading at $30, but the order might have ended up being executed at $0.01. Some, but not all, of the damage was undone when the exchanges cancelled all trades that were more than 60% away from the pre-crash prices.
The CFTC-SEC report claims that the crash was triggered when a mutual fund sold $4.1 billion worth of index futures contracts very rapidly. The mutual fund’s strategy was to sell one contract for every ten contracts being sold by other traders, so that it would account for 9% of all trades during each minute until the entire order was executed. The large order allegedly confused the high frequency traders (HFTs) who having bought from the mutual fund, found themselves holding a hot potato, and then tried to pass the potato around by trading rapidly with each other. The result was a sharp rise in the HFTs’ trading volume, and this higher volume fooled the mutual fund’s algorithm into selling even faster to maintain the desired 9% participation rate. This set up a vicious circle of sharp price declines.
This story makes for a great movie plot but in an official investigative report, one expects to see evidence. Sadly, the report provides no econometric tests like vector auto regressions or Granger causality tests on tick-by-tick data to substantiate its story. Nor are there any computer simulations (using agent-based models) to show that the popular HFT algorithms would exhibit the alleged behaviour when confronted by a large price-insensitive seller.
Moreover, the data in the report itself casts doubt on the story. More than half of the mutual fund’s $4.1 billion trade was executed after prices began to recover. And the report suggests that the hot potato trading was set off by the selling of a mere 3,300 contracts ($180 million notional value). In one of the most liquid futures markets in the world, $180 million is not an outlandishly large trade. Surely, there must have been such episodes in the past and if there is a hot potato effect, it must have been observed. The report is silent on this. Further, the one thing that HFTs are good at is analysing past high frequency data to improve their algorithms. Would they not then have observed the hot potato effect in the past data and modified their algorithms to cope with that? Finally, during the most intense period of the alleged hot potato trading, the HFTs were net buyers and not net sellers. This suggests that perhaps the potato was not so hot after all.
The analysis in the report is even more flawed when it comes to the issue that concerns retail investors most—the carnage of individual stocks that began two or three minutes after the index began its recovery. The report bases its conclusions, on this issue, almost entirely on extensive interviews with the big Wall Street firms—market makers, HFTs and other brokers.
Astonishingly, the regulator did not find it necessary to interview retail investors at all. This is like a policeman investigating a theft without talking to the victim. There is no discussion of whether retail investors were confused, misled or exploited. For example, the report dismisses concerns about delays in the public price dissemination because all the big firms subscribe to premium data services that did not suffer delays. If delayed data led to wrong decisions by retail traders, that apparently is of no concern to the regulators.
In the post-crisis, post-Madoff world, we expect two things from regulatory investigations. First, we expect regulators to have the capability to investigate complex situations using state-of-the-art analytical tools. Second we expect them to carry out an unbiased investigation without giving high-profile regulated firms undue importance.
On both counts, the CFTC-SEC report is disappointing. After five months of effort, they do not seem to have come to grips with the terabytes of data that are available. The analysis does not seem to go beyond presenting an array of impressive graphs. Most importantly, the regulators appear to still be cognitively captured by the big securities firms and are, therefore, reluctant to question current market structures and practices.
Posted at 11:33 am IST on Sat, 16 Oct 2010 permanent link
Categories: exchanges, investigation
Flash crash report is superficial and disappointing
The joint report by the US CFTC and SEC on the flash crash of May 6, 2010 was released late last week. I found the report quite disappointing and superficial. Five months in the making, the report provides a lot of impressive graphs, but few convincing answers and explanations.
Consider the key finding:
- “One key lesson is that under stressed market conditions, the automated execution of a large sell order can trigger extreme price movements, especially if the automated execution algorithm does not take prices into account.” This is referring to a sell order by a mutual fund for 75,000 index futures contracts with a notional value of $4.1 billion that was executed with a target participation rate of 9%. Roughly speaking, this participation rate implies that in any time interval, the algorithm tries to execute a quantity equal to one-tenth of what other traders are executing in the aggregate. The discussion in the report is hopelessly vague about what happened, but it suggests that the crucial problems described in the next point below were triggered when the first 3,300 contracts had been sold. While the report explains at length that 75,000 was a truly large order, it is clear that 3,300 contracts ($180 million) is not an outlandishly large trade. Surely, there must have been episodes in the past of a few thousand contracts being sold very quickly and the report could have provided a comparison of what happened on May 6 with what happened on those dates. And if the problem was market stress, then understanding the nature of that stress is critical.
- “Moreover, the interaction between automated execution programs and algorithmic trading strategies can quickly erode liquidity and result in disorderly markets.” The claim is that the large order confused the high frequency traders (HFTs) who increased their trading volume, and this higher volume fooled the mutual fund’s algorithm into selling even faster. As far as I can see, this is pure speculation. I would have liked to see this phenomenon demonstrated using agent based models or some other sound methodology.
- “As the events of May 6 demonstrate, especially in times of significant volatility, high trading volume is not necessarily a reliable indicator of market liquidity.” This is referring to the fact that: “ Moreover, compared to the three days prior to May 6, there was an unusually high level of ‘hot potato’ trading volume – due to repeated buying and selling of contracts – among the HFTs, especially during the period between 2:41 p.m. and 2:45 p.m. Specifically, between 2:45:13 and 2:45:27, HFTs traded over 27,000 contracts, which accounted for about 49 percent of the total trading volume, while buying only about 200 additional contracts net.” However, no explanation at all is provided for this – if HFTs were desperately trying to pass around the hot potatoes that they had acquired minutes earlier, why were they buying 200 more hot potatoes?
Even if we were to accept the conclusion of the report that a mutual fund selling $4 billion of index futures in 20 minutes is an adequate explanation of the flash crash in the index markets, there is still the issue of what happened in specific stocks. The index began its recovery at 2:45:28 but the carnage in individual stocks happened a few minutes later at 2:48 or 2:49. The report attributes this to the withdrawal of liquidity by market makers at around 2:45. At this point, the report relies on extensive interviews with market makers and fails to substantiate key assertions with hard facts.
Some portions of the report look more like a journalist’s casual empiricism than the hard analysis that one expects in a fact finding report. For example, on page 66, there is a discussion of data about a single market maker that concludes with a whole string of tainted phrases: “If this example is typical ... it seems that ... This suggests that ... From this example it does not seem that ... ” I can understand all this five weeks after the crash, not five months later.
A few less important quibbles:
- I also found some of the graphs difficult to interpret. For example, the charts on the order book are colour coded relative to the mid-price of the stock. Since this price was falling dramatically, it is difficult to see what parts of the order book were actually being eliminated through order execution or order cancellation. That prices can fall to near zero only if the buy side of the order book is exhausted is a tautology. One wants to see how and when the buy orders were cancelled or got executed.
- Most of the reported data for individual stocks is at one minute intervals and some at fifteen minute intervals when other analysts have been looking at events on a millisecond time scale. At a one minute time scale, several things happen simultaneously – for example, prices fall and order books shrink – but it is difficult to see what happened first. Causality is hard, but is it too much to ask for at least the sequencing to be described accurately?
- The report suggests that the order books of ETFs had less depth far from the mid quote and this led to the disproportionate incidence of broken trades in them. However, in the first part of the report, it is shown that the ETF on the S&P 500 (known as SPY) performed better than the index future itself. This suggests that not all ETFs are similar in the fragility of their order book, and I would have liked to see some exploration of this issue.
In conclusion, the report leaves me disappointed as regards the three critical questions that I asked myself after reading the report:
- How far does the report provide confidence to an investor that with the corrective action taken since May 6, 2010, market prices are reliable? I think the report is totally unconvincing on this score.
- Does the report provide evidence that the post-Madoff SEC (and CFTC) can analyze a complex situation and arrive at a top quality analysis? Despite the high calibre of resources that have been recruited into the SEC in the last year or two, the quality of the report leaves much to be desired.
- Does the report show that regulators have escaped cognitive capture by their regulatees? I am sorely disappointed on this score. The report draws on extensive interviews with traditional equity market makers, high-frequency traders, internalizers, and options market makers. Apparently, nobody thought it fit to interview retail investors to understand how market distortions and data feed delays affected their order placement strategies. For example, Nanex has claimed that the Dow Jones index was delayed by 80 seconds. Were retail investors who follow the Dow Jones thinking that the market index was still falling even when professionals could see that it was recovering? Did this cause panic selling? For a cognitively captured regulator, it is sufficient to report that “Most of the firms we interviewed ... subscribe directly to the proprietary feeds offered by the exchanges. These firms do not generally rely on the consolidated market data to make trading decisions and thus their trading decisions would not have been directly affected by the delay in data in this feed.”
Posted at 2:19 pm IST on Sun, 3 Oct 2010 permanent link
Categories: exchanges, investigation
BIS Confirms Huge Offshore Rupee Market
In my post early this month, I very tentatively argued that data from the BIS and the RBI could be put together to suggest that half the rupee-dollar market was outside India. Most people whom I talked to said that this was unlikely and that there was probably some error either in the data or in my analysis.
But now the BIS has published a paper on offshore foreign exchange markets which gives clearer data. According to Table 7 of this paper, 52% ($10.8 billion) of the total rupee forward and forex swap market ($20.8 billion) is offshore and only 48% ($10.0 billion) is onshore. We still need to wait for November for more detailed data on other segments of the rupee market, but $10.8 billion a day is much larger than most estimates that I have seen or heard of the offshore non deliverable forward market.
The same table provides data about 2007 as well – only 30% ($3.6 billion) of the rupee forward and forex swap market was offshore. In just three years, the offshore market has tripled in size! A footnote in the table cautions us that the mandatory reporting of trades in the rupee (and several other emerging market currencies) following its reclassification of these currencies as major currencies would have increased the reported size of the offshore markets.
According to the BIS Paper, the offshore markets are even bigger for the Chinese renminbi (63% is offshore but much of that is in Hong Kong) and the Brazilian real (82% is offshore). The paper argues that offshore non deliverable markets in the Brazilian real, Chinese renminbi and Indian rupee are now so large that “adding an offshore deliverable money and bond market may not represent a large change.”
Suddenly, we are waking up to a much more internationalized currency than any of us were aware of. I have long argued that Indian capital controls are more “sand in the wheels” than effective barriers to capital flows. The data points in the same direction – policy makers must recognize that India has a de facto open capital account.
Posted at 2:51 pm IST on Sun, 26 Sep 2010 permanent link
Categories: international finance
Stock Exchange Regulation and Competition
This column of mine regarding stock exchange regulation and competition appeared in the Financial Express today. Coincidentally, yesterday evening, the Securities and Exchange Board of India released its order rejecting the application of MCX-SX to start trading equities in India. My column was written early this week well before SEBI passed its order. I am not yet masochistic enough to sit up all night to read a 68 page order and then write a column about it.
The ongoing dispute regarding the shareholding pattern of MCX-SX is an opportunity to rethink the current regulatory conception of the stock exchange as the extended regulatory arm of the state. Given this regulatory conception, the ownership structure (legal ownership, ownership of economic interests, and ownership of control rights) of the stock exchange becomes a matter of public policy. However, the requirement to have dispersed shareholding is likely to result in an unacceptably anti-competitive outcome.
There is a different way of looking at stock exchanges—not as frontline regulators, but as the equivalent of a shopping mall for securities. We do expect shopping malls to comply with the building safety code, but do not expect them to “regulate” the shop owners. If we treat stock exchanges the same way, we would expect them to comply with basic regulations regarding trading systems and infrastructure, but would not expect them to regulate either the listed companies or the stock brokers.
We could not have thought of stock exchanges like this a century ago because there were no securities regulators in those days and the central banks too did not bother to regulate the markets. For example, in the US a hundred years ago, it was left to the New York Stock Exchange to demand that companies publish their annual accounts (and delist even large companies like Proctor and Gamble for refusing to do so). Similarly, in those days, it was the London Stock Exchange that imposed free float requirements (67% free float!) because there was no other regulator to do so. Today, we expect the securities regulators, the company law departments and the accounting bodies to perform much of this regulatory role.
The time has come to ask whether the stock exchange should be a listing authority at all in an era of demutualised stock exchanges and alternate trading systems. There are stock exchanges elsewhere in the world that are listed on themselves; this appears to me to be as absurd as a snake swallowing its own tail. Of course, the alternative of a stock exchange listing on a rival exchange is only slightly less laughable. Some countries have shifted the listing function into an arm of the regulator itself and I think there is much to commend such a move.
Another problematic area is that of market surveillance. In an era of highly interconnected markets, the idea of each exchange performing surveillance on its own market is an anachronism. A stock exchange that sees only the trades happening on its own platform is no match for a market manipulator who trades in multiple cash and derivative exchanges (as well as the OTC markets) and shifts positions across these markets to avoid detection. The flash crash in the US markets on May 6, 2010, has highlighted the folly of relying on surveillance by the exchanges. Some of the best analyses of the events of that day have come not from the exchanges or the regulators but from data feed companies that specialise in processing high frequency data from multiple trading venues.
The final regulatory barrier to free competitive entry into the stock exchange industry comes from the extreme systemic importance of the clearing corporation of a major exchange. In the UK, the ability to outsource clearing to LCH.Clearnet has been very important in the emergence of alternate trading venues. Indian regulators should also explore such a solution.
Shorn of listing, surveillance and clearing, a stock exchange would be very much like a shopping mall and it would be possible to permit free entry without any significant regulatory barriers. The regulators should then be blithely unconcerned about who owns, controls or runs an exchange. By unleashing competition, this could help bring down costs and improve service levels.
In the interest of ensuring competitive outcomes, I think it would be useful to also dismantle the utterly dysfunctional “fit and proper” regime throughout the financial sector. The global financial crisis has shown that there is scarcely any bank or financial intermediary in the world that is “fit and proper” enough to be entrusted with any significant fiduciary responsibility without intrusive supervision and stringent regulation. The illusion of a “fit and proper” regime only serves to discourage private sector due diligence.
I believe that regulators worldwide should accept this reality and abandon the “fit and proper” requirement altogether. Resources devoted to screening applicants at the point of granting a licence are much better spent supervising those who are already licensed. Today, the position is the opposite. In India banks and stock exchanges have retained their licences long after they had deteriorated to the point where they would not get a licence if they were applying for it afresh. This is an intensely perverse anti-competitive situation.
In short, the problems relating to shareholding pattern of stock exchanges highlighted by the MCX-SX episode should be solved not through legal hair splitting but through more robust regulatory frameworks.
Posted at 10:50 am IST on Fri, 24 Sep 2010 permanent link
Categories: exchanges, regulation
Teji-Mandi goes to Chicago
Options with one-day maturity (known as Teji and Mandi for call and put options) were popular in Indian equity markets during the 1970s and 1980s though they were prohibited under the Securities Contracts (Regulations) Act, 1956. With the equity market reforms of the 1990s, these contracts disappeared completely.
One-day options are now being proposed by the Chicago Board Options Exchange (hat tip FT Alphaville). In its regulatory filing, the CBOE says:
The Exchange believes that Daily Option Series will provide investors with a flexible and valuable tool to manage risk exposure, minimize capital outlays, and be more responsive to the timing of events affecting the securities that underlie option contracts. In particular, the Exchange seeks to introduce Daily Option Series to provide market participants with a tool to hedge overnight and weekend risk, as well as the risk of special events such as earnings announcements and economic reports, and pay a fraction of the premium of a standard or weekly option (due to the very small time value priced into the option premium). The Exchange believes that daily expirations would allow market participants to purchase an option based on a precise timeframe thereby allowing them to tailor their investment or hedging needs more effectively.
Regulatory fashions come and go – often, a financial innovation is only the reintroduction of something that existed decades or centuries ago. Much of what happens in modern equity markets (good or bad) can be traced back to early 17th Century Amsterdam. (See for example, Geoffrey Poitras, From Antwerp to Chicago: The History of Exchange Traded Derivative Security Contracts and Jose Luis Cardoso, “Confusion de confusiones: ethics and options on seventeenth-century stock exchange markets”, Financial History Review (2002), 9:109-123).
I have long believed that what was really wrong with things like Teji, Mandi or Badla in pre-reform Indian equity markets were not the instruments themselves but the absence of robust risk management and the lack of safeguards against market manipulation. There is nothing wrong with daily options – many high frequency trading strategies might effectively be replicating short maturity options. I would in fact wonder whether there is merit in options with even shorter maturity – hourly, if not even shorter.
Posted at 5:03 pm IST on Sat, 11 Sep 2010 permanent link
Categories: derivatives
Is half the rupee-dollar market outside India?
I do not know whether I am reading the data wrong, but the preliminary results of the BIS Triennial Survey on foreign exchange turnover (April 2010) appear to suggest that nearly half the rupee-dollar market is outside India.
For the first time, the 2010 BIS survey includes the rupee as a “main currency” and provides data about the USD/INR turnover. According to Table 4, the average daily turnover in USD/INR was $36 billion. Table 5 tells us that the average daily foreign exchange turnover in India was only $27.4 billion. That might suggest that India accounts for 75% of the USD/INR market.
However, not all the Indian market is USD/INR. According to the RBI data (Table 47 of the RBI bulletin of June 2010) in April 2010, INR versus all foreign currencies was only 71% of the market in India; almost 30% was trading of various foreign currencies against each other. In this computation, I have taken both sides of the merchant trades and only one side of the inter-bank trades as the BIS data is on net basis. Of course, BIS also does cross border netting which might change the numbers a bit, but I would think the percentages might not be impacted too much.
If we make the reasonable assumption that the entire INR versus foreign currency market in India is actually INR versus USD and take 71% of $27.4 billion as the USD/INR market in India, we get $19.5 billion which is only 54% of the $36 billion global USD/INR market. If these calculations are approximately correct, India is only a little more than half of the total global rupee-dollar market.
One other relevant data point is that according to the BIS Table 5, India’s share of the global foreign exchange market has dropped from 0.9% in 2007 to 0.5% in 2010.
Unfortunately, while the BIS links to various central banks that publish their national results at the same time as the BIS, the RBI is not among them. So we do not have more data to verify these computations.
Posted at 3:18 pm IST on Fri, 3 Sep 2010 permanent link
Categories: international finance
How Fast Can Traders Add and Multiply?
I have written a paper entitled “When Index Dissemination Goes Wrong: How Fast Can Traders Add and Multiply?” It has also been uploaded at SSRN.
The abstract is as follows:
This paper studies an episode of dissemination of wrong stock index values in real time due to a software bug in the Indian Nifty index futures market on the morning of January 18, 2006.
The episode provides an opportunity to test various models of cognitive biases and bounded rationality highlighted in behavioural finance. The paper provides strong evidence against cognitive biases like “anchoring and adjustment” (Tversky and Kahneman, 1974) that one might expect under such situations even though the cognitive task involved is quite simple. The futures market tracked the true Nifty index which it could not see while completely ignoring the wrong Nifty index that it could see.
However, the paper demonstrates that market efficiency failed in more subtle ways. There is evidence of a partial breakdown of price discovery in the futures markets and a weakening of the bonds linking futures and cash markets.
This evidence is consistent with the centrality of “market devices” as argued in “actor network theory” in economic sociology (Muniesa, Millo and Callon, 2007 and Preda, 2006). Well functioning markets today depend critically on a whole set of information and communication technologies. Any failures in these material, socio-technical aspects of markets can make markets quite fragile even if behavioural biases are largely absent.
Posted at 9:58 am IST on Wed, 1 Sep 2010 permanent link
Categories: behavioural finance, benchmarks, exchanges
Loss absorbency of regulatory capital
The Basel committee on banking supervision has put out a proposal to ensure the loss absorbency of regulatory capital at the point of non-viability.
The proposal points out that all regulatory capital instruments are loss absorbent in insolvency and liquidation – “they will only receive any repayment in liquidation if all depositors and senior creditors are first repaid in full.” However, the financial crisis has revealed that many regulatory capital instruments do not always absorb losses in situations in which the public sector provides support to distressed banks that would otherwise have failed.
The solution that is proposed is as follows:
All non-common Tier 1 instruments and Tier 2 instruments at internationally active banks must have a clause in their terms and conditions that requires them to be written-off on the occurrence of ... the earlier of: (1) the decision to make a public sector injection of capital, or equivalent support, without which the firm would have become non-viable, as determined by the relevant authority; and (2) a decision that a write-off, without which the firm would become non-viable, is necessary, as determined by the relevant authority.
I am unable to understand how such tweaking of contractual terms will get around the fundamental problem that governments are unwilling to impose losses on banks and their stakeholders. In the United States when the government injected TARP capital into the banks, it forced the healthy banks also to take the capital to eliminate any stigma associated with TARP capital. Even if the proposed clause were present in the regulatory capital instruments issued by the insolvent banks of that time, clearly the clause would not have been triggered by the injection of TARP capital in this gentle form.
Posted at 3:51 pm IST on Fri, 27 Aug 2010 permanent link
Categories: banks, leverage, regulation
Two curve models
Updated: corrected quote attibution of the second quote.
I am posting for comments an introductory note on two curve models. This note which I wrote largely to improve my own understanding of this subject, represents my interpretation of the results presented in various papers listed in the bibliography and does not claim to contain any original content.
The following two quotations give a sense of what two curve models are all about:
LCH.Clearnet Ltd (LCH.Clearnet), which operates the world’s leading interest rate swap (IRS) clearing service, SwapClear, is to begin using the overnight index swap (OIS) rate curves to discount its $218 trillion IRS portfolio. Previously, in line with market practice, the portfolio was discounted using LIBOR. ... After extensive consultation with market participants, LCH.Clearnet has decided to move to OIS to ensure the most accurate valuation of its portfolio for risk management purposes. (LCH.Clearnet, June 17, 2010)
Ten years ago if you had suggested that a sophisticated investment bank did not know how to value a plain vanilla interest rate swap, people would have laughed at you. But that isn’t too far from the case today. (Deus Ex Machiatto, June 23, 2010)
The topic is quite complex even by the standards of this blog and it takes me twelve pages of occasionaly dense mathematics to explain what I have understood of its basic ideas. To fully understand two curve models you would need to read a lot more of even denser mathematics. If you do not have the patience for that, you should just ignore this post.
What I would really love is for some of my readers to provide comments and suggestions to improve this note and correct any errors that might be there.
Posted at 3:49 pm IST on Fri, 20 Aug 2010 permanent link
Categories: bond markets, derivatives, post crisis finance, risk management
RBI on new bank licences in India
I have a column in today’s Financial Express about the RBI’s discussion paper on new bank licences
Reserve Bank of India (RBI) has begun a very open and transparent process of thinking about new bank licences with a discussion paper that outlines the key issues, presents the pros and cons of the alternatives and also highlights the international experience.
While discussing these important issues, it is necessary to keep two things in mind. First, we must learn the right lessons from the experience of the new bank licences given out in the post-reforms period. Second, the global financial crisis has changed the way we think about bank regulation and competition.
Of the 10 new banks licensed in the first phase, the majority were failures in the broadest sense, but a few became outstanding successes. The viewpoint in the RBI discussion paper is that we must identify the causes of the failures and avoid making the same mistakes when granting new licences.
I am of the completely opposite persuasion. The success of the 1993 experiment was that enough licences were granted to permit a few success stories to emerge, despite a low success rate. Capitalism to me is about liberal experimentation and ruthless selection. What is wonderful about the 1993 experiment is that the failures (although many) were on a small scale and were (with one exception) quite painless, while the successes were outstanding. This makes for an extremely favourable risk-reward ratio.
While handing out new licences, the goal should not be to avoid failures; it should be to maintain the same attractive risk-reward ratio. I believe that this again requires the same approach—granting many licences, allowing the market to weed out failures at an early stage, and giving enough freedom to allow the successes to bloom.
It is impossible to figure out right now what will or will not work in the emerging banking environment. It is very unlikely that a successful bank emerging from the new set of licences will be a clone of, say, HDFC Bank. HDFC Bank and its peers succeeded by identifying what the then existing Indian and foreign banks were not doing well or not doing at all, and then setting out to deliver that with high levels of efficiency. But thanks to their very success, that space has now become overcrowded and hypercompetitive. A new bank starting today will have to find a new space in which to make its mark.
No regulator can predict what that new business model will be. As Hayek once wrote: “Competition is valuable only because, and so far as, its results are unpredictable and on the whole different from those which anyone has, or could have, deliberately aimed at.”
What is important is to keep failures small and manageable, and the way to do that is to allow banks to start small. The RBI discussion paper veers towards allowing only large banks in the belief that this will keep out people who are not serious. This is a mistake that financial regulators around the world appear to make.
I still remember that at the height of the Asian Crisis, one of the few healthy and solvent Indonesian banks was one of their smallest banks (Bank NISP). But the Indonesian central bank’s response (probably encouraged by the IMF) was to impose one of the highest minimum capital requirements in the world. It would have been utterly hilarious were it not so tragic.
Equating money with seriousness is a misconception unique to the financial elite. The rest of the world does not think that a student who has got admission by paying a large donation or capitation fee is a more serious student than the one who came in on the merit list. But financial regulators in India and elsewhere have an abiding belief in the ennobling power of money. Sebi has also been talking about increasing capital requirements for its regulatees.
I believe, on the other hand, that the global financial crisis has indicated that in the world of finance, size is evil in itself. Simon Johnson’s brilliant new book 13 Bankers: The Wall Street Takeover and the Next Financial Meltdown argues for a size limit of “no more than 4% of GDP for all banks and 2% of GDP for investment banks.”
By this standard, which I consider quite reasonable, India has seven banks above the size limit, including one that is almost 20% of GDP (I have taken the bank asset data from the RBI’s Statistical Tables Relating to Banks of India, 2008-09). Of these seven banks, only one is in the private sector, but two other large private banks are growing fast enough to cross the size limit in the next few years.
I believe, therefore, that RBI would grant a large number of new bank licences that will rapidly bring down the concentration in the banking system. India needs a lot of banks that are small enough to fail and fewer that are too big to fail.
Somewhere in the long chain from the keyboard to the printed newspaper what should have been a “should” or a “must” in the beginning of the last paragraph became “would.” Who am I to predict what the RBI would or would not do? I do hope however that my unintended prediction turns out right!
Posted at 10:13 am IST on Fri, 13 Aug 2010 permanent link
Categories: banks, regulation
Has the greatest financial risk gone away?
I have long argued that the greatest global financial risk is not toxic derivatives or bad loans – it is the unnerving possibility that P=NP. P≠NP is a conjecture about an abstruse problem in mathematics, but too much of computer security depends on it. It is likely that if P=NP, then many financial assets that are recorded as electronic entries could suddenly evaporate because those entries could all be hacked. Since almost all financial assets today are in electronic form, that would be the end of finance as we know it.
During the last couple of days, a purported proof that P≠NP has been circulating on the web (hat tip Bruce Schneier). The hundred page paper by Vinay Deolalikar of HP Research Labs, Palo Alto utilizes and expands “ upon ideas from several fields spanning logic, statistics, graphical models, random ensembles, and statistical physics” to obtain the purported proof. We still do not know whether the proof is correct (see here and here)
It reminds me of the early days of the initial claims of Wiles’s proof of Fermat’s last theorem or Perelman’s proof of the Poincare conjecture. Everybody agrees that it is a serious proof, but nobody knows whether the proof is right. But if Deolikar is right, the biggest financial risk of all has gone away.
Posted at 5:38 pm IST on Tue, 10 Aug 2010 permanent link
Categories: mathematics, technology
RBI proposes CDS market by dealers and for dealers
The Reserve Bank of India has released the report of the Internal Group on Introduction of Credit Default Swaps for Corporate Bonds in India.
What is proposed is a market by dealers and for dealers. Users can only buy CDS protection, and they have to buy them from dealers (banks and other regulated entities) who are the only people allowed to sell CDS. But the most diabolical recommendation is the following:
The users can, however, unwind their bought protection by terminating the position with the original counterparty. ... Users are not permitted to unwind the protection by entering into an offsetting contract. [Paragraph 2.7.6(ii) on page 19]
This leaves the unwinding users at the complete mercy of the original dealer from whom they bought CDS protection – that dealer can fleece the users knowing fully well that they cannot go elsewhere. Under these terms, it would be utterly imprudent for a company to use CDS at all. Well designed corporate risk management policies should demand the availability of competitive quotes both at inception and at unwind, and should therefore completely prohibit the use of the proposed CDS market. Of course, India has a number of imprudent companies with poor risk management policies; perhaps, the RBI proposed market is suitable only for them.
The other frightening part of the proposals is that at a time when the entire world is worried about the dangers of an opaque CDS market, the report envisages the creation of a CDS market without a trade repository let alone a clearing mandate. The report envisages a trade reporting platform at some unspecified future date, but the establishment of this platform is not a precondition for CDS trading to begin. As far as clearing is concerned, the report makes the right noises, but it is clear that the RBI is not very keen on this.
Even when the trade repository starts functioning, it is unclear what transparency it would bring. First of all, the recommendation uses the word “may” which deprives it of operational significance:
The reporting platform may collect and make available data to the regulators for surveillance and regulatory purposes and also publish, for market information, relevant price and volume data on CDS activities such as notional and gross market values for CDS reference entities broken down by maturity, ratings etc., gross and net market values of CDS contracts and concentration level for major counterparties. [Paragraph 4.2.1 page 40]
Second, the report provides a trade reporting format (Form I in Annex IV) and this format does not include any data on prices at all. This means that even when the reporting platform starts working, it would not provide price transparency even on a post trade basis. What more could the dealer wish for when it comes to fleecing the customer?
One relatively minor issue which I am not able to figure out is whether RBI intends CDS to be used to hedge loans and not only bonds. The report clearly states that only bonds can be reference obligations for CDS, but it is silent on whether loans can be deliverable obligations. Some parts of the report appeared to be deliberately written vaguely to allow loans to be hedged. For example, “The users can buy CDS for amounts not higher than the face value of credit risk held by them” (Paragraph 2.7.6(i) page 19). That would allow loans to be hedged, and what is deliverable would presumably be decided by the Determination Committee which can be counted on to go with the banks on this issue. Whether loans can be hedged is not terribly important, but if the intention is to permit it, why not say so explicitly?
Coming back to the important prudential issue, I believe that India needs a CDS market, but I am concerned that a CDS market as proposed by the RBI would create more systemic risks than it would eliminate. If these are the only terms on which a CDS market can be had, it would be better for the country that we do not create such a market at all.
Posted at 9:11 pm IST on Mon, 9 Aug 2010 permanent link
Categories: bond markets, derivatives
Criticism of monetary policy
There has been a lively debate in India on senior central bank officials criticizing monetary policy decisions in which they may have participated. This debate has tended to focus on the harm that such alleged indiscreetness can do, while I think the important question is how to design the conduct of monetary policy in a manner where open debate does not cause harm.
Consider this passage in a paper last month by a member of the US FOMC that decides monetary policy in that country:
The U.S. is closer to a Japanese-style outcome today than at any time in recent history. In part, this uncomfortably close circumstance is due to the interest rate policy being pursued by the FOMC.
Or this from a UK MPC member who last month titled his speech provocatively as “How long should the song remain the same?”:
The normal monetary policy reaction to a sustained period of above target inflation would be to tighten policy, to create demand conditions which are more conducive to restraining price increases and bringing inflation back to target. But so far, the Committee has not supported that course of action – and is keeping monetary policy extremely loose. ...
Last month, however, I dissented from this approach and voted for a small rise in interest rates. And in today’s speech I want to set out the thinking behind my view of current economic prospects and the implications for UK monetary policy that led me to that decision. ...
The MPC has a clear remit, which is to keep inflation on target at 2% over the medium term. ... we need to adjust the policy settings we put in place to head off the downside risks to inflation identified in the immediate aftermath of the big financial shocks in late 2008 and early 2009.
I think such open debate and criticism strengthens the conduct of monetary policy by allowing divergent points of view to be heard and considered. Alternative analytical frameworks can thus be developed and are available to the policy makers if and when they choose to change their mind. My knowledge of either the theory or practice of monetary policy is very limited, but I like to believe that monetary policy is closer to a science that progresses through informed debate, rather than a dark art that derives its mystique and efficacy from a veil of secrecy.
Unfortunately, criticism of monetary policy decision by senior officials themselves can be prevented from doing harm only in a culture of transparency where minutes of monetary policy deliberations are published openly so that dissenting voices are not misinterpreted by the markets.
Posted at 12:59 pm IST on Mon, 9 Aug 2010 permanent link
Categories: monetary policy
Time stamping by stock exchanges
I just finished reading an interesting study of the flash crash in the US on May 6, 2010 by Nanex which is a data feed company which provides high frequency real time trade and quote data for all US equity, option, and futures exchanges (over one million updates per second). The study was published in mid June and linked by Abnormal Returns a week later, but I got around to reading it only today after several blogs talked about it two days ago.
The study claims:
Beginning at 14:42:46, bids from the NYSE started crossing above the National Best Ask prices in about 100 NYSE listed stocks, expanding to over 250 stocks within 2 minutes (See Part 1, Chart 1-b). Detailed inspection indicates NYSE quote prices started lagging quotes from other markets; their bid prices were not dropping fast enough to keep below the other exchange’s falling offer prices. The time stamp on NYSE quotes matched that of other exchange quotes, indicating they were valid and fresh.
With NYSE’s bid above the offer price at other exchanges, HFT systems would attempt to profit from this difference by sending buy orders to other exchanges and sell orders to the NYSE. Hence the NYSE would bear the brunt of the selling pressure for those stocks that were crossed.
Minutes later, trade executions from the NYSE started coming through in many stocks at prices slightly below the National Best Bid, setting new lows for the day. (See Part 1, Chart 2). This is unexpected, the execution prices from the NYSE should have been higher -- matching NYSE’s higher bid price, unless the time stamps are not reflecting when quotes and trades actually occurred.
If the quotes sent from the NYSE were stuck in a queue for transmission and time stamped ONLY when exiting the queue, then all data inconsistencies disappear and things make sense. In fact, this very situation occurred on 2 separate occasions at October 30, 2009, and again on January 28, 2010. (See Part 2, Previous Occurrences).
If this is really true, then instead of criticizing only high frequency traders, we must also direct some of the blame at the exchanges which behaved irresponsibly. Why cannot the exchange provide time stamps both of the time that a quote entered the queue and when it exited the queue?
Organizations with none of the self regulatory responsibilities of an exchange do this kind of thing routinely. One of the things that I read today was this study about dissemination of press releases at public websites. Yahoo! Finance provides two timestamps when it reports a press release on its web site – first is the timestamp of the press release itself and the second is the time stamp of when it was published on Yahoo! Finance. By comparing these two time stamps, the study concludes that “the average delay was 83 seconds. The fastest was 24 seconds and the slowest was 237 seconds or almost 4 minutes. The median was 80 seconds.”
It did not require a regulator framing rules for a public website to provide two timestamps in its stories. But perhaps exchanges will do the right thing only if they receive a direction from the regulator. And perhaps the regulator will find itself easily persuaded that this simple thing is too costly, complicated or confusing to implement.
Posted at 9:21 pm IST on Wed, 4 Aug 2010 permanent link
Categories: exchanges, regulation, technology
Risk Management for Derivative Exchanges
I wrote a chapter on risk management lessons from the global financial crisis for derivative exchanges for a book edited by Robert W. Kolb on Lessons from the Financial Crisis: Causes, Consequences, and Our Economic Future.
During the global financial crisis, no major derivative clearinghouse in the world encountered distress, while many banks were pushed to the brink and beyond. This was despite the exchanges having to deal with more volatile assets—equities are about twice as volatile as real estate, and natural gas is about 10 times more volatile than real estate. Clearly, risk management at the world’s leading exchanges proved to be superior to that of the banks. The global financial crisis has shown that the quality of risk management models does matter.
Three important lessons have emerged from this experience:
- The quality of risk management models can be measured along two independent dimensions: crudeness versus sophistication and fragility versus robustness. The crisis of 2007-2009 has shown that of these two dimensions, the second dimension (robustness) is far more important than the first dimension (sophistication).
- An apparent structural change in the economy and the financial markets may only be a temporary change in the volatility regime. Risk models that ignore this can be disastrous.
- Risk models of the 1990s, based on normal distributions, linear correlations, and value at risk, are obsolete not only in theory but also in practice.
Most of the chapter paper deals with these lessons from the crisis of 2007-2009. In the final section, the paper argues that as derivative exchanges prepare to trade and clear ever more complex products, it is important that they refine and develop their risk models even further so that they can survive the next crisis.
The chapter is based largely on a paper that I wrote in February 2009.
Posted at 1:12 pm IST on Mon, 2 Aug 2010 permanent link
Categories: derivatives, exchanges, post crisis finance, risk management, statistics
SEC No Action Letter on Rating Non Disclosure
Updated July 27, 2010: Added link and corrected title of the Reform Act.
The US SEC issued a “No Action Letter” last week to negate a key provision of the Dodd-Frank Wall Street Reform and Consumer Protection Act on the day that it came into force. The “No Action Letter” is self explanatory:
Items 1103(a)(9) and 1120 of Regulation AB require disclosure of whether an issuance or sale of any class of offered asset-backed securities is conditioned on the assignment of a rating by one or more rating agencies. If so conditioned, those items require disclosure about the minimum credit rating that must be assigned and the identity of each rating agency. Item 1120 also requires a description of any arrangements to have such ratings monitored while the asset-backed securities are outstanding.
Effective today, Section 939G of the Dodd-Frank Act provides that Rule 436(g) shall have no force or effect. As a result, disclosure of a rating in a registration statement requires inclusion of the consent by the rating agency to be named as an expert. We note that the NRSROs have indicated that they are not willing to provide their consent at this time. In order to facilitate a transition for asset-backed issuers, the Division will not recommend enforcement action to the Commission if an asset-backed issuer as defined in Item 1101 of Regulation AB omits the ratings disclosure required by Item 1103(a)(9) and 1120 of Regulation AB from a prospectus that is part of a registration statement relating to an offering of asset-backed securities.
This no-action position will expire with respect to any registered offerings of asset-backed securities commencing with an initial bona fide offer on or after January 24, 2011.
The relevant portion of Rule 436 is as follows:
(a) If any portion of the report or opinion of an expert or counsel is quoted or summarized as such in the registration statement or in a prospectus, the written consent of the expert or counsel shall be filed as an exhibit to the registration statement and shall expressly state that the expert or counsel consents to such quotation or summarization.
(g) Notwithstanding the provisions of paragraphs (a) and (b) of this section, the security rating assigned to a class of debt securities, a class of convertible debt securities, or a class of preferred stock by a nationally recognized statistical rating organization ... shall not be considered a part of the registration statement.
I can understand a “No Action Letter” that says that as an interim measure the identity of the rating agency need not be disclosed, but I am amazed to find that the SEC is allowing the entire “ratings disclosure” to be omitted. The fact that an issue is conditioned by a minimum rating requirement is I think a material fact.
Also, I do not understand why the SEC cannot simply state that the rating agency shall not be regarded as an expert and require the registration statement to make the same statement. Rating reports should be regarded as being in the same category as press reports and editorials. Ultimately, the SEC should simply abolish the whole category of nationally recognized statistical rating organizations (NRSROs).
I have blogged about rating agency regulatory reform several times in the last three years:
- Non-use of ratings in SEC regulations
- Stocktaking on the use of credit ratings
- Report on Rating Agency Regulation in India
Posted at 3:03 pm IST on Tue, 27 Jul 2010 permanent link
Categories: credit rating, regulation
Time for a Financial Sector Appellate Tribunal
I wrote a column in the Financial Express today on why a Financial Sector Appellate Tribunal is superior to the bureaucratic solution created by last month’s ordinance to deal with turf battles between financial sector regulators.
The old saying that “your freedom to swing your fist ends where my nose begins” is as true of regulators as it is of individuals. In a regime of multiple regulators, the autonomy of each regulator is effectively limited by the autonomy of other regulators. What this means is that regulatory autonomy is a delusion and regulatory heteronomy is the reality.
The only real question is whether this heteronomy should be judicial or bureaucratic. I argued for the judicial option in these columns four months ago (‘Fill the gaps with apex regulator’, FE, March 19). Some degree of competition between regulators is a healthy regulatory dynamic, but ultimately any dispute between two regulators must be resolved in the courts.
My recommendation was based on the well-established proposition that the legislature frames laws, the judiciary interprets them and the executive implements the law as so interpreted. If there is a dispute about a law, the judiciary can step in and interpret the law or the legislature can step in and rewrite the law to eliminate the ambiguity. The executive has to await guidance from either of these two branches. I realise that this principle is perhaps totally old-fashioned in an environment where all three branches of the government are increasingly inclined to step on each others’ turf.
However, the judicial option at least had the advantage of being acceptable to the regulators. Three months ago, when the government suggested that the dispute between Sebi and Irda regarding the regulation of Ulips be resolved by the court, none of the regulators complained about loss of regulatory autonomy.
Last month, however, the President promulgated the Securities and Insurance Laws (Amendment and Validation) Ordinance, 2010, which not only settled the Ulips dispute in favour of Irda legislatively, but also provided a new bureaucratic arbitration mechanism for certain future disputes.
Most of the regulators are upset with this on the ground that it undermines their autonomy. This is not quite the correct way of looking at it because what it does is to replace judicial arbitration of disputes by bureaucratic arbitration. A better reason for scepticism is that, in general, bureaucratic arbitration is inferior in terms of process and in terms of outcomes.
The drafting of the ordinance itself is a good example of how bureaucratic processes tend to go wrong. The intention of the new section 45Y that has been inserted into the RBI Act is to ensure that future disputes can be resolved quickly. However, as one reads the section, one realises that this section is hopelessly inadequate.
First of all, section 45Y deals only with instruments. It essentially says that if any difference of opinion arises as to whether a certain instrument is a hybrid or composite instrument and falls under the jurisdiction of RBI, Sebi or Irda, then such difference of opinion shall be referred to a joint committee consisting of the finance minister, two top finance ministry officials and the key financial regulators.
Because the Ulips dispute was about a certain instrument, the government created a statute to deal with disputed instruments. What happens if the next dispute is about institutions and intermediaries? For example, RBI may want to regulate as an NBFC an entity that Sebi regulates as a capital market intermediary. Section 45Y is helpless to deal with this dispute because the dispute is not about instruments.
The second problem with the statute is that it says: “The Joint Committee shall follow such procedure as it may consider expedient and give, within a period of three months... its decisions thereon to the Central Government.” One would have liked to see an explicit provision of decision making by majority or qualified majority. The fundamental problem with the existing HLCC is its quasi-consensual and secretive procedure and its unwillingness to rely on transparent voting. The joint committee inherits this fatal weakness.
The third problem is that the ordinance provides that the decision of the joint committee shall be binding on the regulators—RBI, Sebi, Irda and PFRDA. It does not say that the decision is binding on anybody else. In particular, it is not binding on any of the regulated entities.
Suppose, for example, the joint committee decides that a particular product offered by a bank is actually a security that falls under the jurisdiction of Sebi. If Sebi then imposes a penalty on the bank, the latter could well go to court challenging the jurisdiction of Sebi. Neither the bank nor the court is bound by the decision of the joint committee. The decision is binding on RBI, but surely RBI cannot impose a penalty for violation of a Sebi regulation.
I remain convinced that when we have swinging regulatory fists and bleeding regulatory noses, a judicial solution is far more viable and sensible than section 45Y. The time for a Financial Sector Appellate Tribunal is now.
Posted at 10:34 am IST on Wed, 21 Jul 2010 permanent link
Categories: law, regulation
RBI on India and the Global Financial Crisis
The Report on Currency and Finance 2008-09 published by the Reserve Bank of India this month is on the “Global Financial Crisis and the Global Economy.” Well over two-thirds of this 380 page report is about the global crisis itself and does not contain anything new. But about a hundred pages are devoted to the impact of the global crisis on India and to the policy responses in India.
There are a number of interesting empirical analyses in these two chapters of the report. Well, there is an occasional piece of silly econometrics like the regression in levels between trending variables in footnote 9 on page 277; the absurdly high r-square of 0.9981 should have alerted the authors to the possibility (near certainty?) that this is a spurious regression. However, most of the empirical analyses do appear to be sound econometrically.
A few results that I found interesting:
- “An empirical analysis confirmed that the regional stock markets in Asia including India, Hong Kong, and Singapore and global markets such as the US, UK, and Japan shared a single long-run co-integrating relationship in terms of stock price indices measured in US dollars rather than the local currencies (Table 5.10). The Indian market held the key to this integration process. This was evident from the analysis that, excluding India, the other five stock markets did not show a co-integrating relationship. The coefficients of the long-run co-integration vector showed that the impact of the global markets on the Indian stock market was more pronounced than the impact of the regional markets.” (page 213-214)
- “a bivariate VAR model revealed that there was a significant Granger causal relationship from FII flows to the Indian stock market. The feedback causality from the BSE index to FII flows was also significant, albeit at a 10 per cent level of significance. ... FII investment and money supply do exert significant influence on the movements of stock markets in India; while the relationship is the other way round in the case of investments by mutual funds wherein stock markets cause their variations.” (page 216-217). I am aware of some earlier studies that indicated Granger causality ran the other way, so it would be useful to replicate this result over longer time periods. Unfortunately, here as in most of their other regressions, the RBI does not indicate the sample period used.
- The report asserts that the financial stress index (FSI) for India “exhibited synchronised movements with that of the US, Western Europe, Japan and aggregate advanced economies (Figures V.20 & V.21). This either shows that financial stress in India and other advanced countries is driven by a common factor or that transmission of stress to Indian markets from advanced markets is contemporaneous.” (page 231-232). Unfortunately, the report does not present any statistical analysis regarding this and relies on the visual evidence of a chart plotting the FSI in India and other countries. Looking carefully at this chart, however, it appears that the only synchronized movement was in October 2008 (after Lehman). Neither before nor later, do I see a synchronized movement in the chart.
Posted at 2:40 pm IST on Fri, 9 Jul 2010 permanent link
Categories: crisis
Drunken trading and risk limits
The Financial Services Authority of the UK put out an order last month regarding a drunken broker (Steven Perkins) who bought $520 million of crude oil futures sitting at home at night with his laptop (hat tip Finance Professor).
I find it amazing that somebody sitting at home with a laptop between 1:00 am and 4:00 am can execute over 2,000 buy trades worth over half a billion dollars when his broking firm is essentially an execution only oil brokerage, and Perkins himself (and almost all other brokers in the firm) were barred from doing proprietary trading.
What does it say about the risk management and control systems at the brokerage firm (PVM)? The FSA explicity says that it “makes no criticism of PVM”:
Mr Perkins’ behaviour was contrary to PVM’s policies and procedures and the FSA makes no criticism of PVM in this notice. Mr Perkins was immediately suspended by PVM on 30 June 2009 and his employment later terminated.
I would however think that there are issues of risk management that cannot be wished away. The Telegraph reports that the transaction caused a loss to PVM of $9.8 million and resulted in the brokerage reporting a loss of $7.6 million for the year. The implication is that the normal profits of the firm were $2.2 million. A drunken broker supposed to execute only client transactions lost in a single night an amount which was more than four times the normal annual profits of the entire brokerage firm!
I think that financial firms need better risk management systems and need to think carefully about operational risk. When Perkins lied to his company that the trade was for a client, he did not claim that it was for large oil company, he said that it was for a “local trader” which is how independent oil traders are referred to. There should have been some counterparty risk controls kicking in when a half a billion dollar trade is made at the dead of night for a client who is a small time local trader. Some alerts should have been trigerred when the purchases by a single broker during the three hour window was more than 17 times the average daily volume of the entire market for this period of the night.
Interestingly, PVM is owned by senior brokers themselves and the poor risk management cannot be attributed to corporate governance issues. I think it reflects the failure of systems and processes at many financial firms to adjust to the modern reality of round-the-clock high-speed electronic trading.
Posted at 11:36 am IST on Fri, 2 Jul 2010 permanent link
Categories: risk management
Is UK imitating Ireland on Financial Regulation?
There is too little detail in the new UK plan (also here) to abolish the Financial Services Authority (FSA) by folding it into a subsidiary of the Bank of England. But the more I think about it, the more it looks like the pre crisis Ireland model.
If imitation is the sincerest form of flattery, Ireland must be feeling quite flattered right now. It is a different matter that the Irish Central Bank itself in its post mortem of the crisis now thinks that their model had nothing to commend it:
Though few would now defend the institutional structure invented for the organisation in 2003, it would be hard to show that its complexity materially contributed to the major failures that occurred. (page 42)
The grass is always greener on the other side!
Posted at 9:09 pm IST on Fri, 18 Jun 2010 permanent link
Categories: regulation
Why market surveillance can no longer be left to the exchanges
I wrote a column in the Financial Express today arguing that the financial market regulators need to get directly involved in real time market surveillance.
Traditionally, securities regulators globally have regarded the exchanges as the front line regulators with primary responsibility for market surveillance. As a result, regulators have traditionally not invested in the computing resources and the human capital required to perform real time surveillance themselves. A number of developments are making this model unviable in the developed markets and the same factors are at work, a little more slowly, in India as well.
I think it is time for Indian regulators like Sebi, FMC and RBI to develop in-house real time market surveillance capabilities rather than rely on the capabilities that may currently exist at the exchanges or exchange-like entities that they supervise (NSE, BSE, MCX, NCDEX, NDS).
I believe there are two key factors that make this regulatory shift necessary. First is the dramatic change in the nature of exchanges themselves. In the past, exchanges were regarded as ‘utilities’ providing key financial infrastructure and regulatory services. In recent years, they have evolved into businesses just like any other financial services business. Many observers in India (including some of the exchanges themselves) have been concerned about this transformation, but this is a global phenomenon and it is delusional to deny this reality. Concomitantly, there has been a blurring of the line between exchanges and brokers. Globally, alternative trading systems and dark pools have gained market share in recent years, and the operators of these systems are half way between traditional exchanges and large broker dealers, in terms of their business models and regulatory incentives.
In India, too, we have seen the blurring of the line between exchanges and non-exchanges. Examples include the subsidiaries of regional stock exchanges that trade on national exchanges; the exchanges in the commodity space whose promoters had or have large trading arms; and RBI regulated entities that perform many functions of an exchange but are not legally classified as exchanges.
The second and even more important factor is the rise of algorithmic and high frequency trading that links different exchanges together at much shorter time scales than in the past. Each exchange looking only at the trading in its own system has only a very limited view of what is happening in the market as a whole. It becomes very much like the story of the six blind men and the elephant.
The best example of this is the flash crash in the US on May 6, 2010. The US SEC, which like other regulators had never dirtied its hands with real time surveillance, found itself struggling to figure out what happened in those few turbulent minutes on that day. In an interim report, the SEC stated: “To conduct this analysis, we are undertaking a detailed market reconstruction, so that cross-market patterns can be detected and the behaviour of stocks or traders can be analysed in detail. Reconstructing the market on May 6 from dozens of different sources and calibrating the time stamps from each source to ensure consistency across all the data is consuming a significant amount of SEC staff resources. The data are voluminous and include hundreds of millions of records comprising an estimated five to ten terabytes of information.”
This is what happens when a regulator leaves it to others to do its job, but is forced one day to do the job itself. Is it not scandalous that a systemically important institution like an exchange or a depository is not required to synchronise its clocks to a standard time (say GPS time) with an error of not more than a few microseconds at worst? Exchanges are willing to spend a fortune to bring down the latency of their trading engine to a millisecond or so to attract trading volume, but are unwilling to spend a modest amount to synchronise their clocks because nobody asked them to.
There is another important hidden message in this. Modern finance is increasingly high frequency finance and those who do not dirty their hands with it become increasingly out of touch with the reality of financial markets. Doctoral students in finance today, for example, have to learn the econometrics of high frequency data and grapple first hand with the challenges of handling this data.
Unless regulators collect this high frequency data and encourage their staff to explore it, they risk becoming progressively disconnected with the reality that they are supposed to regulate. Interestingly, the US derivatives regulator, CFTC, is moving rapidly to develop this capability. They already collect all trade data on a T+1 basis and run their own surveillance software on that data. Over the next year, they hope to enhance this to receive the entire order book data from the exchanges that they regulate. All regulators worldwide need to move in that direction.
It is true that this will be difficult, expensive and time-consuming for Indian regulators. That is all the more reason to start immediately.
Posted at 12:19 pm IST on Wed, 16 Jun 2010 permanent link
Categories: equity markets, insider trading, manipulation
Why do we not have high quality investigative reports in India?
As the global financial crisis rolls on, I have read a bunch of outstanding investigative reports from around the world:
- Perhaps the best of these is the April 2010 report of the Special Investigative Commission set up by the parliament of Iceland (the Althingi) to investigate and analyse the processes leading to the collapse of the three main banks in Iceland. The Commission consisted of a Supreme Court judge, the Parliamentary Ombudsman and an academic. Only a small part of the report – the executive summary (18 pages), Chapter 18 (65 pages) and Chapter 21 (160 pages) – is available in English. But what is available is truly impressive in terms of the detailed factual presentation, the dispassionate analysis of what went wrong and the balanced indictment of those who were responsible.
- Much narrower in scope but equally impressive in terms of factual detail is the report of the Examiner appointed by the bankruptcy court for Lehman Brothers. At 2,200 pages with 8,000 footnotes, this document is a goldmine of authentic information about what happened at that one company.
- Another report that I liked was the external evaluation of the active management of Norway’s sovereign wealth fund commissioned by the government in response to the sharp fall in its market value during 2008. This study was carried out by three academics in the US and the UK. The response of Norway’s central bank (which manages the wealth fund) is also very interesting. Of course, Norway is no newcomer to transparency. During the crisis of 2008, everybody was reading the detailed reports that Norway had published about their banking crisis of the early 1990s.
Surprisingly, apart from Lehman, there are too few high quality official investigative reports about other US financial firms that have suffered badly in the crisis. We do not really know what happened at Merrill Lynch or Citi. We know a lot more about the problems at UBS for example thanks to the shareholders report produced at the insistence of the Swiss regulators. US regulators have been acting as if everything related to the crisis is a state secret. On AIG, we know a little more due to the efforts of the Congressional Oversight Panel, but even this picture is incomplete.
The US does however have an adversarial system of congressional testimony and all this testimony is available online. Apart from the Financial Crisis Inquiry Commission there are many congressional committees that have held hearings and released valuable information during the process. The quality of the official reports that emerge at the end is not that important. Years ago while studying what happened at Enron, I had found the same thing – the testimony and trascripts of the hearings are far more useful than the reports themselves.
While the investigative reports of the Nordic countries have been exemplary, and those of the US have been very good, India has simply been unable to produce data rich investigative reports of a quality useful to a researcher. A Joint Parliamentary Committee (JPC) was set up to investigate the securities scam of 1991, but the factual details in this report were not sufficient from a researcher’s point of view. The Indian parliamentary committees tend to hold closed door meetings and treat all testimony as confidential. The regulators have also not filled the void. In early days of the scam, the Reserve Bank of India published the Janakiraman report which was rich in factual detail, but this report had a narrow remit. I do not think that we still have a comprehensive authentic data source on what happened in this scam two decades ago.
The situation is not any better in the Ketan Parikh scam that took place a decade ago in technology stocks. Again there was a JPC on this scam, but again the report was not data rich. Turning to more recent times, the Satyam fraud took place a year and a half ago and we still have no authentic information at all on what happened.
I do not believe that developing countries are incapable of producing good investigative reports. I remember being impressed with the Nukul Commission Report in Thailand in the aftermath of the Asian Crisis of 1997, but the report is not available online and I have not been able to refresh my memory. Outside of finance, the Truth and Reconciliation Commission in South Africa produced a series of reports with an enormous amount of factual detail and careful analysis.
India’s failure to produce outstanding investigative reports into financial disasters is something that needs to be remedied. As Andrew Lo has been arguing for some time now, such detailed post mortems are very important. Lo recommends that even the US should go much further than it is doing currently:
The most pressing regulatory change with respect to the financial system is to provide the public with information regarding those institutions that have “blown up”, i.e., failed in one sense or another. This could be accomplished by establishing an independent investigatory agency or department patterned after the National Transportation Safety Board, e.g., a “Capital Markets Safety Board”, in which a dedicated and experienced team of forensic accountants, lawyers, and financial engineers sift through the wreckage of every failed financial institution and produces a publicly available report documenting the details of each failure and providing recommendations for avoiding such fates in the future.
Posted at 4:19 pm IST on Fri, 11 Jun 2010 permanent link
Categories: crisis, investigation
How do regulators cope with terabytes of data?
Traditionally, securities regulators have coped with the deluge of high frequency data by not asking for the data in the first place. The exchanges are supposed to be the front line regulators and leaving the dirty work to them allows the US SEC and its fellow regulators around the world to avoid drowning under terabytes of data.
But the flash crash seems to be changing that. The US SEC had to figure out what happened in those few minutes on May 6, 2010. When it attempted to reconstruct the market using data from different exchanges, it ended up with nearly 10 terabytes of data. The SEC says in its joint report with the CFTC on the preliminary findings about the flash crash:
To conduct this analysis, we are undertaking a detailed market reconstruction, so that cross-market patterns can be detected and the behavior of stocks or traders can be analyzed in detail. Reconstructing the market on May 6 from dozens of different sources and calibrating the time stamps from each source to ensure consistency across all the data is consuming a significant amount of SEC staff resources. The data are voluminous, and include hundreds of millions of records comprising an estimated five to ten terabytes of information. (page 72)
It turns out that the CFTC which regulates the futures exchanges is well ahead in the learning curve as far as the terabytes of data are concerned:
The CFTC also collects trade data on a daily, transaction date + 1 (“T+1”), basis from all U.S. futures exchanges through “Trade Capture Reports.” Trade Capture Reports contain trade and related order information for every matched trade facilitated by an exchange, whether executed via open outcry or electronically, or non-competitively (e.g., block trades, exchange for physical, etc.). Among the data included in the Trade Capture Report are trade date, product, contract month, trade execution time, price, quantity, trade type (e.g., open outcry outright future, electronic outright option, give-up, spread, block, etc.), trader ID, order entry operator ID, clearing member, opposite broker and opposite clearing member, order entry date, order entry time, order number, customer type indicator, trading account numbers, and numerous other data points. Additional information is also required for options on futures, including put/call indicators and strike price, as well as for give-ups, spreads, and other special trade types.
All transactional data is received overnight, loaded in the CFTC’s databases, and processed by specialized software applications that detect patterns of potentially abusive trades or otherwise raise concern. Alerts are available to staff the following morning for more detailed and individualized analysis using additional tools and resources for data mining, research, and investigation.
Time and sales quotes for pit and electronic transactions are also received from the exchanges daily. CFTC staff is able to access the market quotes to validate alerts as well as reconstruct markets for the time periods in question. Currently, staff is working with exchanges to receive all order book information in addition to the executed order information already provided in the Trade Capture Report. This project is expected to be completed within the next year; at present such data remains available to staff through “special calls” (described below) requesting exchange data. (page B-15 in the Appendix)
However, the flash crash did not put the CFTC’s data handling abilities to the test because most of the action was in the cash equity market and the only action in the derivatives exchanges was in a handful of index futures and options contracts.
Finally, I am puzzled by the statement of the SEC quoted above that “calibrating the time stamps from each source to ensure consistency across all the data is consuming a significant amount of SEC staff resources.” Regulators should perhaps require that exchanges synchronize their computer clocks with GPS time to achieve accuracy of a few microseconds. With the exchange latency times close to a millisecond these days, normal NTP (internet) accuracy of 10 milliseconds or so is grossly inadequate. I would not be surprised if some exchanges do not even have formal procedures to ensure accuracy of their system clocks.
All of which goes to show that traditional securities regulator strategies of not dirtying their hands with high frequency data is a big mistake. This should be a wake call for regulators around the world.
Posted at 6:00 pm IST on Thu, 3 Jun 2010 permanent link
Categories: regulation, technology
FASB says IFRS is for less developed financial reporting systems
The FASB’s criticism is buried inside a couple of hundreds of pages of dense accounting proposals, but it is unusually direct and clear:
What may be considered an improvement in jurisdictions with less developed financial reporting systems applying International Financial Reporting Standards (IFRS) may not be considered an improvement in the United States.
This is part of its description of why the FASB is putting aside its convergence project with IASB and is pushing ahead on its own on a new accounting standard for financial instruments. The IASB’s description of the divergence is more muted:
However, the ... efforts [of the FASB and IASB] to achieve a common and improved financial instruments standard have been complicated by the establishment of different project timetables to respond to their respective stakeholder groups in the light of the financial crisis.
The stumbling block in improving the accounting for financial instruments is not technical but political. The key ideas for reform were put forth in a report ten years ago by a Joint Working Group consisting of representatives from the FASB and IASB as well as standard setters from twelve other countries (Joint Working Group of Standard Setters, “Recommendations on Accounting for Financial Instruments and Similar Items”, FASB, Financial Accounting Series, No. 215-A December 22, 2000).
From there we have progressed to the pot calling the kettle black. The global financial crisis has not been good for the reputation of either the FASB or the IASB. Both have appeared vulnerable to pressure from the politicians and lobbying by the big banks.
Posted at 8:10 pm IST on Wed, 2 Jun 2010 permanent link
Categories: accounting
When is a foreign exchange swap not really a foreign exchange swap?
The answer is when it is a swap between the US Federal Reserve and a foreign central bank under the famed swap lines. Last month, the New York Fed described the operational mechanics of these swap lines in considerable detail in its publication Current Issues in Economics and Finance:
The swaps involved two transactions. At initiation, when a foreign central bank drew on its swap line, it sold a specified quantity of its currency to the Fed in exchange for dollars at the prevailing market exchange rate. At the same time, the Fed and the foreign central bank entered into an agreement that obligated the foreign central bank to buy back its currency at a future date at the same exchange rate. ...
The foreign central bank lent the borrowed dollars to institutions in its jurisdiction ... And the foreign central bank remained obligated to return the dollars to the Fed and bore the credit risk for the loans it made.
At the conclusion of the swap, the foreign central bank paid the Fed an amount of interest on the dollars borrowed that was equal to the amount the central bank earned on its dollar lending operations. In contrast, the Fed did not pay interest on the foreign currency it acquired in the swap transaction, but committed to holding the currency at the foreign central bank instead of lending it or investing it. This arrangement avoided the reserve-management difficulties that might arise at foreign central banks if the Fed were to invest its foreign currency holdings in the market.
What this means is that the foreign currency (say, the euro) that the Fed purportedly receives under the swap is completely fictitious because (a) the Fed earns no interest on the euros and (b) the euros are not available to the Fed if it wishes to lend the euros to a US bank or for any other purpose. In fact, in April 2009, the Fed entered into a different swap agreement with the ECB and other central banks to obtain foreign currency liquidity. This would not have been needed at all if the original swap had been a genuine swap.
The so called swap is simply a loan of dollars to the foreign central bank. Why does the Fed want to call it a swap instead of a loan? I think the reason for this use (or rather abuse) of terminology is that a swap with a foreign central bank sounds politically more palatable than a loan to a foreign central bank. All the more so when the swap is for an unlimited amount!
This is another reminder that deceptive disclosure practices are not limited to companies like Enron or to sovereigns like Greece – they are all pervasive.
Posted at 9:10 pm IST on Tue, 1 Jun 2010 permanent link
Categories: derivatives, international finance, law
Am on vacation
I am on vacation for the rest of this month. There will be no posts during this period. I shall try to moderate comments during this period, but there are bound to be delays.
Posted at 10:18 pm IST on Tue, 11 May 2010 permanent link
Categories: miscellaneous
Grand daddy of algorithmic trading bites the dust?
The ten minutes of mayhem in the US stock market last Thursday may have involved the oldest form of algorithmic trading – the stop loss order. We do not often think of the stop loss order as algorithmic trading, but that is what it is. If its conditions are satisfied, it executes without seeking any confirmation from the person who placed the order.
These days, it is usually a computer which implements the stop loss order, but even in the old days when a human broker implemented it, the stop loss was an algorithm. Whether the hardware on which an algorithm runs is made of silicon or of carbon is totally immaterial.
When I blogged last week about the dangers of market orders, I did not realize that many of the market orders that executed at absurd prices on Thursday afternoon in the US might have been stop loss orders. When the stop loss limit is breached, the stop loss order becomes a market order and executes blindly against whatever bids are available. This can be a prescription for disaster in fast moving markets as an avalanche of stop losses can eat through the entire order book and execute at penny prices as well.
Ironically, stop loss orders might not only have been a big victim of Thursday’s “flash crisis,” but might have been one of its major causes as well. Stop losses are inherently destabilizing as they aggravate the current trend. By demanding liquidity when it is least available, they degrade market quality. By making investors complacent about risks (the stop loss order is supposed to limit losses!), they tend to make investors reckless. All the more reason why these people ought not to have been bailed out by cancelling trades.
Posted at 10:13 pm IST on Tue, 11 May 2010 permanent link
Categories: equity markets, risk management
Market panics and bailout manias
People seem to be debating whether it was computers or humans that panicked and caused a temporary intraday drop of almost 10% in the US stock market yesterday. What we do know is that the bailout mania that followed was due entirely to humans. This is what the Nasdaq media release says:
The NASDAQ Stock Market had no technology or system issues associated with the trading that occurred between 2:00 and 3:00 p.m. ET today. The NASDAQ Stock Market operated continuously and its close process ran successfully.
In addition, there is no indication at this time that a NASDAQ market participant experienced a technological failure in connection with this event. NASDAQ has coordinated a process among U.S. Exchanges and therefore, pursuant to rule 11890(b), NASDAQ, on its own motion, will cancel all trades executed between 14:40:00 and 15:00:00 greater than or less than 60% away from the consolidated last print in that security at 14:40:00 or immediately prior. This decision cannot be appealed. NASDAQ has coordinated this decision with all other UTP Exchanges. NASDAQ will be canceling trades on the participant’s behalf.
Make no mistake. This is a bailout as bad and sordid as all the bailouts that we saw in the financial sector in 2008. It is bad because it reduces incentives for the firms to discipline their traders (or redesign their computer algorithms) to reduce the risk of such problems.
Anybody who uses a large market sell order instead of a marketable limit order during a falling market (and that too without looking at the order book) is begging to receive a price of zero for the stock. The market is happy to oblige. These people deserve the price that they got. There is no need for the exchange to ride to their rescue by cancelling their trades.
I wrote a few posts about this kind of thing four years ago:
Why have things not changed? Because, it is so easy to bailout everybody, it is much harder to change things.
Posted at 2:19 pm IST on Fri, 7 May 2010 permanent link
Categories: behavioural finance, crisis
Goldman Sachs and the economic function of investment banks
I wrote a column in the Financial Express on what the securities fraud case against Goldman Sachs tells us about the economic function of investment banks.
Regardless of its ultimate outcome, the SEC’s case against Goldman Sachs alleging securities fraud has already transformed the debate on financial sector reforms in the US. More importantly, I believe the case has raised disturbing questions about the economic function performed by investment banks in modern financial markets.
At the centre of the SEC case is the Abacus deal that Goldman brought to market in early 2007. The structure was created at the request of the hedge fund, Paulson & Co, which wanted to bet on the collapse of the US housing market by taking a short position on subprime securities. Goldman created synthetic subprime securities through the Abacus vehicle and sold these to the German bank, IKB. Paulson took the opposite (short) position on these securities.
The prospectus of Abacus highlighted the role of a reputed CDO manager, ACA Management, in selecting the portfolio of subprime assets underlying the Abacus deal and gave full details of this portfolio. But the 196-page prospectus did not mention the fact that Paulson had played a role in selecting the portfolio. This non-disclosure is a key element in the SEC case against Goldman.
In addition, the SEC alleges that Goldman misled ACA about Paulson’s intentions. Apparently, ACA believed that Paulson intended to buy the equity (first loss) piece of Abacus and was, therefore, motivated to exclude truly bad assets from the portfolio. In reality, Paulson planned to take a short position in Abacus and wished to stuff it with the worst possible assets.
It is difficult to predict the outcome of the SEC case because there are few precedents for invoking the anti-fraud provisions of US securities law in similar situations. The SEC might be in uncharted waters here, but it is pursuing a civil case where the standards of proof are lighter. Moreover, Goldman would certainly not relish having to defend its unsavoury conduct in a jury trial.
It is true that during the crisis the SEC acquired a reputation for incompetence (for example, Madoff and Stanford), which makes people sceptical about the Goldman case as well, but the new director of enforcement, Robert Khuzami, whom the SEC hired last year, has a formidable reputation from his days as a federal prosecutor.
Interestingly, Goldman in its defence thinks of itself more as a broker-dealer in complex derivatives and less as an issuer or underwriter of the Abacus securities. Broker-dealers have no obligation to disclose the identity or motivations of either counterparty to the other. It is true that Goldman could have achieved the same economic effect as Abacus by intermediating a credit default swap (CDS) between IKB and Paulson, but that is not what it chose to do. It chose to issue securities in which a CDS was embedded.
I am, however, less interested in whether the SEC wins this case or not. I am more concerned about the role of investment banks like Goldman in modern financial markets. In an ideal, perfectly efficient market, buyers and sellers would deal with each other directly through an electronic limit order book without any gatekeepers or intermediaries. In reality, intermediaries are needed to solve the problem of information asymmetry where one side knows a lot more about the transaction than the other.
It follows that the value added by an investment bank is measured by the extent to which it reduces information asymmetry. Otherwise, it is only exploiting oligopolistic rents or earning the rewards of excess leverage made possible by implicit ‘too big to fail’ guarantees.
From this perspective, the major banks of the 19th century or early 20th century like Rothschilds, Barings or JP Morgan did serve an economically useful function. Academic studies have shown that sovereign bonds underwritten by these major banks during that period had significantly lower default rates than other sovereign bonds (Flandreau, et al, 2009, The End of Gatekeeping: Underwriters and the Quality of Sovereign Bond Markets, 1815-2007, NBER Working Paper 15128).
Similarly, financial historians tell us that 19th century investment banks like JP Morgan played a critical role in bridging the information asymmetry between US railroads and their British investors. To their contemporaries, rich and powerful bankers like the Rothschilds and the Morgans were among the many ugly faces of capitalism. But the hard facts show that while they did not claim to be doing God’s work, they did do something useful.
The Abacus case makes one wonder whether modern investment banks do play any such useful role. The Goldman defence asserts that sophisticated investors like ACA and IKB were capable of looking after their own interests and did not need help from Goldman or anybody else. If there are no information asymmetries to be resolved or if modern investment banks have too little reputational capital to resolve them, then it is not at all clear what economic function they perform in today’s highly liquid and sophisticated markets.
Posted at 12:02 pm IST on Wed, 5 May 2010 permanent link
Categories: financial history, fraud, regulation
Principles versus rules: HSBC Mutual Fund in India
An order passed by the Securities and Exchange Board of India (SEBI) on Friday regarding a mutual fund run by HSBC in India provides a fascinating example of the advantages of principles based regulation.
Indian regulations require that before a mutual fund makes a “change in any fundamental attribute” of any mutual fund scheme it should not only inform all unit holders but also give them a costless exit option (Regulation 18(15A) of the Mutual Funds Regulations). The regulations do not define what is a fundamental attribute so that absent any further “clarifications” by the regulator, we would have a very sensible principles based regulation.
In 2009, the HSBC Mutual Fund made the following changes in one of its mutual fund schemes, the “HSBC Gilts – Short Term Plan.”
- It changed the name of the scheme to “HSBC Gilt Fund”
- It changed the ceiling on the modified duration of the fund from 5 years to 15 years
- It changed the benchmark index from a sub index covering the 1-3 year maturity to a composite index covering all maturities.
Under a principles based regulation, there is no question that this would be a change in the fundamental attribute of the scheme. In fact, it changes the nature of the scheme so drastically that it is conceivable that many investors in the original scheme might not wish to remain invested in the changed scheme.
Unfortunately, the SEBI regulations were not truly principles based. Way back in 1998, it issued clarifications that replaced the nice principles based regulation with a set of bright red lines by giving a laundry list of things that are fundamental attributes. Neither the change of name, nor the change in the modified duration, nor the change in the benchmark index figured in this list.
The regulator was forced to accept that HSBC was technically correct in claiming that it had not changed any fundamental attribute of its scheme.
The moral of the story is that while principles based regulation is genuinely hard both for regulator and regulatees, rules based regulation is often a farce.
Posted at 7:41 pm IST on Mon, 26 Apr 2010 permanent link
Categories: bond markets, regulation
Lehman and the Derivative Exchanges
The unredacted Volume 5 of the Lehman examiner’s report released earlier this week provides details about how CME handled the Lehman default by auctioning the positions of Lehman to other large dealers. The table below summarizes the data given in the report.
Asset Class | Negative Option Value | Span Risk Margin | Total Collateral | Price paid by CME to buyer | Loss to Exchange | Percentage Loss to Exchange | Loss to Lehman |
Energy Derivatives | 372 | 261 | 633 | 707 | 74 | 12% | 335 |
FX Derivatives | -4 | 12 | 8 | 2 | -6 | -72% | 6 |
Interest Rate Derivatives | 93 | 130 | 223 | 333 | 110 | 49% | 240 |
Equity Derivatives | -5 | 737 | 732 | 445 | -287 | -39% | 450 |
Agricultural Derivatives | -5 | 55 | 50 | 52 | 2 | 4% | 57 |
Total auctioned by CME | 451 | 1195 | 1646 | 1539 | -107 | -6% | 1088 |
Natural Gas Derivatives sold by Lehman itself | 482 | 129 | 611 | 622 | 11 | 2% | 140 |
Grand Total | 933 | 1324 | 2257 | 2161 | -96 | -4% | 1228 |
The negative option value is as the close of business before the Lehman bankruptcy and the loss to Lehman is computed as the excess of the price paid by CME to the buyer over this negative option value. Futures positions are presumably assumed to have zero value after they have been marked to market. On the other hand, CME incurs a loss only if it pays a price in excess of the collateral provided by Lehman. For comparison purposes, the same computation is done for the positions sold by Lehman itself, though, in this case, the exchange does not make any profit or loss.
What I find puzzling here is that in the case of interest rate derivatives, CME had to pay the winning dealer a price of about 1.5 times the collateral available. Had it not been for excess collateral in other asset classes, the CME might have had to take a large loss. Was the CME seriously undermargined or was the volatility in the days after Lehman default so high or was this the result of a panic liquidation by the CME?
We do have an independent piece of information on this subject. LCH.Clearnet in London also had to liquidate Lehman’s swap positions amounting to $9 trillion of notional value. LCH has stated here and here that the Lehman “default was managed well within Lehman margin held and LCH.Clearnet will not be using the default fund in the management of the Lehman default.”
A number of questions arise in this context:
- Did LCH.Clearnet charge higher margins than CME? It is interesting in this context that only a few days ago, Risk Magazine quoted LCH as saying that one of its rivals (IDCH) charges too low a margin: “bordering on reckless.” But LCH did not make this claim about CME.
- During the week after Lehman’s default, was there a big difference in the price behaviour of the swaps cleared at LCH and the bond futures and eurodollar futures cleared at CME?
- Was Lehman arbitraging between swaps and eurodollar futures so that its positions in the two exchanges were in opposite directions? In this case, price movements might have produced a profit at LCH and a loss at CME.
- Were the proprietary positions of Lehman at CME more risky than its (customer?) positions at LCH which might have been more balanced?
- Did the “panic” liquidation by CME exacerbate the loss? LCH hedged the position over a period of about a week and then auctioned off a hedged book.
- Did dealers trade against the Lehman book after the CME disclosed the book to potential bidders a couple of days before the auction? Or did each dealer think that the others would trade against the book? This problem did not arise at LCH because only hedged books were auctioned and the unhedged book was not disclosed to others.
In the context of the ongoing debate about better counterparty risk management (including clearing) of OTC derivatives, I think the regulators should release much more detailed information about what happened. Unfortunately, in the aftermath of the crisis, it is only the courts that have been inclined to release information – regulators and governments like to regard all information as state secrets.
Posted at 7:56 pm IST on Sat, 17 Apr 2010 permanent link
Categories: bankruptcy, crisis, derivatives, exchanges, risk management
SEBI, IRDA and the courts
I wrote a column in the Financial Express today about letting the courts resolve disputes between two financial regulators.
When I wrote a month ago “At a crunch, I do not see anything wrong in a dispute between two regulators... being resolved in the courts,” (‘Fill the gaps with apex regulator’, FE, March 19); I did not imagine that my wish would be fulfilled so soon. The dispute between Sebi and Irda regarding Ulips seems to be headed to the courts for resolution. There is nothing unseemly or unfortunate about this development. On the contrary, I believe that this is the best possible outcome.
An independent regulator should be willing and able to carry out the mission laid down in its statute, without worrying about whether its actions would offend another regulator. Its primary loyalty should be to its regulatory mandate and not to any supposed comity of regulators. Equally, if a regulator intrudes on the mandate of another, the other regulator or its regulatees should have no compunctions in challenging it in a court of law.
In any case, the idea that regulators share cordial relationships with each other is a myth. Turf wars are the rule and not the exception. In the UK, for example, after Northern Rock, the Bank of England and the FSA began to talk to each other only through the press, it was obvious to all that the relationship was extremely bitter. In the US, during the crisis, severe strains were evident between the Fed and the FDIC. The relationship between the SEC and the CFTC has, of course, been strained for decades.
In the financial sector in particular, we want strong-willed regulators to act on the courage of their conviction. Since many of their regulatees probably have outsized egos, perhaps it does not hurt to have a regulator with an exaggerated sense of self-importance. We do not want regulators who are too nice to their regulatees. It follows then that we cannot wish that regulators be too nice to each other either.
What we need instead is a mechanism to deal with the problem of regulatory overreach—democracies thrive on checks and balances. Regulatory overreach is a problem, even when it does not involve another regulator at all. Instead of hoping that regulators will always exercise self-restraint, we need a process to deal with the consequences of regulators overstepping the line.
The best mechanism is a robust appellate process—appellate tribunals and beyond them, the regular judiciary. Regulators, too, must be accountable to the rule of law and an appellate process is the only way to ensure this. The judicial process is as capable of resolving disputes between two regulators as it is of resolving disputes between the regulator and its regulatees.
In the context of the dispute between Sebi and Irda, many people have argued that a bureaucratic process of resolving disputes is preferable to a judicial process. There is, however, little evidence for such a view from around the world. Bureaucratic processes are less likely to provide a lasting solution and more likely to produce unseemly compromises that paper over the problem.
The two-decade-long dispute in the US between the SEC and CFTC about equity futures provides an interesting case study to demonstrate this. In the early 1980s, the SEC and the CFTC came to an agreement (the Shad Johnson accord) dividing up the regulatory jurisdiction of stock index futures and index options, but they were unable to agree on the regulation of single stock futures. Futures on narrow indices were left somewhere in the middle, with the CFTC having regulatory jurisdiction but the SEC having a veto power on the introduction of the contract itself.
In the late 1990s, when the SEC barred the Chicago Board of Trade (CBOT) from trading futures on the Dow Jones Utilities and Transportation indices, CBOT took the SEC to court and won. The court sternly declared that, “SEC is not entitled to adopt a ‘my way or the highway’ view by using its approval power—as a lever.” With the Shad Johnson accord in tatters, the two regulators were finally forced to sort out the regulation of single stock futures—a matter that they had not been able to settle by bureaucratic processes for two decades.
It is evident that the resolution of the single stock futures dispute would not have happened without judicial intervention. For two decades, inter-regulatory coordination mechanisms in the US, like the President’s Working Group on Financial Markets were not able to resolve the matter—it was too convenient for both regulators to agree to disagree.
An important advantage of judicial resolution is that regulatory conflicts that have the most serious impact on the markets are more likely to be litigated than those that are less damaging. It is, therefore, more likely that the final outcome would be socially and economically efficient. There are no such incentives to guide a bureaucratic solution towards the social optimum.
Posted at 12:58 pm IST on Sat, 17 Apr 2010 permanent link
Categories: law, regulation
The SEC and the Python
Last week, the SEC put out a 667 page proposal regarding disclosures for asset backed securities. What I found exciting was this:
We are proposing to require that most ABS issuers file a computer program that gives effect to the flow of funds, or “waterfall,” provisions of the transaction. We are proposing that the computer program be filed on EDGAR in the form of downloadable source code in Python. ... (page 205)
Under the proposed requirement, the filed source code, when downloaded and run by an investor, must provide the user with the ability to programmatically input the user’s own assumptions regarding the future performance and cash flows from the pool assets, including but not limited to assumptions about future interest rates, default rates, prepayment speeds, loss-given-default rates, and any other necessary assumptions ... (page 210)
The waterfall computer program must also allow the use of the proposed asset-level data file that will be filed at the time of the offering and on a periodic basis thereafter. (page 211)
This is absolutely the right way to go particularly when coupled with the other proposal that detailed asset level data be also provided in machine readable (XML) format. For a securitization of residential mortgages for example, the proposal requires disclosure of as many as 137 fields (page 135) on each of the possibly thousands of mortgages in the pool.
Waterfall provisions in modern securitizations and CDOs are horrendously complicated and even the trustees who are supposed to implement these provisions are known to make mistakes. A year ago, Expect[ed] Loss gave an example where approximately $4 million was paid to equity when that amount should have been used to pay down senior notes (hat tip Deus Ex Macchiato).
Even when the trustees do not make a mistake, the result is not always what investors had expected. A few months ago, FT Alphaville reported on two Abacus deals where the documentation allowed the issuer (Goldman Sachs) to use its “sole discretion” to redeem the notes without regard to seniority. People realized that this was possible only when Goldman Sachs actually paid off (at face value) some junior tranches of these CDOs at the expense of senior tranches.
When provisions become complex beyond a point, computer code is actually the simplest way to describe them and requiring the entire waterfall to be implemented in open source software is a very good idea. The SEC does not say so, but it would be useful to add that if there is a conflict between the software and textual description, the software should prevail.
Now to the inevitable question — Why Python? The SEC actually asks for comments on whether they should mandate Perl, Java or something else instead. I use Perl quite extensively, but the idea that Perl is a suitable language for implementing a transparency requirement is laughable. Perl is a model of powerful but unreadable and cryptic code. As for Java and C-Sharp, there is little point in having open source code if the interpreter is not also open source. I do not use Python myself, but it appears to be a good choice for the task at hand.
It is gratifying that the SEC continues the one good thing that Cox initiated when he was Chairman – the use of technology as a key regulatory tool.
Posted at 6:25 pm IST on Fri, 16 Apr 2010 permanent link
Categories: bond markets, derivatives, regulation, technology
Icesave: What is in a name?
The Special Investigation Committee set up by the Icelandic parliament (Althingi) to investigate and analyse the processes leading to the collapse of the three main banks in Iceland submitted its report this week. A portion of the report is available in English.
One of the interesting stories in the report (Chapter 18, page 5) is about the choice of the brandname Icesave for the deposit accounts offered by the Icelandic Bank, Landsbanki in the UK and in the Netherlands. The SIC states:
... Arnason [CEO of Landsbanki] also described how the brand name Icesave was created. He claimed that Landsbanki representatives had initially thought it was negative for an Icelandic bank to market deposit accounts in the UK. An advertising agency employed by the bank pointed out that it would never be possible to conceal the origin of the bank and, therefore, it would be better to simply advertise it especially. As a result, the brand name “Icesave” was created.
... Research indicated that a simple and clear message together with a strong link to Iceland would prove beneficial.
I think this has some implication for the literature about the relationship between geographical names on stock prices. For example, Kee-Hong Bae and Wei Wang show that during the China stock market boom in 2007, Chinese stocks listed in the US that had China or Chinese in their names significantly outperform US listed Chinese stocks that do not have China or Chinese in their names. (“What’s in a ‘China’ Name? A Test of Investor Sentiment Hypothesis”, http://ssrn.com/abstract=1411788)
What the Icesave example shows is that the choice of the name is not independent of the advertising, pricing and other strategies of the company. Some of what appears to be the result of a name change might in fact be due to other changes in the company’s business and strategy.
This might be true even in case of other studies about the impact of name changes on the stock price. For example, Cooper, Dimitrov, and Rau (“A Rose.com by Any Other Name”, Journal of Finance, 56 (2001), 2371–2387) found that stock prices rose 74% when they changed their names to dot com names in 1999. Similarly, Rau, Patel, Osobov, Khorana and Cooper (Journal of Corporate Finance, 11 (2005), 319-335) showed that stock prices rose when firms removed dot.com from their name after the bubble burst.
It is possible that these name changes were also accompanied by changes in business strategies.
Posted at 4:58 pm IST on Wed, 14 Apr 2010 permanent link
Categories: crisis, investigation
IFRS in the Indian financial sector: Regulatory Capture?
A group constituted by the Ministry of Corporate Affairs with representation of all major financial sector regulators in India has approved a road map for the convergence to international accounting standards (IFRS) by insurance companies, banking companies and non-banking finance companies.
First of all, why should there be a different road map for the financial sector? Why not let financial entities be subject to the same road map as the rest of the corporate sector? The only plausible argument is that the most important change from Indian accounting standards to IFRS would be the treatment of financial instruments (IAS 39) and this impacts the financial sector more than any other sector.
But this argument is rather weak because there are other sectors which are disproportionately impacted by IFRS and there is no kid glove treatment for those sectors. The accounting treatment for agriculture for example changes quite substantially under IFRS. But agriculture does not have a powerful set of regulators protecting their regulatees while the financial sector does.
What I found even more interesting was the different treatment of insurance companies and banks within the financial sector itself. Insurance companies will adopt IFRS in 2012 while banks get an extra year. Is this because insurance companies do not stand to lose much from IFRS and might even stand to gain, while banks stand to lose a lot more?
If one looks only at the complexity of the transition to IFRS, it is not possible to argue that the transition is easier for insurance companies than for banks. Insurance companies too have large investment portfolios and they too will have to contend with all the complexities of IAS 39. In addition, there is an entire accounting standard (IFRS 4) for the insurance industry and IFRS 4 is by no means a model of simplicity. The insurance regulator (IRDA) has a 200 page report describing the implications of IFRS for Indian insurance companies.
Nor is it true that contemplated changes in IFRS will impact banks more and that therefore it makes sense for them to transition directly to the revised standards as and when they come out. IFRS 4 relating to insurance is explicitly described as Phase I of the IASB’s insurance project and Phase II promises drastic and fundamental changes in the accounting approach.
No, I do not see any strong argument why it is in the public interest for insurance companies to converge to IFRS a year ahead of banks. It is obvious however that it is in the interest of the banks themselves to postpone IFRS because of the stringent treatment of held to maturity investments. A cynic would say that regulators in every country and every sector are in danger of being captured by their regulatees.
I think this is a powerful reason for not mixing up regulatory capital and accounting capital. It would be nice if regulators could accept that accounting is for investors and agree to stay from interfering in this. Regulators are free to collect whatever data they want and define capital and profits in whatever way they want. They are free to ignore everything that the accountants put out. That would help make it easier for accounting standards to provide what is most relevant and useful for investors.
Posted at 2:02 pm IST on Tue, 13 Apr 2010 permanent link
Categories: accounting, banks, regulation
Prudent at night but reckless during the day
I have been thinking a lot about what the court examiner’s report on Lehman tells us about other banks. Looking at many things mentioned in the report, my conclusion is that even the banks that are prudent at night become quite reckless during the day. Banks that are careful about their end of day (overnight) exposures seem to be happy to assume very large exposures during the day provided they believe that the position will be unwound before close of the day.
My first example of this phenomenon is a repo transaction undertaken by Barclays after it bought a major part of the Lehman broker dealer business (LBI) in the bankruptcy court. The examiner describes this transaction and its consequences in detail in his report:
The parties then began to implement ... a repo transaction between LBI and Barclays under which Barclays would send $45 billion in cash to JPMorgan for the benefit of LBI, and Lehman would pledge securities to Barclays. Barclays planned to wire the $45 billion cash to JPMorgan in $5 billion components, and Barclays (actually, Barclays’ triparty custodian bank, BNYM) would receive collateral to secure each $5 billion cash transfer. (Page 2165, Volume 5)
Shortly after noon on Thursday, Barclays wired the initial $5 billion of cash to JPMorgan for the benefit of LBI. (Page 2166, Volume 5)
... a senior executive from JPMorgan then contacted Diamond, and asked Barclays to send the $40 billion in cash all at once to expedite the process. According to Ricci, the JPMorgan executive provided Diamond with assurances that, if Barclays sent the $40 billion in cash, JPMorgan would follow up promptly in delivering the remaining collateral. Early Thursday evening, Barclays wired the remaining $40 billion in cash. Barclays did not receive $49.6 billion in securities that evening. Although both the FRBNY and DTCC kept their securities transfer facilities open long after their usual closing times, by 11:00 p.m. on Thursday evening, September 18, Barclays had received collateral with a marked value of only approximately $42 billion. (Page 2167, Volume 5)
To put matters in perspective, $40 billion was roughly equal to the total shareholders’ equity of Barclays at that time (according to the June 30, 2008 balance sheet, shareholders’ equity was £ 22.3 billion or $40.5 billion at the exchange rate of 1.82 $/£ on September 18, 2008). In other words, Barclays was willing to take an unsecured intraday exposure to another bank equivalent to roughly its entire worth. I am sure that an overnight unsecured exposure of this magnitude would be regarded as reckless and irresponsible, but an intraday position was acceptable.
My second example is the triparty clearing bank services provided by JP Morgan Lehman and other broker-dealers. The examiner’s report provides a lucid explanation of the whole matter:
In a triparty repo, a triparty clearing bank such as JPMorgan acts as an agent, facilitating cash transactions from investors to broker- dealers, which, in turn, post securities as collateral. The broker-dealers and investors negotiate their own terms; JPMorgan acts only as an agent. Triparty repos typically mature overnight ... Each night collateral is allocated to investors ... The investors, in turn, provide overnight ... funding to the broker-dealer. The following morning, JPMorgan “unwinds” the triparty repos, returning cash to the triparty investors and retrieving the securities posted the night before by the broker-dealer. These securities then serve as collateral against the risk created by JPMorgan’s cash advance to investors. During the business day, broker-dealers arrange the funding that they will need at the close of business through new triparty-repo agreements. This new funding must repay the cash that JPMorgan advanced during the business day... (Page 1086-87, Volume 4)
The premise of a triparty repo is that it constitutes secured funding in which the lender (investor) has the opportunity to sell the collateral immediately upon a broker-dealer’s (borrower’s) failure to pay maturing principal. (Page 1092, Volume 4)
To guard against the possibility of the investor realizing less than the loan amount in a liquidation scenario, the borrower must pledge additional “margin” (i.e., additional collateral) to the lender – for example, $100 million of Treasury securities in exchange for $98 million in cash. (Page 1092, Volume 4)
As triparty-repo agent to broker-dealers, JPMorgan was effectively their intraday triparty lender. When JPMorgan paid cash to the triparty investors in the morning and received collateral into broker-dealer accounts (which secured its cash advance), it bore a similar risk for the duration of the business day that triparty lenders bore overnight. If a broker-dealer such as LBI defaulted during the day, JPMorgan would have to sell the securities it was holding as collateral to recoup its morning cash advance. (Page 1093, Volume 4)
Through February 2008, JPMorgan gave full value to the securities pledged by Lehman in the NFE calculation and did not require a haircut for its effective intraday triparty lending. Consequently, through February 2008, JPMorgan did not require that Lehman post the margin required by investors overnight to JPMorgan during the day. (Page 1094, Volume 4)
That last paragraph left me stunned. Why would the clearing bank not impose a haircut/margin on the intraday secured lending, while the repo lenders do require such a haircut on the overnight lending? It makes no sense to me. First, the clearing bank is taking a large concentrated exposure, while this exposure gets distributed over a large number of overnight lenders. If anything, the intraday lender should be more worried and should be charging a higher margin. Second, most financial asset prices instruments are more volatile when the markets are open than when they are closed. Since prices are expected to change more during the day than during the night, the intraday lender actually needs a higher margin. Yet, the intraday lender did not ask for any margins at all till February 2008!
For a moment, I thought that the clearing bank was not charging margins because it was willing to take some amount of unsecured exposure to the broker-dealer and it was willing to dispense with the margin under the assumption that the margin would be less than the unsecured exposure that it was willing to have. But no, the examiner’s report clearly states that the margin free secured lending was over and above the maximum unsecured lending that JPMorgan was willing to provide:
JPMorgan used a measurement for triparty and all other clearing exposure known as Net Free Equity (“NFE”). In its simplest form, NFE was the market value of Lehman securities pledged to JPMorgan plus any unsecured credit line JPMorgan extended to Lehman minus cash advanced by JPMorgan to Lehman. An NFE value greater than zero indicated that Lehman had not depleted its available credit with JPMorgan. (Page 1093, Volume 4)
Yet, on a reading of the entire examiner’s report, JPMorgan comes across as a bank with a very robust risk management culture. Again and again one sees that in the most turbulent of times, the bank is seen to be sensitive to various market risks and operational risks and appears to have taken corrective action very quickly. The only conclusion that I can come to is that even well run banks are complacent about intraday risks.
Why should banks be prudent at night but reckless during the day? Probably this has got to do with the fact that nobody prepares intraday balance sheets and so positions that are reversed before close of day do not appear in any external reports (and probably not in many internal reports either). Probably, it has to do with the primordial fear of darkness dating back to our evolutionary struggles in the African Savannah. As the biologists remind us, you can take the man out of the Savannah, but you can not take the Savannah out of the man.
Posted at 4:55 pm IST on Thu, 1 Apr 2010 permanent link
Categories: risk management
Indian Financial Stability and Development Council
I wrote a column in the Financial Express today about the proposal to create a Financial Stability and Development Council in India as a potential precursor to an apex regulatory body.
The announcement in the Budget speech this year about the setting up of a Financial Stability and Development Council (FSDC) has revived the long-standing debate about an apex regulatory body. Much of the debate on FSDC has focused on the politically important but economically trivial question of the chairmanship of the council. I care little about who heads FSDC—I care more about whether it has a permanent and independent secretariat. And I care far more about what the FSDC does.
The global financial crisis has highlighted weaknesses in the regulatory architecture around the world. Neither the unified regulator of the UK nor the highly fragmented regulators of the US came out with flying colours in dealing with the crisis. Everywhere, the crisis has brought to the fore the problems of regulatory overlap and underlap. In every country, there are areas where multiple regulators are fighting turf wars over one set of issues, while more pressing regulatory issues fall outside the mandate of any regulator. Regulation and supervision of systemically important financial conglomerates is an area seen as critical in the aftermath of the crisis. It is an area that has been highly problematic in India.
The most important failure (and bail-out) of a systemically important financial institution in India in recent times was the rescue of UTI, which did not completely fall under any regulator’s jurisdiction. The most systemically important financial institution in India today is probably the LIC, whose primary regulator has struggled to assert full regulatory jurisdiction over it. Even the remaining three or four systemically critical financial conglomerates in India are not subject to adequate consolidated financial supervision. The global crisis has shown that the concept of a lead regulator as a substitute for effective consolidated supervision is a cruel joke. The court examiner’s report in the Lehman bankruptcy released this month describes in detail how the ‘consolidated supervision’ by the US SEC of the non-broker-dealer activities of Lehman descended into a farce. Even before that we knew what happened when a thrift regulator supervised the derivative activities of AIG.
Consolidated supervision means a lot more than just taking a cursory look at the consolidated balance sheet of a financial conglomerate. An important lesson from the global crisis is that we must abandon the silly idea that effective supervision can be done without a good understanding of each of the key businesses of the conglomerate. High-level consolidated supervision of the top five or top ten financial conglomerates is, I think, the most important function that the FSDC should perform drawing on the resources of all the sectoral regulators as well as the staff of its own permanent secretariat.
Another important function is that of monitoring regulatory gaps and taking corrective action at an early stage. Unregulated or inadequately supervised segments of the financial sector are often the source of major problems. Globally, we have seen the important role played by under-regulated mortgage brokers in the sub-prime crisis.
In India, we have seen the same phenomenon in the case of cooperative banks, plantation companies and accounting/auditing deficiencies in the corporate sector. Cooperative banks were historically under-regulated because RBI believed that their primary regulator was the registrar of cooperative societies. The registrar, of course, did not bother about prudential regulation. Similarly, in the mid-1990s, plantation companies and other collective investment schemes were regulated neither as mutual funds nor as depository institutions. Only after thousands of investors had been defrauded was the regulatory jurisdiction clarified.
As far as accounting and auditing review is concerned, the regulatory vacuum has not been filled even after our experience with Satyam. Neither Sebi nor the registrar of companies undertakes the important task of reviewing published accounting statements for conformity with accounting standards. There is an urgent need for a body like FSDC that systematically identifies these regulatory gaps and develops legislative, administrative and technical solutions to these problems. By contrast, I believe that the role of ‘coordination’ between regulators emphasised in the current title of the high-level coordination committee is the least important role of an FSDC. Some degree of competition and even turf war between two regulators is a healthy regulatory dynamic.
At a crunch, I do not see anything wrong in a dispute between two regulators (or between one regulator and regulatees of another regulator) being resolved in the courts. After all, the Indian constitution gives the judiciary the power to resolve disputes even between two governments!
My favourite example from the US is the court battle between the SEC and the derivative exchanges (supported by their regulator, the CFTC) that led to the introduction of index futures in that country. A truly independent regulator should be able and willing to go to court against another arm of the government in order to perform its mission.
Posted at 10:19 am IST on Fri, 19 Mar 2010 permanent link
Categories: regulation
Lehman and its computer systems
Perhaps, I have a perverse interest in the computer systems of failed financial firms – I blogged about Madoff and his AS400 last year. Even while struggling to cope with the fantastic 2,200 page report of the court examiner on Lehman, I homed in on the discussion about Lehman’s computer systems:
At the time of its bankruptcy filing, Lehman maintained a patchwork of over 2,600 software systems and applications. ... Many of Lehman’s systems were arcane, outdated or non-standard. Becoming proficient enough to use the systems required training in some cases, study in others, and trial and error experimentation in others. ... Lehman’s systems were highly interdependent, but their relationships were difficult to decipher and not well documented. It took extraordinary effort to untangle these systems to obtain the necessary information.
My limited experience suggests that outdated and unusable software is a problem in most large organizations. I do hope that the ongoing consumerization of information technology will help reduce these problems by putting intense pressure on corporate IT to reform their ways. Perhaps, organizations should consider releasing the source code of most of their proprietary software on their own intranet to help manage the complexity and user unfriendliness of their systems. Consumerization plus crowd sourcing might just be able to tame the beast.
Posted at 6:20 pm IST on Thu, 18 Mar 2010 permanent link
Categories: bankruptcy, technology