Monday, November 23, 2009

Webinar on : Challenges in Liquidity Risk Management & Reporting

Over the years, regulators have tried to manage liquidity risk by issuing guidelines and recommendations. These guidelines were qualitative in nature and a lot was left to firm's discretion. As the regulatory bodies are all set to revamp the financial regulatory structure, liquidity risk is one area which is gathering a lot of attention. In aftermath of the current crisis, FSA has laid down a new set of rules for Liquidity Risk Management.

While Basel Committee on Banking Supervision has issued principles and guidelines for Liquidity Risk Management, FSA has completely overhauled its Liquidity Framework which was finalized & presented on the 9th of Oct 2009. The far-reaching overhaul, designed to enhance firm's liquidity risk management practices, is based on the lessons learned since the start of the credit crisis in 2007.

Aleri, a leading software company with unique CEP technology, joins with Headstrong, the security industry's leading IT services firm to review the Liquidity reporting requirements of FSA and the unique IT and business challenges it poses to financial services industry.


For more information and to Register for the event on 2nd December, 2009 Click here

Wednesday, November 4, 2009

Stress Tests and Credit Ratings: An Update




To avoid any confusion the title refers to two largely unrelated aspects of financial regulation. I have written about these topics before and there has been enough, or depending on your perspective perhaps not enough, going on to merit revisiting them.
As I am sure you recall earlier this year when the banking community was slogging its way out of a crisis the regulators determined that the best way to decide who needed more capital among the nineteen companies that benefited from the Government’s largesse was to run stress tests. This wasn’t anything new. Financial institutions at least in theory had made stress testing a part of their risk management procedures. Unfortunately, the recent crisis proved that stress testing had, to put it kindly, been found lacking.

In fairness the concept is excellent, but the execution can be challenging. Complex questions need to be addressed realistically before such tests can commence. For example, what scenarios will be used, how will the various interrelationships among portfolios, and among clients be addressed, how will changes in correlations among financial instruments be estimated, etc. (An excellent paper was produced on the subject by the Basel Committee on Banking Supervision, “Principles for sound stress testing practices and supervision”, May 2009.) In addition, the results of such tests have to be at the core of risk management procedures, and discussions with senior management and Boards of Directors as well as with regulators.

Stress testing may well have failed in preventing the crisis because the results were not actively incorporated in bank decision making. However it is also likely that the failure is attributable to the construction and assumptions built into the tests. Presumably regulators addressed these issues before basing capital decisions at a very critical time on the results. Moreover, these fixes will likely become part of standard testing procedure. If my assumptions are correct then wouldn’t it be confidence raising for the financial system and the soundness of individual institutions to know how these tests are conducted. Regulators appear to believe that transparency is good for the efficient functioning of markets. A little more transparency on the regulatory side would be helpful as well.

Now let’s consider credit ratings. Two recent news items provide some insight on how rating agencies are being held accountable by the market. One item involves the National Insurance Company Commissioners who are considering hiring a firm (or firms) to assess the risk of insurance company investments with particular emphasis initially on mortgage-backed securities. To put it kindly the Commissioners appear to have no faith in rating agencies, which is understandable. Now if this effort is successful then what other regulatory bodies will begin to look to new sources for credit rating information, and what asset classes will join MBS in the process?

The other news item reflects on the value of ratings from the issuer and investor perspective. What do Highland Capital Management LP, Heineken NV, Gruppo Campari, Credit Suisse Group AG and Dubai have in common? They have all recently successfully issued either bonds or structured complex securities without credit ratings. That’s correct no ratings agencies were involved and no ratings were announced as part of these deals. Investors apparently did their own research or relied on sources other than traditional rating agencies to make credit based decisions.

It may be premature to conclude that the big three rating agencies have lost their lock on this vital market function but unassisted they have managed to reduce the barriers to entry to their business. The market, as it should, is responding.

Ben Wolkowitz, Headstrong November 2, 2009

Wednesday, October 14, 2009

Regulating Derivatives : The Case for Incentives

Increased regulation of derivatives is coming. At least that was the consensus view of the participants at a recent breakfast discussion I attended “Regulating Derivatives in the Wake of the Financial Crisis”, hosted by FinReg21. The discussion was lead by former SEC Commissioner, Edward H. Fleischman who is currently the Chair of the International Securities Regulation Committee of the International Law Association, and Professor Stephen Figlewski, Professor of Finance at the NYU Stern School of Business and editor of the Journal of Derivatives. The participants included a representative cross cut of the legal and business communities involved in these instruments.

This isn’t a surprising conclusion given there are submitted bills and bills in formation in Congress and a Treasury plan on the table. Although there is overlap in the provisions of these competing plans there are some significant differences in several key provisions. In no particular order here goes my take on the key issues in this discussion.

  1. Derivatives should only be used for hedging purposes. Translated that means derivative trading will end at least in this country. If only hedging is permitted then every hedger will be put in the difficult position of finding another hedger with the opposite exposure. Speculators serve a real economic purpose in providing liquidity and making markets more efficient. There are plenty of orderly, stable markets rife with speculators. Take the cash market for U.S. Government bonds, or futures for example, which gets me to point 2.
  2. Most if not all derivatives should be exchange traded. This is one way of addressing the transparency issue. Exchange traded instruments are transparent and OTC derivatives for the most part are not. Therefore, requiring OTC derivatives to be exchange traded will make them transparent. Logical but impractical. The popularity of OTC derivatives is at least in part due to the market’s capacity to tailor instruments to reflect precise risk exposures. Tailored instruments do not succeed on exchanges because the number of participants interested in trading the same tailored exposure is likely to be quite small; certainly not enough to justify an exchange listing. Therefore some exclusion has to made for tailored derivatives; however implementing that exclusion is not so easy as will be discussed below.
  3. Derivates should be cleared by a centralized facility (or facilities). This is a solid idea. Besides promoting transparency central clearing reduces counterparty risk assuming of course that the clearing agents are sufficiently well capitalized to actually guarantee the trades they settle. Confidence in the financial cushion provided by centralized clearing would benefit from having more than one agent; how many is optimal is a discussion for another time.
  4. Derivatives should be standardized (at least as much as is practical). Standardization facilitates exchange trading, but since exchange trading is really not necessary for greater transparency and is not a realistic objective for the entire market, who cares. Standardization would also clearly facilitate central clearing and about this I do care. Although a non standard instrument could be centrally cleared it does not follow that all centrally cleared instruments could be non-standard. It would likely place too great a burden on the clearing agents.
  5. Given that there is benefit to standardization how would it be implemented or enforced? Would teams of attorneys lay out in precise detail the terms and conditions of each standardized derivative? And even if they did how long would it take to re-engineer standardized derivatives and make them non-standard? Why you might ask would anyone favor non-standardized derivatives? See 6.
  6. The answer is money. Opaque markets are more profitable than transparent markets, and therefore dealers will have a bias against transparency. Many recent innovations in fixed income markets are in part explainable as an ongoing march to create the next successful opaque, and highly profitable, market. OTC derivatives are a relatively recent manifestation of that motivation. There are OTC derivatives currently traded that appear to generate sufficient volume to support being exchange traded but there has been no groundswell of support from dealers to list them. Although I am not in favor of exchange trading requirements as discussed above centralized clearing, which I do favor, will also benefit from standardization. The question then is how to promote centralized clearing?
  7. I prefer incentives to mandates. Capital requirements can be used to encourage market participants to favor clearing their derivative trades in a centralized facility. Make it more expensive to keep trades away from such facilities by specifying greater required capital for OTC derivatives that are not centrally cleared. Then dealers are incented to make their trades conform to standards required by a centralized clearing agent. Only when the cost of clearing is so great that it approaches the magnitude of the capital charge would a dealer be inclined to favor clearing direct rather than central clearing. That situation is likely to occur for those derivatives that must be tailored to work and are sufficiently different so that clearing is made complex. Further the cost of direct clearing would have to be significantly lower to compensate for the difference in capital requirements.
  8. Centralized clearing is not my idea. It has floated through a number of proposals including the one authored by the administration. Academics and analysts have also gotten behind the idea. The hard part is how to set the differential capital requirements. Set them too high and the idea is a non-starter. Set them too low and there is no incentive to move to central clearing. Finding the Goldilocks levels for capital requirements is complex and possibly even intractable. Some regulatory authority cajoling in additional to capital requirements may be needed to ensure success. Regardless, the more we can rely on objective capital requirements that provide incentives to encourage centralized clearing for the vast majority of, if not all, derivatives so much the better.
Ben Wolkowitz Headstrong September 27, 2009

Wednesday, September 9, 2009

Know Your Customer: Driving Value through Trusted Data - A Breakfast Seminar

In just one morning, get invaluable updates on strategies and technologies you can use to better understand and protect your most valuable assets - your data and your customers.
  • Achieve a 360-degree view of the customer by improving the quality of your master data
  • Reduce risk with complete, accurate, consistent, auditable and secure data
  • Establish new levels of customer trust by protecting data in your development and testing environments
  • Improve your fraud investigation efforts through accurate matching, even in the face of deliberate deception
The seminar is co-hosted by Headstrong, a global financial technology consultancy with cutting-edge expertise in the financial sector, and Informatica, a global leader in data integration.

About the speakers
Michael Destein, Director of Solutions Marketing for MDM at Informatica has spent his career focusing on data access, data integration, and data management. In presales, product management, and product marketing roles at Borland, Active Software, webMethods, and Siperian, he has learned the best practices and applies them to achieve business value.

Susan Palm, Headstrong, has a strong background in Risk Management and Compliance from her 25+ year career as Senior Vice President of the Technology Information Group at Wells Fargo and Company.

Click here for more information.
Click here to register for this free event.

Thursday, September 3, 2009

Client Identity Protection & Data Masking

Why Financial Firms need Data Masking, now more than ever!

Let alone defining models for Operational Risk Management (ORM), risk managers struggle to define what is operational risk and rightly so because operational risk in itself is very vast. Some define operational risk as- “All the other risk(s) other than market and credit risk.”

According to Basel – “Operational risk is defined as the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. This definition includes legal risk, but excludes strategic and reputational risk.”

So trade failure, non-compliance to regulations, loss of data due to natural calamity, loss of key-personal, legal lawsuits etc. are all examples of operational risk. Thus while one is aware (most of the times) of their exposure to market and credit risk the same can’t be said for operational risk. How can one define the probability & severity of a failed trade or that of data theft?

On specific area under Operational risk management which is making headlines today is sensitive data protection or to be precise client identity protection. Although there have been regulations around this issue, yet as an aftermath of sub-prime crisis regulators are raising this issue with much more emphasis. One the main reason for the concerns raised by the regulators is the use of production data in non-production environment like development or testing of IT applications. This article aims at various means of protecting such data and suggests guidelines for effective data protection policies.


The Regulations
  1. The Gramm-Leach-Bliley Act: Offers Privacy and Safeguards Rules to protect personal information held by U.S. financial institutions. The Privacy Rule speaks largely to information collection and sharing – with respect to data exposure, this rule mandates that certain information, such as account numbers, cannot be shared with third parties. The Safeguards Rule speaks more to protecting information.

  2. The Identity theft Red Flags Rules: The final rules require each financial institution and creditor that holds any consumer account, or other account for which there is a reasonably foreseeable risk of identity theft, to develop and implement an Identity Theft Prevention Program (Program) for combating identity theft in connection with new and existing accounts. The Program must include reasonable policies and procedures for detecting, preventing, and mitigating identity theft and enable a financial institution or creditor to 1) Identify relevant patterns, practices, and specific forms of activity that are “red flags” signaling possible identity theft and incorporate those red flags into the Program; 2) Detect red flags that have been incorporated into the Program; 3) Respond appropriately to any red flags that are detected to prevent and mitigate identity theft; and 4) Ensure the Program is updated periodically to reflect changes in risks from identity theft.

  3. PCI DSS: The Payment Card Industry Data Security Standard is a set of requirements for securing payment account data. The PCI DSS affects all the companies which handle payment card data, which are myriad. The requirements are straightforward, and include “protect stored cardholder data” and “restrict access to cardholder data by business need-to-know”.

  4. OCC BULLETIN 2008-16: This bulletin reminds national banks and their technology service providers that application security is an important component of their information security program. All applications, whether internally developed, vendor-acquired, or contracted for, should be subject to appropriate security risk assessment and mitigation processes. Vulnerabilities in applications increase operational and reputation risk as unplanned or unknown weaknesses may compromise the confidentiality, availability, and integrity of data.

Out of these the last two are dedicated to the financial services industry. A study shows that these firms are responsible for protecting almost 85% of their entire data.

The Cost of Data Theft

Below are few numbers around data loss and theft.

  1. Since 2005 over 250 million customer records containing sensitive information have been lost or stolen. Privacy Rights Clearinghouse

  2. The 2008 breach report revealed 656 reported breaches at the end of 2008, reflecting an increase of 47% over last year’s total of 446.” Identity Theft Resource Center, 2009

  3. 62% [of respondents] use live data for testing of applications and 62% of respondents use live data for software development.” Ponemon Institute, December 2007

  4. The average cost of a data breach has risen to $202 per customer record, which translates to roughly 20 Million USD per 100,000 records. Ponemon Institute 2009.

  5. Recently, the credit card processor associated with the TJX data breach was fined $880,000 for failing to meet this standard. In the same incident, TJX paid a $40.9 million settlement to Visa.

Many firms have settled their lawsuits for millions and millions of dollars. Please note that although one can measure the cost of fines and lawsuits the cost of loss of reputation & customer trust is harder to measure. Needless to say that data protection is a serious issue and is becoming a bigger concern with each passing year as evident from the study by Ponemon Institute.

As discussed above the one of the main concern of the regulators pertains to the use of production or live data in non-production environments. Firms often prepare test beds for testing various IT applications, products etc. before deploying them for use. Since the testing of applications requires “real-like” data, almost 62% of the firms use production data to test these applications. This poses serious risks of identity and data theft. Also firms take huge pains to ensure the safety of their live or production data but somehow same standards are not applied when that data is copied to non-production environment.

What is Data Masking

Data masking is the process of obscuring (masking) specific data elements within data stores. It ensures that sensitive data is replaced with realistic but not real data. The goal is that sensitive customer information is not available outside of the authorized environment. Data masking is typically done while provisioning non-production environments so that copies created to support test and development processes are not exposing sensitive information. Masking algorithms are designed to be repeatable so referential integrity is maintained.


Differences between encryption and masking

Encrypted data is good in case when you want only people with the right “keys” to view the data. The data loses all its properties and hence it can’t be used by developers and QA professionals who need “real-like” data for testing applications. In contrast data masking or masking will prevent abuse while ensuring that properties of the data remain as they are in production environment.


Different Methods For Data Masking

Following is a brief discussion of various methods used for data masking. We have also discussed which method is to be used when.

Nulling

  • Deleting a column of data by replacing it with NULL values
  • Useful in role based access, when you don’t want to reveal the data
  • Can’t be used in testing environment, as data properties are lost
  • NULLing is not a data masking technique, but is used with other methods for data masking e.g. Credit Card numbers masked as 4234- XXXX- XXXX- 6565

Substitution Text Color

  • Randomly replacing the contents of a column of data with information that looks similar but is completely unrelated to the real details
  • Preserve the data properties
  • Since there is no logic or relationship involved like ageing & reorder one has to store the large amount of random, substitute data
  • Finding the required random data to substitute and developing the procedures to accomplish the substitution can be a major effort
  • Generating large random “real-like” data is difficult in some cases e.g. Street address
  • Useful for generic data like name, address, and numerical data with no properties (credit card pre-fixes and suffix etc.)

Shuffling/Reorder

  • The data in a column is randomly moved between rows until there is no longer any reasonable correlation with the remaining information in the row
  • Since the entire is only jumbled, the end-user still has access to the entire set of data can perform some meaningful queries on the same.
  • Shuffling algorithms fail if they are simple and can be easily decoded
  • It is useful only on large amount of data
  • It should be used along with ageing, variation etc. techniques which shuffle and also increase / decrease data by some fixed percentage.
  • On the plus side, this is one the easiest and fastest way of shuffling data

Numeric alternation

  • Increase/decrease numerical by %
  • % can be fixed or random but is selected to that the data stays within the permissible or probable values
  • It is generally used in isolation with other techniques

Gibberish Generation

  • Given any input, computer generates output which is random, but which has the same statistical distribution of characters or combinations of characters. (A character may be a letter, a digit, a space, a punctuation mark, etc.)
  • In level 1 gibberish, the output has the same distribution of single characters as the input. For example, the probability of seeing characters like "e" or "z" or "." will be approximately the same in the output as in the input. In level 2 gibberish, the output has the same distribution of character pairs as the input. For example, the probability of seeing a pair like "th" or "te" or "t." will be approximately the same in the output as in the input.
  • In general, in level n gibberish, the output has the same distribution of groups of n characters (n-tuples) as the input.
  • Just like encryption it will render the data meaning less
  • Should be used with role based access of data. i.e. when you want to subjectively shield data based on the role / purpose of the individual

Currently there are two main approaches for data masking in non-production environment :

  1. EML (Extract Mask And Load): Data is extracted from the production db, it is masked and then it’s loaded to the pre-production server. It is useful when loading large amount of data.

  2. IPM (In place Masking method): Data is directly loaded to the non-production db and there specific columns are masked before the data is released to the QA and Developers. Useful when data is less and you have well-defined sensitive data to protect.

    By data masking one can ensure that data will retain its properties and can be used for analytical purposes. Also given the fact that almost 70 % of all data thefts are internal one needs to ensure that some form is data masking is employed within the organization to prevent internal threats.

Guidelines for Effective Data Masking

Start with fewer applications. As discussed above financial firms need to protect almost 85% of their entire data. If a firm can protect 15% data each year it is a good achievement.

  1. One needs to understand that as discussed under various methods for data masking, one method is generally not sufficient for data protection and each has its pros and cons.
  2. Ideally data masking must be used with role based access or exposure to data to provide a double protection.
  3. The IT side in just one aspect of Data masking. The first thing that firms should consider is defining the firm-wide data protection policy. Without defining the KRIs and effective ways to measure them one can’t succeed with any operational risk initiative.
  4. In order to save cost firms should consider 3rd party software vendor for IT products and solution around data masking. The policies and procedures should be maintained in-house.

    Abhishek Dhall Headstrong August - 2009

Wednesday, August 26, 2009

Regulatory Reform - An Update and Assessment

With Congress in summer recess it is an appropriate time to assess what has happened in this once in a lifetime opportunity to reform financial regulation. It is enticing to say not much and end this note now, but I can’t resist the opportunity to bore you a bit with my review of recent events and a cautionary note as we go forward.

In reality not much has been accomplished in terms of actual reform although there continue to be a substantial number of proposals on the table including the Administration’s proposal, which is comprehensive in approach. Congress has institutionalized bank compensation reviews and oversight although it is unclear precisely how this will work and what in fact it will accomplish. Also stockholders will now be given the right to vote on compensation packages for key executives, although such votes are non-binding. There must be some logic to that little piece of meaningless reform, but I am hard pressed to tell you what it is.

The more substantial proposals have gotten bottled up for several different reasons. My award for the most colorful moment in a not particularly colorful process was when the heads of the major regulatory agencies brought their turf war to a Congressional hearing. It was reported that the generally unflappable Secretary of the Treasury was less than subtle in voicing his dissatisfaction to the aforementioned group after that fiasco. Can’t say I blame him although it is a bit unusual for an Administration official to address the heads of supposedly independent regulatory agencies in such a way.

This lack of tangible accomplishments doesn’t mean the game is over. When Congress returns financial industry regulatory reform is likely to get attention assuming health care insurance doesn’t become completely overwhelming. What I find concerning at this stage is the likelihood of compromise resulting in costly regulation with inadequate attendant benefits to justify the changes. We have had a hint of that already. The SEC seems serious about adding to the industry data collection burden by requiring that information on activity in over the counter derivatives be provided periodically. This information will then be made public, but only in aggregated form and with a one-month lag. I suppose in about 10 years that may generate sufficient data points to support the empirical part of a Ph.D. dissertation, but I cannot understand what else will come of this additional collection burden. Will SEC staff be examining this data as it comes in and if so what will they be looking for? How will what they be looking for differ from what the examiners have on their list? You get the point. Financial institutions will be adding to their costs with no apparent associated benefits.

The aspect of regulation that seems to get little to no attention is that compliance is not free; in fact it can be very expensive. IT costs can escalate quickly when firms have to reengineer processes to effectively comply. Obviously there is no increase in revenue associated with these regulatory activities. That puts pressure on banks to find sources of additional revenue to counter these added costs. A particularly appealing source is fees. The high visibility recent changes in credit card terms are at least in part a response to an increase in the cost of doing business. (The other part is the curtailment of profitable, albeit risky activities, either because of regulation and/or management response to large losses.) There are other knock on effects that are worthy of further discussion, perhaps in a future note. For now we can agree that when bank costs are increased by added regulation and compliance, at least some of those costs are likely to be passed on to the banks’ customers. My cautionary note is that regulators should take these costs into consideration when reforming regulation to ensure that what we are paying for is worth the price.

Ben Wolkowitz Headstrong August 25, 2009

Tuesday, August 25, 2009

Financial Regulatory Reform. Is this all there is?

The proposed changes to financial regulation announced by President Obama on June 17th are significant, but somewhat less aggressive than I had expected given that this is supposed to be that once in a lifetime opportunity to reform financial regulation. The following are the highlights of the announcement.

  1. Regulators will look at the financial system as a whole, not simply at individual institutions, to avoid a replay of the current situation. As expected the Federal Reserve will be charged with regulating systemic risk. Moreover to ensure that there are no gaps in regulation and also to facilitate oversight of the entire financial system a new organization, the Financial Services Oversight Council, will be established. Chaired by the Secretary of the Treasury, it will include the heads of all financial industry and market regulators.


  2. A new regulatory agency will be established to look after the consumers’ interests in the financial markets. The Consumer Financial Protection Agency will be akin to a Consumer Products Safety Commission for financial instruments including mortgages.


  3. Simplification of regulatory agency structure falls short of resolving the current labyrinth of regulatory agencies and responsibilities. One positive development is that the Office of Thrift Supervision is no more. Its responsibilities will be merged with those of the Controller of the Currency into a new agency, The National Bank Supervisor. This reflects the end of thrift charters; all federally chartered depository institutions will be banks. However the CFTC will continue to be an independent agency rather than merged with the SEC as many observers had expected. The historical and arcane distinction between contracts, as in futures contracts, and securities will prevail for no particularly compelling reason that I can discern. To make it interesting both agencies will be given enhanced authority to regulate derivatives.


  4. The Federal Reserve wins a little and loses a little. On the plus side it gets to oversee all large institutions whose failure could jeopardize the stability of the financial system. On the negative side some of the authority that will go to the Consumer Financial Protection Agency had belonged to the Fed.


  5. Hedge funds exceeding a to be specified size threshold will be required to register with the SEC, and with registration comes the requirement to open your books to the regulator. Surprisingly expanded registration will also be extended to private equity and venture capital firms.

There are also a substantial number of less dramatic but equally important provisions contained in the proposed regulatory reforms. One that I particularly like would give the SEC the authority to require a company to allow shareholders to vote on executive compensation packages. Opponents of the governments’ potential involvement in compensation guidelines have argued that this is the responsibility of shareholders knowing full well that shareholders have no real authority in this area. Now they will (or more accurately, might).

Another interesting provision will require the issuing institution to retain 5% of the loans in a pool underlying a securitized debt issue. This is less than the 20% that had been suggested by some European regulators, but probably still a sufficient amount to provide more discipline to the securitized debt market. A 20% requirement might well have been tantamount to ending that market.

Unfortunately the rating agencies come in for little attention. There is mention made of addressing compensation arrangements to avoid conflicts of interest. The brief mention of this topic relative to other areas leaves the impression that it is not a major priority. Equally puzzling is the lack of recognition that the supervisory process is broken. Although there are loopholes in regulation, if supervisors had been doing their jobs it is unlikely that we would gotten into all the difficulties we did. New regulations and regulatory structure is one thing and robust enforcement is another.

These proposals will have to be approved by Congress, which will no doubt be assaulted by the lobbying efforts of the financial industry, one of the most powerful lobbying organizations in the country and certainly one of the wealthiest. I still expect the outcome will retain much of the overall reach of these proposals because at least for now there appears to be widespread support for regulatory reform. But as the saying goes the devil is in the details. Certain trends reflected in these proposals seem unstoppable including greater oversight of the entire financial system, not just the component parts, more consumer protection and greater transparency and reporting.

Ben Wolkowitz Headstrong June 18, 2009

Briefing Note on Regulation: FASB

Early this week the Financial Accounting Standards Board (FASB) issued guidelines on the valuation of instruments for which there is no actively traded market. Mark to market has been the rule for the marking of financial assets. When an actively traded market does not exist there are approved procedures for coming up with a proxy for the market. For example in the case of aged corporate bonds, which commonly do not trade for days at a time, it is commonplace to identify cohorts, i.e. bonds with similar characteristics and then discount the price of the actively traded cohort by a reasonable amount to reflect the illiquidity of the security that is not actively traded. In other cases such as tailored derivatives a sample of dealer estimates of a fair market price is often obtained and averaged. Similarly private placements can be compared to market traded securities and again discounted for their illiquidity. Without belaboring the point illiquid securities are not a rarity and there are established procedures for assigning a market price.

Why tinker with the system now? Banks are currently burdened with significantly more non-traded securities than they had anticipated. The proliferation of mortgage backed securities, credit default swaps and other categories of ill considered securities has created a burden on bank earnings and capital that in a couple of well advertised cases has proven fatal. The market estimate for such securities is not surprisingly a fraction of par value. Any reasonable market price would have to be quite low.

Banks with some logic argue that although these securities may have been overvalued in the past they are far from worthless. They generate a revenue stream and in time the vast majority of underlying collateral will survive having less of an impact on these securities than the market currently estimates by its pricing. FASB has been under pressure to allow banks to generate alternative valuations that would basically reflect the discounted stream of cash flows over some average life properly adjusted for the likely mortgage defaults. Although this may sound logical it depends on the banks becoming more adept at valuing collateral than they proven so far. The only thing we can be fairly certain of is that banks will value these securities higher than any market driven estimate, and that’s where the timing comes in.


Recently the Treasury announced the introduction of a well-received plan to combine public and private capital to acquire these toxic assets from banks. The core issue in all such proposals is how will the two parties to these trades come together on price. Until now there was little doubt that there would be a wide spread between the bid and offer. However, given the way this particular plan is structured there was optimism in the market that the spread, at least on securities, would not be insurmountable especially given the inducements for buyers in this plan. Clearly anything that contributes to a widening of the spread increases the likelihood that market-clearing prices will not be discovered. FASB’s guidelines are likely to make that happen. While the sellers, the banks, are valuing securities on the basis of self-serving assumptions, the buyers will be looking for guidance from the market. This game isn’t going to be pretty if each side is playing by different rules. I doubt that the Treasury has sent a thank you note to FASB for their ill-timed contribution to continued financial instability.


Post Script: As of July 7th this program of making a market in toxic assets has yet to begin.
Ben Wolkowitz Headstrong April 15, 2009

Briefing Notes on Regulation : Credit Deafult Swaps
This week the credit default swaps market went through a momentous change that in time could dramatically affect that market. The International Swaps and Derivatives Association (ISDA) overhauled the $27 trillion market for credit default swamps on corporate and sovereign debt by standardizing the terms of many of these instruments and incorporating characteristics that will bring them more in line with the underlying bonds. Aptly named the Big Bang by ISDA, some 1,800 market participants including banks, hedge funds and money managers agreed to the new standards.
Standardizing swaps’ terms makes them more transparent and facilitates trading. It is a direct and effective way to address the accusations that the market is too opaque. Also standardizing the terms is a significant step toward centralized clearing and when necessary settlement of these swaps. By comparison achieving the standardization of the terms of interest rate swaps was a much longer and more difficult process. Far less than 1,800 market participants agreed to the new protocols on day one, which by comparison is a good reason for calling this the Big Bang.
As significant as these changes are not all CDS market problems are solved. Notably not all CDS are covered. Omitted are CDS based on mortgages and convoluted underlying instruments; just the type of CDS that AIG specialized in insuring. Also issues regarding concerns over counterparties to these swaps will be solved only when there is a clearinghouse involved that has sufficient financial backing to ensure participants that counterparty defaults in their obligations will be covered by the clearinghouse.
Minimal resistance from banks is surprising given that this will adversely affect their profit margins on these instruments. The history of fixed income markets demonstrates that once a sector becomes transparent the spread (between the bid and the offer) shrinks dramatically. I suppose the banks were being realistic recognizing that maintaining previous market practices would ensure a small to non-existent trading environment. If anything transparency in markets generally results in an increase in their size. As the expression goes, you can make it up in volume.
Ben Wolkowitz Headstrong April 9, 2009

Wednesday, August 19, 2009

Brieifing Notes on Regulation and Compliance: G-20

The communiqués that came out of the G – 20 meetings identified at least three areas that may well affect our clients.

  1. The G – 20 agreed to the creation of a global regulator, The Financial Services Board (FSB), to work to prevent financial and economic crises, and to respond to them effectively when they do occur. This organization provides structure around global regulatory cooperation. Housed at the Bank for International Settlements (also home to the Basel Accords), this organization is likely to request information to assess the behavior and stability of financial markets. I expect that most data will be aggregated by country with the exception of the major global financial institutions that will be required to report individually.

    For our clients the FSB is yet another recipient of information that they will have to satisfy. What is not clear is how much will be provided by the regulators and how much will have to be provided by the large financial institutions directly. If the burden is on the regulators then little or no additional work will be required of our clients. If, however, the requirement for information directed to the company level our already overtaxed IT clients may find themselves with yet more to do.

  2. The likelihood of hedge funds being regulated has increased. Hedge fund regulation was explicitly addressed in the communiqués. I would expect hedge funds to be subjected to less regulation and disclosure requirements than bank holding companies; most likely they will be covered, at least in the U.S., by a watered down version of the Investment Act of 1940. Briefly, I would expect that disclosure of balance sheets on some periodic basis, and perhaps also client lists (see 3 below) would be required. If possible, hedge funds are likely to rely on outside agents to ensure compliance. Whether those agents will be prime brokers, fund administrators or someone else will depend on the nature of the requirements.

  3. Tax havens will no longer be tolerated. I found it surprising that this was a focus of the outcome of the meetings since tax havens have had little to nothing to do with our current situation. I can only assume that many nations are concerned about paying for their stimulus packages and associated activities, and are no longer willing to overlook a source of substantial revenue.

This suggests that financial institutions will have to disclose more information about their clients, in addition to greater focus on anti money laundering. Although most financial institutions currently have to comply with AML regulations, I anticipate a re-examination of what is currently required and how all categories of financial institutions comply. Some tightening of requirements may occur and additional reporting is likely to be part of any new regulations designed to eliminate tax havens.

Amidst the G – 20 display of global cooperation it was clearly stated in communiqués and by key participants, Treasury Secretary Geithner included, that financial industry regulation is a sovereign responsibility that cannot be relinquished to a global organization. Therefore I would anticipate that globally agreed on principles and guidelines will be subject to interpretation when it comes to implementation. This is not unlike the Basel Accords, which are drafted and agreed on by a globally representative committee, but put into action on a country level where they often need to be modified so that they are consistent with each country’s regulations. We will continue to monitor these developments.

Ben Wolkowitz
Headstrong


(This entry was published right after the G-20 meeting earlier this year; however we felt it was worth reviewing what came out of that meeting and how we felt about the outcome.)

Regulation and Compliance : The Headstrong Blog

Headstrong, an IT domain lead consulting firm with particular expertise in the financial services industry is going public. Not in the usual sense, we will continue to be privately held, but in the sharing of observations sense. Our clients are focused on regulatory change and the implications for compliance and so are we. Since the beginning of the year we have hosted a panel discussion on regulation with industry experts with more expertise and experience than the usual talking heads to tell us what to expect and how it will change the way the industry operates. In addition we have initiated a series of what we call Briefing Notes on Regulation, in which we comment on what we see happening in and to the financial services industry. The feedback we have received has been favorable and that has inspired us to begin a blog to make these commentaries more accessible. We do this with humbleness knowing that there are more commentators out there than is optimal by any measure. For that reason we will post only when we have something to say that we feel has not been said at all or at least has not been said with the emphasis that we believe is deserved.

To give you some idea as to what to expect we are beginning with a few older notes, which we believe still have something to add to the regulatory discussion. These will be distributed over the next few days. We have also included a link to the above-mentioned panel discussion.

To learn more about Headstrong go to http://www.headstrong.com/.

Ben Wolkowitz

Tuesday, March 31, 2009

Overview of Financial Regulation

Regulations are rules enforced by regulators with the primary objectives of ensuring orderly capital markets and protecting investors from unfair and illegal business practices. Regulations come from several sources. The major regulatory changes are most often the result of legislation coming from Congress. The regulators are involved in the deliberations but they do not act alone. However regulators do have authority to issue regulations as well as long as they are in the context of existing law. Regulatory Compliance is to be in conformance with relevant regulations. Breach of these regulations can carry fines, result in loss of reputation and the revocation of the right to be employed in the securities industry, and ultimately lead to the loss of business, civil penalties etc.

All firms providing financial services are governed by regulations but the number of regulations governing them and the extent of control, disclosures etc is dictated by main business/activities of the firm. Regulation can be geography specific i.e. Any Investment bank operating in European Union has to comply by MiFID; Client specific i.e. Suitability of a client to invest in a particular asset class; industry specific i.e. BASEL II which is applicable to all Bank holding companies etc. The list of regulations is huge and so is the list of regulatory fines and legal cases against firms and individuals for non-compliance.