Tuesday, January 5, 2010

Credit Rating Agencies --- Proposed Regulatory Reform

This is the year in which we are likely to see some of the proposed changes in regulation of the financial industry become law. In a recent white paper I wrote about the principles of reform that I hoped to see guide the changes as they apply to rating agencies. They include changing the business model by having investors rather than issuers pay for ratings, reduce the barriers to entry to encourage more competition in the industry, and enhance accountability so that those wronged by woefully inaccurate ratings would have recourse for compensation. Judging by the current bills (Senate Bill Title IX Investor Protection Act of 2009, and the House Bill, HR 4173 Accountability and Transparency in Ratings Agency Act) this is not going to happen.

Both bills share the basic operating principle that the business model in the ratings agency business is to remain unchanged. The primary reason that the rating agencies failed so miserably in their primary task was because they served the wrong master, issuers and not investors. Both bills substitute an internal compliance agent, additional reporting and further disclosure as means to restoring confidence in the work of the rating agencies. No question these proposed reforms will make kowtowing to issuers more difficult, but will they end the practices we have recently seen? The financial motivation to bias ratings to what issuers want will not change. What will change is that SEC examiners will be collecting data, reviewing codes of ethics and conflict of interest policies, meeting with compliance officers, and reviewing the methods used for assigning ratings. I believe you could fairly characterize the potential situation as greed as represented by highly motivated and clever individuals working for rating agencies versus examiners who have very recently demonstrated an inability to properly supervise and enforce longstanding straightforward regulations. I am not optimistic.

Moreover, both bills do nothing to encourage more companies to enter this industry. The barriers to entry that currently exist will be raised a bit higher with additional and potentially costly reporting requirements. The bills also propose that the agencies make public the procedures they use to generate their ratings. Fundamental to the ratings process when done correctly are models that predict the financial performance of the company being rated or the underlying collateral in the case of securitized debt. These models presumably represent an important component of the intellectual capital of a firm. What potential new entrant would want to risk their IP becoming public because of a reporting requirement? Finally what particular need would a new agency serve? I could see investors creating a demand for agencies with particular expertise or unique sophisticated methods in assigning ratings. However with issuers paying for the ratings I do not see a source for innovation in the credit rating business.

The bills provide for penalties as a deterrent against violating the proposed reforms. The SEC is given rulemaking authority in the establishment of penalties. Presumably they will be made sufficiently onerous to encourage voluntary compliance with the provisions of the bills. But that leaves unanswered the question of restitution for those victimized by inaccurate ratings. To be fair that is a legal issue that should be addressed by the courts and not the legislatures. Therefore my evaluation of these bills is based on two of my three previously stated principles, correcting the business model and encouraging greater competition among agencies. As for the first principle I certainly hope that greater disclosure and transparency are sufficient to overcome the bias in the business model, but I remain skeptical. On the second principle I have no doubt. Not only do the bills not encourage additional competition, they discourage it by increasing the barriers to entry, financial and otherwise.

Ben Wolkowitz Headstrong

Monday, November 23, 2009

Webinar on : Challenges in Liquidity Risk Management & Reporting

Over the years, regulators have tried to manage liquidity risk by issuing guidelines and recommendations. These guidelines were qualitative in nature and a lot was left to firm's discretion. As the regulatory bodies are all set to revamp the financial regulatory structure, liquidity risk is one area which is gathering a lot of attention. In aftermath of the current crisis, FSA has laid down a new set of rules for Liquidity Risk Management.

While Basel Committee on Banking Supervision has issued principles and guidelines for Liquidity Risk Management, FSA has completely overhauled its Liquidity Framework which was finalized & presented on the 9th of Oct 2009. The far-reaching overhaul, designed to enhance firm's liquidity risk management practices, is based on the lessons learned since the start of the credit crisis in 2007.

Aleri, a leading software company with unique CEP technology, joins with Headstrong, the security industry's leading IT services firm to review the Liquidity reporting requirements of FSA and the unique IT and business challenges it poses to financial services industry.


For more information and to Register for the event on 2nd December, 2009 Click here

Wednesday, November 4, 2009

Stress Tests and Credit Ratings: An Update




To avoid any confusion the title refers to two largely unrelated aspects of financial regulation. I have written about these topics before and there has been enough, or depending on your perspective perhaps not enough, going on to merit revisiting them.
As I am sure you recall earlier this year when the banking community was slogging its way out of a crisis the regulators determined that the best way to decide who needed more capital among the nineteen companies that benefited from the Government’s largesse was to run stress tests. This wasn’t anything new. Financial institutions at least in theory had made stress testing a part of their risk management procedures. Unfortunately, the recent crisis proved that stress testing had, to put it kindly, been found lacking.

In fairness the concept is excellent, but the execution can be challenging. Complex questions need to be addressed realistically before such tests can commence. For example, what scenarios will be used, how will the various interrelationships among portfolios, and among clients be addressed, how will changes in correlations among financial instruments be estimated, etc. (An excellent paper was produced on the subject by the Basel Committee on Banking Supervision, “Principles for sound stress testing practices and supervision”, May 2009.) In addition, the results of such tests have to be at the core of risk management procedures, and discussions with senior management and Boards of Directors as well as with regulators.

Stress testing may well have failed in preventing the crisis because the results were not actively incorporated in bank decision making. However it is also likely that the failure is attributable to the construction and assumptions built into the tests. Presumably regulators addressed these issues before basing capital decisions at a very critical time on the results. Moreover, these fixes will likely become part of standard testing procedure. If my assumptions are correct then wouldn’t it be confidence raising for the financial system and the soundness of individual institutions to know how these tests are conducted. Regulators appear to believe that transparency is good for the efficient functioning of markets. A little more transparency on the regulatory side would be helpful as well.

Now let’s consider credit ratings. Two recent news items provide some insight on how rating agencies are being held accountable by the market. One item involves the National Insurance Company Commissioners who are considering hiring a firm (or firms) to assess the risk of insurance company investments with particular emphasis initially on mortgage-backed securities. To put it kindly the Commissioners appear to have no faith in rating agencies, which is understandable. Now if this effort is successful then what other regulatory bodies will begin to look to new sources for credit rating information, and what asset classes will join MBS in the process?

The other news item reflects on the value of ratings from the issuer and investor perspective. What do Highland Capital Management LP, Heineken NV, Gruppo Campari, Credit Suisse Group AG and Dubai have in common? They have all recently successfully issued either bonds or structured complex securities without credit ratings. That’s correct no ratings agencies were involved and no ratings were announced as part of these deals. Investors apparently did their own research or relied on sources other than traditional rating agencies to make credit based decisions.

It may be premature to conclude that the big three rating agencies have lost their lock on this vital market function but unassisted they have managed to reduce the barriers to entry to their business. The market, as it should, is responding.

Ben Wolkowitz, Headstrong November 2, 2009

Wednesday, October 14, 2009

Regulating Derivatives : The Case for Incentives

Increased regulation of derivatives is coming. At least that was the consensus view of the participants at a recent breakfast discussion I attended “Regulating Derivatives in the Wake of the Financial Crisis”, hosted by FinReg21. The discussion was lead by former SEC Commissioner, Edward H. Fleischman who is currently the Chair of the International Securities Regulation Committee of the International Law Association, and Professor Stephen Figlewski, Professor of Finance at the NYU Stern School of Business and editor of the Journal of Derivatives. The participants included a representative cross cut of the legal and business communities involved in these instruments.

This isn’t a surprising conclusion given there are submitted bills and bills in formation in Congress and a Treasury plan on the table. Although there is overlap in the provisions of these competing plans there are some significant differences in several key provisions. In no particular order here goes my take on the key issues in this discussion.

  1. Derivatives should only be used for hedging purposes. Translated that means derivative trading will end at least in this country. If only hedging is permitted then every hedger will be put in the difficult position of finding another hedger with the opposite exposure. Speculators serve a real economic purpose in providing liquidity and making markets more efficient. There are plenty of orderly, stable markets rife with speculators. Take the cash market for U.S. Government bonds, or futures for example, which gets me to point 2.
  2. Most if not all derivatives should be exchange traded. This is one way of addressing the transparency issue. Exchange traded instruments are transparent and OTC derivatives for the most part are not. Therefore, requiring OTC derivatives to be exchange traded will make them transparent. Logical but impractical. The popularity of OTC derivatives is at least in part due to the market’s capacity to tailor instruments to reflect precise risk exposures. Tailored instruments do not succeed on exchanges because the number of participants interested in trading the same tailored exposure is likely to be quite small; certainly not enough to justify an exchange listing. Therefore some exclusion has to made for tailored derivatives; however implementing that exclusion is not so easy as will be discussed below.
  3. Derivates should be cleared by a centralized facility (or facilities). This is a solid idea. Besides promoting transparency central clearing reduces counterparty risk assuming of course that the clearing agents are sufficiently well capitalized to actually guarantee the trades they settle. Confidence in the financial cushion provided by centralized clearing would benefit from having more than one agent; how many is optimal is a discussion for another time.
  4. Derivatives should be standardized (at least as much as is practical). Standardization facilitates exchange trading, but since exchange trading is really not necessary for greater transparency and is not a realistic objective for the entire market, who cares. Standardization would also clearly facilitate central clearing and about this I do care. Although a non standard instrument could be centrally cleared it does not follow that all centrally cleared instruments could be non-standard. It would likely place too great a burden on the clearing agents.
  5. Given that there is benefit to standardization how would it be implemented or enforced? Would teams of attorneys lay out in precise detail the terms and conditions of each standardized derivative? And even if they did how long would it take to re-engineer standardized derivatives and make them non-standard? Why you might ask would anyone favor non-standardized derivatives? See 6.
  6. The answer is money. Opaque markets are more profitable than transparent markets, and therefore dealers will have a bias against transparency. Many recent innovations in fixed income markets are in part explainable as an ongoing march to create the next successful opaque, and highly profitable, market. OTC derivatives are a relatively recent manifestation of that motivation. There are OTC derivatives currently traded that appear to generate sufficient volume to support being exchange traded but there has been no groundswell of support from dealers to list them. Although I am not in favor of exchange trading requirements as discussed above centralized clearing, which I do favor, will also benefit from standardization. The question then is how to promote centralized clearing?
  7. I prefer incentives to mandates. Capital requirements can be used to encourage market participants to favor clearing their derivative trades in a centralized facility. Make it more expensive to keep trades away from such facilities by specifying greater required capital for OTC derivatives that are not centrally cleared. Then dealers are incented to make their trades conform to standards required by a centralized clearing agent. Only when the cost of clearing is so great that it approaches the magnitude of the capital charge would a dealer be inclined to favor clearing direct rather than central clearing. That situation is likely to occur for those derivatives that must be tailored to work and are sufficiently different so that clearing is made complex. Further the cost of direct clearing would have to be significantly lower to compensate for the difference in capital requirements.
  8. Centralized clearing is not my idea. It has floated through a number of proposals including the one authored by the administration. Academics and analysts have also gotten behind the idea. The hard part is how to set the differential capital requirements. Set them too high and the idea is a non-starter. Set them too low and there is no incentive to move to central clearing. Finding the Goldilocks levels for capital requirements is complex and possibly even intractable. Some regulatory authority cajoling in additional to capital requirements may be needed to ensure success. Regardless, the more we can rely on objective capital requirements that provide incentives to encourage centralized clearing for the vast majority of, if not all, derivatives so much the better.
Ben Wolkowitz Headstrong September 27, 2009

Wednesday, September 9, 2009

Know Your Customer: Driving Value through Trusted Data - A Breakfast Seminar

In just one morning, get invaluable updates on strategies and technologies you can use to better understand and protect your most valuable assets - your data and your customers.
  • Achieve a 360-degree view of the customer by improving the quality of your master data
  • Reduce risk with complete, accurate, consistent, auditable and secure data
  • Establish new levels of customer trust by protecting data in your development and testing environments
  • Improve your fraud investigation efforts through accurate matching, even in the face of deliberate deception
The seminar is co-hosted by Headstrong, a global financial technology consultancy with cutting-edge expertise in the financial sector, and Informatica, a global leader in data integration.

About the speakers
Michael Destein, Director of Solutions Marketing for MDM at Informatica has spent his career focusing on data access, data integration, and data management. In presales, product management, and product marketing roles at Borland, Active Software, webMethods, and Siperian, he has learned the best practices and applies them to achieve business value.

Susan Palm, Headstrong, has a strong background in Risk Management and Compliance from her 25+ year career as Senior Vice President of the Technology Information Group at Wells Fargo and Company.

Click here for more information.
Click here to register for this free event.

Thursday, September 3, 2009

Client Identity Protection & Data Masking

Why Financial Firms need Data Masking, now more than ever!

Let alone defining models for Operational Risk Management (ORM), risk managers struggle to define what is operational risk and rightly so because operational risk in itself is very vast. Some define operational risk as- “All the other risk(s) other than market and credit risk.”

According to Basel – “Operational risk is defined as the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. This definition includes legal risk, but excludes strategic and reputational risk.”

So trade failure, non-compliance to regulations, loss of data due to natural calamity, loss of key-personal, legal lawsuits etc. are all examples of operational risk. Thus while one is aware (most of the times) of their exposure to market and credit risk the same can’t be said for operational risk. How can one define the probability & severity of a failed trade or that of data theft?

On specific area under Operational risk management which is making headlines today is sensitive data protection or to be precise client identity protection. Although there have been regulations around this issue, yet as an aftermath of sub-prime crisis regulators are raising this issue with much more emphasis. One the main reason for the concerns raised by the regulators is the use of production data in non-production environment like development or testing of IT applications. This article aims at various means of protecting such data and suggests guidelines for effective data protection policies.


The Regulations
  1. The Gramm-Leach-Bliley Act: Offers Privacy and Safeguards Rules to protect personal information held by U.S. financial institutions. The Privacy Rule speaks largely to information collection and sharing – with respect to data exposure, this rule mandates that certain information, such as account numbers, cannot be shared with third parties. The Safeguards Rule speaks more to protecting information.

  2. The Identity theft Red Flags Rules: The final rules require each financial institution and creditor that holds any consumer account, or other account for which there is a reasonably foreseeable risk of identity theft, to develop and implement an Identity Theft Prevention Program (Program) for combating identity theft in connection with new and existing accounts. The Program must include reasonable policies and procedures for detecting, preventing, and mitigating identity theft and enable a financial institution or creditor to 1) Identify relevant patterns, practices, and specific forms of activity that are “red flags” signaling possible identity theft and incorporate those red flags into the Program; 2) Detect red flags that have been incorporated into the Program; 3) Respond appropriately to any red flags that are detected to prevent and mitigate identity theft; and 4) Ensure the Program is updated periodically to reflect changes in risks from identity theft.

  3. PCI DSS: The Payment Card Industry Data Security Standard is a set of requirements for securing payment account data. The PCI DSS affects all the companies which handle payment card data, which are myriad. The requirements are straightforward, and include “protect stored cardholder data” and “restrict access to cardholder data by business need-to-know”.

  4. OCC BULLETIN 2008-16: This bulletin reminds national banks and their technology service providers that application security is an important component of their information security program. All applications, whether internally developed, vendor-acquired, or contracted for, should be subject to appropriate security risk assessment and mitigation processes. Vulnerabilities in applications increase operational and reputation risk as unplanned or unknown weaknesses may compromise the confidentiality, availability, and integrity of data.

Out of these the last two are dedicated to the financial services industry. A study shows that these firms are responsible for protecting almost 85% of their entire data.

The Cost of Data Theft

Below are few numbers around data loss and theft.

  1. Since 2005 over 250 million customer records containing sensitive information have been lost or stolen. Privacy Rights Clearinghouse

  2. The 2008 breach report revealed 656 reported breaches at the end of 2008, reflecting an increase of 47% over last year’s total of 446.” Identity Theft Resource Center, 2009

  3. 62% [of respondents] use live data for testing of applications and 62% of respondents use live data for software development.” Ponemon Institute, December 2007

  4. The average cost of a data breach has risen to $202 per customer record, which translates to roughly 20 Million USD per 100,000 records. Ponemon Institute 2009.

  5. Recently, the credit card processor associated with the TJX data breach was fined $880,000 for failing to meet this standard. In the same incident, TJX paid a $40.9 million settlement to Visa.

Many firms have settled their lawsuits for millions and millions of dollars. Please note that although one can measure the cost of fines and lawsuits the cost of loss of reputation & customer trust is harder to measure. Needless to say that data protection is a serious issue and is becoming a bigger concern with each passing year as evident from the study by Ponemon Institute.

As discussed above the one of the main concern of the regulators pertains to the use of production or live data in non-production environments. Firms often prepare test beds for testing various IT applications, products etc. before deploying them for use. Since the testing of applications requires “real-like” data, almost 62% of the firms use production data to test these applications. This poses serious risks of identity and data theft. Also firms take huge pains to ensure the safety of their live or production data but somehow same standards are not applied when that data is copied to non-production environment.

What is Data Masking

Data masking is the process of obscuring (masking) specific data elements within data stores. It ensures that sensitive data is replaced with realistic but not real data. The goal is that sensitive customer information is not available outside of the authorized environment. Data masking is typically done while provisioning non-production environments so that copies created to support test and development processes are not exposing sensitive information. Masking algorithms are designed to be repeatable so referential integrity is maintained.


Differences between encryption and masking

Encrypted data is good in case when you want only people with the right “keys” to view the data. The data loses all its properties and hence it can’t be used by developers and QA professionals who need “real-like” data for testing applications. In contrast data masking or masking will prevent abuse while ensuring that properties of the data remain as they are in production environment.


Different Methods For Data Masking

Following is a brief discussion of various methods used for data masking. We have also discussed which method is to be used when.

Nulling

  • Deleting a column of data by replacing it with NULL values
  • Useful in role based access, when you don’t want to reveal the data
  • Can’t be used in testing environment, as data properties are lost
  • NULLing is not a data masking technique, but is used with other methods for data masking e.g. Credit Card numbers masked as 4234- XXXX- XXXX- 6565

Substitution Text Color

  • Randomly replacing the contents of a column of data with information that looks similar but is completely unrelated to the real details
  • Preserve the data properties
  • Since there is no logic or relationship involved like ageing & reorder one has to store the large amount of random, substitute data
  • Finding the required random data to substitute and developing the procedures to accomplish the substitution can be a major effort
  • Generating large random “real-like” data is difficult in some cases e.g. Street address
  • Useful for generic data like name, address, and numerical data with no properties (credit card pre-fixes and suffix etc.)

Shuffling/Reorder

  • The data in a column is randomly moved between rows until there is no longer any reasonable correlation with the remaining information in the row
  • Since the entire is only jumbled, the end-user still has access to the entire set of data can perform some meaningful queries on the same.
  • Shuffling algorithms fail if they are simple and can be easily decoded
  • It is useful only on large amount of data
  • It should be used along with ageing, variation etc. techniques which shuffle and also increase / decrease data by some fixed percentage.
  • On the plus side, this is one the easiest and fastest way of shuffling data

Numeric alternation

  • Increase/decrease numerical by %
  • % can be fixed or random but is selected to that the data stays within the permissible or probable values
  • It is generally used in isolation with other techniques

Gibberish Generation

  • Given any input, computer generates output which is random, but which has the same statistical distribution of characters or combinations of characters. (A character may be a letter, a digit, a space, a punctuation mark, etc.)
  • In level 1 gibberish, the output has the same distribution of single characters as the input. For example, the probability of seeing characters like "e" or "z" or "." will be approximately the same in the output as in the input. In level 2 gibberish, the output has the same distribution of character pairs as the input. For example, the probability of seeing a pair like "th" or "te" or "t." will be approximately the same in the output as in the input.
  • In general, in level n gibberish, the output has the same distribution of groups of n characters (n-tuples) as the input.
  • Just like encryption it will render the data meaning less
  • Should be used with role based access of data. i.e. when you want to subjectively shield data based on the role / purpose of the individual

Currently there are two main approaches for data masking in non-production environment :

  1. EML (Extract Mask And Load): Data is extracted from the production db, it is masked and then it’s loaded to the pre-production server. It is useful when loading large amount of data.

  2. IPM (In place Masking method): Data is directly loaded to the non-production db and there specific columns are masked before the data is released to the QA and Developers. Useful when data is less and you have well-defined sensitive data to protect.

    By data masking one can ensure that data will retain its properties and can be used for analytical purposes. Also given the fact that almost 70 % of all data thefts are internal one needs to ensure that some form is data masking is employed within the organization to prevent internal threats.

Guidelines for Effective Data Masking

Start with fewer applications. As discussed above financial firms need to protect almost 85% of their entire data. If a firm can protect 15% data each year it is a good achievement.

  1. One needs to understand that as discussed under various methods for data masking, one method is generally not sufficient for data protection and each has its pros and cons.
  2. Ideally data masking must be used with role based access or exposure to data to provide a double protection.
  3. The IT side in just one aspect of Data masking. The first thing that firms should consider is defining the firm-wide data protection policy. Without defining the KRIs and effective ways to measure them one can’t succeed with any operational risk initiative.
  4. In order to save cost firms should consider 3rd party software vendor for IT products and solution around data masking. The policies and procedures should be maintained in-house.

    Abhishek Dhall Headstrong August - 2009

Wednesday, August 26, 2009

Regulatory Reform - An Update and Assessment

With Congress in summer recess it is an appropriate time to assess what has happened in this once in a lifetime opportunity to reform financial regulation. It is enticing to say not much and end this note now, but I can’t resist the opportunity to bore you a bit with my review of recent events and a cautionary note as we go forward.

In reality not much has been accomplished in terms of actual reform although there continue to be a substantial number of proposals on the table including the Administration’s proposal, which is comprehensive in approach. Congress has institutionalized bank compensation reviews and oversight although it is unclear precisely how this will work and what in fact it will accomplish. Also stockholders will now be given the right to vote on compensation packages for key executives, although such votes are non-binding. There must be some logic to that little piece of meaningless reform, but I am hard pressed to tell you what it is.

The more substantial proposals have gotten bottled up for several different reasons. My award for the most colorful moment in a not particularly colorful process was when the heads of the major regulatory agencies brought their turf war to a Congressional hearing. It was reported that the generally unflappable Secretary of the Treasury was less than subtle in voicing his dissatisfaction to the aforementioned group after that fiasco. Can’t say I blame him although it is a bit unusual for an Administration official to address the heads of supposedly independent regulatory agencies in such a way.

This lack of tangible accomplishments doesn’t mean the game is over. When Congress returns financial industry regulatory reform is likely to get attention assuming health care insurance doesn’t become completely overwhelming. What I find concerning at this stage is the likelihood of compromise resulting in costly regulation with inadequate attendant benefits to justify the changes. We have had a hint of that already. The SEC seems serious about adding to the industry data collection burden by requiring that information on activity in over the counter derivatives be provided periodically. This information will then be made public, but only in aggregated form and with a one-month lag. I suppose in about 10 years that may generate sufficient data points to support the empirical part of a Ph.D. dissertation, but I cannot understand what else will come of this additional collection burden. Will SEC staff be examining this data as it comes in and if so what will they be looking for? How will what they be looking for differ from what the examiners have on their list? You get the point. Financial institutions will be adding to their costs with no apparent associated benefits.

The aspect of regulation that seems to get little to no attention is that compliance is not free; in fact it can be very expensive. IT costs can escalate quickly when firms have to reengineer processes to effectively comply. Obviously there is no increase in revenue associated with these regulatory activities. That puts pressure on banks to find sources of additional revenue to counter these added costs. A particularly appealing source is fees. The high visibility recent changes in credit card terms are at least in part a response to an increase in the cost of doing business. (The other part is the curtailment of profitable, albeit risky activities, either because of regulation and/or management response to large losses.) There are other knock on effects that are worthy of further discussion, perhaps in a future note. For now we can agree that when bank costs are increased by added regulation and compliance, at least some of those costs are likely to be passed on to the banks’ customers. My cautionary note is that regulators should take these costs into consideration when reforming regulation to ensure that what we are paying for is worth the price.

Ben Wolkowitz Headstrong August 25, 2009