Monday, January 4, 2010

Fraud & GIPS


A couple recent and related blog posts have addressed recent suggestions about the Oppenheimer College Fund Fraud investigation. The Money Game actually points to Mish's site,  which is one that I have listed as a blog I periodically visit.

As Mish pointed out, this fund is alleged to be a hedge fund masquerading as a mutual fund. It appears that the firm also claims compliance with the Global Investment Performance Standards (GIPS(R)). While it isn't clear, yet, whether or not they were also verified, we nevertheless must be concerned that yet another "GIPS compliant" firm may have committed fraud. The details have not been provided such that we can determine to the extent that this infraction may have run afoul of any specific GIPS rules (of course, the requirement to abide by laws would clearly be in conflict with GIPS).

What makes this case all the more interesting (and challenging) is that there are actually two firms that carry the name "Oppenheimer," and both are located in New York City. They were apparently related at one time but no longer are. In addition, there are two funds with the same name! Talk about confusing.

I had the opportunity to speak with Mish last week about this topic. He explained (and noted as an addendum to his blog piece) that he pulled some earlier verbiage once he discovered the existence of the second firm.

One might think that these most recent problems might cause further discussion to ensue regarding a verifier's role in detecting fraud. As has been stated previously, verification isn't designed to detect fraud. In addition, we wouldn't want this additional requirement to be placed on a verifier. BUT, we would hope that verifiers would be sensitive to things that just don't look right.

This is one reason we don't conduct remote verifications: we don't believe a verifier can do an appropriate and adequate job by verifying a client from the comfort of the verifier's offices. Sorry. I've suggested before that I believe that the GIPS Verification Subcommittee should at least encourage verifiers to spent a large portion of their time doing the verification in the client's offices. Whether or not this will come to pass is, of course, open to speculation.

6 comments:

  1. Yikes! The use of the Oppenheimer name sounds like a nightmare for the Oppenheimer firm that is not accused of fraud.

    ReplyDelete
  2. Stephen Campisi, CFAJanuary 5, 2010 at 4:41 AM

    A very interesting and useful case study in both verification and more importantly in fund classification. I think the key issue here is not whether the fund acted as a hedge fund, but rather whether the fund managers took appropriate risks and disclosed those risks to investors. Mish's analysis points out that this was not the case. But the real culprit here is not manager discretion or the use of derivatives. It's not even leverage. The real problem is lack of liquidity. So, why is liquidity still the most ignored aspect of the investment process, even after all of the carnage it caused during 2008?

    I had written in earlier replies that liquidity risk is currently the most serious issue that continues to be left unaddressed by investment managers and performance analysts. Lack of liquidity can be seen as a form of systematic risk and as a driver of significant price decline. Yet, liquidity is not part of any performance models, other than to be seen as a potential driver of the so-called "selection effect." To performance analysts and verifiers, extreme lack of liquidity should be seen as a key risk that must be disclosed. And, when such extreme levels of illiquidity are present, the fund's classification should be changed. You should not be allowed to classify a leveraged, spread-overweight bond strategy using illiquid derivatives as a plain vanilla, long-only core plus bond fund. That's just misrepresentation. But apparently, this happens all the time and the current protections such as GIPS will not necessarily catch this.

    We have just come through a period where it was considered quite stylish to get "cheap beta" through derivatives and use your "active risk budget" to get "pure alpha." This made for a lot of interesting academic articles and conference talks. As a result, the so-called enlightened investors began piling on to all sorts of derivatives and moving from liquid futures (which are easily traded and have the oversight of the exchanges) to the extremely illiquid private arrangements such as swaps and CDOs. In doing so, they added counterparty risk to the illiquidity of these structures. They also added the problem that these risks are highly correlated to each other, so that the investor faced an additional level of credit risk along with the inability to trade what became distressed securities. As a result, investors thought they were getting a more efficient portfolio, but were actually taking on unintended and uncompensated risk. That's how you have such enormous losses in a simple bond portfolio, as in your example. What a mess!

    So the question is: how do we ensure that the true risks of the fund are identified and disclosed? And, how do we ensure that these funds are properly classified? It seems that GIPS provides a lot of guidance on the rather accounting-oriented details for calculating returns, but provides little if anything on the huge issues of risk and fund classification that you have identified in this example. As to the verifiers, what role should they play in all this? Should they be the "early warning device" that something is wrong? Or do we continue to let them simply opine on the credibility of the calculations made from data that the client states is credible and accurate?

    ReplyDelete
  3. Dave, you arrive at a conclusion about on-site versus off-site, or automated verifications that I dont't think can be supported by the facts. The mere presence of a verifier does not elevate the quality of the verification. Verification is all about the data and an objective, rules-based, automated process/system that can be managed remotely would certainly provide equal or superior results to verifiers combing through data at a client's site. In addition, systematic verifications using a system can handle vast amounts of data and do not have to rely on sampling, which, in itself, increases the validity of the process.

    ReplyDelete
  4. Actually, verification isn't "all about data." It's about procedures and often a need to review records. Since our firm has taken over from firms who DO use automated systems to do their verifications and DO miss things, this only adds further credibility to my suggestion that on site is a requirement. Does it ALL have to be on site? Perhaps not. But to not even to bother to show up? And we wonder why firms are able to commit fraud?

    ReplyDelete
  5. Funny, when I took the CIPM exam, I need to be present at the exam site and show proper IDs in order to verify that I'm truly the person permitted to take the test. I do agree with Dave comments that a verify needs to visit the company (just a note, I don't always agree. For example, I don't understand what he wrote on presentations versus persentation blog).

    Nancy,
    Data is aggregate by user(s). Rules are created by user(s). Processes are created by user(s). How would a verifier know if the user(s) involve in these activities actually work a the firm? If I agree with your comment, then GIPS should also think about changing the exam sites to at home as well. Hey, it saves money for test takers and test prepares (win win for both parties). I'm trustable (right?)!

    ReplyDelete
  6. It's always nice to find someone agreeing with me! Good contrast with the CIPM program.

    ReplyDelete

Note: Only a member of this blog may post a comment.