Friday, September 30, 2011

Risk management gone amuck

By now you're aware of Washington's funding of Solyndra, the now bankrupt energy company that received something north of a half billion dollars in government loans. In Wednesday's WSJ, Deborah Solomon had an article ("Solyndra Said to Have Violated Terms of Loan") which included the following passage: "In a March 2011 report, [the Energy Department's inspector general, Gregory Friedman] said his office 'found that the Loan Guarantee Program could not always readily demonstrate, through systematically organized records, including contemporaneous notes, how it resolved or mitigated relevant risks prior to granting loan guarantees.'"

Interesting, isn't it, how the federal government, who readily criticized the investment industry's risk controls and management, was able to grant such a huge loan, shortly before the recipient declared bankruptcy? And while perhaps it may do our hearts good to see that "the shoe is on the other foot," or to feel better because "misery loves company," (sorry for the inclusion of overused cliches), the reality is that the millions of dollars given away belongs to the taxpayers, who lose in the end. And perhaps it's better to use this example as a way to reflect upon our own risk controls to ensure that they're sufficiently tight. While we may not all be in the business of granting loans (although anyone who buys bonds is, of course, doing just that), we all experience risks in our investments. Too many managers have no risk policies at all; surely something can be implemented to demonstrate an awareness of the risks being taken and the controls in place to minimize these risks.

Thursday, September 29, 2011

It's not what you say,...

Pollster and word guru Frank Luntz is known for his admonition, "it's not what you say; it's what people hear." This rang true for me this week.

I did a talk on Tuesday at the New York Society of Security Analysts (NYSSA) titled "Performance Measurement & Alternative Products: What's different?" As you might suspect, I addressed hedge funds, private equity, and real estate. I pointed out that for illiquid assets (e.g., private equity and real estate, though this can apply to certain hedge fund assets, too), valuations can be quite difficult, which calls into question the accuracy of the returns that depend on them.

One young woman mentioned something to the effect that private equity managers aren't "all cowboys" when it comes to pricing; that established rules are followed for pricing. I agreed, and cited the three-level hierarchy that the GIPS(R) (Global Investment Performance Standards) standards have for such pricing. But this doesn't mean that the valuations are solid.

Case in point (okay, a slight diversion, but hopefully you'll see the connection): in yesterday's WSJ, Joesph B. White and Nick Timiraos mentioned how Detroit is witnessing a jump in housing prices ("Housing Revs Up in Detroit.")  The very last paragraph referenced a realtor who cited a house that had just gone under contract for $80,000, which was appraised at $30,000! In my talk I mentioned how (a) prices are often subjective, (b) appraisers can use different methods (thus could come up with different valuations), and (c) valuations are often done without the benefit of transactions. What better example? An asset undervalued by more than 50%!

I never intended (thus my Luntz quote) for anyone to take away that I was suggesting that pricing of private equity and real estate assets is anything but formal, standardized, and with the greatest care. But, we all know how pricing can be suspect. I'll try harder not to offend or mislead.

Wednesday, September 28, 2011

Taking the CIPM exam? Want some help?

Our next webcast has been announced: John Simpson, CIPM will conduct a two-hour session to help test takers prepare for their upcoming CIPM (Certificate in Investment Performance Measurement) exam (whether you're taking the Principles or Expert level test).

It will be this coming Monday, October 3, from noon until 2 PM (EST). If (a) you attended one of John's prep classes, (b) your firm is a member of the Performance Measurement Forum, or (c) your firm is one of our verification clients, then it's free! For everyone else, our normal webinar fee has been reduced to make it easier for you to benefit from John's expertise. Not able to attend but still want to hear what's discussed? You can obtain a copy of the session!

Questions? Contact Patrick Fowler (732-873-5700).

John Simpson developed the class and has taught it numerous times. He also teaches our other courses, and is  highly regarded for his teaching skills. We're sure you'll find this program of value. Stumped by something? Have some questions you'd like to pose? Or, just curious what others are thinking? You'll no doubt enjoy and benefit from this session.

More on composite returns

Brian Chapman (of KPMG, London) reminded me that the AIMR-PPS(R)'s view of composite return is that it's "a single value that reflects the overall performance (the 'central tendency') of the set. The objective in reporting the returns of composites is to use a method for reporting the composite return that will give the same value achieved if the composite were treated as one master portfolio. That is, the value being calculated is the same value that would result if all of the assets and transactions of the individual portfolios/classes were combined and the return were computed using the procedures discussed earlier." [page 27 of the '93 version] Actually, I hadn't so much forgotten this, but rather was unable to locate a definition in the '93 or '97 editions of the AIMR-PPS (neither of their indexes provide easy passage to what Brian located, and I wasn't as diligent as he in trying to locate it).

When I taught classes for the CFA Institute (and prior to that, AIMR) on the standards (AIMR-PPS and GIPS(R)) the explanations regarding the math more often than not fell to me, and I would explain that asset weighting is used so that the return looks like it's coming from a single portfolio. I guess I hadn't really given this explanation a whole lot of thought: I had first heard it in 1992, when a debate was occurring on this subject, with the ICAA and IMCA challenging the approach that AIMR was implementing. The arguments against asset weighting were that it would cause larger accounts to overly influence the results; however, with the aggregate method we don't actually see this, since we end up with a mix of all accounts' holdings tossed together, with no reference or link to their source.

Further research is in order to understand "why" AIMR (and then the CFA Institute, and arguably now the GIPS Executive Committee, by default) would favor this approach. Is the blending of assets from a variety of accounts truly what we want?

It's somewhat ironic, I think, that the only method that AIMR came up with in their '93 edition fails at achieving the definition they laid out, as it provides an asset weighted average of returns (what IMCA and the ICAA objected to), not a result which truly represents the composite as if it was a single portfolio (though some would argue that it is an approximation of this).

If by now you're growing tired of this topic, I apologize. But I happen to find this somewhat fascinating.

Tuesday, September 27, 2011

Comment in a courteous manner, please...

This blog has been around for more than two years, and I thank those who have taken the time to follow it, post comments and critiques, and contribute to its success. I always appreciate when someone takes the time to comment on the blog. Perhaps there haven’t been as many as I would have hoped for or expected, but still a fair number have come in.

Many have commented anonymously, which until recently has been fine. But over the past few weeks I’ve been sent a couple comments which either bordered on rudeness or were patronizing: they didn’t get posted. While I appreciate all comments submitted, I will not publish rude or arrogant ones. The blog is a medium for information and education for the industry. Critiques and comments are welcome when presented in a professional and respectful manner. While I embrace the new social media, and the freedom of information and voice it provides us, I am also a firm believer in professionalism, courtesy and etiquette.

Have an opinion that differs from mine? Think that I made an error or that I’m wrong? I am more than happy to post it. Case in point: Brian Chapman sent me a comment correcting something I had written relating to what the GIPS return means. It got posted, and I appreciated him sharing this. In fact, it resulted in another blog post.

Be professional and it will be posted. Going forward, I will avoid publishing anonymous submissions: please identify yourself. If you can’t do this for some reason, then send me an email along with the post, and I’ll consider releasing it. Thanks!

Monday, September 26, 2011

When the stats don't add up

I am a very big fan of Michael Lewis, beginning with his best seller Liar's Poker. I was advised to read Moneyball, and as with his other books, found it very well written. And given my lifelong love of baseball, found it quite intriguing: the idea that the Oakland A's could assemble a team of low priced players and meet with great success. The book has been transformed into a movie, which opened this past Friday.

In last Thursday's Wall Street Journal Allen Barra wrote a piece titled "The 'Moneyball' Myth," which questions the conclusions drawn by Lewis and others, that by focusing on certain statistics rather than others, success could be realized; that teams didn't necessarily have to spend a bundle.

Thus, Barra was questioning Lewis' attribution analysis, was he not? Were the conclusions that were done, based upon the story that was told by the team's general manager, appropriate? Barra shares information that causes one to wonder.

Statistics are curious things, are they not? As the saying (credited to Britain's former Prime Minister Disraeli, and made famous by Mark Twain) goes, "there are lies, damn lies, and statistics." We can get statistics to prove or disprove just about anything. In the world of performance measurement, we have loads of measures, some that compete with one another (money- versus time-weighting; equal- versus asset weighting; geometric versus arithmetic), which produce differing results.

One must be sure that they are willing to step back and understand what the numbers are supposed to represent, to determine if they're appropriate, whether they're complete, and whether additional information is needed.

Baseball is perhaps the best example of statistics gone wild. For example, ESPN can tell us what the batter's chances are of getting a hit, with two on, facing a right handed pitcher, and with a ball and a strike. In the end, one is tempted to ask "do we care?" But with performance measurement, and the ability to understand what is working and what isn't, we surely do care. But having the skills and tools necessary isn't perhaps always so evident.

A performance measurement professional friend and veteran of our industry visited our offices last week to chat. We talked about the world in which we find ourselves, and what we like about it. For me, the dynamics are perhaps the best part: we don't know all the answers, and are continuing to question and learn. One cannot ever think that "it's done," 'cause it isn't.

Saturday, September 24, 2011

The art of investing

Since almost the beginning of this blog, I have incorporated clipart, to add color, effect, humor, and sometimes wonder (as you wonder what the heck it's there for). The selection has always followed the subject, until today.

Clipart has the word "art" in it. And while I deplore the notion that there is anything artful about graffiti (i.e., that those who display them are "graffiti artists"), clipart frequently exhibits art-like qualities. And for one who very much enjoys and appreciates art, being able to include pieces that somehow (though not always very clearly) tie into the post's theme is fun and sometimes challenging (I usually include clipart with my PowerPoint presentations, too). Speaking of art, I am one who believes that a bare wall borders on being sinful, and our home and office walls are replete with art and other pieces, that perhaps aren't always very artful, but at least add something to the rooms.

Charles Bragg is a very unique artist. Like many other painters (Vincent van Gogh comes to mind, for example), his style is easily recognizable. I recall the first time I saw his work: my wife and I attended an art auction at the Pearl Harbor Officers' Club (I was with the 25th Infantry Division a few miles north at Schoefield Barracks; Hawaii was hardly a "hardship tour"), where they exhibited about five of his paintings. I did not care for his art at all. However, within a few weeks it grew on me, and I decided that I actually liked it (and regretted not having arrived at this realization earlier, when  we could have acquired one or two of his pieces at an attractive price). Much of his work deals with the medical or legal professions, though he has produced a few paintings that tie into investing. I have wanted to acquire a piece for a long time, but they were always outside my price range, or simply more than I cared to spend, until this week.

I happened to locate the piece shown above on eBay, and was quite pleased by my good fortune. It will arrive shortly and appear in our offices (hopefully my colleagues won't mind). My wife knows of my enjoyment of Bragg's work, and is pleased by my acquisition.

Many folks have done pieces that tie into what we do, often with a serious tone: there is rarely anything very serious about a Bragg. If you have a piece that you like (it doesn't have to be a Bragg), please send me a photo so I can include it here.

Friday, September 23, 2011

What does it mean?

Earlier this month I posted a question regarding what the definition of "composite return" is. Given the importance of this value within the GIPS(R) standards (Global Investment Performance Standards), there should be an answer.

I promised to expand on this subject further, and have done so in this month's Spaulding Group newsletter. To me this is a fundamental issue. Please let me know if you disagree.

Thursday, September 22, 2011

"A remedy in search of a problem"

In last weekend's WSJ, Clint Bolick had a column dealing with Arizona's struggle with apparently unfair and unnecessary federal government interaction with much of their activities, especially regarding elections. In it he had a statement which hit me as being, at times, appropriate to the world of GIPS(R) (Global Investment Performance Standards"): "a remedy in search of a problem." Mr. Bolick does not deserve (nor I suspect does he seek) credit for coining this phrase (a Google search identified its use elsewhere), but its appropriateness in many areas of our lives is fitting.

The recently circulated revised guidance statement for examinations (whose comment period recently ended) included suggested language which would have required verifiers to go directly to custodians and brokers for records to support the asset manager's claims. I believe this was a case of a remedy looking for a problem. The inconvenience and added cost should have been obvious from the start. We do not yet know what will make it into the final version, but hopefully these requirements won't.

I'm on record as opposing examinations, though my firm (The Spaulding Group) is quite willing to perform them for our verification clients (the vast majority of whom have decided (as we have) that they are an unnecessary expense). We believe our clients (and for that matter, all firms) should spend their money efficiently. In most cases, examinations are remedies in search of a problem.

I would include in this group of "remedies" the decision to abolish the use of carve-outs, except where cash is managed separately. There was no documented problem necessitating this move. And the change has caused headaches for many firms who had been using them since the early 1990s. 

One might expect that the GIPS Executive Committee will (if they haven't begun already) begin to consider what changes to make to GIPS; after all, the standards are to be reviewed every five years for potential changes. One would hope that if any are, that they will be minimal (the last round was quite extensive). Adding remedies in search of problems should be avoided.

When I was mayor of the Township of North Brunswick,  our resident gadfly suggested that for each new law (actually, ordinances in the case of municipalities) we introduced, two should be struck from the books. Not a bad idea. As Thoreau recommended in Walden: simplify!

Wednesday, September 21, 2011

Where education is lacking

Surely you're familiar with the George Santayana line about needing to recall the past to avoid repeating it. This speaks to history, yes?

In today's Wall Street Journal Norm Augustine discusses how history is a subject today's youth don't do too well with. As one who considered history as an undergrad pursuit (I chose math instead) and who still enjoys reading about history, I found the article of great interest. Augustine writes that studying history has benefits outside the knowledge it brings, as it "create[s] critical thinkers who can digest, analyze and synthesize information and articulate." Furthermore, and perhaps of greater importance, "These are skills needed across a broad range of subjects and disciplines." Can you, perchance, think of any area where such skills are necessary? Hopefully performance and risk analysis came to mind. He also points out the value of "the ability to think broadly and read and write clearly" as being key to advancement.

While I love American and World History, I am also a fan of Performance Measurement History; that is, to learn about what has transpired over the past 40 years or so in the world in which I've chosen to spend most of my time and build a career. Having been around "at the creation" of the FAF standards, and the AIMR-PPS(R), and its evolution into GIPS(R), I can relate how things were to how they are today, which often has value. The Spaulding Group's publication of Classics in Investment Performance Measurement was partly in response to my desire to capture some of the earlier works in our field.

I encourage you to study history in general, but also to be aware of our history. Who doesn't want their analytical skills improved and to advance further?

Monday, September 19, 2011

What are the risk statistics for IRR?

Someone recently asked me what risk statistics should be used with the internal rate of return (IRR), (which, as any reader of this blog knows, is my preferred return measure). Sadly, I didn't have an immediate reply.

The plethora of risk statistics that are available for time-weighted rates of return (TWRR) use the intra-period returns. For example:
  • standard deviation (we can continue its appropriateness as a risk statistic)
  • beta
  • tracking error
  • downside deviation
as well as the multitude of risk-adjusted return measures that use these risk measures (such as Sharpe ratio).

Recall that the IRR measures the return for a single period; there is no linking. Comparing the portfolio's IRR with the benchmark's only serves the purpose of seeing how well the portfolio did. But how can we measure risk if we use the IRR?

Reflect on this for a bit;  I will return to this matter soon, with some concrete ideas.

Friday, September 16, 2011

Knowing the risks one is taking

By now you probably heard about a rogue trader who's cost his employer, UBS, some $2 billion. The bank's risk management is being questioned, and I won't say anymore about it at this  time.

It so happens that another, perhaps more interesting, story has surfaced regarding a 20-year old Irish national who attempted to smuggle some $200,000 worth of cocaine out of Brazil. Apparently this young man failed to consider all of the risks he was taking, which would include detection (which would lead to a trial, and likely imprisonment) and that one or more of the bags might decide to open, which could lead to death.

In order to manage risk, one must know the risks they're taking, and decide if the action being considered is worth it. Too often this isn't the case, be it in investing or other pursuits. Why don't people consider the consequences of their actions? Not all risks can be properly evaluated, but some thought should go into what we do, yes?

p.s., as for the rather graphic photo, it’s a medical image that was released by the Brazilian Federal Police which shows cocaine-stuffed bags inside the gastrointestinal tract of the would-be smuggler. It's enough to cure me from taking up this profession.

Wednesday, September 14, 2011

Getting the basics right

I stumbled upon an asset manager's website this week, which served as a good example of some of the problems that exist within the world of firms claiming compliance with GIPS(R) (Global Investment Performance Standards).

This firm apparently has undergone a recent verification; granted, not by a firm that many have heard of, but nevertheless, one that holds itself out as being capable and qualified to conduct verifications. Very quickly I identified three problems:
  1. This was an SEC-registered firm, meaning that it must comply with the SEC's advertising rules. The SEC requires that net-of-fee be of equal or greater prominence to the gross-of-fee returns. They only reported gross-of-fee results. (Strike one)
  2. Their website page announcing compliance failed to contain the GIPS required advertising disclosures. Granted, compliant firms aren't obligated to adhere to these guidelines unless they reference GIPS; but since this firm did, it was required to include them. It was missing virtually all of them. (Strike two)
  3. Their site includes a composite presentation. This presentation was missing at least one required disclosure, and had other items which looked suspect. (Strike three).
Of these three items, the verifier is technically only responsible to look into the third; their apparent failure to catch the missing disclosure(s) is a problem. The verifier should also provide their client guidance on advertising (from both a GIPS as well as SEC perspective); this was clearly not done here. The verifier should educate their client so as to avoid actions which might put them at risk or negate their compliance. Bottom line: this firm is not compliant and part of the blame belongs with the verifier.

Nearly 20 years ago I encountered similar problems, and requested a mechanism to verify the verifiers (I believe I coined the phrase). This has been an ongoing theme for nearly two decades. I suspect that nothing will ever be done.

We all make mistakes; as the saying goes, "no one is perfect." But the gravity of these errors is of such a magnitude that their existence reflects serious problems. I am confident that this asset manager had no intention of being out of compliance. They are most likely quite proud that they "achieved compliance." It appears to me that they picked an unqualified verifier.

The requirements to be a verifier today are the same as they were roughly 20 years ago: call yourself a verifier. Technically, we and the standards would expect a lot more, but since there is no test, no oversight group, no verifier of verifiers, anyone wishing to can become a verifier in a matter of minutes. I think there's room for improvement.

Monday, September 12, 2011

Last week to participate!

The Spaulding Group (TSG) has announced that this will be the final week to participate in its 4th survey on performance attribution. Joining in is quite simple, as the survey is online and easy to access; simply go to the firm's website and you'll be directed to the survey. You can also see details about the survey on the firm's site.

The survey covers all aspects of attribution. All participants will receive a complimentary copy of the results.

TSG has taken on six cosponsors: (DST Global Solutions, First RateStatPro, Morningstar, BI-SAM and VPD) for this research project. They will not receive details about the participants, as this information will remain confidential. 

Please join in today!

Sunday, September 11, 2011

9/11: A day to remember

I don't read the comics, but my wife does. And she showed me this one. It fits my mood today quite well.

Thursday, September 8, 2011

What's it all about?

You may recall that I have opined in the past about the problems I've discovered with the aggregate method, both several times in this blog, as well as in our newsletter; I also wrote an article on this topic.

Some individuals who have read my materials argue that the aggregate method is, in fact, the superior method. They find that the case, for example, where all the composite's accounts have identical 4.00% returns, but where the composite has a different return, to be perfectly fine. In fact, that it is the correct return, and that the BMV+weighted cash flow approach to be the one in error (even though it matches the composite's).

I must confess that I found this all quite perplexing and, to some degree, shocking. However, while corresponding with some folks on this topic I had yet another epiphany. I believe there is something more foundational / fundamental going on here, which requires some thought. There is a point which is very important to consider: What is the composite return supposed to represent?  I suggest that you take a moment to reflect on this before continuing to read this post.

If you want to check the definition that is in the GIPS® (Global Investment Performance Standards) glossary, you'll quickly discover something:

it's not there! In spite of the vast importance of the composite return, there is no definition for it. Perhaps there was an expectation that "of course everyone knows what it means or represents," but after some reflection, I don't believe this is true. 

In reality, there are two different answers, depending upon which formula you use to calculate it:
  • if you use either the BMV or BMV+WCF approach, the composite return represents the asset-weighted average experience of the accounts that were present in the composite during the given time period. However,
  • if you use the aggregate method, the composite return represents the return of the composite, as if it was an account itself.
Therefore, if you believe that the return should treat the composite as a single account, then of course you would believe that the aggregate method is correct; but if, like me, you think it should represent the average experience of REAL accounts, then you would find its result to be a bit confusing at times.

Space doesn't permit me to go much further on this subject here, but I will comment at greater length later this month in our newsletter, so please stay tuned. I am of the firm believe that this is a fundamental issue with the standards, which has been overlooked by the GIPS Executive Committee, and its predecessor groups. I, too, hadn't really given it much thought until recently. But when someone firmly says, in response to my criticisms, that "no, the aggregate method is the most accurate method,  and the BMV+WCF is actually inferior," I was forced to pause and reflect.

p.s., As you no doubt realize, the GIPS standards grew from the AIMR-PPS®. If you take a look at the first edition of the earlier standard you will only see one method to derive the composite return, and it is based solely on the beginning market value. The other two were added in the 1997 editions. I suspect that when this was done, no one on the AIMR-PPS Implementation Subcommittee realized that they were at that moment introducing a second definition or meaning for the composite's return. But they were.

Tuesday, September 6, 2011

The overwhelm effect

In this past weekend's WSJ,Jonah Lehrer had an article titled "Learning How to Focus on Focus." He quotes psychologist Herbert Simon: "A wealth of information creates a poverty of attention." Lehrer tell us that a key is to strengthen our "executive function," which is " a collection of cognitive skills that allow us to exert control over our thoughts and impulses."

I couldn't help but think about the massive amount of data that we sometimes give to our portfolio managers and external clients. When they see pages upon pages of information, is the result an inability to properly focus on the most important things?

One of our neighbors has a front yard that is overrun with plants and shrubs. At one point I'm sure that it looked quite nice, but now it's difficult just to walk past their house, as the sidewalk is being partially blocked by some of this vegetation. Some trimming is definitely in order. Might the same be true for some of the information we give our clients?

Can we properly focus when we're bombarded with pages and pages of return, risk, attribution, etc. details? Our effort to, at times, impress others with what we can give them may result in a lack of good and solid information, that they can really find of value. Something to think about.