Thursday, April 17, 2014

A proposed GIPS change you MUST pay attention to

The CFA Institute is proposing to MANDATE, REQUIRE, COMPEL firms that claim compliance with the GIPS standards to submit information regarding their firm to them on an annual basis. 

To say the least, I am seriously concerned by this idea.

You can learn about it by going here. It's a fairly non-intuitive process to (a) get to the details behind what they propose and (b) to submit your thoughts. 

I will offer a fairly detailed explanation of what this is, as well as my candid views on it in this month's newsletters. Suffice it to say, I STRONGLY disapprove. 

An interview

I was very flattered when BI-SAM recently asked to interview me for their online newsletter. I'm the one who normally conducts interviews (for The Journal of Performance Measurement(r)), so it was nice to be on "the other side."

You can read the interview here.

I thank BI-SAM for this privilege honor.

Wednesday, April 16, 2014

A reason for performance measurement professionals to celebrate!

The Wall Street Journal posted the following yesterday:

Well, to me this is a reason for performance measurers to celebrate. 

Why?, you might ask: because:
 
we're mathematicians

Wikipedia defines a mathematician as "a person with an extensive knowledge of mathematics who uses this knowledge in their work, typically to solve mathematical problems. Mathematics is concerned with numbers, data, collection, quantity, structure, space, models and change." 

The reality also is that many of us in the field have degrees in math (or, as the British say, maths). My undergraduate degree is from Temple University and is in mathematics. My colleague, John Simpson, CIPM, holds a BS in Applied Mathematics from UCLA (he couldn't get into USC ... it's a sore subject ... don't go there). And my colleague, Jed Schneider received a B.S. in Applied Mathematics from the State University of New York at Stony Brook. (I guess because mine is in pure or theoretical math, I'm more like Sheldon Cooper, who decries those who work in the "applied" area, although I've adapted to the applied side (or, as Sheldon might call it, the "dark" side)). 

I suspect that no one in performance measurement "hates" or "dislikes" math; many, like John, Jed, and I like or even love math. Plus, I'm sure that many of us are here because it involves mathematics, equations, formulas, models, data, analysis, etc.

And so, to learn that it's seen in such a positive light is worth celebrating!

 For the full WSJ article, go here!

Tuesday, April 15, 2014

Buddy, can you spare ... two minutes? Our "mini" survey won't even take THAT long! And yet, your views are highly desirable

We're wrapping up our "mini" GIPS® survey, but first want to hear from you! It's REALLY fast to do; we only ask four questions:

1. Your Name/ Company/ Title

2. Does your firm currently claim compliance with the GIPS® standards?

3. Do you want to see a guidance statement that will outline “sunset” provisions for various GIPS required disclosures, where “sunset” provisions refer to the minimum amount of time a disclosure would be required to be shown?

4. Do you favor new rules being introduced via Q&As? 

Oh, and we also give you a chance to 
comment further, if you wish:

5. Please enter any other comments, questions, or concerns you might have in the following box. 

That's it! Pretty simple, right? And so, please visit our survey site  right now, and take the two minutes (or less) to answer these simple questions. You'll have a chance to win a $25 Amex gift card, too!

Thanks!

Thursday, April 10, 2014

What performance measurers can learn from golf

I took most of this week off so that a friend of mine and I could go to the Masters golf tournament in August, GA. Our wives came along, though they stayed at our hotel location (in Savannah, GA) while we headed to yesterday's final practice round and par-3 tournament. It occurred to me that there is quite a parallel between golf and performance measurement.

Statistics mean a great deal. For example, golfers count their strokes, and they're used to determine how well one does against the course, as well as how one ranks against other players. This is how we measure the performance of the players.

There are essentially two benchmarks at work in golf: par (for each hole and for the course) and a peer group. While the former is a good judge relative to the course, it's the latter that determines victory in a tournament (along with whether a golfer even "makes the cut," and if he does, how much money he's awarded). Investment managers are typically judged vis-a-vis at least one, if not multiple, benchmarks.

Golf, like investing, has risks, though they tend to be a bit more obvious (e.g., bunkers and water hazards).

Golf has rules ... lots of rules. Performance measurement does, too, though they're not nearly as refined or extensive. Many of these rules are not well understood and require interpretation, for both golf and performance measurement.

One of my favorite golf movies is The Legend of Bagger Vance. And one of the most memorable lines is spoken by Hardy Greaves, in explaining why golf is the greatest game there is: "It's fun. It's hard and you stand out there on that green, green grass, and it's just you and the ball and there ain't nobody to beat up on but yourself; just like Mister Newnan keeps hittin' himself with the golf club every time he gets angry. He's broken his toe three times on account of it. It's the only game I know that you can call a penalty on yourself, if you're honest, which most people are. There just ain't no other game like it. ." [emphasis added]

We're supposed to call penalties on ourselves, yes? That is, when we make a mistake, we are to determine the materiality of the error and what course of action to take. 

I think we can take pride in where we stand today, though we can look upon golf as a way to move forward; to develop our rules a bit more, and to be willing and comfortable at calling a penalty on ourselves.

p.s., to avoid any confusion, we won't be at any more days of the tournament ... we purchased tickets for yesterday's event, but the cost for the remaining days is a bit beyond our budgets!

Monday, April 7, 2014

Understannding the rules, the conventions

This morning, I had to phone in a prescription refill to our pharmacy (CVS). They have an automated system that allows you to enter a number associated with the prescription, which makes the process pretty simple; simple up to one point, at least for me.

My wife is picking it up, and said she'd do so around noon. The pharmacy's automated attendant asked what time, and I entered "1200." It then asked if  this would be AM or PM. That's when I had to pause.

Recall that:
  • AM = from the Latin ante meridiem, meaning "before midday"
  • PM = post meridiem, "after midday"
[source: Wikipedia]

Since noon is THE meridian, it is neither ante nor post; it's just "meridian." Noon is 12:00 M. And so, I turned to my wife, who is less picky about such things, for help, and she said "PM." This worked, so the prescription will be ready at the appointed time.

However, 12:00 noon is definitely NOT PM; 12 midnight is. But, we (everyone but me and others who take this too seriously, actually) think of 12:00 midnight as AM. However, it isn't AM until one second after midnight.

Wikipedia offers the following: It is not always clear what times "12:00 a.m." and "12:00 p.m." denote. From the Latin words meridies (midday), ante (before) and post (after), the term ante meridiem (a.m.) means before midday and post meridiem (p.m.) means after midday. Since strictly speaking "noon" (midday) is neither before nor after itself, the terms a.m. and p.m. do not apply. However, since 12:01 p.m. is after noon, it is common to extend this usage for 12:00 p.m. to denote noon. That leaves 12:00 a.m. to be used for midnight at the beginning of the day, correctly, continuing to 12:01 a.m. that same day.

The 24 hour clock, which I became intimately familiar with in the army, makes this much simpler. Noon is 1200; midnight is 2400. We don't worry about "AM" or "PM." Midnight, being 2400 represents the end of a 24 hour day. One minute after midnight is not 2401; it's 0001. 

Sometimes our rules are confusing, sometimes they're just plain wrong, but we must conform, despite our objections, to doing so, in order to "get along" (or to get our prescriptions filled properly). 

In performance measurement we often deal with rules that make perfect sense, while at times we deal with ones that are not so clear. Sometimes, what appears confusing is only a reflection of our ignorance. This often happens when working with time-weighted returns, when there's a loss but we show a positive return: the typical and understandable response is "it doesn't make sense." And, it doesn't; until we get a better grasp of what is occurring. 

We can fight some of the rules, as I often attempt, or just give in. 

I am not the only one who has challenged the 12:00 noon is PM  convention; occasionally articles are offered on this subject, and these articles appear to only appeal to folks like myself. 

As a purest (I guess that's what I am), I also challenge calling a lectern a podium, or to suggest that "the lion's share" means "most" (it means "all"). But one can tire and frustrate others (such as my normally very patient wife) by being pedantic, so I will stop now. Hopefully my link to our profession is clear. 

Thursday, April 3, 2014

Dealing with denial

Yesterday's WSJ had a book review on The Unpersuadables,  by Will Storr. It isn't clear that I have much interest in the book, though the ideas summarized in the review are intriguing.

The reviewer, Michael Shermer, identifies some of the folks highlighted in Storr's book: unpersuadables, such as David Irving, who deny certain things that most agree with (in Irving's case, the Holocaust). Storr refers to these individuals as "enemies of science." 

It occurred to me that our industry has such individuals: who, despite overwhelming and objective evidence, refuse to give in. I'll touch very briefly on three.

The benefits of money-weighting 

There are some who simply don't see any role for money-weighting, save for its use with private equity managers. While they agree that we use time-weighting because the manager doesn't control cash flows, they fail to see the opposite: that we should use money-weighting in cases when managers do control cash flows

The only conclusion when debating one of these unpersuaders is that the logic for time-weighting is flawed: rather, that we should simply always use time-weighting! The problem arises, of course, with private equity: we are sometimes told that we use money-weighting then because these are illiquid securities whose values are difficult to discern.

It is frustrating, I can assure you, when devotees of money-weighting, such as myself, attempt to persuade one who simply has no desire to be persuaded.

The fallacy of the aggregate method   

I have, on more than a few occasions, demonstrated how the aggregate method is severely flawed as a composite return method. It can provide totally nonsensical results, and even violates the definition of the composite return in the GIPS(R) standards. 

But those who refuse to see this dare to suggest that this method is actually the best approach! I think in this case it's "I don't care what proof you have, we'll stick with this position and yell louder to make it sound right!"

The error in asset-weighted composite returns  

Staying with the GIPS (Global Investment Performance Standards) theme a bit more, let's consider the requirement to asset-weight composite returns. Many forget that this was a very hotly debated topic 20+ years ago, when the AIMR-PPS(R) was being unveiled. Two industry groups (the Investment Council Association of America (now the Investment Advisers Association) and the Investment Management Consultants' Association) argued for equal-weighting, but the framers of these standards refused to budge. At the time, I didn't give it much attention, and so (like most) was quite comfortable with the decision.

Now that we've been at this for over 22 years, many of us have concluded that asset-weighting serves no purpose, and provides a less-than-ideal metric to judge a manager's performance: the return is, by design, skewed in the direction of the larger accounts' performance. This return doesn't report how a manager did "on average." How does one even interpret what it means? All we know is that it leans towards the bigger accounts.

But try to get this requirement to be changed!

Making progress despite the presence of unpersuadables

We are not dealing with stupid people, as this clever quote by Mark Twain addresses.

Not at all. In the case of performance measurement, these unpersuadables are individuals of high intellect, who are simply unwilling or unable to be open to alternative views. 

Perhaps we could alter the caption a bit: Never argue with unpersuadables." It's really quite a waste of effort, and isn't much fun, either.

I hope and believe we can do better. Our little industry is still quite new. We've made loads of mistakes along the way: in some cases, we've righted them, while in other cases, we still have some work to do. But it's difficult to make progress when some (especially when they're in a position of authority) hold steadfast to positions that are, in reality, weak. 

I am not about to declare these unpersuadables "enemies of performance measurement." However, their refusal to be even the least bit open to change doesn't help.

The Catholic Church took a very long time to agree that the earth revolved around the sun; hopefully, we will see the acknowledgement of better ways in a more expeditious fashion.

I will close by quoting the late U.S. Senator, Daniel Patrick Moynihan who once observed that "everyone is entitled to his own opinion, but not to his own facts."