Tuesday, January 31, 2012

Inflating performance

Excuse me for once again commenting on the subject of trust, but I just learned that Claremont Mckenna College, a prestigious California institution, has admitted to inflating SAT scores to improve it's ranking. There seems to be almost an epidemic in such shenanigans. It hasn't been that long since we learned of teachers in many public schools in the United States changing the answers on student exams, to improve rankings. Cheating has somehow become acceptable, at least in some sectors.

Such actions don't occur without a degree of conspiring on the parts of two or more individuals. How does this happen? Apparently, it isn't so difficult, at least from the rash of cases that have surfaced. Yes, we have our Bernie Madoff and others like him who was successful at getting individuals to work with them in order to do some pretty dishonest things. And yes, over the years we've seen some asset managers inflate their scores to improve their rankings. I find all of this quite disturbing.

And while we can feel good that these are "isolated incidents," when those we hold in the highest regard fail us, that surely impacts our comfort at trusting others, does it not? As a society, we should be concerned that such behavior seems to have almost no limits.

Monday, January 30, 2012

Dispersion relative to what, exactly?

I participated in a panel discussion last week for the New York Society of Security Analysts (NYSSA). Questions arose regarding the use of standard deviation with GIPS(R) (Global Investment Performance Standards). I used  my standard graphic, which distinguishes between this statistic being used as a risk measure (a longitudinal or across time view, looking at 36 months of composite returns) and as a measure of dispersion (for a single period, where we look at the returns of the accounts within the composite, to see how disparate they are).

One individual mentioned that as a dispersion measure, it measures the account returns relative to the composite's return. While this would be, I believe, the ideal, as one should want to know how returns vary relative to the composite, in reality, most firms measure dispersion relative to the average of the accounts that were present for the full period, and this can be quite a different number.

Consider this: We have a composite that begins with 30 accounts; during the year, 10 disappear, and 10 more are added, meaning 20 are present for the full year. The composite's return is derived on a monthly basis, from the accounts present each month; these returns are then linked to produce the composite's return for the year. If one runs standard deviation across the 20 accounts that were present all year, it won't consider the composite's  return whatsoever; in order to bring that return into the mix, one must manually (i.e., employ a step-by-step approach) calculate standard deviation, using the composite's average as the average against which each account return is measured.

My suspicion is that few firms employ this more accurate approach. Is there much of a difference? Probably not. However, I think it unfortunate that we weren't clearer as to how this measure is to be derived. Perhaps we will in the future. I'll address this in greater detail in our February newsletter.

Friday, January 27, 2012

Is consistency overblown?

We, that is, The Spaulding Group, are (or should it be, is?) wrestling with a situation where a client may not be consistent in their pre-2011 adoption of "stub periods" for their GIPS(R) (Global Investment Performance Standards) composites (recall that until 1 January 2011, showing stub period performance in your composite materials was an option; and some might argue, not even permitted!). And so this begs the question: "must they (be consistent, that is)?"

The standards expect consistency, but is this in everything a firm does? I would hope not. Surely asset managers should be granted some degree of flexibility, and not be castigated for an occasional, though intentional, lapse.

The Standards shouldn't be seen as constantly putting up challenges before firms that wish to comply. Surely, it must be challenging and demanding, but not in a silly, nonsensical, unnecessary way.

And while I am the first to criticize those verifiers who "work with their clients" in such a way that they ignore clearly articulated and defined rules (and as a result, put their clients at risk), where the rules haven't been overly prescriptive, let not the verifier be the one to introduce new and unnecessary hurdles. Your thoughts?

Wednesday, January 25, 2012

Let's take risk reporting to the next level

The Global Investment Performance Standards (GIPS(R)) now require compliant firms to include the 3-year, annualized standard deviation for the composite and its benchmark. And while this was a somewhat controversial move, it's here, so we live with it. But, why stop there?

For example, while conducting a recent GIPS verification for Reams Asset Management, a division of Scout Investments, I found the following shown for their Unconstrained Fixed Income Composite:

What can we tell from this? Not much.

Okay, the composite had a significant out performance relative to the index (more than 200 bps); but, look at that standard deviation; looks like a lot of risk was taken! If one truly believes in the value of standard deviation, might it be a good idea to move to the next  step?  That is, to require a risk-adjusted measure, such as (what seems to be the logical choice in this case, given that the risk measure is standard deviation) the Sharpe ratio?

But also observe that we are showing a one-year return and a three-year standard deviation, meaning the match up isn't perfect (and is arguably misleading), and so, let's report what isn't required (but perhaps should also be?): that is, the three-year annualized returns!

A lot more insightful,  right?

In this particular case, the benchmark is an absolute index, so the differences are a bit more pronounced than they might otherwise be. But the point is, I believe, still valid: to compare one-year returns with three-year risk statistics is, as we like to say, mixing apples and oranges. And, showing returns and a risk measure doesn't quite do the job.

And so, I encourage the GIPS Executive Committee to:
  1. Require, in addition to the 3-year annualized standard  deviations, the corresponding 3-year annualized returns
  2. Require the Sharpe Ratio.
What do you think?

Tuesday, January 24, 2012

"Say again?"

It wasn't long after I joined the Field Artillery that I learned that one did not say "repeat" over the radio, especially when speaking to anyone in an artillery battery, as this expression means to "fire again." Instead, one would simply speak the words "say again?" (You can often tell a former army guy, if they say this; you also know when they can spell phonetically (alpha, bravo, charlie, etc.)).

Well, sometimes one is tempted to ask the person one is speaking with to "say again?" when they hear something that is confusing, ambiguous, or unclear. This happened to me recently, when speaking with a client who was wondering about the proper treatment of fees, meaning custodial and management, for "SMA accounts." SMA stands for "separately managed account," and my initial question was "are you able to break the fees out?" Since this firm is a GIPS(R) (Global Investment Performance Standards) verification client of ours, I was a bit confused, since I didn't recall that they had any wrap accounts.

Well, there's the rub. You see, they don't have wrap accounts. The person asking the question is somewhat new to this side of the investment business, and was using the term "SMA" to represent, well, a separately managed account. Sadly, since the wrap fee industry adopted the term "SMA" to represent wrap accounts, confusion often arises; this isn't much different than when someone says "alpha," which can mean (a) excess return, (b) Jensen's alpha, and (c) other things, too! And so, one is forced to qualify what the speaker or writer means.

As I understand it, E.F. Hutton (you recall them, right? "When E.F. Hutton speaks ...") invented "wrap fee" accounts in the early/mid 1980s (an advisor I worked for in the mid 1980s considered introducing wrap fee accounts, too). These accounts "wrap" all the fees (commissions and other trading expenses, advisory fees, custodial fees, broker fees) together into a single fee (e.g., 2.00%; 2.50%), which the client pays. This way, the client doesn't worry about the advisor churning and burning them with lots of trades, which can turn into high commission expenses. My guess is that some in the industry felt that "wrap" didn't have quite the pizazz they wanted, and so the use of "SMA" began. It's probably too bad that no one said "sorry, that term is already in use; pick something else!"

Consequently, when we hear someone say "SMA" or even "separately managed account, qualification is in order. We should try to avoid reusing words and expressions, as this practice often leads to confusion.

Friday, January 20, 2012

The value (and necessity) of trust

I recently listened to Stephen Covey (the son of Stephen Covey of the "7 Habits" fame) speak on trust. It truly resonated with me, and I'll share just a bit here and more in an upcoming newsletter.

Our industry has suffered from a loss of trust. A highly successful and revered leader, Bernie Madoff, turned out to be a charlatan and a crook. Former New Jersey Governor and Senator, and former Goldman Sachs CEO, Jon Corzine ran a company that appears to have misappropriated client segregated funds. If ever the need for trust was evident, it is today.

In our GIPS(R) (Global Investment Performance Standards) and non-GIPS verification, we must have trust in our clients: if we encounter someone who we don't trust; who we think will try to deceive us, then we won't take them as a client.

As Covey points out, there are two important aspects of trust: character and competence. To gain our full trust, one must have both. To have character without confidence, we know that the person will strive hard to do a good job, but won't fully know enough to be successful, and so will need our support, counsel, and guidance. If the person is highly competent but lacks character, then there is nothing we can for them.

But in a relationship such as this, we, too, must win the trust of our clients, by demonstrating our competence and character. We want them to have confidence in our counsel, and see us as a highly trusted advisor. This is critical to success.

Yes, trust is extremely important. And again, more to follow.

Thursday, January 19, 2012

The many faces of standard deviation

Confusion abounds when it comes to standard deviation. Some of the issues include:
  • Equal-weighted or asset-weighted?
  • Divide by "n" or "n-1"?
  • Is it a measure of variability, volatility, or dispersion?
  • Is it a measure of risk?
  • What's the best way to measure relative to the composite's average return?
I'll be brief, but promise to expound further upon this subject in this month's newsletter.

Equal or asset-weighted?

If you've been reading my stuff for any length of time, chances are you know the answer: EQUAL! Okay, so you're allowed to do asset-weighted, but why would you? What does the number mean or represent? This was an idea that some folks thought made sense almost 20 years ago ("since returns are asset-weighted, shouldn't standard deviation?"), but didn't and doesn't. But if you insist on doing asset-weighted, be my guest.

Divide by "n" or "n-1"?

By "n" we mean the number of accounts. I recall that the AIMR-PPS® flip flopped on this one (the first edition (1993) had one form, the second (1997) a different one [perhaps someone was planning to enter politics, and wanted practice]).

We're supposed to use "n" when we're measuring against the population, and "n-1" when against a sample. Dividing by "n" makes standard deviation a bit smaller. Most firms seem to use "n," so I say "why not join them?" We can debate which is appropriate, but why bother?

Is it a measure of variability, volatility or dispersion?

The short answer: yes!

Bill Sharpe, in his 1966 paper used the term "variability" to describe standard deviation (he referred to what we know as the "Sharpe Ratio" as the "reward to variability" (recall it has standard deviation in the denominator) and Jack Treynor's risk-adjusted measure as the "reward to volatility" (it has beta in the denominator)). However, in an email to me not long ago, he said using either the term "variability" or "volatility" is fine. Both of these are used in the context of standard deviation being a measure of risk; what some call "external dispersion."

As for "dispersion," I usually mean this in the same context as some do for "internal dispersion," meaning how the composite's returns compare / vary.

The GIPS® standards (Global Investment Performance Standards) now require both (a) a measure of dispersion (and standard deviation is just one way to accomplish this) and (b) the 36- month, annualized standard deviation for both the composite and benchmark. The former is for a single time period (standard deviation of annual portfolio returns for 2011, for example) and the other across time; a longitudinal measure, if you will (e.g., the 36-month standard deviation of the composite for the period ending 31 December 2011).

Is it a measure of risk?

It depends who you speak to. Since many consider risk to be either (a) the failure to meet the client's objective or (b) losing money, it wouldn't qualify, because it does neither. However, Spaulding Group research has shown that it's the most common measure of risk. And, the GIPS standards now require it (although they've shied away from calling it a "risk measure"). And so, regardless of its detractors, most folks do consider it a measure of risk.

What's the best way to measure relative to the composite's average return?

I saved the best for last. I am conducting a GIPS verification and was validating the client's measure of dispersion; in this case, equal-weighted standard deviation. Because I couldn't match what they had, I tried comparing it to the composite return; let me explain.

If you use Excel, for example, and run the "STDEVP" function against the returns of all account's present for the full year, you're measuring standard deviation against the average of these returns, which in almost all cases will not be the same as the composite's return, meaning it's telling us how disparate the returns are around this average, not the average reported in the presentation. I believe that ideally it should be run against the composite's return. However, this would require several more steps, and couldn't be invoked by simply running a similar function like STDEVP. Too bad.

--------------------------------------------------

And so, standard deviation isn't really so simple, is it?

Wednesday, January 18, 2012

Lessons from a former CEO, U.S. Senator, and Governor, on how to avoid risk hurdles

In yesterday's WSJ we saw yet another article regarding MF Global Holdings and its former CEO, Jon Corzine ("MF Probe Targets Back-Office Unit") . This time we learn that just about everyone in the company has been interviewed by federal prosecutors, save for the Honorable former Governor of the great state of New Jersey, and a few other former seniors from the firm.

In the spirit of "full disclosure" I must confess that as a resident of the Garden State, I did not vote for Corzine when he ran for Senator or Governor, and my reasons aren't because he is a Democrat, as I have been known to vote for several Democrats, including the member of the House of Representatives (Rush Holt) who represents my district. It is one thing to persuade lots of people to make significant donations to your campaign, as Barack Obama and many others have done; it's another to spend millions of your own dollars to buy a Senate seat and Governor's position. Perhaps he was fired from Goldman Sachs for a reason. His performance as a U.S. Senator was one with no notable accomplishments, and as Governor, his performance was so bad that this predominantly Democratic state voted him out (something that rarely occurs).

It has been reported that Corzine, when confronted by MF Global's risk officer (who no doubt was paid a sizable amount to guard the firm against taking unnecessary risks) about his desire to invest so much of the company's funds in sovereign debt, said something to the effect of "if you won't let me do this, I'll quit!" At least somewhere in the WSJ I recall reading something to this effect, as incredulous as this may sound.

I am blessed with two beautiful grandsons. But I have confidence that their father (my older son) and their mother (my lovely daughter-in-law), if talked to in a similar fashion, wouldn't budge. But no; not in this case. A CEO who throws the equivalent of a tantrum is told "okay, go ahead."

Risk managers are hired and risk controls are implemented for very good reasons; and one would think that the CEO, who as spent decades on Wall Street, would know as much, respect them, and even serve perhaps as an example in honoring them (so much for the "honorable").

Risk remains a very difficult subject to get our arms around. Risk managers, risk officers, risk controls, risk management rules, etc. are necessary; they must be honored, respected, and ahered to.

Tuesday, January 17, 2012

Marrying Performance and Risk

I have been invited to join the CFA Institute's Jonathan Boersma and Neuberger Berman's Leah Modigliani to speak on the subject of risk, at an evening event at the NYSSA (New York Society of Security Analysts). The program takes place at 6 o'clock on January 26.

I am particularly looking forward to this, simply to hear Leah once again discuss the risk-adjusted measure she and her Nobel Prize winning grandfather, the late Franco Modigliani, developed; to me, this alone, is "worth the price of admission." As I understand it, Jonathan will discuss GIPS' (Global Investment Performance Standards) new risk reporting requirement; I will briefly provide an overview of a variety of measures, and then Leah will discuss M-squared (I guess we could say that she's "one of the M's" in this model!).

Hope you can join us; I guarantee you'll benefit!

Can I be accused of caviling? Hopefully not.

The editorial writers of The Wall Street Journal would make Bill Buckley proud, given the frequency in which they use words that are unfamiliar to many of us, thus providing the opportunity to strengthen our vocabulary. A weekend issue from last October was no exception.

In their "Killing Awlaki" piece, they mentioned how "The caviling over Awlaki's death began almost the moment the news was announced." This required me to turn to my trusty source for word meanings, dictionary.com, where I discovered that "cavil" means "to raise irritating and trivial objections; find fault with unnecessarily." Perhaps you were already familiar with this word; I was not (or if I was, I long ago forgot it; a "senior moment" perhaps).

Well, as one who has, on occasion, voiced opinions about such things as the Global Investment Performance Standards (GIPS(R)), I would hope that my comments have not been likened to caviling. And while some may be irritated by some of the points I have raised, I don't believe they have been trivial. Irritation, if there has been any, might stem from uninvited or unwelcome questioning, which I imagine can become tedious at times. But if we want our rules to be as good and valuable as possible, comments and challenges should be welcome, yes? I believe with confidence that for the most part, those who oversee the Standards are open to such comments. But if I ever do cavil, I hope someone will let me know!

Friday, January 13, 2012

Almost Everything We're Taught Is Wrong, well maybe not almost everything

Last year John Stossel wrote a piece titled "Almost Everything We're Taught Is Wrong." When it comes to performance measurement, there's some truth to this, too. Sorrowfully, many refuse to be open to the possibility that the way they've been doing or promoting something is fundamentally wrong. Is it pride, a refusal to be objective, impatience or frustration with those of us who challenge the "conventional wisdom," a resolute commitment to the traditional methods, or some other reason? 

I must confess I've been guilty of this, too. We learn something at an early age, and it gets reinforced along the way. Then, out of the blue, someone comes and says "no, there's a different way," or "no, what you're doing is wrong," or, "no, your understand is incorrect." How do we react? Our natural reaction is usually a defensive one: that is often the case with me. I have to learn to pause, listen, reflect, consider. A lot to ask, but really what's necessary.

Take Modified Dietz, for example. I was taught that it was a time-weighted rate of return: FULL STOP! That's it. And then I'm told (by my friend Carl Bacon and a few others) "no, actually it's a money-weighted rate of return." "WHAT!" And so, I turn to the literature, and nowhere do I read that this is true; on the contrary, everything points to it being a time-weighted rate of return. Carl is clearly mistaken.

Somewhere along the way I realized that (dare I say it?) Carl was correct. BUT, do I confess (mea culpa)? And, do I tell others? After all, just about everyone knows that Mod Dietz is time-weighted. By coming out with this revelation, won't confusion be rampant? Can the performance measurement industry stand such a jolt? Well, after much soul searching, I realized that regardless of the impact, the truth must be told: Modified Dietz is money-weighted ... unless, of course, you link it, and then it's an approximation to time-weighting (right, Steve?). 

But there is so much more that we do that is simply wrong; for example:
  • using the aggregate method for GIPS(R) (Global Investment Performance Standards) compliance
  • relying so heavily on time-weighting, when money-weighting is a superior method
  • requiring asset-weighted composite returns rather than (or, at a minimum, along with) equal-weighted composite returns for GIPS compliance.
Opportunities still arise for change, however. And it will come, eventually.

Thursday, January 12, 2012

Making sense of negative Sharpe ratios

I'm teaching an in-house Fundamentals of Performance class this week in Canada, and, as usual, we touch upon the Sharpe ratio, and how negative Sharpe ratios can produce results which appear inconsistent with our expectations.

To help try to communicate what's going on, I constructed the following graphic:


What you're seeing are two different cases: one where we're dealing with a positive Sharpe ratio, and the other where we have a negative. In both cases, the portfolio's risk exceeds that of the benchmark, and in both cases the portfolio's return equals that of the benchmark. On the positive side, given the higher risk, we would expect a higher return for the portfolio; but because it failed to do that, we end up with a lower Sharpe ratio. On the negative side, we would expect to see a lower return, given the higher risk; but failing to see this occur, we are rewarded with a higher Sharpe ratio.

This may not be clear enough to comprehend, and I will take the subject up later this month, in our monthly newsletter. So, consider this a "warm up"!

Wednesday, January 11, 2012

PMAR IX Transcripts Published

We're pleased to announce that the transcripts from last year's Performance Measurement, Attribution, and Risk (PMAR) conference has been published. Since The Spaulding Group began the conference ten years ago, we have gone to the expense of time and money to produce these reports, because of the valuable information that's shared at the events.

As Patrick Fowler stated in the press release which was published last week, there are several reasons we go to this expense: "our speakers and topics are very important, and we want to capture what is shared, as it often makes a profound impact on the industry. Second, we know that it's easy to forget some of what is shared during a talk, and these books allow attendees to review sessions to recall information, and perhaps gain additional insights. Third, we know that many who attend would like to share the information with others in their firm, but even the most gifted listener cannot record all the pertinent details of a speech. Fourth, our vendors are given the opportunity to place advertisements in these books, which is yet another way for them to communicate with our clients. And finally, we believe that the transcripts bring back great memories of the event, as we include many photos that our attendees can peruse. In addition to the transcripts, attendees receive audio CDs, which allow them to listen to the conference presentations, and share with their colleagues." All attendees of the conference receive complimentary copies of these materials.

The transcripts for last year's PMAR Europe III will be published shortly. Attendees of last year's event will be sent complimentary copies, along with audio CDs of the program.

PMAR North America X will be held at the Ritz-Carlton, Philadelphia on May 23 and 24. PMAR Europe III will be held at the America Square Conference Centre in London on June 12 and 13. To learn more about these conferences, please contact Christopher Spaulding or Patrick Fowler  or visit TSG's website.

Tuesday, January 10, 2012

Who benefits from quarterly verifications?

The Spaulding Group is sometimes asked if we do quarterly GIPS(R) (Global Investment Performance Standards) verifications. We would be happy to, but strongly recommend against it. And why is this?

Well, we firmly believe that the only one who benefits from quarterly is the verifier; and this is for two reasons:
  1. They can charge more, because they increase the frequency of visits
  2. It keeps their staff busy all year round!
We oppose it because:
  1. It's disruptive to the client
  2. It costs more
  3. There are no added benefits from more frequent verifications
What benefit does it provide? Does the client really think that they are going to fall out of compliance within a quarter, or a few quarters? Recall that verification does two things; it "assesses whether (1) the firm has complied with all the composite construction requirements of the GIPS standards on a firm-wide basis and (2) the firm’s policies and procedures are designed to calculate and present performance in compliance with the GIPS standards." Even though the Standards encourage firms to reflect quarterly and/or monthly returns on their presentations (see ¶ I.5.B.2.c), we don't feel this means they need to immediately get those quarters verified.

As verifiers, we focus on the firm's policies and procedures, and its composite construction. Chances are the P&P won't change very much during the year, so that leaves the composite construction. Why must we monitor clients monthly? If we give interim reports rather than an annual, will it really help them?

During the year we often engage with our clients. Our clients frequently contact us with questions or seek advice. Our clients are invited to participate in our monthly webinars at no cost. And, we share information with them in other ways. And so, why bother them with quarterly visits?

Interestingly, of the numerous firms who have switched to The Spaulding Group from verifiers who required quarterly, none have continued at this frequency: all have been happy to move to annual.

Disagree? Think quarterly is a good idea? Let me know why! I'd love to hear your reasons. And the competing verifiers who read this blog are invited to chime in, too, but not anonymously, otherwise they won't be posted.

Sunday, January 8, 2012

A year of epiphanies, perhaps?

Today is the Feast of the Epiphany. At Church our pastor explained the meaning of the term:
  • capitalized, it refers to the manifestation of Christ to the gentiles in the persons of the Magi (three Kings)
  • lower case, it is a sudden, intuitive perception of or insight into the reality or essential meaning of something, usually initiated by some simple, homely, or commonplace occurrence or experience.
For our purposes, I'm referring to the latter. Surely you've encountered situations that you could describe as "epiphanies." Moments when all of a sudden something makes sense. This happens to me on a regular basis. And I often share these in this blog and/or our newsletter.

Epiphanies should be sought out, as they are ways for us to grow. To strive to make this a "year of epiphanies" is, I believe, a worthy objective. For me, I'd like every year to be one.

p.s., the definitions above, while consistent with what our priest stated, are actually from www.Dictionary.com.

Saturday, January 7, 2012

How large is your stack of quotidian reports?


If you're like me, the word "quotidian" is probably one you're not terribly familiar with. Perhaps you don't even see it in print regularly, though interestingly it appeared twice on page A15 of the July 26, 2011 edition of The Wall Street Journal, in both an article on the nutcase Anders Breivik and a book review of Rules of Civility, by Amor Towles (a book I happened to read, and found quite good).

Bret Stephens, the author of the first piece references the "quotidian details of [Breivik's] shooting," while Joanne Kaufman, who penned the book review, referenced the "quotidian pursuits like commerce."

So what does this word mean? My favorite source for word meanings offers the following:

adjective
1. daily: a quotidian report.
2. usual or customary; everyday: quotidian needs.
3. ordinary; commonplace: paintings of no more than quotidian artistry.
4. (of a fever, ague, etc.) characterized by paroxysms that recur daily.

noun
5. something recurring daily.
6. a quotidian fever or ague.

And thus the real reason for this post: your quotidian reports (or the ones you produce and give to various folks in your organization). When was the last time you did an inventory of them? Are they needed? Can they be improved? Might be a good new year's project!

Friday, January 6, 2012

Thursday, January 5, 2012

The "regulator's dilemma"

I often save clippings from newspapers and magazines, to refer back to at a future date. I just discovered one from the August 12, 2011 issue of the WSJ: it's from their "op ed" section, titled "S&P 500 and the 'Regulator's Dilemma.'"

The article discusses the U.S. Senate Banking Committee's examination of Standard &Poor's decision to downgrade the long-term U.S. debt. The writer points out something ironic: "S&P's judgements carry such weight because Washington [i.e., the federal government, or perhaps more correctly, the United States Congress] told the markets to pay attention to them. Federal regulators have embedded credit ratings into countless financial rules." Not surprisingly, as a result of Congress' habit of excluding various parties from their regulations (even themselves, on occasion), Treasury debt was exempt.

The author points out that "a critical ingredient in the 2008 financial crisis was the encouragement that regulators gave banks to hold mortgage-backed securities rated by S&P, Moody's and Fitch – the government-created oligopoly of credit judges."

And while Dodd-Frank instructed bureaucrats to remove credit rating references from their rules, bank regulators resisted, because of their struggle to devise better standards to judge an asset's safety.

A key statement in this piece: "as counterintuitive as it may be to politicians, having no federal standard on risk is the best standard of all." [emphasis added]


Those interested in reviewing the article's points regarding banking regulations are welcome to do so. My reason for mentioning this piece is the reference to trying to "standardize risk." It's illusive, ambiguous, impossible to classify with any degree of agreement, and impossible to measure in a way that all would find acceptable.

In my recent post that highlighted the 10 things I like best about GIPS(R) (Global Investment Performance Standards), I (with some hesitation I might add) applauded the introduction of the requirement for a three-year annualized standard deviation. Not because I think it's an ideal measure, because I'm on record objecting to it. However, I also realize that there is no measure that all would agree with, and that this is a formula that is quite easy to calculate and interpret. Firms can include additional risk measures; the new requirement simply aims to have something that prospective clients can see.

In the Standards' 2010 edition exposure draft, the GIPS Executive Committee suggested mandating risk disclosures in composite presentations; this was objected to by most who took the time to comment, and the EC wisely withdrew it. Risk is SO difficult to get one's arms around. Trying to regulate it, much more than this simple requirement, would probably be unwise. And, create additional dilemmas we don't need.

Wednesday, January 4, 2012

Time and Money Weighting: making sense of the differences

When teaching our Fundamentals of Investment Performance course, when writing my books, and often when simply having conversations with clients, I am often faced with the task of explaining, in as clear a manner as possible, the differences between time and money weighting. This topic is one of the most confusing in our industry. I've heard, on several occasions, performance measurement veterans misspeak when it comes to these matters.

At the core it all boils down to cash flows: whether to include them in the process, or eliminate (or at least reduce) their impact on the resulting return. And while a few folks suggest that their implementation has nothing to do with who controls the cash flows, the reality is that this is definitely the main reason behind deciding upon which to use (though there are times when we actually ignore this, in favor of the insights provided).

And it also boils down to linking. That is, the geometric linking of returns.

Time weighting comes in two forms: exact and approximate. Exact methods revalue the portfolio for all cash flows, and calculate returns between each of these revaluations. Approximation methods may revalue for large flows, but not all flows (or they'd be exact). And linking occurs at any point when the portfolio is revalued (either when large flows occur, or at month-ends).

We typically use either the Modified Dietz or Internal Rate of Return (IRR) formula in our approximation methods. Both of these formulas, by themselves, are actually money-weighted methods. We transform them into time-weighting when we employ geometric linking!

The following graphic contrasts money and time weighting:

As you can see, we are calculating returns in two ways: by time and money weighting. The essential difference is that with time-weighting, we value the portfolio multiple times during the period, and link the intermediate results, while for money weighting, we only value at the end points.

Can more be said on this topic? Yes! And more will be, so stay tuned.

Tuesday, January 3, 2012

Down with obfuscation!

As we begin a new year (and one of the "leap" variety, at that!), I wish to share with you further commentary from Henry Hitchings' The Language Wars. Recall that I mentioned this book in a recent post. The reality is, there is much of what he's written which lends itself to issues we deal with.

He points out that "It is no fun to have to read twice a sentence which, on the second reading, we find we didn't even want to read once. Skillful handling of language will tend to reduce the amount of cognitive effort one's audience has to expend in getting at one's meaning. If my expression is confused and ambiguous, I risk losing your attention."

Surely you can relate to cases where clarity is hard to find, and information shared is unnecessarily obfuscated (but then again, is it ever necessary to introduce obfuscation?). At times it seems that some speakers and authors wish to go out of their way to make something more complicated than it needs to be.

Hitchings cites Noam Chomsky, who stated that "language's main purposes [are] to transmit information, establish relationships, express our thoughts or clarify them, pursue knowledge and understanding, exercise our minds creatively, and play. In all but the last two of these, lucidity is vital. Precise and conventional use of language averts painful misunderstandings."

I have been told that one of my gifts is the ability to communicate in a very lucid manner; this may be due to my need to be lucid for myself, let alone the audience, to ensure that I understand what I'm communicating! I first became aware of this skill when I taught a business mathematics course at the University of Baltimore, while pursuing my MBA, more than 30 years ago. It was essentially a survey course, which touched on many areas of math, including algebra and basic calculus. I didn't feel the need to impress the students with my knowledge, but rather to convey the knowledge to them so that they could understand it. I succeeded, and thus realized that I could, in fact, share complex material in a, well, lucid fashion!

A few years ago, at a Performance Measurement Forum meeting, we had a speaker explain a particular risk measure. While I cannot speak for my fellow attendees, I found the presentation difficult to understand. And so, I slowed the speaker down, and asked some very basic questions. As a result, I was able to grasp a much better understanding then I would otherwise have obtained. Meaning, sometimes it's up to the listener to ask for clarity.

Of what value is it to overly complicate information? An oft cited quote, attributed to Einstein, is to "make things as simple as possible, but not simpler." A good idea, I think!