This past Saturday, my wife and I were driving to a wedding in Hope, NJ, which is about 75 minutes from our home. As you may have heard, New Jersey, as much of the Northeast, received a rare and rather nasty snowstorm. We didn't know what we were in for.
After sitting in traffic on Route 206 for about 30 minutes, we learned that a tree had fallen up ahead, and was blocking the entire roadway; okay, so mystery solved. Our GPS detoured us to another route, which took us on to Schooleys Mountain Road; a road we were somewhat familiar with because it was how we got to the college our younger son, Douglas, went to as an undergrad (Centenary). We were about 15 minutes from our destination and a short distance from where we were to turn, when we encountered a freshly fallen tree, blocking our way (see first photo). So, we had to turn around and find another way (what did we do before the GPS?).
This time we were driving along Route 46, four tenths of a mile from where we were to turn, and again, less than 15 minutes from our destination. Now we were stopped for about 20 minutes and then learned that someone had driven into a utility pole, knocking it down, resulting in a roadblock which wouldn't clear for hours. And so, off again!
We were heading down a road when we found yet another tree blocking the road; not a terribly large one, and one I thought we could maneuver around. But before we had a chance to try, a truck came the other way (second photo). To make a long story short, the truck helped to make our path clearer, but also nearly hit us, as he fishtailed his way up the hill and inches away from our car. We eventually made it to the wedding, missing the ceremony: a trip that took us nearly four and a half hours.
Recognizing that the snow covered roadways required extra care, I, as well as just about every driver we encountered, drove much slower than normal. But there were still accidents and other mishaps. But, we could control much of the risks we encountered.
Contrast this with the 84 year old man from Pennsylvania, who was napping in his recliner at home. A tree fell down, onto his home, and killed him. A totally unanticipated event. We could think of the man's house as a AAA-rated structure, which wasn't able to withstand the onslaught of a tree, whose limbs were made more heavy by snow falling upon leaves, which normally are long gone before the first snow arrives.
Can we see a parallel between these two situations: driving through a snowstorm, and resting at home during a snowstorm? In the first, we can observe risks and take steps to reduce or try to avoid them. In the second, "out of left field" an event appears, one that we couldn't have expected; perhaps not unlike the mortgage crisis which rippled onto the stock market.
Bottom line: we can only do so much to avoid or control risks. We should try to identify the risks we face, and take the necessary steps to minimize them. However, we must also be aware that there are certain risks we cannot anticipate, and must respond to them, as best we can, if we are able to.
Monday, October 31, 2011
Thursday, October 27, 2011
How to make your returns look better
We've discovered a way to enhance performance, at least in the short term: if you have a portfolio begin late in the month, to make the monthly return look better, treat the return as if it was for the full month!
What do we mean by this? Well, consider the Modified Dietz formula for a second:
Let's say that we begin with zero invested, so our beginning value is zero. We have a cash flow of $100 occur on the 28th of a 30 day month. We end with $105. It is evident from the numbers that we have a 5% return, yes? But let's see what happens with Modified Dietz, if we calculate the weight for the 30 day period:
Now, let us plug this value into our formula:
Instead of the 5% we know we earned, we will see a 50% return! Quite impressive, yes? And so, what happened?
The problem is, as you hopefully can see, that we are trying to measure a monthly return for an asset that was held for just three days. The culprit is our weighting formula. Instead of our "CD" (number of calendar days) be three, which we know it is, we're using 30. And therefore, instead of the flow being present for the full time (of three days of a three day period), it is shown as being present but for a mere three days of a thirty day period, or 10% of the time. This causes our denominator to be smaller than it should be, resulting in a return that is much higher than it really is.
Does this mean that Modified Dietz cannot be used in cases like this? Well, no, it can be; you just have to get your weight right.
Now, when we use this weight in our formula, our return is:
Unfortunately, I've seen too many systems that do not test to see the true period an asset is held, and allow distorted returns to be reported. There is no legitimate explanation for allowing this to occur: the return is wrong. You cannot report a one month return for an asset that is held for less than a month; nor can you report a one-year return for an asset held less than a year. We designed a system for a client a few years ago, and made sure it was sensitive to cases like this, to avoid reporting misleading information. If you see very large returns, they may be attributable to this problem, not truly great performance.
Of course, if you have a bad three days, your negative return will be exploded, too, so this practice serves no valid purpose, and should be abandoned.
What do we mean by this? Well, consider the Modified Dietz formula for a second:
Let's say that we begin with zero invested, so our beginning value is zero. We have a cash flow of $100 occur on the 28th of a 30 day month. We end with $105. It is evident from the numbers that we have a 5% return, yes? But let's see what happens with Modified Dietz, if we calculate the weight for the 30 day period:
Now, let us plug this value into our formula:
Instead of the 5% we know we earned, we will see a 50% return! Quite impressive, yes? And so, what happened?
The problem is, as you hopefully can see, that we are trying to measure a monthly return for an asset that was held for just three days. The culprit is our weighting formula. Instead of our "CD" (number of calendar days) be three, which we know it is, we're using 30. And therefore, instead of the flow being present for the full time (of three days of a three day period), it is shown as being present but for a mere three days of a thirty day period, or 10% of the time. This causes our denominator to be smaller than it should be, resulting in a return that is much higher than it really is.
Does this mean that Modified Dietz cannot be used in cases like this? Well, no, it can be; you just have to get your weight right.
Now, when we use this weight in our formula, our return is:
Unfortunately, I've seen too many systems that do not test to see the true period an asset is held, and allow distorted returns to be reported. There is no legitimate explanation for allowing this to occur: the return is wrong. You cannot report a one month return for an asset that is held for less than a month; nor can you report a one-year return for an asset held less than a year. We designed a system for a client a few years ago, and made sure it was sensitive to cases like this, to avoid reporting misleading information. If you see very large returns, they may be attributable to this problem, not truly great performance.
Of course, if you have a bad three days, your negative return will be exploded, too, so this practice serves no valid purpose, and should be abandoned.
Tuesday, October 25, 2011
A new GIPS rule has been introduced
The GIPS(R) (Global Investment Performance Standards) Q&A desk appears to have introduced a new rule:
While I have no problem with this change, it would have been nice if it had been put forward for public comment, since it doesn't appear to be based on any language that is actually in the standards themselves; i.e., no paragraph is referenced to justify its introduction. This is arguably a change in the standards, which should (I believe) have been open to public comment. [Recall that GIPS-compliant firms must comply with Q&As and Guidance Statements, in addition to the standards themselves. Therefore, this Q&A applies to you.]
In this Q&A post we find the following: “When calculating composite returns for a specific period, only portfolios that are included in the composite for the entire performance measurement period are included in the calculation.” [emphasis added] We also find “When calculating a monthly composite return and the performance measurement period is defined as a month, a firm must not include in the composite calculation portfolios that were not managed for the full month and therefore do not have a full month of performance.” [again, emphasis added]
Consider the expression "performance measurement period." In the early days of the AIMR-PPS(R), we saw both this term and "performance reporting period" used. One might report quarterly but measure monthly.
While this new "rule" seems to prohibit monthly additions (or removals) of accounts to (from) a composite, it appears to be based solely on this concept of this undefined term: "performance measurement period." If I calculate the return for January, but revalue the portfolio for large cash flows, aren't I measuring performance across multiple performance measurement periods in order to arrive at my January return? Firms that do daily performance might say that their performance measurement periods are days, might they not? Would this therefore allow them the opportunity to include an account within a month (since the month isn't the performance measurement period; the subperiods of the month are)?
Let's return to the first sentence from this rule: "When calculating composite returns for a specific period, only portfolios that are included in the composite for the entire performance measurement period are included in the calculation.” I've now emphasized two terms: specific period and performance measurement period. Are they interchangeable? It's a bit confusing, I think.
If the intent of this rule was simply to say "you can't add or remove accounts within a month," why doesn't it say that? I firmly believe that this is what is intended, but again, from the standards perspective, I fail to see the justification for it; sorry.
I will confess that I was aware that this change was "coming down," and I am perhaps responsible for it, given my challenge of the aggregate method (many, though not all, of my examples of flaws were based on cases where accounts were added within the month). A colleague told me that firms cannot add accounts within the month. It probably won't surprise you to learn that I asked "what paragraph in the standards is this interpretation based on?," but I got no response. I think this Q&A is the response.
Again, I am fine with this change, but feel that it is based on nothing in the standards. There is a process and protocol to introduce changes to the standards; this wasn't followed here (unless someone can point out the chapter and verse that the Q&A's response is based on). The Q&A desk's role is to interpret questions relative to existing rules; no rule was cited as the basis for their response. In my view this constitutes a new rule, not an interpretation of an existing rule.
In addition, given that there ARE managers who (a) claim compliance with the standards and (b) HAVE been doing this in some cases for many years, does this mean that they aren't compliant, or that they have to recalculate their performance? More is needed, I believe. In my view, this is a new rule that should have an effective date, so as to avoid confusion (and again, should have been open for public review). There are composite software packages that permit their customers to add or remove accounts from composites within a month; I suspect that they should consider restricting it going forward, so as to avoid putting their customers at risk in violating the standards. This issue will not apply to the vast majority of compliant firms, who add accounts at month-end; however, it's still important to consider the process and how the change was introduced. Can we expect other changes to be introduced in this manner?
* Note: I don't want to confuse anyone; this isn't a quote from the Q&A; it's my interpretation of what the new rule is.
Firms may not add new accounts within a month*
While I have no problem with this change, it would have been nice if it had been put forward for public comment, since it doesn't appear to be based on any language that is actually in the standards themselves; i.e., no paragraph is referenced to justify its introduction. This is arguably a change in the standards, which should (I believe) have been open to public comment. [Recall that GIPS-compliant firms must comply with Q&As and Guidance Statements, in addition to the standards themselves. Therefore, this Q&A applies to you.]
In this Q&A post we find the following: “When calculating composite returns for a specific period, only portfolios that are included in the composite for the entire performance measurement period are included in the calculation.” [emphasis added] We also find “When calculating a monthly composite return and the performance measurement period is defined as a month, a firm must not include in the composite calculation portfolios that were not managed for the full month and therefore do not have a full month of performance.” [again, emphasis added]
Consider the expression "performance measurement period." In the early days of the AIMR-PPS(R), we saw both this term and "performance reporting period" used. One might report quarterly but measure monthly.
While this new "rule" seems to prohibit monthly additions (or removals) of accounts to (from) a composite, it appears to be based solely on this concept of this undefined term: "performance measurement period." If I calculate the return for January, but revalue the portfolio for large cash flows, aren't I measuring performance across multiple performance measurement periods in order to arrive at my January return? Firms that do daily performance might say that their performance measurement periods are days, might they not? Would this therefore allow them the opportunity to include an account within a month (since the month isn't the performance measurement period; the subperiods of the month are)?
Let's return to the first sentence from this rule: "When calculating composite returns for a specific period, only portfolios that are included in the composite for the entire performance measurement period are included in the calculation.” I've now emphasized two terms: specific period and performance measurement period. Are they interchangeable? It's a bit confusing, I think.
If the intent of this rule was simply to say "you can't add or remove accounts within a month," why doesn't it say that? I firmly believe that this is what is intended, but again, from the standards perspective, I fail to see the justification for it; sorry.
I will confess that I was aware that this change was "coming down," and I am perhaps responsible for it, given my challenge of the aggregate method (many, though not all, of my examples of flaws were based on cases where accounts were added within the month). A colleague told me that firms cannot add accounts within the month. It probably won't surprise you to learn that I asked "what paragraph in the standards is this interpretation based on?," but I got no response. I think this Q&A is the response.
Again, I am fine with this change, but feel that it is based on nothing in the standards. There is a process and protocol to introduce changes to the standards; this wasn't followed here (unless someone can point out the chapter and verse that the Q&A's response is based on). The Q&A desk's role is to interpret questions relative to existing rules; no rule was cited as the basis for their response. In my view this constitutes a new rule, not an interpretation of an existing rule.
In addition, given that there ARE managers who (a) claim compliance with the standards and (b) HAVE been doing this in some cases for many years, does this mean that they aren't compliant, or that they have to recalculate their performance? More is needed, I believe. In my view, this is a new rule that should have an effective date, so as to avoid confusion (and again, should have been open for public review). There are composite software packages that permit their customers to add or remove accounts from composites within a month; I suspect that they should consider restricting it going forward, so as to avoid putting their customers at risk in violating the standards. This issue will not apply to the vast majority of compliant firms, who add accounts at month-end; however, it's still important to consider the process and how the change was introduced. Can we expect other changes to be introduced in this manner?
* Note: I don't want to confuse anyone; this isn't a quote from the Q&A; it's my interpretation of what the new rule is.
Monday, October 24, 2011
The Five Ways You May Be Wasting Money in Performance Measurement (#5)
#5 Using the wrong approach to measure, analyze, and report performance and risk
The good news is that there are plenty of great software packages available today, to assist firms with all aspects of their performance and risk measurement. The bad news is that there is no one "best" system, nor is there a system that works well with every organization's needs. Consequently, investing money in a system that fails to do the right job is a waste, or at least a partial waste.
Too many firms use spreadsheets for "systems," and in general this is not a good idea: there are exceptions, but they are few.
Producing reports that fail to provide the information needed, or failing to take an inventory of what is being used and what isn't, is a waste of resources.
All firms should periodically assess where they stand, from a systems and operational perspective. Is the firm using the right measures? Do the reports contain the right information? Are the processes smooth, with the appropriate controls? Is there unnecessary redundancy, in processing or systems? Are there more efficient ways to operate?
Some firms have moved to or are considering outsourcing part or all of their performance and risk measurement process. One can debate the appropriateness of this, but the reality is that many firms find benefits in taking these steps. Even these, though, are worthy of occasional reviews to ensure they are meeting the firm's needs.
The good news is that there are plenty of great software packages available today, to assist firms with all aspects of their performance and risk measurement. The bad news is that there is no one "best" system, nor is there a system that works well with every organization's needs. Consequently, investing money in a system that fails to do the right job is a waste, or at least a partial waste.
Too many firms use spreadsheets for "systems," and in general this is not a good idea: there are exceptions, but they are few.
Producing reports that fail to provide the information needed, or failing to take an inventory of what is being used and what isn't, is a waste of resources.
All firms should periodically assess where they stand, from a systems and operational perspective. Is the firm using the right measures? Do the reports contain the right information? Are the processes smooth, with the appropriate controls? Is there unnecessary redundancy, in processing or systems? Are there more efficient ways to operate?
Some firms have moved to or are considering outsourcing part or all of their performance and risk measurement process. One can debate the appropriateness of this, but the reality is that many firms find benefits in taking these steps. Even these, though, are worthy of occasional reviews to ensure they are meeting the firm's needs.
Friday, October 21, 2011
The Five Ways You May Be Wasting Money in Performance Measurement (#4)
#4 GIPS Performance Examinations
We have always been big supporters of GIPS(R) (Global Investment Performance Standards) verifications: given the Standards' complexity, it is very easy for firms to make mistakes.
At one time it was expected that verifications would become mandatory, but this idea was met with much opposition and was (fortunately) derailed. However, the importance of verification has been heightened as a result of the rewording and expansion of the "claim of compliance" statement in composite presentations, and the market has, in fact, made it a de facto requirement.
Examinations are a totally different matter. They are GIPS' version of the AIMR-PPS' Level II Verification. The following schematic summarizes the history of these reviews:
Examinations test to see if the firm is "cooking its books," and has little to do with the standards themselves.
While it's true that firms (a) move to compliance and (b) undergo verifications primarily for marketing purposes, the rationale behind examinations is less clear, especially since most RFPs fail to ask about them. We believe that for many firms they are simply a continuation of their Level II verifications, that were done when many large verification firms wouldn't do Level I's. And while this is no longer the case (i.e., these same verifiers will do GIPS verifications), the practice has continued.
We discourage our verification clients from having them done, and very few do (even those who used to, have, for the most part, stopped). This has resulted in tens of thousands of dollars in savings each year for our clients.
If a firm IS going to have examinations done, they should only be for those composites for which they have seen interest in having them done by their prospective clients. We have seen cases where verifiers perform them for all of the firm's marketed AND non-marketed composites: the only one who benefits from this is the verifier.
If yours is like most firms, chances are you're spending a lot of money on examinations each year. Do yourself the favor of asking "why?"
We have always been big supporters of GIPS(R) (Global Investment Performance Standards) verifications: given the Standards' complexity, it is very easy for firms to make mistakes.
At one time it was expected that verifications would become mandatory, but this idea was met with much opposition and was (fortunately) derailed. However, the importance of verification has been heightened as a result of the rewording and expansion of the "claim of compliance" statement in composite presentations, and the market has, in fact, made it a de facto requirement.
Examinations are a totally different matter. They are GIPS' version of the AIMR-PPS' Level II Verification. The following schematic summarizes the history of these reviews:
Examinations test to see if the firm is "cooking its books," and has little to do with the standards themselves.
While it's true that firms (a) move to compliance and (b) undergo verifications primarily for marketing purposes, the rationale behind examinations is less clear, especially since most RFPs fail to ask about them. We believe that for many firms they are simply a continuation of their Level II verifications, that were done when many large verification firms wouldn't do Level I's. And while this is no longer the case (i.e., these same verifiers will do GIPS verifications), the practice has continued.
We discourage our verification clients from having them done, and very few do (even those who used to, have, for the most part, stopped). This has resulted in tens of thousands of dollars in savings each year for our clients.
If a firm IS going to have examinations done, they should only be for those composites for which they have seen interest in having them done by their prospective clients. We have seen cases where verifiers perform them for all of the firm's marketed AND non-marketed composites: the only one who benefits from this is the verifier.
If yours is like most firms, chances are you're spending a lot of money on examinations each year. Do yourself the favor of asking "why?"
Thursday, October 20, 2011
The Five Ways You May Be Wasting Money in Performance Measurement (#3)
#3 Not training your staff
Entrepreneur magazine cited lack of staff training as one of the top reasons companies fail.
As consultants to many types of firms (e.g., asset managers, custodians, software vendors, pension funds, endowments), we often encounter cases where improper decisions were made because of lack of training. Take GIPS(R) (Global Investment Performance Standards), for example. We strongly encourage our new GIPS verification clients to allow us to conduct a pre-verification in advance of the actual verification, to ensure that they are heading in the right direction. Part of this day is spent educating the client about the standards, to help them avoid making costly mistakes. Firms that fail to take advantage of training often make many errors as they move towards compliance, because of the Standards' complexity and many areas of confusion. We believe that moving towards GIPS compliance without proper training is usually a waste of time and money. [Reminder: The Spaulding Group is hosting a free GIPS webinar next Monday!]
Other aspects of performance also need to be addressed, such as performance attribution, rates of return, benchmarks, and risk. The lack of knowledge often leads to poor decisions being made. Having untrained individuals carry out performance and risk measurement related tasks is an expense that often results in a waste of money.
Investing in proper training is not only good for the firm, it's also very good for the team members. They recognize the organization's desire for them to grow and to enhance their ability to contribute. They appreciate their company's willingness to invest in them. It is a motivational factor that leads to more productive employees.
Training can take many forms, from formal class room programs, to conferences, and even reading industry publications, like The Journal of Performance Measurement(R).
Poorly or insufficiently trained staff can often result in a waste of money.
Entrepreneur magazine cited lack of staff training as one of the top reasons companies fail.
As consultants to many types of firms (e.g., asset managers, custodians, software vendors, pension funds, endowments), we often encounter cases where improper decisions were made because of lack of training. Take GIPS(R) (Global Investment Performance Standards), for example. We strongly encourage our new GIPS verification clients to allow us to conduct a pre-verification in advance of the actual verification, to ensure that they are heading in the right direction. Part of this day is spent educating the client about the standards, to help them avoid making costly mistakes. Firms that fail to take advantage of training often make many errors as they move towards compliance, because of the Standards' complexity and many areas of confusion. We believe that moving towards GIPS compliance without proper training is usually a waste of time and money. [Reminder: The Spaulding Group is hosting a free GIPS webinar next Monday!]
Other aspects of performance also need to be addressed, such as performance attribution, rates of return, benchmarks, and risk. The lack of knowledge often leads to poor decisions being made. Having untrained individuals carry out performance and risk measurement related tasks is an expense that often results in a waste of money.
Investing in proper training is not only good for the firm, it's also very good for the team members. They recognize the organization's desire for them to grow and to enhance their ability to contribute. They appreciate their company's willingness to invest in them. It is a motivational factor that leads to more productive employees.
Training can take many forms, from formal class room programs, to conferences, and even reading industry publications, like The Journal of Performance Measurement(R).
Poorly or insufficiently trained staff can often result in a waste of money.
Wednesday, October 19, 2011
The Five Ways You May Be Wasting Money in Performance Measurement (#2)
#2 Using spreadsheets to maintain your GIPS composites
The Spaulding Group began surveying the industry on the presentation standards (initially the AIMR-PPS(R); later GIPS(R) (Global Investment Performance Standards)) in 1994, and we've done it several times since then. We have always been struck by the large percentage of compliant firms who use spreadsheets to maintain their composites. And while we can understand why this was common in the early 1990s (given the dearth of software packages), this is no longer the case, and hasn't been for some time.
Years ago, when doing talks on the standards, I would tell participants that when they hear a voice in their head suggesting that they put their composites on a spreadsheet, that this was the voice of the devil. Spreadsheets are:
Okay, and so now I can hear you ask "you're suggesting that we SPEND money; how is this 'wasting money.'"
Good question. Because you're wasting money on the manpower that today must maintain these spreadsheets. These folks are usually highly trained and educated individuals whose time could probably be better spent (invested) in more analytical-type work. In addition, the potential errors that can arise from spreadsheet-based composites is huge.
So why bother? Make the investment of money in software, not people's time. Use these human resources in better ways, that will likely give them more job satisfaction and provide additional benefits to the firm.
p.s., Are there exceptions? Yes, of course! Many smaller firms cannot justify the expense of a packaged solution, and so spreadsheets may be their only option. Or, if the firm has but a few composites and a limited number of accounts, then spreadsheets may be okay. But in general, firms should seek out a packed solution.
p.p.s., In general, spreadsheets shouldn't be used for "systems." Especially when there are programmed systems available. This applies beyond the area of GIPS compliance.
p.p.p.s., Reminder: there's still time to sign up for next Monday's free webinar on the GIPS Standards. To register, please contact Jaime Puerschner or Patrick Fowler (001-732-873-5700).
The Spaulding Group began surveying the industry on the presentation standards (initially the AIMR-PPS(R); later GIPS(R) (Global Investment Performance Standards)) in 1994, and we've done it several times since then. We have always been struck by the large percentage of compliant firms who use spreadsheets to maintain their composites. And while we can understand why this was common in the early 1990s (given the dearth of software packages), this is no longer the case, and hasn't been for some time.
Years ago, when doing talks on the standards, I would tell participants that when they hear a voice in their head suggesting that they put their composites on a spreadsheet, that this was the voice of the devil. Spreadsheets are:
- time-consuming
- error prone
- cumbersome
- manually intensive
- and not a data base.
Okay, and so now I can hear you ask "you're suggesting that we SPEND money; how is this 'wasting money.'"
Good question. Because you're wasting money on the manpower that today must maintain these spreadsheets. These folks are usually highly trained and educated individuals whose time could probably be better spent (invested) in more analytical-type work. In addition, the potential errors that can arise from spreadsheet-based composites is huge.
So why bother? Make the investment of money in software, not people's time. Use these human resources in better ways, that will likely give them more job satisfaction and provide additional benefits to the firm.
p.s., Are there exceptions? Yes, of course! Many smaller firms cannot justify the expense of a packaged solution, and so spreadsheets may be their only option. Or, if the firm has but a few composites and a limited number of accounts, then spreadsheets may be okay. But in general, firms should seek out a packed solution.
p.p.s., In general, spreadsheets shouldn't be used for "systems." Especially when there are programmed systems available. This applies beyond the area of GIPS compliance.
p.p.p.s., Reminder: there's still time to sign up for next Monday's free webinar on the GIPS Standards. To register, please contact Jaime Puerschner or Patrick Fowler (001-732-873-5700).
Tuesday, October 18, 2011
The Five Ways You May Be Wasting Money in Performance Measurement
Let's start with a joke: As a salesman approaches a farmhouse door, he notices a three-legged pig hobbling about on the front porch. The salesman knocks on the door and is greeted by the farmer. Before making his pitch he asks "what happened to this pig's leg."
The farmer responds "oh, this is Bessie. She's a marvelous pig. There was the time when a fire started in the house. Bessie saw it, came in to my room, and woke me up. I was able to put the fire out before it spread. If it hadn't been for her, the house would have burned down, and me and my family would have died!"
"Very interesting," says the salesman, "but what happened to her leg?"
The farmer continued: "then there was the time that our youngest daughter, Mary, fell into the pond. Bessie swam in and dragged her out. Mary would have drowned had it not been for Bessie."
"Wow, Bessie is quite an impressive pig. But I'm curious to know what happened to her leg" responded the salesman.
"Son, with a pig like Bessie, you can't eat it all at once!"
Okay, so much for the joke. How does it relate to this topic? Well, with a title that begins "The Five Ways" I think it best that we don't discuss it all at once, but rather over a five day period.
#1 Not Being GIPS Compliant
You may be wondering what money you're wasting if you're not compliant with GIPS(R) (Global Investment Performance Standards), since it costs money to comply. Answer: much of the money you spend on marketing, sales, and responding to requests for proposal, because the failure to comply makes these efforts much more challenging, and much of your investment in selling is lost because of the inability to respond "yes" to the question "Are you compliant with the GIPS standards?" In the institutional space, lack of compliance places a firm at a marketing disadvantage, and therefore results in wasting money in pursuit of many prospects that are beyond your reach.
If your particular market isn't institutional, where compliance is a de facto standard, you're still wasting money, because you're competing on a level playing field with the other firms that aren't compliant. Why not make an investment in compliance, so that you obtain a marketing advantage!
p.s., Reminder: there's still time to sign up for next Monday's free webinar on the GIPS Standards. To register, contact Jaime Puerschner or Patrick Fowler.
The farmer responds "oh, this is Bessie. She's a marvelous pig. There was the time when a fire started in the house. Bessie saw it, came in to my room, and woke me up. I was able to put the fire out before it spread. If it hadn't been for her, the house would have burned down, and me and my family would have died!"
"Very interesting," says the salesman, "but what happened to her leg?"
The farmer continued: "then there was the time that our youngest daughter, Mary, fell into the pond. Bessie swam in and dragged her out. Mary would have drowned had it not been for Bessie."
"Wow, Bessie is quite an impressive pig. But I'm curious to know what happened to her leg" responded the salesman.
"Son, with a pig like Bessie, you can't eat it all at once!"
---------------------
Okay, so much for the joke. How does it relate to this topic? Well, with a title that begins "The Five Ways" I think it best that we don't discuss it all at once, but rather over a five day period.
#1 Not Being GIPS Compliant
You may be wondering what money you're wasting if you're not compliant with GIPS(R) (Global Investment Performance Standards), since it costs money to comply. Answer: much of the money you spend on marketing, sales, and responding to requests for proposal, because the failure to comply makes these efforts much more challenging, and much of your investment in selling is lost because of the inability to respond "yes" to the question "Are you compliant with the GIPS standards?" In the institutional space, lack of compliance places a firm at a marketing disadvantage, and therefore results in wasting money in pursuit of many prospects that are beyond your reach.
If your particular market isn't institutional, where compliance is a de facto standard, you're still wasting money, because you're competing on a level playing field with the other firms that aren't compliant. Why not make an investment in compliance, so that you obtain a marketing advantage!
p.s., Reminder: there's still time to sign up for next Monday's free webinar on the GIPS Standards. To register, contact Jaime Puerschner or Patrick Fowler.
Saturday, October 15, 2011
Oldies ... so, do you want us or not?
This weekend's WSJ has two stories, reflecting opposite views.
Michael M. Phillips' "The Old Soldier Who Didn't Fade Away" tells the story of a 59-year old U.S. Army staff sergeant who is currently deployed in Afghanistan. He has my admiration and appreciation for his dedication, patriotism, bravery, and sacrifice.
In the same section of the paper we find Joe Queenan's "Revenge of the 60-Year-Old Has-Beens," where he takes aim at Mitt Romney (who in Joe's eyes is "dull as dishwater," who "could be the first president that no one in this great country actually likes"), Bill Clinton (who "will not, will not shut his trap"), along with a host of other sexagenarians.
As one who has made it to the 60-year old level (coincidentally, eight days after Joe did), I prefer Phillips' story of someone who refuses to let age slow him down. And because the U.S. Army restricts enlisted men from serving in combat once they hit 60, he is now looking to get a commission, since officers have a higher age limit. Bravo!
In a sense, these stories speak of performance; granted, not performance in investing, but human performance nonetheless. Can "Has-Beens" continue to perform? Obviously the answer is "yes," and often better than the young folks. Granted, my body is a bit more creaky than it was a few decades ago: for example, unlike SSG Don Nicholas, who can run two miles under 12 minutes, my orthopedist will not allow me to run any more (and I'm not sure I could ever run two miles at such a pace); but, I have no intention of slowing down. Plus, I'm having too much fun!
Looking at age from multiple perspectives isn't much different than looking at risk or returns from multiple angles, too. We can learn a lot, and perhaps draw some great insights and conclusions.
Michael M. Phillips' "The Old Soldier Who Didn't Fade Away" tells the story of a 59-year old U.S. Army staff sergeant who is currently deployed in Afghanistan. He has my admiration and appreciation for his dedication, patriotism, bravery, and sacrifice.
In the same section of the paper we find Joe Queenan's "Revenge of the 60-Year-Old Has-Beens," where he takes aim at Mitt Romney (who in Joe's eyes is "dull as dishwater," who "could be the first president that no one in this great country actually likes"), Bill Clinton (who "will not, will not shut his trap"), along with a host of other sexagenarians.
As one who has made it to the 60-year old level (coincidentally, eight days after Joe did), I prefer Phillips' story of someone who refuses to let age slow him down. And because the U.S. Army restricts enlisted men from serving in combat once they hit 60, he is now looking to get a commission, since officers have a higher age limit. Bravo!
In a sense, these stories speak of performance; granted, not performance in investing, but human performance nonetheless. Can "Has-Beens" continue to perform? Obviously the answer is "yes," and often better than the young folks. Granted, my body is a bit more creaky than it was a few decades ago: for example, unlike SSG Don Nicholas, who can run two miles under 12 minutes, my orthopedist will not allow me to run any more (and I'm not sure I could ever run two miles at such a pace); but, I have no intention of slowing down. Plus, I'm having too much fun!
Looking at age from multiple perspectives isn't much different than looking at risk or returns from multiple angles, too. We can learn a lot, and perhaps draw some great insights and conclusions.
Friday, October 14, 2011
Dispersion around what exactly?
A verification client called me with the following question: they have historically calculated dispersion around the composite's return; however, their new GIPS(R) (Global Investment Performance Standards) system measures it around the average of the accounts that were present for the full year. Which is better?
To clarify: GIPS compliant firms are required to include a measure of dispersion (e.g., standard deviation, range, high/low, quartile) for each year, provided there were six or more accounts present for the full year (if there are less than six, then it's an option to include dispersion). The composite's annual return is based on the monthly returns, which are linked together. Each month can have a different mix of accounts, because, for example, accounts were removed because they: terminated, fell below the minimum, had a significant cash flow, had a change in strategy, or are now non-discretionary; or accounts were added because they are new, rose above the minimum, returned after removal because of a significant flow, are no longer non-discretionary, and so on.
The only accounts that are used for the dispersion measurement purpose will be those that were present for the full year. If we calculate standard deviation against these accounts themselves, without any reference to the composite's return, then dispersion will be measured against the average of these accounts, which may not (and probably will not) be the same as the composite's annual return. To measure standard deviation against the composite's return, one would have to manually, so to speak, step through the standard deviation formula, inserting the composite's average into the equation, rather then allow the formula (e.g., Excel's STDEVP) run by itself. This would require more effort. You can get differences in results, as you might expect. Here's a quick example:
And so, is either approach okay? Is one method preferred?
The standards do not speak specifically to this question. I would say that both approaches are acceptable. However, I believe dispersion is expected to be about the composite's average (we want to know how actual accounts varied relative to the reported return). But, I suspect that most systems measure dispersion relative to average of the account returns, not the composite's return. In the end, the differences are probably immaterial.
To clarify: GIPS compliant firms are required to include a measure of dispersion (e.g., standard deviation, range, high/low, quartile) for each year, provided there were six or more accounts present for the full year (if there are less than six, then it's an option to include dispersion). The composite's annual return is based on the monthly returns, which are linked together. Each month can have a different mix of accounts, because, for example, accounts were removed because they: terminated, fell below the minimum, had a significant cash flow, had a change in strategy, or are now non-discretionary; or accounts were added because they are new, rose above the minimum, returned after removal because of a significant flow, are no longer non-discretionary, and so on.
The only accounts that are used for the dispersion measurement purpose will be those that were present for the full year. If we calculate standard deviation against these accounts themselves, without any reference to the composite's return, then dispersion will be measured against the average of these accounts, which may not (and probably will not) be the same as the composite's annual return. To measure standard deviation against the composite's return, one would have to manually, so to speak, step through the standard deviation formula, inserting the composite's average into the equation, rather then allow the formula (e.g., Excel's STDEVP) run by itself. This would require more effort. You can get differences in results, as you might expect. Here's a quick example:
And so, is either approach okay? Is one method preferred?
The standards do not speak specifically to this question. I would say that both approaches are acceptable. However, I believe dispersion is expected to be about the composite's average (we want to know how actual accounts varied relative to the reported return). But, I suspect that most systems measure dispersion relative to average of the account returns, not the composite's return. In the end, the differences are probably immaterial.
Wednesday, October 12, 2011
Risks in predicting the future
Despite the criticisms that have been offered to anyone who wishes to make predictions, there is no limit to their presence. Who would want to see the end of weathermen (weather people?) telling us what tomorrow will look like, despite everyone knowing "no one can [accurately] predict the weather."
The accompanying chart shows the Obama administration's predictions for what the unemployment rate would be if (a) the stimulus package was implemented, (b) what it would be if it wasn't implemented, and (c) what it actually has been. The abject failure to be accurate in these predictions has resulted in strong criticism from the president's foes (and even some friends). But Obama isn't the first president to make predictions (or for that matter, promises) that didn't hold (recall, for example, "read my lips: no new taxes!").
Qualifying predictions is important. I recall a few years ago when a couple sports pundits didn't bother to offer their predictions for the first round of the baseball league playoffs, as it was "a given" as to what teams would prevail in the first round. Well, they were wrong. And yet, their pomposity and arrogance seemed to give their predictions undeserved credibility.
I have voiced my concerns with ex ante risk measures, but recognize that clients and managers still want to see them, which is fine, as long as they recognize the underlying assumptions and the qualifications of these predictions. To make bold statements about what will happen will usually lead to errors of one sort or another.
The accompanying chart shows the Obama administration's predictions for what the unemployment rate would be if (a) the stimulus package was implemented, (b) what it would be if it wasn't implemented, and (c) what it actually has been. The abject failure to be accurate in these predictions has resulted in strong criticism from the president's foes (and even some friends). But Obama isn't the first president to make predictions (or for that matter, promises) that didn't hold (recall, for example, "read my lips: no new taxes!").
Qualifying predictions is important. I recall a few years ago when a couple sports pundits didn't bother to offer their predictions for the first round of the baseball league playoffs, as it was "a given" as to what teams would prevail in the first round. Well, they were wrong. And yet, their pomposity and arrogance seemed to give their predictions undeserved credibility.
I have voiced my concerns with ex ante risk measures, but recognize that clients and managers still want to see them, which is fine, as long as they recognize the underlying assumptions and the qualifications of these predictions. To make bold statements about what will happen will usually lead to errors of one sort or another.
Monday, October 10, 2011
Is net of fee performance recommended?
We received a question from a client recently, who cited the GIPS(R) (Global Investment Performance Standards) Fees Guidance Statement.
On page 5 we find the following:
Let's focus on the sentence that reads "In these situations, it is most appropriate to present the return net of all fees (e.g., including administrative fees) since all investors must pay these fees." Is the "most appropriate" contrasting different ways to calculate net-of-fees returns (i.e., to include all of the fees rather than only the fees the manager charges) or relative to showing gross-of-fee performance? I can see why one might be confused.
I do not interpret this to be "as opposed to gross-of-fee performance," since this entire guidance statement deals with fees; it is not positioning itself relative to gross-of-fee returns. This guidance statement (page 2) and the standards themselves (¶ I.5.B.1) recommend gross-of-fee returns. Therefore, my interpretation isn't that this is taking a new direction on the recommendations, but rather stating that if you're going to report net-of-fee, this is how you should in these cases. If I am mistaken, hopefully someone will correct me.
On page 5 we find the following:
Let's focus on the sentence that reads "In these situations, it is most appropriate to present the return net of all fees (e.g., including administrative fees) since all investors must pay these fees." Is the "most appropriate" contrasting different ways to calculate net-of-fees returns (i.e., to include all of the fees rather than only the fees the manager charges) or relative to showing gross-of-fee performance? I can see why one might be confused.
I do not interpret this to be "as opposed to gross-of-fee performance," since this entire guidance statement deals with fees; it is not positioning itself relative to gross-of-fee returns. This guidance statement (page 2) and the standards themselves (¶ I.5.B.1) recommend gross-of-fee returns. Therefore, my interpretation isn't that this is taking a new direction on the recommendations, but rather stating that if you're going to report net-of-fee, this is how you should in these cases. If I am mistaken, hopefully someone will correct me.
Sunday, October 9, 2011
In and out of a bear market, or maybe not
In this weekend's WSJ, Jason Zweig (in "If It Looks Like a Bear ...") begins with a brief retrospective on what appeared to be the market's recent entry into bear market territory (and quick departure from), but then touches on what is meant by a "bear market." It seems that "there is no such thing as an 'official' bear market, [though] for convenience...The Wall Street Journal and others define it as a 20% decline from a closing high to a closing low on the Dow or the S&P 500." This is apparently a new definition. He goes on to explain the source of the term, "bear" ("to sell the bear's skin before one has caught the bear"). This presentation should be of interest to anyone involved in the world of investing.
I found it quite interesting to learn that there is no official bear market definition. Who would have thunk! Surely a term that is bandied about would have a common meaning and interpretation, but this isn't the case.
And so, the fact that many of the terms we use in performance not having official meanings isn't unique. But it's important for us to know this, I believe, just as with Jason's insights. I won't restate some of the points I've referenced of late in this blog, as my readers are no doubt familiar with them. Being aware of what is defined or agreed to, versus terms that aren't, is important, so one isn't mislead or confused.
I found it quite interesting to learn that there is no official bear market definition. Who would have thunk! Surely a term that is bandied about would have a common meaning and interpretation, but this isn't the case.
And so, the fact that many of the terms we use in performance not having official meanings isn't unique. But it's important for us to know this, I believe, just as with Jason's insights. I won't restate some of the points I've referenced of late in this blog, as my readers are no doubt familiar with them. Being aware of what is defined or agreed to, versus terms that aren't, is important, so one isn't mislead or confused.
Thursday, October 6, 2011
When aggregate makes sense
Before you think "oh, here he goes again" please give me a second to explain. Okay, yes, it's true: I've been beating this to death (and will continue to, but not right now).
I was sent the following question from a client:
"How do you calculate performance for a composite that contains two or more funds? Do you combine multiple funds' cash flows and calculate an IRR [internal rate of return] using all the funds' contributions, distributions and residual values as if they were from a single fund?"
Some background: the writer is speaking about a client's account, where they are reporting to the client, and are using the concept of a "composite" to pool the client's accounts together.
In this case, unlike with time-weighting, we would aggregate the accounts (starting and ending market values, as well as cash flows). We then calculate an IRR across this aggregate account. I would not encourage asset weighting individual IRRs, though to confess I haven't played around with it; my "gut" is speaking here.
And so, there is a place for aggregation: for IRRs, when we are reporting to a client about their return!
I was sent the following question from a client:
Some background: the writer is speaking about a client's account, where they are reporting to the client, and are using the concept of a "composite" to pool the client's accounts together.
In this case, unlike with time-weighting, we would aggregate the accounts (starting and ending market values, as well as cash flows). We then calculate an IRR across this aggregate account. I would not encourage asset weighting individual IRRs, though to confess I haven't played around with it; my "gut" is speaking here.
And so, there is a place for aggregation: for IRRs, when we are reporting to a client about their return!
Wednesday, October 5, 2011
Chief Ethics Officer named; Standards of Practice published
The Spaulding Group, Inc. is pleased to announce that we've created two new positions: Chief Ethics Officer and Assistant Ethics Officer. In addition, we've published a Standards of Practice handbook for our staff.
John D. Simpson, CIPM has been appointed our firm's Chief Ethics Officer, and Jed M. Schneider, CIPM, FRM, is the Assistant.
Details can be found in our press release.
John D. Simpson, CIPM has been appointed our firm's Chief Ethics Officer, and Jed M. Schneider, CIPM, FRM, is the Assistant.
Details can be found in our press release.
Tuesday, October 4, 2011
A free webcast on the GIPS standards!
The Spaulding Group's official October webinar (yesterday's was actually for September; John Simpson had an extremely busy month, which included a trip to the Middle East to conduct training) will be GIPS(R) (Global Investment Performance Standards) fundamentals, that will highlight the "Five Key Components of GIPS." I will conduct this session, which will be on Monday, October 24, from 11 AM to 1 PM, EST.
And, it's free!
But if you want to participate, you'll have to sign up quickly, as space is limited. Only the first 100 to register will be able to do so for free. If you plan to have more than one listen in, you only need to register one time!
For more information, please contact Jaime Puerschner.
And, it's free!
But if you want to participate, you'll have to sign up quickly, as space is limited. Only the first 100 to register will be able to do so for free. If you plan to have more than one listen in, you only need to register one time!
For more information, please contact Jaime Puerschner.
Monday, October 3, 2011
Potter Stewart and risk measurement
When Herb Chain (of Deloitte), Matt Forstenhausler (of E&Y), and I used to regularly teach AIMR-PPS(R) and then GIPS(R) courses (first for AIMR, then for the CFA Institute), one thing we could be sure of: Herb would reference former U.S. Supreme Court Justice's famous line about not having a definition for pornography, but knowing what it was when he saw it. Over the years I, like Herb, have used this line metaphorically, as it fits a variety of situations.
Well, when it comes to risk measurement his line doesn't fit, as their are several definitions available. Leslie Rahl (of Capital Market Risk Advisers) has become somewhat famous for her long list of risks, which she further qualifies as being incomplete. There is no commonly used definition for risk, and perhaps this is how it should be. Risk is usually defined in four ways:
The possibility of a loss and possibility of not meeting the objective can be measured using the same formula: Sortino ratio. We just adjust the absolute return in our equation, from zero (for a loss) to our goal, or minimal acceptable return (or, if you prefer, minimum funding ratio or liability related benchmark).
Uncertainty is difficult to measure. Scenario analysis can be used here, where we look at different possible future events to see how our portfolio would behave. Value at Risk and Liquidity Risk might also work, too.
Regardless of your definition, we see risk as something that needs to be "ganged up on." That is, approached from multiple angles to get a sense of what is really there. The old television game show, Concentration, might be a good metaphor for risk. We want to get a good look at what risk is, but must take several views to really understand it. But unlike the game show, it never fully reveals itself.
Well, when it comes to risk measurement his line doesn't fit, as their are several definitions available. Leslie Rahl (of Capital Market Risk Advisers) has become somewhat famous for her long list of risks, which she further qualifies as being incomplete. There is no commonly used definition for risk, and perhaps this is how it should be. Risk is usually defined in four ways:
- Volatility
- Possibility of a loss
- Possibility of not meeting the client's objective
- Uncertainty.
The possibility of a loss and possibility of not meeting the objective can be measured using the same formula: Sortino ratio. We just adjust the absolute return in our equation, from zero (for a loss) to our goal, or minimal acceptable return (or, if you prefer, minimum funding ratio or liability related benchmark).
Uncertainty is difficult to measure. Scenario analysis can be used here, where we look at different possible future events to see how our portfolio would behave. Value at Risk and Liquidity Risk might also work, too.
Regardless of your definition, we see risk as something that needs to be "ganged up on." That is, approached from multiple angles to get a sense of what is really there. The old television game show, Concentration, might be a good metaphor for risk. We want to get a good look at what risk is, but must take several views to really understand it. But unlike the game show, it never fully reveals itself.
Subscribe to:
Posts (Atom)