Wednesday, September 4, 2013

It's not JUST about the numbers, but pretty darn close ...

Last week I offered a theory as to why retail investors may not be interested in rates of return, and it resulted in numerous comments from readers.

As I was preparing this post, I read the following in an "Op Ed" piece in today's WSJ, by David R. Henderson, titled "The Man Who Resisted 'Blackboard Economics'":

"Ronald Coase, who died on Labor Day at age 102, was one of the most unusual economists of the 20th century. He won the Nobel Prize in 1991 for his insights about how transaction costs affect real-world economies. In a 75-year career he wrote only about a dozen significant papers, and he used little or no math.
Yet his impact was profound."

The ability to convey economic, as well as financial and investment, ideas without the aid of mathematics is important, though we cannot lose sight of the importance of numbers. In the brief paragraph above, note the number of numbers that appear (102, 20, 1991, 75, 12)!

What actually served as the genesis of this piece were two events that were highlighted this past weekend.

The first was about Thomas Merrick, someone no doubt unfamiliar to most folks. He spent 65 years working for a railroad and just retired at age 91. He was a World War II veteran. This article
impressed me because of this individual's desire to continue to work, long past the time when many retire (at the present time I have no plans to retire, and so glory in such stories). Here we learn how long he worked (65 years) and his age at retirement (91). Numbers are the way we convey important facts; to say that this "old guy" finally retired after working for "a long time" is lacking important details. Without numbers, we're unsure about how old he was and for how long he worked; plus, one might find the characterization a bit insulting.


The second is about Diana Nyad, the 64-year old woman who swam from Cuba (my mother's birthplace, by the way) to Florida, through shark-infested waters, without a cage; the first person to do
this. She covered a distance of roughly 110 miles in 53 hours, and this was her fifth attempt. Again, we see the use of numbers to provide key details about this woman's feat. Would the story be as good without them? Hardly.

Math matters ... a lot.

Someone posted the following puzzle on LinkedIn:
Well, I guess I am a bona fide genius, as I solved it in less than two minutes. But seriously, only one out of 100 folks can? I'm a bit suspicious about this claim. Granted, it does require some degree of analytical skill, but I'd be surprised if the actual number wasn't higher. And, to suggest that by solving it you're a genius is a bit hyperbolic, no doubt.

Although my fondness for mathematics has been part of my DNA forever, I also value the ability to communicate without and/or about the numbers in a meaningful way. I think for many performance measurement professionals, this can be a challenge. At next year's PMAR conferences The Spaulding Group will have sessions on this subject: the ability to add insight to the performance measurement results.
I think this is a critical part of being a performance measurement professional; i.e., it not just about the numbers. (Okay, it's mostly about the numbers ... Or perhaps not! A debate topic, perhaps!)

p.s., Speaking of geniuses, I like this quote from Polish-American mathematician Mark Kac: "An ordinary genius is a fellow that you and I would be just as good as, if we were only many times better. There is no mystery as to how his mind works. Once we understand what he has done, we feel certain that we, too, could have done it. It is different with the magicians … the working of their minds is for all intents and purposes incomprehensible. Even after we understand what they have done, the process by which they have done it is completely dark." Our industry is blessed to have both "ordinary geniuses" as well as a few "magicians."  

24 comments:

  1. Very interesting article...thanks!

    This week, I quickly turned to CNBC when I found out they were presenting the following: "August sees record ETF outflow at $15 billion: Pro"

    As someone who has done extensive research on ETFs since the early 2000's, the news came as a surprise to me...

    Not until someone who actually uses math asked what does that translate in % terms did the "Pro" reply: "it is only about 1 percent of total ETF assets. Outflows to total assets in June 2010 were larger on a relative basis, at 2.2 percent."

    So much for the record!

    ReplyDelete
  2. Stephen Campisi, Nowhere Near a GeniusSeptember 4, 2013 at 10:33 PM

    I agree with you. You're no genius and neither am I.
    The answer is 410. Not surprising this was posted on LinkedIn, the only site that could claim to be as self-congratulatory as Facebook, where everyone has thousands of "friends" and everyone is above average.

    ReplyDelete
  3. Steve, LinkedIn isn't quite Lake Wobegon, and such a post is an extreme rarity; as for FB, yes, this is a regular appearance (I think many actually believe the statistics).

    ReplyDelete
  4. Numbers in context.
    I agree numbers are very important, especially as a means to measure things. I think the issue in the previous column had to do with why retail investors lacked interest in rates of return. The discussion was more about reporting "You made $400 on your $10k investment" or "You were up 4% on your $10k investment" - saying the same thing, but perhaps a different visual for clients. Still using numbers, just in a different way.
    Do these numbers really matter anyway? where is the context? What is the time frame? How much money would be made if it was invested in Treasury Bills or some other investment vehicle?

    ReplyDelete
  5. Thanks, Debi; I'm sure we could have quite a discussion on this, perhaps at PMAR

    ReplyDelete
  6. I suggest a game show setting where an expert in math will go against someone who is not good at math. The contestants will answer general questions to test who has the sharper mind, but not like the Jeopardy template used this year..I suggest a version of the British game show "The Weakest Link." The contestant who gets eliminated is sentenced to the Walk of Shame....and the host gets to say: You are the Weakest Link, Goodbye!!!

    ReplyDelete
  7. Interesting idea ... perhaps something like "explain Fermat's last equation, and discuss the solution) ... just kidding!

    ReplyDelete
  8. I like SuperbMindset's idea. Just as there's a difference between knowledge and wisdom, there's also a difference between "smarts" and "street smarts." It recalls a story of an inventor who wanted to hire an engineer to work with him. He needed someone smart but also practical. So he asked each candidate to tell him how many marbles would be needed to fill the medium sized jar on his desk. (He had a large box of marbles in the office.) The candidates spent hours developing elaborate calculations based using all sorts of mathematical and engineering models; they were pretty smart fellows. He didn't hire any of them; he hired the guy who simply took the jar over to the box of marbles, filled it up and then counted them.

    ReplyDelete
  9. If you really want to test your intelligence, try saying this 5 times fast:

    One smart fellow, he felt smart.
    Two smart fellows, they felt smart.
    Three smart fellows, they all felt smart.

    If you try, you might be smart, but you're not street smart...

    ReplyDelete
  10. Replies
    1. Dave: Your replies (only seconds apart) truly demonstrate the difference between knowledge and practical experience...

      Delete
  11. Steve, I still wouldn't hire the guy that counted the marbles, simply because he used no math! I would hire the guy who weighed the empty jar, filled up the jar with marbles and weighed the jar. Then weighed the one marble and did the math!

    ReplyDelete
  12. Rightly is it written: "Where there is no vision, the people perish."

    The best answer is to hire the guy who weighed the jar, weighed one marble, did the math, and then wrote a generalized model from the relationship he perceived.

    Then again, perhaps we would also like to hire "the other guy" who weighed EVERY marble and created a confidence level around his estimate of the marble weight, using his generalized model... or the guy who made random draws from the marble population and then ...

    You see where all this is going, don't you? Perhaps not. Go back and read from the beginning.

    ReplyDelete
  13. Funny: for a guy who professes some disdain for math-types, your answer takes much into consideration. The idea that all marbles weigh the same is an assumption that cannot be proven without some analysis; you've suggested a good approach. Yes, we could weight each individually, or take a sample, and use its weight as the average. There are clearly multiple ways to solve this problem. Now, we need only think of the appropriate venue for a discourse on this subject and its applicability to our world.

    ReplyDelete
  14. It is a perfectly valid observation that instantly catapults you into questions like the specifics (in this case the weight) of each individual marble you are working with. Interestingly, the asymmetry of your observation arises from the standpoint that your story did not indicate the size of each marble either. So if the assumption is that every time you filled up the jar and actually counted the number of marbles, you would get a different outcome then you go back and read the arbitrary story from the beginning “how many marbles would be needed to fill the jar” and hire the guy who picks a random number. Now that's practical and smart!

    ReplyDelete
  15. In any analysis, we must be sure to understand our assumptions, whether they are explicit or implicit. It's equally important to understand the question being answered by our analysis. And it's MOST important NOT to assume what the other guy is assuming (unless YOU are that guy, in which case I guess you're free to assume whatever you want...)

    That said, I didn't assume that marbles were all the same size, or that the difference in sizing would affect the answer in a material way. You see, if you wanted to know the number of marbles because you want to know if you will need the "25 pack" for your child, or the "250 pack" for your small day care center, then you don't need a very precise answer. In that case, you don't really need to test the possible deviation in marble size. Why waste good math?

    So, allow me to correct your apparent comfort with poking fun at me for my "disdain for math-types" since I believe even you would shudder at the rather pedantic approach to every question by people who apply the utmost rigor to achieve theoretical and often spurious levels of precision while simultaneously ignoring the credibility of their underlying assumptions. We have seen how this approach that "more math is always better" has resulted in a tremendous waste of resources that could be used for valuable analytical work that produces better decision working, instead of tedious and time-consuming approaches to the calculation of data (a necessary, but commodity function.)

    Let me give you a real-life example: When I worked as a bond portfolio manager, our staff with actuarial backgrounds would test our portfolio for the effect of interest changes. Their test involved a stochastic analysis that ran 15,000 random paths to illustrate potential changes in the Treasury rate. However, the portfolio contained mostly bonds with "spread" such as corporates, mortgages and high yield. We know that the change in spread and the Treasury rate tends to be opposite, as it was in 1998 when a global credit crisis caused bond spread to widen and Treasury rates to fall by an equal amount. The "yield" change was zero, but our crackerjack staff of "math geeks" were busy measuring with great precision an exaggerated price response that had no basis in reality, since the true change in yield was zero! THAT'S what I object to: a completely wrong analysis that results from simply firing the math bazooka at every problem without first understanding the question, the degree of precision required, the use of the resulting information and the validity of the underlying assumptions. That's fair, isn't it?

    ReplyDelete
  16. Tsk, tsk ... really didn't mean to "poke fun," but rather "having fun" and taking this topic to an extreme, merely "for the fun of it." Your point is, of course, valid. We must be aware of the assumptions that we, or the person(s) we're doing the analysis for is. As with the boy who solved the problem of the truck being too high to make it under the bridge (just take some air out of the tires; i.e., lower the water, don't raise the bridge), some things CAN be taken to extreme. The TV show, "Big Bang Theory," which my wife and I have become huge fans of, occasionally does just this, as some scientists are apt to do. I guess "just because you have the tools," or, more broadly, "just because you can" doesn't always mean you should. There are times when we should show a return to three or four decimal places (i.e., to a 10th or 100th of a basis point) (e.g., when the effects are so small that to show 0.00% would convey no contribution at all, where 0.003 or 0.000 may have some legitimacy)In general, when I see returns to 3 decimal places (1/10th of a basis point) I shudder. Those who show to one decimal place, in my view, are lacking the precision we'd normally want to see, but gain the advantage of not worrying about miniscule errors that may arise. Yes, we can become, at times pedantic, and must guard against it (or at least be aware of it).

    ReplyDelete
  17. Fair enough. The below response is not my own but I think it's a good fit to what I am trying to get at:

    "Officer: Do you know how fast you were going?

    Driver: I have no idea.

    Officer: 95 miles per hour.

    Driver: But I haven't been driving for an hour!

    We clearly don't need a "full hour" to measure your speed. We can take a before-and-after measurement (over 1 second, let's say) and get your instantaneous speed. If you moved 140 feet in one second, you're going ~95mph. Simple, right?

    Not exactly. Imagine a video camera pointed at Clark Kent (Superman's alter-ego). The camera records 24 pictures/sec (40ms per photo) and Clark seems still. On a second-by-second basis, he's not moving, and his speed is 0mph.

    Wrong again! Between each photo, within that 40ms, Clark changes to Superman, solves crimes, and returns to his chair for a nice photo. We measured 0mph but he's really moving -- he goes too fast for our instruments!

    Analogy: Like a camera watching Superman, the speed we measure depends on the instrument!"

    ReplyDelete
  18. There's a great article in this weekend's WSJ about "The Firm," a book about McKinsey. In it they discuss a hypothetical scenario about being asked by a client what time it is. It says that Booz Allen would say "What time do you want it to be," while A.D. Little would tell you it's 9:45:20 Greenwich Mean Time. But at McKinsey they'd ask "Why do you want to know? What decisions are you trying to make for which knowing the time would be helpful? Getting a broader sense of what's occurring can be quite helpful. BTW, I'll be doing something else next week on this topic!

    ReplyDelete

Note: Only a member of this blog may post a comment.