Skip to Content
Rekenthaler Report

The Fund Winner Curse Is an Optical Illusion

Just because winners decline doesn't make them a bad choice.

Good Is Bad 
A month ago, The Wall Street Journal ran a story called "Mutual Funds' Five-Star Curse." Morningstar featured (indirectly) in the headline, but the true subject was not the behavior of the star ratings. Rather, it was the inevitable disappointment that came from investing in the top-performing mutual funds. Whether measured by the Morningstar Rating for funds, or any other returns-based system, today's winners do not sustain.

Per the WSJ's report (which used Morningstar data), of funds that had the highest rating of 5 stars on July 2004, "37% had lost one star 10 years later. But 31% lost two stars, 14% dropped three, and 3% lost four [stars] … Only 58, or 14%, of the 403 funds that had five stars in July 2004 carried the same rating through July 2014, Morningstar says."

The reporters, and their editors, believed that this evidence demonstrated the fruitlessness of buying strong past performers. So, I suspect, did almost every reader. Those statistics arrive quickly, and they do look damning. A full 86% of all 5-star funds failed to retain their rating. And most of those fell by more than 1 star. Past performance, we are told, is a poor indicator of future performance. This study offers the evidence.

Except that it doesn't.

The 10-year time period used by the article was well and fairly chosen. As the Morningstar ratings are generated from the trailing three-, five-, and 10-year periods, the star ratings of July 2004 and July 2014 are fully independent. None of the gains that contributed to the funds' initial 5-star ratings is contained in the second, later rating.

It therefore becomes a straightforward task to assess how the group of 5-star funds fared. Compare the distribution of the new star ratings to the distribution of the overall fund universe. If the group's new ratings are lower, the signal of past performance (as measured by the star ratings) failed. If the new ratings are similar, the signal was neutral. If the new ratings are higher, the signal succeeded.



There was a clear correlation between past and future results.

(A nuance: As Morningstar ratings include funds' sales charges, and sales charges rarely change, the chart is partially rigged. Regardless of the time period, funds with sales charges will tend to lag those without. However, that effect does not account for the entire pattern--and it's beside the main point, which is that the future star ratings were relatively good in reality but presented as being relatively bad.)

The study's framing creates an illusion--an illusion that turned the funds' victory into defeat. Evaluating the future performance of a pool of current winners means measuring decline. Whether the data sample consists of mutual funds or NCAA football rankings, the group that occupies the very top at one time period will not occupy the very top at another. The direction is always down.

But while heading down feels like failing, it very often isn't. Yes, the final NCAA 2013 coaches' poll seems a poor guide for predicting 2014 NCAA football games. However, it's a much sounder method than the neutral approach of flipping a coin. As a general rule, the powerhouse teams from one season remain pretty good the following season (or the following decade, for that matter). In NCAA football, past performance does indicate future performance. Ditto for mutual funds, albeit with less consistency and more noise.

There's also a second aspect to this story. Almost all of these 5-star funds are actively managed. If index funds were to show a similar pattern, the tale would have been differently told--because everybody knows that while actively run funds wax and wane, index funds are consistent. Indeed, the article's first lesson is, "Don't buy in to the all-powerful manager." The reporters were initially beguiled by the illusion of the eroding numbers. They then offered less resistance to the allure because these were active managers. The problems were expected.

As it turns out, the handful of 5-star index funds that existed in 2004 also suffered a ratings downturn over the next decade. Compared with the general fund population the index funds fared well, with three funds recording 4 stars for the subsequent time period and two receiving 3 stars, for a 3.6 star-rating average. Nonetheless, they all fell. This, too, could be negatively portrayed: "Over the decade, not a single 5-star index fund retained its rating, with 60% losing 1 star and 40% dropping a full 2 stars." It would still, however, be a positive result.



This common, casual, and subconscious discounting of the achievements of active management leads to a remarkable side sentence: "One encouraging finding: While the 10 largest five-star funds [see table] may not all have kept their five stars after 10 years--only one did as of July--all outperformed their peers."

Wait now. Of the 10 largest actively managed mutual funds in 2004, all 10 still exist, and all 10 outgained their average competitor over the next decade?

(Which is understating the matter. None of the funds finished outside the top third of its group. The average percentile ranking was 16.)

Credit to the reporters for conducting sensitivity testing and publishing the figures. The results, however, do not support the article's thesis about the perils of investing based on past results. Nor do they support the warning about "all-powerful" active managers. While all 10 of the largest actively run funds beat their rivals over the following decade, only eight of the 10 largest index funds accomplished the task.

As I've previously written, performance investing in general--and active management in particular--faces a very difficult sales climate. In this article's study, the typical winning fund from a decade ago remained above-average, and each of the 10 largest outgained. What will readers remember? The headline of the "Five-Star Curse."

John Rekenthaler has been researching the fund industry since 1988. He is now a columnist for Morningstar.com and a member of Morningstar's investment research department. John is quick to point out that while Morningstar typically agrees with the views of the Rekenthaler Report, his views are his own.

Sponsor Center