It’s a lot to keep up with, of course, but it’s a great tapestry from which to construct a sense of where things are going, and what things need to get going.
There are, of course, things to be wary of. For example, surveys conducted on behalf of organizations supportive of a particular view—that suggest that most people agree with that view—are an obvious eyebrow raiser. Studies based on samplings that are limited in size or scope aren’t inherently flawed, but should always be taken with a grain of salt (for example, a survey of large plans isn’t always illustrative or predictive of the behaviors of smaller programs). My personal favorite: “studies” by the purveyor of a particular good or service that indicate that what people really want is—more of that particular good or service.
But there’s another kind of survey that can sneak up on even the most discerning—the survey that confirms what you already believe.
There was an example of that just about a month ago when Urban Institute researchers Mauricio Soto and Barbara A. Butrica, who did the study for the Center for Retirement Research at Boston College, reported that employers with auto-enrollment had match rates about 7% below their non-auto-enrolling counterparts (see “Auto Enrollment Could Lead to Reduced Match”)–and from that finding, drew a not-unreasonable conclusion—that automatic enrollment could lead to situations where employers reduced their matching contributions.
And so it might. Generally speaking, automatic enrollment leads to more participants and, generally speaking, more participants leads to more matching dollars and, particularly for cash-strapped employers, more matching dollars can be a problem—a problem that could certainly result in a reduced (or suspended) match. Moreover, while matching contributions have often served as valuable participation incentives, in an era of automatic enrollment, those incentives might well play a different role, a role at a different level or, in the most extreme case, no role at all1.
Now, it really doesn’t require a leap of faith to accept the premise of the study. And, if you’re like most people in our business, you probably saw the headline, skimmed over the results, and filed it under unintended plan-design consequences—or maybe even “bad things about auto-enrollment.” However, there’s a problem: Last week, the Employee Benefit Research Institute (EBRI) put out a report that claimed exactly the opposite; that, in fact, automatic enrollment has led to a HIGHER rate of match, at least among large plan sponsors (see “Study Finds Auto-Enrollment/Higher Match Link Among Large Plans”) (2).
In ordinary circumstances, we might be left to draw our own conclusions about two studies from reputable sources that seemed to draw two such widely disparate conclusions. This time, however, the folks at EBRI not only acknowledged the disparity, they offered insights into those differences. According to EBRI, the CRR/Urban Institute data was based on match rates constructed by the researchers, not actual rates of match—and then, this inferred rate was matched (no pun intended) against a separate listing of plans to determine which had, at some point, adopted automatic enrollment—though when that had been adopted vis-à-vis the match changes (if any) was not identified.
The bottom line: The conclusion drawn in the CRR/Urban Institute report (3) wasn’t illogical, but it was apparently based on such an oddly concocted methodology that, IMHO, it wasn’t worth the paper it was printed on. Consequently, when all is said and done, it doesn’t really add much to our insights about matching contributions and employer decisions—but it surely reminds us that we must always be careful not to jump to factual conclusions that, however reasonable, aren’t supported by the facts.