Bravo for Jordan Rau, the author of a piece produced by Kaiser Health News, that at last untangles the proliferating hospital ratings schemes that may do more to confuse patients than enlighten them. Some of these ratings schemes, notably ones from Healthgrades and U.S. News & World Report, have been around for years and no doubt have helped Healthgrades’ bottom line and the circulation stats of U.S. News. As for patients getting to the best hospital…well, let’s just say the jury is out.
The piece also ran on NBC.com, PBS.org, and was localized by the Tampa Bay Times and The Philadelphia Inquirer. We’ve needed an expose of this sort for a while. Readers do need to find the safest, most patient-centered hospitals. The trouble is just that ratings systems as they are constructed today don’t get them there with any certainty. That’s what they need to understand, and that’s what Rau’s piece tells them.
More specifically, he demonstrates the ambiguities in the ratings business, and he finds that how a hospital rates depends on who is doing the rating. Each rater has its own methodology based on what it deems important.
Most raters depend on dozens of pieces of data that Medicare publishes on its Hospital Compare website (which consumers can access on their own, by the way, and actually learn quite a lot). They include death rates and patient-satisfaction scores. Sometimes raters get data from other sources, too, such as state organizations and private surveys. But they all weight these factors differently in their formulas. These myriad methodologies, Rau reports, “often come to wildly divergent conclusions. Some hospitals rated as outstanding by one group are ignored or panned by another.”
In one telling example, Dr. Douglas Salvador, vice president of quality at Maine Medical Center in Portland, said “we’ve alternatively been labeled the least safe hospital in Maine and the safest hospital in Maine.” Now what are Maine consumers, aka patients, supposed to do with that? Not much!
Another example Ray provides: Many raters have given UCSF Medical Center in San Francisco high marks. But California regulators fined the hospital $425,000 for repeatedly endangering patients. Leapfrog awarded the hospital a “B” rating, even though regulators penalized the hospital eight times for infractions since 2008. They included leaving a sponge inside one patient and a plastic clip in the skull of another.
Rau exposes a byproduct of the hospital ratings craze: Hospitals may benefit more than patients. While the ratings may be fuzzy or of limited use to most patients, the hospitals that do well strike it rich.
Kaiser Health News crunched some numbers and found that about a third of all the country’s hospitals—some 1600—last year won at least one honor from a major hospital rating group. A hospital so honored may find itself on a list of America’s best hospitals, a hospital honor roll, a list of those giving the best cardiac care, and so on. Those ratings become great marketing tools, which hospitals use to lure new patients. So when an outfit like Healthgrades or Leapfrog comes along that charges a hospital a licensing fee to advertise their rankings, they often fork over the dough.
Dr. Andrew Brotman, the chief clinical officer at NYU Langone Medical Center in Manhattan—the hospital whose back-up generator failed during Hurricane Sandy—told Rau that although the hospital did well on Healthgrades’ rankings, it wasn’t willing to pay $145,000 to use the ranking as a logo on its website. Brotman revealed it cost only $50,000 to use the U.S. News accolade, and $12,500 for Leapfrog’s award.
A section of Langone’s website called “accolades” shows the hospital is on the U.S. News Honor Roll of Best Hospitals and that Leapfrog awarded it an “A” for patient safety. The website is mum about the money NYU might have paid to use that information in marketing. Lack of disclosure is common in the hospital ratings business. Patients have no idea there may be less than an arms-length relationship between the hospital and the rater.