Bravo for Jordan Rau, the author of a piece produced by Kaiser Health News, that at last untangles the proliferating hospital ratings schemes that may do more to confuse patients than enlighten them. Some of these ratings schemes, notably ones from Healthgrades and U.S. News & World Report, have been around for years and no doubt have helped Healthgrades’ bottom line and the circulation stats of U.S. News. As for patients getting to the best hospital…well, let’s just say the jury is out.
The piece also ran on NBC.com, PBS.org, and was localized by the Tampa Bay Times and The Philadelphia Inquirer. We’ve needed an expose of this sort for a while. Readers do need to find the safest, most patient-centered hospitals. The trouble is just that ratings systems as they are constructed today don’t get them there with any certainty. That’s what they need to understand, and that’s what Rau’s piece tells them.
More specifically, he demonstrates the ambiguities in the ratings business, and he finds that how a hospital rates depends on who is doing the rating. Each rater has its own methodology based on what it deems important.
Most raters depend on dozens of pieces of data that Medicare publishes on its Hospital Compare website (which consumers can access on their own, by the way, and actually learn quite a lot). They include death rates and patient-satisfaction scores. Sometimes raters get data from other sources, too, such as state organizations and private surveys. But they all weight these factors differently in their formulas. These myriad methodologies, Rau reports, “often come to wildly divergent conclusions. Some hospitals rated as outstanding by one group are ignored or panned by another.”
In one telling example, Dr. Douglas Salvador, vice president of quality at Maine Medical Center in Portland, said “we’ve alternatively been labeled the least safe hospital in Maine and the safest hospital in Maine.” Now what are Maine consumers, aka patients, supposed to do with that? Not much!
Another example Ray provides: Many raters have given UCSF Medical Center in San Francisco high marks. But California regulators fined the hospital $425,000 for repeatedly endangering patients. Leapfrog awarded the hospital a “B” rating, even though regulators penalized the hospital eight times for infractions since 2008. They included leaving a sponge inside one patient and a plastic clip in the skull of another.
Rau exposes a byproduct of the hospital ratings craze: Hospitals may benefit more than patients. While the ratings may be fuzzy or of limited use to most patients, the hospitals that do well strike it rich.
Kaiser Health News crunched some numbers and found that about a third of all the country’s hospitals—some 1600—last year won at least one honor from a major hospital rating group. A hospital so honored may find itself on a list of America’s best hospitals, a hospital honor roll, a list of those giving the best cardiac care, and so on. Those ratings become great marketing tools, which hospitals use to lure new patients. So when an outfit like Healthgrades or Leapfrog comes along that charges a hospital a licensing fee to advertise their rankings, they often fork over the dough.
Dr. Andrew Brotman, the chief clinical officer at NYU Langone Medical Center in Manhattan—the hospital whose back-up generator failed during Hurricane Sandy—told Rau that although the hospital did well on Healthgrades’ rankings, it wasn’t willing to pay $145,000 to use the ranking as a logo on its website. Brotman revealed it cost only $50,000 to use the U.S. News accolade, and $12,500 for Leapfrog’s award.
A section of Langone’s website called “accolades” shows the hospital is on the U.S. News Honor Roll of Best Hospitals and that Leapfrog awarded it an “A” for patient safety. The website is mum about the money NYU might have paid to use that information in marketing. Lack of disclosure is common in the hospital ratings business. Patients have no idea there may be less than an arms-length relationship between the hospital and the rater.
Rau reported that hospitals covet the ratings so much they sometimes use the services of consultants to help boost their marks. Healthgrades and Truven Health Analytics, which publishes the 100 Top Hospitals, offer consulting services to hospitals that want to improve their ratings. It’s a win-win for Healthgrades, which itself rates hospitals. So it’s a little like the hospital getting test answers before the test.
The benefits of these ratings schemes to patients are much less clear—especially since most people tend to go to the hospital their physician or surgeon sends them to anyway.
Even when you do study the ratings, you might not learn much. In searching for a hospital for my own recent cataract surgery, I examined three ratings schemes—from U.S. News, Leapfrog, and the government’s Hospital Compare, which was the most useful. Examining dimensions of care measured by the government showed me what questions I needed to ask and what observations I should make in order to monitor my care. Having now been a patient in a hospital, I know where in the care process things can go wrong and what patients and their families can do to prevent at least some of the problems.
What patients also need for real guidance—and what reporters need for their stories—are the inspection reports made by surveyors for the Joint Commission, the private accrediting group that inspects hospitals. They also need inspection reports from the Centers for Medicare and Medicaid Service (CMS). The Joint Commission won’t release its reports, but now the CMS inspection reports are available on the Association of Health Care Journalist’s website. They may be the best tools around.
The Second Opinion, CJR’s healthcare desk, is part of our United States Project on the coverage of politics and policy. Follow @USProjectCJR for more posts from this author and the rest of the United States Project team. And follow @Trudy_Lieberman.