Reading view

There are new articles available, click to refresh the page.

Medicare’s new AI experiment sparks alarm among doctors, lawmakers

Older men play cards in a park in New York City's Chinatown.

Older men play cards in a park in New York City's Chinatown in 2024. Medicare, the public health insurance for older Americans, is piloting a new prior authorization program powered by artificial intelligence that some physicians fear will result in more denials and delays in medical care for patients. (Photo by Spencer Platt/Getty Images)

A Medicare pilot program will allow private companies to use artificial intelligence to review older Americans’ requests for certain medical care — and will reward the companies when they deny it.

In January, the federal Centers for Medicare & Medicaid Services will launch the Wasteful and Inappropriate Services Reduction (WISeR) Model to test AI-powered prior authorizations on certain health services for Medicare patients in six states: Arizona, New Jersey, Ohio, Oklahoma, Texas and Washington. The program is scheduled to last through 2031.

The program effectively inserts one of private insurance’s most unpopular features — prior authorization — into traditional Medicare, the federal health insurance program for people 65 and older and those with certain disabilities. Prior authorization is the process by which patients and doctors must ask health insurers to approve medical procedures or drugs before proceeding.

Adults over 65 generally have two options for health insurance: traditional Medicare and Medicare Advantage. Both types of Medicare are funded with public dollars, but Medicare Advantage plans are contracted through private insurance companies. Medicare Advantage plans tend to cost less out of pocket, but patients enrolled in them often must seek prior authorization for care.

AI-powered prior authorization in Medicare Advantage and private insurance has attracted intense criticism, legislative action by state and federal lawmakers, federal investigations and class-action lawsuits. It’s been linked to bad health outcomes. Dozens of states have passed legislation in recent years to regulate the practice.

In June, the Trump administration even extracted a pledge from major health insurers to streamline and reduce prior authorization.

“Americans shouldn’t have to negotiate with their insurer to get the care they need,” U.S. Health and Human Services Secretary Robert F. Kennedy Jr. said in a June statement announcing the pledge. “Pitting patients and their doctors against massive companies was not good for anyone.”

Four days after the pledge was announced, the administration rolled out the new WISeR program, scheduled to take effect in January. It will require prior authorizations only for certain services and prescriptions that the Centers for Medicare & Medicaid Services has identified as “particularly vulnerable to fraud, waste, and abuse, or inappropriate use.” Those services include, among other things, knee arthroscopy for knee osteoarthritis, skin and tissue substitutes, certain nerve stimulation services and incontinence control devices.

The companies get paid based on how much money they save Medicare by denying approvals for “unnecessary or non-covered services,” CMS said in a statement unveiling the program.

The new program has alarmed many physicians and advocates in the affected states.

“In concept, it makes a lot of sense; you don’t want to pay for care that patients don’t need,” said Jeb Shepard, policy director for the Washington State Medical Association.

“But in practice, [prior authorization] has been hugely problematic because it essentially acts as a barrier. There are a lot of denials and lengthy appeals processes that pull physicians away from providing care to patients. They have to fight with insurance carriers to get their patients the care they believe is appropriate.”

CMS responded to Stateline’s questions by providing additional information about the program, but offered few details on what the agency would do to prevent delays or denials of care. It has said that final decisions on coverage denials will be made by “licensed clinicians, not machines.” In a bid to hold the companies accountable, CMS also incentivizes them for making determinations in a reasonable amount of time, and for making the right determinations according to Medicare rules, without needing appeals.

In the statement announcing the program, Abe Sutton, director of the CMS Innovation Center, said the “low-value services” targeted by WISeR “offer patients minimal benefit and, in some cases, can result in physical harm and psychological stress. They also increase patient costs while inflating health care spending.”

A vulnerable group

Dr. Bindu Nayak is an endocrinologist in Wenatchee, Washington, a city near the center of the state that bills itself as the “Apple Capital of the World.” She mainly treats patients with diabetes and estimates 30-40% of her patients have Medicare.

“Medicare recipients are a vulnerable group,” Nayak told Stateline. “The WISeR program puts more barriers up for them accessing care. And they may have to now deal with prior authorization when they never had to deal with it before.”

Nayak and other physicians worry the same problems with prior authorizations that they’ve seen with their Medicare Advantage patients will plague traditional Medicare patients. Nayak has employees on staff whose only role is to handle prior authorizations.

More than a quarter of physicians nationwide say prior authorization issues led to a serious problem for a patient in their care, including hospitalization or permanent damage, according to the most recent report from the American Medical Association.

And some patients are unfairly denied treatment. Private insurers have denied care for people with Medicare Advantage plans even though their prior authorization requests met Medicare’s requirements, according to an investigation from the U.S. Department of Health and Human Services published in 2022. Investigators found 13% of prior authorization denials were for requests that should have been granted.

But supporters of the new model say something must be done to reduce costs. Medicare is the largest single purchaser of health care in the nation, with spending expected to double in the next decade, according to the Medicare Payment Advisory Commission, an independent federal agency. Medicare spent as much as $5.8 billion in 2022 on services with little or no benefit to patients.

Congress pushes back

In November, congressional representatives from Ohio, Washington and other states introduced a bill to repeal the WISeR model. It’s currently in committee.

“The [Trump] administration has publicly admitted prior authorization is harmful, yet it is moving forward with this misguided effort that would make seniors navigate more red tape to get the care they’re entitled to,” U.S. Rep. Suzan DelBene, a Washington Democrat and a co-sponsor of the bill, said in a November statement.

Physician and hospital groups in many of the affected states have backed the bill, which would halt the program at least temporarily. Shepard, whose medical association supports the bill, said that would give CMS time to get more stakeholder input and give physicians more time to prepare for extra administrative requirements.

“Conventional wisdom would dictate a program of this magnitude that has elicited so much concern from so many corners would at least be delayed while we work through some things,” Shepard said, “but there’s no indication that they’re going to back off this.”

Adding more prior authorization requirements for a new subset of Medicare patients will tack on extra administrative burdens for physicians, especially those in orthopedics, urology and neurology, fields that have a higher share of services that fall under the new rules.

That increased administrative burden “will probably lead to a lot longer wait times for patients,” Nayak said. “It will be important for patients to realize that they may see more barriers in the form of denials, but they should continue to advocate for themselves.”

Dr. Jayesh Shah, president of the Texas Medical Association and a San Antonio-based wound care physician, said WISeR is a well-intentioned program, but that prior authorization hurts patients and physicians.

“Prior authorization delays care and sometimes also denies care to patients who need it, and it increases the hassle factor for all physicians,” he told Stateline.

Shah added that, on the flip side, he’s heard from a few physicians who welcome prior authorization. They’d rather get preapproval for a procedure than perform it and later have Medicare deny reimbursement if the procedure didn’t meet requirements, he said.

Prior authorization has been a bipartisan concern in Congress and statehouses around the country.

Last year, 10 states — Colorado, Illinois, Maine, Maryland, Minnesota, Mississippi, Oklahoma, Vermont, Virginia and Wyoming — passed laws regulating prior authorization, according to the American Medical Association. Legislatures in at least 18 states have addressed prior authorization so far this year, an analysis from health policy publication Health Affairs Forefront found. Bipartisan groups of lawmakers in more than a dozen states have passed laws regulating the use of AI in health care.

But the new effort in the U.S. House to repeal the WISeR program is sponsored by Democrats. Supporters worry it’s unlikely to gain much traction in the Republican-controlled Congress.

Prior authorization delays care and sometimes also denies care to patients who need it, and it increases the hassle factor for all physicians.

– Dr. Jayesh Shah, president of the Texas Medical Association

Shepard said his organization has talked with state and congressional representatives, met with the regional CMS office twice, and sent a letter to CMS Director Dr. Mehmet Oz.

“We’ve looked at all the levers and we’ve pulled most of them,” Shepard said. “We’re running out of levers to pull.”

Venture capital jumps in

CMS announced in November it has selected six private tech companies to pilot the AI programs.

Some of them are backed by venture capital funds that count larger insurance companies among their key investors.

For example, Oklahoma’s pilot will be run by Humata Health Inc., which is backed by investors that include Blue Venture Fund, the venture capital arm of Blue Cross Blue Shield companies, and Optum Ventures, a venture capital firm connected to UnitedHealth Group, the parent company of UnitedHealthcare. Innovaccer Inc., chosen to run Ohio’s program, counts health care giant Kaiser Permanente as an investor.

Nayak said she knows little about Virtix Health, the Arizona-based private company contracted by the feds to run Washington state’s pilot program.

“Virtix Health would have a financial incentive to deny claims,” Nayak said. “It begs the question, would there be any safeguards to prevent profit-driven denials of care?”

That financial incentive is a concern in Texas too.

“If, financially, the vendor is going to benefit by the denial, it could be a problem for our patients,” Shah said. He said that Oz, in a speech at a recent meeting of the American Medical Association, assured physicians that their satisfaction and turnaround times would be metrics that Medicare would factor into the tech companies’ payments.

Editor’s note: This story has been updated to correct a reference to Medicare Advantage and to CMS Director Dr. Mehmet Oz’s speech to the American Medical Association.

Stateline reporter Anna Claire Vollers can be reached at avollers@stateline.org.

This story was originally produced by Stateline, which is part of States Newsroom, a nonprofit news network which includes Wisconsin Examiner, and is supported by grants and a coalition of donors as a 501c(3) public charity.

AI vs. AI: Patients deploy bots to battle health insurers that deny care

As states continue to curb health insurers’ use of artificial intelligence, patients and doctors are arming themselves with AI tools to fight claims denials, prior authorizations and soaring medical bills. (Photo by Anna Claire Vollers/Stateline)

As states continue to curb health insurers’ use of artificial intelligence, patients and doctors are arming themselves with AI tools to fight claims denials, prior authorizations and soaring medical bills. (Photo by Anna Claire Vollers/Stateline)

As states strive to curb health insurers’ use of artificial intelligence, patients and doctors are arming themselves with AI tools to fight claims denials, prior authorizations and soaring medical bills.

Several businesses and nonprofits have launched AI-powered tools to help patients get their insurance claims paid and navigate byzantine medical bills, creating a robotic tug-of-war over who gets care and who foots the bill for it.

Sheer Health, a three-year-old company that helps patients and providers navigate health insurance and billing, now has an app that allows consumers to connect their health insurance account, upload medical bills and claims, and ask questions about deductibles, copays and covered benefits.

“You would think there would be some sort of technology that could explain in real English why I’m getting a bill for $1,500,” said cofounder Jeff Witten. The program uses both AI and humans to provide the answers for free, he said. Patients who want extra support in challenging a denied claim or dealing with out-of-network reimbursements can pay Sheer Health to handle those for them.

In North Carolina, the nonprofit Counterforce Health designed an AI assistant to help patients appeal their denied health insurance claims and fight large medical bills. The free service uses AI models to analyze a patient’s denial letter, then look through the patient’s policy and outside medical research to draft a customized appeal letter.

Other consumer-focused services use AI to catch billing errors or parse medical jargon. Some patients are even turning to AI chatbots like Grok for help.

A quarter of adults under age 30 said they used an AI chatbot at least once a month for health information or advice, according to a poll the health care research nonprofit KFF published in August 2024. But most adults said they were not confident that the health information is accurate.

State legislators on both sides of the aisle, meanwhile, are scrambling to keep pace, passing new regulations that govern how insurers, physicians and others use AI in health care. Already this year, more than a dozen states have passed laws regulating AI in health care, according to Manatt, a consulting firm.

“It doesn’t feel like a satisfying outcome to just have two robots argue back and forth over whether a patient should access a particular type of care,” said Carmel Shachar, assistant clinical professor of law and the faculty director of the Health Law and Policy Clinic at Harvard Law School.

“We don’t want to get on an AI-enabled treadmill that just speeds up.”

A black box

Health care can feel like a black box. If your doctor says you need surgery, for example, the cost depends on a dizzying number of factors, including your health insurance provider, your specific health plan, its copayment requirements, your deductible, where you live, the facility where the surgery will be performed, whether that facility and your doctor are in-network and your specific diagnosis.

Some insurers may require prior authorization before a surgery is approved. That can entail extensive medical documentation. After a surgery, the resulting bill can be difficult to parse.

Witten, of Sheer Health, said his company has seen thousands of instances of patients whose doctors recommend a certain procedure, like surgery, and then a few days before the surgery the patient learns insurance didn’t approve it.

You would think there would be some sort of technology that could explain in real English why I’m getting a bill for $1,500.

– Sheer Health co-founder Jeff Witten

In recent years, as more health insurance companies have turned to AI to automate claims processing and prior authorizations, the share of denied claims has risen. This year, 41% of physicians and other providers said their claims are denied more than 10% of the time, up from 30% of providers who said that three years ago, according to a September report from credit reporting company Experian.

Insurers on Affordable Care Act marketplaces denied nearly 1 in 5 in-network claims in 2023, up from 17% in 2021, and more than a third of out-of-network claims, according to the most recently available data from KFF.

Insurance giant UnitedHealth Group has come under fire in the media and from federal lawmakers for using algorithms to systematically deny care to seniors, while Humana and other insurers face lawsuits and regulatory investigations that allege they’ve used sophisticated algorithms to block or deny coverage for medical procedures.

Insurers say AI tools can improve efficiency and reduce costs by automating tasks that can involve analyzing vast amounts of data. And companies say they’re monitoring their AI to identify potential problems. A UnitedHealth representative pointed Stateline to the company’s AI Review Board, a team of clinicians, scientists and other experts that reviews its AI models for accuracy and fairness.

“Health plans are committed to responsibly using artificial intelligence to create a more seamless, real-time customer experience and to make claims management faster and more effective for patients and providers,” a spokesperson for America’s Health Insurance Plans, the national trade group representing health insurers, told Stateline.

But states are stepping up oversight.

Arizona, Maryland, Nebraska and Texas, for example, have banned insurance companies from using AI as the sole decisionmaker in prior authorization or medical necessity denials.

Dr. Arvind Venkat is an emergency room physician in the Pittsburgh area. He’s also a Democratic Pennsylvania state representative and the lead sponsor of a bipartisan bill to regulate the use of AI in health care.

He’s seen new technologies reshape health care during his 25 years in medicine, but AI feels wholly different, he said. It’s an “active player” in people’s care in a way that other technologies haven’t been.

“If we’re able to harness this technology to improve the delivery and efficiency of clinical care, that is a huge win,” said Venkat. But he’s worried about AI use without guardrails.

His legislation would force insurers and health care providers in Pennsylvania to be more transparent about how they use AI; require a human to make the final decision any time AI is used; and mandate that they show evidence of minimizing bias in their use of AI.

“In health care, where it’s so personal and the stakes are so high, we need to make sure we’re mandating in every patient’s case that we’re applying artificial intelligence in a way that looks at the individual patient,” Venkat said.

Patient supervision

Historically, consumers rarely challenge denied claims: A KFF analysis found fewer than 1% of health coverage denials are appealed. And even when they are, patients lose more than half of those appeals.

New consumer-focused AI tools could shift that dynamic by making appeals easier to file and the process easier to understand. But there are limits; without human oversight, experts say, the AI is vulnerable to mistakes.

“It can be difficult for a layperson to understand when AI is doing good work and when it is hallucinating or giving something that isn’t quite accurate,” said Shachar, of Harvard Law School.

For example, an AI tool might draft an appeals letter that a patient thinks looks impressive. But because most patients aren’t medical experts, they may not recognize if the AI misstates medical information, derailing an appeal, she said.

“The challenge is, if the patient is the one driving the process, are they going to be able to properly supervise the AI?” she said.

Earlier this year, Mathew Evins learned just 48 hours before his scheduled back surgery that his insurer wouldn’t cover it. Evins, a 68-year-old public relations executive who lives in Florida, worked with his physician to appeal, but got nowhere. He used an AI chatbot to draft a letter to his insurer, but that failed, too.

On his son’s recommendation, Evins turned to Sheer Health. He said Sheer identified a coding error in his medical records and handled communications with his insurer. The surgery was approved about three weeks later.

“It’s unfortunate that the public health system is so broken that it needs a third party to intervene on the patient’s behalf,” Evins told Stateline. But he’s grateful the technology made it possible to get life-changing surgery.

“AI in and of itself isn’t an answer,” he said. “AI, when used by a professional that understands the issues and ramifications of a particular problem, that’s a different story. Then you’ve got an effective tool.”

Most experts and lawmakers agree a human is needed to keep the robots in check.

AI has made it possible for insurance companies to rapidly assess cases and make decisions about whether to authorize surgeries or cover certain medical care. But that ability to make lightning-fast determinations should be tempered with a human, Venkat said.

“It’s why we need government regulation and why we need to make sure we mandate an individualized assessment with a human decisionmaker.”

Witten said there are situations in which AI works well, such as when it sifts through an insurance policy — which is essentially a contract between the company and the consumer — and connects the dots between the policy’s coverage and a corresponding insurance claim.

But, he said, “there are complicated cases out there AI just can’t resolve.” That’s when a human is needed to review.

“I think there’s a huge opportunity for AI to improve the patient experience and overall provider experience,” Witten said. “Where I worry is when you have insurance companies or other players using AI to completely replace customer support and human interaction.”

Furthermore, a growing body of research has found AI can reinforce bias that’s found elsewhere in medicine, discriminating against women, ethnic and racial minorities, and those with public insurance.

“The conclusions from artificial intelligence can reinforce discriminatory patterns and violate privacy in ways that we have already legislated against,” Venkat said.

Stateline reporter Anna Claire Vollers can be reached at avollers@stateline.org.

This story was originally produced by Stateline, which is part of States Newsroom, a nonprofit news network which includes Wisconsin Examiner, and is supported by grants and a coalition of donors as a 501c(3) public charity.

❌