Reading view

There are new articles available, click to refresh the page.

Data center tax breaks are on the chopping block in some states

Data centers operate in Oregon in 2024. Some states are scaling back their data center incentives as the facilities contribute to increasing electric bills and raise environmental concerns. (Photo by Rian Dundon/Oregon Capital Chronicle)

Data centers operate in Oregon in 2024. Some states are scaling back their data center incentives as the facilities contribute to increasing electric bills and raise environmental concerns. (Photo by Rian Dundon/Oregon Capital Chronicle)

After years of states pushing legislation to accelerate the development of data centers and the electric grid to support them, some legislators want to limit or repeal state and local incentives that paved their way.

President Donald Trump also has changed his tone. Last year he issued an executive order and other federal initiatives meant to support accelerated data center development. Then last month, he cited rising electricity bills in saying technology companies that build data centers must “pay their own way,” in a post on Truth Social.

As the momentum shifts, lawmakers in several states have introduced or passed legislation that aims to rein in data center development by repealing tax exemptions, adding conditions to certain incentives or placing moratoriums on data center projects. Virginia lawmakers, for example, are considering ending a data center tax break that costs the state about $1.6 billion a year.

“Who is actually benefiting from these massive data centers that, in many cases, are the size of one or two shopping malls combined?” asked Michigan Democratic state Rep. Erin Byrnes, who introduced a proposal to repeal the state’s data center tax exemptions. “They have a large footprint in terms of land and energy usage. And by and large, it’s not going to be the average resident who lives near a data center who’s going to benefit.”

Over the past few years, more data centers have been built in an effort to meet the demand for digital processing power, which has rapidly increased as more artificial intelligence systems come online. Data centers house thousands of servers that are responsible for storing and transmitting data required for internet services to work.

But as local communities voice growing outrage over rising electricity prices and environmental concerns brought by data centers, such as water and energy use, lawmakers in several states are hoping to slow data center development. By limiting incentives or placing moratoriums on new projects, state legislators are hoping to give themselves more time to determine whether the massive facilities are worth losing millions or more in tax revenue each year.

Some experts also say that developers and tech companies have exaggerated some of the benefits they bring to local communities. While the promise of new jobs sounds attractive, local leaders may face other concerns, such as the effects of diverting construction resources away from other purposes and higher energy costs caused by AI, said Michael Hicks, an economics professor at Ball State University in Indiana.

“A lot of households — and the people that are elected by households — and local governments are becoming more unnerved by the public pushback to data centers,” Hicks said.

Tech developers and data center operators are concerned, however, that the changes could hurt the rapidly growing industry. And most states and localities already require developers using incentives to follow certain requirements, said Dan Diorio, the vice president of state policy for the Data Center Coalition, a lobbying group for the data center industry.

State lawmakers have to consider how changes to incentive programs could upend years of construction, which has long-term business impacts, Diorio said.

“I think data centers are very much the backbone of the 21st-century economy,” he said. “We’re generating economic activity in states, contributing to state-level GDP, contributing significantly to labor income and state and local tax revenue, and creating significant amounts of jobs. I mean, we’re just jumping into something preemptively here.”

Incentives granted

At least 37 states offer incentives that are available to data centers, including sales tax exemptions and property tax abatements, according to the National Conference of State Legislatures. Sales tax exemptions, the most common incentive, allow data center developers to buy computers and other equipment at a much lower cost.

“I think these are one of many factors that the data centers are looking at, along with the cost of electricity, the cost of construction, land and things like that,” said Nicholas Miller, a policy associate at NCSL. “These incentives are one way that states are trying to pitch themselves as competitive to this industry.”

These aren’t the days of being able to build a data center, cut deals with NDAs, then start turning dirt before the constituents even know what’s happened.

– Oklahoma House Speaker Kyle Hilbert, a Republican

In 2020, Maryland implemented a program that exempts data centers from sales and use taxes if they provide at least five jobs within three years of applying to the program and invest at least $2 million in data center personal property. The first four years of the program cost the state $22 million — but $11 million of that came in 2024 alone, as the costs grew, Democratic state Del. Julie Palakovich Carr said.

Concerned about this and the impact of data centers on residents’ electricity bills, Palakovich Carr introduced legislation this year that would repeal the state’s sales and use tax exemptions for personal property used at data centers. The measure, which is under consideration in the House, would also restrict localities in the state from eliminating or reducing assessments for personal property used in data centers, which drew opposition from the Maryland Association of Counties.

The amount of money states are forfeiting to provide tax breaks for data centers is increasingly concerning, Palakovich Carr said.

“Unfortunately, that’s the turn we’re seeing across many other states,” she said. “The price starts out maybe in line with what we think it’s going to be. But over time it just costs more and more.”

Similar bills that would repeal or halt state incentives for data centers have been filed in Arizona and Georgia.

“When we look at potential subsidies for businesses, I’m really looking at it from a frame of incentivizing new behavior rather than just giving away money for things that the companies were going to already do anyways,” Palakovich Carr said. “I think it’s really important that once these things get put in place, we look at the data and see what’s happening on the ground.”

In 2024, Michigan enacted sales and use tax exemptions on certain data centers through at least 2050.

Now, with developers looking at more than a dozen sites for potential data centers, public sentiment has soured, said Byrnes, who had voted against the measure. Communities across the state began organizing in an effort to stop data centers from coming to their neighborhoods because of environmental concerns and energy costs, she said.

The outcry prompted Byrnes to co-sponsor a bipartisan package of three bills that would repeal the 2024 law.

“We’re taking a stand with this legislation to say that we don’t believe data centers should be offered these exemptions,” she said. “I believe it aligns with public sentiment.”

Lawmakers in a handful of states — including New York, Oklahoma and Vermont — have filed bills that would place a temporary moratorium on all data center projects and require studies of their impacts.

Georgia Democratic state Rep. Ruwa Romman introduced a measure this session that would put a moratorium on new data center projects until March 2027. The proposal would give the legislature time to study the impact of data centers on the state’s natural resources, environment and other areas.

“We have such a beautiful state and it would be a damn shame to completely and utterly wreck it and its landscape for short-term gain,” Romman said. “These data centers aren’t bringing jobs. They’re saying they’re bringing the revenue, but there’s a ton of fine print on the revenue that’s coming in. So, I’ve been urging my colleagues from every side of the political spectrum to just take a beat.”

In 2021, the Oklahoma legislature approved a measure from current Republican House Speaker Kyle Hilbert that excludes new data centers from qualifying for an exemption program that allows certain manufacturers not to pay property taxes for their first five years in business. Any data centers that qualified for the program in the five years prior to the law, however, can continue to apply for exemptions.

This year, as more project proposals were made, Hilbert introduced legislation to ensure no data centers could “slip through the cracks.”

“These aren’t the days of being able to build a data center, cut deals with NDAs, then start turning dirt before the constituents even know what’s happened,” Hilbert said. “Those days are over, and data centers need to be proactive in their messaging and talking to people about their concerns.”

Costs vs. benefits

Last year, Virginia, home to the most data centers in the country, gave up $1.6 billion in sales and use tax revenues from data centers, state data shows. That’s a 118% increase from the previous year, according to a report from Good Jobs First, a watchdog group that focuses on economic development incentives. Another report from the group said Georgia is expected to lose at least $2.5 billion to data center sales tax exemptions this year, 664% higher than the state’s previous estimate.

Virginia state lawmakers are considering legislation that would require data centers to achieve high energy efficiency standards and decrease their use of diesel backup generators in order to be eligible for the state’s sales and use tax exemption. The measure, which passed the House, is now moving through the Senate.

Before the end of his term, former Virginia Gov. Glenn Youngkin, a Republican, suggested a provision in his proposed state budget that would extend the data center tax incentive from 2035 to 2050. The Senate’s budget bill, however, would end the incentive altogether on Jan. 1, 2027. It’s not clear if state leaders, including current Democratic Gov. Abigail Spanberger, support the measure.

While states can put a specific number on the tax losses, it’s much more difficult to determine how much data centers contribute to local communities and the state, Miller said.

Virginia brings in a significant amount of revenue from the property taxes for each facility. Local construction firms, restaurants and other small businesses also benefit from ongoing projects, he said.

“This is the big question,” Miller said. “With all economic development projects, it’s generally a lot easier to measure the cost of the incentive directly versus the benefits.”

The changing incentive landscape may cause instability within the data center industry, said Diorio, of the Data Center Coalition. Data center projects are large-scale capital investments that play out for several years, but changing policies could upend that progress.

“When states look at these policies or consider abrupt ends to programs, that creates significant market uncertainty,” Diorio said. “It will have a significant long-term impact on the viability of that market for data center development. Industries are very responsive to market signals, and any kind of uncertainty will bring up a red flag because you’re looking to invest for the long haul.”

Stateline reporter Madyson Fitzgerald can be reached at mfitzgerald@stateline.org.

This story was originally produced by Stateline, which is part of States Newsroom, a nonprofit news network which includes Wisconsin Examiner, and is supported by grants and a coalition of donors as a 501c(3) public charity.

As AI-generated fake content mars legal cases, states want guardrails

CoCounsel Legal is an artificial intelligence tool that acts as a virtual assistant for legal professionals. More people in the legal field are using AI to automate repetitive tasks and save time, but hallucinations have led to fake cases and false information in legal documents.

CoCounsel Legal is an artificial intelligence tool that acts as a virtual assistant for legal professionals. More people in the legal field are using AI to automate repetitive tasks and save time, but hallucinations have led to fake cases and false information in legal documents. (Photo by Madyson Fitzgerald/Stateline)

Last spring, Illinois county judge Jeffrey Goffinet noticed something startling: A legal brief filed in his courtroom cited a case that did not exist.

Goffinet, an associate judge in Williamson County, looked through two legal research systems and then headed to the courthouse library — a place he hadn’t visited in years — to consult the book that purportedly listed the case. The case wasn’t in it.

The fake case, generated by artificial intelligence, came across Goffinet’s desk just a few months after the Illinois Supreme Court’s policy on the use of AI in the courts took effect. Goffinet co-chaired a task force that informed that policy, which allows the use of AI as long as it complies with existing legal and ethical standards.

“People are going to use [AI], and the courts are not going to be able to be a dam across a river that’s already flowing at flood capacity,” Goffinet said. “We have to learn how to coexist with it.”

As more false quotes, fake court cases and incorrect information appear in legal documents generated by AI, state bar associations, state court systems and national law organizations are issuing guidance on its use in the legal field. A handful of states are considering or enacting legislation to address the issue, and many courts and professional associations are focused on education for attorneys.

Federal judge mulls sanctions for attorneys who used AI in court filing

From divorce cases to discrimination lawsuits, AI-generated fake content can cause evidence to be dismissed and motions to be denied.

While some states urge attorneys to lean on existing guidance about accuracy and transparency, the new policies address AI concerns related to confidentiality, competency and costs. Most policies and opinions encourage attorneys to educate themselves and to use proprietary AI tools that prevent sensitive data from being entered into open source systems. Since AI tools could also increase efficiency, several policies advise attorneys to charge less if they spend less time on cases.

Some states, such as Ohio, also ban the use of artificial intelligence for certain legal tasks. In Ohio, courts are prohibited from using AI to translate legal forms, court orders and similar content that may affect the outcome of a case.

Several states have also advised legal professionals to adhere to the American Bar Association’s formal opinion of ethical AI use in law.

Artificial intelligence can help attorneys and law firms by automating administrative tasks, analyzing contracts and organizing documents. Generative AI can also be used to draft legal documents, including court briefs. Experts say the use of AI productivity tools can save legal professionals time and reduce the risk of human error in everyday tasks.

But law professionals nationwide have faced fines and license suspensions, among other consequences, for submitting legal documents citing false quotes, cases or information.

Many legal professionals are likely to not notice instances in which an AI system is “hallucinating,” or confidently making statements that are not true, said Rabihah Butler, the manager for enterprise content for Risk, Fraud and Government at the Thomson Reuters Institute. The institute is a research subsidiary of the Thomson Reuters company, which sells an AI system meant to help lawyers.

AI has such confidence, and it can appear so polished, that if you're not paying attention and doing your due diligence, the hallucination is being treated as a factual piece of information.

– Rabihah Butler, manager for enterprise content for Risk, Fraud and Government at the Thomson Reuters Institute

Courts and law organizations will need to consider education, sanctions and punitive actions to ensure law professionals are using AI appropriately, Butler said.

“AI has such confidence, and it can appear so polished, that if you’re not paying attention and doing your due diligence, the hallucination is being treated as a factual piece of information,” she said.

Since the beginning of 2025, there have been 518 documented cases in which generative AI produced hallucinated content used in U.S. courts, according to a database by Damien Charlotin, a senior research fellow at the HEC Paris business school.

“So far, if we’re looking at the institutional response, there’s not a lot because people are not very sure how to handle this kind of issue,” Charlotin said. “Everyone is aware that some lawyers are using artificial intelligence in their day-to-day work. Most people are aware that the technology is not very mature. But it’s still hard to prevent a mistake.”

State guidance

As of Jan. 23, state bar associations or similar entities have issued formal guidance on the use of AI in at least 10 states and the District of Columbia, typically in the form of an ethics opinion. Those aren’t enforceable as law, but spell out proper conduct.

In February, for example, the Professional Ethics Committee for the State Bar of Texas issued an ethics opinion that outlines issues that may arise from law professionals using AI. Texas lawyers should have a basic understanding of generative AI tools and guardrails to protect client confidentiality, it said. They should also verify any content generated by AI and refrain from charging clients for the time saved by using AI tools.

Legal professionals must be aware of their own competency with AI tools, said Brad Johnson, the executive director of the Texas Center for Legal Ethics.

“A really important takeaway from the opinion is that if a lawyer is considering using a generative AI tool in the practice of law, the lawyer has to have a reasonable and current understanding of the technology because only then can a lawyer really evaluate the risks that are associated with it,” he said.

Court systems in at least 11 states — Arizona, Arkansas, California, Connecticut, Delaware, Illinois, New York, Ohio, South Carolina, Vermont and Virginia — have established policies or issued rules of conduct regarding AI use by law professionals.

Illinois, for instance, allows lawyers to use artificial intelligence and does not require disclosure. The policy also emphasizes that judges will ultimately be responsible for their decisions, regardless of “technological advancements.”

“The task force wanted to emphasize that as judges, what we bring to the table is our humanity,” said Goffinet, the associate judge. “And we cannot abdicate our humanity in favor of an AI-generated decision or opinion.”

Former lawyer seeks reinstatement after sanctions, arrest and contempt findings

Some state lawmakers have tried to address the issue through legislation. Last year, Louisiana Republican Gov. Jeff Landry signed a measure that requires attorneys to use “reasonable diligence” to verify the authenticity of evidence, including content generated by artificial intelligence. The law also allows parties in civil cases to raise concerns about the admissibility of evidence if they suspect it was generated or altered by artificial intelligence.

California Democratic state Sen. Tom Umberg also introduced legislation last year that would require attorneys to ensure confidential information is not entered into a public generative AI system. The measure, which was approved by the Senate Judiciary Committee last week, also would require attorneys to ensure that reasonable steps are taken to verify the accuracy of generative AI material.

Attorney education

It’s also important for state bar associations and law schools to provide education on artificial intelligence, said Michael Hensley, a counsel at FBT Gibbons and an advocate for the safe use of AI in California courts. AI has the ability to reduce research time just like online legal research systems, but it requires training, he said.

“I would hope the state bar would have training for this,” Hensley said. “And I think it’s absolutely imperative that law schools have a session on AI.”

In a Bloomberg Law survey conducted last spring, 51% of the more than 750 respondents said their law firms purchased or invested in generative artificial intelligence tools. Another 21% said they planned to purchase AI tools within the next year. Attorneys reported using generative AI for general legal research, drafting communications, summarizing legal narratives, reviewing legal documents and other work.

Of the law firms that were not using generative AI, attorneys cited incorrect or unreliable output, ethical issues, security risks and data privacy as the top reasons.

While attorneys and law firms have become more comfortable with AI tools, courts have been more apprehensive, said Diane Robinson, a principal court research associate at the National Center for State Courts. Robinson is also project director at the Thomson Reuters Institute/NCSC AI Policy Consortium for Law and Courts, an association of legal practitioners and researchers developing guidance and resources for the use of AI in courts.

AI has the potential to improve case processing and can allow people needing legal advice to find information by using AI chatbots, she said. But, she added, courts are still struggling with evidence altered by AI and briefs littered with hallucinations.

“Fake evidence is nothing new,” Robinson said. “People have been altering photographs as long as there were photographs. But with AI, the ability to create videos, audio and pictures has become very easy, and courts are really struggling with it.”

Charlotin, of HEC Paris, said most courts and professional associations will continue to focus on education right now.

“You cannot prevent a mistake just by telling people, ‘Don’t make a mistake,’” Charlotin said. “That doesn’t work. It’s more about setting up processes to make people aware of it, then they can set up processes to work on dealing with it.”

Stateline reporter Madyson Fitzgerald can be reached at mfitzgerald@stateline.org.

This story was originally produced by Stateline, which is part of States Newsroom, a nonprofit news network which includes Wisconsin Examiner, and is supported by grants and a coalition of donors as a 501c(3) public charity.

States will keep pushing AI laws despite Trump’s efforts to stop them

A billboard advertises an artificial intelligence company.

A billboard advertises an artificial intelligence company in San Francisco in September. California is among the states leading the way on AI regulations, but an executive order signed by President Donald Trump seeks to override state laws on the technology. (Photo by Justin Sullivan/Getty Images)

State lawmakers of both parties said they plan to keep passing laws regulating artificial intelligence despite President Donald Trump’s efforts to stop them.

Trump signed an executive order Thursday evening that aims to override state artificial intelligence laws. He said his administration must work with Congress to develop a national AI policy, but that in the meantime, it will crack down on state laws.

The order comes after several other Trump administration efforts to rein in state AI laws and loosen restrictions for developers and technology companies.

But despite those moves, state lawmakers are continuing to prefile legislation related to artificial intelligence in preparation for their 2026 legislative sessions. Opponents are also skeptical about — and likely to sue over — Trump’s proposed national framework and his ability to restrict states from passing legislation.

“I agree on not overregulating, but I don’t believe the federal government has the right to take away my right to protect my constituents if there’s an issue with AI,” said South Carolina Republican state Rep. Brandon Guffey, who penned a letter to Congress opposing legislation that would curtail state AI laws.

The letter, signed by 280 state lawmakers from across the country, shows that state legislators from both parties want to retain their ability to craft their own AI legislation, said South Dakota Democratic state Sen. Liz Larson, who co-wrote the letter.

Earlier this year, South Dakota Republican Gov. Larry Rhoden signed the state’s first artificial intelligence law, authored by Larson, prohibiting the use of a deepfake — a digitally altered photo or video that can make someone appear to be doing just about anything — to influence an election.

South Dakota and other states with more comprehensive AI laws, such as California and Colorado, would see their efforts overruled by Trump’s order, Larson said.

“To take away all of this work in a heartbeat and then prevent states from learning those lessons, without providing any alternative framework at the federal level, is just irresponsible,” she said. “It takes power away from the states.”

Trump’s efforts

Thursday’s executive order will establish an AI Litigation Task Force to bring court challenges against states with AI-related laws, with exceptions for a few issues such as child safety protections and data center infrastructure.

The order also directs the secretary of commerce to notify states that they could lose certain funds under the Broadband Equity, Access, and Deployment Program if their laws conflict with national AI policy priorities.

Trump said the order would help the United States beat China in dominating the burgeoning AI industry, adding that Chinese President Xi Jinping did not have similar restraints.

“This will not be successful unless they have one source of approval or disapproval,” he said. “It’s got to be one source. They can’t go to 50 different sources.”

In July, the Trump administration released the AI Action Plan, an initiative aimed at reducing regulatory barriers and accelerating the growth of AI infrastructure, including data centers. Trump also has revoked Biden-era AI safety and anti-discrimination policies.

The tech industry had lobbied for Trump’s order.

“This executive order is an important step towards ensuring that smart, unified federal policy — not bureaucratic red tape — secures America’s AI dominance for generations to come,” said Amy Bos, vice president of government affairs for NetChoice, a technology trade association, in a statement to Stateline.

As the administration looks to address increasing threats to national defense and cybersecurity, a centralized, national approach to AI policy is best, said Paul Lekas, the executive vice president for global public policy and government affairs at the Software & Information Industry Association.

“The White House is very motivated to ensure that there aren’t barriers to innovation and that we can continue to move forward,” he said. “And the White House is concerned that there is state legislation that may be purporting to regulate interstate commerce. We would be creating a patchwork that would be very hard for innovation.”

Congressional Republicans tried twice this year to pass moratoriums on state AI laws, but both efforts failed.

In the absence of a comprehensive federal artificial intelligence policy, state lawmakers have worked to regulate the rapid development of AI systems and protect consumers from potential harms.

Trump’s executive order could cause concern among lawmakers who fear possible blowback from the administration for their efforts, said Travis Hall, the director for state engagement at the Center for Democracy & Technology, a nonprofit that advocates for digital rights and freedom of expression.

“I can’t imagine that state legislators aren’t going to continue to try to engage with these technologies in order to help protect and respond to the concerns of their constituents,” Hall said. “However, there’s no doubt that the intent of this executive order is to chill any actual oversight, accountability or regulation.”

State rules

This year, 38 states adopted or enacted measures related to artificial intelligence, according to a National Conference of State Legislatures database. Numerous state lawmakers have also prefiled legislation for 2026.

But tensions have grown over the past few months as Trump has pushed for deregulation and states have continued to create guardrails.

It doesn't hold any water and it doesn't have any teeth because the president doesn't have the authority to supersede state law.

– Colorado Democratic state Rep. Brianna Titone

In 2024, Colorado Democratic Gov. Jared Polis signed the nation’s first comprehensive artificial intelligence framework into law. Under the law, developers of AI systems will be required to protect consumers from potential algorithmic discrimination.

But implementation of the law was postponed a few months until June 2026 after negotiations stalled during a special legislative session this summer aiming to ensure the law did not hinder technological innovation. And a spokesperson for Polis told Bloomberg in May that the governor supported a U.S. House GOP proposal that would impose a moratorium on state AI laws.

Trump’s executive order, which mentions the Colorado law as an example of legislation the administration may challenge, has caused uncertainty among some state lawmakers focused on regulating AI. But Colorado state Rep. Brianna Titone and state Sen. Robert Rodriguez, Democratic sponsors of the law, said they will continue their work.

Unless Congress passes legislation to restrict states from passing AI laws, Trump’s executive order can easily be challenged and overturned in court, she said.

“This is just a bunch of hot air,” Titone said. “It doesn’t hold any water and it doesn’t have any teeth because the president doesn’t have the authority to supersede state law. We will continue to do what we need to do for the people in our state, just like we always have, unless there is an actual preemption in federal law.”

California and Illinois also have been at the forefront of artificial intelligence legislation over the past few years. In September, California Democratic Gov. Gavin Newsom signed the nation’s first law establishing a comprehensive legal framework for developers of the most advanced, large-scale artificial intelligence models, known as frontier artificial intelligence models. Those efforts are aimed at preventing AI models from causing catastrophic harm involving dozens of casualties or billion-dollar damages.

California officials have said they are considering a legal challenge over Trump’s order, and other states and groups are likely to sue as well.

Republican officials and GOP-led states, including some Trump allies, also are pushing forward with AI regulations. Efforts to protect consumers from AI harms are being proposed in Missouri, Ohio, Oklahoma, South Carolina, Texas and Utah.

Earlier this month, Florida Republican Gov. Ron DeSantis also unveiled a proposal for an AI Bill of Rights. The proposal aims to strengthen consumer protections related to AI and to address the growing impact data centers are having on local communities.

In South Carolina, Guffey said he plans to introduce a bill in January that would place rules on AI chatbots. Chatbots that use artificial intelligence are able to simulate conversations with users, but raise privacy and safety concerns.

Artificial intelligence is developing fast, Guffey noted. State lawmakers have been working on making sure the technology is safe to use — and they’ll keep doing that to protect their constituents, he said.

“The problem is that it’s not treated like a product — it’s treated like a service,” Guffey said. “If it was treated like a product, we have consumer protection laws where things could be recalled and adjusted and then put back out there once they’re safe. But that is not the case with any of this technology.”

Stateline reporter Madyson Fitzgerald can be reached at mfitzgerald@stateline.org.

This story was originally produced by Stateline, which is part of States Newsroom, a nonprofit news network which includes Wisconsin Examiner, and is supported by grants and a coalition of donors as a 501c(3) public charity.

❌