Reading view

There are new articles available, click to refresh the page.

With electricity bills rising, some states consider new data center laws

An Amazon Web Services data center is shown situated near single-family homes in Stone Ridge, Va., in 2024. As Americans grow increasingly frustrated over their electricity bills, states are trying to keep the nation’s growing number of data centers from causing higher energy costs for consumers.

An Amazon Web Services data center is shown situated near single-family homes in Stone Ridge, Va., in 2024. As Americans grow increasingly frustrated over their electricity bills, states are trying to keep the nation’s growing number of data centers from causing higher energy costs for consumers. (Photo by Nathan Howard/Getty Images)

As Americans grow increasingly frustrated over their electricity bills, states are trying to keep the nation’s growing number of data centers from causing higher energy costs for consumers.

For years, many states competed aggressively to land data centers, sprawling campuses full of the computer servers that store and transmit the data behind apps and websites. But many officials are now scrutinizing how those power-hungry projects might affect the electric bills of households, small businesses and other industries.

Oregon last year became one of the first states to enact a law requiring utilities to charge data centers different electric prices than other industries because of how they drive up the cost of energy production and transmission.

“We are now making data centers pay a higher rate commensurate with the amount of energy they’re sucking out of the system,” said Oregon state Rep. Tom Andersen, a Democrat.

Republican and Democratic leaders in at least a dozen states have targeted data centers with separate, higher electric rates to protect other customers. States also are requiring long-term commitments and financial guarantees through collateral before greenlighting infrastructure investments for new data center projects. But lawmakers acknowledge that numerous factors affect energy prices, so targeting data center-specific costs can be complicated.

An increasingly digital world and the rise of energy-intensive artificial intelligence has led to major expansion of data centers: Consultant McKinsey & Company expects companies to spend nearly $7 trillion worldwide on data centers by 2030. But the industry is facing growing scrutiny, from neighbors who don’t want to live near the massive server farms and from residents worried about how data centers will affect their own swelling utility bills.

Delaware legislation that would charge data centers higher rates advanced out of committee last week. On Tuesday, a Florida state Senate committee approved a bill that would create new rate structures for data centers.

In Oklahoma, a Republican state senator has proposed a moratorium on new data centers until late 2029, allowing the state to study how data centers affect utility rates, the environment and property values.

Separate legislation from state Rep. Brad Boles will seek to protect other ratepayers from the costs of data centers. Boles, the Republican chair of the state Energy and Natural Resources Oversight Committee, said his in-the-works measure would ensure data centers pay their fair share.

Boles told Stateline that his constituents are increasingly worried about data centers, with a dozen potential major ones proposed across the state.

“We’re trying to ensure that those data centers pay for their own infrastructure and we don’t shift that cost or burden to everyday Oklahomans,” he said.

In Oregon, Andersen’s legislation created a new rate structure for data centers with long-term contracts and required regulators to separate the costs of those facilities from other ratepayers.

But consumer advocates have already accused the state’s largest utility of trying to skirt the new law by making residential customers pay part of the long-term cost of supplying large data centers in a pending rate case.

Andersen, a member of the state House Committee on Climate, Energy and Environment, said the new rate structure is unlikely to immediately lower consumer bills. Rather, it aims to curb future increases as data centers require more power generation and transmission.

“We’re not going to change the rates that are being currently paid by the ratepayers and the users of the electricity,” he said. “It’s just going to stop future raises.”

The data center boom

Rising utility bills continue to outpace inflation, sparking anger from consumers and more scrutiny from state regulators, governors and lawmakers.

The boom of data centers is frequently cited as a prime reason for rising electricity prices, as their operation requires more power generation, transmission and distribution upgrades. A Bloomberg News analysis in September found wholesale electricity costs as much as 267% more for a single month than it did five years ago in areas with significant data center activity.

Data center companies say they aren’t the only reason prices are rising.

“It’s inaccurate to draw a clear line between large load customers like data centers coming online and increases in prices. It’s just not that simple,” said Lucas Fykes, senior director of energy policy and regulatory counsel at the Data Center Coalition, a trade group representing data center owners and users, including Amazon, Meta and Visa.

He said many factors have contributed to higher electricity prices, including extreme weather events and the nation’s aging electric grid.

Fykes said his organization opposes rate structures that treat data centers differently from other large electric users such as industrial sites. The organization is working with regulators as states increasingly implement practices to ensure residents and small businesses aren’t on the hook for big energy investments if major projects including data centers don’t come to fruition.

Fykes said the country is likely just in the “beginning innings” of a longer ramp-up in technology and power needs.

“We are also in a global race to build out data centers, to support AI, to support cloud infrastructure,” he said. “It’s important to make sure that we maintain those assets here in the United States.”

That can pose competing interests for political leaders, including mayors, who have pushed hard to land investments from tech companies.

“We want to be leaders in AI, but we don’t want the infrastructure needed to support it,” said Rusty Paul, the mayor of Sandy Springs, Georgia, in the Atlanta metro area.

He was among several mayors addressing the issue of data centers at last month’s winter meeting of the United States Conference of Mayors in Washington, D.C. On a data center panel, Paul acknowledged the effect of Georgia’s tax incentives for data centers: “They’re just popping up everywhere,” he said.

But utilities and regulators are also making long overdue grid upgrades that aren’t tied to data centers, he said.

“The cost of electricity is going up for everybody — and it’s not all related to data centers,” he said.

A bipartisan push

The Georgia Public Service Commission last year created new rules that officials said would protect ratepayers from data center costs. In addition to covering costs of power consumed at their facilities, data centers would have to fund the costs incurred by upstream generation, transmission and distribution, the regulator said.

But lawmakers aren’t convinced those steps went far enough.

State Sen. Chuck Hufstetler, a Republican, is again pushing legislation that would solidify the regulator’s rules into law. His bill would prohibit utilities from passing along the fuel, generation or transmission costs of data centers to other customers.

He told Stateline that the regulator’s rules need to be codified into law so they can’t be weakened later.

Hufstetler said rising utility bills are among the biggest issues facing his constituents. High prices played a key role in November’s election, when Democrats flipped two seats on the state’s Public Service Commission board — the first time Democrats won statewide constitutional office in nearly two decades.

“I saw people with MAGA hats going into the election polling places that were saying, ‘I’m not voting for those guys that raised my rates,’” Hufstetler said, referring to the Republican incumbents who lost.

Hufstetler said the bill, which passed out of committee last year, has already gained major bipartisan support in the Senate, where it is sponsored by multiple Republicans and Democrats.

“This is very bipartisan,” he said. “We have all heard from our people around the state of Georgia.”

The Georgia Public Service Commission agrees in principle with the legislation, said agency spokesperson Tom Krause. But he said the regulator worries about losing flexibility if its rules are written into law.

“Not just this bill, but whenever the legislature codifies a rule that we put in place, we get a little nervous because it can tie our hands in special circumstances,” he said.

A complex challenge

As part of implementing a law enacted last year, Maryland’s utility regulator is weighing a new rate structure for data centers and other large load users.

Proposed regulations would require certain preapproval analysis for heavy power users, a separate rate tariff for data centers and collateral to ensure other ratepayers don’t end up paying for major investments if projects do not come to fruition.

Maryland’s Office of People’s Counsel, an independent agency representing residential utility users, said the proposed changes meet statutory requirements but could do more to protect consumers.

In a news release last month, Maryland People’s Counsel David S. Lapp said residents are already facing higher costs from data centers from outside the state.

“While we push for better federal rules to address those costs, Maryland has the power—and customers a clear need—to make sure data centers within Maryland take on every cost that they impose on residential customers,” Lapp said.

Democratic Gov. Wes Moore recently joined 12 other governors and the Trump administration in urging the regional grid operator, PJM Interconnection, to shield residents and businesses from the infrastructure costs from data centers.

Maryland state Del. Lorig Charkoudian, a Democrat, said the grid operator has for years failed residents in the 13 states plus the District of Columbia that it serves. By delaying renewable energy projects, she said, PJM has kept older, more expensive power plants online, driving up prices as data centers increase demand.

PJM’s board last month rolled out a new data center plan that it said would improve demand forecasting, accelerate the addition of new generation projects and give states a larger role.

The best time to fix this was five years ago. The next best time is right this minute, because it’s only going to get worse.

– Maryland Democratic state Del. Lorig Charkoudian

Charkoudian said states and utilities struggle to determine just how much power is needed. Data center users shop around for sites, which can cause wildly inaccurate forecasts of just how much power a utility will need.

“It actually has a very concrete financial impact on ratepayers,” she told Stateline. “And so that’s why one of the things that really could make a difference for ratepayers is if we actually had an accurate count of how much we’re getting online.”

While some of those challenges lie outside the realm of state control, Charkoudian said there are things the state can do, including the new rate structure for larger users. She’s crafting a bill encouraging data centers to curtail their power usage during peak periods, such as hot days, when the electrical system is taxed by heavy usage of air conditioners, Maryland Matters reported.

Charkoudian said adding solar generation and storage are low-cost ways to respond quickly to demand. And states can avoid the need for more generation by doubling down on energy efficiency programs that lower demand and also consumer costs.

“The best time to fix this was five years ago,” she said. “The next best time is right this minute, because it’s only going to get worse.”

Stateline reporter Robbie Sequeira contributed to this story. Stateline reporter Kevin Hardy can be reached at khardy@stateline.org

This story was originally produced by Stateline, which is part of States Newsroom, a nonprofit news network which includes Wisconsin Examiner, and is supported by grants and a coalition of donors as a 501c(3) public charity.

Wisconsin lawmakers explore age verification requirements on companionship chatbots

As a proliferation of free and easy artificial intelligence tools transform how people learn, work and socialize, Wisconsin lawmakers heard testimony Wednesday on a proposal that seeks to regulate kids’ use of human-esque chatbots.

The post Wisconsin lawmakers explore age verification requirements on companionship chatbots appeared first on WPR.

Wisconsin debates how to pay for the power-hungry AI boom

An aerial view of a large industrial complex next to a pond and surrounding construction areas at sunset, with orange light along the horizon under a cloudy sky.
Reading Time: 5 minutes

How much should data centers pay for the massive amounts of new power infrastructure they require? Wisconsin’s largest utility, We Energies, has offered its answer to that question in what is the first major proposal before state regulators on the issue.

Under the proposal, currently open for public comment, data centers would pay most or all of the price to construct new power plants or renewables needed to serve them, and the utility says the benefits that other customers receive would outweigh any costs they shoulder for building and running this new generation.

But environmental and consumer advocates fear the utility’s plan will actually saddle customers with payments for generation, including polluting natural gas plants, that wouldn’t otherwise be needed.

States nationwide face similar dilemmas around data centers’ energy use. But who pays for the new power plants and transmission is an especially controversial question in Wisconsin and other ​“vertically integrated” energy markets, where utilities charge their customers for the investments they make in such infrastructure — with a profit, called ​“rate of return,” baked in. In states with competitive energy markets, like Illinois, by contrast, utilities buy power on the open market and don’t make a rate of return on building generation.

Although six big data center projects are underway in Wisconsin, the state has no laws governing how the computing facilities get their power.

Lawmakers in the Republican-controlled state Legislature are debating two bills this session. The Assembly passed the GOP-backed proposal on Jan. 20, which, even if it makes it through the Senate, is unlikely to get Democratic Gov. Tony Evers’ signature. According to the Milwaukee Journal Sentinel, a spokesperson for Evers said on Jan. 14 that ​“the one thing environmentalists, labor, utilities, and data center companies can all agree on right now is how bad Republican lawmakers’ data center bill is.” Until a measure is passed, individual decisions by the state Public Service Commission will determine how utilities supply energy to data centers.

The We Energies case is high stakes because two data centers proposed in the utility’s southeast Wisconsin territory promise to double its total demand. One of those facilities is a Microsoft complex that the tech giant says will be ​“the world’s most powerful AI datacenter.”

The utility’s proposal could also be precedent-setting as other Wisconsin utilities plan for data centers, said Bryan Rogers, environmental justice director for the Milwaukee community organization Walnut Way Conservation Corp.

“As goes We Energies,” Rogers said, ​“so goes the rest of the state.”

Building new power

We Energies’ proposal — first filed last spring — would let data centers choose between two options for paying for new generation infrastructure to ensure the utility has enough capacity to meet grid operator requirements that the added electricity demand doesn’t interfere with reliability.

In both cases, the utility will acquire that capacity through ​“bespoke resources” built specifically for the data center. The computing facilities technically would not get their energy directly from these power plants or renewables but rather from We Energies at market prices.

Under the first option, called ​“full benefits,” data centers would pay the full price of constructing, maintaining and operating the new generation and would cover the profit guaranteed to We Energies. The data centers would also get revenue from the sale of the electricity on the market as well as from renewable energy credits for solar and wind arrays; renewable energy credits are basically certificates that can be sold to other entities looking to meet sustainability goals.

The second option, called ​“capacity only,” would have data centers paying 75% of the cost of building the generation. Other customers would pick up the tab for the remaining 25% of the construction and pay for fuel and other costs. In this case, both data centers and other customers would pay for the profit guaranteed to We Energies as part of the project, though the data centers would pay a different — and possibly lower — rate than other customers.

Developers of both data centers being built in We Energies’ territory support the utility’s proposal, saying in testimony that it will help them get online faster and sufficiently protect other customers from unfair costs.

Consumer and environmental advocacy groups, however, are pushing back on the capacity-only option, arguing that it is unfair to make regular customers pay a quarter of the price for building new generation that might not have been necessary without data centers in the picture.

“Nobody asked for this,” said Rogers of Walnut Way. The Sierra Club told regulators to scrap the capacity-only option. The advocacy group Clean Wisconsin similarly opposes that option, as noted in testimony to regulators.

But We Energies says everyone will benefit from building more power sources.

“These capacity-only plants will serve all of our customers, especially on the hottest and coldest days of the year,” We Energies spokesperson Brendan Conway wrote in an email. ​“We expect that customers will receive benefits from these plants that exceed the costs that are proposed to be allocated to them.”

We Energies has offered no proof of this promise, according to testimony filed by the Wisconsin Industrial Energy Group, which represents factories and other large operations. The trade association’s energy adviser, Jeffry Pollock, told regulators that the utility’s own modeling of the capacity-only approach showed scenarios in which the costs borne by customers outweigh the benefits to them.

Clean energy is another sticking point. Clean Wisconsin and the Environmental Law and Policy Center want the utility’s plan to more explicitly encourage data centers to meet capacity requirements in part through their own on-site renewables and to participate in demand-response programs. Customers enrolled in such programs agree to dial down energy use during moments of peak demand, reducing the need for as many new power plants.

“It’s really important to make sure that this tariff contemplates as much clean energy and avoids using as much energy as possible, so we can avoid that incremental fossil fuel build-out that would otherwise potentially be needed to meet this demand,” said Clean Wisconsin staff attorney Brett Korte.

And advocates want the utility to include smaller data centers in its proposal, which in its current form would apply only to data centers requiring 500 megawatts of power or more.

We Energies’ response to stakeholder testimony was due on Jan. 28, and the utility and regulators will also consider public comments that are being submitted. After that, the regulatory commission may hold hearings, and advocates can file additional briefs. Eventually, the utility will reach an agreement with commissioners on how to charge data centers.

Risky business

Looming large over this debate is the mounting concern that the artificial intelligence boom is a bubble. If that bubble pops, it could mean far less power demand from data centers than utilities currently expect.

In November, We Energies announced plans to build almost 3 gigawatts of natural gas plants, renewables and battery storage. Conway said much of this new construction will be paid for by data centers as their bespoke resources.

But some worry that utility customers could be left paying too much for these investments if data centers don’t materialize or don’t use as much energy as predicted. Wisconsin consumers are already on the hook for almost $1 billion for ​“stranded assets,” mostly expensive coal plants that closed earlier than originally planned, as Wisconsin Watch recently tabulated.

“The reason we bring up the worst-case scenario is it’s not just theoretical,” said Tom Content, executive director of the Citizens Utility Board of Wisconsin, the state’s primary consumer advocacy organization. ​“There’s been so many headlines about the AI bubble. Will business plans change? Will new AI chips require data centers to use a lot less energy?”

We Energies’ proposal has data centers paying promised costs even if they go out of business or otherwise prematurely curtail their demand. But developers do not have to put up collateral for this purpose if they have a positive credit rating. That means if such data center companies went bankrupt or otherwise couldn’t meet their financial obligations, utility customers may end up paying the bill.

Steven Kihm, the Citizens Utility Board’s regulatory strategist and chief economist, gave examples of companies that had stellar credit until they didn’t, in testimony to regulators. The company that made BlackBerry handheld devices saw its stock skyrocket in the mid-2000s, only to lose most of its value with the rise of smartphones, he noted. Energy company Enron, meanwhile, had a top credit rating until a month before its 2001 collapse, Kihm warned. He advised regulators that data center developers should have to put up adequate collateral regardless of their credit rating.

The Wisconsin Industrial Energy Group echoed concerns about risk if data centers struggle financially.

“The unprecedented growth in capital spending will subject (We Energies) to elevated financial and credit risks,” Pollock told regulators. ​“Customers will ultimately provide the financial backstop if (the utility) is unable to fully enforce the terms” of its tariff.

Jeremy Fisher, Sierra Club’s principal adviser on climate and energy, equated the risk to co-signing ​“a loan on a mansion next door, with just the vague assurance that the neighbors will almost certainly be able to cover their loan.”

A version of this article was first published by Canary Media.

Wisconsin debates how to pay for the power-hungry AI boom is a post from Wisconsin Watch, a non-profit investigative news site covering Wisconsin since 2009. Please consider making a contribution to support our journalism.

Driving American battery innovation forward

Advancements in battery innovation are transforming both mobility and energy systems alike, according to Kurt Kelty, vice president of battery, propulsion, and sustainability at General Motors (GM). At the MIT Energy Initiative (MITEI) Fall Colloquium, Kelty explored how GM is bringing next-generation battery technologies from lab to commercialization, driving American battery innovation forward. The colloquium is part of the ongoing MITEI Presents: Advancing the Energy Transition speaker series.

At GM, Kelty’s team is primarily focused on three things: first, improving affordability to get more electric vehicles (EVs) on the road. “How do you drive down the cost?” Kelty asked the audience. “It's the batteries. The batteries make up about 30 percent of the cost of the vehicle.” Second, his team strives to improve battery performance, including charging speed and energy density. Third, they are working on localizing the supply chain. “We've got to build up our resilience and our independence here in North America, so we're not relying on materials coming from China,” Kelty explained.

To aid their efforts, resources are being poured into the virtualization space, significantly cutting down on time dedicated to research and development. Now, Kelty’s team can do modeling up front using artificial intelligence, reducing what previously would have taken months to a couple of days.

“If you want to modify … the nickel content ever so slightly, we can very quickly model: ‘OK, how’s that going to affect the energy density? The safety? How’s that going to affect the charge capability?’” said Kelty. “We can look at that at the cell level, then the pack level, then the vehicle level.”

Kelty revealed that they have found a solution that addresses affordability, accessibility, and commercialization: lithium manganese-rich (LMR) batteries. Previously, the industry looked to reduce costs by lowering the amount of cobalt in batteries by adding greater amounts of nickel. These high-nickel batteries are in most cars on the road in the United States due to their high range. LMR batteries, though, take things a step further by reducing the amount of nickel and adding more manganese, which drives the cost of batteries down even further while maintaining range.

Lithium-iron-phosphate (LFP) batteries are the chemistry of choice in China, known for low cost, high cycle life, and high safety. With LMR batteries, the cost is comparable to LFP with a range that is closer to high-nickel. “That’s what’s really a breakthrough,” said Kelty.

LMR batteries are not new, but there have been challenges to adopting them, according to Kelty. “People knew about it, but they didn’t know how to commercialize it. They didn’t know how to make it work in an EV,” he explained. Now that GM has figured out commercialization, they will be the first to market these batteries in their EVs in 2028.

Kelty also expressed excitement over the use of vehicle-to-grid technologies in the future. Using a bidirectional charger with a two-way flow of energy, EVs could charge, but also send power from their batteries back to the electrical grid. This would allow customers to charge “their vehicles at night when the electricity prices are really low, and they can discharge it during the day when electricity rates are really high,” he said.

In addition to working in the transportation sector, GM is exploring ways to extend their battery expertise into applications in grid-scale energy storage. “It’s a big market right now, but it’s growing very quickly because of the data center growth,” said Kelty.

When looking to the future of battery manufacturing and EVs in the United States, Kelty remains optimistic: “we’ve got the technology here to make it happen. We’ve always had the innovation here. Now, we’re getting more and more of the manufacturing. We’re getting that all together. We’ve got just tremendous opportunity here that I’m hopeful we’re going to be able to take advantage of and really build a massive battery industry here.”

This speaker series highlights energy experts and leaders at the forefront of the scientific, technological, and policy solutions needed to transform our energy systems. Visit MITEI’s Events page for more information on this and additional events.

© Photo: Gretchen Ertl

Kurt Kelty (right), vice president of battery, propulsion, and sustainability at General Motors, joined MITEI's William Green at the 2025 MIT Energy Initiative Fall Colloquium. Kelty explained how GM is developing and commercializing next-generation battery technologies.

How artificial intelligence can help achieve a clean energy future

There is growing attention on the links between artificial intelligence and increased energy demands. But while the power-hungry data centers being built to support AI could potentially stress electricity grids, increase customer prices and service interruptions, and generally slow the transition to clean energy, the use of artificial intelligence can also help the energy transition.

For example, use of AI is reducing energy consumption and associated emissions in buildings, transportation, and industrial processes. In addition, AI is helping to optimize the design and siting of new wind and solar installations and energy storage facilities.

On electric power grids, using AI algorithms to control operations is helping to increase efficiency and reduce costs, integrate the growing share of renewables, and even predict when key equipment needs servicing to prevent failure and possible blackouts. AI can help grid planners schedule investments in generation, energy storage, and other infrastructure that will be needed in the future. AI is also helping researchers discover or design novel materials for nuclear reactors, batteries, and electrolyzers.

Researchers at MIT and elsewhere are actively investigating aspects of those and other opportunities for AI to support the clean energy transition. At its 2025 research conference, MITEI announced the Data Center Power Forum, a targeted research effort for MITEI member companies interested in addressing the challenges of data center power demand.

Controlling real-time operations

Customers generally rely on receiving a continuous supply of electricity, and grid operators get help from AI to make that happen — while optimizing the storage and distribution of energy from renewable sources at the same time.

But with more installation of solar and wind farms — both of which provide power in smaller amounts, and intermittently — and the growing threat of weather events and cyberattacks, ensuring reliability is getting more complicated. “That’s exactly where AI can come into the picture,” explains Anuradha Annaswamy, a senior research scientist in MIT’s Department of Mechanical Engineering and director of MIT’s Active-Adaptive Control Laboratory. “Essentially, you need to introduce a whole information infrastructure to supplement and complement the physical infrastructure.”

The electricity grid is a complex system that requires meticulous control on time scales ranging from decades all the way down to microseconds. The challenge can be traced to the basic laws of power physics: electricity supply must equal electricity demand at every instant, or generation can be interrupted. In past decades, grid operators generally assumed that generation was fixed — they could count on how much electricity each large power plant would produce — while demand varied over time in a fairly predictable way. As a result, operators could commission specific power plants to run as needed to meet demand the next day. If some outages occurred, specially designated units would start up as needed to make up the shortfall.

Today and in the future, that matching of supply and demand must still happen, even as the number of small, intermittent sources of generation grows and weather disturbances and other threats to the grid increase. AI algorithms provide a means of achieving the complex management of information needed to forecast within just a few hours which plants should run while also ensuring that the frequency, voltage, and other characteristics of the incoming power are as required for the grid to operate properly.

Moreover, AI can make possible new ways of increasing supply or decreasing demand at times when supplies on the grid run short. As Annaswamy points out, the battery in your electric vehicle (EV), as well as the one charged up by solar panels or wind turbines, can — when needed — serve as a source of extra power to be fed into the grid. And given real-time price signals, EV owners can choose to shift charging from a time when demand is peaking and prices are high to a time when demand and therefore prices are both lower. In addition, new smart thermostats can be set to allow the indoor temperature to drop or rise —  a range defined by the customer — when demand on the grid is peaking. And data centers themselves can be a source of demand flexibility: selected AI calculations could be delayed as needed to smooth out peaks in demand. Thus, AI can provide many opportunities to fine-tune both supply and demand as needed.

In addition, AI makes possible “predictive maintenance.” Any downtime is costly for the company and threatens shortages for the customers served. AI algorithms can collect key performance data during normal operation and, when readings veer off from that normal, the system can alert operators that something might be going wrong, giving them a chance to intervene. That capability prevents equipment failures, reduces the need for routine inspections, increases worker productivity, and extends the lifetime of key equipment.

Annaswamy stresses that “figuring out how to architect this new power grid with these AI components will require many different experts to come together.” She notes that electrical engineers, computer scientists, and energy economists “will have to rub shoulders with enlightened regulators and policymakers to make sure that this is not just an academic exercise, but will actually get implemented. All the different stakeholders have to learn from each other. And you need guarantees that nothing is going to fail. You can’t have blackouts.”

Using AI to help plan investments in infrastructure for the future

Grid companies constantly need to plan for expanding generation, transmission, storage, and more, and getting all the necessary infrastructure built and operating may take many years, in some cases more than a decade. So, they need to predict what infrastructure they’ll need to ensure reliability in the future. “It’s complicated because you have to forecast over a decade ahead of time what to build and where to build it,” says Deepjyoti Deka, a research scientist in MITEI.

One challenge with anticipating what will be needed is predicting how the future system will operate. “That’s becoming increasingly difficult,” says Deka, because more renewables are coming online and displacing traditional generators. In the past, operators could rely on “spinning reserves,” that is, generating capacity that’s not currently in use but could come online in a matter of minutes to meet any shortfall on the system. The presence of so many intermittent generators — wind and solar — means there’s now less stability and inertia built into the grid. Adding to the complication is that those intermittent generators can be built by various vendors, and grid planners may not have access to the physics-based equations that govern the operation of each piece of equipment at sufficiently fine time scales. “So, you probably don’t know exactly how it’s going to run,” says Deka.

And then there’s the weather. Determining the reliability of a proposed future energy system requires knowing what it’ll be up against in terms of weather. The future grid has to be reliable not only in everyday weather, but also during low-probability but high-risk events such as hurricanes, floods, and wildfires, all of which are becoming more and more frequent, notes Deka. AI can help by predicting such events and even tracking changes in weather patterns due to climate change.

Deka points out another, less-obvious benefit of the speed of AI analysis. Any infrastructure development plan must be reviewed and approved, often by several regulatory and other bodies. Traditionally, an applicant would develop a plan, analyze its impacts, and submit the plan to one set of reviewers. After making any requested changes and repeating the analysis, the applicant would resubmit a revised version to the reviewers to see if the new version was acceptable. AI tools can speed up the required analysis so the process moves along more quickly. Planners can even reduce the number of times a proposal is rejected by using large language models to search regulatory publications and summarize what’s important for a proposed infrastructure installation.

Harnessing AI to discover and exploit advanced materials needed for the energy transition

“Use of AI for materials development is booming right now,” says Ju Li, MIT’s Carl Richard Soderberg Professor of Power Engineering. He notes two main directions.

First, AI makes possible faster physics-based simulations at the atomic scale. The result is a better atomic-level understanding of how composition, processing, structure, and chemical reactivity relate to the performance of materials. That understanding provides design rules to help guide the development and discovery of novel materials for energy generation, storage, and conversion needed for a sustainable future energy system.

And second, AI can help guide experiments in real time as they take place in the lab. Li explains: “AI assists us in choosing the best experiment to do based on our previous experiments and — based on literature searches — makes hypotheses and suggests new experiments.”

He describes what happens in his own lab. Human scientists interact with a large language model, which then makes suggestions about what specific experiments to do next. The human researcher accepts or modifies the suggestion, and a robotic arm responds by setting up and performing the next step in the experimental sequence, synthesizing the material, testing the performance, and taking images of samples when appropriate. Based on a mix of literature knowledge, human intuition, and previous experimental results, AI thus coordinates active learning that balances the goals of reducing uncertainty with improving performance. And, as Li points out, “AI has read many more books and papers than any human can, and is thus naturally more interdisciplinary.”

The outcome, says Li, is both better design of experiments and speeding up the “work flow.” Traditionally, the process of developing new materials has required synthesizing the precursors, making the material, testing its performance and characterizing the structure, making adjustments, and repeating the same series of steps. AI guidance speeds up that process, “helping us to design critical, cheap experiments that can give us the maximum amount of information feedback,” says Li.

“Having this capability certainly will accelerate material discovery, and this may be the thing that can really help us in the clean energy transition,” he concludes. “AI [has the potential to] lubricate the material-discovery and optimization process, perhaps shortening it from decades, as in the past, to just a few years.” 

MITEI’s contributions

At MIT, researchers are working on various aspects of the opportunities described above. In projects supported by MITEI, teams are using AI to better model and predict disruptions in plasma flows inside fusion reactors — a necessity in achieving practical fusion power generation. Other MITEI-supported teams are using AI-powered tools to interpret regulations, climate data, and infrastructure maps in order to achieve faster, more adaptive electric grid planning. AI-guided development of advanced materials continues, with one MITEI project using AI to optimize solar cells and thermoelectric materials.

Other MITEI researchers are developing robots that can learn maintenance tasks based on human feedback, including physical intervention and verbal instructions. The goal is to reduce costs, improve safety, and accelerate the deployment of the renewable energy infrastructure. And MITEI-funded work continues on ways to reduce the energy demand of data centers, from designing more efficient computer chips and computing algorithms to rethinking the architectural design of the buildings, for example, to increase airflow so as to reduce the need for air conditioning.

In addition to providing leadership and funding for many research projects, MITEI acts as a convenor, bringing together interested parties to consider common problems and potential solutions. In May 2025, MITEI’s annual spring symposium — titled “AI and energy: Peril and promise” — brought together AI and energy experts from across academia, industry, government, and nonprofit organizations to explore AI as both a problem and a potential solution for the clean energy transition. At the close of the symposium, William H. Green, director of MITEI and Hoyt C. Hottel Professor in the MIT Department of Chemical Engineering, noted, “The challenge of meeting data center energy demand and of unlocking the potential benefits of AI to the energy transition is now a research priority for MITEI.”

© Image: Igor Borisenko/iStock

Researchers at MIT and elsewhere are investigating how AI can be harnessed to support the clean energy transition.

ACT EXPO Registration Opens, Event Focus on AI and Autonomy

Registration is now open for the 2026 ACT Expo, which returns to Las Vegas, Nevada, in the spring.

The 16th ACT Expo, held May 4-7 at the Las Vegas Convention Center, will feature sessions on AI and autonomy as well as zero-emission vehicles. Originally called the Advanced Clean Transportation, ACT Expo for short, will now be known solely as ACT Expo, which event producers TRC Companies, said reflects the “expanded scope across advanced, autonomous, connected, and clean transportation technologies.”

TRC noted that ACT Expo can no longer “be simply defined as the clean or advanced technology show — it has become so much more.”

ACT now stands for the following:

  • Advanced, Autonomous, Alternative, AI, Analytics, Adaptable, Assets
  • Clean, Commercial, Connected, Cost-Effective, Compliant, Charged, Carbon-free
  • Transportation, Technology, Transition, Trailers, Telematics, TCO, Tires

The event, which annually attracts over 12,000 attendees and 500 exhibitors, “offers end-users the most current insight into the key technology trends driving the market today and in the years ahead, practical lessons from peers, direct access to every major OEM and industry supplier in the market, strategies to boost competitiveness and accelerate the use of high-tech and clean vehicles and fuel, and the relationships that drive long-term success,” a press release on the event states.

The ACT Expo traditionally has hosted one school-bus-specific session each year and features school buses on the trade floor from various manufacturers. This year, however, TRC Companies said ACT Expo will place a greater emphasis on the digital frontier, reflecting industry investment in software-defined vehicles, real-time data collection and analysis via the use of AI and autonomy.


Related: (STN Podcast E257) The Paths Forward: AI, Clean Energy, Manufacturing Discussed at ACT Expo
Related: ACT Expo Heads Back to Anaheim, Agenda Released
Related: School Bus Wi-Fi Solution Now Available for Districts Left in E-Rate Cold
Related: WATCH: Michigan Association Releases Illegal Passing PSA for School Bus Safety Week


“Through end-user case studies, the event will highlight how these cutting-edge technologies are improving performance, safety, and ROI, while giving attendees a clear view of where and how they are scaling,” the release states.

In addition to the technologies, the conference will continue to highlight ultra-clean vehicles and low-carbon fuels, spotlighting infrastructure.

“The pace of change and acceleration of advanced technologies in commercial transportation is phenomenal; it’s unlike anything we have seen before,” stated Erik Neandross, president of Clean Transportation Solutions at TRC. “From the boardroom to the show floor, ACT Expo is the one place where C-suite representatives from fleets, OEMs, and infrastructure partners engage directly to shape real-world progress and the future of their businesses. It’s where fleet leaders learn what’s actually working in the field, what’s just around the corner, and where they can better understand proven strategies that can deliver both economic and environmental results.”

School Transportation News is a media sponsor of the event.

The post ACT EXPO Registration Opens, Event Focus on AI and Autonomy appeared first on School Transportation News.

Why AI in School Transportation Must Start with Empathy, Not Efficiency

As the school transportation industry wrestles with complex challenges—driver shortages, safety concerns and operational inefficiencies—artificial intelligence (AI) is often positioned as a silver bullet. Fleet management systems tout data optimization. Dash cams promise incident reduction. Digital platforms claim to centralize and simplify operations.

But in the rush to innovate, we risk forgetting what matters most: People. Specifically, the drivers, dispatchers and front-line staff who make student transportation possible every day. If AI is to truly move this industry forward, it must be rooted in empathy—not just algorithms.

Coaching, Not Surveillance
Take the growing adoption of AI-powered dash cameras. When framed solely as surveillance tools, these systems can alienate drivers. No one wants to feel like they’re being watched without context or support. However, when implemented with a focus on coaching rather than punishment, these same tools can become allies. Cameras that detect risky behaviors—such as distracted driving, hard-braking or rolling stops—can deliver real-time feedback and personalized training opportunities. This helps drivers improve their performance without feeling policed.

It’s a shift in mindset from compliance to confidence-building. Drivers begin to feel supported, not scrutinized. And fleets often see measurable improvements in safety outcomes and morale as a result.

Retention Through Respect
The transportation industry has a retention problem. Nationally, school bus operators report chronic shortages, with turnover rates frequently exceeding 50 percent. Recruitment incentives and signing bonuses help, but they rarely address the deeper issue: How drivers feel on the job.

This is where AI can play a powerful role, if used thoughtfully. Integrated platforms that
offer real-time route data, reliable communication and automated scheduling aren’t just operational tools. They’re stress reducers. When a school bus driver knows their route will be accurate, when help is one tap away, and that their feedback is acknowledged and
acted upon, it builds trust. And trust builds tenure. In some operations, these changes have reduced driver turnover by double digits. Not because of gimmicks or grand gestures but because the technology made drivers feel valued and protected.

The Quiet Power of Automation AI’s most human impact may come behind the scenes. The administrative burdens on drivers and staff, from payroll questions to incident reporting, can erode time, focus and job satisfaction. Enter virtual assistants, workflow automations and smart self-service tools. When designed well, they give employees 24/7 access to the information they need, cut response times and free up staff to focus on meaningful, person-to-person support.

This isn’t just about operational efficiency, it’s about respect. Respect for employees’ time. Respect for their need to focus on their core responsibilities. Respect for their mental bandwidth. It’s tempting to think of automation as impersonal. But when deployed with the employee experience in mind, it can be one of the most empathetic forms of technology.

Start With the End User Too often, transportation tech is built from the top down and optimized for operations managers, IT leaders, or compliance teams. But the most successful implementations flip that script. They ask, what do drivers actually need? What do dispatchers struggle with? Where do mechanics waste the most time? Empathy, in this sense, becomes a design principle. And when it is, adoption skyrockets. Engagement rises. Feedback loops get shorter. And frontline staff begin to see technology not as a burden—but as a partner.

The Bigger Opportunity We’re at a crossroads. AI and automation are poised to reshape school transportation over the next decade. But the question isn’t whether we’ll adopt these tools. It’s how we’ll use them. Will we chase efficiency at the cost of human connection? Or will we use technology to elevate the people who make the system work? The path forward requires us to recognize a simple truth: Buses don’t move students—people do. And when we center those people in our digital transformation efforts, everyone wins: the organization, the employees and most importantly, the children we’re entrusted to transport safely every day.

Editor’s Note: As reprinted from the September 2025 issue of School Transportation News.


Gaurav Sharda is the chief technology officer for Beacon Mobility companies and in July won the SchoolTransportation News Innovator of the Year Award for his direction of new human-focused AI solutions.



Related: Strides in Vehicle to Grid Technology Continue
Related: Feeling Super About Transportation Technology?
Related: New Technology Provides Data to School Bus Routing
Related: Bring the A-Game to Fleet Management

The post Why AI in School Transportation Must Start with Empathy, Not Efficiency appeared first on School Transportation News.

Celebrating an academic-industry collaboration to advance vehicle technology

On May 6, MIT AgeLab’s Advanced Vehicle Technology (AVT) Consortium, part of the MIT Center for Transportation and Logistics, celebrated 10 years of its global academic-industry collaboration. AVT was founded with the aim of developing new data that contribute to automotive manufacturers, suppliers, and insurers’ real-world understanding of how drivers use and respond to increasingly sophisticated vehicle technologies, such as assistive and automated driving, while accelerating the applied insight needed to advance design and development. The celebration event brought together stakeholders from across the industry for a set of keynote addresses and panel discussions on critical topics significant to the industry and its future, including artificial intelligence, automotive technology, collision repair, consumer behavior, sustainability, vehicle safety policy, and global competitiveness.

Bryan Reimer, founder and co-director of the AVT Consortium, opened the event by remarking that over the decade AVT has collected hundreds of terabytes of data, presented and discussed research with its over 25 member organizations, supported members’ strategic and policy initiatives, published select outcomes, and built AVT into a global influencer with tremendous impact in the automotive industry. He noted that current opportunities and challenges for the industry include distracted driving, a lack of consumer trust and concerns around transparency in assistive and automated driving features, and high consumer expectations for vehicle technology, safety, and affordability. How will industry respond? Major players in attendance weighed in.

In a powerful exchange on vehicle safety regulation, John Bozzella, president and CEO of the Alliance for Automotive Innovation, and Mark Rosekind, former chief safety innovation officer of Zoox, former administrator of the National Highway Traffic Safety Administration, and former member of the National Transportation Safety Board, challenged industry and government to adopt a more strategic, data-driven, and collaborative approach to safety. They asserted that regulation must evolve alongside innovation, not lag behind it by decades. Appealing to the automakers in attendance, Bozzella cited the success of voluntary commitments on automatic emergency braking as a model for future progress. “That’s a way to do something important and impactful ahead of regulation.” They advocated for shared data platforms, anonymous reporting, and a common regulatory vision that sets safety baselines while allowing room for experimentation. The 40,000 annual road fatalities demand urgency — what’s needed is a move away from tactical fixes and toward a systemic safety strategy. “Safety delayed is safety denied,” Rosekind stated. “Tell me how you’re going to improve safety. Let’s be explicit.”

Drawing inspiration from aviation’s exemplary safety record, Kathy Abbott, chief scientific and technical advisor for the Federal Aviation Administration, pointed to a culture of rigorous regulation, continuous improvement, and cross-sectoral data sharing. Aviation’s model, built on highly trained personnel and strict predictability standards, contrasts sharply with the fragmented approach in the automotive industry. The keynote emphasized that a foundation of safety culture — one that recognizes that technological ability alone isn’t justification for deployment — must guide the auto industry forward. Just as aviation doesn’t equate absence of failure with success, vehicle safety must be measured holistically and proactively.

With assistive and automated driving top of mind in the industry, Pete Bigelow of Automotive News offered a pragmatic diagnosis. With companies like Ford and Volkswagen stepping back from full autonomy projects like Argo AI, the industry is now focused on Level 2 and 3 technologies, which refer to assisted and automated driving, respectively. Tesla, GM, and Mercedes are experimenting with subscription models for driver assistance systems, yet consumer confusion remains high. JD Power reports that many drivers do not grasp the differences between L2 and L2+, or whether these technologies offer safety or convenience features. Safety benefits have yet to manifest in reduced traffic deaths, which have risen by 20 percent since 2020. The recurring challenge: L3 systems demand that human drivers take over during technical difficulties, despite driver disengagement being their primary benefit, potentially worsening outcomes. Bigelow cited a quote from Bryan Reimer as one of the best he’s received in his career: “Level 3 systems are an engineer’s dream and a plaintiff attorney’s next yacht,” highlighting the legal and design complexity of systems that demand handoffs between machine and human.

In terms of the impact of AI on the automotive industry, Mauricio Muñoz, senior research engineer at AI Sweden, underscored that despite AI’s transformative potential, the automotive industry cannot rely on general AI megatrends to solve domain-specific challenges. While landmark achievements like AlphaFold demonstrate AI’s prowess, automotive applications require domain expertise, data sovereignty, and targeted collaboration. Energy constraints, data firewalls, and the high costs of AI infrastructure all pose limitations, making it critical that companies fund purpose-driven research that can reduce costs and improve implementation fidelity. Muñoz warned that while excitement abounds — with some predicting artificial superintelligence by 2028 — real progress demands organizational alignment and a deep understanding of the automotive context, not just computational power.

Turning the focus to consumers, a collision repair panel drawing Richard Billyeald from Thatcham Research, Hami Ebrahimi from Caliber Collision, and Mike Nelson from Nelson Law explored the unintended consequences of vehicle technology advances: spiraling repair costs, labor shortages, and a lack of repairability standards. Panelists warned that even minor repairs for advanced vehicles now require costly and complex sensor recalibrations — compounded by inconsistent manufacturer guidance and no clear consumer alerts when systems are out of calibration. The panel called for greater standardization, consumer education, and repair-friendly design. As insurance premiums climb and more people forgo insurance claims, the lack of coordination between automakers, regulators, and service providers threatens consumer safety and undermines trust. The group warned that until Level 2 systems function reliably and affordably, moving toward Level 3 autonomy is premature and risky.

While the repair panel emphasized today’s urgent challenges, other speakers looked to the future. Honda’s Ryan Harty, for example, highlighted the company’s aggressive push toward sustainability and safety. Honda aims for zero environmental impact and zero traffic fatalities, with plans to be 100 percent electric by 2040 and to lead in energy storage and clean power integration. The company has developed tools to coach young drivers and is investing in charging infrastructure, grid-aware battery usage, and green hydrogen storage. “What consumers buy in the market dictates what the manufacturers make,” Harty noted, underscoring the importance of aligning product strategy with user demand and environmental responsibility. He stressed that manufacturers can only decarbonize as fast as the industry allows, and emphasized the need to shift from cost-based to life-cycle-based product strategies.

Finally, a panel involving Laura Chace of ITS America, Jon Demerly of Qualcomm, Brad Stertz of Audi/VW Group, and Anant Thaker of Aptiv covered the near-, mid-, and long-term future of vehicle technology. Panelists emphasized that consumer expectations, infrastructure investment, and regulatory modernization must evolve together. Despite record bicycle fatality rates and persistent distracted driving, features like school bus detection and stop sign alerts remain underutilized due to skepticism and cost. Panelists stressed that we must design systems for proactive safety rather than reactive response. The slow integration of digital infrastructure — sensors, edge computing, data analytics — stems not only from technical hurdles, but procurement and policy challenges as well. 

Reimer concluded the event by urging industry leaders to re-center the consumer in all conversations — from affordability to maintenance and repair. With the rising costs of ownership, growing gaps in trust in technology, and misalignment between innovation and consumer value, the future of mobility depends on rebuilding trust and reshaping industry economics. He called for global collaboration, greater standardization, and transparent innovation that consumers can understand and afford. He highlighted that global competitiveness and public safety both hang in the balance. As Reimer noted, “success will come through partnerships” — between industry, academia, and government — that work toward shared investment, cultural change, and a collective willingness to prioritize the public good.

© Photo: Kelly Davidson Studio

Bryan Reimer, founder and co-director of the AVT Consortium, gives the opening remarks.

The Ultimate Guide to AI in Cleantech

The cleantech world is experiencing a quiet revolution. Artificial intelligence is no longer knocking at the door, it’s quietly remodeling the entire house....

The post The Ultimate Guide to AI in Cleantech appeared first on Cleantech Group.

Want to design the car of the future? Here are 8,000 designs to get you started.

Car design is an iterative and proprietary process. Carmakers can spend several years on the design phase for a car, tweaking 3D forms in simulations before building out the most promising designs for physical testing. The details and specs of these tests, including the aerodynamics of a given car design, are typically not made public. Significant advances in performance, such as in fuel efficiency or electric vehicle range, can therefore be slow and siloed from company to company.

MIT engineers say that the search for better car designs can speed up exponentially with the use of generative artificial intelligence tools that can plow through huge amounts of data in seconds and find connections to generate a novel design. While such AI tools exist, the data they would need to learn from have not been available, at least in any sort of accessible, centralized form.

But now, the engineers have made just such a dataset available to the public for the first time. Dubbed DrivAerNet++, the dataset encompasses more than 8,000 car designs, which the engineers generated based on the most common types of cars in the world today. Each design is represented in 3D form and includes information on the car’s aerodynamics — the way air would flow around a given design, based on simulations of fluid dynamics that the group carried out for each design.

Side-by-side animation of rainbow-colored car and car with blue and green lines


Each of the dataset’s 8,000 designs is available in several representations, such as mesh, point cloud, or a simple list of the design’s parameters and dimensions. As such, the dataset can be used by different AI models that are tuned to process data in a particular modality.

DrivAerNet++ is the largest open-source dataset for car aerodynamics that has been developed to date. The engineers envision it being used as an extensive library of realistic car designs, with detailed aerodynamics data that can be used to quickly train any AI model. These models can then just as quickly generate novel designs that could potentially lead to more fuel-efficient cars and electric vehicles with longer range, in a fraction of the time that it takes the automotive industry today.

“This dataset lays the foundation for the next generation of AI applications in engineering, promoting efficient design processes, cutting R&D costs, and driving advancements toward a more sustainable automotive future,” says Mohamed Elrefaie, a mechanical engineering graduate student at MIT.

Elrefaie and his colleagues will present a paper detailing the new dataset, and AI methods that could be applied to it, at the NeurIPS conference in December. His co-authors are Faez Ahmed, assistant professor of mechanical engineering at MIT, along with Angela Dai, associate professor of computer science at the Technical University of Munich, and Florin Marar of BETA CAE Systems.

Filling the data gap

Ahmed leads the Design Computation and Digital Engineering Lab (DeCoDE) at MIT, where his group explores ways in which AI and machine-learning tools can be used to enhance the design of complex engineering systems and products, including car technology.

“Often when designing a car, the forward process is so expensive that manufacturers can only tweak a car a little bit from one version to the next,” Ahmed says. “But if you have larger datasets where you know the performance of each design, now you can train machine-learning models to iterate fast so you are more likely to get a better design.”

And speed, particularly for advancing car technology, is particularly pressing now.

“This is the best time for accelerating car innovations, as automobiles are one of the largest polluters in the world, and the faster we can shave off that contribution, the more we can help the climate,” Elrefaie says.

In looking at the process of new car design, the researchers found that, while there are AI models that could crank through many car designs to generate optimal designs, the car data that is actually available is limited. Some researchers had previously assembled small datasets of simulated car designs, while car manufacturers rarely release the specs of the actual designs they explore, test, and ultimately manufacture.

The team sought to fill the data gap, particularly with respect to a car’s aerodynamics, which plays a key role in setting the range of an electric vehicle, and the fuel efficiency of an internal combustion engine. The challenge, they realized, was in assembling a dataset of thousands of car designs, each of which is physically accurate in their function and form, without the benefit of physically testing and measuring their performance.

To build a dataset of car designs with physically accurate representations of their aerodynamics, the researchers started with several baseline 3D models that were provided by Audi and BMW in 2014. These models represent three major categories of passenger cars: fastback (sedans with a sloped back end), notchback (sedans or coupes with a slight dip in their rear profile) and estateback (such as station wagons with more blunt, flat backs). The baseline models are thought to bridge the gap between simple designs and more complicated proprietary designs, and have been used by other groups as a starting point for exploring new car designs.

Library of cars

In their new study, the team applied a morphing operation to each of the baseline car models. This operation systematically made a slight change to each of 26 parameters in a given car design, such as its length, underbody features, windshield slope, and wheel tread, which it then labeled as a distinct car design, which was then added to the growing dataset. Meanwhile, the team ran an optimization algorithm to ensure that each new design was indeed distinct, and not a copy of an already-generated design. They then translated each 3D design into different modalities, such that a given design can be represented as a mesh, a point cloud, or a list of dimensions and specs.

The researchers also ran complex, computational fluid dynamics simulations to calculate how air would flow around each generated car design. In the end, this effort produced more than 8,000 distinct, physically accurate 3D car forms, encompassing the most common types of passenger cars on the road today.

To produce this comprehensive dataset, the researchers spent over 3 million CPU hours using the MIT SuperCloud, and generated 39 terabytes of data. (For comparison, it’s estimated that the entire printed collection of the Library of Congress would amount to about 10 terabytes of data.)

The engineers say that researchers can now use the dataset to train a particular AI model. For instance, an AI model could be trained on a part of the dataset to learn car configurations that have certain desirable aerodynamics. Within seconds, the model could then generate a new car design with optimized aerodynamics, based on what it has learned from the dataset’s thousands of physically accurate designs.

The researchers say the dataset could also be used for the inverse goal. For instance, after training an AI model on the dataset, designers could feed the model a specific car design and have it quickly estimate the design’s aerodynamics, which can then be used to compute the car’s potential fuel efficiency or electric range — all without carrying out expensive building and testing of a physical car.

“What this dataset allows you to do is train generative AI models to do things in seconds rather than hours,” Ahmed says. “These models can help lower fuel consumption for internal combustion vehicles and increase the range of electric cars — ultimately paving the way for more sustainable, environmentally friendly vehicles.”

“The dataset is very comprehensive and consists of a diverse set of modalities that are valuable to understand both styling and performance,” says Yanxia Zhang, a senior machine learning research scientist at Toyota Research Institute, who was not involved in the study.

This work was supported, in part, by the German Academic Exchange Service and the Department of Mechanical Engineering at MIT.

© Credit: Courtesy of Mohamed Elrefaie

In a new dataset that includes more than 8,000 car designs, MIT engineers simulated the aerodynamics for a given car shape, which they represent in various modalities, including “surface fields.”
❌