Part 2 | Lumada Business Award – Special Interview

Putting a “Leash of Physical Laws” on AI

How Hitachi’s Physics-Based AI and Infrastructure Avert a Power Grid Crisis

How can we prevent AI from “going rogue” in social infrastructure where absolute reliability is mandatory?
In Part 2, we dive into the core technology—physics-based AI—and the AI infrastructure that supports it, Hitachi iQ. What lessons does this U.S. case offer for Japan’s looming power crisis, said to be five years behind?

In Part 1, we explored how the rapid increase in AI data centers has intensified the U.S. power crisis, exacerbating severe interconnection backlog issues. To address this challenge, the Hitachi Group formed a “One Hitachi” team, collaborating with Southwest Power Pool (SPP)*1 to dramatically shorten analysis time and unclog critical infrastructure bottlenecks. This initiative earned the Grand Prize at Hitachi’s internal Lumada Business Award, recognized for both its technological excellence and social impact.

*1 A regional transmission organization in the central United States.

But how exactly will this team achieve such a breakthrough?
In Part 2, we examine the mechanism behind physics-based AI (also referred to as physically driven AI), the AI infrastructure Hitachi iQ that enables it, and the implications this U.S. case holds for Japan’s future energy challenges.

What Is Physics-Based AI—the Breakthrough Technology?

At the heart of the SPP project lies physics-based AI. While generative AI dominates today’s headlines, Yoshimitsu Kaji, Lumada Innovation Hub Senior Principal, notes that such models are often criticized for producing “plausible but incorrect answers,” known as hallucinations.

“In the world of social infrastructure like power grids, even a single mistake is unacceptable,” Kaji asked. “How did you ensure AI reliability in a domain where errors are simply not allowed?”

Responding to this question, Bo Yang, who led the development of physics-based AI at Hitachi America, explained how it differs from conventional AI.

“Typical data-driven AI learns from historical data and performs inference based on statistics alone. Our physics-based AI, by contrast, directly embeds scientific laws—such as mathematics and physics—into the algorithm itself.”

In electrical engineering, for example, Kirchhoff’s laws define how currents and voltages behave. Physics-based AI incorporates such physical principles as hard constraints within the algorithm. Rather than relying solely on probabilistic interpretation, as large language models do, it combines fact-based physical calculations with statistical inference—creating a hybrid approach.

Bo Yang
(Vice President, Energy Solution Lab, R&D Division, Hitachi America)

Yang describes this mechanism as “putting a leash on AI.”

“Purely data-driven AI can fabricate statistically ‘likely’ answers when faced with unfamiliar or unseen scenarios. Physics-based AI, however, is bound by immutable physical laws. These laws act as a leash, preventing runaway behavior and ensuring that the AI produces physically valid solutions—even for situations not found in historical data.”

One clarification is important here. Recently, the term “physical AI” has gained traction, particularly in robotics. However, physics-based AI is fundamentally different. When Kaji asked whether the two concepts should be considered separate, Shawn Monroe of Hitachi Vantara agreed.

Kaji notes, “In Japan, ‘physical AI’ is becoming something of a buzzword, but you are quite strict about how it differs from ‘physics-based AI.’ Is it fair to say that what is commonly referred to as physical AI is not the same as physics-based AI?” Monroe nods and responds as follows.

“They are not the same. ‘Physical AI’ typically refers to AI that learns from real-world data and executes inference tied to physical actions, such as robot control. Physics-based AI, on the other hand, focuses on constraining the AI’s reasoning process using physical laws. It does not necessarily involve physical motion, but it is grounded in mathematical and physical facts.”

By introducing physics-based AI, SPP is improving both the accuracy and speed of grid interconnection studies—where countless patterns must be evaluated through advanced simulations (see Part 1 for details). This outcome represents the culmination of Yang’s leadership and Hitachi’s accumulated expertise.

Hitachi iQ: Accelerating AI Through Proprietary Infrastructure

Pre-trained generative AI models grow larger and more resource-intensive as their capabilities increase. Ideally, all model data would reside in high-speed DRAM, but capacity constraints make this impractical, forcing AI workloads to rely on large-capacity storage, which offers lower processing speeds.

Physics-based AI places even heavier demands on infrastructure. It requires extremely complex computations and rapid access to massive volumes of simulation data. This is where Hitachi iQ comes into play. Kaji jokingly asks whether Hitachi iQ is like a “baby” to him, and Monroe responds with a smile, “Yes.”

In conventional systems, data reads and writes pass through the CPU and operating system kernel, creating significant overhead and performance bottlenecks.

Shawn Monroe
(Principal Strategist for AI in Energy, Hitachi Vantara)

Hitachi iQ is architected around a reference design developed in collaboration with NVIDA.  By bypassing the OS kernel and transferring data directly from storage to GPU, it eliminates CPU wait times. Monroe explained its technical advantage with enthusiasm:

“Traditional communication protocols are essentially serialized per I/O, limiting throughput to around 1.6 Gbps due to CPU constraints. Hitachi iQ aggregates multiple ultra-high-speed 800 Gbps connections and streams data directly into GPUs. Combined with Hitachi’s long-standing expertise in large-scale data lake technology, the architecture is designed around a single principle: never let the GPU sit idle.”

This GPU–storage–integrated infrastructure is what enables physics-based AI to run at scale.

While Hitachi is well known for its storage technology, hardware alone was not the deciding factor. By optimizing software end to end to match GPU characteristics, the team achieved dramatically higher performance—delivering faster processing with less than half the resources typically required.

Applying These Insights to Japan’s Future

Yoshimitsu Kaji
(Senior Principal, Lumada Innovation Hub, Hitachi, Ltd.)

In its mid-term management plan Inspire 2027, announced in April 2025, the Hitachi Group set forth the vision of “Lumada 3.0.” This initiative focuses on transforming social infrastructure by combining Hitachi’s deep domain knowledge—built through decades of on-site experience—with AI, and is being advanced through customer co-creation at the Lumada Innovation Hub Tokyo, where Kaji serves as a Senior Principal.

The Lumada Business Award is held annually to recognize initiatives aligned with this vision. As noted earlier, the SPP project received the Grand Prize for addressing a pressing global challenge—the power crisis—by combining energy expertise with advanced AI in a socially meaningful way.

When Kaji reiterates, “The core philosophy of Lumada is to solve societal challenges through data. How do you see that philosophy reflected in this project?” Monroe responds, “That is exactly the philosophy we strongly resonated with as we moved forward,” adding, “Having the results of that work recognized is a tremendous honor.”

Project leaders Monroe and Yang emphasized that the achievement went beyond deploying technology. By integrating the knowledge and capabilities of five group companies and one R&D division, they are delivering a true One Hitachi solution. Monroe commented, “I am extremely proud of what we’re accomplishing together.”

The project will support SPP’s historic USD 7.7 billion transmission grid reinforcement plan, which includes the effort to reduce analysis time significantly by 80%–an outcome to be further enabled by this collaborative framework.

At the Lumada Business Award ceremony, Monroe remarked, “I’m proud that we’re achieving something as One Hitachi that no single company could have done alone—and that our efforts were recognized with this award.” Yang added, “As an R&D organization, we are delighted that our work is delivering tangible business value and was acknowledged in this way.”

 

As a leading example from the U.S., where AI data centers are heavily concentrated, the SPP project has strong global visibility. Hitachi plans to expand this solution across energy markets worldwide, including Japan.

At the end of the interview, Kaji states “In my view, Japan is about five years behind the United States. When I was involved in the early stages of EV initiatives, I came to realize how much of a time lag there is in infrastructure development. A similar surge in energy demand will inevitably come to Japan as well. What we are seeing in the United States is not something we can afford to view as a distant problem.”

As Yang noted, while Japan’s grid has unique characteristics—such as the coexistence of 50 Hz and 60 Hz systems—the underlying technology is universally applicable. The lessons learned from SPP should translate well to Japan’s specific conditions.

What Hitachi demonstrated through its One Hitachi, end-to-end approach was not merely system deployment, but the importance of holistic optimization—from transmission grids to server racks. This perspective may prove essential in addressing Japan’s future power shortages. There is much Japan can learn from the SPP case as it prepares for the challenges ahead.