Haiqu Life Sciences

Haiqu’s Agentic Research Workflow helps Reconstruct Quantum Materials’ Fingerprints

Blog + Whitepapers

Two early demonstrations show how agentic quantum workflows can turn advanced material design problems into executable experiments on today’s quantum hardware.

Book a Demo

Advanced quantum simulations should not require a months-long custom engineering effort each time. That is the central idea behind Haiqu’s Agentic Quantum OS. Frontier quantum R&D can become a structured workflow: from research problem formulation to baselines to optimized quantum execution to interpretable scientific results.

As we built some of the most advanced tools to execute quantum algorithms, we learned that the R&D process matters. The hard part of quantum computing is often not sending a quantum algorithm to a quantum processor, but rather deciding what to compute, how to build and validate a quantum application, how to make the circuit survive hardware noise, and how to interpret the output once the hardware returns data.

Agentic blog hero2
From a research or business question to a hardware-backed quantum result.

In chemistry and material science applications, that challenge is especially sharp. A serious calculation needs a physical model, numerical and theoretical baselines, initial state preparation, time evolution or sampling, hardware-aware optimization, error handling, and scientific interpretation. Combining all of these in an orchestrated workflow is not trivial.


In this blog, Haiqu has tested its Agentic Quantum OS on two advanced quantum workflows for material physics and chemistry, demonstrating that we are now entering a new era in building advanced quantum applications. In one example, we looked for the low-energy structure of a strongly correlated electron model. The other reconstructed a magnetic excitation spectrum. Both of these fingerprints are used by scientists to understand the properties of real materials.
 

Why quantum materials are a strong test?

Materials are quantum systems. Their useful properties — conductivity, magnetism, superconductivity, optical response, transport — come from how quantum particles organize and behave collectively.

That is why materials simulation is one of the natural long-term targets for quantum computing.

But the useful study object is rarely a single algorithmic circuit. It is a physical quantity in the target system, in whose computation circuit, executions are only one step. It's computation requires setting up a model, approximation choices, validation path, hardware execution, and finally physical interpretation.

In the following demonstrations, Haiqu’s system did not merely produce quantum programs. It helped assemble the chain of decisions needed to make the result testable and executable on real quantum hardware.
 

Case 1: Low-energy physics of correlated electrons

The first workflow targeted the single-impurity Anderson model. This is a minimal model with nontrivial many-body physics. It describes one interacting electronic site — the impurity — coupled to a surrounding bath of electrons. Electrons can move through the bath and hop onto the impurity. When two electrons occupy the impurity, they interact strongly.


That local interaction is enough to produce correlated-electron behavior that is difficult to simulate.


The model is important because it captures a basic motif in quantum matter: how a local interacting degree of freedom behaves inside a larger electronic environment. This connects to magnetic impurities, correlated conductors, superconducting systems, and quantum devices.


To simulate the low-energy physics of the model, our goal was to build a workflow around sample-based Krylov quantum diagonalization, or SKQD, and use Haiqu’s OS to extend and improve its performance.

SKQD scheme

At a high level, SKQD uses the quantum processor to sample a sequence of related quantum states. A classical solver then operates within that smaller, sample-selected space to identify the low-energy states, rather than attacking the full, exponentially large Hilbert space. 
 

The workflow had to assemble the full calculation: model construction, basis choice, initial state preparation, time-evolution circuits, Krylov sampling, noisy bitstring recovery, projected diagonalization, and comparison against classical exact diagonalization and DMRG references at small scales.


The final computation ran a 40-qubit hardware experiment to find the ground-state of a strongly correlated 20-site single impurity Anderson model (SIAM), and to build a direct comparison with the results of an independent application of the SKQD approach to SIAM.

SKQD main result
Haiqu compression helped preserve SKQD convergence on hardware while reducing the classical post-processing burden.

Hardware experiments gave insights into how noise affects the performance of the method, and how compression may alleviate this problem. In principle, adding more Krylov states should improve the ground state estimate, but these require deeper circuits which are more affected by the noise on real hardware, slowing down the convergence of the method.

 

SKQD subspace25

With Haiqu’s compression stack applied to the Krylov states, the expected convergence behavior was recovered. The compressed workflow also reduced the dimension of the subspace used in the classical post-processing  by roughly 2–4× at matched accuracy.


That is not just about a nicer plot or faster convergence. In SKQD, the classical post-processing becomes a scaling bottleneck as it involves large-scale classical diagonalization, preventing the method's ability to attack larger system sizes. Making the resulting subspace dimension smaller, while maintaining comparable accuracy, directly improves the method's practicality. In practice, it also allows for making the method less reliant on advanced HPC backends. In this demonstration, the workflow ran locally on an ordinary laptop, and the quantum execution required only a few seconds of QPU time.
 

Case 2: Reconstructing a magnetic fingerprint

The second workflow examined a different kind of material signature: how a magnetic system responds to perturbations, which is how most physical properties are probed. 

In experiments, one of the main tools for this is inelastic neutron scattering. Scientists send neutrons into a crystal and measure how energy and momentum are exchanged with the material. The result is a map of magnetic excitations, called the excitation spectrum.

For a theorist, reproducing the excitation spectrum from a microscopic model is a serious test: it means the model captures the structure and physics of the material. The quantity behind that map is the dynamical structure factor. It connects a Hamiltonian to the magnetic-scattering signal that an experiment should observe.

We used Haiqu’s Agentic Quantum OS to build and execute a workflow to reconstruct the dynamical structure factor for real quasi-one-dimensional magnetic materials.

DSF intro plots
Dioxane is a real quasi-one-dimensional magnetic material. Its copper ions form spin chains that can be modeled as a spin-1/2 Heisenberg antiferromagnet. We reproduce the two-spinon continuum spectra characteristic of the compound on the IBM Boston quantum computer.

For this workflow, we focused on 2Dioxane·2H₂O·CuCl₂, often abbreviated as CuDCl, as a reference material. It is a metal-organic copper chloride material whose copper ions form nearly one-dimensional magnetic chains.

This quasi-linear structure allows the compound to be modelled by a spin-1/2 Heisenberg chain, a foundational model for quantum magnetism. In this system, the excitations are not simple classical spin waves, but a broad two-spinon continuum — a characteristic signature of a material's magnetic structure.

The agentic flow first helped to set up a stack of analytical and numerical benchmarks and validation tests. With their help, it then proceeded to build the complete computation, including the required quantum circuits and their hardware execution.

This computation is more challenging than the previous SKQD example, which only involved ground state, or zero-temperature, properties. At low temperature, the system sits close to its ground state, and the relevant physics is governed by how that state responds to small disturbances. The ground state is only the starting point. To reproduce a neutron-scattering-like observable, we need to disturb the system and follow how it evolves.

The complete quantum pipeline constructed by the Agentic OS involves ground-state preparation, its perturbation and time evolution, circuit optimizations and depth reduction, hardware execution, error shielding, and spectrum reconstruction.

Hardware experiments on a real quantum processor recovered the key physical feature in the relevant spectral window: spectral weight near the antiferromagnetic zone boundary, consistent with the expected two-spinon continuum. 

This demonstration is particularly important because it uses quantum hardware to compute a physically meaningful observable. The output is not only a bitstring distribution or an abstract benchmark. It is a neutron-scattering-like spectrum — the type of object materials scientists use to investigate the properties of real compounds.

The resource footprint was modest: the classical work ran on an ordinary MacBook Air M4 laptop. The quantum execution used less than 10 minutes of QPU (IBM Boston) time to produce a full energy spectrum from a batch of 160 circuits (some as large as 2380 two-qubit operations, and up to 14261 total operations). 

This makes a similar kind of simulation readily accessible to ordinary academic groups rather than only national-lab-scale teams.

What made this agentic, and why does this matter?
 

Haiqu OS

In both cases, Haiqu’s AI did not replace the physicist, but it helped to reduce the hidden engineering burden around quantum research while the human researcher supplied scientific supervision.


The agentic workflow helped structure the work, prepare implementation plans, assemble modules, track baselines, connect execution, and organize the result into something that could be checked.


Quantum R&D often fails at transitions: between theory and code, between an algorithm and its circuit implementation, between a simulator and hardware, and between raw measurements and physical interpretation. Haiqu’s Agentic Quantum OS is designed to make those transitions explicit, inspectable, executable and reproducible.


With these demonstrations, we aim to show something practical for the near term: with the right agentic workflow and software stack, today’s quantum hardware can already participate in meaningful scientific simulation workflows.


That is the role of Haiqu’s Agentic Quantum OS.