new

Get trending papers in your email inbox!

Subscribe

Daily Papers

byAK and the research community

Mar 12

Ground State Preparation via Dynamical Cooling

Quantum algorithms for probing ground-state properties of quantum systems require good initial states. Projection-based methods such as eigenvalue filtering rely on inputs that have a significant overlap with the low-energy subspace, which can be challenging for large, strongly-correlated systems. This issue has motivated the study of physically-inspired dynamical approaches such as thermodynamic cooling. In this work, we introduce a ground-state preparation algorithm based on the simulation of quantum dynamics. Our main insight is to transform the Hamiltonian by a shifted sign function via quantum signal processing, effectively mapping eigenvalues into positive and negative subspaces separated by a large gap. This automatically ensures that all states within each subspace conserve energy with respect to the transformed Hamiltonian. Subsequent time-evolution with a perturbed Hamiltonian induces transitions to lower-energy states while preventing unwanted jumps to higher energy states. The approach does not rely on a priori knowledge of energy gaps and requires no additional qubits to model a bath. Furthermore, it makes mathcal{O}(d^{,3/2}/epsilon) queries to the time-evolution operator of the system and mathcal{O}(d^{,3/2}) queries to a block-encoding of the perturbation, for d cooling steps and an epsilon-accurate energy resolution. Our results provide a framework for combining quantum signal processing and Hamiltonian simulation to design heuristic quantum algorithms for ground-state preparation.

  • 4 authors
·
Apr 8, 2024

Information Theory and Statistical Mechanics Revisited

The statistical mechanics of Gibbs is a juxtaposition of subjective, probabilistic ideas on the one hand and objective, mechanical ideas on the other. In this paper, we follow the path set out by Jaynes, including elements added subsequently to that original work, to explore the consequences of the purely statistical point of view. We show how standard methods in the equilibrium theory could have been derived simply from a description of the available problem information. In addition, our presentation leads to novel insights into questions associated with symmetry and non-equilibrium statistical mechanics. Two surprising consequences to be explored in further work are that (in)distinguishability factors are automatically predicted from the problem formulation and that a quantity related to the thermodynamic entropy production is found by considering information loss in non-equilibrium processes. Using the problem of ion channel thermodynamics as an example, we illustrate the idea of building up complexity by successively adding information to create progressively more complex descriptions of a physical system. Our result is that such statistical mechanical descriptions can be used to create transparent, computable, experimentally-relevant models that may be informed by more detailed atomistic simulations. We also derive a theory for the kinetic behavior of this system, identifying the nonequilibrium `process' free energy functional. The Gibbs relation for this functional is a fluctuation-dissipation theorem applicable arbitrarily far from equilibrium, that captures the effect of non-local and time-dependent behavior from transient driving forces. Based on this work, it is clear that statistical mechanics is a general tool for constructing the relationships between constraints on system information.

  • 3 authors
·
May 27, 2011

The X-ray Integral Field Unit at the end of the Athena reformulation phase

The Athena mission entered a redefinition phase in July 2022, driven by the imperative to reduce the mission cost at completion for the European Space Agency below an acceptable target, while maintaining the flagship nature of its science return. This notably called for a complete redesign of the X-ray Integral Field Unit (X-IFU) cryogenic architecture towards a simpler active cooling chain. Passive cooling via successive radiative panels at spacecraft level is now used to provide a 50 K thermal environment to an X-IFU owned cryostat. 4.5 K cooling is achieved via a single remote active cryocooler unit, while a multi-stage Adiabatic Demagnetization Refrigerator ensures heat lift down to the 50 mK required by the detectors. Amidst these changes, the core concept of the readout chain remains robust, employing Transition Edge Sensor microcalorimeters and a SQUID-based Time-Division Multiplexing scheme. Noteworthy is the introduction of a slower pixel. This enables an increase in the multiplexing factor (from 34 to 48) without compromising the instrument energy resolution, hence keeping significant system margins to the new 4 eV resolution requirement. This allows reducing the number of channels by more than a factor two, and thus the resource demands on the system, while keeping a 4' field of view (compared to 5' before). In this article, we will give an overview of this new architecture, before detailing its anticipated performances. Finally, we will present the new X-IFU schedule, with its short term focus on demonstration activities towards a mission adoption in early 2027.

  • 282 authors
·
Feb 15, 2025

Observational signatures of mixing-induced cooling in the Kelvin-Helmholtz instability

Cool (approx 10^4K), dense material permeates the hot (approx 10^6K), tenuous solar corona in form of coronal condensations, for example prominences and coronal rain. As the solar atmosphere evolves, turbulence can drive mixing between the condensations and the surrounding corona, with the mixing layer exhibiting an enhancement in emission from intermediate temperature (approx10^5K) spectral lines, which is often attributed to turbulent heating within the mixing layer. However, radiative cooling is highly efficient at intermediate temperatures and numerical simulations have shown that radiative cooling can far exceed turbulent heating in prominence-corona mixing scenarios. As such the mixing layer can have a net loss of thermal energy, i.e., the mixing layer is cooling rather than heating. Here, we investigate the observational signatures of cooling processes in Kelvin-Helmholtz mixing between a prominence thread and the surrounding solar corona through 2D numerical simulations. Optically thin emission is synthesised for Si IV, along with optically thick emission for Halpha, Ca II K and Mg II h using Lightweaver The Mg II h probes the turbulent mixing layer, whereas Halpha and Ca II K form within the thread and along its boundary respectively. As the mixing evolves, intermediate temperatures form leading to an increase in Si IV emission, which coincides with increased radiative losses. The simulation is dominated by cooling in the mixing layer, rather than turbulent heating, and yet enhanced emission in warm lines is produced. As such, an observational signature of decreased emission in cooler lines and increased emission in hotter lines may be a signature of mixing, rather than an implication of heating.

  • 3 authors
·
Jan 20, 2025

Single-shot thermometry of simulated Bose--Einstein condensates using artificial intelligence

Precise determination of thermodynamic parameters in ultracold Bose gases remains challenging due to the destructive nature of conventional measurement techniques and inherent experimental uncertainties. We demonstrate an artificial intelligence approach for rapid, non-destructive estimation of the chemical potential and temperature from single-shot, in situ imaged density profiles of finite-temperature Bose gases. Our convolutional neural network is trained exclusively on quasi-2D `pancake' condensates in harmonic trap configurations. It achieves parameter extraction within fractions of a second. The model also demonstrates zero-shot generalisation across both trap geometry and thermalisation dynamics, successfully estimating thermodynamic parameters for toroidally trapped condensates with errors of only a few nanokelvin despite no prior exposure to such geometries during training, and maintaining predictive accuracy during dynamic thermalisation processes after a relatively brief evolution without explicit training on non-equilibrium states. These results suggest that supervised learning can overcome traditional limitations in ultracold atom thermometry, with extension to broader geometric configurations, temperature ranges, and additional parameters potentially enabling comprehensive real-time analysis of quantum gas experiments. Such capabilities could significantly streamline experimental workflows whilst improving measurement precision across a range of quantum fluid systems.

  • 3 authors
·
Jun 20, 2025

Superpositions of thermalisations in relativistic quantum field theory

Recent results in relativistic quantum information and quantum thermodynamics have independently shown that in the quantum regime, a system may fail to thermalise when subject to quantum-controlled application of the same, single thermalisation channel. For example, an accelerating system with fixed proper acceleration is known to thermalise to an acceleration-dependent temperature, known as the Unruh temperature. However, the same system in a superposition of spatially translated trajectories that share the same proper acceleration fails to thermalise. Here, we provide an explanation of these results using the framework of quantum field theory in relativistic noninertial reference frames. We show how a probe that accelerates in a superposition of spatial translations interacts with incommensurate sets of field modes. In special cases where the modes are orthogonal (for example, when the Rindler wedges are translated in a direction orthogonal to the plane of motion), thermalisation does indeed result, corroborating the here provided explanation. We then discuss how this description relates to an information-theoretic approach aimed at studying quantum aspects of temperature through quantum-controlled thermalisations. The present work draws a connection between research in quantum information, relativistic physics, and quantum thermodynamics, in particular showing that relativistic quantum effects can provide a natural realisation of quantum thermodynamical scenarios.

  • 2 authors
·
Jul 5, 2023

Time evolution of the Boltzmann entropy for a nonequilibrium dilute gas

We investigate the time evolution of the Boltzmann entropy of a dilute gas of N particles, N>>1, as it undergoes a free expansion doubling its volume. The microstate of the system, a point in the 4N dimensional phase space, changes in time via Hamiltonian dynamics. Its entropy, at any time t, is given by the logarithm of the phase space volume of all the microstates giving rise to its macrostate at time t. The macrostates that we consider are defined by coarse graining the one-particle phase space into cells Δ_α. The initial and final macrostates of the system are equilibrium states in volumes V and 2V, with the same energy E and particle number N. Their entropy per particle is given, for sufficiently large systems, by the thermodynamic entropy as a function of the particle and energy density, whose leading term is independent of the size of the Δ_α. The intermediate (non-equilibrium) entropy does however depend on the size of the cells Δ_α. Its change with time is due to (i) dispersal in physical space from free motion and to (ii) the collisions between particles which change their velocities. The former depends strongly on the size of the velocity coarse graining Δv: it produces entropy at a rate proportional to Δv. This dependence is investigated numerically and analytically for a dilute two-dimensional gas of hard discs. It becomes significant when the mean free path between collisions is of the same order or larger than the length scale of the initial spatial inhomogeneity. In the opposite limit, the rate of entropy production is essentially independent of Δv and is given by the Boltzmann equation for the limit Δvrightarrow 0. We show that when both processes are active the time dependence of the entropy has a scaling form involving the ratio of the rates of its production by the two processes.

  • 4 authors
·
Mar 12, 2024

TOMATOES: Topology and Material Optimization for Latent Heat Thermal Energy Storage Devices

Latent heat thermal energy storage (LHTES) systems are compelling candidates for energy storage, primarily owing to their high storage density. Improving their performance is crucial for developing the next-generation efficient and cost effective devices. Topology optimization (TO) has emerged as a powerful computational tool to design LHTES systems by optimally distributing a high-conductivity material (HCM) and a phase change material (PCM). However, conventional TO typically limits to optimizing the geometry for a fixed, pre-selected materials. This approach does not leverage the large and expanding databases of novel materials. Consequently, the co-design of material and geometry for LHTES remains a challenge and unexplored. To address this limitation, we present an automated design framework for the concurrent optimization of material choice and topology. A key challenge is the discrete nature of material selection, which is incompatible with the gradient-based methods used for TO. We overcome this by using a data-driven variational autoencoder (VAE) to project discrete material databases for both the HCM and PCM onto continuous and differentiable latent spaces. These continuous material representations are integrated into an end-to-end differentiable, transient nonlinear finite-element solver that accounts for phase change. We demonstrate this framework on a problem aimed at maximizing the discharged energy within a specified time, subject to cost constraints. The effectiveness of the proposed method is validated through several illustrative examples.

  • 3 authors
·
Oct 8, 2025