Transforming Lindblad Equations Into Systems of Real-Valued Linear Equations: Performance Optimization and Parallelization of an Algorithm
Meyerov, I; Kozinov, E; Liniov, A; Volokitin, V; Yusipov, Igor; Ivanchenko, Mikhail; Denysov, Sergiy
Journal article, Peer reviewed
Published version
View/ Open
Date
2020-10-06Metadata
Show full item recordCollections
Original version
Meyerov I, Kozinov E, Liniov A, Volokitin V, Yusipov I, Ivanchenko M, Denysov S. Transforming Lindblad Equations Into Systems of Real-Valued Linear Equations: Performance Optimization and Parallelization of an Algorithm. Entropy. 2020;22 https://doi.org/10.3390/e22101133Abstract
With their constantly increasing peak performance and memory capacity, modern supercomputers offer new perspectives on numerical studies of open many-body quantum systems. These systems are often modeled by using Markovian quantum master equations describing the evolution of the system density operators. In this paper, we address master equations of the Lindblad form, which are a popular theoretical tools in quantum optics, cavity quantum electrodynamics, and optomechanics. By using the generalized Gell–Mann matrices as a basis, any Lindblad equation can be transformed into a system of ordinary differential equations with real coefficients. Recently, we presented an implementation of the transformation with the computational complexity, scaling as O(N5logN) for dense Lindbaldians and O(N3logN) for sparse ones. However, infeasible memory costs remains a serious obstacle on the way to large models. Here, we present a parallel cluster-based implementation of the algorithm and demonstrate that it allows us to integrate a sparse Lindbladian model of the dimension N=2000 and a dense random Lindbladian model of the dimension N=200 by using 25 nodes with 64 GB RAM per node