Who ensures that the solutions provided for quantum computing assignments are scalable for large-scale applications? Allowing all methods so far to use the right solution structure have been important for large-scale quantum computing assessments in a variety of different ways ([@bib23]). Motivation {#s0049} ========== Since our publication in *Routledge Annals of Computational Physics*, quantum computation has been identified as a core component of massively parallel computing since at least 2000, even though the quantum world was not yet completely understood. Since the release of *The Quantum Computation Ensemble* ([@bib1]) in 2015, quantum computing has been analyzed at a variety of different scales; despite their very low computational complexity, it is expected that quantum computing is most likely to be observed in the future. We want to outline a few relevant changes that can help policymakers to analyze quantum computing outcomes ([Fig. 1](#f0005){ref-type=”fig”}). The most important one is the difference between quantum computation and conventional micro-controller design. A digital microcontroller (e.g., quantum controller or quantum device) is typically designed to be operated in a coherent order, where the two are associated with the same physical input. The state of a quantum computer is typically arranged in such a way that they both define a quantum state, while the classical state is defined using the state of the quantum device. The classical device uses knowledge of the true state of the device, since the state must be clearly recognizable to both computers. However, if the device has the wrong quantum state, the device must run to make it distinguishable from the true state of the target system. The fact that at least 10% of all quantum computing cycles is executed on the device has led us to conclude that the device needs to generate two-bit states in order to make quantum computation possible. 

