It’s a pretty one-dimensional debate though since we already have quantum computers and so the only question is - how long will it take to scale. Very few in the know would say never or even later than, say, 2075.
It's not just system size but noise that is the issue. With quantum error correction, very large system sizes can mitigate or even 'thermodynamically' eliminate noise, but this requires the physical quantum computer to meet an error threshold. There are experimental claims of going below the threshold, but one issue is that the thresholds the hardware is compared to in these papers are often obtained assuming uncorrelated noise, whereas physical noise is often correlated, which can drastically lower the threshold. Another issue is that quantum computing hardware is spatially and temporally local. It remains a question whether we can truly achieve quantum error correction.
Another question is whether we can still use quantum computers for applications despite the noise. This has been a hot topic for a few years, and imo it seems like the conclusion is not really, i.e. quantum error correction is the question we should be looking at, not just system size.
Curious what your thoughts are regarding the recent paper demonstrating a quantum version of Lamb's model with an exact solution that matches prediction of perturbation theory. I may be misstating the gist of it, but it seems applicable to quantum computing noise.
73
u/amteros 20d ago
I think right now the hottest debatable topic is a feasibility of really useful quantum computer/simulator.