Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
What are we going to do with quantum computers? (technologyreview.com)
140 points by hunglee2 on Feb 25, 2018 | hide | past | favorite | 132 comments


I wonder whether we'll first use quantum computers to perform cryptanalysis against archived unencrypted traffic.

We've conducted ourselves for decades underthe assumption that the cryptographic invariants securing internet communications would hold. Looking to the future, we can switch to quantum-resistant ciphers and go on with our lives.

But the past? Mass decryption of archived communication would lead to learning things we never wanted to know. It would cause economic ruin, social turmoil, and worse on a huge scale. And it may now be inevitable. Everyone has secrets.

We should start moving toward quantum-resistant cryptography now so that by the time these machines become practical, sensitive information will have fallen out of most archives.


> We should start moving toward quantum-resistant cryptography now so that by the time these machines become practical, sensitive information will have fallen out of most archives.

I think we actually are. NIST already gathered submissions for post quantum crypto[0] last year.

And people like djb are also clearly interested in it[1][2].

[0] https://csrc.nist.gov/projects/post-quantum-cryptography/rou...

[1] https://blog.cr.yp.to/20170719-pqbench.html

[2] https://sphincs.org/


I know very little about the topic. Will post-quantum crypto require new hardware? The few techniques I have heard about rely on special networks capable of transmitting quantum particles, not just bits.

If new network infrastructure is required, I can't imagine how we would ever get quantum computer resistant cryptography before it's way too late.


>Will post-quantum crypto require new hardware?

No. We just need to switch from existing public-key cryptographic algorithms to quantum-resistant public-key cryptographic algorithms. There are already a variety of such algorithms developed (see https://en.wikipedia.org/wiki/Post-quantum_cryptography). All that remains is to study them further and agree on a standard.

Symmetric ciphers such as AES are already quantum-resistant for 256-bit keys. (Grover's algorithm is the best known quantum attack for such algorithms. Cracking an N-bit key classically is approximately as hard as cracking an (N/2)-bit key on a quantum computer with Grover's algorithm.)


> It would cause economic ruin, social turmoil, and worse on a huge scale.

I don't think this is true at all. No one has the capacity to store everything and governments would be the ones that come closest. Why would they cause all those bag things? It does not make sense.

I am not saying they won't use it, but it will not have nearly those effects that you list.


> No one has the capacity to store everything

Don't be so sure about that: https://en.wikipedia.org/wiki/Utah_Data_Center


1. Even if it is of infinite capacity, one datacenter will not be able to store everything. The data needs to be moved there as well.

2. My point was that there's no way NSA will use the data stored there in ways that will give the effects mentioned in the grandparent post.


The person storing it and accessing don't necessarily have to work for the same nation-state.


> I wonder whether we'll first use quantum computers to perform cryptanalysis against archived unencrypted traffic.

To break contemporary public-key crypto you need quantum computers orders of magnitude larger than anything built so far. To break something like Curve25519 you need 1600 or so logical qubits.


Are there any credible opinions on whether quantum computers will experience Moore’s Law-like exponential growth or not?


Serious question: Why would it? Isn’t “Moore’s Law” just an empirical observation about late 20th century technology that is trotted out in front of undergraduates, confusing them about what a “law” even is? There’s not a theoretical reason to expect it to apply to any other information systems is it?


Moore’s law confuses the hell out of me as a kid. It went against everything I understood about what a law meant, and it made an 8-9 year old me think that there was some mystical physical forces that essentially required processing speed to increase.

I couldn’t understand how that worked and for the longest time thought either I was really stupid, or the whole world had lost its mind promoting something that obviously could not be true.

It took a couple of years for me to realize that it simply wasn’t a “law” in the first place and to recognize differences in academic speak and popular media.


If I had left out the phrase "Moore's Law-like" would the question have been less controversial?

It's not that uncommon for new technologies to experience exponential improvement for some period of time. It doesn't seem like an unreasonable question to ask.


Moore's Law was based on past experiences, understanding of the processes and deductions about what future issues are likely to be based on current trends.

My gut feeling is that this question right now is like asking for Moore's Law in 1950.


I think it's closer to the 90s. The tooling to make very small features consistently already exists at good enough performance and they can fabricate much of this with the same systems used for traditional chips. Not the same materials being deposited, but they don't have to invent x-ray lithography and CVD/FIB/E-Beam Lithography/ALD/etc.

The hard part is figuring out the best general engineering approach, i.e. what to make the qubits out of, how to entangle them, what materials to use. Once you figure that out and do small demonstration units I'd fully expect the right design to be scalable quickly. But the actual production of the right design should be less of a barrier.

Still a barrier, but nowhere near 50s levels.


There are credible opinions about everything, including credible opinions from experts that the laws of physics won't allow quantum computers to ever operate at the scale we're talking about here.


Do you have a reference?



Thank you,


Sensitive information won't fall out of NSA's Utah facility. They will also be among the first to get quantum code breakers. There is going to be a rapture of sorts. A lot of people will disappear.


What do you mean? Are you saying that people that have aided terrorists or whatnot will be held to account? Is that a bad thing?


Without due process: yes, it is a bad thing.


Google has experimented with post-quantum algorithms too: https://security.googleblog.com/2016/07/experimenting-with-p...


Do not worry about this. The powers that control this ability also happen to control the information we receive. Thus nothing will happen that catastrophic (I hope).

The only thing THEY will gain is putting together the missing pieces of some puzzles (or discover new ones pieces/puzzles).


> cryptanalysis against archived UNencrypted traffic

What?


Gah. I meant encrypted. That's the only way the sentence makes sense. :-)


Serious quantum computers ARE FINALLY HERE

That would involve both the hardware and software is ready for production and use.

So it's my thing or is it MIT Technology Review writers who are misleading common people into thinking quantum computing will be usable in a couple of years?


Since people are disputing it, we took that bit out of the title above.


Yes. From my layman's view the fact that there are serious scientists in the field who still doubt that practical quantum computation is even realistically possible surely indicates we're not on the verge of being able to order one for delivery.


On the other hand I'm sure you could find lots of other achievements where some scientists were denying it was possible right up to the point it was achieved.


Could you?

Sure, we can all think of cases where someone said something was impossible and then eventually it was achieved.

But are there any cases where the expert consensus on the physical possibility of something wasn't established until it was achieved in practice? I can't think of one off the top of my head.

Nuclear energy? No, the physical possibility of this was understood in the thirties; the objection was that, as one scientist put it, you would have to turn a whole country into a uranium refinery. Which wasn't exactly wrong; the Manhattan project took engineering resources in the ballpark of a small country.

Supersonic flight? No, artificial objects had been going supersonic for centuries.

Spaceflight? No, the physics of this was well understood long before it was achieved.

Any that I'm missing?


Right, people had consensus on the physical possibility, but not on whether the engineering problems were surmountable.

But I think there is a consensus on the physical possibility of quantum computing in principle. The [threshold theorem](https://en.wikipedia.org/wiki/Quantum_threshold_theorem) is very convincing.


Be careful: there's an important distinction. There is a consensus that quantum computers can, for certain problems, perform computation exponential in the number of qubits. But the often unspoken assumption is that the difficulty of getting the computation to stay coherent, is polynomial in the number of qubits. That's currently the big unknown. If it's not, then that's what would be meant by practical quantum computers being physically impossible.


Is that not the problem that the threshold theorem solves?


As I understand it (disclaimer: not a physicist), that's the open question: does there exist a stable configuration of atoms that will create conditions such that the threshold theory applies in full, errors are adequately corrected, and an N-qubit quantum computer can be built and operated for cost polynomial rather than exponential in N? Or does that line of thinking rely on assumptions that can't actually be met?


Quantum Physics.

"I, at any rate, am convinced that He [God] does not throw dice."


Errr, Eistein fully accepted (as did most others fairly quickly) that quantum mechanics was real -- in fact the duality of wave + particle for photons was something he developed (as part of his developing the theory that explained the photoelectric effect).

The "does not play dice" was mostly with regard to Heisenberg, in which he was determined to find a way to avoid the predicted lack of predictability of the future.


"If an elderly but distinguished scientist says that something is possible, he is almost certainly right; but if he says that it is impossible, he is very probably wrong."

https://www.brainyquote.com/quotes/arthur_c_clarke_100793

> On the other hand I'm sure you could find lots of other achievements where some scientists were denying it was possible right up to the point it was achieved.

True and I'm very aware that I'm creeping towards the age where I'm likely to be on the wrong side of these debates. (Not that I'm distinguished or even a scientist)

However - the distinguished, elderly scientists are sometimes right.


That only seems true with hindsight bias, as distinguished elderly scientists are most memorable when they are wrong. Being right 99.9% of the time would not be enough if there are millions of you and people only care about a few quotes.

Granted, you can't be proved correct if you say some action is impossible as someone can always wait to see if you're wrong.


Or you know, that is a BS quote, extrapolating from too few historical examples (and even them, not understood well), and not some scientific law.


That's missing the point. Until it's actually achieved it isn't "finally here".


It's reverse psychology. Surely, if they are in the field, they want it to be true. I frequently find myself invariably assuming a contrarian position in debate just to provoke more arguments, effectively contradicting myself over the course of different debates. In that sense, I just wanna know if that has a name in psychology as a normal or pathological condition. I mean, it might be purely rhetorical.

Also, if it's not "crypto season", what is it actually, nano, cryo, opto, ..?


Any good scientist should be the biggest skeptic of their own work. Though I usually poke holes in my own ideas in private ;-)


Technology Review decided to go full New Scientist 20 years ago. It's entertaining sometimes, but not a reliable source for the state of a research field.


When did New Scientist itself go full New Scientist? I've stumbled across New Scientist articles from the 1970s when performing searches on Google Books and it reads like a completely different publication.


I don't know. I didn't start to pay attention until the late 80s or so.


It will be "ready", when you can rent qubits by the minute on AWS.


Only if those common people only read the headline. One of the first sentences of the article is:

"Yet only now, after decades of gradual progress, are researchers finally close to building quantum computers powerful enough to do things that conventional computers cannot."

And there are more parts of the article that make clear that it's definitely not possible to buy a quantum computer any time soon.

Granted, it's not the best headline ever, but it's a decent article.


"close to building" means in 20 years that means https://xkcd.com/678/

For most people "close to building" means 1 or 2 years.


I personally envision quantum computers and machine learning working together to "solve" biology. QC would solve difficult molecular dynamics problems (like protein foldings, enzymatic reactions...) and ML would learn those results to simulate large aggregates of such molecules (like a whole cell or even a whole organism).


From how QC is done, that seems unlikely. It seems to me a QC would best model quantum processes but anything larger would require astronomical numbers if qubits.


Is it possible to massively speed up ray tracing by representing geometry in a quantum program and then using superposition to evaluate many ray paths simultaneously? That would be exciting in <N> years for real time accurate ray tracing.

This is just a wild guess without much knowledge of quantum computing at all though.


In short, no.

Quantum computers don't really compute many things in parallel. Or rather, they do, but the problem is getting the results out.

To stick with your ray tracing question, you could probably indeed trace an exponentially (in the size of the computer) number of rays, but at the end you have to do go back to the classical world to actually read out data, and you'd essentially be able to only read out the result of one ray picked at random.

Quantum computers are only know to provide exponential speedups in cases where there is some algebraic structure to exploit when condensing the exponentially many computations down to a single result. That's why you can get an exponential speedup for factoring or discrete logarithm, but not for symmetric cryptography or really any other useful problem (except for the kind of circular problem of simulating quantum systems).

That's what makes me personally hope that quantum computers will never scale. The likely outcome of quantum computing is the worst of all worlds: all known practically useful cryptography will be broken, and practically nothing will really benefit.


Quantum computers do not break "all known practically useful cryptography".

For symmetric crypto, some schemes will have to be dropped, and others will require larger keys. But we can (for example) just keep right on using AES 256.

For asymmetric crypto, we'll have to switch to some new schemes, but those exist (though I don't know of any current software using them). For example:

https://en.m.wikipedia.org/wiki/Lattice-based_cryptography

The bigger worry I think is what about all of the existing traffic that has been archived?


It's true that symmetric crypto isn't entirely broken by quantum computers, although in many cases the key sizes need to be doubled. However, I wrote my original statement with that fully in mind, because symmetric crypto by itself is not really all that useful. All useful applications need at least some asymmetric crypto, even if it's just to distribute keys for more efficient symmetric crypto.

As for the alternatives to asymmetric crypto: I like lattices, I've studied them a lot myself, though not in a crypto context. But have you looked at the required key sizes of those alternative schemes? They're pretty horrible and not at all comparable to what we have today.

Obviously people will continue research on this, and perhaps something close to elliptic curves in terms of efficiency will be found. I certainly hope so, but I'm not holding my breath.

I agree that archived traffic is an issue as well, but personally I'm more worried about the future than the past.


I hadn't previously looked at lattice crypto in much detail, and was not aware the keys were as big as they are; thanks for pointing that out. That said, a big performance hit isn't necessarily the same thing as being practically unusable. It certainly explains why nobody's using it now. I'd be curious to read some concrete performance analyses.

A sibling comment points out one use case for symmetric-only crypto.

I'm also curious what people would actually do if forced to deal with a symmetric-only world.

I think it is more likely that we'd see cumbersome ways of dealing with key distribution than that people would just stop using crypto in all the places we rely on public key schemes today. Think symmetric keys printed on your bank statement (maybe with a qr code?) We'd definitely see it used a lot less than now.

That said, we'd probably have about the same number of people doing end to end email encryption; key distribution with an asymmetric scheme is no picnic either :P.


>All useful applications need at least some asymmetric crypto, even if it's just to distribute keys for more efficient symmetric crypto.

File encryption, whether it be my entire hard drive, or just my password database, doesn't need asymmetric crypto.


>The likely outcome of quantum computing is the worst of all worlds: all known practically useful cryptography will be broken, and practically nothing will really benefit.

I agree with you on the technical points, but I think this is a tad pessimistic.

Firstly, by the time we have quantum computing we will probably also have quantum cryptography. Quantum cryptography is secure against quantum computers and is also vastly more simple than existing crypto (no complicated algorithms to mess up). So I expect cryptography to improve.

Secondly, while quantum computers won't be useful for everything, they will have amazing applications. The "simulating quantum systems" thing will be hugely useful for studying chemistry. And Grover's algorithm will provide a significant speedup for a whole host of interesting problems, especially in machine learning.


> Firstly, by the time we have quantum computing we will probably also have quantum cryptography.

I disagree for two reasons.

First, because there is a huge difference between having a single quantum computer and quantum computers in every household.

Second, because we don't need quantum cryptography, we need post-quantum cryptography. Quantum computers break only asymmetric cryptosystems, but despite common belief, quantum cryptography doesn't solve this! Post-quantum cryptography does, and there is already good promise in this field.


>but despite common belief, quantum cryptography doesn't solve this!

? You might not think that quantum cryptography is likely to be practical soon. But it is definitely resistant to quantum computers.


My point was that quantum cryptography doesn't solve anything that classic cryptography doesn't. (And yes there is classic cryptography that isn't broken by quantum computers.) I'll defer to two of the world's leading cryptographers to back me up:

- https://www.schneier.com/essays/archives/2008/10/quantum_cry...

- https://cryptome.org/2012/09/bernstein-qke.htm


> Quantum cryptography is secure against quantum computers and is also vastly more simple than existing crypto (no complicated algorithms to mess up).

Can you explain this statement? I understand that algorithms such as RSA might require particular padding or what have you to be secure in practice, but is quantum-resistant crypto much different?


The phrase "quantum-resistant crypto" usually means classical algorithms (e.g. those involving elliptic curves) that are resistant to quantum computers.

What I'm talking about is "quantum cryptography", in which qubits are actually used in the protocol. Quantum cryptography is also known to be secure against quantum computers (indeed it's secure against arbitrary amounts of computing resources, unless our theories of physics are wrong).

It's also simpler than RSA or elliptic curves, so I hope that (after the kinks are worked out) it will also be less susceptible to bad implementations and side-channel attacks.


Apologies as I misread "quantum" as "quantum-resistant" above. However, I would still contest that quantum key exchange is "simpler". I mean, you need a quantum channel and to accurately exchange qubits without disturbance. Not to mention a small initial secret. It doesn't really work too well with our current infrastructure.


Agree with most of your points. But I suspect building a quantum infrastructure will be easier (and therefore occur earlier) than building quantum computers.

>Not to mention a small initial secret

This is interesting, what are you referring to here?


Doesn't quantum transmission infrastructure require direct connections from A to B? I.e. you could use it for a single uninterrupted fiber cable, or for the channel between your antenna and a satellite, but not with our common fiber infrastructure that relies on repeaters / re-transmitters.

As soon as there's any device between you and the recipient that breaks the entanglement, all the guarantees of quantum encryption go out of the window, and it's possible to attack the comms at that retransmission point.


>Doesn't quantum transmission infrastructure require direct connections from A to B?

No, the retransmitters can preserve the entanglement (and A and B can verify that this has been done).

In fact (providing qubits can be stored) the transmission can be done indirectly and in advance.

The telecom company produces lots of entangled qubits and gives a bunch to each customer, keeping half of each pair for itself. Then when Alice wants to communicate securely with Bob they ask the telecom company to take the corresponding qubits and perform a joint measurement. This entangles Alice's qubits with Bob's (like making a connection at a telephone exchange). Then Alice and Bob can measure their qubits (in various bases) to create a one-time-pad.

The clever thing is that Alice and Bob can (by checking that on a portion of the qubits their results always matched when they used the same basis) verify that they did indeed have maximally entangled qubits and therefore no one was listening-in.


Encryption has become pervasive in modern communication, yet it is clear that we still need more to properly protect the things we are doing. Does the quantum infrastructure that you are proposing not imply an almost complete replacement of the current global communications network with quantum channels, or at least to shadow it with a co-extensive one for key distribution? Is that close to being feasible for mobile and other radio-connected devices?


It's completely different. The underlying mechanism is the fact that observing the state on one member of an entangled pair changes the state of the other member.


More specifically, the important fact is that it's impossible for three particles to be (maximally) entangled simultaneously. So you can verify that no one eavesdropped when you agreed your one-time-pad.


Have you seen this recent result http://news.mit.edu/2018/physicists-create-new-form-light-02....

Does it change your thoughts on the utility of quantum cryptography?


They get three photons to be entangled, but not "maximally entangled" in the (impossible) way that would be needed for someone to listen to a communication without being detected.


There were some researchers working on noise saying that entangling more qubits has exponential difficulty, it's easy to create quantum computer with couple of quibits that we're seeing now, but it will be impossible to use more. If that's the case, we're safe.


You're not wrong, but that is a fairly extreme fringe view. There is a strong consensus, based on very solid physics experiment and theory, that that is not the case.


A neural network can have many inputs but only one input. That could be useful?


SIGGRAPH 2016 had a paper about "quantum supersampling" which does something like that: https://vimeo.com/180284417


I've enjoyed the talk up until the liquid helium part.


No. That's a common misconception about how quantum computation works. Read this: https://www.smbc-comics.com/comic/the-talk-3


Ray tracing is typically the kind of things you want to do as fast as possible, ideally in real time. Quantum computers will never work fast.

Also if I'm not mistaken ray tracing scales well with conventional computers : the quality of your output is proportional to the number of rays you simulate. So it's not an appropriate use case for quantum computers.


but it's not linear with the number of reflections and transissions, I guess, if you want physically correct materials. Or if you want aberations and, well, "quantum effects" like in the slit experiment.


I've always assumed that, however here quantum computers are in labs you can read articles about, they're herer at the NSA. Is there any external evidence that the NSA is already using them?


None actually. If the NSA or anyone had managed to find a use for the sort of Quantum computers that are floating around the world would be very different.

However currently these computers are constrained by classical memory and read write operations are still classical.

I'm willing to bet it'll be a century more at this rate before quantum computing comes of age.

Since the 40s people have been hyping them without any sort of clear understanding.


> If the NSA or anyone had managed to find a use for the sort of Quantum computers that are floating around the world would be very different.

Remember how much effort the British put into making it hard for the Axis to realise that Engima had been cracked. To the extent that they allowed ships to be sunk and sailors to die. If a 3 letter agency does have anything then they are probably hoping to keep that edge for as long as possible.

(This doesn't mean that I believe they do - merely that I don't think it would be immediately obvious if they did)


Information regularly leaks out of the NSA. It doesn't seem like they'd be able to keep this a secret.

It could be that they have some super secret projects which are protected by much better security, but I doubt it. If they could be that competent for one project, why not use the same protocols everywhere?


> If they could be that competent for one project, why not use the same protocols everywhere?

For exactly the reason set out in the post you are replying to, I would think, at least for a while.

Though that approach to security does not always last. The USA almost gave away that it had broken the Japanese navy's most important cyphers through its interception of Admiral Yamamoto.


That's what they want you to think.


> Since the 40s people have been hyping them without any sort of clear understanding.

Groan: it's going to be Cloud/Big Data/ML/Blockchain hype all over again except possibly worse. I don't mind the technology, but the tsunami of vapid content (and conversation) that comes with it wears me out.


>40s

This is a typo, surely? Quantum computers were first proposed in the 80s.


Yup, sorry, typed that on an alphanumeric.

But tech-hype has always existed, side by side with those claiming tech has maxed out... It's disappointing to see the media even try TBH.


I think you’re right: “In 1982, the Nobel prize-winning physicist Richard Feynman thought up the idea of a 'quantum computer', a computer that uses the effects of quantum mechanics to its advantage”

https://www.doc.ic.ac.uk/~nd/surprise_97/journal/vol4/spb3/


How would the world be different? Presumably they’d keep it a secret.


How long does a computation step in a quantum computer (which I guess is technically a measurement) actually take? I'm (at a superficial level) aware of Grover's and Shor's algorithms, but only in the sense that they provide asymptotic speedups in the number of steps required for the computation. In classical computers it's easy to envision some model where a step is just some constant number of cycles or whatever, but how long do the measurements take in quantum computers?

I guess it depends on the specific experimental setup of your quantum computer somehow, but I'm just curious about the real-world speeds. A hypothetical computer that does few steps but takes a minute for each measurement wouldn't be so useful.


For superconducting qubits, without error correction, each operation takes tens of nanoseconds.

With error correction... it depends how many spare qubits you have. All the non-trivial operations are done by "magic state distillation", where you produce special states in one part of the computer then consume them to perform the operation elsewhere. The more qubits you have, the more space you can allocate to magic state factories.

For example, a 32-bit addition can be done with 124 |T> states [1]. If you have N qubits allocated to factories then you can expect to get about 100N magic states per second. So if you have a tiny computer with 20 qubits dedicated to a factory, then a single addition will take over half a second. But if you can spare 2000 qubits for factories (about the same amount of qubits you'd need to store the problem in the first place) then you'd be producing T-states at 200KHz and 32-bit-adding at 1600Hz.

Shor's algorithm is, effectively, a single modular exponentiation. Performing an N-bit modular exponentiation requires something like 4N^3 |T> states. So 1024^3*4/200KHz = 6 hours to a factor a 1024 bit key, assuming 2000 qubits on factories and that the other rough numbers are close.

1: https://arxiv.org/abs/1709.06648


It depends on the implementation. Ion traps are slower than superconducting qubits, for example, although they are speeding up. This paper, from 2014, gives some parameters:

  "Logic gates at the surface code threshold: Superconducting qubits poised for fault-tolerant quantum computing"  
  Campbell et al.  
  Nature, 2014  
https://arxiv.org/abs/1402.4848

From page 9:

Initialization ~50ns

Measurement ~200ns

One-qubit gates ~20ns

Two-qubit gates ~40ns


Single measurement collapses to some fixed state. You need to reset the whole thing and do measurement again. Repeat it many times and you'll start seeing distribution that you're interested in. Your solution/algo will amplify correct answers.

Single measurement is like revealing single pixel of a picture. If you do it enough times, you know what's on the picture.


Yeah, so the question is about the order of magnitude it takes for each such simple measurement; how much time does it take to reset the whole thing and do measurement again - how many times per second/hour/whatever can we hope do do that.


And how many times do you need to do it to get the precision you want?


The importance of quantum computers is not the absolute clock speed, but rather the ability to reduce previously exponential problems into sub exponential complexity.

For example take the classic attacks on DLP -- instead of essentially manually factoring the keys we can create a program (Shor's algorithm, etc) that essentially treats the problem of factoring as the task of determining the base frequencies of the key when treating the public component as a wave (at least as I understand it).

On an ideal quantum computer that task should be polynomial complexity, although there's still some (hopefully small) probability of error, but basically you just repeat many times and use a classical computer to verify the solution.


I never got really far explaining myself what a quantum computer is. From what I understand, this is what we can do with a quantum bit:

    class Qbit
    {
     var $p;

     function randomize($p)
     {
      $this->p=$p;
     }
    
     function observe()
     {
      if ( $this->p > random() ) $this->p=1;
      else                       $this->p=0;
      return $this->p;
     }
    }
The interesting things probably get possible when you have multiple qbits. Not just a normal array of qbits, but qbits that have some kind of interaction (entanglement?).

Would it be possible to simulate 'multiqbits' in a piece of code similarly small as the one above?


On a mobile device so hard to write out proper code, but basically as I understand it, for an n-qubit system, you could imitate this with a class containing

- A length 2^n array of complex numbers, such that the sum of the norms of those numbers is always 1. Each element represent a possible length n bit string.

- An "observe" method, which returns a given length n bit string with probability equal to the norm of the corresponding complex number in the array.

- Various "transform" methods. These are all linear unitary operators (matrices with special properties) on the array, but I think you can't pick and choose them arbitrarily; see https://en.m.wikipedia.org/wiki/Quantum_gate

All the usual bit operations are valid, but there are operations which don't have classical analogs, like this one in a 2-qbit system: "Maintain the probabilities of 00 or 11, but average the respective probabilities of 01 and 10".

The fact that you're working with complex numbers and not just probabilities is mathematically how entanglement can play a role, but I don't have a good minimal example of how this is helpful.



You're missing several important things:

* The amplitude of a qubit is effectively a complex probability instead of a real one.

* Don't think of an n-qubit machine as having n instances of qubit. The representation you want is essentially:

    struct quantum_state {
      uint64_t bits;
      complex_t amplitude;
    };
    struct quantum_register {
      int num_bits;
      size_t num_states;
      struct quantum_state *states;
    };
(quantum registers are usually emulated as sparse vectors for space efficiency).

From a simulation perspective, quantum computers are just linear algebra operations, albeit over the complex field instead of the reals. The difficulty is that the equivalent vectors and matrices are of exponential size (2^n for an n-qubit computer), and the interesting operations don't exhibit regular patterns that permit aggressive optimization.


    The amplitude of a qubit is effectively a
    complex probability instead of a real one
Does this have any implications as long as there is only one qbit?



Thank you. But I don't think in formulas. I think in code.



You can poke around https://quantumexperience.ng.bluemix.net/qx/experience and see if there's anything helpful


Ignore the equations and read the words. From a quick look, they have some customized notation but also heuristically valid descriptions.


What’s the difference?


To store the state of one quantum bit, you need both the amplitude for state 0 and the amplitude for state 1. You need to store 2 numbers.

For a memory of N qubits, you need to store the amplitude for each bit configuration, so 2^N numbers. This is totally doable, and people have written quantum computer simulation software.


I thought "serious" meant at least one million error-corrected qubits or so, and it seems to me that we're at least two decades away from this. The article is talking about the 50 qubits milestone without proper error correction.


Why do you think so many qubits would be needed in order for it to be serious? The article quotes quantum computer researchers as saying 50-100 qubits could perform computations that are intractable on any classical computer. I'm inclined to believe them, the state space grows way faster than linearly as you entangle more particles.


To encode 100 logical error-corrected qubits, you need on the order of 100K to 1M physical qubits. People mix up the two types all the time.


It's true that at 50-100 physical (not logical) qubits, it's no longer possible to simulate classically. That does not mean that they are useful.

As other people pointed out, you may need a factor of 100-1000 more physical qubits to get as many logical qubits.


>> Not just any computer, but one on the verge of passing what may, perhaps, go down as one of the most important milestones in the history of the field.

Well, I don't know enough about quantum computers to be able to tell what's going on here, but "on the verge" and "finally here" sound a bit contradictory. Are we expecting to have useful quantum computers soon, like the article's opening paragraph suggests, or do we already have them, as the title does?


There must be a way to explain the principle behind quantum computers for a layman that understands both computers and quantum mechanics. Where is that explanation?


This is a tall order considering physicists haven't really come up with a layman explanation for the latter. The best way to understand quantum mechanics is to do the math, taking the Copenhagen interpretation to be true. Sadly, this is the only way to gain an intuition for quantum mechanics.


As was mentioned in the article, applications in chemical and physical simulation seem very promising. I would think that error correction would not be an issue. You would use QC to identify possible solutions which could then be verified by traditional computers.


Who is the market leader now? Is it D-Wave or IBM? Someone else?!


D-Wave computers are quantum annealers, not machines capable of running, for example, Grover’s or Shor’s algorithm.

And even these algorithms can help asymptotic complexity, but not solve an NP complete problem in polynomial time.

(Even AES256 is pretty much safe against a quantum attack, since a quantum computer effectively square roots the time complexity, and that just leaves key-exchange protocols, of which several QR-variants exist.)

Personally, I think the hype far outweighs the value, even compared to Big Data/IoT.


> And even these algorithms can help asymptotic complexity

Is this a theorem or conjecture?


Existing quantum algorithms have not yet been found which have reduced NP-complete problems to polynomial time. (See [0] for a pretty comprehensive list.)

IE, I was talking about quantum algorithms we have, none of which have done so, not stating the impossibility of such an algorithm.

I am not aware of a proof that comments on whether or not such an algorithm is impossible, however.

[0]: https://math.nist.gov/quantum/zoo/


The Chinese, perhaps.


Anyone know how you'd debug a quantum algorithm? It doesn't look like it would be possible to step through the code in a debugger.


you can simulate it in a regular computer and debug there.


The government is going to use them to finally start decrypting all of the VPN traffic they've been storing all these years.


Finally a viable platform to mine bitcoin and other crypto currencies


You cannot mine Bitcoin with a quantum computer, as there is no quantum algorithm to calculate SHA hashes.


All hashcash based PoW, including bitcoin's, is subject to quantum speedup by Grover's search algorithm. It allows you to find a hash output with 80 leading zero bits in roughly 2^40 steps rather than the 2^80 steps classically needed.


what a clickbait title


Bitcoin mining mostly





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: