Archive for June, 2017

Cryptography in the Quantum Computing Age

June 16, 2017

There is a problem with much of modern internet communications at the moment.  The problem is that the security of same relies on factoring being hard.  And, this is the case for a conventional computing machine.  However, for a quantum computer, such as may become available to corporations and governments within the next 5 to 10 years, it will be easy.

So this necessitates the need for a ‘refresh’ of the algorithms maintaining internet security.  So, how can we do this?

How RSA works

Well, recall the following about how RSA works (from the wikipedia entry):

  • Take two very large primes (100s of digits long) and multiply them together.   (n = pq, p and q prime).
  • Compute h = lcm(p – 1, q – 1)  (this is easy via the Euclidean algorithm, as lcm(a, b) = ab/gcd(a, b) ).
  • Choose e st e < h and gcd(e, h) = 1 (again, easy via the Euclidean algorithm).
  • Determine d st ed = 1 (mod h)  [ so that d is the modular multiplicative inverse of e ]
  • e is then the public key.  n is also released as public.
  • d is the private key.

Encryption process:

  • Bob obtain’s Alice’s public key e.  With a message M, he converts it into an integer m.  Then he computes c = m^e (mod n)

Decryption process:

  • Alice obtains Bob’s message c, and decrypts it as c^d = (m^e)^d = m (mod n)
  • Then m -> M is easy if the padding scheme is known.

The Achilles heel exposed by a quantum computer

The main point about the above that makes RSA hard to break is that factorisation is hard.  But, if factorisation is easy, Jill can intercept Bob’s ciphertext c, and, given her knowledge of n and e (which are public), factor n to p and q, compute h, and then identify d as the multiplicative inverse of e mod h.

Consequent to this, she can then decrypt Alice’s message as above.

So, if factorisation is easy, RSA is in a bit of a pickle.  But is there a better way to encrypt?

RSA over an integer polynomial ring

Process:

  • Take two very large irreducible polynomials over the integer polynomial ring Z[x] (100s of tuples long) and multiply them together.   nb. for extra security, you may want to ensure that the coefficients of these polynomials are each at least 100 digits long. (n[x] = p[x]q[x], p[x] and q[x] prime).
  • Compute h[x] = lcm(p – 1, q – 1)  (this is easy via the Euclidean algorithm for integer valued polynomials, as lcm(a, b) = ab/gcd(a, b)).
  • Choose e st e < h and gcd(e, h) = 1 (again, easy via the Euclidean algorithm).
  • Determine d[x] st e[x]d[x] = 1 (mod h[x])  [ so that d is the modular multiplicative inverse of e ]
    • Note: This is hard, unless factorisation is easy.
  • e is then the public key.  n is also released as public.
  • d is the private key.

Encryption process:

  • Bob obtain’s Alice’s public key e.  With a message M, he converts it into an integer m.  Then he computes c = m^e (mod n)

Decryption process:

  • Alice obtains Bob’s message c, and decrypts it as c^d = (m^e)^d = m (mod n)
  • Then m -> M is easy if the padding scheme is known.

Is this vulnerable to attack by a Quantum Computer eavesdropping on a channel carrying an encrypted message?

I claim not, as the size of the set of irreducible (prime) polynomials over the integer ring is much bigger than the set of primes.  The cardinality might be the same, but the complexity just seems intuitively to be much greater to me.  Factorising a semi-prime polynomial, which as factors has prime polynomials with possibly 100 or 200 terms, each with coefficients up to 150 digits in length, is, I suspect, much much more difficult than factorising a semi-prime integer.

If one had a tetra-computer (“hyper-computer”), the story might be different, but I suspect we will not have sufficiently powerful variants of those until 2050.

Advertisements

Towards zero latency – improved communications, coming soon to you

June 16, 2017

This recent press release got me thinking about how a quantum internet might work.  In particular, one question an observer might ask is, why do we need entanglement to transmit information?

The answer is that the ‘speed of entanglement’ is fast.  It is at least 10000 times faster than the speed of light.  Possibly the square of the speed of light.  It is quick.  But, you say, can’t I communicate quite rapidly anyway from one side of the world to the other?  Yes, but there is latency.

The bane of latency

Latency is a problem in internet communications today.  This problem is felt most acutely when attempting high bandwidth communications that require near instantaneous feedback.  And a key medium (and one of the first mediums to adopt the new technologies that consume all the bandwidth, like VR) is gaming.

Gaming where the data centre is located in another country can be a very frustrating experience.  That is where entangled communications excel.  Additionally, entangled communications have the promise of increasing bandwidth drastically, from megabits per second, to gigabits, or even terabits per second.

Entanglement is useful

By entangling a medium of communication, such as photons, and then separating same, it is possible to transmit data very quickly, and very rapidly.  However, there are a couple of obvious problems:

  • How does one ‘recharge’ one’s entanglement capacity at each network node to service the required / anticipated bandwidth?, and
  • How does one ensure one has enough bandwidth for these communications?

These are interesting questions.  Certainly, it seems intuitively reasonable that with a first pass quantum internet, at best the rate of recharge of entanglement would at best be limited by:

  • The speed of light for transportation of entangled pairs from a source; and
  • The amount of energy consumed in creating entangled pairs

Resolution of the problems

However, it does not seem unreasonable that the amount of energy required to create an entangled pair could be relatively low and fairly cheap, economically.  Which leads the matter of transporting this ‘data energy’ to where it needs to be consumed.  This is a problem that would necessitate not just distributing data energy to a single point, but rather a 2-tuple of points.  But this should be possible eg with photons, as one could use optimal waveguides to transmit them to storage at particular centres, to administer 2 to 3 times the anticipated capacity in time bucket N say between nodes K1 and K2, with redundancy a function of the amount of funding the market between K1 and K2 is willing to pay, and reliability – that is, the duration of the entanglement – also a function of same.

Ensuring duration of reservoir of entangled pairs between nodes K1 and K2

Assuming that one has transmitted entangled information between nodes K1 and K2 to said nodes respectively, it should be possible to have error correction operating at both ends to ensure the integrity of the entangled state of the ‘data energy’.  This will require an input of first order energy in order to do this, at both nodes K1 and K2.  As to details of implementation, I am not sure, but I do know that this is likely something that has likely been already subject to exhaustive study (integrity of entangled information, that is).

Concluding thoughts

This theorycraft sketch of a quantum internet still necessitates that ‘data energy’, the backbone of same, be limited in terms of transmission by the speed of light.  But this is not such a bad thing, as long as one can anticipate demand in advance.  For example, in the energy markets, one often has to plan in advance how much energy people in a particular location at a particular time will want to use.  If there is not enough, there will be blackouts.  The equivalent in the quantum internet could be fallback to the regular internet.  But, hey.  At least things would still work.  Sort of.