Algorithm Agility?
2021-04-25 04:00:00 Author: www.tbray.org(查看原文) 阅读量:151 收藏

What happened was, I was fooling around with zero-knowledge proof ideas and needed to post public keys on the Internet in textual form. I picked ed25519 keys (elliptic-curve, also known as EdDSA) so I asked the Internet “How do you turn ed25519 keys into short text strings?” The answer took quite a bit of work to find and, after I posted it, provoked a discussion about whether I was doing the right thing. So today’s question is: Should these things be encoded with the traditional PKIX/PEM serialization, or should developers just blast the key-bits into base64 and ship that?

Previously ·

Old-school key wrapping · Traditionally, as described in the blog piece linked above, the public key, which might be a nontrivial data structure, is serialized into into a byte blob which includes not just the key bits but metadata concerning which algorithm applies, bit lengths, and hash functions.

When I say “old-school” I mean really old, because the technologies involved in the process (ASN.1, PKIX, PEM) date back to the Eighties. They’re complicated, crufty, hard to understand, and not otherwise used in any modern applications I’ve ever heard of.

Having said all that, with a couple of days of digging and then help from YCombinator commentators, the Go and Java code linked above is short and reasonably straightforward and pretty fast, judging from my unit testing, which round-trips a thousand keys to text and back in a tiny fraction of a second.

Since the key serialization includes metadata, this buys you “Algorithm Agility”, meaning that if the flavor of key you’re using (or its supporting hash or whatever) became compromised and untrustworthy, you can change flavors and the code will still work. Which sounds like a valuable thing.

There is, after all, the prospect of quantum computing, which assuming that they can ever get the hardware to do anything useful, could crack lots of modern crypto notably including ed25519. I know very smart people who are betting on quantum being right around the corner, and others, equally smart, who think it’ll never work. Or that if it does, it won’t scale.

The simpler way · Multiple commentators pointed out that ed25519 keys and signatures aren’t data structures, just byte arrays. Further, that there are no options concerning bit length or hash algorithm or anything else. Thus, arguably, all the apparatus in the section just above adds no value. In fact, by introducing all the PKIX-related libraries, you increase the attack surface and arguably damage your security profile.

Furthermore, they argue, ed25519 is not likely to fail fast; if the algorithms start creeping up on it, there’ll be plenty of time to upgrade the software. I can testify that I learned of multiple in-flight projects that are going in EdDSA-and-nothing-else. And, to muddy the waters, another that’s invented its own serialization with “a 2-3 byte prefix to future proof things.”

Existence proof · I’m working on a zero-knowledge proof where there are two or more different public posts with different nonces, the same public key, and signatures. The private key is discarded after the nonces are signed and the posts are generated, and keypairs aren’t allowed to be re-used. In this particular case it’s really hard to imagine a scenario where I’d feel a need to switch algorithms.

Conclusions? · The question mark is because none of these are all that conclusive.

  1. Algorithm agility is known to work, happens every time anyone sets up an HTTPS connection. It solves real problems.

  2. Whether or not we think it’s reasonable for people to build non-agile software that’s hardwired to a particular algorithm in general or EdDSA in particular, people are doing it.

  3. I think it might be beneficial for someone to write a very short three-page RFC saying that, for those people, just do the simplest-possible Base64-ification of the bytes. It’d be a basis for interoperability. This would have the potential to spiral into a multi-year IETF bikeshed nightmare, though.

  4. There might be a case for building a somewhat less convoluted and crufty agility architecture for current and future public-key-based applications. This might be based on COSE? This would definitely be a multi-year IETF slog, but I dunno, it does seem wrong that to get agility we have to import forty-year-old technologies that few understand and fewer like.

The current implementation · It’s sort of the worst of both worlds. Since it uses the PKIX voodoo, it has algorithm agility in principle, but in practice the code refuses to process any key that’s not ed25519. There’s an argument that, to be consistent, I should either go to brute-force base64 or wire in real algorithm agility.

Having said that, if you do need to do the PKIX dance with ed25519, those code snippets are probably useful because they’re simple and (I think) minimal.

And another thing. If I’m going to post something on the Internet with the purpose of having someone else consume it, I think it should be in a format that is described by an IETF RFC or W3C Rec or other stable open specification. I really believe that pretty strongly. So for now I’ll leave it the way it is.



文章来源: https://www.tbray.org/ongoing/When/202x/2021/04/24/Algorithm-Agility
如有侵权请联系:admin#unsafe.sh