Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You write as if you believe the difference between small-footprint embedded cryptography and server farm cryptography was the byte size of the resulting .o/.a. That can't possibly be what you believe; if I described you that way, you'd justifiably object that I was attacking a straw man. Perhaps you can clarify.


Indeed, I do not believe binary footprint is the only difference. It's actually the least important one. The most important is availability.

As I said above, Libsodium's support of embedded targets is limited. To the point it wasn't even considered on some benchmarks, while Monocypher was. https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=8725488 (Note that since that benchmark, I have reduced stack usage of signature verifications by half, at no noticeable loss of performance).

Monocypher's portability comes from 3 factors beyond sticking to standard C99: the size of its binary (least important), the size of the stack (more important), and the lack of dependency (most important). Monocypher uses no system call, and does not depend on libC. That makes it usable on targets without a kernel.

The second difference is simplicity. When I wrote about Libsodium being 10 times bigger, I didn't refer to the size of the binary. I was referring to the size of the source code. And the number of exposed symbols, to a lesser extent. Such a massive difference in degree is close to a difference in kind. Audits take 10 times more time and money (where Monocypher cost $7K and 7 days, Libsodium would cost $70K and 14 weeks). Building, porting, and deploying takes 10 times more effort, and sometimes is not possible at all, when the environment is constrained enough.

Simplicity matters, even on server farms. DJB wrote TweetNaCl for this very purpose if I recall correctly. He wasn't even targetting embedded devices.

---

One important thing to clarify: small-footprint embedded cryptography and server farm cryptography can be the same in the IoT world: that connected object has to connect somewhere, generally to the vendor's servers. While embedded devices are often very constrained, the servers can be a bottleneck (especially true if there's no subscription to pay for the servers). You might want to optimise for the server side, to help scaling, even if it put more strain on the embedded side.

That, and there's an advantage to using the most popular primitives: less room for new cryptanalysis, more compatibility. That's how you get Ed25519 signature verification even on tiny 16-bit embedded processors. You'd like to use an extension field to make a faster, more lightweight curve, but the literature on prime fields (and therefore Edwards25519) is more stable, making Ed25519 the safer choice.

I also acknowledge that Monocypher is not ideal for tiny embedded devices: Blake2b instead of Blake2s, 64-bit multiplication on Poly1305 and Curve25519… 64-bit multiplication is particularly problematic on 8/16-bit processors, they end up inflating the binary size to prohibitive proportions. For those tiny targets, C25519 is a much better fit. I have redirected a user to it once, though they ended up using a slightly bigger processor, and kept Monocypher because of its speed.

On small 32-bit processors however, Monocypher is king. As far as I know, only custom assembly beats it.


Bernstein and Schwabe wrote Tweetnacl for verifiability. You did not write Monocypher for verifiability, as the track record shows. Libsodium has been repeatedly audited; it is one of the most heavily targeted libraries in the industry. You shipped an EdDSA that accepted all-zeroes input as valid; if you want to snipe at libsodium, let me ask: what's the comparably catastrophic vulnerability there?


Okay, now you're conflating "verifiability" and "has been verified". One can perfectly build a crypto library with verifiability in mind, then fail to verify some crucial property.

Breaking news: TweetNaCl has not been fully verified. Two instances of Undefined Behaviour (negative left shifts), lines 281 and 685, remain to this date. They're easily found with UBSan (yay for verification!), but for some reason DJB has yet to correct them.

The original paper verified 2 specific memory safety properties (no out of bound accesses, no uninitialised memory access). Monocypher's test suite does the same (and more) on a systematic basis since before version 1.0. I use Valgrind, all sanitizers, and the TIS interpreter. The test suite covers all code & data paths, much thanks to the code being constant time.

So not only Monocypher has been build with verifiability in mind, it has been pretty thoroughly verified. You would know that if the time you took to discredit Monocypher were used to look at it instead. It's all there in tests/test.sh, referenced in the README.

---

About that vulnerability 2 years ago. As shocking at it may be, I learned from it. The looming threat of something similar happening again tends to do that. I've paid my dues since, learned a ton. The audit gives no cause to fear another error of that kind. The test suite was deemed adequate, and they found no bug, however minor.

That old bug is irrelevant now. Give me a break.


You're the one who drew the comparison between the security of your library and libsodium. I simply completed the comparison for you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: