Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> efficient complete formulas for Weierstrass curves were found only after Curve25519 was well established

Assuming you are talking about the Renes–Costello–Batina formulas, they're complete, but not necessarily efficient. According to [1], optimized short Weierstrass with the complete formulas is still 1.5 to 3 times slower than Curve25519. I imagine the numbers won't be much better for Edwards25519, either. There's definitely a ton of potential for a better complete addition formula on Weierstrass still left.

> Ristretto is nice but so terribly complex

Ristretto is nice, terribly complex, and you don't actually need to care about the conceptual complexity. As an implementer, your only job is to execute the explicit formulas in section 5 of the Ristretto website. You do not have to be able to follow the hard math (just how you do not have to be able to follow the hard math involved in making the explicit formulas). Plus the entire thing can be trivially constant-time given a constant-time selection primitive and constant-time field arithmetic. It's not that much more difficult than doing regular point compression on your own.

[1] Peter Schwabe, Daan Sprenkels. The complete cost of cofactor h=1 (published in INDOCRYPT19), https://eprint.iacr.org/2019/1166.pdf



>Ristretto is nice, terribly complex, and you don't actually need to care about the conceptual complexity. As an implementer, your only job is to execute the explicit formulas in section 5 of the Ristretto website. You do not have to be able to follow the hard math (just how you do not have to be able to follow the hard math involved in making the explicit formulas).

I don't think one should blindly follow an instruction without understanding why in any fields, let alone in crypto where a small, subtle difference can make or break it. Also, understanding crypto requires less math than inventing (and attacking) crypto, so it takes some effort, but it's doable even for hobbyists. If the math makes one uncomfortable, maybe one shouldn't try to roll their own crypto for production use in the first place.

Case in point: the author of this article that we're commenting on made a deadly mistake because they did not understand the math behind point conversion between Ed25519 and Curve25519 [1].

Below I also point out a mistake in their claim about malleability in EdDSA.

[1] https://www.reddit.com/r/crypto/comments/8toywt/critical_vul...


> Case in point: the author of this article that we're commenting on made a deadly mistake because they did not understand the math behind point conversion between Ed25519 and Curve25519

By the way, since you seem to work on Wycheproof: would you take a look at my pull request? The EdDSA test vectors would have saved me (and my users), but the front page didn't mention them, so I didn't know they even existed for over a year. I initially believed you were concentrating on other, even more error prone, primitives.

Others might be in my position: letting bugs through because they think Whycheproof is not relevant to their project. A pity, considering how amazing Whycheproof is (I'm now systematically integrating any new test vector I learn about).


That's a good example of how a "SafeCurve" caused a vulnerability that wouldn't exist in Weierstrass curve.

But many smart people made many such mistakes in the past. If we gatekeep it to much then we won't have anyone left to implement crypto.


I didn't say anything about gatekeeping. It's okay to make mistakes, that's one of the best way to learn.

I said if one isn't comfortable with the math, maybe don't try to roll one's own crypto and advertise or use it as production-grade crypto.


It's difficult to assess one's "comfort" with the math. I've been working with crypto for more then 10 years and I wouldn't say that I'm perfectly "comfortable" (e.g. the Ristretto stuff). Should I stop working with it?


You should not implement Ristretto, and continue implementing stuff that you're comfortable with.

Crypto is deep. You can get involved at the levels you feel comfortable with.


Maybe we should gatekeep it so much though. As long as there exist at least two people capable of implementation per programming language (one to implement, another to audit), there will only ever be one, single, canonical implementation and there's no way around it. It is not and should not be an inherent right to be allowed to implement cryptography (that is put into production or made publicly available). The gatekeeping is there for a reason and it's important that we uphold it. Fewer implementations means that more people will be focused on having to write and check less code overall. Patents could be used to help with this by only permitting one upstream implementation to exist, but that's now how they end up being used in practice, and that's ignoring the fact that patent expiry is impractically short (compared to copyright expiry especially so).


Gate keeping is a double edged, and somewhat blunt, sword.

First, some Maverick is going to ignore what everyone says and implement crypto for serious applications. Like yours truly.

Second, I've seen it go a bit too far when I implemented Argon2i: there was a discrepancy between the specs and the reference implementations, and the authors haven't corrected the specs. I figured this was because not enough independent implementers bugged them about that. (Now, 3 years later, the specs still aren't fixed, so maybe the authors are really really busy. At least but the issue is still open: https://github.com/P-H-C/phc-winner-argon2/issues/183 )


That simply does not work in the real world. Also, why does this only applies to crypto? A RCE vuln can have a much larger impact than mishandling cofactors. Should we have canonical implementations of every piece of software imaginable?


we should leave the task of left-padding a string to a popular, no doubt well-tested library


Exactly. Yes, it's still not as efficient... but it feels to me that if they were found before and were "marketed" as Curve25519 was, we would be using them instead.

There were always more efficient formats (binary curves, extension fields) but they never caught on, so efficiency isn't everything.

From a cursory reading, shouldn't that paper compare timings with a Ristretto implementation? The overhead may be small but must be measured for a fair comparison.

It's good to know that implementing Ristretto is much easier than understanding it - that website is very intimidating ;) I need to study it more.


It's not that complex once you have formulas for computing square roots. I've recently implemented it in TypeScript using bigints for browsers & nodejs. Quite readable & performant. See index.ts file here: https://github.com/paulmillr/noble-ed25519

Wish ristretto folks added the library to their website though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: