This article criticizes the use of the RSA encryption and signature system:
Basically the argument here is that RSA is too simple and straightforward. Therefore there is a temptation to implement it and make bonehead errors while doing so. The claim is made that other, more complicated, schemes are better because a potential implementer would become discouraged and would seek out a library. Presumably the implementer would then take the time to learn how to properly use the library.
I don't think the author of this article has met many programmers. Complexity is like catnip to the sort of people that otherwise love the sort of intense detail associated with making computers do useful things.
Since many PGP implementations use RSA by default some comments are in order.
Moreover, p and q must be chosen independently. If p and q share approximately half of their upper bits, then N can be factored using Fermat's method.
From a recent (2022) study 12) that involved searching for instances of this weakness:
I applied the algorithm to a dump of the SKS PGP key servers. I found four vulnerable keys. However all these keys had a user ID that did imply they were created for testing.
It is plausible that these keys were not generated by vulnerable implementations, but were manually crafted, possibly by people aware of this attack creating test data.
The study ended up showing that this particular bonehead error was rare. It was found in a single very obscure library.
In fact, even the choice of primality testing algorithm can have security implications.
Perhaps, but the link is to a study with no obvious applicability to RSA. So it is not clear exactly what the claimed issue is here.
It’s important to recognize that in none of these cases is it intuitively obvious that generating primes in such a way leads to complete system failure.
Normally for RSA you pick 2 random numbers and then find a prime close to each. All of these cases involved generating RSA keys in ways quite different than normal. So sure, your super clever method might have weaknesses that are not obvious, but what rational person would not consider the possible existence of such weaknesses when doing something entirely different from what everyone else is doing? Why is RSA singled out here? Doing weird stuff will usually produce strange results in any context.
Instead, developers are encouraged to choose a large d such that Chinese remainder theorem techniques can be used to speed up decryption. However, this approach’s complexity increases the probability of subtle implementation errors, which can lead to key recovery.
The linked article doesn't describe any sort of implementation error. Instead it describes a completely theoretical hardware attack.
Despite cryptographers recommending the use of 65537, developers often choose e = 3 which introduces many vulnerabilities into the RSA cryptosystem.
A quick check of a RSA key generated by GnuPG revealed the use of 65537 for the public exponent.
Padding oracle attacks everywhere
Padding oracle attacks are not applicable to PGP simply because the encryption is only done once and there is no reverse channel.
So what should you use instead?
First of all, a common misconception is that ECC is super dangerous because choosing a bad curve can totally sink you.
Alternatively, it is a completely accurate conception that failing to validate your elliptic curve parameters properly can lead to bad outcomes. In some cases, failure to properly validate gets an attacker the secret key material.
Note all the conditional bits covered by the linked article in the previous paragraph. Different curves have different properties and different issues. There are a bunch of different curves in common use while RSA pretty much always uses 65537 for the one and only implementer controlled parameter (public exponent).