By now, everyone is aware of the story published in the New York Times earlier this week by John Markoff. The team of researchers led by Arjan Lenstra scanned 7.1 million 1024-bit public facing RSA keys, and came to the conclusion that an estimated 0.2 percent of all RSA keys in the wild are duplicate keys, and many more may share a common prime factor. Lentra's research paper stated the following:
“We performed a sanity check of public keys collected on the web. Our main goal was to test the validity of the assumption that different random choices are made each time keys are generated. We found that the vast majority of public keys work as intended. A more disconcerting finding is that two out of every one thousand RSA moduli that we collected offer no security.”
This does not mean that the RSA key cryptography is broken by any means.
In fact, another research team led by Nadia Heninger at Princeton University proposes that the issue lies in faulty implementations of the RSA algorithm that result in predictable random keys, due to weak entropy built into key generation software. Heninger agrees with Lenstra's research conclusions, with the exception that her team believes the problem is isolated to keys embedded on devices such as routers, firewalls, and VPN appliances.Lenstra’s team, on the other hand, suspects that the issue is much more widespread.
Both research teams brought two core issues to light:
1. Duplicate keys have been found in the wild
2. Weak keys are being generated by weak entropy
The instance of duplicate keys may not necessarily be a scandal, but it’s certainly bad practice at the administrator level. The existence of weak keys, however, is a wider issue as it pertains to poor quality random number generation.
So, what are we gonna do about it?
The Electronic Frontier Foundation (EFF), who provided the data used by the researchers, announced to the CA/Browser forum yesterday that the scope of the RSA issue has no affect on certificates issued by trusted certificate authorities, and other CA’s are announcing a clean bill of health based on the research produced by Heninger.
We could go that route, but Symantec takes data security very seriously. We want to be abundantly sure that we have exhausted every possible threat vector whether it’s duplicate keys in the wild or weak key generation on devices. SSL keys on web servers may very well not be affected at all. However, being that Symantec is the world's largest certificate authority, it's our duty to leave nothing to chance. We may not be the first to press with an "all clear" message, but its important to note that a) our customer database is much larger than any other CA, b) It would be irresponsible of us to run some quick database scrubs and call it a day, and c) we leverage our deep bench of resources to perform exhaustive analysis to rule out unexposed risk.
Symantec is taking proactive measures to verify the level of risk the issues at hand pose to the digital world. We have dedicated a team comprised of our top cryptography experts who are conducting due diligence on the keys issued by our customers as a precautionary best practice.
Additionally, we are working with both research teams employing their independent methodologies to cross check our own lists of customer generated keys.
Symantec is proactively implementing additional internal security measures to protect against future occurrences of random number generation weaknesses found in third party software or devices.
I’ve said this internally, and it’s worth mentioning here – our keys and the keys that protect our customers, and ultimately consumers worldwide, are our number one priority. We’re taking this matter rather seriously, going above and beyond what can be considered best practices of due diligence.
Nadia Heninger’s correct: there is no reason to panic, but as the world’s largest data security company it’s our job to double and triple check, leaving no stone unturned.