Abstract
In the paper “Why Cryptosystems Fail”, Ross Anderson ponders the question about why cryptosystems really fail. Obviously, there may be weak crypto-algorithms, too short key lengths and flawed crypto-protocols. However, these were not the main reason why cryptosystems failed. Anderson discovered that the problem had more to do with misplaced trust and misconceptions of the threats the systems faced. Now, more than 25 years later, it seems prudent to revisit the question of why cryptosystems fail. We investigate the original paper, and evaluate to what extent the situation is similar today.
Similar content being viewed by others
References
Anderson, R. (1993). Why cryptosystems fail. In Proceedings of the 1st ACM conference on computer and communications security (pp. 215–227). ACM.
Zimmermann, P. (1998). An introduction to cryptography. Documentation for pretty good privacy. Network Associates: Santa Clara.
Schneier, B. Memo to the amateur cipher designer. Crypto-Gram Newsletter.
Heilman, E., Narula, N., Dryja, T., & Virza, M. Iota vulnerability report: Cryptanalysis of the curl hash function enabling practical signature forgery attacks on the iota cryptocurrency. Technical report, MIT Media Lab.
Schneier, B. (2016). Cryptography is harder than it looks. IEEE Security & Privacy, 14(1), 87–88.
Walker, J., et al. (2000). Unsafe at any key size; An analysis of the wep encapsulation. IEEE Document, 802(00), 362.
Carvalho, M., DeMott, J., Ford, R., & Wheeler, D. A. (2014). Heartbleed 101. IEEE Security & Privacy, 12(4), 63–67.
Barkan, E., Biham, E., & Keller, N. (2003). Instant ciphertext-only cryptanalysis of gsm encrypted communication. In Annual international cryptology conference (pp. 600–616). Springer.
ICAO. Convention on International Civil Aviation. (2006). Convention (9th ed., Vol. 7300/9). Montreal: ICAO.
Taleb, N. N. (2018). Skin in the game: Hidden asymmetries in daily life. New York: Random House.
Abadi, M., & Needham, R. (1996). Prudent engineering practice for cryptographic protocols. IEEE Transactions on Software Engineering, 1, 6–15.
Checkoway, S., Fredrikson, M., Niederhagen, R. F., Everspaugh, A., Green, M., Lange, T., Ristenpart, T., Bernstein, D. J., & Shacham, H., et al. (2014). On the practical exploitability of dual ec in tls implementations. In Conference; 23rd USENIX security symposium; 2014-08-20; 2014-08-22. Usenix Association.
Checkoway, S., Maskiewicz, J., Garman, C., Fried, J., Cohney, S., Green, M., et al. (2018). Where did i leave my keys? Lessons from the juniper dual ec incident. Communications of the ACM, 61(11), 148–155.
Higginbotham, S. (2018). 6 ways IoT is vulnerable. IEEE Spectrum, 55(7), 21.
Thompson, K. (1984). Reflections on trusting trust. Communications of the ACM, 27(8), 761–763.
Malmedal, B. & Røislien, H. E. (2016). The Norwegian cybersecurity culture. NorSIS: Report.
Seacord, R. C. (2008). The CERT C secure coding standard. London: Pearson Education.
Shor, P. W. (1994). Algorithms for quantum computation: Discrete logarithms and factoring. In 35th Annual symposium on foundations of computer science, 1994 proceedings (pp. 124–134). IEEE.
ETSI. (2011). Implementation security of quantum cryptography; Introduction, challenges, solutions. ETSI White Paper 27, ETSI, Sophia Antipolis, France.
ETSI. (2015). Quantum safe cryptography and security: An introduction, benefits, enablers and challenges. ETSI White Paper 8, ETSI, Sophia Antipolis, France.
Smart, N. P. (ed). (2014). Algorithms, key sizes and parameters report 2014. Technical report, ENISA.
Microsoft Corporation. Deprecation of SHA-1 for SSL/TLS certificates in microsoft edge and internet explorer 11. Technical report, Microsoft.
Peters, T. PEP 20—The Zen of Python (2004-08).
Industrial Control Systems Cyber Emergency Response Team (ICS-CERT). Recommended practice: Improving industrial control system cybersecurity with defense-in-depth strategies. Department of Homeland Security.
NeSmith, B.. The cybersecurity talent gap is an industry crisis. Forbes (online), 2018-08-09.
Anderson, R. (2008). Security engineering. Hoboken: Wiley.
Feynman, R. P. (1985). Cargo cult science. In W. W. Norton (Ed.), In surely you’re joking, Mr. Feynman (1st ed.)., Originally a 1974 Caltech commencement address London: Vintage.
Schneier, B. (2003). Beyond fear: Thinking sensibly about security in an uncertain world. New York: Copernicus Book.
Greenberg, A. The untold story of NotPetya, the most devastating cyberattack in history. Wired (online), 22.08.2018.
Gollmann, D. (2003). Analysing security protocols. In Formal aspects of security: First international conference, FASec 2002, London, UK, December 16–18, 2002, Revised papers (vol. 1, p. 71). Springer.
Knuth, D. (1977). Notes on the van Emde Boas construction of priority deques: An instructive use of recursion. Memo/Letter.
ENISA. ENISA threat landscape report 2017. Technical report, ENISA.
Symantec Corporation. Internet security threat report. (2018). Report. Mountain View: Symantec Corporation.
ETSI Technical Committee Cyber Security. CYBER; methods and protocols; part 1: Method and pro forma for threat, vulnerability, risk analysis (TVRA). Technical Specification 102 165-1 V5.2.3, ETSI (2017).
Adam, S. (2014). Threat modeling: Designing for security (1st ed.). Hoboken: Wiley.
Kalenderi, M., Pnevmatikatos, D., Papaefstathiou, I., & Manifavas, C. (2012). Breaking the gsm a5/1 cryptography algorithm with rainbow tables and high-end fpgas. In 22nd International conference on field programmable logic and applications (FPL), 2012 (pp. 747–753). IEEE.
Nohl, K. (2010). Attacking phone privacy. Black Hat USA, pp. 1–6.
Dunkelman, O., Keller, N., & Shamir, A. (2010). A practical-time related-key attack on the kasumi cryptosystem used in gsm and 3g telephony. In Annual cryptology conference (pp. 393–410). Springer.
Florêncio, D., & Herley, C. (2013). Where do all the attacks go? In Economics of information security and privacy III (pp. 13–33). Springer.
Kruger, J., & Dunning, D. (1999). Unskilled and Unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121.
Plous, S. (1993). The psychology of judgment and decision making., McGraw-Hill series in social psychology New York: Mcgraw-Hill Book Company.
Schneier, B. Drawing the wrong lessons from horrific events. CNN.com.
Taleb, N. N. (2007). The black swan: The impact of the highly improbable. New York: Random House Publishing Group.
Cavoukian, A. (2009). Privacy by design: The 7 foundational principles. Ontario: Information and Privacy Commissioner of Ontario.
EU. (2016). Regulation (EU) 2016/679 (General data protection regulation). Regulations 679, EU, 04.
NSM. (2016). S-01 Fire effektive tiltak mot dataangrep.
NSM. (2016). S-02 Ti viktige tiltak mot dataangrep.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Annex: The list of Lessons
Annex: The list of Lessons
-
1.
Understanding why cryptosystems fail is important.
-
2.
Security costs should be internalized (if at all possible).
-
3.
Security controls to provide accountability are important.
-
4.
Systems must have Detection and Response capabilities.
-
5.
The principle of least privilege should be enforced.
-
6.
Outsourcing must be done in a responsible way.
-
7.
Security responsibility cannot be outsourced.
-
8.
Minimizing the attack surface is vitally important.
-
9.
Strong quality assurance is a prerequisite for security.
-
10.
Misguided advise can be harmful (do not trust all advise).
-
11.
Passwords and PIN codes alone cannot provide strong security.
-
12.
Requirements for unpredictability must be explicit.
-
13.
Pseudo-random number generators must be verified.
-
14.
Timely handling of security updates is absolutely essential.
-
15.
Old equipment that cannot be updated is a serious security risk.
-
16.
Crypto products must not be blindly trusted.
-
17.
Hardware support is needed for strong security.
-
18.
Trust must be warranted and continually evaluated.
-
19.
Cryptographic backdoors must be considered harmful.
-
20.
Physical exposure and inadequate protection is a problem.
-
21.
Systems need a complete and comprehensive security architecture.
-
22.
Systems need complete and comprehensive security policies.
-
23.
Security is a process. All parts have a best-before date.
-
24.
Human factors must be taken into account.
-
25.
Special care must be taken when handling security credentials.
-
26.
Cryptosystems must be designed to allow algorithm changes.
-
27.
Cryptosystems must be designed to allow extending the key lengths.
-
28.
Every effort should be made to keep complexity to a minimum.
-
29.
Defense-in-depth is a recommended strategy.
-
30.
Security specialization studies needs increased capacity.
-
31.
Security must be taught as part of all ICT educations.
-
32.
Security and risk must be taught as part of all higher educations.
-
33.
Assumptions considered harmful. Explicitness is a virtue.
-
34.
Threat modeling should be done frequently.
-
35.
One must take measures to ensure that one uses “good” security.
-
36.
Company politics may affect security and may obscure the goals.
-
37.
Both pro-active and re-active security is needed.
-
38.
Scalability issues affects both attacks and defenses.
-
39.
Cost–benefit consideration affects both attacks and defenses.
-
40.
End-users cannot be relied upon concerning security decisions.
-
41.
Experts may sometimes overestimate threats/risks.
-
42.
Beware of the “exceptional versus mundane” problem.
-
43.
Risk awareness of exceptional events is necessary.
-
44.
Security cultures can be part of the problem, but also the solution.
-
45.
Security cultures change, human nature remains.
-
46.
Systems evolve. Everything may need to be replaced.
Rights and permissions
About this article
Cite this article
Køien, G.M. Why Cryptosystems Fail Revisited. Wireless Pers Commun 106, 85–117 (2019). https://doi.org/10.1007/s11277-019-06265-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11277-019-06265-6