Machine learning innovations for enhancing quantum-resistant cryptographic protocols in secure communication

Peter Adeyemo Adepoju 1, *, Blessing Austin-Gabriel 2, Adebimpe Bolatito Ige 3, Nurudeen Yemi Hussain 4, Olukunle Oladipupo Amoo 5 and Adeoye Idowu Afolabi 6

1 Independent Researcher, Lagos Nigeria.
2 Babcock University, Ilishan-Remo, Ogun State, Nigeria.
3 Independent Researcher, Canada.
4 M&M Technical Services, Nigeria.
5 Amstek Nigeria Limited, olukunle.
6 Independent Researcher, Nigeria.
 
Review
Open Access Research Journal of Multidisciplinary Studies, 2022, 04(01), 131-139.
Article DOI: 10.53022/oarjms.2022.4.1.0075
Publication history: 
Received on 03 June 2022; revised on 16 July 2022; accepted on 19 July 2022
 
Abstract: 
Quantum computing has introduced unprecedented challenges to traditional cryptographic systems, rendering many current protocols vulnerable to quantum attacks. Quantum-resistant cryptography has emerged as a crucial field, employing innovative algorithms such as lattice-based and hash-based schemes to counter these threats. Concurrently, machine learning (ML) revolutionizes cryptography by enhancing protocol optimization, key generation, and threat detection. This paper explores the integration of ML with quantum-resistant cryptographic frameworks, highlighting its potential to address efficiency and scalability challenges while improving security. Strategies for combining these domains are discussed, emphasizing hybrid models, dynamic adaptation to threats, and lightweight solutions. The study also considers the potential risks of ML integration, such as adversarial vulnerabilities and resource demands, while recommending collaborative efforts among researchers, policymakers, and practitioners. Ultimately, this interdisciplinary approach promises robust, scalable, and future-ready cryptographic systems for secure communication in a quantum era.
 
Keywords: 
Quantum-resistant cryptography; Machine learning in cryptography; Quantum computing threats; Secure communication; Cryptographic protocol optimization
 
Full text article in PDF: