A Zero Trust Framework for Realization and Defense Against Generative AI Attacks in Power Grid
Abstract
Understanding the potential of generative AI (GenAI)-based attacks on the power grid is a fundamental challenge that must be addressed in order to protect the power grid by realizing and validating risk in new attack vectors.
In this paper, a novel zero trust framework for a power grid supply chain (PGSC) is proposed.
This framework facilitates early detection of potential GenAI-driven attack vectors (e.g., replay and protocoltype attacks), assessment of tail risk-based stability measures, and mitigation of such threats.
- First, a new zero trust system model of PGSC is designed and formulated as a zero-trust problem that seeks to guarantee for a stable PGSC by realizing and defending against GenAI-driven cyber attacks.
- Second, in which a domain-specific generative adversarial networks (GAN)- based attack generation mechanism is developed to create a new vulnerability cyberspace for further understanding that threat.
- Third, tail-based risk realization metrics are developed and implemented for quantifying the extreme risk of a potential attack while leveraging a trust measurement approach for continuous validation.
- Fourth, an ensemble learning-based bootstrap aggregation scheme is devised to detect the attacks that are generating synthetic identities with convincing user and distributed energy resources device profiles.
Experimental results show the efficacy of the proposed zero trust framework that achieves an accuracy of 95.7% on attack vector generation, a risk measure of 9.61% for a 95% stable PGSC, and a 99% confidence in defense against GenAI-driven attack.
Authors
- Md Shirajum Munir, Old Dominion University
- Sravanthi Proddatoori, Old Dominion University
- Manjushree Muralidhara, Old Dominion University
- Walid Saad, Virginia Tech
- Zhu Han, University of Houston
- Sachin Shetty, Old Dominion University
- Publication/Confrence: 2024 IEEE International Conference on Communications.
- Date: June 9-13, 2024.