Secure and Privacy-Preserving Decentralized AI through Model Refine and Fully Homomorphic Encryption
Researchers from Old Dominion University, Virginia Commonwealth University
Researchers aim to exploit potential system vulnerabilities in the decentralized learning framework, develop attack and defense mechanisms, and theoretically analyze the system's resilience.
They'll also enable a privacy-preserving decentralized learning process to protect sensitive data, and theoretically analyze and prove the privacy guarantee.
Funded by the CCI Coastal Virginia Node
Project Investigators
Rationale
The Internet of Things (IoT) has boosted the distribution of data, prompting the study of distributed learning algorithms.
However, current distributed learning systems face challenges in such areas as security, privacy, compatibility, and efficiency. Primary considerations include:
- The success of such learning systems heavily depends on the integrity of both the central server and data holders.
- The loT and Cyber-Physical Systems (CPS) usually hold heterogeneous models and data, which require the learning systems’ high compatibility to utilize these resources effectively.
- Security and privacy are becoming major concerns when more users contribute to the learning process to enable reliable learning performance.
Projected Outcomes
This research will help generate innovative and secure decentralized deep-learning techniques for numerous applications such as smart cities, smart homes, and mobile health.
It will also transform the existing centralized or distributed smart applications in a fully decentralized manner with security and performance guarantees, laying the groundwork for system research in dense IoT applications supported by decentralized learning.