Show simple item record

dc.contributor.advisorTrafalis, Theodore
dc.contributor.authorJafarigol, Elaheh
dc.date.accessioned2023-12-01T15:51:39Z
dc.date.available2023-12-01T15:51:39Z
dc.date.issued2023-12-15
dc.identifier.urihttps://hdl.handle.net/11244/339990
dc.description.abstractFederated learning is a groundbreaking distributed machine learning paradigm that allows for the collaborative training of models across various entities without directly sharing sensitive data, ensuring privacy and robustness. This Ph.D. dissertation delves into the intricacies of federated learning, investigating the algorithmic and data-driven challenges of deep learning models in the presence of additive noise in this framework. The main objective is to provide strategies to measure the generalization, stability, and privacy-preserving capabilities of these models and further improve them. To this end, five noise infusion mechanisms at varying noise levels within centralized and federated learning settings are explored. As model complexity is a key component of the generalization and stability of deep learning models during training and evaluation, a comparative analysis of three Convolutional Neural Network (CNN) architectures is provided. A key contribution of this study is introducing specific metrics for training with noise. Signal-to-Noise Ratio (SNR) is introduced as a quantitative measure of the trade-off between privacy and training accuracy of noise-infused models, aiming to find the noise level that yields optimal privacy and accuracy. Moreover, the Price of Stability and Price of Anarchy are defined in the context of privacy-preserving deep learning, contributing to the systematic investigation of the noise infusion mechanisms to enhance privacy without compromising performance. This research sheds light on the delicate balance between these critical factors, fostering a deeper understanding of the implications of noise-based regularization in machine learning. The present study also explores a real-world application of federated learning in weather prediction applications that suffer from the issue of imbalanced datasets. Utilizing data from multiple sources combined with advanced data augmentation techniques improves the accuracy and generalization of weather prediction models, even when dealing with imbalanced datasets. Overall, federated learning is pivotal in harnessing decentralized datasets for real-world applications while safeguarding privacy. By leveraging noise as a tool for regularization and privacy enhancement, this research study aims to contribute to the development of robust, privacy-aware algorithms, ensuring that AI-driven solutions prioritize both utility and privacy.en_US
dc.languageen_USen_US
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International*
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectMachine learningen_US
dc.subjectArtificial intelligenceen_US
dc.subjectPrivacyen_US
dc.subjectFederated learningen_US
dc.titleUncovering the Potential of Federated Learning: Addressing Algorithmic and Data-driven Challenges under Privacy Restrictionsen_US
dc.contributor.committeeMemberRazzaghi, Talayeh
dc.contributor.committeeMemberDiochnos, Dimitrios
dc.contributor.committeeMemberSiddique, Zahed
dc.date.manuscript2023-11-15
dc.thesis.degreePh.D.en_US
ou.groupGallogly College of Engineeringen_US
shareok.orcid0000-0002-0294-3454en_US


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


Attribution-NonCommercial-NoDerivatives 4.0 International
Except where otherwise noted, this item's license is described as Attribution-NonCommercial-NoDerivatives 4.0 International