Learning what works in accommodations: A federated machine learning framework with differential privacy guarantees
1 Department of Business Administration- Business Analytics (Major) Wilmington University New Castle DE 19720 USA.
2 Department of Information Technology, Washington University of Science and Technology, Alexandria, VA-22314, USA.
3 Department of MBA, Ashland University, Ashland, OH 44805.
Research Article
World Journal of Advanced Research and Reviews, 2023, 20(03), 2374–2379
Publication history:
Received on 28 October 2023; revised on 19 December 2023; accepted on 27 December 2023
Abstract:
This paper presents the Privacy-Preserving Autism Employment Data Trust, a machine learning-based federated data trust architecture that enables an evaluation of workplace accommodations without the need for raw data sharing. Features differential privacy to safeguard health and employment information that's sensitive to privacy, plus is built on NIST AI Risk Management Framework, U.S. government policies on HIPAA and the 21st Century Cures Act/ONC guidelines. Using a synthetic multi-site data set, the framework is shown to enable cross-site analytic queries to be answered with minimal accuracy loss and while ensuring strong privacy. Findings demonstrate that this method strikes a balance between privacy, compliance, and utility while providing an evidence-based pathway to inclusive employment research.
Keywords:
Federated Learning; Differential Privacy; Autism Employment; Workplace Accommodations; AI Governance; Nist AI RMF
Full text article in PDF:
Copyright information:
Copyright © 2023 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution Liscense 4.0
