{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,22]],"date-time":"2025-10-22T23:38:00Z","timestamp":1761176280172,"version":"build-2065373602"},"reference-count":0,"publisher":"IOS Press","isbn-type":[{"value":"9781643686318","type":"electronic"}],"license":[{"start":{"date-parts":[[2025,10,21]],"date-time":"2025-10-21T00:00:00Z","timestamp":1761004800000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by-nc\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2025,10,21]]},"abstract":"<jats:p>Federated Learning (FL) offers a decentralized approach to training large language models (LLMs), addressing critical concerns around data privacy and transmission costs. However, FL\u2019s inherent distributed training paradigm can lead to increased energy consumption and carbon emissions, especially with random or uninformed client selection. To tackle this issue, we propose SustainFed-LLM, a novel Q-learning-based client selection framework that integrates real-time renewable energy availability, carbon intensity data, and fairness considerations. By dynamically assessing client performance, spare capacity, and sustainability metrics, SustainFed-LLM optimizes client participation to minimize environmental impact while maintaining model accuracy. SustainFed-LLM significantly reduces energy consumption by up to 50%, while achieving convergence 30\u201370% faster compared to conventional selection strategies. We also analyze the communication overhead and computation cost, finding 30% fewer transmitted bytes and a 14% drop in FLOPS. The proposed framework also promotes fairer client participation, as evidenced by a reduced Gini coefficient. These findings underscore the potential of SustainFed-LLM to advance green AI, providing an effective pathway for large-scale sustainable and energy-efficient LLM training.<\/jats:p>","DOI":"10.3233\/faia251355","type":"book-chapter","created":{"date-parts":[[2025,10,22]],"date-time":"2025-10-22T09:59:14Z","timestamp":1761127154000},"source":"Crossref","is-referenced-by-count":0,"title":["SustainFed-LLM: Renewable Energy Aware Client Selection for Energy Efficient Federated Training of Large Language Models"],"prefix":"10.3233","author":[{"ORCID":"https:\/\/orcid.org\/0009-0005-9970-7544","authenticated-orcid":false,"given":"Sunbal","family":"Iftikhar","sequence":"first","affiliation":[{"name":"Technological University Dublin, Ireland"}]},{"ORCID":"https:\/\/orcid.org\/0009-0003-6980-8817","authenticated-orcid":false,"given":"Hassan","family":"Khan","sequence":"additional","affiliation":[{"name":"Data Science Institute, University of Galway, Galway, H91 TK33 Ireland"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3300-1152","authenticated-orcid":false,"given":"Steven","family":"Davy","sequence":"additional","affiliation":[{"name":"Technological University Dublin, Ireland"}]}],"member":"7437","container-title":["Frontiers in Artificial Intelligence and Applications","ECAI 2025"],"original-title":[],"link":[{"URL":"https:\/\/ebooks.iospress.nl\/pdf\/doi\/10.3233\/FAIA251355","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,22]],"date-time":"2025-10-22T09:59:14Z","timestamp":1761127154000},"score":1,"resource":{"primary":{"URL":"https:\/\/ebooks.iospress.nl\/doi\/10.3233\/FAIA251355"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,10,21]]},"ISBN":["9781643686318"],"references-count":0,"URL":"https:\/\/doi.org\/10.3233\/faia251355","relation":{},"ISSN":["0922-6389","1879-8314"],"issn-type":[{"value":"0922-6389","type":"print"},{"value":"1879-8314","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,10,21]]}}}