Privacy-Preserving Probabilistic Voltage Forecasting in Local Energy Communities

Jean-François Toubeau, Fei Teng, Yi Wang, Leandro Von Krannichfeldt, Thomas Morstyn

Research output: Working paperPreprint

Abstract / Description of output

This paper presents a new privacy-preserving framework for the short-term (multi-horizon) probabilistic forecasting of nodal voltages in local energy communities. This task is indeed becoming increasingly important for cost-effectively managing network constraints in the context of massive integration of distributed energy resources. However, traditional forecasting tasks are carried out centrally, by gathering all individual information in a single database, thus exposing private aspects of end-users. To avoid such privacy issues, this work relies on a distributed learning scheme, known as federated learning wherein individuals’ data are kept decentralized. The learning procedure is then augmented with differential privacy, which offers formal resilience guarantees that the trained model cannot be reversed-engineered to infer sensitive local information. Moreover, the problem is framed using cross-series learning, which allows to smoothly integrate any new client joining the community (i.e., cold-start forecasting) without being plagued by data scarcity. Outcomes show that the proposed approach can achieve a trade-off between privacy and model performance for different architectures of deep learning networks.
Original languageUndefined/Unknown
PublisherTechRxiv
Number of pages10
DOIs
Publication statusPublished - 24 Jan 2022

Keywords / Materials (for Non-textual outputs)

  • differential privacy
  • deep learning
  • federated learning
  • heterogeneous data
  • voltage forecasting

Cite this