Modified RL-Based Data-Driven Algorithm for Optimal Integrated Energy System Operation Incorporating P2P Energy Trading and DSM
Main Article Content
Abstract
The distinct features of dairy farms impact the layout and functionality of their integrated energy systems (IES). This research introduces of demand side management (DSM) for IES of dairy farms, leveraging various energy devices and load types to reduce energy consumption. Peer-to-peer (P2P) energy trading is also taken into consideration for several reasons including the additional electricity bill savings. In order to implement DSM and P2P energy trading (P2P-ET), the Decision Tree Regression (DTR) model is used to forecast the day-ahead PV power generation, utility grid energy pricing, and different loads consumption based on real-world data. However, because of high uncertainty accompanying the decision variables, integrating P2P-ET and DSM makes the decision-making process more challenging. In addition, due to the large search area caused by the mixed decisions of DSM and P2P-ET, the computation effort is challenging. A modified multi-agent reinforcement learning (MARL) is implemented for decision making in order to deal with the increasing uncertainties resulting from bidding actions, amounts of transactions, and predicted data on load profile, renewable resources generation, and energy prices. The DSM and P2P-ET problem is formulated as a finite Markov decision process (FMDP) in order to tackle such a problem. In the updated MARL, the mixed uncertainties are added as extra states and action scenarios. The simulation result shows that by using the proposed MARL algorithm to optimize the P2P-ET and DSM strategies, the average daily cost of energy and the average load can be reduced by 23.57% and 20.73%, respectively.
