| The research of energy management strategies for microgrids can effectively balance the supply and demand of energy within the microgrid,ensuring its smooth operation.In recent years,deep reinforcement learning algorithms have received widespread attention in the field of power system control due to their powerful perception and decision-making abilities.However,related research still has some problems,such as discretization of action space,estimation bias,and the need for accurate state information.Therefore,this paper proposes a new microgrid energy management strategy based on deep reinforcement learning;Furthermore,combined with source-load prediction,a microgrid energy management strategy based on state information prediction is constructed to obtain more applicable energy management strategies.The specific research content is as follows:(1)In deep reinforcement learning algorithms for dynamic energy management in microgrids,the Deep Q Network(DQN)algorithm has the problem of discrete action space making it difficult to achieve energy refinement control.The Deep Deterministic Policy Gradient(DDPG)and Twin Delayed Deep Deterministic Policy Gradient(TD3)algorithms also have estimation bias issues.To this end,a real-time energy scheduling strategy for microgrids based on the Softmax Deep Double Deterministic Policy Gradients(SD3)algorithm is proposed,and the Loss Adjusted Prioritized(LAP)mechanism is introduced to improve the algorithm SD3-LAP;Establish a gridconnected microgrid that includes photovoltaic,wind turbines,micro gas turbines,energy storage equipment,and power loads,with the objective function of minimizing its operating costs while meeting a series of constraints;Recast the real-time energy dispatching problem of microgrids as a Markov decision process,and perform environmental simulation verification on the proposed method.The experimental results show that compared with the relevant baseline models,the energy scheduling strategy of the SD3-LAP algorithm is more effective and robust;and the LAP mechanism has universality,which can effectively improve the performance of energy scheduling strategies in related reinforcement learning algorithms.(2)Most existing energy management strategies for microgrids are based on ideal environments,assuming that the source-load power state information is known,but the actual environment is not.Therefore,a deep reinforcement learning dynamic energy management strategy combining source-load power state information prediction is proposed to enhance the practical applicability of the strategy.Firstly,based on the Sparrow Search Algorithm(SSA)algorithm and its improved algorithm in the intelligent optimization algorithm,the hyperparameter combination of the source-load power prediction model is optimized;Furthermore,based on the obtained prediction model,the source-load state prediction information is used as input to the SD3-LAP algorithm,enhancing its ability to handle unknown microgrid environments;Finally,validation was conducted based on the microgrid built earlier.The experimental results show that the time series model of the proposed improved algorithm is superior to the relevant baseline prediction model.The higher the prediction accuracy of the time series model,the more reasonable the scheduling strategy based on the SD3-LAP algorithm.In summary,this paper constructs a microgrid model and proposes an energy management strategy based on the SD3-LAP algorithm,addressing the related issues of deep reinforcement learning algorithms for the dynamic energy management of microgrids.It also considers the actual situation where source-load information is unknown in the actual environment,proposes a deep reinforcement learning dynamic energy management strategy that combines source and load power state information prediction,and proposes a more accurate time series prediction model for it.The detailed experiments on the constructed microgrid model show that the proposed algorithm is effective and feasible in practice. |