Skip to main content
  • PES
    Members: Free
    IEEE Members: $25.00
    Non-members: $40.00
    Length: 00:26:48
Panel 12 Sep 2022

This panel session video contains the following presentations:
1. Real-Time Simulation-Based Energy Management of Airport Microgrid for Electric Aircraft
Future airport dc microgrid system for electric aircraft with energy management strategies and the power control method was put forward in this paper, in which the upper control layer with calculated robust optimization of energy management strategy for the optimal scheduling decisions and the lower control layer for transient power control for power load are complimented. The optimal power scheduling, minimization of Dc micro grid economic operation and harmful gas emissions targets can be achieved. The power control layer provides sufficient voltage and power response speed for the airport electrical load. The proposed control strategy is applicable to the DC microgrid of electric aircraft airport which contains photovoltaic and other renewable energy generation system and energy storage systems and can be interacted with the public power grid. Based on the Typhoon HIL 604 real-time simulator, the real-time microgrid simulation model of airport microgrid including photovoltaic, fuel cell, electric energy storage and other renewable energy units and airport power load was developed. RT-LAB 5700 platform was used to realize the microgrid power controller and energy management optimization strategy solution. Finally the real-time simulation platform of airport microgrid for electric aircraft was constructed to realize the verification and evaluation of the proposed energy management strategy.

2. Online Scheduling of PV and Energy Storage System Based on Deep Reinforcement Lea
The Photovoltaic and Energy Storage System (PV-ES), as a typical microgrid, is increasingly become an important component of smart grid. Through effective ES energy managing, the PV-ES can effectively realize economic benefits, and provide power support for distribution networks (DN). However, the intermittency of PV power and the higher fluctuations of load power bring great uncertainty to the optimal scheduling of PV-ES. The optimality of traditional optimization methods for online scheduling can be insufficient because of the modeling complexity, strong conservatism, and heavy reliance on the prediction accuracy. To address these issues, this paper proposes an online ES scheduling algorithm for PV-ES based on deep reinforcement learning (DRL). Taking the system economic cost as the operation objective, DRL uses high dimensional PV-ES measurement data to directly drive the continuous action control of the energy storage, and can flexibly adapt to uncertain environments. Test results of a typical PV-ES show the superiority and effectiveness of this method

Chairs:
Dr Zaipatimah Ali

More Like This

  • PES
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • PES
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • PES
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00