Real-time Emulation of the Communication Layer for Microgrids using Hardware-in-the-Loop (HIL) Simulations Z. Cai, M. Yu FAMU-FSU College of Engineering M. Steurer, M. Sloderbeck, and K. Schoder FSU Center for Advanced Power Systems Presenter & contact: Michael Mischa Steurer Email: steurer@caps.fsu.edu, phone: 850-644-1629 This work was supported by ERC Program of the National Science Foundation under Award Number EEC-08212121. Microgrid RODEO Summit, Feb 20, 2014, Austin, TX
FSU Center for Advanced Power Systems Established at Florida State University in 2000 under a grant from the Office of Naval Research Focusing on research and educa@on related to applica@on of new technologies to electric power systems Organized under FSU VP for Research Strongly Affiliated with FAMU- FSU College of Engineering Member of ONR s Electric Ship R&D Consor@um - ESRDC ~$6 million annual research funding from ONR, DOE, Industry DOD cleared facility at Secret level Research Groups Electric Power Systems Advanced Modeling and Simula@on Advanced Control Systems Power Electronics Integra@on and Controls Thermal management High Temperature Superconduc@vity Electrical Insula@on/Dielectrics Staffing 25 Full- @me staff scien@sts, engineers and technicians, post- doc. s, and suppor@ng personnel 7 FAMU- FSU College of Engineering faculty 45 Students Facility 44,000 W 2, laboratories and offices, located in Innova@on Park, Tallahassee; Over $35 million specialized power and energy capabili@es funded by ONR, DOE Unique: 5 MW integrated hardware- in- the- loop (HIL) real- @me simula@on lab
Role of HIL Simulation throughout Technology Development Relative Effort Limited Hardware Testing Power HIL HW only Lab Testing In situ Modeling and Simulation dominates the entire process CHIL contributes heavily from proof of concept through PHIL testing De-risk early development of Hardware (fast) controller Application (slow) controller De-risking PHIL experiments Control HIL Modeling and Simulation PHIL supports model building and integration phases Experimental data for model construction and validation Stimulation of component through controlled transients Integration testing through emulation of the target environment(s) Proof of Concept Development and Model Building Integration Testing Time Power Hardware-in-the-Loop = PHIL Controller Hardware-in-the-Loop = CHIL 3
FSU-CAPS integrated HIL facilities Integrated 5 MW Hardware-in-theLoop (HIL) testbed 5 MW variable voltage / variable frequency converter: 4.16 kvac, 1.1 kvdc 5 MW dynamometers 225/450, 1,800/3,600; -12,000/24,000 RPM 5 MW MVDC converters: 6/12/24 kv Real-time Digital Simulators (RTDS, OPAL-RT) <2 µs step in real-time Cyber-physical system simulation in RT Superconductivity and crogenics AC Loss and Quench Stability Lab Cryo-dielectrics High Voltage Lab Cryo-cooled systems lab Low power and smart grid labs
FREEDM Future Renewable Electric Energy Delivery and Management FREEDM System Vision Energy Cell = Load (+-) Storage -Generation 5
The local power node
Distributed Grid Intelligence (DGI) in the FREEDM System Each SST has a corresponding DGI node (i.e. process) which coordinates with its peers DGI node commands SST power level, voltage control, etc. Surrogate of commanding local loads and sources on LV side of SSTs DGI nodes negotiate a desired ( optimal ) system control settings (e.g. power levels) between all SSTs Organize themselves into groups Incrementally converge to the desired outcome DGI agent-based operating system developed by Dr. Bruce McMillin (MS&T) Available for application control development 7
The FREEDM HIL Test Bed Provide a testing platform for the interaction between Power system Power components (SSTs) Local sources, loads, and storage (LSLS) DGI processes Communication network Allow integration of actual control, protection, and power devices Need real time capability Grid DGI LSLS Communication Network SST SST LSLS DGI LSLS DGI DGI SST SST DGI SST LSLS LSLS 8
Why Real-Time? Non-Real-Time Real-Time Inter-DGI Data Communication Not simulated DGI Processing Power Simulation time time Non-real-time testing may lead to false timing results 9
Cyber-Physical RT Simulation RT simulation of Controls Communications Distributed Controls (DC) computing layer Fast data links between DC and power components: RT simulator specific RT simulation of electric power system
The real-time SITL (system in the loop) module of OPNET Simulation Host OPNET Modeler Real Network Real Network Ethernet Adapter Ethernet Adapter API Utility API Utility Systemin-the-Loop Interface Systemin-the-Loop Interface Packet Filters Simulated Network Simulation host: powerful workstation API (WinPcap) facilitates the task of moving packets from the physical Ethernet adapter to the SITL Interface in OPNET Real network: interfaces with control agent hosts
OPNET Characterization Added 1 ms artificially delay via OPNET 12
Impact of DGI Traffic on OPNET Performance Baseline without DGI 6 DGIs with timing pattern B (TS 7800 ARM) 6 DGIs with timing pattern A (Mamba) Performance with DGI communicating via OPNET in real-time 13
DGI Real-Time Performance Examples Make use of Formed Group and State Collection Version 1.4.0 Simple load balancing algorithm Timing pattern A (Mamba) 1.8 s GM 410 ms SC 250 ms LB 310 ms/migration Timing pattern B (TS 7800 ARM) 13 s GM 2.3 s SC 1.3 s LB 1.6 s/migration
FREEDM Center Wide Login to HIL Testbed Access to HIL testbed from any location with Internet access Developers can more efficiently diagnose issues Used regularly by MS&T s DGI team to gather real-time data necessary for timing configurations Test scenarios can be performed and validated by a remote team, using the testbed at FSU This feature resulted in substantially improved cross-campus collaborations Remote User VPN enabled firewall Internet 15
Concluding Remarks Micro-grid R&D needs fully integrated cyber-physical test beds FSU-CAPS has established a realtime HIL test bed for micro-grid R&D which includes Power system models on RTDS Communication models on OPNET Distributed control processes on x86 ( Mamba ) and embedded ARM boards Currently, supports up to 24 control nodes This test bed is available via remote access Rockwell Automation controller 6 x86 Mamba RTDS 6 ARM FSU-CAPS micro-grid R&D Hardwar-in-the-Loop cyberphysical system test bed
Presenter & contact: Michael Mischa Steurer Email: steurer@caps.fsu.edu, phone: 850-644-1629