2025 - AVEVA World - San Francisco - Power & Utilities
NREL: AVEVA PI System and AI. Advances in future grid control rooms.
Join NREL's experts as they unveil the cutting-edge capabilities of eGridGPT, a fine-tuned Generative AI model designed for on-premise use in grid control rooms. This presentation will demonstrate how eGridGPT can seamlessly integrate with AVEVA PI System to offer operators, engineers, and corporate users enhanced guidance and decision support. Discover how this innovative AI solution can improve state estimation, boost variable energy forecasting, and optimize grid operations. By leveraging eGridGPT's unique features, attendees will learn to unlock new levels of automation, predictive analytics, and reliability within their power systems, ultimately leading to reduced downtime and improved operational efficiency.
Industry
Power and Utilities
Company
NREL
Speaker
Seong Choi
Mr. Seong Lok Choi is responsible for the development of NREL eGridGPT for real-time grid operation both EMS & ADMS, Digital Twin technology, and cybersecurity, shaping the control room of the future. Prior to NREL, Mr. Choi was an EMS support and software architect with the Western Electricity Coordinating Council (WECC) and Peak Reliability Coordinator (PeakRC), and he was with Ameren as a software developer. Mr. Choi is currently working towards a doctoral degree at the University of Michigan and holds a graduate degree from Washington University in St. Louis, Missouri.
Session Code
SESS-3
Transcript
My name is Song Choi, like a singer Song.
I have worked in the National Renewable Energy Lab for five years and if you don't know the NREL is one of the seventeen national labs on the Department of Energy.
NREL is focusing on renewable energy with three thousand seven hundred researchers, Denver, Colorado.
My job at NREL is reconstruct utility control room.
My research focus is on operators' decision making to make the grid reliable, stable, affordable and resilient.
Before joining NREL, I worked in the control room of at the Peak Reliability to monitor Western interconnection for seven years from energy management system to pie historian.
This session is, I want to take three things away. One, AI understand your language. Two, digital twin in this session is all automation.
Three, advanced display. We need to reduce number of displays. How to do that? That's what we are going to talk about today.
As you see, my English is not native.
I have to learn English, from Korea before I come to here attend university.
Back then, my English tutor was Kramer from Seinfeld.
I love the comedy.
There was one word I cannot understand. When someone says, Kramer, stay out of this.
And Kramer says, And people laughed like you, but I don't know what the means.
And I checked the English Korean dictionary. It wasn't there.
After I came to United States, I learned what, the meaning was.
So learning English is very tough. You know, speaking, if you go to, you don't have, speaking is not that difficult. If you go to foreign countries and you don't speak a language, you can use your gestures, right?
But writing was very, very difficult.
And I worked in the utility for twenty years and I wrote the code. I don't need to write English, I just write the code.
But when I joined NREL, this is a national lab, saying you need to write a paper, you need to write a proposal.
This is unbelievable to me. The writing is very, very tough.
And as you know, two years ago, Chezipedia came out.
It saved my writing.
Thanks, Chezipedia.
Monthly, twenty dollars. Right? So one day I asked Chechipiti, hey, here's some Pi, generating Pi tech code. So read this and then can you write a PI history and create a PI tag from the web page?
As you know, it wrote well, right? So did OpenAI actually hired PI developers to train this?
No, right? So I have questions of how they do it. And so that's how I started to getting into AI. Okay.
Honestly, we didn't have any funding. Nobody wanted to give us funding. Even back then, I had only one thousand Pi techs. That's the only one I have to play around the Pi and AI together.
I begged ten thousand dollars to get the GPU and spend time how LLM, the large language model works.
And following the tutorial, and voila, this is the video that I come up with.
So I will explain. So in the future, I see the operator is the humanoid robot will talk to each other like Casper this morning. He said, collaboration.
And the first question I asked is, what is the wind output generation now? I'm talking real time, not the past history. And so it come up.
You can use PyAF to get the very old PyWeb API. The next question is, is the system operating normally or not normally?
I don't think the PI knows what is the definition of the normal. Unless someone write the equation, so many equations coming into it, then maybe you can understand.
This is where AI comes in.
Okay?
Some of you question how I did it. Right? It's not just you.
Utilities and vendors are coming into NREL and ask how soon you did it. So today, how will take the time. So you have a QR code down there, read it, then you'll probably figure it out. But my recommendation is don't spend time on how, spend time on why and what.
So today's discussion is what and why.
The reason that I don't want to spend time on how is because first, the progress of AI is so fast, even I cannot sleep on it. I turn on the YouTube and listening to it and how they do it.
I cannot follow the progress of it.
The demo you just saw, I came up October over last year. In the last month, Google Gemini came up that one. So you think about, okay, what would be the best display for the operator or engineers?
Several months later, somebody will come out. I spend lines and lines of code to come up one to get the information, right?
Now it's just a separate alliance. So I don't want to spend time on that. Okay? So vendors are asking, should I put AI into my software? And I told them no.
You need to modularize. You don't know what will happen next.
Utilities are asking, hey, do I have to have my own model?
I'm sorry because we are under NERC compliance, so maybe I should say yes. But does the AI model you heard from the news, to come up with the AI model, you need to spend billions of dollars, right? The one hundred thousand, GPUs, thirty thousand.
That's too much.
But don't spend time on too much, but just get start. Be familiar with it.
By how? Working with NRBR, I can help you out how to start your AI models.
Now, when we the utilities got media attention, mainly because of the blackout or rate cases, not this time. This time is data center and AI.
From the utility perspective, data center interconnection request that forced the utilities rethinking about the grid modernization.
How to upgrade the transmission, how to add the generations.
On the other hand, the AI is getting attention because of the data and the volume and speed of it.
And we heard this morning, David Boomers are retiring and you treat, they need to reduce the O and M cost and they manage the supply chain.
So why this time AI is different? You know, AI been in the field more than fifty years.
So from nineteen eighty to nineteen ninety, and that time is all about human supervision.
Meaning, in the past, engineers write the software, like Programmable Logic Controller PLC, or DCS, or SCADA. Engineers need to write something because motion doesn't understand what operators are talking.
And so engineers had to come up with the logic that if input is this, then output is x, y, z. So engineers are basically translator between the machine and the operator.
But now, CHET GPT, or the larger language model, is basically reduced the supervision.
We need a collaboration. That's where the engineers are coming, operators are coming. But most of the time that engineers have to spend time on is to me, that is almost over.
AI is the right software for you. This is a paradigm shift.
So because of this, utilities are now thinking, hey, I have this vision. I have this goal to meet this goal. What technology is out there?
And so in this session, I will talk two things. One, AI and digital twin.
The robot and drone is another big topic.
But at least if you understand what the meaning of AI and digital twin, then that's what I want you to take from this session.
So before I go into detail, I want to explain how the control room evolved. Back in when Thomas Edison invented the lighting bulb in eighteen ninety two, there were no computers. So as you see from the screen, one operator is on the phone, and another operator is calculating how much megawatts I need to come up the frequency. Right?
And nineteen seventy, when the energy management system and all those mainframe servers are coming up, now you have a calculator.
So in the control room, now it's only one operator.
And all those systems are supporting operators.
Things changed.
As you heard from the session to session, it's about the weather issue, and so many data, and renewable energies coming up.
Operators are dealing with unpredictable, uncontrollable, unmanageable grid.
So to me, operators are doing tug of war between generation and demand.
How to meet them and how to flow the power through the transmission and distribution line without violating the limit. Right?
So their job is reliability. How to make sure twenty four hours electricity is there. And they send the power to make the frequency and voltage within the limit. How to send the cheapest generation.
So to address operational challenges, and the baby boomers are retiring, and the data is so we need to help operators.
So to help them, we need to understand how they make a decision.
In the normal days, it's basically scheduled and dispatched.
But if something happens, then they need to go through this, what we call the recognition primed decision.
A Gary Kleiner made it. So from that situation, understand how that situation lead into, and what they need to run it. That's how they back to emergency operation, handling the emergency operation.
And then we're back to normal operation. So this decision making process is where the most we can the most operators need it.
So in the control room, when something they are monitoring a scale up display, EMS displays, and then alarms coming in, or contingency analysis is spitting out to the operator, Then they need to bring the cases, a very similar one, and then run the study.
And based on the study, to find the mitigation options, And then issue the latest batch or command.
So to me that from situation two to patterns is where the AI are.
And then run the simulation and come up with the mitigation. That's where the digital twin is.
And once we have all those, then you've got to come up with a good display operators will focus on.
What we are hearing from session to session is operators are clicking displays one by one to get the right information. Why don't we use AI to do it?
So you heard about digital twin. They mentioned the design and build. But in my case, it's a digital twin, is the automating scenario at the same time in real time.
So it means when you run into emergency operation, then you have to check the forecast, outage, and you need to add weather scenarios.
So many if you have one or two scenarios, then you can run the study.
But what if there is one thousand scenarios, and you need to come up within thirty minutes? Can you do it? That's where the digital twin value is.
So let's see this demo.
So I asked the image to EagleGPT, and asked, hey, what does this diagram mean? Then EagleGPT able to detect, oh yeah, this is OpenDS's i2b circuit bus system.
Then I asked the second question, can I close circuit breaker one?
And he says, wait, I need to run my powered simulation and coming the result back, and then understand it, oh, I don't think you can close the network.
So what this means is the Digital Twin is able to grab the current case and then run with so what if this circuit breaker close? What do I need to do? So it knows where the data are. It knows how to run the simulation.
If you have AI and the digital twin together, then this becomes AI orchestrator and AI agent.
You may hear AI agent differently, but the way I see is AI agent must know what it's doing and then listen to what the AI is telling them.
Currently, your own tools is just input and output.
But who understands the input and output? We need to use AI. And so AI is the orchestrator read the output. And based on the output, it will talk to auditors and bring that on. So that's the concept we have AI orchestrator and an agent.
When I have some really cool idea of bringing that display or tool to an operator, the operator is saying, Son, look at how many monitors I have. Yes, you have eight monitors.
How many displays I have? You have four displays each. And so I am looking at thirty two displays. And you are bringing one more? Don't waste my time.
Reduce your display. That was my key to this. So I don't need to know where the data are. And that's what the AVEVA PI Connect is heading into. So I think that's the right direction.
But I'm going one step more. Bring all those display and regenerate by AI and to the right information to the operator.
So this is just a proof of concept. Think of it this way. Oh, there is a big event that happened in the total solar eclipse last year we have.
And so if the totality is there, what generation dispatch do you recommend?
So this is you will see the numbers changing, and then you will see the solar is crossing the eastern side. And then on the right side, you see the generations is changing it.
So same concept. Why they have to click the buttons, right?
Can AI create the displays?
So three things. AI understands what you are talking.
Two, come up with digital twin. It's really easy if you are IT and engineers. Come up with the digital twin in the control room doesn't take a long time.
And then, advanced display. How to reduce number of displays? I have to manage ten thousand displays.
Nobody will read it, right? And so how to come up with streamlined display or advanced display?
Thank you.