How automation tools could level up storage interconnection
The energy industry doesn’t have a technology problem when it comes to interconnection studies: by now, the power-flow solvers that make those studies work are mature and capable. What’s still causing friction, however, and slowing down the interconnection process, is the data feeding those tools.
“The actual running of a power-flow study is the fastest part of the entire interconnection process,” said Chris Ariante, co-founder and CEO of software company Nira Energy, which automates the pre- and post-processing steps of interconnection and transmission planning. He told ESS News that the real delays for developer projects happen in those stages.
Every project has to be modeled from scratch before studies can be conducted, which can take weeks of work for large cluster studies; anytime the interconnection queue changes, the model must be updated. That work grows exponentially as queues lengthen.
“[Running the power flow] is maybe 5% of the workload,” he added, pointing out that the “real analysis” starts after: manually combing through raw constraint violations, identifying what upgrades are needed and why, running sensitivity checks and determining how to allocate upgrade costs across hundreds of projects.
This is where late-stage surprises come from, Ariante explained. Any small movement in the interconnection queue or changing assumptions could shift a project to “shouldering hundreds of millions in upgrade costs alone” rather than just needing a moderate upgrade.
“When people say interconnection delays aren’t a ‘math problem,’” he added.”It’s the manual data management that makes the process slow, risky and unpredictable for developers.”
Storage makes the equation trickier. Unlike standalone solar, battery energy storage systems need to be evaluated under both charging and discharging modes, each of which can produce different system impacts. Establishing a baseline understanding of network effects for hybrid projects can require running several scenarios to determine how they interact with the grid.
Still, that volatility can be dangerous and makes it tricky for storage projects to pencil out. Time between ISO decision points is often limited, meaning that developers are frequently forced to make capital commitments based on one or two tested scenarios. From Ariante’s view, it’s a structural blind spot that puts developers in a tough position. “Developers need better tools to help predict costs at a faster and more reliable rate.”
“When every shift in assumptions requires re-running the analysis from end to end, it’s hard for teams to keep a current view of where capacity actually exists,” he pointed out. That’s where automation tools like Nira that focus on the most time-consuming steps of the process could have their largest impact, he said: they let engineers run more scenarios faster without rebuilding cases by hand.
Many critics worry that integrating software into their workflows could introduce “black box” results; when done correctly, Ariante said that automation should increase, not reduce transparency. In practice, that looks like utilizing a transparent, reproducible methodology, and a power flow solver that everyone has access to that can produce reliable results.
“Engineers should always be able to see how the case was built, what assumptions were used and why a particular upgrade showed up,” he noted.