Posted on 08/13/2008 2:02:30 PM PDT by shove_it
Five years after the worst blackout in North American history, the country's largest power providers say the problems that turned out the lights on 50 million people have largely been resolved, but they fear that larger, systemic issues could soon lead to even bigger and more damaging outages.
(Excerpt) Read more at biz.yahoo.com ...
I remembering arguing with some idiot liberal on the then operating Yahoo boards who claimed, through six degrees of separation, that it was Bush’s fault.
What’s their answer? Wait!, I know, Raise Taxes!
Oddly, we only lost power for a few seconds here. And a good thing, too, as it gets hot in the high desert in summer.
Well,,,, nothin’ has changed here at the epicenter of that blackout. Akron Ohio.
Digital controls have been installed in the Systems Operations Centers for the electrical grid. All new stuff.
That’s what an international task force concluded a year later, listing six FirstEnergy failures as causes, including the most low-tech of all — failure to cut down trees under transmission lines. Investigators also noted 10 other FirstEnergy operational deficiencies as contributing causes.
The Akron-based utility disputed the findings at the time but has since spent millions of dollars on upgrades and training. The company also adopted a scorched earth policy about trees. There won’t be any. (A tree under an overheated and sagging transmission line in Walton Hills was identified as part of the chain of events that led to the blackout.)
What we’ve learned since the blackout of ‘03
http://blog.cleveland.com/business/2008/08/what_weve_learned_since_the_bl.html
Links of articles concerning blackout
North American Power Grid-lock of 2003
http://www.indusscitech.net/blackout2003.html
It was just a tree limb, just ask _jim.
Digital controls have been in System Operation Centers for decades. Of course, the one used now by FirstEnergy works properly, which cannot be said for the control system that was in service during the blackout.
A lot has changed since the blackout, mostly aimed at improving monitoring of the transmission system, and at giving NERC some regulatory teeth in making sure that utilities and reliability coordinators adhere to the rules regarding operation of the transmission grid.
The upside is that the industry is serious about preventing another blackout. The downside is that there are thousands of pages of procedures and regulations where there were once a few hundred.
There's still room for improvement, and supply is not increasing as fast as demand, so we could well see the same type of daily rolling blackouts in the US that have been used in California a few times over the last few years.
It was part of it.
How and Why the Blackout Began in Ohio, Summary
https://reports.energy.gov/B-F-Web-Part2.pdf
Page 45
U.S.-Canada Power System Outage Task Force
After 15:05 EDT, some of FEs 345-kV transmission lines began tripping out because the lines were contacting overgrown trees within the lines right-of-way areas.
It was the mylar balloons!! :)
If by public utility you mean Municipal Power company you need say no more.
You wont get any upgrades on your distribution system until your local politicians manage to get a federal grant. Your politicians wont get a grant until they hire a professional to apply for the grant.
Isnt socialism wonderful.
Five years already? I was in grad school at the time, and when the power went out I was working a side job in the meatpacking district. Walked from 14th and 9th all the way to the Staten Island ferry (the line for the Brooklyn Ferry stretched for miles, so I skipped it). Caught two buses in Staten Island, which was PITCH BLACK, until I made it to Bay Ridge. Spent the evening having an informal block party with my neighbors.
In that case here is something to think seriously about.
How far away from your plant is the share holder owned utility.
Depending on how far away and what your plants electrical usage is the power company would more than likely be very willing to run new service out to your plant.
It is surprising how much better service you get when a companys profit depends no keeping its you happy.
You know? I’m not really sure exactly where the power is generated. Our Public Utility is in Sagamore Hills, about 20 miles North of Akron. The PUCO gave me a tip several years ago to DEMAND that we be put on the same status as hospitals, police stations, etc. That has forced the electric company to put us in a priority position for “first response” in case of outages. But we have also installed a large generator of our own that can run the whole plant during emergencies.
Reading the article gives me no sense or impression that there is any engineered guarantee that the electrical transmission system is as predicatable as the SCADA systems being employed would imply.
There are plenty of firms willing to install SCADA without a clue as to system modeling.
Most problems I’ve seen in the industry are due to off-the-shelf construction of distribution and transmission networks lacking in definitive robust design. The systems function, so those who contract and build them are really never held accountable for their design.
As in this case, the few players and perhaps the only who might have professional responsibility and desire to effect a well designed system, are hammered for the failure caused by lack of design by others.
Electronic monitoring of these systems is somewhat laughable. Certain parameters are good to measure and telemetry used to communicate system responses to dynamic loading, but unless the entire system has been designed, it will not be properly modeled, let alone controllable in real-time.
(The best real-time system is frequently the system itself.)
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.