Major Fail, backup didn't work why can't they operate manually?
Class action anyone?
At least it's not a regular occurrence. The M5 tunnel in Sydney gets clogged up on a regular basis due to car breakdowns/accidents. No idea who the muppet was that decided 2 lanes in some of our major tunnels was a good idea either - talk about being short-sighted!!
We also have the M2, which is currently undergoing 'an upgrade' that is expected to take 2 years (18 months so far) which has SO many different speed zones due to the 'roadworks', that it has doubled the time taken for some poor buggers to travel to work - and who are still getting stung with the tolls, which add up to around $9 one way.
Don't even get me started on the stupidity of having to slow down to either 40 or 60 to go through the now cashless tolls (yep, and they wonder why it has a flow on affect).
Having used the Melbourne freeways, they seem to run a hell of a lot smoother than ours do up here. Although I guess when yours fark up, they really fark up - fark'n
I knew about it before I left for work, and texted a workmate who was going to drive his very grumpy TR6 to work this morning.
I went along Nepean Hwy/Beach Rd (from Seaford to the city) and it took 1h45, which wasn't actually THAT bad for that way.
Workmate went along the Monash/Citylink from Berwick to the city, left home at 7am, walked in the door at work at 11.... The last 5ish km's took nearly 3 hours, the Monash was flowing well until Yarra Boulevard at which point it turned to ****, and there's nowhere else to get off.
I somehow think some heads might roll at Transurban over such a chronic systems failure.
I also want to know how I get a job as the CEO of a company that operates toll roads and get paid $7.6 million a year for what really can't be THAT complex a job.
One of the benefits of working nights this week. I was asleep the whole time
I've been in the Citilink Control room a few times. Pretty cool with all the mission control like screens and ****. A lot of the warning systems are automated in that they detect when cars suddenly brake and slow down etc. Doesn't help if your monitor goes blank in rush hour though
The gloves are off, the wisdom teeth are out
What you on about ?
I actually think they did the right thing by closing it under the circumstances. Assuming they were telling the truth, they had no control over the exhaust fans, and more importantly, the deluge systems for if there was a fire.
Mont Blanc Tunnel fire anyone?
I work with systems that have triple redundancy, just in case some dork is playing Mario on a system net and one has hit CTRL/ALT/DEL at an inopportune time.
Like when there's an incoming about to **** everyones day!!!
This is one of the states most critical arterials and all we get is eight hours of primary school tech speak about "teams investigating " and "we're looking into it"
so they're saying a core switch failed, and then backup systems did as well. Assuming we're talking about network infrastructure, there is absolutely no excuse for that in such a critical piece of infrastructure. It's not that hard to design for that sort of failure, and even if they had multiple failures, there should have been maintenance contracts and procedures in place whereby equipment could be replaced within 4 hours.
Mind you, if the system in place is 13 years old without a refresh, it may well be end of life, and unsupportable, which is even more inexcusable.
I used to work in healthcare, and people died if infrastructure wasn't working properly. Gives you a different perspective on redundancy....
We only ever had one patient die that was MAYBE linked to a fault, but he/she must have been pretty close to going anyway. We did have a VERY close call when I got a site to power off their UPS for their comms equipment, only to find out that it was sitting next to an identical UPS that was powering the intensive care ward. Of course they turned off the wrong one. Luckily there were no patients in there at the time....
PS I liked it...much less traffic about c.f. usual I thought.