Optus Outage - How is it affecting you?

5G requires 4G to be operational as it piggy backs; I suspect if something wrong with the 4/5G core they are bringing up the older 3G (which funnily enough is being phased out) to get telephony (calls and sms working) and will likely hold back data services until 4G/5G comes back online.

ETA - I have an LTE signal now (4G) but no 5G yet.
Do you work in the field? You're very knowledgeable.
 
On Optus network, just got network back for now at 1348 when I was reading this post!! Yah!! 😀

Edit: looks like its data service returned, not the phone as I cant dial out or have my family dial in - all Optus mobile services but not the landline - with different carrier

Further update: finally able to call and receive calls at 1450
 
Last edited:
Read our AFF credit card guides and start earning more points now.

AFF Supporters can remove this and all advertisements

On the plus side. I got thru to Qantas reservations immediately (not HBA i don't think). So if you still have phone capabilities and need any Qantas calls done, this might be the best time.
Must be slow day, HBA just called me (SG) to clarify my problem with the OWA itinerary.
 
Must be slow day, HBA just called me (SG) to clarify my problem with the OWA itinerary.
I’ve been using Qantas number as phone test. As I know it won’t get answered quickly. While 5G shows on phone this is the reality.

IMG_5307.png
 
So scuttlebutt is that route cause was "configuration changes on the backbone routers that coordinate network traffic between our data centers caused issues that interrupted this communication. This disruption to network traffic had a cascading effect on the way our data centers communicate, bringing our services to a halt."

I happen to know that routing and firewall changes are always scheduled overnight on either Tuesday and Thursday, so this would make sense. Clearly the changes were not tested adequately in non-prod before being deployed to production, so when it would have tried to fail over to non-prod (which doubles as Disaster Recovery) same error.

Will be interesting to see who the sacrificial goat ends up being, given recent cuts.
 
So when you call Optus to cancel and they offer to bundle, say “No”!

Fortunately our NBN was TPG but both Flexiroam and Eskimo eSIMs (data only) roam on Optus - so no joy there! I was millimetres from buying a cheapy Telstra prepaid eSIM a few weeks ago for roaming,.. Anyway, Optus restored about the same time I connected to QF wifi in the QF F Lounge in SYD….
 
So scuttlebutt is that route cause was "configuration changes on the backbone routers that coordinate network traffic between our data centers caused issues that interrupted this communication. This disruption to network traffic had a cascading effect on the way our data centers communicate, bringing our services to a halt."
One would think that they would have backups or a rollback functionality implemented with the update. Sure testing may have found the potential issue but there is no guarantee it'll always catch every potential pitfall.
 
So scuttlebutt is that route cause was "configuration changes on the backbone routers that coordinate network traffic between our data centers caused issues that interrupted this communication. This disruption to network traffic had a cascading effect on the way our data centers communicate, bringing our services to a halt."

So, in other words, - someone accidentally kicked the plug out of the wall?
 
One would think that they would have backups or a rollback functionality implemented with the update. Sure testing may have found the potential issue but there is no guarantee it'll always catch every potential pitfall.

Well they obviously did roll back. But they spent a fair bit of time isolating the cause, then processing all the roll-backs.

All implementation plans have roll-backs but in a large corporate any nightly change window may contain changes for 1-25 projects/BAU maintenance some of which are incremental on top of each other.

A roll-back is not as simple as clicking "undo", and in the case of network ones the changes are pushed out and need to propagate over individual elements - which was problematic if the bad config was preventing them from connecting to said components. Rumour has it staff could not even VPN in.

Someone will lose their job over this one.
 
One would think that they would have backups or a rollback functionality implemented with the update. Sure testing may have found the potential issue but there is no guarantee it'll always catch every potential pitfall.

There's been many an outage at telcos, and cloud providers in the last year or two that have come about due to config changes, especially routing changes, and sometimes DNS changes. You'd think they'd have a way of testing this stuff..
 
There's been many an outage at telcos, and cloud providers in the last year or two that have come about due to config changes, especially routing changes, and sometimes DNS changes. You'd think they'd have a way of testing this stuff..
Microsoft pre testing is pretty rank. Those with laptops on auto upgrade, including our original server were wiped out when they pushed through an update and hadn't tested it on all major brands.
 
So, in other words, - someone accidentally kicked the plug out of the wall?
They completed half of the IT repair process of unplugging it and plugging it in again! Surprising it took Optus hours to figure out they needed to plug it back in!
 
Sure testing may have found the potential issue but there is no guarantee it'll always catch every potential pitfall.
It's agile ::) adapt and go , deal with the problems later :P which might end up being costly.
 
What I would be curious to know is how robust Optus' network failure plan is. Are the key technical engineers issued with multiple mobile phones for example, with SIM cards connected to (separated) competitor networks to mitigate communication failures in the event of a network wide outage such as this? The original failure is somewhat less important than how the business is structured to deal with such disasters.

A number of years back at a previous employer, a few select staff were permitted to be issued with multiple mobile phones to account for discrepancies in coverage at different remote locations. Not managers and the type, but critical operational staff who had an absolute requirement to be contactable and contact others when out and about at various locations. It's pretty basic stuff but you have to wonder given how long it took Optus to get a handle on this issue and (presumably) roll something back.
 
My alarm is radio, as soon as it went off leading news item was the outage. Those unaware must not own a radio or TV.
I had a truck driver at work complaining about phone reception. Said he was with Optus (he would have started work around 1am). I assumed it was the normal phone reception issue at the shopping center.

Then a merchandiser said her phone was dead due to optus being down and she had to use the store wifi to contact her works servers to know what she had to do. Didn't know what she'd do at other stores without wifi.
 
Well they obviously did roll back. But they spent a fair bit of time isolating the cause, then processing all the roll-backs.

All implementation plans have roll-backs but in a large corporate any nightly change window may contain changes for 1-25 projects/BAU maintenance some of which are incremental on top of each other.

A roll-back is not as simple as clicking "undo", and in the case of network ones the changes are pushed out and need to propagate over individual elements - which was problematic if the bad config was preventing them from connecting to said components. Rumour has it staff could not even VPN in.

Someone will lose their job over this one.
At work, whenever they do an IT change, such as rolling out updates to a register or upgrading the RF devices, they always do it a few at a time over a number of nights. So that if there is a problem, it only impacts a handful of the devices, not all of them.
 

Become an AFF member!

Join Australian Frequent Flyer (AFF) for free and unlock insider tips, exclusive deals, and global meetups with 65,000+ frequent flyers.

AFF members can also access our Frequent Flyer Training courses, and upgrade to Fast-track your way to expert traveller status and unlock even more exclusive discounts!

AFF forum abbreviations

Wondering about Y, J or any of the other abbreviations used on our forum?

Check out our guide to common AFF acronyms & abbreviations.
Back
Top