Network Overdrive has been delivering Managed IT Services in Australia for over 20 years.
In that time, we’ve seen the cybersecurity landscape change significantly – both the increased volume of attacks and the massive increase in the sophistication of those attacks.
We’ve watched as organised crime has taken over what used to be called “hacking”, turning it from an individual pursuit into a state-sponsored industry.
We believe we have to talk about this – a lot. We have to bring it out of “someone else’s problem” and talk about the real and present threat to every Australian business and every Australian organisation.
“Business Australia warns that they are seeing a ransomware attack on small businesses every 11 seconds.” news.com.au
We know this – because we’re survivors. So we’re talking about cybercrime a lot – along with other cybercrime survivors. We’re talking so much that’s we’ve made it to news.com.au in this article Aussie companies Langs Building Supplies and Network Overdrive hit by cyberattacks
Our bitter experience on July 3rd
On 3rd July 2021 at 3 am, we were hit by the largest ransomware in history (at that point in time). Not only did it hit our systems – it hit ALL our customers!
It didn’t happen because of something we did wrong – but we still had to deal with it. It was the biggest challenge we’d ever faced in our business – and the worst thing that has ever happened to us.
As a service business, we expect 10% of our customers to call us. That Saturday morning, while our systems were down, 100% of our customers called us. And every one of them wanted 100% of our attention.
We’ve survived to tell the tale
Not only did we survive, but all our customers also survived AND continue to service their clients.
So let me take you back a bit to understand why we’re still standing. Then you’ll understand what we and our customers have done to make ourselves resilient in the face of the inevitable attacks coming in the next few years.
Unfortunately, many of our competitors will not be ready!
We started by being worried
Through the first half of 2020, I’d been watching cybercrime developments and getting more and more worried about how fast and smart and how many bad actors were operating. Cybercrime had become an organised global crime industry.
The need to act crystallised in September 2020, when one of our network members was expanding into the US. He wanted to sell water quality technology (including software) into the US – and got slammed with a whole lot of IT security requirements. The US (pushed by Californian laws) now has a whole different level of cybersecurity requirements that he’d never encountered – and he had to meet them to sell his product.
To me – watching him from the outside – it was clear that the ability to meet leading-edge cybersecurity requirements was now a business opportunity. Those who had proper, secure IT systems would be able to sell into places that others could not.
So for us as an IT Consulting and Services business, it would be important to invest in the best security.
Our next step was to make sure we knew what “the best security” actually was. We looked at the standards and at our business context and decided to investigate the American cybersecurity standards set by NIST.
We set out to develop our company’s security to a standard that the Americans would like – implementing the US National Institute of Standards and Technology’s cybersecurity best practices and standards. https://www.nist.gov/cybersecurity
We invested in expert advice
So, after doing this research, in January 2021 we got an external security consultancy to do an audit – and paid a fair amount of money for that audit. After the audit, we then spent a fair amount of money to implement the audit recommendations.
The audit identified fifteen to twenty recommended changes – and the two main ones were:
1) change the way we did backups. They just said we did good backups by Australian industry best practice – but it wasn’t enough to be safe. The future requires an
independent standalone backup – one able to instantly recover whole systems.
2) the other thing they said is that we needed to practice our recovery plans – to create a simulation of a disaster in our business and make sure that everybody knew what to do and that they could do it AND IT WORKED.
We acted on that advice – immediately
So we wrote the audit cheque in January 2021 – and started implementing the recommendations. We did it even though we were still dealing with coronavirus impacts on our operations and our customers. It took some time – but what we had learned from the audit was SO important that we knew we had to do it quickly.
We told our customers about what we were doing – but they mostly didn’t have the resources to make the recommended backup changes at that time.
We got the changes made and did our first simulation in May 2021. We knew we had upped our security game to a whole new level.
Our investment paid off
On July 3rd 2021 at 3 am we got hacked. (July 2nd in the US – the start of their Independence Day long weekend.)
If we had not done this work over the eight months before that attack, I honestly think I wouldn’t have a business.
At three AM on July 3rd the malware attack happened. It was a supply chain attack through one of our software suppliers – Kaysera. A bad actor had got into THEIR systems and changed Kaysera’s software to upload malware. The malware got partially loaded to our systems AND into our customer’s systems.
But we HAD done the preparation and we HAD changed our backups and we did have a plan and we had TESTED that plan.
So by seven AM we had developed and delivered individual action plans for all our customers. How? By sharing our action plan with them. (All while doing our own recovery.)
Twenty-four hours later we had our own systems back online (it only took as long as 24 hours because we were also very busy helping all our customers).
By Sunday morning around three AM we had all our own IT systems back in place. That meant we could help our customers properly. Without the highest level of backup, their recovery processing took longer – but they did recover.
Recovering from backups can be a lengthy process. If it’s cloud-based, if you have to recover EVERYTHING from the cloud, it can take 2-3 WEEKS to recover across the NBN. And if you’ve done tape backups to off-site storage, then you have to go and get those tapes.
It took between 1 and 3 weeks to get our customers back to full operations, prioritising urgent applications like Payroll and Debtors.
And as things started to resolve, there were key questions that needed to be answered:
- What did the hackers do? Did they exfiltrate (steal) data from us?
- Do I need to take action to comply with mandatory data breach laws?
- What do I tell my clients?
- What is this going to cost me? (Data/systems recovery is generally free on a managed services contract – but if you can’t take orders or do business or get paid there’s a big cost to your business)
- Will I still have a business? Has my business reputation been damaged?
Over this time, it became plain to us that a lot of effort needs to go into the human side of communications and recovery – talking to our customers about what happened, and helping them talk to their customers about what was lost and what wasn’t lost.
Collaboration minimised the Kaseya attack
Kaseya actually was fantastic even though they were the immediate “source” of our pain. Their response as a multinational company to a small Australian business when they had so many others to deal with was excellent. It emphasised to me that there is value in partnerships, numerous business IT solutions and having people with the capacity to respond in extreme situations.
This event had been very public – at least in tech circles. It even got the attention of the US president. The FBI got involved – so there was a lot of free forensics. The result was – for this ransomware event – we knew exactly how they got in and we knew exactly how much damage they did. So we knew for sure that if we blocked that way in then we were safe – it wasn’t going to happen again. That is very rare in today’s world.
Another positive was Kaseya’s systems – because they did detect the attack in mid-stream. They got hit – but they realised something was up FAST. So they were closing down the holes almost as they happened. The attackers realised they were busted and rushed to push out their ransomware to try and do as much damage they could – but they were limited to locking down systems.
The result was that the amount of time that the hackers were in our systems was measured in minutes – not days or weeks. So while they could do some damage in terms of disrupting systems – they didn’t have time to do anything else.
Everyone helped everyone. It was not only great in that moment, but it also made a big $$$ difference – because the cost of recovery goes down when cooperation goes up.
The cost of recovery per organisation is significantly less if there are multiple victims who collaborate. They can share knowledge – about the attack details, about how to recover and about how to improve their network virus protection.
So what have we learned?
The changing nature of ransomware
Hackers USED to go after business operations – the goal was to stop a business from using their systems to do business. Then they would demand a ransom to get you back to work.
Not any more.
Now infiltrators sneak into your systems and hunt around for all sorts of data that they can leverage – or blackmail you NOT to publish it. Their new game is extortion – and they’ll try it on everyone.
In the Langs Building case:
“after they worked out the attack had been unsuccessful and we weren’t paying, they sent an email to all staff they knew of in the organisation telling them had their data and they would sell it on the black market if we didn’t pay,” news.com.au
(They were lying, and Langs could prove they were – fortunately. Could you if this happened to you?)
It’s not just ransom and extortion – it’s also theft
Infiltrators have different tactics for different businesses. Small businesses, may not try blackmail. They may just lurk until you have a big invoice to pay – then change bank account details and redirect your payment into their own account.
It gets worse, too. In 99% of ransomware events, there are multiple attacks – because HOW the attacker got in couldn’t be identified. So often the hacker is still there – hiding somewhere unexpected like the smart TV in the boardroom. You recover, they come back; you recover again, they come back again.
Hacking has changed, so protection HAS to change
IT Security needs a community – no one can do it on their own. Get expert advice and ACTION it. Find experts who understand the importance of collective protection. Keep working on it – the job of security is never “done”.
Backups are essential AND your disaster recovery must be tested and practised. You don’t want to find out after the fact that your “secure cloud recovery” will take 3 weeks to get done therefore, implementing trustworthy cloud computing services is a crucial part of your business’ productivity.
Partnership counts – regardless of legalese. Get to know your technology partners and make security a regular conversation. Part of why we survived is that Kaysera was amazing. They detected the attack, stopped it before it was complete, and were really proactive about
Dwell time and data exfiltration are way worse than vandalism and shutdowns.
You can’t keep it a secret – notification is mandatory. You have a legal obligation to tell your customers. So normalise it. Have open conversations about the risks BEFORE you get hit. Plan the conversations and have them regularly. You’ll make your customers more “sticky” – loyal to you.
Review your security today
Would your business stay in business if your systems were shut down for 1-2 months? Shut down so thoroughly you couldn’t make payroll or take customer payments.
If you’re an Australian business interacting with the online world, you need to consider the security that meets a global standard – something current like GDPR or NIST – rather than a local norm.
Why?
The goals of ransomware have changed – they now include financial theft, data theft, extortion and blackmail – not just disruption.
The task of prevention has changed – it MUST include dwelling time detection and internal discrepancy monitoring AS WELL AS backups, firewalls and traditional barrier methods.
Get a plan AND talk to your customers about your plan – so they know ahead of time what will happen when (not if) your system is compromised. Help them understand the new forms of attack they are facing.
Ask for help – get advice and guidance now
Be prepared to be embarrassed – because there WILL be things uncovered that you won’t like. But the alternative is worse, You can get over embarrassment – but your business might not survive a cyberattack.
I got advice – and it was embarrassing and it seemed expensive at the time. However, I would probably be out of business if I hadn’t taken action.
I took action to protect my business against cybercrime – and you should too. NOW!