In their latest cyber security report for the first quarter, CERT NZ (Computer Emergency Response Team) unveils a disturbing increase in scams and sheds light on the growing exploitation of Artificial Intelligence (AI) by cyber criminals, signalling that the digital battlefield is evolving.
According to the report, scams have spiked by 12% compared to the previous quarter. Financial losses have also skyrocketed by 66%, reaching a staggering $5.8 million NZD. The tactics of scammers behind investment scams have taken a cunning twist with reports that malicious actors have been using search engine ads and slick, professional-looking documents to lure unsuspecting victims to them. Evidence suggests that these new tactics are working as the report highlights that millions of New Zealand dollars were stolen in a single month.
The Impact of AI on Cybercrime
Unfortunately, the growing sophistication of tactics does not stop there. The rise of AI as a tool for scammers is another significant trend highlighted in the report. While we may not yet grasp the full impact of AI on scams, cybercriminals are harnessing it to create more convincing phishing emails, craft deceptively realistic investment advice, and even impersonate individuals during live chats. Although AI-driven scams have not been widely reported to CERT NZ, experts warn that it is only a matter of time before these sophisticated techniques become more prevalent.
The following is an extract from our Executive Cyber Security Report for June 2023
For all the hype & bluster about generative chat AI being able to write custom malware, the area where we've seen the greatest impact from the likes of Chat GPT is Fraud & Social Engineering which often leads to business email compromise (BEC). Cybercriminals have been able to use GPT to write highly convincing phishing emails, with perfect grammar. They've also leveraged its capabilities to write more phishing emails and use GPT as a framework to continue interacting with their victims. While some traditional methods used by anti-phishing technologies are still valid (such as looking for malicious attachments & links or cryptocurrency addresses) many vendors of anti-phishing technologies are having to switch to behaviour-based analysis. This includes checking whether an email user regularly sends or receives communications from an external address. This is further complicated by threat actors who will establish contact with someone from the target (victim) organisation outside of normal channels. They will sometimes reach out via social media platforms, chat software, SMS or phone to establish communications.
The report underscores the need for heightened vigilance and caution among individuals engaging in online activities. It provides several red flags to watch out for, including suspiciously high investment returns, careful scrutiny of email and web domains, verification of businesses through official registers, and consultation with regulatory authorities before making financial decisions. Individuals are advised to remain sceptical and exercise caution when interacting with potential scammers.
Additionally, the report advises securing social media profiles to prevent the misuse of personal information, as scammers can use such data to create more realistic fake accounts or launch social engineering scams. The importance of staying informed and receiving security awareness training is emphasised to empower individuals and organisations in safeguarding against cyber threats.
To conclude, the Q1 Cyber Security Insights Report from CERT NZ highlights the alarming surge in scams, coupled with the exploitation of AI by cybercriminals. The report serves as a reminder for individuals and businesses alike to remain vigilant, adopt robust security measures, and stay informed about the evolving tactics employed by scammers in order to protect themselves from financial and personal harm.
Interested in discovering how Security Services and Intelligence could safeguard your business, its people and its data from these kinds of attacks? Contact us to find out more.