Be aware: AI voice cloning scam attempt against Odfjell

20.01.2025
We want to bring your attention to a growing threat that we see is targeting our company now: voice phishing, also known as "vishing." Cybercriminals are increasingly using advanced AI technology to mimic the voices of trusted individuals, making it crucial for us to stay vigilant.

We have had an incident where one of our employees was called by someone that pretended to be CFO Terje Iversen. The caller used AI-generated voice and sounded identical to our CFO, and a substantial amount was requested to be transferred. Our colleague followed Odfjell protocol, ended the conversation and called Terje back to verify the request. Terje could then verify that it was a fraud. 
 

What is Voice Phishing? 

Voice phishing involves attackers using AI to replicate the voice of someone you know, such as a colleague, manager, or even a family member. This can lead to unauthorized access to sensitive information or fraudulent activities. They will often ask for money transfers and will use urgency and get aggressive to get what they want.
 

How to Protect Yourself:

  • Be Skeptical: Always be cautious when receiving unexpected calls, especially if the caller is requesting sensitive information or urgent actions.
  • Verify Identity: If you have any doubts about the caller's identity, make an excuse to end the call politely. For example, you can say, “I'm in the middle of something. Can I call you back in a few minutes?”
  • Call Back: Use a known and trusted number to call the person back. This simple step can help verify that you are indeed speaking to the right person.
  • Report Suspicious Calls: If you receive a suspicious call, report it to the IT security team immediately. Your vigilance can help protect the entire organization.

Remember: It's always better to be safe than sorry. Taking a moment to verify the caller's identity can prevent potential security breaches and protect our organization's sensitive information. Stay safe and secure!