MTN chair Mcebisi Jonas is once again the subject of a cyber scam that has fraudsters impersonating the former deputy finance minister on messaging platforms.
The mobile operator alerted the public on Monday to a new set of scams where Jonas is being impersonated to solicit money.
MTN said it had been made aware “of incidents related to the false solicitation of funding under the guise of requests for sponsorships or raising of funds towards various causes”.
In the most recent case, “the fraudster(s) reached out via voice call from various non-listed numbers and introduced himself as MTN group chair Mcebisi Jonas or at times, the fraudster may ask for funds to rescue the chair from some ‘danger’ or ‘difficult situation’”.
It appears that the tactics being used remain largely unchanged.
A year ago Jonas was the subject of scam in which fraudsters would contact targeted victims, introducing themselves as MTN representatives and request an urgent EFT payment to a specific recipient towards MTN’s outreach, social and foundation projects.
These scams amount to a form of social engineering on the part of the fraudsters.
According to cybersecurity firm Kaspersky, social engineering is a manipulation technique that exploits human error to gain private information, access or valuables.
In cybercrime, these “human hacking” scams tend to lure unsuspecting users into exposing data, spreading malware infections, or giving access to restricted systems. Attacks can happen online, in person and via other interactions.
Other public figures, including chief electoral officer at the Electoral Commission of SA (IEC), Sy Mamabolo, have also been subject to impersonation on social media and messaging platforms.
This problem is now being made worse by technologies such as artificial intelligence (AI).
According to Trend Micro’s 2023 Midyear Cybersecurity Threat Report, malicious actors abuse AI technology to accurately impersonate real people as part of their attacks and scams. “In fact, imposter scams such as virtual kidnapping are becoming increasingly rampant.
“In the case of virtual kidnapping, malicious actors are able to create a deepfake voice of their victim’s child and use it as proof that they have the child in their possession to pressure the victim into sending large ransom amounts.”
At the same time, ChatGPT and other AI tools are enabling criminals to automate the gathering of information, formation of target groups, and identification of vulnerable behaviours.
This is helping to lure big-name victims, also known as “big fish”, in what are now referred to as harpoon whaling attacks.
The company is calling on the public to report any suspicious calls or forms of communication.






Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.
Please read our Comment Policy before commenting.