I've taken a much-needed respite from writing at during the past six months. Now, I'm back -- and the first thing I have to tell you is that I am an idiot.
In the time I've been away, I've had a lot of exposure to generative artificial intelligence (AI) and large language models (LLMs) as a content creation professional, both as a powerful tool for improving productivity, and as a creative assistant. But as with any technology, there is a dark side and the potential for abuse.
Also: 6 harmful ways ChatGPT can be used by bad actors
Last week, I was almost certainly the target of AI-assisted phishing attempt. I almost fell victim to it, even though I have written about this subject professionally and was previously employed as a threat analyst at a major infosec company specializing in shielding enterprises against phishing attacks.
Should I have known better? Absolutely.
But as human beings, we are only as good as how well we are trained to recognize the phish, and part of that capability is being able to tell the fake from the real, and to train the ancient lizard brain to scream at us when something smells wrong.
Also: Generative AI brings new risks to everyone. Here's how you can stay safe
However, If something scans as sufficiently authentic, then even someone with a lot of experience can end up doing something foolish. And that person was me.
During the past few weeks, I have received emails that closely resemble invoices from Stripe, a payment processor often used for cryptocurrency transactions. The email is an HTML-formatted message that looks very authentic and even includes PDF attachments that look like invoices for cryptocurrency purchases through Coinbase.
Phishing email posing as a PayPal Invoice.
Screenshot by Jason Perlow/Enclosed PDF with faked Coinbase payment via PayPal, with convincing 888 customer support phone number.
Screenshot by Jason Perlow/I'm used to seeing phishing emails that are far less convincing because they have easily detectable formatting, phrasing, and spelling errors. Even the email subject fields sometimes look like gibberish -- not official comms from an authentic vendor or service provider.
Also: The best travel VPNs right now: Expert tested and reviewed
Additionally, the phishing attempts I'm used to will have links in them that will point to fake bank or vendor sites with mysterious URLs where you are then prompted to enter creds and end up giving away passwords. Whenever I receive these kinds of phishing emails, they set off my alarms, and I report them.
Many of them are so obviously phishy that they don't even end up in my inbox; they go straight to spam. I've also trained my brain never to click on a link, and I didn't click on any of the links in this email, either.
However, in this instance, Gmail didn't flag the phishing attempt as spam. The invoice and email language were so well written and formatted that it is very likely that AI was used to mimic what one of these invoices from Stripe might look like to evade Gmail's and my human filters. And since I do not buy cryptocurrency, I don't know what a "real" invoice looks like.
Also: What to know about the SanDisk/Western Digital data loss disaster
Now, there is no easy way to prove that AI optimized any of the text of the email or the PDFs, but given the free and easy access to AI tools during the past year, it's increasingly likely they were used in their construction.
This is one of the dangers of generative AI. These tools can be used to generate text and images to produce fraudulent communications that look flawless enough to pass inspection.
The other stage of this very convincing-looking phishing campaign is that the invoice has a toll-free 888 support number instructing you to call if you don't recognize the transaction.
Also: Scammers are using AI to impersonate your loved ones
Although 800 and 888 phone numbers are frequently spoofed for telemarketing purposes, they are also used by legitimate organizations for call centers because the FCC tracks their issuance. However, I have learned that bad guys are using them for their own call centers.
PayPal's actual phone number.
Screenshot by Jason Perlow/Because I was worried about my email being used to make financial transactions, I called this number, believing it to be PayPal itself, and connected to a busy call center in India, where the rep knew enough details about me to sound scarily authentic.
Also: We're not ready for the impact of generative AI on elections
He told me that some devices in Ohio, China, Texas, and New Jersey were attempting to make crypto purchases through this Stripe service via Coinbase. Still, a final gating step prevented the transaction from going through. Whew.
Google will happily use generative AI to tell you the authentic phone number. Hopefully.
Screenshot by Jason Perlow/However, to remove those devices from my PayPal profile -- and to flag my account, so I would stop receiving these bogus emails -- I needed to send him codes associated with emails attached to my Amazon account that he would send to my email/phone number.
I should have woken up right then and there, but it was 8am and my first espresso drink had not fully saturated my bloodstream.
Also: The best security keys right now: Expert tested
Those codes are two-factor authentication (2FA) codes you would use if you lost access to your Amazon account, if you had 2FA set up -- and you should never give them to anyone.
I was actually dumb enough to send this guy the first 2FA code, and then -- right as he asked for which Mastercard I was using, so they could "flag" it -- my lizard brain woke up; I hung up the phone, and immediately reset both my PayPal and Amazon passwords.
I then found PayPal's actual telephone number -- 1 (888) 221-1161 -- and spoke to its fraud and security department who told me I was a phishing victim and to forward the emails to their abuse department: [email protected].
Also: AI gold rush makes basic data security hygiene critical
Oh, and that PayPal call center was based in India and sounded exactly like the fake one, with an equally legitimate-looking 888 number. Be careful, people.
Have you almost fallen victim to a highly polished phishing attack? Talk back and let me know.