Welcome to NerdWallet’s Smart Money podcast, where we answer your real-world money questions. In this episode:
Learn how scammers deceive victims by using AI for voice cloning and learn how you can protect yourself from other AI-related fraud.
How can you protect yourself from AI-driven scams that target your finances?
What new scams are happening as technology advances?
Hosts Sean Pyles and Sara Rathner discuss the alarming use of AI in scams and the future of fraud to help you understand how to safeguard your personal security. They begin with a discussion of AI-driven voice scams, with tips and tricks on recognizing potential fraud, staying informed about scam tactics, and the importance of open discussion to empower against scammer tactics.
Then, scam expert Bob Sullivan, author of “Stop Getting Ripped Off” and host of the podcast The Perfect Scam, joins Sean to discuss the broader implications of AI technology in scams. They discuss the potential for AI to personalize phishing attacks, the ease of creating convincing fake audio, and the importance of skepticism in the face of unexpected calls. Plus: the need for technology companies to embed safeguards, the role of societal learning in approaching unexpected calls, and the importance of verifying any financial requests you receive.
Check out this episode on your favorite podcast platform, including:
NerdWallet stories related to this episode:
Episode transcript
This transcript was generated from podcast audio by an AI tool.
We already know that our robot overlords are coming, but in the meantime, while they plot, their artificial intelligence skills are being put to use by bad actors all over the world, utilizing technology to bilk people out of their money. That includes using AI to copy someone’s voice and demand ransom for a non-existent kidnapping.
I had a full conversation with my daughter. It was interactive. There was no pause. There was no break. There was nothing that would lead me to believe that it wasn’t her. So when the mom that stepped outside called 911, she came back in and she said, “Hey, 911 tipped me off that there’s a scam where they use AI and they can replicate anyone’s voice.” I didn’t believe it. It gave me hope, but I didn’t believe it.
Welcome to NerdWallet’s Smart Money Podcast. I’m Sean Pyles.
And I’m Sara Rathner. And Sean, that clip is as creepy as it gets.
It is, and the story we’re going to hear today is as creepy and as awful as it gets as we wrap up our Nerdy deep dive into scams and identity theft and how to protect yourself from all of it so you don’t lose your life savings. Today we’re going to examine the future of the scam industry and the expanding role of AI.
Yeah, I have to say, and I know you’ve touched on this in several of your interviews already, this is exhausting. I mean, it’s hard to listen to this and not think, yeah, no matter what, I’m screwed. They’re going to get me unless I spend all this time and effort protecting myself. And who has the time for that?
I hear you, Sara, and it’s easy to feel somewhat defeated by all these organized criminals whose sole job is to steal our identities, which technology seems to make easier and easier for them, and to scam us in ways that we can’t even conceive of until it happens.
I mean, I’d rather spend more time taking naps, honestly. I don’t do that enough and I’m really sleep-deprived, which is probably making me more susceptible to scams, honestly.
Yeah, I am totally there with you. But Sara, I think we’ve also provided listeners, you included, with some really practical ways to fully arm ourselves that don’t take an undue amount of effort. And as we’ve been saying, one of the most important takeaways from this series, I think, is for everyone to realize that there is no immunity here. This stuff can happen to anyone regardless of how old you are, how much schooling you’ve had, or how much money you make, where you live. It’s a universal risk, and the more we talk about it, the more power we take away from the bad actors.
All right. Well, the idea that AI is getting in on the action is slightly terrifying. You mentioned our robot overlords at the top of the show, and I guess they’re coming for everybody’s bank account PINs.
If only it were that simple, Sara. AI is being deployed in sophisticated ways to manipulate our emotions, find vulnerabilities in software that we rely on every day, and generally make our lives like something out of that show Black Mirror. So in this episode, we’re going to explore things like how is AI being used in scams, what’s the deal with these AI voice scams and what hellish development might we see next in the world of scams. To start, we hear from a woman named Jennifer DeStefano. She lives in Arizona and had an experience that no one should ever go through, but that provides a window into one of the ways that scammers can reach into your heart and try to pull money from your bank account.
All right. We want to hear what you think too, listeners. So tell us your stories of identity theft or getting scammed or share how you’re working to fight it or recover from it. Leave us a voicemail or text the Nerd Hotline at 901-730-6373. That’s 901-730-NERD. Or email a voice memo to [email protected]. Here’s Sean with our first guest.
Jennifer DeStefano. Welcome to Smart Money.
Thank you so much for having me.
So Jennifer, you experienced an AI voice scam. Can you set the scene for us? What was that day like before you got this phone call?
It was just a normal day. I had two children that were up training for a ski race and I had my daughter, she was at dance, so I was going to go pick her up and then hopefully joining my other two kids later in the weekend. So I went to pull up to the studio and get out of my car to go get her, and I got a phone call and it came in from an unknown number. Originally I was going to ignore it, but knowing that I had two of them that were practicing for a ski race and unknown can be medical, you just never know, just in case I decided to answer it.
When I answered it, I said hello, and I was getting out of my vehicle, so I had all my stuff in my hands. I was walking through the parking lot, so I had the phone on speaker and it was my older daughter crying and sobbing saying, “Mom, Mom, I messed up.” And I said, didn’t think anything of it. She ski raced for a number of years. It was a very familiar phone call. And I said, “Okay. What happened?” And she goes, “Mom, I messed up.” And I said, “Okay. What’d you do?” And then all of a sudden a man came on and he said, “Put your head down, lay back.”
And at that point I thought she got really hurt just being toboggan. So then I started to get really concerned. I’m like, “Wait, wait, wait. What’s going on? What’s happening, Bri? What’s going on?” And then this man gets on the phone as she starts saying, “Mom, help me. These bad men have me. Help me, help me, help me.” The phone, her voice starts to fade off with her crying and sobbing and pleading for me. And this man gets on the phone. He goes, “Listen here, I have your daughter.”
“You call the police, you call anybody, I’m going to pop her stomach so full of drugs and have my way with her and then drop her for dead in Mexico.”And at that point was when I had my hand on the door handle of dance, and I walked inside the room and I just started screaming for help. So fortunately there happened to be three other moms there that know me well. I was asking my younger daughter to get her dad on the phone, call her brother, call anybody. So she actually jumped up and ran over to my younger daughter to say, “Let’s go find your dad. Let’s figure this out.”
Another mom said, “I’m going to go call 911.” She stepped outside to go call the police, and the third mom sat beside me so she could hear everything the man was saying as I was trying to figure out where my daughter was, what’s going on.
And so it’s a perfectly normal day. You’re about to get your kids after a day of them doing their activities, you get a phone call and within 30 seconds your world is turned upside down.
Completely upside down. I had no idea what was going on. I had a full conversation with my daughter. It was interactive. There was no pause. There was no break. There was nothing that would lead me to believe that wasn’t her. So when the mom that stepped outside called 911, she came back in and she said, “Hey, 911 tipped me off that there’s a scam where they use AI and they can replicate anyone’s voice.” She’s like, “It could have been a voice recording.” I’m like, “It was definitely not a voice recording. It was interactive. I was asking her questions. She was responding to me. It was not a recording.” And she’s like, “Well, they can do anything.” I’m like, “But it was her crying. It was her sobbing. I know it’s my daughter. It wasn’t a recording.”
And what thoughts are going through your head as you’re having this conversation with what sounds exactly like your daughter?
I didn’t for a second not believe it. It wasn’t until another mom actually got my daughter on the phone and I talked to her and she reassured me that she was who she really was, and I could finally wrap my head around it. And then I finally believed her and then I knew it was a scam.
How much time elapsed from the time that you answered the phone to when your actual daughter was speaking to you and you were reassured the phone call that you got wasn’t legitimate?
So the whole phone call actually took four minutes, but that’s where time freezes in that panic and fear.
Right. Oh God, that’s heartbreaking. So do you know how the scammers got your daughter’s voice and maybe why they targeted you specifically?
So I had a bunch of different thoughts on that. Okay. She’s done a few interviews related to school, sports, whatnot, but that still doesn’t explain the crying and sobbing. It doesn’t explain that conversation. Her voice recording for her phone is her prepubescent voice, so it’s not her current voice. So I honestly have no idea. That’s where a lot of this, what’s scary is at first it was are they following me? Is it targeted? Do they know something? But then hearing how it had happened to a number of other people in different capacities, and you realize it’s a lot more arm’s length.
They were demanding money to be hand delivered to them. So they were making arrangements to come pick me up in a white van with a bag over my head. I had to have all the cash. They were going to take me to my daughter, and if I didn’t have all the cash, then we were both dead.
God, how much were they asking for?
It was originally a million dollars. And then he came up with a number of $50,000 when I pushed back that that wasn’t possible.
And to this day, it’s unclear why you specifically got this call?
Okay. And so after the phone call ended, I assume you hung up on the scammer when you realized that your daughter was safe.
So once I realized my daughter was safe, I actually had them on mute and they were furious that I wasn’t making final arrangements for a pickup. And then I picked the phone back up and I called them out and said, you don’t have my daughter, this is a scam and I’m going to make sure that this is going to come to a stop and I’m going to do anything I can to stop you. And I hung up on him.
God, what are you on an individual or maybe even a family level doing to safeguard yourselves? Have you guys established a safe phrase that you might use to confirm your identities?
So we did create a safe word, and then it’s a lot of communication. Where are you? Who are you with? Where are you going? So that way if I do get a phone call or anybody gets a phone call, you can easily put it through the test. Does this make sense? Is this where they’re supposed to be? Is this even possible? Do you know the code word? Do you have some identifiers? If I didn’t know where my daughter was supposed to be, I wouldn’t have been able to locate her as fast as I did. And I had her brother, I had all of her siblings coming together in response to help me as well. So everybody was in full communication. You have to communicate and you have to seek help.
Well, Jennifer, is there anything that you would like to leave listeners with?
Just awareness, have these conversations, sometimes maybe tough conversations, especially with children. But you have to have the conversations, have safe words, know where your kids are at. You have to have these conversations and make sure you safeguard your family.
Well, Jennifer DeStefano, thank you for sharing your story with us.
Thank you so much for having me. I really appreciate it.
Sara. I found this story just heartbreaking. I mean, at least they found out it was a scam before handing over money or before Jennifer offered herself up to scammers. But not everyone is so fortunate. Imagine how hard it is to say no to something like this when a loved one seems to be in jeopardy.
Yeah, there was a piece recently in The Cut written by a journalist who knew she would never, ever fall for something like this. Don’t we all think that? And ended up handing over $50,000 in a shoebox to a stranger in a large SUV. I don’t think anybody ever sees themselves doing that. I’m glad Jennifer DeStefano didn’t let it get that far with the help of friends.
And there’s hope that help will come from more than friends. Earlier this year the Federal Trade Commission proposed new rules that would prohibit the impersonation of individuals. It recently enacted rules that prohibit impersonating government or businesses. This proposed rule would extend to, well, us. The proposal is currently in a comment period, so if you feel so moved, go to the FTC’s website, ftc.gov, and comment.
Well, next we’re going to talk with another journalist, Bob Sullivan, who’s been covering the scam world for years now. He hosts a podcast for the AARP called The Perfect Scam and is the author of Stop Getting Ripped Off, among other books. We’re going to talk about the future of the scam world and how to protect yourself as technology continues to make it easier for the bad guys. That’s coming up in a moment. Stay with us.
Bob Sullivan, I’m so glad you could join us on Smart Money.
Thanks so much for having me.
So Bob, the first question I have for you is how do I know that you are the real Bob Sullivan and not an AI-generated Bob Sullivan?
This is an excellent question and I’m glad that you’ve started there. You can’t, really. In fact, I did an episode on my own podcast recently where I had someone clone my voice and rather persuasively introduce the podcast, although family members pointed out to me that there were just little things that didn’t quite sound right. So either I was AI or maybe I had a bad cold or something. But it’s hard to tell, a little nasally.
So in this series we’ve talked about identity theft, identity fraud, and the scam world, and I’m hoping that today you can give us a warning about the future of all of this and the role that artificial intelligence or AI is going to play and in fact is playing. So to start, when did we first start seeing AI being put to use in this way? Do you remember a specific AI-generated fraud or scam where you said, oh wow, this is something new?
Well, I have to be honest with you and say that I sit here reading emails about scams and fraud all day long, and I have not seen evidence of these kinds of things that a lot of folks are talking about right now, which is voice cloning or deep fake videos being used to fool people. Here’s a couple of things that I am worried about, however. All the data collection that we have, the criminals now have access to it and it’s going to be very easy for criminals to use that data to just really carefully craft their phishing pitches so that they’ll know exactly when you are transactional, for example.
Then they’ll know precisely when you order something from Amazon or what your zip code says about your income, and they’ll know how to attack the right person at the right time with the right message. And that’s the kind of artificial intelligence that I’m worried about, criminals using big data to essentially perfectly hone their crimes. But there’s one other thing that I’d really like to mention that enough experts have told me about that I am quite concerned about it, and that’s this idea of generative AI, where a tool like ChatGPT can engage in conversations and learn.
We have told people forever that one of the ways that they might recognize that they’re talking to a criminal over email or in chat or in a game is bad grammar or sentences that don’t quite make sense, non sequiturs. Well ChatGPT is getting very good at holding intelligent sounding conversations. Let’s start by saying it’s going to probably eliminate the bad grammar problem, but even more than that, imagine a tool that learns along the way just the right things to say to romance someone using a formula that’s been tested in the real world or the right things to say to get someone to follow the instructions for an investment scam.
I think these tools are going to learn how to carry on these conversations in ways that we’ve never seen at large scale, and that’s the kind of artificial intelligence that I’m worried about being used in scams.
Okay. And can you talk us through how these AI voice cloning scams do work, whether they’re pervasive or not?
Sure. Well, I mean there are services, the fellow who did it on me signed up for a website that lets you do this for $5 a month and the first month is 80% off. So for literally one US dollar, you can upload samples of my voice or anyone’s voice and then generate for a potential scam victim, something that sounds incredibly realistic. I think the one thing that’s important to understand about what’s different about voice cloning, I don’t know if you remember the movie Sneakers, it’s one of my favorite hacker movies.
But in that movie, they basically needed a voice passport in order to enter a highly secure building, and they needed the authority figure to say things so that they could piece together cut and paste style a certain sentence, for example. So one way you might be imagining this works is someone tricks me into saying, my mother is in distress and I need you to send money to this wire account, but that’s not it. Instead, what’s powerful about AI voice cloning is with just a few sentences from me, they can extrapolate my intonation, my pausing and make me say anything.
So you don’t need a whole lot of vocabulary in order to make a really, really effective, almost fully independent voice clone.
Well, I’d like to walk our listeners through some of the ways that fraudsters and scammers are putting this technology to work right now in ways that are shocking even to you. Can you share one or two examples that you know of that will give us a sense of just how bonkers this new era is?
Well, let me go back to the big data example. Foreign governments and large hacker organizations do have what would look to most people like a credit reporting agency on all of us. They have thousands of bits of data about all of us that they can use against us, and it’s data that they’ve been compiling for years. So they know what your tendencies are, they know where you shop, they know where you are. We never talk nearly enough about the theft of location data. All our cell phones are tracking devices.
And so a criminal could know when you’re walking past a store and send you a precisely timed invitation to either buy something at a discount or even worse to send you a note saying, I was just in Ireland. Bob, there’s a bank in Ireland that suddenly tried to charge a $2,000 charge to your account, say yes or no. And I would believe that message right now because I was just there. Those kinds of highly sophisticated, highly targeted crimes enabled by massive amounts of data that again can be searched now instantaneously, that’s the kind of thing that really scares me.
And those examples are highly specific and individualized, which makes them all the more believable. So it makes it hard to trust anything that’s inbound to us.
Absolutely. And this is a tragedy because technology enables so many wonderful things. It is a terrible thing that we have all of these dark stories as this gray cloud around tech that’s going to prevent a lot of people from even trying to use it, and it’s going to make all of us feel just a little bit insecure because we know these sorts of bad and dangerous things can happen to us. The best example of this is in the health arena. We’re so far behind in what electronic health records could be in America right now.
When you go to the hospital, you’re laying on a gurney and there’s someone asking you over and over again, are you allergic to penicillin and you just were in a car accident. And that’s ridiculous. But because we are, I mean there’s many reasons, but a big one is we are so concerned about criminals misusing this data or companies misusing this data that we are decades behind where we could be with things like electronic health records.
Earlier this episode, I spoke with a woman who received an AI voice scam call from what sounded like her daughter, and it of course wasn’t her daughter. But after everything settled down, she still doesn’t really understand how these people got her daughter’s voice. Her daughter isn’t really on social media, and this woman is also very unclear as to why she was targeted. So do you know how scammers are capturing people’s voices and why they might choose to target one person over another?
So I don’t know. I think for the vast majority of young people, it would be fairly trivial to examine a couple of TikTok videos and get enough voice sample in order to fake their voices. There are people who are not on social media and whose recorded voices aren’t in any, say, school websites or anything like that. I think they are few and far between. So I think most people should assume that a criminal could absolutely get enough audio samples of your voice to do this to you. So I can’t speak to that specific instance or why that person was targeted or why that child was targeted.
The only thing that concerns me is I don’t think we should give anyone the impression that this is happening on a widespread basis. It’s not. 99% of these kinds of calls are still being done by just human beings in boiler rooms. Nevertheless, this absolutely can be done. It can be done really inexpensively. And as I just mentioned, all of us are vulnerable to this. You’d be shocked at how much, even if you don’t have any social media, that pieces of your life have been posted by other people.
So it’s out there, and again, it takes very little, we’re talking probably less than a minute of audio in order to generate a fake you.
What do you think we’re supposed to try to do to combat this? I mean, using me as an example, I host this podcast, you host one too. Our voices are out there just waiting for scammers to take a clip and make us say whatever they want, call our loved ones and use that voice to try to get their money. How do we fight that?
Yeah, you and I are screwed.
Sorry. But the best, I talked to some other expert about this, so I can’t claim this advice myself, but I think it’s very good advice. At the beginning of the Photoshop era, people saw pictures of pyramids moved and weren’t skeptical of that. We just thought photographs couldn’t lie. I think nowadays for the most part, and certainly not everybody, for the most part, if you saw a crazy picture of Joe Biden riding on a camel or something, that there would be a piece of you at least that would say wait a minute, this might be fake.
There’s now an impulse that things you see might very well be faked. I’m hoping that our level of 21st century digital sophistication gets there quick enough with audio that your parents and my parents will have a predisposition to think if this is a weird phone call from Bob or Sean, it could be fake. And I think that’s the sort of learning curve we all have to go through kind of as a society.
Well, let’s turn to some tactical ways that people can try to protect themselves. Can you tell us about the importance of things like pass keys, biometrics, other ways to authenticate that you are who you are when you get a call from someone or you allegedly call someone else?
I’m glad you brought that up. When it comes to voice printing in particular, there are these new technologies that are a little bit like image watermarking they’re discussing putting on voices. So you can imagine there being something even inaudible embedded in an audio phone call, which the technology company, the phone company, used seamlessly to verify that you were you, sort of like a Verisign email or whatnot. So there’s people who are working on technologies that would help with this verification. I’m not a fan of putting these really hard things onto individual consumers.
I think it’d be much better if the technology companies were forced to solve these problems because I can’t give my mom advice on how to verify how I might contact her at every platform that ever is going to exist. That advice is going to get outdated almost immediately.
Given that we do live in this world that we are living in, I’m trying to think about ways that I can protect myself and my family. After I began doing research into 21st century scams, I established a safe phrase that if my family gets a call that alleges it’s from me and I’m in a panic, they’ll say, “Hey, what’s the safe phrase?” And I will tell them that phrase, if it’s actually me. And if it’s not, then the scammer’s going to try to divert them some other way, I’m sure.
I do think that’s great, and I don’t mean to trivialize any of that, but I would like to point out most people in security would say you’ve also created a vulnerability because someone armed with that phrase could easily disarm someone in your family, right?
That’s true. Although the phrase has only been uttered in person when we agreed on what the phrase is. So we’ve tried to keep it as away from recording devices as possible to the extent that we can.
The only real point in my saying that was none of these things are foolproof. So it’s good to have that in mind. I think the one thing that helps all the time in the end, whatever we’re talking about here, almost inevitably, is a cover story for give me money. All of these, whatever technology we’re using, whatever the story is, in the end, there’s an ask of some kind. And stealing people against the ask is really, really important. And the best way to do that is interruption. The best way to do that is to train everybody in every circumstance, whatever is happening, to stop and talk to an independent third party, whether that be a family member or a financial professional or something.
Your son’s in jail in Europe, he needs bail money immediately, take the 15 seconds to talk to someone not involved in the situation and hear the words come out of your mouth. When you get a phone call you don’t expect, hang up and then go to the company’s website yourself and call the official published number, call the company back. That solves about 99% of these problems.
Well, Bob, I’m asking this of all of the experts that we’re talking with for this series. So I’m going to ask you too, have you ever experienced a scam or identity theft or fraud?
No, but I’ve certainly been through a bunch of credit card-style identity thefts, but fortunately, knock on wood, nothing that we would consider deeply involved identity theft.
Well, Bob, do you have any hopeful thoughts as we wrap up this series, which has been a bit of a bummer as we’ve talked about fraud and scams and people losing their life savings to technology assisted terrible people?
Yeah. So I spend all my week talking to people who’ve had their life savings stolen from them in all manner of speaking. It’s hard to stay optimistic. I think there’s a whole bunch of factors coming into play here. We have an aging population, many of whom thankfully have a lot of savings, they’re an easy target. And as I’ve mentioned, we have all of these tools that make it so much easier for someone halfway around the world to steal money instantly in untraceable ways. This has never happened in human history before, so this is the golden age of crime.
However, we are all talking about it now. So that’s really positive. Here’s the most optimistic thing I can tell you. Young people, software designers, engineers inside companies are now getting out of school having taken ethics classes and social impact classes and are starting to push back on their managers when they come up with tools like this. And that’s where the tide will turn is when enough people who have a grandparent who’s been a victim of a scam work at a software company and they say, we have to put this protection into this device before we release it to the world. And I do think those conversations are happening. So I am actually optimistic about that.
That’s good to hear. And is there anything else that you wanted to mention that we didn’t touch on?
What we find is that a really, really big obstacle to fixing this problem is shame and embarrassment. Many, many people won’t come forward after they’ve been a victim of a crime like this because they feel stupid. I called myself stupid. All the language around scam crimes tends to focus on the individual instead of the system. Well, if you read a news story about a person who fell for a home improvement scam, that just doesn’t sound the same thing as someone who was robbed at gunpoint.
Was the victim of a crime. That’s what happened at the end of the day.
They’re a victim of a crime, and we work hard on the language that we use to stress that there was a crime. There’s something about if we say, well, that person fell for this scam. Well, I would never fall for that scam. You can sort of put it at arm’s length, and that makes it a little easier to not do anything about the problem. And it takes the focus off the criminal. We kind of think the criminals are clever and sexy. But more than anything, we want to try to get away from the idea of shame because when someone is embarrassed because they are a victim of a crime, they don’t come forward.
The statistics don’t reveal the true nature and breadth of the crime. Everybody will tell you this, all this crime is wildly under reported. So however big the numbers seem to be, they’re at least double what we hear from the Federal Trade Commission and whatnot. And so anything that I can do to relieve the stigma from being a victim of crime like this, I’m all for it.
Bob Sullivan, thank you so much for helping us out today.
Thanks a lot for having me.
So Sara, after four episodes of hearing from experts and people who have experienced scams, I’m in a state of what I would call bleak optimism. The world right now is rife with scammers and their methods of duping innocent people are evolving at a rapid pace. But simultaneously, I can’t remember a time where scams and fraud were more present in the cultural conversation. Yes, it is fully a tragedy that our means of communication are so compromised that we cannot trust a call from a loved one in what seems like their most dire moment.
That really can’t be overstated. But hopefully the increased awareness of these scams will help people avoid sending money to bad actors and mitigate feelings of shame that people carry after enduring a scam. And hey, maybe one day our government will make some laws that help tamp down on the rampant scams that we’re all facing.
And there’s this saying in journalism, if your mother tells you she loves you, fact check it. Well now you have to. So that’s the world we live in.
If anyone contacts you at all, fact check it.
Yeah. And text them on the side and be like, “Hey, are you calling me from jail right now?” And they’ll be like, “No.”
I think the bottom line for everyone listening is to exercise extreme caution when you speak with anyone online and before you send money to anyone ever.
If somebody is asking you for money and you don’t really know who they are, they are not who they tell you they are. How’s that? How’s that for general rule?
All right. Well, for now, that’s all we have for this episode and this Nerdy deep dive about scams and ID theft and fraud. If you have a money question about any of this or anything else, turn to the Nerds and call or text us your questions at 901-730-6373. That’s 901-730-NERD. You can also email us at [email protected]. Visit nerdwallet.com/podcast for more info on this episode and remember to follow, rate and review us wherever you’re getting this podcast.
This episode was produced by Tess Vigeland. Sean helped with editing. Kevin Berry helped with fact checking, Sara Brink mixed our audio.
And here’s our brief disclaimer, we are not financial or investment advisors. This nerdy info is provided for general educational and entertainment purposes and may not apply to your specific circumstances.
And with that said, until next time, turn to the Nerds.
Source: nerdwallet.com