Hey there, good lookin. I’ve been spending the last few months rearing my little ginger delight, and that baby does not like to share. I thought I’d spend my maternity leave snuggling/feeding/changing/playing with my newborn, but I also thought I’d have enough time to get bored and to parlay that boredom into creativity. I thought I’d be writing during the supposed “downtime,” but this ginger likes minimal downtime. And when I do get it, he ensures it’s at the end of the day when I’m running on fumes and it takes every ounce of remaining energy to hold my eyelids open to watch whatever crime/drug/war show we’re currently watching.
But let’s table this personal moan for later, shall we? I want to moan about something more irritating than how formulaic every show on Netflix and Amazon has gotten. At the risk of sounding dramatic, it’s been eating into my soul. It was the last thing I thought about last night when I was trying my damnedest to fall asleep:
When did everyone turn on Florida?
It may just be my chosen entertainment, but so much of what I watch (ie The Good Place, Broad City) and listen to (ie “Pod Save America”) is coming for Florida so hard, y’all (how hard is it?😂). Anyways, it’s hurt my feelings. My closest family is all there. So many of my good friends are either there, from there, or have family there. I loved growing up there.
The Florida bashing started casually. I was among friends. I was on a ski trip back in February, and one of my friends made no fewer than 900 jokes about my handgun-shaped state that week. He’s of midwestern origin, and, I’m paraphrasing, but he was all “Florida is the WORST. You’re all alligators, humidity and arsonists named Crystal Metheny.”
It put me on the defensive, y’all, but I was mostly just confused. Where was he getting this? I had a charmed childhood with sunshine, swimming pools, sandy beaches, and Disney on the weekends. We wore shorts and ate citrus like it was going out of style. I thought Florida was the Hawaii of the east coast.

Also, call me old fashioned, but I wasn’t done with Los Angeles. I thought we all enjoyed tearing that city a new one, did we not? I used to live in New York, and I distinctly remember everyone in the city rallying together to collectively shit on Los Angeles on the daily. We agreed that the traffic was hell on earth, most of the people were dull and vapid, and the same Shangri-La like weather every day was painfully boring.
No matter that we all have friends and family there — it’s almost like they were in on the joke. They couldn’t actually get offended when they get to live amongst gorgeous, sprawling, Spanish-tiled homes and all that Hollywood royalty.
In hindsight, we were probably jealous. New York winter lasts a solid eight months, and we trudge through the rain toting groceries and bragging that we’re so glad we don’t have to drive. Also, we were bulletproof. No one dared retaliate and come for New York. Pop culture defends New York. It was and is the setting of so many iconic TV shows, movies and books.

I’ve been living abroad for nearly seven years now, so I can no longer huddle inside the protective force field New York provides. I’m a Floridian, and Europeans at least think that’s pretty cool. Their eyes light up when I tell them. I’m harassed with “WHY would you ever leave?” type comments on the daily.
Americans seem to disagree, though, and I’m confused about when that happened. I want answers. Again, I’ve been gone, so maybe I’ve missed something. Maybe Floridians agreed to be the comic relief of a disgruntled nation. Maybe I’m just overly sensitive and annoyed that I’m not in on the joke. Mostly, I’m annoyed that I keep being put in a position to have to defend Florida. Can’t we just go back in time and agree that LA is the worst?