What is something that was a normal thing people said to you when you were younger, that now is super weird or horrifying?
When I was young, I was told by lots of people that "Women played at sex to get love, and that men played at love to get sex." And this was obviously a warning to not having premarital sex, but Jesus Christ, y'all aren't even gonna try to hold men accountable or have expectations of them, just put some kind of faux purity on the shoulders of an actual child. At the time, it just seemed normal, but now it's horrifying.