This is totally a tangent now and not related to the original topic anymore, but I'm just curious: is it only a trope from movies and TV shows or are American parents really so afraid of their teenage kids having sex? Because, apart from maybe very religious families, I've never seen this to be that much of an issue where I grew up (Western Europe), people just know that teenagers are going to have sex.
I'm not American, hence can't speak for them, but I think it has a lot to do with America's religious roots in orthodox Protestantism and largely Catholic immigration.