So sick of shows talking about children ruining lives & relationships, insulting a significant portion of the population & God forbid children see these shows & believe they're a burden to their parents, but they instead of a blessing. There are many countries whose populations are on the decline because people aren't having children, instead living in an extended adolescence, turning cities into Never Never Lands, one of the many symptoms of a society in decline, a culture partying their way to destruction. Look at history, the behavior of cultures before their empires went up in flames: promiscuity, disdain for what we now call blue collar work, sacrificing children (literally & via birth control), increase in acceptability of what we now call alternative lifestyles, including homosexuality & pedophilia, and pornography, rejection of the conservative values, morals & religion of previous generations. Hollywood & the left are doing everything they can to destroy society, starting with families, normalizing hatred of the most innocent & vulnerable lives.