What Has Happened To Us?

Published August 13, 2015 by Angela

As I manage 3 blogs and read through what the community is saying via what my blog is about, I find it so very sad to see so many women with newborn babies struggling to raise them alone. Women who were told to “get rid of it”, but didn’t. Women who were promised, “I will be there forever”, but it really meant I will be there until someone else catches my eye. Women who suddenly found themselves divorced with little children because the “family life” became to much for the father (I say that VERY loosely).

Why is it okay to just walk away? To say “It is not my problem?”

What happened to our morals, our foundation, our right and desire to live the American dream?

It is gone. And the people who still stick up for it are the outcasts, the freaks, the ones who reject change.

Believe-you-me, I would rather it still be like it was when I was a child in the 70’s and 80’s and parents beat your butt, you went outside and played until the streetlights came on just to give your parents some peace of mind, and you did NOT talk back.

One comment on “What Has Happened To Us?

  • This is sad… If you want my theory, it’s a part of a me-centered shift in our culture. People used to live for a higher purpose than their own happiness. Now, if something doesn’t make you happy, get rid of it. Kids and a family doesn’t always make you happy. People used to be taught to stick with things, that there would be highs and lows, just hang on. (And you were stigmatized by your family if you left your girlfriend/wife, with or without children). I don’t know. There are good things about a modernizing society, and some sucky things…

  • Leave a Reply

    Fill in your details below or click an icon to log in:

    WordPress.com Logo

    You are commenting using your WordPress.com account. Log Out /  Change )

    Facebook photo

    You are commenting using your Facebook account. Log Out /  Change )

    Connecting to %s

    %d bloggers like this: