saggyjones
Well-Known Member
No, they souldn't. Employers are not responsible for your personal health. This country was founded on the ideals of leaders like Franklin and Hamilton -- both self-made men who came from nothing to make something of their lives. Throughout our history, this idea has been carried through from Andrew Jackson, to Frederick Douglass to Abraham Lincoln to Andrew Carnegie to Oprah Winfrey.
Not everyone has the capability of those people to rise up like they did because they don't really have any extraordinary skills.
USMC the Almighty said:Self-responsbility is the backbone of this country, not relying on other people to take care of you. If you want Socialism -- go to Europe (though with all of the Muslims coming flooding the EU it's only a matter of time before they come under Sharia law).
So what if it's the backbone of this country? Why are you so afraid of change? I believe companies should be required to provide health care because it would 1. help the economy (like I stated above) and 2. allow everyone to be insured for health. It doesn't mean equal insurance for everyone but there needs to be a standard, like a minimum wage for health care.