Let me start by saying I reap the benefits of Capitalism, and there is much it has done for fostering human development. However, sometimes making a buck gets in the way of destroys the true purpose of an industry. Medicine comes to mind. There's no profits in curing illness, so it is treated instead. This seems like the mindset of a sociopath rather than that of someone who wishes to help people. Yet here Americans are, ignorant of any better way. Health insurance is another bilking of the American people. The idea is if one pays into this system, the company will provide funding in the event of an emergency or health crisis. Unfortunately, the company decides what they help pay for. Oh, and in the USA, having health insurance is required by law. Either one pays an exorbitant amount for coverage from whatever company their employer provides, or they can find their own. Lower income families can file for benefits from their respective state. I myself have been in a situation where my paycheck wasn't enough to cover either the insurance my employer partnered with or private insurance, yet the state told me I made too much to qualify for public aid. Ideally, having insurance should alleviate or absolve one from financial responsibility if they visit a doctor or hospital or need a prescription. It doesn't exactly work that way though. Oftentimes the payments a person makes towards insurance equal the payment the insurance makes on a medical bill. I've had a hospital bill where my insurance paid absolutely nothing. So, what's the point of having insurance then? Oh, right... Americans who don't have health insurance are breaking a law. I've stumbled upon a fallacy here. How can one be a good consumer and feed the greedy system of Capitalism for an adequate lifetime if they have poor health and die at an early age? Wouldn't it benefit the economy if all members were able to work while blindly consuming and living to a ripe old age? Isn't that the major goal of Capitalism? Wouldn't some sort of healthcare plan that isn't a business make such a thing possible? In the end, some industries ought not to be a business, and healthcare is one of them. How long until the greater portion of the US populace understands this?