Now that we’re on the road to universal health care, a question that I think we should be asking is, shouldn’t we just do away with the health insurance system? And most types of insurance, for that matter?
I realize that the knee-jerk reaction is to point out how far-fetched it would be for our country to head in that direction. After all, so many jobs and so much money is made off the insurance company, so the resistance they would mount over any such initiative is bound to be extraordinary.
However, ultimately these companies are making money by providing (and in many cases, withholding) protection for people, that really shouldn’t be treated as a luxury. If a person is sick and needs health care, they should be able to receive it, regardless of what type of coverage they might have, if any at all.
The federal government already invests tons of money investigating and preventing insurance fraud, so that proves that people recognize that the government should be involved in these matters.
Obviously, this means that we end up paying more taxes, but in the long run, it’s bound to cost the average citizen less than heath insurance premiums, life insurance premiums, house insurance premiums, and car insurance premiums. People don’t like when their taxes are raised, sure, but the tradeoff is more security when it comes to the important things in our lives, and a healthier, safer society as a whole.
Some people will dismiss these arguments simply because it makes me sound like a socialist, but I don’t think it makes you a socialist to argue that some programs and services just shouldn’t be privatized.