I believe we should have a state/government run health insurance program for EVERYONE in this country. This is what I believe, though I know many people are horrified at the thought.
When the show I worked on went on haitus a few months ago I was briefly left without health insurance, as I could not afford to pay for Cobra. When the stimulus plan kicked in suddenly it became affordable for me, though still my biggest expense other than rent and student loan bills. It was really scary not being covered, because I knew that one minor illness or accident, one hospital or even maybe a doctor's visit, would leave me in debt for years. I don't make a lot of money right now. Hopefully one day I will, but right now I am getting by by budgeting carefully and hoping that nothing goes wrong. Many people are.
I had a family member pass away from cancer a few years ago. It's a devestating disease. Due to the fact he lived in Europe his family was not left with crippling debt or bankruptcy on top of the loss of their loved one. When people get very sick here, they better be covered and they better stay covered because otherwise not only will they be in debt for the rest of their lives, they risk running out of money to even get treated at all, or they get subpar treatment.
And if they recover, if they then lose their coverage for whatever reason good luck ever getting covered again- insurance companies don't want to touch people with a history of serious illness with a ten foot stick. You'd better hope you stay healthy forever!
Add this to the fact that though public education is free (which it should be, by the way) but that if you are sick you're on your own and the system seems really fucked up. One day, down the line, if I have kids, I can teach them to read and teach them history, etc, if I have to. There are books out there to teach ME what I need to teach them. People homeschool all the time.
I cannot learn to administer chemo from a book if someone I love gets sick. I cannot learn to do a triple bypass or how to create antibiotics in my kitchen.
As much as I think free public education is important, I think free healthcare (well, not free, but public since nothing in this world is free) is more important by far. If I had to choose for my family, I would choose health care every time. If you are well enough off your kids probably go to private schools and have private healthcare anyway. If you are not rich, the state should provide both. But you can't fix dead. You can teach someone to read or whatever at any point but you can't bring them back if they die from lack of care.
So why does everyone basically agree that education is a divine right in this country but healthcare isn't? Why are some conservatives horrified at the thought of a government run program? I'm not saying education is not important. I think it's critical. I think without it we would collapse back into the dark ages. I would fight for it to the end. But do people not feel the same way about health care? Why is that somehow our own problem?
I was just at a website that has a lot more facts and data than I do, and it is worth reading about, but do we really need a ton of statistics to understand this? Isn't it common sense?