And the United States, to its credit, provides just about all of the above.
I don't get this idea that we have to understand that foreigners don't think we're the best or concern ourselves with their opinions of our country or our culture. I don't care what the people in the next state over think of my state. Why would I care what someone on the other side of the world, who will never set foot on American soil, thinks?
My idea of "live my life" is that I'm not on the hook for the healthcare expenses of every Tom, Dick, and Harry that doesn't take care of himself and is under no obligation to use healthcare resources wisely.