Should every employer in the US be required to provide health insurance coverage to its employees? If so, to what degree? If not, do you think there should be an individual mandate requiring all individuals to purchase coverage for themselves and their family members?
==================================================