Is health care a basic human right

What are your feelings on the United States’ current health care system? Do you think health care has improved or worsened under the Affordable Care Act? Is health care a basic human right? Do politicians have the right to decide what Americans do with their bodies?
Think hard – do you believe yes/no in some cases and not others?