What is health insurance?

Health insurance is, basically, a promise by an insurance company or health plan to provide or pay for health care services in exchange for payment of premiums.

Health care in the United States is delivered and insured in many different ways.