Who Really Makes Your Health Care Decisions?
Gone are the days where the patient-doctor relationship was just between the patient and the doctor. Health care in America involves insurance companies, hospital systems, and pharmaceutical companies all having a say in virtually all decisions made by your doctor about your health.
by Dr. Rebecca Bub
August 27, 2019