Why Our Health Matters shows what has gone wrong with the American way of health to create the crisis in which our country is embroiled. Dr. Weil identifies the root of the problem, showing how medical schools, insurance companies, and pharmaceutical companies have failed us, but also pointing the way to a solution.