Health is one of the most important and complex issues faced by our country today, and the workplace has a central role to play. Not only is the workplace the source of health insurance for most Americans, but it also has significant influence on worker health, both in terms of how job conditions protect or threaten workers’ health and safety, and how the job promotes or interferes with personal wellness.