alternatehistory.com

As I understand it, the current American healthcare system had its roots in war industries starting to offer health care insurance during WWII. Since wages were frozen, this was a way to attract needed workers.

What if instead the US government decided that a healthy workforce was critical to the war effort. So they setup a program of Universal Healthcare so that the country could maximize the war effort. After the war, this program was popular with people so that the Truman administration keep it going. By the time the Republicans took back the White House, the program was entrenched in American society.

What changes do you see in American society. If this has already been discussed, please let me know of the appropriate threads.

Regards

Stubear1012
Top