It's 1942. The Popular Front has led France and Britain to victory over Germany, Japan continues choking on China, blah blah blah....
The real question is waht happens to America. I can't picture american society wthout the changes the World War and Cold War brought, and am curious what others think.