I've been kicking around this idea of WI after their betrayal by Philip IV of France in 1307, the Knights Templar had sailed to America and over the next two centuries conquered the Americas and Christianized the population. I know its against conventional wisdom but their is evidence of European activity in the Americas during the late 14th century.
Any thoughts?