I've noticed that a lot of the time when for whatever reason East Asians discover the Americas it results in Chinese colonies in southern California. Problem is, from what I understand of Chinese culture, it doesn't seem likely. Basically, they thought the world already belonged to them, and if barbarians weren't smart enough to recognize the fact, that was their own folly. So I can see the Chinese discovering America, but not doing much about it, basically. But what about the Japanese or Koreans? I don't know much about what they were doing at the time, assuming a non-existent or virtually non-existent European presence outside their corner of the world.