WI Effect of Japanese written with Romaji rather than Kanji etc.

What if sometime after world war two, as a way to democratize Japan the US occupational forces abolished the previously used writing systems. IOTL there was some talk of doing this as it was felt that the use of the previous system(s) was holding back the average Japanese from being literate. What happened OTL was that it was found that most people knew how the systems worked and it was decided to simply stick with the established systems.
But what if that was not how it went down? What if instead of sticking with the three separate written forms of Japanese the American occupational forces decided to try to wipe out any vestiges of the old ways and cut the modern Japan off from its past.
What would the ramifications be for Japan following this? How would she be viewed by the rest of Asia if Japanese was written in a Roman alphabet rather than the more typical Asian scripts? And what would the US Japanese relations ship likely be like?
This is just an example of Kani and all its symbols to give you an idea of the complexity of the Language. For a non native user it is crazy difficult (Chinese still beats it easily though from what I hear)
Kanji map.png
 
I could see phasing out kanji, but hiragana and katakana are way more tailored to the Japanese language than the latin alphabet. The only people who would benefit would be Westerners who are trying to learn Japanese. If you want to see an Asian language that adopted the latin alphabet, look at the hot mess that is Vietnamese.
 
I could see phasing out kanji, but hiragana and katakana are way more tailored to the Japanese language than the latin alphabet. The only people who would benefit would be Westerners who are trying to learn Japanese. If you want to see an Asian language that adopted the latin alphabet, look at the hot mess that is Vietnamese.

Why is Vietnamese a "hot mess"?
 
One consequence is that adopting romaji makes all previously written texts completely unreadable to future generations. And there are a lot of them. Some texts might be transliterated, and choosing which ones is an interesting problem. All it can significantly change the cultural identity of modern Japanese. It would be easier for them to read European languages - afaik it was/is a big problem for an average Japanese to learn a foreign language.

Changing a writing system is a big undertaking. It could be successfully done without much turmoil when there is either not a strong literary tradition or this tradition is not familiar to the majority of population.
 
Why is Vietnamese a "hot mess"?

The Latin Alphabet is generally a poor choice for a tonal language with lots of homophones. See pinyin, which is a great system for Westerners learning how to pronounce Chinese, but would make for an exceptionally confusing primary system of reading and writing the language. Imo the best route for Vietnamese would've been to replace their earlier use of hanzi with a script tailored towards their language, probably an abugida descended from the Khmer script or something completely new like the Koreans did with Hangul.
 
One consequence is that adopting romaji makes all previously written texts completely unreadable to future generations. And there are a lot of them. Some texts might be transliterated, and choosing which ones is an interesting problem. All it can significantly change the cultural identity of modern Japanese. It would be easier for them to read European languages - afaik it was/is a big problem for an average Japanese to learn a foreign language.
That was tzhe whole concept of changing the system. The US officials felt that for Japan to democratize the old culture of Japan needed to be done away with.
New way to read the language means that no one in the future can read of the samurai and get ideas about a new Empire.
 
Agree that the Latin alphabet is awkward with too many letters with too many different sounds.
Sadly, English got spelling got really confusing after the great vowel shift (post Shakespeare and Queen Elizabeth 1).
A phonetic version of English would only use the letter "K" when referring to kittens. IOW no double-sound for the letter "C."
Simplified English would need to add a new letter for the "th"'consonant-blend, since I do not know of any other language that uses that distinctive sound (th).

I would also like to see doubled "rr" when a letter is rolled (Spanish, French and German).

Speaking of German (apologies for the pun) modern German spelling is easier to read than modern Englush because they always use the letter "K" when referring to kittens.

Apparently Serbs solved the pronounciation problem by adding a few extra letters with distinctive sounds.
 
Romaji, as others have pointed out, is horrible for Japanese. I understand Japanese businessmen, for instance, trace kanji on the table between them in conversation, because there are so incredibly many homonyms.

You notice that Japanese is still written mostly with kanji, with kana only in a supporting rôle (grammatical particles, foreign words, furigana for schoolchildren and others learning the language). There's a reason for this.
 
Mesdames et messieurs, allow me to introduce a native Japanese attempt to Romanize their language - Nihonsiki. It's basically ready-made for use should the occupation authorities go for it.
 
For the record I'm not a native user.

Pfft, Kanji, Hanzi, Hanja, and other derived writing system is really easy. There's something called radical which indicate the meaning, pronounciation, etc. The homophones are also way too many that it's unfeasible to use only Hiragana and Katakana

It would be easier for them to read European languages - afaik it was/is a big problem for an average Japanese to learn a foreign language.

Incorrect. The major difficulty for English and well any language learner is the grammatical features present or not present in the learnt language. Japanese is an agglutinative language heavily influenced by an analytic language named Chinese, while Proto Indo-European was fusional English shifted to a more analytic language. The other difficulity is a sound not present in learnt language, for example Japanese language only allows CV structure with the sole expection of "N" which is the only consonant that can be in the end of syllable causing endless difficulties for native Japanese speakers.

Torabura (Trouble), Waifu (Wife), Oirushokku (Oil crisis), etc
 
Note that the following is from someone who is interested not only in linguistics, but also am interested in the East/Southeast Asian linguistic area; there's a lot in those languages of all language families which even to this day fascinate me. At the same time, I've done quite a bit of reading on this topic, so while I don't know everything off the top of my hat, I can give a general realistic idea of what most linguists could postulate.

I could see phasing out kanji, but hiragana and katakana are way more tailored to the Japanese language than the latin alphabet. The only people who would benefit would be Westerners who are trying to learn Japanese. If you want to see an Asian language that adopted the latin alphabet, look at the hot mess that is Vietnamese.

In general, no writing system is tailored to suit one language or another. That is the beginning of the fall into a trap many linguists warn against, which is conflating speech with writing. Spoken languages change all the time; written languages by their very nature are more conservative and tend to reflect older states of the language. At the time postulated by the OP, kana usage at the time reflected Classical Japanese pronunciation more so than Modern Japanese, which meant that (much like modern Korean orthography) it was morphophonemic and tended to reflect the underlying grammar instead of the language as it was spoken. Whether it was a good idea to change kana usage or not is anyone's guess, but Japanese reformers both before WW2 and after (with quite a bit of success) advocated reforming the writing system to match the modern pronunciation. This was also the underlying rationale behind Nihonsiki Romanization, which not only was a way for Japanese people to write their own language in Romanization (and thus make it more "modern") but also to make the language easier to learn by having the writing reflect the pronunciation and not morphophonology and grammar. As a result, Nihonsiki makes no concession to Western learners of the language since one assumes to know the intrinsic rules of pronunciation, including the allophones.

As for your comment, and to address this post below:

Why is Vietnamese a "hot mess"?

Vietnamese is actually an exception; from an objective point of view, Quốc ngữ represents the language in all its diversity very well considering the vast changes that took place between Middle Vietnamese (the stage of the language represented by the orthography) and the modern varieties. The reason why it looks so difficult and alien in the first place is due to the Jesuit missionaries (no doubt aided by local Vietnamese clergy) relying on the Portuguese language and its orthography as its basis (with some input from Italian, as particularly noted by <(n)gh> - a combination which does not exist in Portuguese, which would instead - like Spanish - use <(n)gu>). However, around this time - the 15th to 17th century, which the orthography reflects - Portuguese itself was undergoing a great deal of changes, particularly in the phonology as it transitioned from its state during the medieval period to its modern form (and in the process began to split between the Portuguese spoken in Brazil, which best represents Classical Portuguese in many areas, and the Portuguese spoken elsewhere), so that the missionaries were able to make the orthography work is greatly remarkable. Some of the quirks of the Vietnamese alphabet, therefore, are basically Europeans trying to pronounce Middle Vietnamese using sounds familiar to them. Take, for example, the letter <d>, which is pronounced /z/ in Hanoi and /j/ in Saigon/HCMC. In Middle Vietnamese, this letter instead represented /ð/ - a common allophone in Iberian Romance languages for /d/ when not after a consonant (or nasal vowel, in the case of Portuguese) or a pause (e.g. absolute word-initial position). The representation of <g> in Vietnamese also reflects the alternation in Iberian Romance languages between */g/ and */dʒ/. The best evidence for the Portuguese roots of the orthography is <nh> and the difference between <s> and <x>, the latter partially best reflected in southern Vietnam; <nh> is a Portuguese digraph borrowed from Occitan/Provençal to represent the same sound which Spanish used either <nn> or (currently) <ñ>, while the difference between <s> and <x> is because Middle Vietnamese /ʂ/ and /ɕ/ sounded to Portuguese Jesuits like, respectively, /s/ and /ʃ/. Actually, from a European point of view, Vietnamese was easy to Romanize since its sound system and grammar both were comparatively very simple, even once one mastered the tones, compared with the Portuguese language and all its complexities. Apparently the orthography did its job quite well, since there's records of missionaries who were able to preach in Vietnamese. So it's not a "hot mess"; it's a reflection of the situation at the time the language was recorded in a Latin orthography, and for representing the current language (particularly considering the huge phonetic differences between Hanoi in the north, Saigon/HCMC in the south, and the heavily conservative rural central dialects) it works well in its own right. Learning its features is no different from, say, trying to learn Czech or Polish - let alone Greek, with its own complex etymological-based orthography.

The Latin Alphabet is generally a poor choice for a tonal language with lots of homophones. See pinyin, which is a great system for Westerners learning how to pronounce Chinese, but would make for an exceptionally confusing primary system of reading and writing the language.

In my personal (and thus subjective) opinion, every writing system, even the ones that render virtually every single phoneme almost precisely, is a poor choice for any language - once you render a language into writing, once the spoken form changes the written form will have a hard time catching up. Having said that, I think that for Japanese (and also for Chinese, where I first heard this argumentation) the homophone problem is overblown to a great deal. OK, so the Japanese managed to borrow Chinese loanwords twice at different stages throughout its history (and thus reflecting different regional pronunciations), but that doesn't affect my overall theory. First off, most characters in Japanese and Chinese dictionaries tend to be obsolete or archaic, so all one focuses on is a subset of a larger set. Yet precisely because one assumes that the obsolete/archaic characters are still in current use, and with so many pronounced similarly, one thus assumes that there is a homonym problem. In actuality there really isn't a homonym problem with both languages, and the reason for that is the trap I mentioned earlier of confusing speech with writing. By that token, English has a homonym problem, French also has one, and so does Greek, Hausa, Arabic, and most Polynesian languages - in fact, thanks to phonological processes at work, there isn't a language which does not have a few homophones here and there. That is a natural part of language, and one works out which word means what according to context. With Mandarin Chinese in particular, the whole homonym thing is overblown, as it is a naturally polysyllabic language which is masked by the writing system. Japanese is no different, IMO. (BTW, Standard Japanese uses a pitch accent, not tone - only Kansai dialect has anything resembling tone, and even then there are whole areas of Japan which do not have a pitch accent at all, but just regular stress which depending on the region may or may not be associated with any form of accent.)

Second off, relating to your Pinyin example in particular - Pinyin is basically from the same school as Nihonsiki, which was tailored to suit native needs in total disregard to Western expectations. Throughout the 20th century there have been attempts in China to Romanize their language (or, rather, the Beijing vernacular which ended up as the standard language) because of a still-prevalent view that the characters are a hindrance to literacy. Fair enough, even if the tones became a point of disagreement, with some systems (like GR) going for tonal spelling and others going for adding diacritics to the letters. Nor were Mandarin speakers alone in this - there were multiple Latin orthographies for almost every coastal variety of Chinese, with a significant portion (like POJ for Amoy/Taiwanese) gaining widespread acceptance. In those cases, the reasoning is simple - learning the various diacritics for tonal representation, or the spelling conventions in the case of GR, is no different then learning how to write the vowels in most South and Southeast Asian languages which use a Brahmi-based script, or even how to interpret the vowels in Semitic languages or when to use the various accents and diacritics in many European languages. (In fact, one of the criticisms among Anglophones of Esperanto is its diacritics, even though using diacritics of some form is common practice among most writing systems, with English in its normal written form a glaring exception. Even in Chinese characters, the semantic "radical" is basically a diacritic for a largely English-esque syllabary dating back to and reflecting the pronunciation of Old Chinese.) It's something that's just an intrinsic part of the particular Latin orthography that one uses. On the other hand, you have Sin Wenz, which had things gone differently could have been the standard orthography for Chinese (and, presumably, also many of its minority languages) in the PRC. Unlike most Romanization schemes, Sin Wenz dispenses entirely with the tones, its creators cleverly figuring its users could rely on context to distinguish between various words. This tactic actually makes sense - all writing systems have a tendency towards underspecification, meaning that several features of one's speech can be - and often are - omitted from actual writing. What might seem like a bug or omission by some can be seen by others as a feature of the system. One does not necessarily need phonological accuracy in writing (as English readily demonstrates), nor does one necessarily need grammatical accuracy in writing. Tendencies towards both in writing tend to be pretty cumbersome after a while, which is a reflection of the conservative nature of writing in general. It's for that same reason that Sin Wenz uses that in Soviet Central Asia (particularly modern-day Kazakhstan and Kyrgyzstan), when the Dungan people (descended from Chinese Muslims who fled in the 19th century after a series of rebellions) began adapting the Russian Cyrillic script to their language, they adopted the same principle and omitted tones altogether. This, combined with some homophones of its own, mean that one often relies on context when writing or reading, which is actually pretty simple for Dungan - a language which I've always been fascinated with - and thus most Dungan people don't get confused. To modern Dungan people, reading and writing in their own language, with their own Cyrillic alphabet, is a part of their identity, and consequently competence in that language is still pretty strong despite 'competition' from Russian and Turkic languages. The same would also be true, if - for example - northern Japan became a Communist state under strong Soviet influence, and the local Communist leaders decided that in order to boost literacy the Cyrillic script would be adopted to replace kanji and kana, and was most likely true when Turkey switched to the Latin alphabet from its previous Arabic script under Atatürk. No writing system is confusing for those who choose to learn to read and write it, just like no language is difficult for those who choose to learn to speak it, all things considered.

Imo the best route for Vietnamese would've been to replace their earlier use of hanzi with a script tailored towards their language, probably an abugida descended from the Khmer script or something completely new like the Koreans did with Hangul.

Actually, according to some linguists, Hangeul was not really completely new - more likely, it's related to 'Phags-pa, which is related to the Mongolian script, and thus ultimately related back to the same Northwest Semitic abjad which is the base of virtually all writing systems in the world. And much like your Vietnamese example for the Latin script, I could just as well point to Thai since it adopted the Khmer script - then the language underwent huge changes, including tonogenesis, which means that the Thai 'alphabet' is in a similar position to Vietnamese. Which does not bode well for Vietnamese, even with its shared Mon-Khmer origins. Then again, that would have been the case no matter what writing system the Vietnamese chose to write their language, even if the Vietnamese somehow converted to Islam en masse and begin using the same Arabic-derived Jawi script currently used for Malay alongside its Latin orthography. The route the Vietnamese chose was the one that they felt was the best choice.


Romaji, as others have pointed out, is horrible for Japanese.

Not necessarily.

I understand Japanese businessmen, for instance, trace kanji on the table between them in conversation, because there are so incredibly many homonyms.

From what I know about that, it's basically an old wives' tale. Japanese typewriters, on the other hand, . . .

You notice that Japanese is still written mostly with kanji, with kana only in a supporting rôle (grammatical particles, foreign words, furigana for schoolchildren and others learning the language). There's a reason for this.

And not necessarily a good one, IMO.
 
So then it would seem that even making the Japanese written form into a Latin based Alphabet would not make it any easier for Europeans and westerners in general to understand?
 
Agree that the Latin alphabet is awkward with too many letters with too many different sounds.
Sadly, English got spelling got really confusing after the great vowel shift (post Shakespeare and Queen Elizabeth 1).
A phonetic version of English would only use the letter "K" when referring to kittens. IOW no double-sound for the letter "C."
Simplified English would need to add a new letter for the "th"'consonant-blend, since I do not know of any other language that uses that distinctive sound (th).

I would also like to see doubled "rr" when a letter is rolled (Spanish, French and German).

Speaking of German (apologies for the pun) modern German spelling is easier to read than modern Englush because they always use the letter "K" when referring to kittens.

Apparently Serbs solved the pronounciation problem by adding a few extra letters with distinctive sounds.
Don't mix English with the Latin alphabet as a whole. Phonetic languages like Italian have no problems with said alphabet - while there are a couple of curious rules, they are universal, and if you know the sounds, you can read every single word. There might be some confusion about where the word is stressed, but that doesn't depend on the alphabet itself.
 
So then it would seem that even making the Japanese written form into a Latin based Alphabet would not make it any easier for Europeans and westerners in general to understand?

To a certain degree. For instance, I'm not fluent in French, but because I had 2 years of high school French, I can understand French Wikipedia (plus some French text to a certain degree, without being able to speak it). I had two years of college Japanese--because of that, I can comprehend Japanese-language music as well as spoken dialogue in anime to a certain degree (can't speak it). If Japanese were written minus kanji (either in kana or in Latin), I would likely be able to read Japanese to a certain extent as well like I can do with French (it isn't too hard of a language). The kanji really do present a huge barrier to comprehending a Japanese text to a westerner, because it takes far more time to learn than it does anything else in the language (but it does serve as a handy vocab lesson as you try and learn them).

But at the same time, the level of homophones in Japanese would get really annoying. I know that writing Japanese all in kana looks awkward (unless you have a simple sentence where is no kanji)--I can't give any examples off the top of my head where writing it the normal way is more efficient than writing it all in kana (or god forbid, romaji), but there are plenty.
 
Don't mix English with the Latin alphabet as a whole. Phonetic languages like Italian have no problems with said alphabet - while there are a couple of curious rules, they are universal, and if you know the sounds, you can read every single word. There might be some confusion about where the word is stressed, but that doesn't depend on the alphabet itself.

Damn right.



(I wish that thorn and eth had stayed in English orthography SO MUCH)
 
Also, a bit of a pop culture effect (from my knowledge--I am not a programmer)--this will make Japanese video games function differently in regards to text (since Japanese script take up far less space to say just as much if not more). There are some Japanese games which were and are notoriously difficult to translate from a mechanical standpoint because of the coding, as well as the fact that many workarounds had to be found in regards to how the script works. It is also noted that Japanese games are far easier to translate into Chinese simply because you don't need to rewrite everything like you would if you were translating into English. So that would presumably mean that many more Japanese games, visual novels, etc. have fan translations. In addition to others which official translators may have taken on. Granted, these wouldn't be the games we know because of butterflies, but if you have a Japanese game industry, this is the result if they are forced to use romaji.

I'm sure plenty of other butterflies might come out of getting rid of kanji/kana (although using exclusively kana would get you a similar effect). You will fundamentally change the light novel style of fiction, for instance. Since after all, you are making it take more space to say less content compared to what you could do with OTL's Japanese language.
 
Top