Alternate/Earlier Path to Computing

Specifically: moveable print and typewriters (precursors to word processing today) seem to thrive in OTL due to Western adoption of a standard alphabet. The Chinese actually had woodblock printing but it was unweidly due (as I understand it) to the extensive character set of their writing system. The same issue appears to be true for other complex writing systems.

Of course, computing itself originates in mathematics and the need for number crunching. It really cranks up in times of war with the need to keep track of extensive records (thinking about the advances in WWII in both the Allied and Axis camps). That's not a language-bound need, and we know of many cultures throughout the world that may tremendous mathematical advances.

So my question is: what factors could lead to a rise of computing in other parts of the world outside of Europe/America, and/or earlier? Do we find a way to make movig type more amendable to complex writing systems? Do we have to reingeer other cultures to have greater economic/industrial needs for number crunching? Maybe have imperial China remain open to exploration rather than turn inward, maybe keep the Mongols in power longer, maybe an Inca empire that is able to withstand colonization and develop its quipo system further? Maybe a classical world dominated by Egypt or Persia instead of Greece and Roman?

I'm open to any and all ideas.
 

Philip

Donor
It really cranks up in times of war with the need to keep track of extensive records (thinking about the advances in WWII in both the Allied and Axis camps)

I disagree. Record keeping was not the principle driver. Providing numerical solutions to differential equations and completing exhaustive searches, among others, were far bigger drivers.

what factors could lead to a rise of computing in other parts of the world outside of Europe/America, and/or earlier?

OTL, it was tracking the stars. This was the single most important motivation in the early development of mathematics and mechanical computing.
 
You're of course correct that mathematical computation is the primary driver of advances in computing technology; certainly both creating ever more complex codes and deciphering them relied on it (thinking Enigma and Turning here). I meant that war tends to expediate needs that might take longer if left wholly to the private sector.

Star tracking is certainly a place to start in looking at the need for numerical computation: creating calendars to track religious holy days is evident in many cultures around the world. But what gets people beyond earlier techniques of dating? I don't know that Babbage et all are primarily concerned with stars in their work. Economic needs and the need for technological advantage surely play into the development beyond theory into actual practice.
 
Specifically: moveable print and typewriters (precursors to word processing today) seem to thrive in OTL due to Western adoption of a standard alphabet. The Chinese actually had woodblock printing but it was unweidly due (as I understand it) to the extensive character set of their writing system. The same issue appears to be true for other complex writing systems.
Moveable type print and typewriters(mechanical word processor) are precursors to OTL word processing, but not required for the development of computers or software word processing. The common keyboard arrangement (QWERTY) is a legacy from typewriters, where it was designed to minimise the typewriter becoming jammed which is not an issue for keyboards.
The first word processor used by a computer was in 1976, long after computers were developed and started mass production.

Woodblock printing was invented and used by the Chinese because they valued highly accurate printing. A page is carved as a single block, the size of the character set neither makes it faster nor makes it slower.
Movable type printing was invented by the Chinese, but rarely used because of it's inaccuracy (typos). Movable type does depend on the size of the character set and will be slowed down by more characters.

Of course, computing itself originates in mathematics and the need for number crunching. It really cranks up in times of war with the need to keep track of extensive records (thinking about the advances in WWII in both the Allied and Axis camps). That's not a language-bound need, and we know of many cultures throughout the world that may tremendous mathematical advances.
Pre-WWII there were many scientists working on computer development and they would have continued to advance the field without the war.
Charles Babbage designed an early prototype computer in 1822, tabulating machines were sold in the 1890s and used in the 1890 USA Census.

WWII significantly crippled computer technology, certainly by years and perhaps by decades. Both sides classified their calculating machines and prevented the publishing of research papers.
The Colossus computers was built for code-breaking purposes during the second World War, but remained classified for thirty years so that by the time it became public knowledge it had long been surpassed. All the designs and documentation were incinerated to preserve secrecy at the end of the war. The designer (Tommy Flowers) was unable to continue developing computers as all of his research was top secret, rather than start again from scratch he returned to working on telephone exchanges.
Alan Turing's first paper on computers was published in 1936; "On Computable Numbers". The war prevented him from computer development (he designed the Bombe at Bletchley Park, but that was limited and secret) , after the war he designed the Automatic Computing Engine but he couldn't convince people it was viable (some research proving it was possible was classified) and only a much reduced form was eventually built.
If all the computer scientists had been put into a coma at the beginning of WWII, and then awoken at the end of the then computer technology would probably have advanced faster than OTL. Then they would be able to share ideas and not have important research declared secret and unable to be used.


So my question is: what factors could lead to a rise of computing in other parts of the world outside of Europe/America, and/or earlier? Do we find a way to make movig type more amendable to complex writing systems? Do we have to reingeer other cultures to have greater economic/industrial needs for number crunching? Maybe have imperial China remain open to exploration rather than turn inward, maybe keep the Mongols in power longer, maybe an Inca empire that is able to withstand colonization and develop its quipo system further? Maybe a classical world dominated by Egypt or Persia instead of Greece and Roman?
Movable type printing played no role, so no.
Quipu system would not have produced computers.
Rome and Greece (generally) used calculations more than Egypt or Persia, so if they dominated then the progress is likely to be slower.
Mongols did not use much maths, their skillset was wrong for leading to computers. Maybe their empire would have helped or maybe it would have hindered, it is difficult to predict due to it's short duration.
If China focused on intercontinental sailing, the navigational needs would have eventually lead to computing.


You're of course correct that mathematical computation is the primary driver of advances in computing technology; certainly both creating ever more complex codes and deciphering them relied on it (thinking Enigma and Turning here). I meant that war tends to expediate needs that might take longer if left wholly to the private sector.

Star tracking is certainly a place to start in looking at the need for numerical computation: creating calendars to track religious holy days is evident in many cultures around the world. But what gets people beyond earlier techniques of dating? I don't know that Babbage et all are primarily concerned with stars in their work. Economic needs and the need for technological advantage surely play into the development beyond theory into actual practice.
Enigma was a mechanical encryption device, nothing to do with computing at all (no calculations involved).
War can push the development of some technologies, but it is a short term gain for a long term loss. Funding for one project can be far greater than during peace, but that is ignoring the many many peacetime projects cancelled by the war.

Dates/calenders, that was an important part of ancient astronomy. But since 45 BC (in the Roman Empire, but spread to much of Europe) calenders have been static and (generally) unchanging. You learn the mnemonic "Thirty days hath September, April, June, and November. All the rest have thirty one, except for February which has 28 days and one day more, we add to it, each year in 4" (or similar) and you can easily know dates without much calculation.

Agriculture and sailing needed detailed and accurate predictions of the Earth, the Sun and star positions. Almanacs were the second most (after the Bible) sold printed book in the 17th century.
Farmers needed to know which dates were best (that year) for planting/harvesting each crop, they would try (depending on weather) to plant on the ideal date.
Ships used stars to navigate, if you wanted to sail directly from London to New York you needed computers (1613 to 1950 computer was a job performed by a human) to have calculated a very precise star chart for that year.

Charles Babbage was initially funded by the British Royal Astronomical Society in developing his computer, but (partly due to arguments with many people including the Astronomer Royal) after several years the Treasury stopped providing him money.
He stated that he was inspired to create a mechanical computer after discovering errors in astronomical tables produced by computers (people with the job of calculation).

By the 19th century, engineering required extensive (and time consuming) calculations. A combination of engineering and financial calculation needs were the main drivers of computer(machine) development in the 1950s to 1980s.
 
Thanks for the detailed reply: I appreciate your going through each point, especially interacting with the pros/cons of war advancing technology. With the discussion, it sounds like we're tying the development/need for mechanical computers into the Industrial Revolution, which would make the premise of my question tied into jumpstarting that engine a tad earlier. But is it necessary?

Let's take the example of China and intercontinental sailing. I assume here we're talking about increased trade between China and the rest of east Asia, such as the Phillippines, Thailand, etc. As I understand it, China abandoned treasure fleets and trading expeditions after the Ming Dynasty in a long period of retrechment. So, saying we can provide factors that would allow China to remain more outward and/or expansionist, what kind of computing needs develop?
 
Thanks for the detailed reply: I appreciate your going through each point, especially interacting with the pros/cons of war advancing technology. With the discussion, it sounds like we're tying the development/need for mechanical computers into the Industrial Revolution, which would make the premise of my question tied into jumpstarting that engine a tad earlier. But is it necessary?
Computers (mechanical or electrical) are essentially post-Industrial Revolution, it is not really possible to have them before.

Let's take the example of China and intercontinental sailing. I assume here we're talking about increased trade between China and the rest of east Asia, such as the Phillippines, Thailand, etc. As I understand it, China abandoned treasure fleets and trading expeditions after the Ming Dynasty in a long period of retrechment. So, saying we can provide factors that would allow China to remain more outward and/or expansionist, what kind of computing needs develop?
Thailand can be reached by coastal sailing, the Philippines can be reached with island hopping.
Regular multi-week sea journeys are needed to create the demand for computing.
If you can see land then you can use that to navigate. One, two days without sight of land and you can use dead reckoning. The longer your ship cannot see land, the more off-course you will become (without using stars or radio or gps).
Detailed star charts are only needed if you are spending many days in the ocean away from land.

Possibly if China found something incredibly valuable(a plant that only grows there?) on the island of Guam, and needed regular ship travel to retrieve it.
 
Computers (mechanical or electrical) are essentially post-Industrial Revolution, it is not really possible to have them before.

The first computers were little more than oversized calculators. Calculation machines certainly existed before the industrial revolution.

https://en.wikipedia.org/wiki/Pascal's_calculator
https://en.wikipedia.org/wiki/Stepped_reckoner
https://en.wikipedia.org/wiki/Mechanical_calculator
https://en.wikipedia.org/wiki/Mechanical_computer

Al-Jazari's castle clock may well be the first programmable analogue computer. It dates from the early 13th century.

Heck, it could be argued that the antikythera mechanism was a form of mechanical computer.

As for sailing long distances, what you need is accurate timekeeping. You don't need a computer for that, although it helps for accuracy.
 
Heck, it could be argued that the antikythera mechanism was a form of mechanical computer.

According to The Perfectionists by Simon Winchester, the antikythera mechanism would have been "an analog computer, essentially" but without machine tools it couldn't be made precisely enough to work accurately (the goal was to predict the motions of the planets). If you somehow improve the accuracy of Greco-Roman manufacturing, you could evolve the antikythera mechanism into a sort of computer 2000 years early.
 
WWII significantly crippled computer technology, certainly by years and perhaps by decades. Both sides classified their calculating machines and prevented the publishing of research papers.
The Colossus computers was built for code-breaking purposes during the second World War, but remained classified for thirty years so that by the time it became public knowledge it had long been surpassed. All the designs and documentation were incinerated to preserve secrecy at the end of the war. The designer (Tommy Flowers) was unable to continue developing computers as all of his research was top secret, rather than start again from scratch he returned to working on telephone exchanges.
Alan Turing's first paper on computers was published in 1936; "On Computable Numbers". The war prevented him from computer development (he designed the Bombe at Bletchley Park, but that was limited and secret) , after the war he designed the Automatic Computing Engine but he couldn't convince people it was viable (some research proving it was possible was classified) and only a much reduced form was eventually built.
If all the computer scientists had been put into a coma at the beginning of WWII, and then awoken at the end of the then computer technology would probably have advanced faster than OTL. Then they would be able to share ideas and not have important research declared secret and unable to be used.
This is certainly a novel argument, considering the limited resources that were available to people interested in computing before the war compared to during or after it. It is also, I note, an argument which completely ignores American efforts in the field, which is rather curious given the existence of the Harvard Mark I and ENIAC computers, which were not classified and which directly led into the development of the first computers (particularly ENIAC). The American military involvement in the development of the computer also led to the Moore School Lectures, which led to quite a lot of sharing of ideas. The American military also facilitated the development of the idea that computing was its own field instead of being half electrical engineering and half mathematics. Finally, the war also led to heavy investments in a number of technologies that would go on to be important in the earliest computers, for example mercury delay line memory. It's hard to seriously argue that the war didn't lead to significant advances in computing, more or less similarly to the way that it led to major advances in nuclear physics.

Also, Turing is overrated, in the sense that it took some time for people to understand the centrality of Turing machines to computing. In the 1940s, "On Computable Numbers" was, like the lambda calculus, seen as a mathematical toy of little practical importance. I'm not sure even Turing really thought of it as being particularly important to practice the way people do now.

According to The Perfectionists by Simon Winchester, the antikythera mechanism would have been "an analog computer, essentially" but without machine tools it couldn't be made precisely enough to work accurately (the goal was to predict the motions of the planets). If you somehow improve the accuracy of Greco-Roman manufacturing, you could evolve the antikythera mechanism into a sort of computer 2000 years early.
It was a sort of computer; to quote an article in the April Communications of the ACM (located here, though you may not be able to read all of it), "The device determines the approximate position of the sun, the moon, and...possibly the [known] planets...It predicted or described solar and lunar eclipse possibilities...and calculated the phases of the moon". Additionally, also according to the article, "[t]he mechanism is so mature it can hardly be a unique device" and it might have been "primarily for educational and philosophical purposes".
 
Top