Sexiest Topic on AH.com: ALTERNATE ACCOUNTING

This is a very interesting thread.

I think Venice and Florence came quite close to it.
Seems like you know a lot about that: What exactly is the charge/discharge system?

Anyway, without knowing the details, I am quite convinced that, say, Pisan and Lübeckian bookkeeping in the 15th century were looking quite differ

Charge-discharge was the "normal" accounting system in mediaeval times. It was a single entry system , whereby a "debtor" , say the land steward of a lord, was "charged" with producing a revenue - usually the produce and rents of the estate. The "debtor" (the term wasn't used, I have applied it anachronistically) discharged himself by showing the disbursements he had made. So if the steward was supposed to collect 1000 marks, he could produce an account showing that he paid out on the lord's behalf 300 marks ofr this or that , and rendering the other 700. There was provision for the use of fictitious entities to cover losses , shortfalls etc.

One interesting aspect of it, is that the accounts were not necessarily currency based.

So the "charge" might be 500 marks, 50 hens, 2 barrels of wine and a dozen sheep.

The steward would discharge his account by producing tallies from the lord's house steward for 49 hens , 2 barrels of wine and a dozen sheep, proof of 100 marks given in alms to his Lord's account, an assertion of one hen, dead of disease, and rendering of 400 marks in coin. The tallies from the house steward would then be reckoned against his stubs. And the 49 hens, 2 barrels, and 12 sheep would then become a charge into the house stewards account. To be discharged by him in his turn.

As to when double entry came along, Edwards in his "History of Financial Accounting" (Can you believe that someone actually wrote a history of it!) gives a probable date of 14C to 15C in Italy. Like most things it didn't happen all at once, and charge-discharge was still the accounting system of English Universities in the mid 19C

One thought. The ancient Greeks managed a pretty sophisticated mathematical understanding without a zero. Including the use of pi.

Once the lord agreed the discharge of the account, he would write 'Quietus' across it the account was 'silent', closed. Hence "with a bare bodkin his own quietus make".
 
Arabic numbers was vital for the development of Europe because of the ease of calculating compound interest, try doing that in roman numberals!
 
This is a very interesting thread.

Charge-discharge was the "normal" accounting system in mediaeval times. It was a single entry system , whereby a "debtor" , say the land steward of a lord, was "charged" with producing a revenue - usually the produce and rents of the estate. The "debtor" (the term wasn't used, I have applied it anachronistically) discharged himself by showing the disbursements he had made. So if the steward was supposed to collect 1000 marks, he could produce an account showing that he paid out on the lord's behalf 300 marks ofr this or that , and rendering the other 700. There was provision for the use of fictitious entities to cover losses , shortfalls etc.

One interesting aspect of it, is that the accounts were not necessarily currency based.

So the "charge" might be 500 marks, 50 hens, 2 barrels of wine and a dozen sheep.

The steward would discharge his account by producing tallies from the lord's house steward for 49 hens , 2 barrels of wine and a dozen sheep, proof of 100 marks given in alms to his Lord's account, an assertion of one hen, dead of disease, and rendering of 400 marks in coin. The tallies from the house steward would then be reckoned against his stubs. And the 49 hens, 2 barrels, and 12 sheep would then become a charge into the house stewards account. To be discharged by him in his turn.

As to when double entry came along, Edwards in his "History of Financial Accounting" (Can you believe that someone actually wrote a history of it!) gives a probable date of 14C to 15C in Italy. Like most things it didn't happen all at once, and charge-discharge was still the accounting system of English Universities in the mid 19C

One thought. The ancient Greeks managed a pretty sophisticated mathematical understanding without a zero. Including the use of pi.

Once the lord agreed the discharge of the account, he would write 'Quietus' across it the account was 'silent', closed. Hence "with a bare bodkin his own quietus make".

Fascinating stuff! Can you give me a proper citation for that book? I'd like to try to find a copy for myself, it sounds interesting.

Thanks in advance! :)

Arabic numbers was vital for the development of Europe because of the ease of calculating compound interest, try doing that in roman numberals!

Yeah, banking would be set back a ways... heh, the more thought one puts into a PoD, the greater the number of effects one uncovers, it seems to me.

I'm wondering - with banking set back, economic development would probably take a hit. Hm... I can't help but wonder whether this TL would see any variant methods for raising capital. Any thoughts as to possibilities there?
 
One thought. The ancient Greeks managed a pretty sophisticated mathematical understanding without a zero. Including the use of pi.
Yes and no; it depends on the school of thought involved. For example, various stories state that Pythagoras rejected the existence of irrational numbers (and thus, by implication, the existence of Pi, e, and a whole host of very cool numbers). What is truly essential to fully understand the real numbers is the demonstration that transcendental numbers exist. Once one accepts the existance of the transcendentals, the true differences between the real and natural numbers comes into focus.

Arabic numbers was vital for the development of Europe because of the ease of calculating compound interest, try doing that in roman numberals!

Arabic numerals themselves do not save time; it is the creation of a method of digital representation. More specifically, it requires a concrete way of representing rational numbers (i.e., numbers that can be represented as the ratio of two integers).

For example, using some modern notation, we can represent any finite decimal as the ratio of two numbers:

.1 = I/X
.2 = I/V
.3 = I/III
.4 = II/V
.5 = I/II
.99 = XCIX/C
22/7 = XXII/VII

and so on and so forth. It is from this basis we can do most of our needed calculations.

India in general tended to be pretty advanced in Mathematics. Not that I mean to denigrate B von A- I think this is an interesting and fresh POD but I just took exception to the idea that mathematical developments of this sort were a product of the Enlightenment. The Enlightenment was a very specific reaction to the European context.

Yes and no; what evolved from the Enlightenment was the search for full mathematical formalism and simplification. It wasn't until the late 19th and early 20th centuries that such formalism was fully realized (mostly through the efforts of Hilbert and his ilk, although the list of mathematicians who contributed to the rise of formalism is a long one). Such searches originated from attempts at the removal of Euclid's Fifth Axiom, which eventually lead to the creation of Hyperbolic Geometry. Once hyperbolic geometry was developed fully, it became apparent that the underlying basis of mathematics needed to be explored.

So in conclusion, the entirety of the world developed mathematical ideas and principles (and thus your point is valid), it is simply the Europeans who have dominated its modern development, for the reasons I gave above.

But these are only marginal conditions. I can see no systematic reason (apart from the arguments mentioned) why the algebraic insights until Gauss could not have become known 400 years earlier ...

Mostly this is due to historical factors; people sought not to overturn Euclid, but to vindicate his axiomatic system by constructing such a system that the fifth axiom is unneeded (and thus the entirety of the universe is founded on Euclidean Geometry, which is nice and easy to comprehend). With the developments in absolute geometry from the 17th to 19th century, mathematics began probing its underlying foundations with renewed vigor.

That being said, calculus is an extremely useful tool for many purposes. There might be a completely different approach, a different "language" which would allow for the same, or at least partially the same insights and computations in the applications as calculus did for us. I know a couple of different "languages" how to imagine the concept of calculus, but they all use some idea of a continously varying number.

This area is probably the most difficult to probe; with a sufficiently rigorous mathematical base (namely those axioms of set theory that allow us to arrive at the reals [We'll just use ZFC since that's the most commonly taught axioms]), we can arrive at the real numbers and thus calculus. Although calculus development is twofold; Newton developed calculus mostly from physical observations. Liebniz approached the subject from a much more abstract (that is not to say that he approached the subject with sufficient rigor).

All in all, the development of mathematics is very closely tied with the development of the other sciences until the 17th and 18th centuries, then mathematics shot way, way ahead :p.

Anyway, that's my ¥2 on this topic.

For some minor speculation, butterflying away Arabic numerals may lead to a "better" (from a technical sense) numerical system, preferably some based on the exponentiation of two, for example hexadecimal world, anyone? Ah, what a wonderful world...
 
Thanks for the hint to the Indian Calculus. Unfortunately, I couldn't find a lot on the details. One description said they did not establish a general concept out of it.

Obviously, I was wrong, and calculus may be as "semi-transcendent" as mathematics in general; in particular, largely independent from particular historical circumstances.

I cannot comment on the Kerala scientists and their findings as I don't know their representation exactly. However, with mathematical concepts it does not suffice that they have once popped up. It is important


  • how deeply the developpers themselves understand the concept,
  • whether there is an application for it.


As to the first point, negative numbers have been developped several times, and mostly (or: never) really became generally accepted. This was often due to the fact that the concept was not really developped to some maturity.

===================================================

Also, thank you very much for the description of the charge-discharge system, Jedediah Scott.
Obviously, this was suitable for accounts which had to be balanced at the end.
Do you also happen to know how merchants would book?
In normal times, they would accrue fortunes. So how do you include a positive balance here?
And are the items supposed to be inventories (e.g. "property"), or changing items (e.g. "revenue")?

===================================================

Hunam:
I don't think it's very useful about (relatively recent) axiom systems, or describing fractions in Roman numerals. Of course, we can do anything in Romans because we have learned it in Arabics (there is even a calculator computing in Romans - also internally!).
But I don't think the idea of continuity is compatible with an integer system based on Roman numerals.
 
Last edited:
Hunam:
I don't think it's very useful about (relatively recent) axiom systems, or describing fractions in Roman numerals. Of course, we can do anything in Romans because we have learned it in Arabics (there is even a calculator computing in Romans - also internally!).
But I don't think the idea of continuity is compatible with an integer system based on Roman numerals.

To be fair, it also equally difficult to picture "continuity" (i.e., density) with Arabic numerals; the entire concept of density developed as a result of calculus and set theory, not the other way around (to be precise, you are using continuity incorrectly, as continuity refers to mappings between sets. The concept you are referring to is the density of sets, specifically the density of the reals and rationals). The demonstration of the density of the rationals and irrationals is the important leap mathematics requires, and to do that merely requires a way of representing rationals (as per my previous post with Roman numerals) and the proper development of rudimentary set theory. So it is possible to develop calculus by resorting to any arbitrary numeral system (since they are merely arbitrary ways of representing the same concept).

On the calculator statement, are you implying that the calculator does its calculations based on Roman algorithms? I find it hard to believe, assuming the calculator is built up from standard logic gates (as is every modern electronic device). Could you please clarify? Thanks.
 
Hunam: Sorry, you are confusing proofs and definitions (as we use them today) with the history of the ideas involved.

You are arguing that calculus were based on set theory, while set theory is over a century younger than calculus.
"Continuity" is a property a mapping between topological spaces may have (not general sets, as you said btw) - I was referring to the idea of a continuum, which obviously lies at heart of any notion of a continuous mapping. Also the concept of "density" was developped afterwards to have a logically "clean" explanation for what mathematicians had done for a long time before: For instance, write down infinite series and work with them as with ordinary numbers.



The continuum has a very interesting history.
What did people know about the "*set"/entirety of all numbers at the end of the 17th century?
They knew the rationals, and (convergent) infinite series of rationals.
However, they did not really have a clean notion of convergence or limit, but they worked with them.
The continuum therefore was, in the imagination of Leibniz and his contemporaries, very rich: It contained "clouds" of infinitesimally close points around each number. Therefore they could write

(x+ dx)^2 = x^2 + 2x dx + (dx)^2 = x^2 + 2x dx (because if dx is infinitesimal, then (dx)^2 vanishes), and hence d (x^2)/ dx = 2x.

Some scholars noticed that there were logical gaps about this, but
it took roughly 150 years until people as Weierstraß and Kronecker could make sense of it, by developping "epsilontic" (which you certainly take for granted).

It took even longer until a logically correct construction of the reals was found.

It is a great achievement of these mathematicians to find a way of deriving all these objects from bare logic. However, their job was rather apologetic: They just established what had been in use for centuries.

The questions "What is a number?" and later "What is a function?" were not solved by a definitory fiat, but by many experiments, many of them with wide logical gaps.



Oh, and the Roman calculator: That's nothing useful, it's was a fun work of a computer scientist. I don't exactly know how it is constructed, but it is not hard to guess: As basic building blocks, you don't have x Bit numbers, but y Roman Digit numbers, where each Roman Digit is again represented by an appropriate number of bits (similar to, but more costly than the "binary representation of decimal numbers" concept). Then you have to implement the gates to perform what they should, like addition.
I wanted to say with that: Once we understand the concept, we can represent it in many different ways. However, Roman numerals do not really favor the (naive) observation of convergence effects.
 
You are arguing that calculus were based on set theory, while set theory is over a century younger than calculus.
Well, Calculus is based on set theory (being one of the most primitive axiom systems), we just came to that conclusion after much later. Though you are right, calculus evolved much earlier than set theory.

"Continuity" is a property a mapping between topological spaces may have (not general sets, as you said btw)
Fair enough; I wasn't being precise enough, a capital crime (in mathematics anyway).

I was referring to the idea of a continuum, which obviously lies at heart of any notion of a continuous mapping. Also the concept of "density" was developped afterwards to have a logically "clean" explanation for what mathematicians had done for a long time before: For instance, write down infinite series and work with them as with ordinary numbers.
The history is true; I was arguing from the reverse (i.e., establishing our axioms, we arrive at calculus). My argument more explicitly stated is simply that we will arrive at this point once we have the proper set of axioms, regardless of however we choose to represent numbers and concepts. Of course, reaching that point is a separate discussion, and an interesting history.

The continuum has a very interesting history.
Some scholars noticed that there were logical gaps about this, but
it took roughly 150 years until people as Weierstraß and Kronecker could make sense of it, by developping "epsilontic" (which you certainly take for granted).
I have studied the continuum, as well as real analysis. I use Weierstraß formulation all the time, for the obvious reasons. Although I don't so much "take it for granted" as "it will be a waste of my time to use anything else, and a waste of breath justifying this other method." Also, it is what everyone else uses (in classes), so why bother utilizing anything else (and taking extra time that never is available)?

It is a great achievement of these mathematicians to find a way of deriving all these objects from bare logic. However, their job was rather apologetic: They just established what had been in use for centuries.
All true. The history of these developments should be taught within math classes, but it is rarely thus. I bothered reading these histories; I failed to be sufficiently thorough in my explanation (another mathematical capital crime, unless its a proof by intimidation :D).

The questions "What is a number?" and later "What is a function?" were not solved by a definitory fiat, but by many experiments, many of them with wide logical gaps.
All true, all already established. My argument, again, was once we arrive at the necessary axioms, the rest of it can be derived by "definitionary fiat." Perhaps we get some eccentric & brilliant French noble in the early 17th century to lay the proper foundation and rigor from what his mind cooked up in a dream one night in the midst of a battle (cookie to the one that gets the reference).

I wanted to say with that: Once we understand the concept, we can represent it in many different ways. However, Roman numerals do not really favor the (naive) observation of convergence effects.
True, but neither does any integral representation (like Arabic numerals). It takes a more nuanced approach to develop this knowledge (like the convergence of rational series), but such developments do not appear to be dependent on the number system implemented, rather the number of estoric examinations of the behavior of certain mathematical objects. After all, once the Romans can construct s = SUM(i=0,infinity, 1/(2^n) ), then they can observe that s-> 2.

Oh, and the Roman calculator: That's nothing useful, it's was a fun work of a computer scientist. I don't exactly know how it is constructed, but it is not hard to guess: As basic building blocks, you don't have x Bit numbers, but y Roman Digit numbers, where each Roman Digit is again represented by an appropriate number of bits (similar to, but more costly than the "binary representation of decimal numbers" concept). Then you have to implement the gates to perform what they should, like addition.
Based on your explanation, sounds like a basic calculator. The only thing changed is the input/output device (which is really, really easy to reprogram. It's the logic and underlying architecture that takes all the effort and time).
 
Zyzzyva will be interested in this one.

My personal feeling is that it would have lots of knock-on effects, but is hard to do: arabic numerals are so patently superior to roman ones that it's hard to see how this could be achieved with any kind of peaceful east-west contact whatsoever.

India in general tended to be pretty advanced in Mathematics. Not that I mean to denigrate B von A- I think this is an interesting and fresh POD but I just took exception to the idea that mathematical developments of this sort were a product of the Enlightenment. The Enlightenment was a very specific reaction to the European context.

Yeah, but it's still a Hero of Alexandria / Chinese woodcut press scenario: whatever else you can say about the Enlightenment, it took these ideas and ran like hell with them.

One thought. The ancient Greeks managed a pretty sophisticated mathematical understanding without a zero. Including the use of pi.

Ehhh, their understanding of pi was pretty sketchy at best. Well, maybe not Archimedes', but he accomplished amazing things despite the numbers he was shackled with, not because of them.


Brother! :)

Arabic numerals themselves do not save time; it is the creation of a method of digital representation. More specifically, it requires a concrete way of representing rational numbers (i.e., numbers that can be represented as the ratio of two integers).

For example, using some modern notation, we can represent any finite decimal as the ratio of two numbers:

.1 = I/X
.2 = I/V
.3 = I/III
.4 = II/V
.5 = I/II
.99 = XCIX/C
22/7 = XXII/VII

and so on and so forth. It is from this basis we can do most of our needed calculations.

You're looking at this too much as a formalist and not enough from a historical perspective. Roman numerals suffice but they're an absolute bitch to work with. I suppose you can do anything decimals can do using just roman numerals and... continued fraction representation, I guess?... but it would be an ungodly pain in the neck. Math would be technically possible but would go nowhere - witness Archimedes: he was as smart as Newton, certainly as smart as Leibniz, and he got 90% of the way to inventing calculus in the 220s BC - but he didn't, because the disaster of a number system he had to use just had too much overhead to use properly.

Yes and no; what evolved from the Enlightenment was the search for full mathematical formalism and simplification. It wasn't until the late 19th and early 20th centuries that such formalism was fully realized (mostly through the efforts of Hilbert and his ilk, although the list of mathematicians who contributed to the rise of formalism is a long one). Such searches originated from attempts at the removal of Euclid's Fifth Axiom, which eventually lead to the creation of Hyperbolic Geometry. Once hyperbolic geometry was developed fully, it became apparent that the underlying basis of mathematics needed to be explored.

Huh? No, that was the big thing of the 19th C. Between the invention of the calculus and about 1820 or so nobody did formalism: that came about because of all the problems people using, eg, naive "infinitesimal" calculus ran into. Claiming that the quest for formalism existed before then is reading far too much direction into history to be plausible.

Mostly this is due to historical factors; people sought not to overturn Euclid, but to vindicate his axiomatic system by constructing such a system that the fifth axiom is unneeded (and thus the entirety of the universe is founded on Euclidean Geometry, which is nice and easy to comprehend). With the developments in absolute geometry from the 17th to 19th century, mathematics began probing its underlying foundations with renewed vigor.

Where are you getting these dates from? Saccheri didn't publish until 1733, and there was a century of dead space between him and Lobachevsky. Again (except the doesn't-really-count-cuz-he-backed-off Saccheri) it's all 19th C.

This area is probably the most difficult to probe; with a sufficiently rigorous mathematical base (namely those axioms of set theory that allow us to arrive at the reals [We'll just use ZFC since that's the most commonly taught axioms]), we can arrive at the real numbers and thus calculus. Although calculus development is twofold; Newton developed calculus mostly from physical observations. Liebniz approached the subject from a much more abstract (that is not to say that he approached the subject with sufficient rigor).

Of course he didn't approach it with sufficient rigour, the concept of sufficient rigour in modern terms didn't exist until Cauchy!

...Huh, I meet another mathematician and immediately begin tearing him a new one. The nature of the internet, I suppose. :eek:;)

The history is true; I was arguing from the reverse (i.e., establishing our axioms, we arrive at calculus). My argument more explicitly stated is simply that we will arrive at this point once we have the proper set of axioms, regardless of however we choose to represent numbers and concepts. Of course, reaching that point is a separate discussion, and an interesting history.

But it's really, really, implausible; with the exception of a couple of really modern fields almost all math starts out with some naive version and doesn't get formalized until it starts to run unto problems as a result. You can't just run that backwards: like how it's possible to build single-wing planes with 1880s tech but it didn't happen because they needed the experience creating grotesque kites-with-engines first before they could start paring the design down.

I have studied the continuum, as well as real analysis. I use Weierstraß formulation all the time, for the obvious reasons. Although I don't so much "take it for granted" as "it will be a waste of my time to use anything else, and a waste of breath justifying this other method." Also, it is what everyone else uses (in classes), so why bother utilizing anything else (and taking extra time that never is available)?

Because if the Wright brothers had had to build a plane that could carry 250 people across the Atlantic in six hours from the first go, we'd still be using Zeppelins.

All true. The history of these developments should be taught within math classes, but it is rarely thus. I bothered reading these histories; I failed to be sufficiently thorough in my explanation (another mathematical capital crime, unless its a proof by intimidation :D).

Haha! My status as resident mathie (historian) is still intact! :D

True, but neither does any integral representation (like Arabic numerals). It takes a more nuanced approach to develop this knowledge (like the convergence of rational series), but such developments do not appear to be dependent on the number system implemented, rather the number of estoric examinations of the behavior of certain mathematical objects. After all, once the Romans can construct s = SUM(i=0,infinity, 1/(2^n) ), then they can observe that s-> 2.

But they won't, because doing so in the Roman system is far far far more trouble than it's worth, and generalizing to, eg, cauchy sequences or whatever is such a leap as to be laughable. Besides, even the concept of using a symbol for a variable was not really common practice until the 1500s or so, IIRC (Diophantus did it, but only for the number being solved for and thus only for one number at a time. Plus nobody else followed his lead for a millennia or so).

I think you've been seriously let down by the lack of math history courses at your university. (So have I, really, I just read a hell of a lot about it.) The fact that nowadays we can develop all of this stuff so directly and abstractly does not mean that anyone could have at any time. We have the benefit of centuries of earlier work that enable us to see why, eg, the epsilon-delta definition of continuity is so much better than the intuitive version, and the mindset that developed out of that history, which is quite a recent development - if you tried to teach a modern calc-101-for-math-majors course in 1800, you'd be laughed out of town; hell, there are parts that would seem awkward and useless in 1880. Not because you're not teaching real math, but just because it's so dependent on modern mathematical mentality that it would seem bizarre and pointless. I'm a platonist; I think the conclusions math comes to are true - but it's still based on an enormous pile of contingent historical development, and you can't just kick that out from under it and claim "the conclusions are true, so they could still be developed".

...Er, by which I mean welcome, fellow mathie. :eek:
 
What he said.

Mathematics was VERY fuzzy but it worked. Until it didn't, and THEN you get modern formalism.

From a modern perspective, you can drive trucks through the holes in Euclid, who was considered the absolute epitome of logical thoroughness for most of 2 millenia. Newton and Leibnitz's theory was... based not so much on sand as mud. However there was a strong underlying bedrock that wasn't discovered for another century or two.

(MSc Math, Queens, 1981 - no you're not the only mathematician)
 

Al-Buraq

Banned
I'm a bit of a dunce at Maths, does all this mean that 10% of the Government deficit is.....

__________________________________________________________
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
?
 
I'm a bit of a dunce at Maths, does all this mean that 10% of the Government deficit is.....

__________________________________________________________
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
MMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMM
?

No, it would be either ten million Ms-with-bars or ((((X)))). (For 10% thereof, one million Ms-with-bars or (((M))) respectively.)
 
Retarded actually. I do however understand the sagan unit.

Uh... that's my sig. It's a joking reference to this

GoddamnBatman.jpg


notorious comics panel (I don't think Roberto would ever be that insulting on his own, and I certainly wouldn't have sigged it if he had). It appears at the bottom of all my posts, so it's definitely not specifically directed at you. :eek:
 
You're looking at this too much as a formalist and not enough from a historical perspective. Roman numerals suffice but they're an absolute bitch to work with. I suppose you can do anything decimals can do using just roman numerals and... continued fraction representation, I guess?... but it would be an ungodly pain in the neck. Math would be technically possible but would go nowhere - witness Archimedes: he was as smart as Newton, certainly as smart as Leibniz, and he got 90% of the way to inventing calculus in the 220s BC - but he didn't, because the disaster of a number system he had to use just had too much overhead to use properly.

Actually, I was thinking of continued fraction representation (as horribly, horribly inefficient as it is, it can get the job done). True, I do view much from formalism, but its non-existence doesn't imply that its ideas can't be derived from other things. It's just much, much, much more unlikely (rather than outright impossible). Again, find someone who is the right combination of smart and crazy and give them just the right amount of education and we can arrive at new knowledge (which, correct me if I'm wrong, but wasn't that the way things were done prior to the advent of scientific rigor?)

Where are you getting these dates from? Saccheri didn't publish until 1733, and there was a century of dead space between him and Lobachevsky. Again (except the doesn't-really-count-cuz-he-backed-off Saccheri) it's all 19th C.
More like I couldn't remember the exact date, so I ballpark from what I remembered :eek:.

Of course he didn't approach it with sufficient rigour, the concept of sufficient rigour in modern terms didn't exist until Cauchy!
But it's really, really, implausible; with the exception of a couple of really modern fields almost all math starts out with some naive version and doesn't get formalized until it starts to run unto problems as a result. You can't just run that backwards: like how it's possible to build single-wing planes with 1880s tech but it didn't happen because they needed the experience creating grotesque kites-with-engines first before they could start paring the design down.
True, it is really, really, really implausible. But not impossible; again, find someone sufficiently crazy and maybe humanity gets lucky (e.g., the guys who invented fire, the brick, the wheel, farming, Stonehenge, gunpowder, &c.)

The fact that nowadays we can develop all of this stuff so directly and abstractly does not mean that anyone could have at any time.
I'm a platonist; I think the conclusions math comes to are true - but it's still based on an enormous pile of contingent historical development, and you can't just kick that out from under it and claim "the conclusions are true, so they could still be developed".
Hm, I'll stew on this. I readily agree it is nigh impossible, but to put it in familiar terms, the probability of this outcome is small but nonzero.

I take quite a different tack. I don't see the development of mathematics as necessarily dependent on history, due to the nature of the subject. After all, it is reason removed from context at its heart of hearts. Although, context has played a huge role in its development, for better or worse.

...Huh, I meet another mathematician and immediately begin tearing him a new one. The nature of the internet, I suppose. :eek:;)
*sniff* The sudden change in your attitude has driven me from mathematics forever! I'll completely change my life and study one of those useless fuzzy subjects, like law, engineering, or history! (;) to lawyers, engineers, and historians). *sniff*

:D

Anyway, grazi, senior Zyzzyva.
 
I'm not going to claim to understand even half of what's been discussed here, but I do find this topic rather interesting.
 
Top