You are viewing dewimorgan

Dewi Morgan

Mathematical notation considered harmful.

Mathematical notation considered harmful.

Previous Entry Share Next Entry
defaultShades
"Giant zigzag, sitting on two symbols separated by a scarab's butt." - The diagram is three lines high, and displayed as an image. What does it MEAN? How, using the tools available, can we find out?



It means:

FOREACH (o IN O)
x += f(o)

...which is still unacceptable. It's TERRIBLE. It would quite possibly get a programmer fired, but is considered perfectly acceptable for math professors.

Mathematicians think mathematical notation is acceptable in teaching. It is NOT acceptable, for several reasons.

1) It is not copy-pastable.
2) It is not searchable - Google is blind to these things: pasting in an omega will not give you what you want.
3) It is not linkable. So someone seeing a sum or union symbol has literally NO way of finding out what they mean. An arrow pointing to the right... what does that mean? This arrow has two bars. Hrm. Nope. I got nothing.
4) They feel it is sufficient to provide an explanation in these esoteric alchemical symbols; therefore many wiki pages are left incomplete, where otherwise an intelligible explanation might be given; thus they reduce the sum of human knowledge.
5) They choose the world's worst variable names. The above code is NOT acceptable. When they write programs, they write them with global variables 'a' through 'z', then start again at 'aa', 'ab' - this is not a rarity, it is a commonplace amongst math professors in most faculties. It is how they think: without clarity or any need to explain their workings.

eventProbability = 0
FOREACH (outcome IN eventOutcomeList)
eventProbability += probabilityOf(outcome)

Now that's getting closer to the right way to do it. Without esoteric one-letter symbols to represent things, you stand a chance of being able to see at a glance what is going on.

Mathemagical symbols were designed for writing quickly on blackboards. They are the WRONG TOOL for writing on computers.

"But Dewi, we have no better tool" - what, other than English, every other written language, and every single programming language? Well, if you're not willing to use any of those, then perhaps design a better terminology, then, and push for it to be accepted. But I can guarantee there is a clear and concise terminology in most programming languages to represent every one of those concepts, and that this terminology will at least be copyable, linkable, and searchable.

And don't cry that the terminology is unfamiliar and doesn't have global acceptance in the mathematical community: that is an EXCELLENT feature, for it will mean that you will link to somewhere that clearly defines all your symbols in the language of the rest of your article, and you will actually pick decent variable and function names, instead of arbitrary letters!

And in the commons: I'd be willing to bet that more people know C/C++ or PHP than know how to read math markup. There's a reason for that, and it's not because these people are stupid: it's because math markup is more user-hostile than C/C++!




[When I wrote this in a BoingBoing comment, I got the response "So how about a link to your clear and concise universal grammar/language that all can use to do everything. I personally know several gurus who would be totally willing to throw mad mad money in your (that/this) direction for such a solution to this very general problem. Seriously, dude, we're waiting."

However, I'm afraid that I have no intention of linking "English, every other written language, and every single programming language" - if a reader is unable to find references to these languages themselves, they should probably not be writing, period. But, as a hint: you want what we call a "dictionary". They've been around for at least 4,300 years, so you should be able to pick one up quite cheap. For what it's worth, "mad money" has already been spent on the development of these languages. They already exist, and are very well refined and capable of expressing any nuance you might imagine. They don't need further investment from "gurus".]
  • When I wrote Math essays I always was frustrated because of this trouble. I had to type in each symbols in Mathtype then copy paste in Word document. However, I am sort of pride of my professional work - the more complex it is, the more satisfied I am. Recently I'm struggling in my BINARY OPERATION, the symbols DO NOT mean what they supposed to. You know, human being just like to make things complicated to show that" we are not the same with monkeys!"
    • Boolean algebra, perhaps?

      I'm wondering if, here, the writer means Boolean, rather than binary.

      In that case, I can understand why. In some representations of Boolean algebra, "." means AND, and "+" means OR. Which is not immediately intuitive, and seems perhaps the opposite of what it should be.

      These operations are variously shown as:

      Conjunction(xy), x AND y, x & y, x && y, x ^ y, Kxy, x . y
      Disjunction(xy), x OR y, x | y, x || y, x v y, Axy, x + y
      Negation(x), NOT x, !x, ¬x, Nx, "x with a bar on top that I can't draw here."

      But I think the traditional Boolean symbols were selected as an aide-mémoire that some rules hold true for Booleans, as with regular arithmetic: a+0=a, a.0=0, a.1=a.

      But there are more things that work in traditional algebra and would also have worked whichever way round you picked the symbols with Boolean algebra: a+b=b+a, a.b=b.a, a.(b.c)=(a.b).c, a+(b+c)=(a+b)+c, and a.(b+c)=(a.b)+(a.c) (which last works only this way round for traditional math, but both ways for Boolean).

      And there are things which hold true only for Booleans, like: a = a+a = a.a = a.(a+b) = a+(a.b), and a+1=1.

      So its utility as an aide-mémoire is a bit debatable, yeah. The programming habit of using entirely separate symbols (& and |, or AND and OR) to represent Boolean operations is in my opinion much better than trying to recycle them. For this same reason, operator overloading is generally frowned upon and not implemented in most modern programming languages.
  • Mathematicians make up new words every day

    (Anonymous)
    I think it's pretty clear you don't do mathematics or know any mathematicians. Mathematicians *invent* new definitions on a daily basis, and the concepts are far more complicated and nuanced than you imagine. They honestly need to invent these new words because there is nothing existing to express what they're talking about. Part of the reason you don't understand that is because you're afraid to actually learn anything about mathematics before judging it as completely nonsensical. The problem is not the notation; a twelve-year-old can learn the notation. The problem is your ability to go outside of your comfort zone to learn something difficult and new.
    • So do programmers: we just do it better.

      I suppose I should respond to this anon, though it feels like they only read the title, rather than the post. Let's not make that same mistake, and instead address each point.

      > "I think it's pretty clear you don't do mathematics or know any mathematicians."

      Ad hom. Also takes the view that I am addressing the use of such symbols between experts in the field, which I think I very clearly am not, so also a little strawmannish.

      Also happens to be false. I'm a programmer: that means that math is a natural part of my life. But I don't call it "f(x)", I use "fourierTransform(signal);" or "quaternion.increaseYaw(angle);". Still some strange symbols, but next person to try to maintain my code would have at least a chance.

      > "Mathematicians *invent* new definitions on a daily basis, and the concepts are far more complicated and nuanced than you imagine."

      The same is true, to a greater extent, of programmers. However, where mathematicians value simplicity, economy, brevity, and terseness; we value readability above all.

      So when a mathematician needs something to represent an item contained within a set, he asks himself "which single symbol should I use for this?" and picks a arbitrary symbol and modifies it with a strike, dash, bar, rotation, or something he personally understands, but nobody else will until he explains it to them. So he picks a scarab's ass, and in his head calls it "element of".

      When a programmer invents a new definition, we ask ourselves "what's the best term to describe this?" We strive not to reuse terms. So if we have code to define a container, what should we call the thing contained? Object and item are already used. Containable? ElementOf? We will give this much thought: we won't simply call it 'o' and be done.

      "They honestly need to invent these new words because there is nothing existing to express what they're talking about."

      Firstly, upward-facing Scarab-butt is not a word, it's a symbol. Secondly, "Element of" is a perfectly acceptable pre-existing English term for it. As I wrote, "I can guarantee there is a clear and concise terminology in most programming languages to represent every one of those concepts, and that this terminology will at least be copyable, linkable, and searchable." - simply claiming "oh no there isn't" is not a valid rebuttal.

      "...you don't understand that ... you're afraid to actually learn ... judging it as completely nonsensical ... go outside of your comfort zone to learn something ..."

      And this passage is why I left off responding to this, and left off this topic entirely, for so long. Armchair psychology by an anon? If that's all I'll get for this, why even bother? It was despiriting that someone could miss the point SO BADLY.

      But looking at it now, it's a perfect illustration of the kind of overweening arrogance and blindness that is used to preserve this particular terrible status quo.

      When I read an article about a chemical, I see stuff written in English: https://en.wikipedia.org/wiki/Paraquat is a fairly typical example. Chemistry information is both shown in a sidebar, and in the body text, but even though the article is about a chemical, the topic remains accessible to a layman.

      In contrast, consider any math page, no matter how popular the topic might be to lay-people: https://en.wikipedia.org/wiki/Klein_bottle. While this is an unusually good page in that it at least makes an effort to explain some of the topics in English as well as symbols, this is far from true of all.

      To me this feels like a selective form of blindness. Chemists know that they are using speicalist terms, and willingly explain them. Mathematicians appear to feel that everyone should understand the notation - no matter how unsuitable it is for their medium - and that explaining it in English is... I don't know. I'm really not sure what their motivation for avoiding being intelligible might be, and I don't want to sink to contemptuous armchair-psychology as the other anon poster did, and suggest that perhaps it's "out of their comfort zone". Pointless? Beneath them? Impossible?

      Even in chemistry, arcane alchemical symbols quickly gave way to better notations: why not in math?
  • (Anonymous)
    This is a very superficial analysis of a complex problem. The whole purpose of mathematical notation is to facilitate communication, not between novices who want to learn about the subject from Wikipedia, but between people who spend a considerable amount of time dealing with the concepts that mathematical notation is designed to express. The particular notation itself is incidental, what's important is that it's fit for purpose. "Fit for purpose" doesn't mean the same thing on a computer as on a sheet of paper.
    • It seems we agree

      I'm puzzled by your response. You seem to be phrasing it as disagreement, but agree with me in every point you make. It's almost as if you had only read the title, and not the post. Which is strengthened by the fact that you don't address any of the points I made in the post.

      Firstly, you and I both agree that there *is* a problem. I'm happy to agree that the problem's too complex for me to cover in a single blog post. I can, at best, scratch the surface of why it's bad, provide an example of how mathematical notation could be expressed intelligible, and pose some solutions, and hope that my readers "get it". It seems not.

      Next, you agree with me that mathematical notation is intended for terse communication between people who already know the subject inside-out, rather than novices.

      And finally, you agree that '"Fit for purpose" differs between the blackboard and the computer. That is exactly what I meant when I wrote "Mathemagical symbols were designed for writing quickly on blackboards. They are the WRONG TOOL for writing on computers."

      So, it seems that we agree that mathematical notation is unfit for purpose on a computer; and that it is unfit for purpose for pedagogy; and that this is a complex problem.
Powered by LiveJournal.com