### Mathematical notation considered harmful.

"Giant zigzag, sitting on two symbols separated by a scarab's butt." - The diagram is three lines high, and displayed as an image. What does it MEAN? How, using the tools available, can we find out?

It means:

FOREACH (o IN O)

x += f(o)

...which is still unacceptable. It's TERRIBLE. It would quite possibly get a programmer fired, but is considered perfectly acceptable for math professors.

Mathematicians think mathematical notation is acceptable in teaching. It is NOT acceptable, for several reasons.

1) It is not copy-pastable.

2) It is not searchable - Google is blind to these things: pasting in an omega will not give you what you want.

3) It is not linkable. So someone seeing a sum or union symbol has literally NO way of finding out what they mean. An arrow pointing to the right... what does that mean? This arrow has two bars. Hrm. Nope. I got nothing.

4) They feel it is sufficient to provide an explanation in these esoteric alchemical symbols; therefore many wiki pages are left incomplete, where otherwise an intelligible explanation might be given; thus they reduce the sum of human knowledge.

5) They choose the world's worst variable names. The above code is NOT acceptable. When they write programs, they write them with global variables 'a' through 'z', then start again at 'aa', 'ab' - this is not a rarity, it is a commonplace amongst math professors in most faculties. It is how they think: without clarity or any need to explain their workings.

eventProbability = 0

FOREACH (outcome IN eventOutcomeList)

eventProbability += probabilityOf(outcome)

Now that's getting closer to the right way to do it. Without esoteric one-letter symbols to represent things, you stand a chance of being able to see at a glance what is going on.

Mathemagical symbols were designed for writing quickly on blackboards. They are the WRONG TOOL for writing on computers.

"But Dewi, we have no better tool" - what, other than English, every other written language, and every single programming language? Well, if you're not willing to use any of those, then perhaps design a better terminology, then, and push for it to be accepted. But I can guarantee there is a clear and concise terminology in most programming languages to represent every one of those concepts, and that this terminology will at least be copyable, linkable, and searchable.

And don't cry that the terminology is unfamiliar and doesn't have global acceptance in the mathematical community: that is an EXCELLENT feature, for it will mean that you will link to somewhere that clearly defines all your symbols in the language of the rest of your article, and you will actually pick decent variable and function names, instead of arbitrary letters!

And in the commons: I'd be willing to bet that more people know C/C++ or PHP than know how to read math markup. There's a reason for that, and it's not because these people are stupid: it's because math markup is

It means:

FOREACH (o IN O)

x += f(o)

...which is still unacceptable. It's TERRIBLE. It would quite possibly get a programmer fired, but is considered perfectly acceptable for math professors.

Mathematicians think mathematical notation is acceptable in teaching. It is NOT acceptable, for several reasons.

1) It is not copy-pastable.

2) It is not searchable - Google is blind to these things: pasting in an omega will not give you what you want.

3) It is not linkable. So someone seeing a sum or union symbol has literally NO way of finding out what they mean. An arrow pointing to the right... what does that mean? This arrow has two bars. Hrm. Nope. I got nothing.

4) They feel it is sufficient to provide an explanation in these esoteric alchemical symbols; therefore many wiki pages are left incomplete, where otherwise an intelligible explanation might be given; thus they reduce the sum of human knowledge.

5) They choose the world's worst variable names. The above code is NOT acceptable. When they write programs, they write them with global variables 'a' through 'z', then start again at 'aa', 'ab' - this is not a rarity, it is a commonplace amongst math professors in most faculties. It is how they think: without clarity or any need to explain their workings.

eventProbability = 0

FOREACH (outcome IN eventOutcomeList)

eventProbability += probabilityOf(outcome)

Now that's getting closer to the right way to do it. Without esoteric one-letter symbols to represent things, you stand a chance of being able to see at a glance what is going on.

Mathemagical symbols were designed for writing quickly on blackboards. They are the WRONG TOOL for writing on computers.

"But Dewi, we have no better tool" - what, other than English, every other written language, and every single programming language? Well, if you're not willing to use any of those, then perhaps design a better terminology, then, and push for it to be accepted. But I can guarantee there is a clear and concise terminology in most programming languages to represent every one of those concepts, and that this terminology will at least be copyable, linkable, and searchable.

And don't cry that the terminology is unfamiliar and doesn't have global acceptance in the mathematical community: that is an EXCELLENT feature, for it will mean that you will link to somewhere that clearly defines all your symbols in the language of the rest of your article, and you will actually pick decent variable and function names, instead of arbitrary letters!

And in the commons: I'd be willing to bet that more people know C/C++ or PHP than know how to read math markup. There's a reason for that, and it's not because these people are stupid: it's because math markup is

*more user-hostile than C/C++*!*[When I wrote this in a BoingBoing comment, I got the response "So how about a link to your clear and concise universal grammar/language that all can use to do everything. I personally know several gurus who would be totally willing to throw mad mad money in your (that/this) direction for such a solution to this very general problem. Seriously, dude, we're waiting."*

However, I'm afraid that I have no intention of linking "English, every other written language, and every single programming language" - if a reader is unable to find references to these languages themselves, they should probably not be writing, period. But, as a hint: you want what we call a "dictionary". They've been around for at least 4,300 years, so you should be able to pick one up quite cheap. For what it's worth, "mad money" has already been spent on the development of these languages. They already exist, and are very well refined and capable of expressing any nuance you might imagine. They don't need further investment from "gurus".]However, I'm afraid that I have no intention of linking "English, every other written language, and every single programming language" - if a reader is unable to find references to these languages themselves, they should probably not be writing, period. But, as a hint: you want what we call a "dictionary". They've been around for at least 4,300 years, so you should be able to pick one up quite cheap. For what it's worth, "mad money" has already been spent on the development of these languages. They already exist, and are very well refined and capable of expressing any nuance you might imagine. They don't need further investment from "gurus".]

## It seems we agree

dewimorganFirstly, you and I both agree that there *is* a problem. I'm happy to agree that the problem's too complex for me to cover in a single blog post. I can, at best, scratch the surface of why it's bad, provide an example of how mathematical notation could be expressed intelligible, and pose some solutions, and hope that my readers "get it". It seems not.

Next, you agree with me that mathematical notation is intended for terse communication between people who already know the subject inside-out, rather than novices.

And finally, you agree that '"Fit for purpose" differs between the blackboard and the computer. That is exactly what I meant when I wrote "Mathemagical symbols were designed for writing quickly on blackboards. They are the WRONG TOOL for writing on computers."

So, it seems that we agree that mathematical notation is unfit for purpose on a computer; and that it is unfit for purpose for pedagogy; and that this is a complex problem.