### Mathematical notation considered harmful.

"Giant zigzag, sitting on two symbols separated by a scarab's butt." - The diagram is three lines high, and displayed as an image. What does it MEAN? How, using the tools available, can we find out?

It means:

FOREACH (o IN O)

x += f(o)

...which is still unacceptable. It's TERRIBLE. It would quite possibly get a programmer fired, but is considered perfectly acceptable for math professors.

Mathematicians think mathematical notation is acceptable in teaching. It is NOT acceptable, for several reasons.

1) It is not copy-pastable.

2) It is not searchable - Google is blind to these things: pasting in an omega will not give you what you want.

3) It is not linkable. So someone seeing a sum or union symbol has literally NO way of finding out what they mean. An arrow pointing to the right... what does that mean? This arrow has two bars. Hrm. Nope. I got nothing.

4) They feel it is sufficient to provide an explanation in these esoteric alchemical symbols; therefore many wiki pages are left incomplete, where otherwise an intelligible explanation might be given; thus they reduce the sum of human knowledge.

5) They choose the world's worst variable names. The above code is NOT acceptable. When they write programs, they write them with global variables 'a' through 'z', then start again at 'aa', 'ab' - this is not a rarity, it is a commonplace amongst math professors in most faculties. It is how they think: without clarity or any need to explain their workings.

eventProbability = 0

FOREACH (outcome IN eventOutcomeList)

eventProbability += probabilityOf(outcome)

Now that's getting closer to the right way to do it. Without esoteric one-letter symbols to represent things, you stand a chance of being able to see at a glance what is going on.

Mathemagical symbols were designed for writing quickly on blackboards. They are the WRONG TOOL for writing on computers.

"But Dewi, we have no better tool" - what, other than English, every other written language, and every single programming language? Well, if you're not willing to use any of those, then perhaps design a better terminology, then, and push for it to be accepted. But I can guarantee there is a clear and concise terminology in most programming languages to represent every one of those concepts, and that this terminology will at least be copyable, linkable, and searchable.

And don't cry that the terminology is unfamiliar and doesn't have global acceptance in the mathematical community: that is an EXCELLENT feature, for it will mean that you will link to somewhere that clearly defines all your symbols in the language of the rest of your article, and you will actually pick decent variable and function names, instead of arbitrary letters!

And in the commons: I'd be willing to bet that more people know C/C++ or PHP than know how to read math markup. There's a reason for that, and it's not because these people are stupid: it's because math markup is

It means:

FOREACH (o IN O)

x += f(o)

...which is still unacceptable. It's TERRIBLE. It would quite possibly get a programmer fired, but is considered perfectly acceptable for math professors.

Mathematicians think mathematical notation is acceptable in teaching. It is NOT acceptable, for several reasons.

1) It is not copy-pastable.

2) It is not searchable - Google is blind to these things: pasting in an omega will not give you what you want.

3) It is not linkable. So someone seeing a sum or union symbol has literally NO way of finding out what they mean. An arrow pointing to the right... what does that mean? This arrow has two bars. Hrm. Nope. I got nothing.

4) They feel it is sufficient to provide an explanation in these esoteric alchemical symbols; therefore many wiki pages are left incomplete, where otherwise an intelligible explanation might be given; thus they reduce the sum of human knowledge.

5) They choose the world's worst variable names. The above code is NOT acceptable. When they write programs, they write them with global variables 'a' through 'z', then start again at 'aa', 'ab' - this is not a rarity, it is a commonplace amongst math professors in most faculties. It is how they think: without clarity or any need to explain their workings.

eventProbability = 0

FOREACH (outcome IN eventOutcomeList)

eventProbability += probabilityOf(outcome)

Now that's getting closer to the right way to do it. Without esoteric one-letter symbols to represent things, you stand a chance of being able to see at a glance what is going on.

Mathemagical symbols were designed for writing quickly on blackboards. They are the WRONG TOOL for writing on computers.

"But Dewi, we have no better tool" - what, other than English, every other written language, and every single programming language? Well, if you're not willing to use any of those, then perhaps design a better terminology, then, and push for it to be accepted. But I can guarantee there is a clear and concise terminology in most programming languages to represent every one of those concepts, and that this terminology will at least be copyable, linkable, and searchable.

And don't cry that the terminology is unfamiliar and doesn't have global acceptance in the mathematical community: that is an EXCELLENT feature, for it will mean that you will link to somewhere that clearly defines all your symbols in the language of the rest of your article, and you will actually pick decent variable and function names, instead of arbitrary letters!

And in the commons: I'd be willing to bet that more people know C/C++ or PHP than know how to read math markup. There's a reason for that, and it's not because these people are stupid: it's because math markup is

*more user-hostile than C/C++*!*[When I wrote this in a BoingBoing comment, I got the response "So how about a link to your clear and concise universal grammar/language that all can use to do everything. I personally know several gurus who would be totally willing to throw mad mad money in your (that/this) direction for such a solution to this very general problem. Seriously, dude, we're waiting."*

However, I'm afraid that I have no intention of linking "English, every other written language, and every single programming language" - if a reader is unable to find references to these languages themselves, they should probably not be writing, period. But, as a hint: you want what we call a "dictionary". They've been around for at least 4,300 years, so you should be able to pick one up quite cheap. For what it's worth, "mad money" has already been spent on the development of these languages. They already exist, and are very well refined and capable of expressing any nuance you might imagine. They don't need further investment from "gurus".]However, I'm afraid that I have no intention of linking "English, every other written language, and every single programming language" - if a reader is unable to find references to these languages themselves, they should probably not be writing, period. But, as a hint: you want what we call a "dictionary". They've been around for at least 4,300 years, so you should be able to pick one up quite cheap. For what it's worth, "mad money" has already been spent on the development of these languages. They already exist, and are very well refined and capable of expressing any nuance you might imagine. They don't need further investment from "gurus".]

## Boolean algebra, perhaps?

dewimorganIn that case, I can understand why. In some representations of Boolean algebra, "." means AND, and "+" means OR. Which is not immediately intuitive, and seems perhaps the opposite of what it should be.

These operations are variously shown as:

Conjunction(xy), x AND y, x & y, x && y, x ^ y, Kxy, x . y

Disjunction(xy), x OR y, x | y, x || y, x v y, Axy, x + y

Negation(x), NOT x, !x, ¬x, Nx, "x with a bar on top that I can't draw here."

But I think the traditional Boolean symbols were selected as an aide-mémoire that some rules hold true for Booleans, as with regular arithmetic: a+0=a, a.0=0, a.1=a.

But there are more things that work in traditional algebra and would also have worked whichever way round you picked the symbols with Boolean algebra: a+b=b+a, a.b=b.a, a.(b.c)=(a.b).c, a+(b+c)=(a+b)+c, and a.(b+c)=(a.b)+(a.c) (which last works only this way round for traditional math, but both ways for Boolean).

And there are things which hold true only for Booleans, like: a = a+a = a.a = a.(a+b) = a+(a.b), and a+1=1.

So its utility as an aide-mémoire is a bit debatable, yeah. The programming habit of using entirely separate symbols (& and |, or AND and OR) to represent Boolean operations is in my opinion much better than trying to recycle them. For this same reason, operator overloading is generally frowned upon and not implemented in most modern programming languages.