The Watering Hole: September 21 – The ASCII System

A usable numbering notation system arose out of necessity during the early computer age.  Using binary was out of question and the choice of octal was dictated by economy of design. Two problems arising from this practice were  that only upper case letters could be defined and that that the numerals from 0 to 9 were assigned the symbolic base 8 values from 60 to 71 – or respectively 60, 61, 62, 63, 64, 65, 66, 67, 70 and 71.

While logic circuitry did not care about such representations, it blew the minds of early logic developers working towards base 10 arithmatic. This led to a code point set where the lowest digit was base 16 (hexadecimal) and the high order digit was represented as a base 4 number, thus the numeric values could be represented as 30, 31, 32, 33,34, 35, 36, 37, 38 and 39 in 6 bit code. The number of code points limited the characters that could be represented without resorting to up shifts and down shifts. Earlier 5 bit codes relied on numeric shifts as well. There was a slew of 5 bit code sets that arose out of the confusion but the Baudot and Murray sets came to the top by the early 1900’s. It is from the Baudot code set that we get the term ‘baud’.

This limitation led to the adaptation of ASCII code points. This is a modern ASCII translation table:

This simplified the job of the designers because numerics were in a form more compatible with human experience.  Life would have been even easier if  the Arabs had originally come forth with a hexadecimal numbering system instead of base ten. The only problem there was the representation of the numbers four and two in hexadecimal – in a four finger system, they would be represented by _|__ and __|_ respectively.

This is our open thread. Please feel free to offer your own comments on this or any other topic.

Advertisements