In most regards, we have come an extremely long way from the
first computers. There have been ridiculous advances in size, cost, speed, power
consumption and portability. But what’s interesting is that today, outside of
extravagant purpose-built supercomputers, everybody’s running Intel with the
same architecture. It’s like going to the car lot and having your choice of 10
different makes, and they’ve all got the same engine and suspension. Apple
shows up with a different interior and a different paint color. Let’s say,
Victoria Plum Firemist. Dell can’t figure out what color to paint their car so
they look at Apple and paint theirs “Purple”.
That’s the state of machines today
But it wasn’t always so. In fact, until the PC revolution,
there was no common “best” way to build a machine. For the first decade or so
of commercial machines, you either chose a scientific computer, or a business
oriented computer. Both were general purpose, but a scientific machine excelled
at complex math (maybe it had hardware floating point) while a business machine
was better at sorting and merging (high speed I/O). You might buy a Univac 1103
for your hydrodynamics problems, and a Univac II for payroll and inventory. The
eventual availability of compact, affordable memory and reduced circuit size
meant one machine could excel at both tasks. History credits this to the famed
IBM 360 which was a make-or-break proposition for the company*. If you’ve ever
wondered why it was called the 360, now you know- to encompass the complete
circle of computing demands.
But a bent towards a specific field is pretty general.
Today, a byte is specified as 8 bits, also known as a Word. 8-bit
microcontrollers spurred the microcomputer industry in the late 70’s. 4 bits
was too few (there was an Intel 4004) and 16 bits at that stage, just out of
reach for most applications. But, it was realized very early on that the bigger
the computer word, the more “work” that could be done in a single computing
cycle. And even in 1952, people were trying to get all the computing speed they
could out of their vacuum tube machines (which were remarkably fast). One
machine might specify an instruction, a source address and a destination
address (in binary of course) in a single word. 64-bits, or even more, were not
uncommon 50 years ago. A large word also had the advantage that large numbers
could be manipulated without splitting them into segments, but that’s another
story. Other machines, such as the NORC, might even specify the addend address,
the augend address and the destination address in a single word. Others included
the address of the next instruction within that word.
And here’s the thing… Nobody had yet settled the matter of
whether decimal arithmetic, or binary arithmetic, was the superior thing. That
would take another 10 or 15 years. Having the machine work entirely in binary
meant more efficient use of memory, but it also meant that conversion into, and
out of, the machine would be required if it were to be human readable. And then
we have 1’s complement, 2’s complement, XS3, etc. Decimal machines solved this issue by acting
on the digits of the stored numbers specifically as BCD, but 4 bits are
required for each digit, so there’s a memory tradeoff because you essentially
throw away the values of 11-15.
And what then, of letters? IBM preferred the EBCDIC encoding
arrangement which assigned 6-bits to each letter; others preferred some flavor
of 7-bit ASCII. Not only are the two incompatible, but ASCII allows natural
sorting; that is, the binary code for G is greater than F, and F is greater
than E. This is not so with EBCDIC. And how might the machine tell the
difference between a number and a letter? If two leading zeros are appended to
each BCD number, memory is wasted. Though clever designers might then assign
these two bits to serve a different function (such as in the 1401).
But as the saying goes, that’s not all! Why must the machine
word length be a fixed number of digits? What if it was infinite and the ends
defined by some sort of bit pattern? This would certainly make for a much more
efficient machine and a lot less wasted memory. And so was the IBM 1401 born.
It far outsold the conservative projections of most everyone at IBM and became
the most popular computer in the world up until the microcomputer movement.
Words could be variable length, and their total length defined by a code known
as a word mark.
So now we have scientific and business, BCD and binary,
fixed length (of many varieties), variable length, and encoding differences.
Oh yes, and checking facilities. How does one ensure the
exact number of bits has been recorded to tape when the blocks are written at
several thousand bits per second and the loss of just one bit will completely
invalidate the data? To say nothing of swapping reels on drives of various
adjustment. Well, there’s no one way to do it. “All 1’s” checkers, checksum
counts, even/odd parity bits, redundant channel recording, redundant CPU
comparisons, and a host of other techniques were favored by various companies
for various machines. And all worked pretty darn well.
And still, we haven’t discussed instruction architecture,
which can get pretty wild. Or memory systems and addressing.
All of these variations meant lots of variety, and
competition, and new ideas in CPU and architecture design. Something that’s
exceedingly rare today, and likely never to be visited again given that
software development work is done primarily at a high level language and that
coders no longer have to give any though to the nuts and bolts of the machine.
And as long as we can put more transistors on a chip, and make memory smaller, cheaper,
and faster, there’s no need to care. The downside is that we’ve narrowed our
path of computer evolution, which is only 1 of many nails in the coffin for the
personal computer.
*One of many reasons the 360 was a gamble was that up until
this time, IBM had attempted some semblance of machine compatibility amongst
their scientific machines, and again with their business machines. The 360 was
totally incompatible with ALL existing machines, and it would be some time
before emulators (or useful software in general) would be available.