Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article

To: PatrickHenry

Bravo!

Well done my friend. :-)


6,961 posted on 08/21/2006 5:32:26 AM PDT by RadioAstronomer (Senior member of Darwin Central)
[ Post Reply | Private Reply | To 6950 | View Replies ]


To: RadioAstronomer
Something I wrote, but never posted for a thread that is lost in the depths of time.

Over the past number of weeks I have been contemplating the direction that this now defunct thread took. I thought I would share some of those thoughts this evening.

Shermy; get the way back machine! :-)

Back in the 1960’s I watched a show called “Star Trek”. One of the themes that ran thru many of the shows was the impact of computers on both the starship and the worlds it visited. From that moment on, I became fascinated with computers and machine intelligence. Moore’s law had yet to be defined and the microprocessor had not yet been invented. Heck, not too long earlier, in 1947, the very first transistor was invented and shortly thereafter, April 25, 1961, the first patent was granted for the integrated circuit. And as they like to say; “the rest is history”.

Anyhow, when Star Trek was being aired for the first time, RTL, DTL, and TTL were still king of the hill. Many of the computers of that time were large mainframes which used electromechanical interfaces such as teletypes. There were articles about learning machines in publications such as Scientific American and the like; however, they only gave the first glimmers of the possibilities of where computer technology would lead us. Apollo and Gemini was the king of the hill for NASA and the future looked promising (at least to some of us).

Let me digress a bit further into the culture of those times. Overpopulation was a great worry, the Vietnam War caused a huge backlash by the counter culture types, and racial clashes were commonplace. I still remember walking into bookstores and seeing the posters of the time showing a ruined society with waves of people everywhere. Bike gangs, drugs and hippies were all the rage. During this tumultuous time progress continued on these tiny circuits which would later evolve into the microprocessor. The HiFi, television, and the telephone, was about all that the average house had that reflected the great leaps that were happening. Banking was the closest that folks got to computing and the news often had stories about computer banking errors. This did not instill trust by the general public into this new technology. I often heard “what good is it” or “new fangled” during that time frame. Oh most folks knew NASA needed computers, however, they were more of a bother instead of a boon to the general masses.

This cumulated towards the end of the 60’s and into the first part of the 70’s. Books were written about the “information age” such as “Future Shock” by Alvin Toffler. The supposition was that technology was going to “explode” at such a pace that the average person would be lost in this sea of technology and end up rejecting and/or being buried by the same. I personally did not adhere to that mindset. I so wanted my own computer. Unfortunately, computers were still the realm of either SiFi or companies. Individuals just did not own their very own computer. I would talk to my dad and others and would receive this reply more often than not; “what would you do with it?” To this day I remember my neighbor taking me to the CDC core memory plant. What a treat! I stood in awe looking at all of the machines, terminals, teletypes, and computers in this huge building. I was in heaven. This was the place where the memory arrays were built for these huge mainframes. I was allowed into the room with the sea of workstations where women strung these tiny ferrite beads on strands of wire under magnifying glasses. If you have never seen a core memory plane in real life, you are missing out on a real work of art. Each core plane was strung with gold, red and green wires not much bigger than sewing thread. The cores themselves were so tiny; you could barely see the hole in the center of each. All these colored wires caused the core plane to glitter with tiny rainbows of light under the fluorescent lighting. I was in awe. I did make one BIG faux pas. I brushed a fingertip over the top plain of one of the core stacks to feel it not realizing I was causing almost two days work worth of damage to repair. I still feel bad about that to this day.

Enter the microprocessor. Up until this point, the central processing unit of a computer had been made up of a number of different circuits, starting with tubes and relays in the very early machines, which then migrated to transistors and finally to discrete integrated circuits. In 1971 this all changed. Intel Corporation created the worlds very first “CPU” on a chip. It was only a four bit processor that contained 2300 transistors. This “brain” on a chip was quickly followed by eight bit processors that became the mainstay throughout the rest of the 70s. As complexity progressed, an Intel CEO; Gordon Moore, noticed a particular trend. He noted that approximately every 18 months, the number of transistors doubled on the average processor. This quickly became known as Moore’s Law. This has led to a single processor (such as the Itanium-2) with approximately 400 Million transistors inside. This is a far cry from the humble beginnings of 2300 transistors. It would take roughly 138,000 “I4004” processors to equal the transistor count on one Itanium-2 processor. The clock speed of the CPUs has increased as well. With the advent of massive parallel processing and clustering of CPUs, the humble microprocessor has evolved into the “super computer” realm that would have boggled the mind back in 1971.

6,962 posted on 08/21/2006 5:57:12 AM PDT by RadioAstronomer (Senior member of Darwin Central)
[ Post Reply | Private Reply | To 6961 | View Replies ]

Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson