Posted on 01/29/2017 12:45:19 PM PST by oblomov
n 1971, Intel, then an obscure firm in what would only later come to be known as Silicon Valley, released a chip called the 4004. It was the worlds first commercially available microprocessor, which meant it sported all the electronic circuits necessary for advanced number-crunching in a single, tiny package. It was a marvel of its time, built from 2,300 tiny transistors, each around 10,000 nanometres (or billionths of a metre) across about the size of a red blood cell. A transistor is an electronic switch that, by flipping between on and off, provides a physical representation of the 1s and 0s that are the fundamental particles of information.
In 2015 Intel, by then the worlds leading chipmaker, with revenues of more than $55bn that year, released its Skylake chips. The firm no longer publishes exact numbers, but the best guess is that they have about 1.5bn2 bn transistors apiece. Spaced 14 nanometres apart, each is so tiny as to be literally invisible, for they are more than an order of magnitude smaller than the wavelengths of light that humans use to see.
Everyone knows that modern computers are better than old ones. But it is hard to convey just how much better, for no other consumer technology has improved at anything approaching a similar pace. The standard analogy is with cars: if the car from 1971 had improved at the same rate as computer chips, then by 2015 new models would have had top speeds of about 420 million miles per hour. That is roughly two-thirds the speed of light, or fast enough to drive round the world in less than a fifth of a second. If that is still too slow, then before the end of 2017 models that can go twice as fast again...
(Excerpt) Read more at theguardian.com ...
I remember way back in high school, I wrote a very nice accounting application for a company local to me in assembly language. IIRC, it was a Honeywell 316. Probably the entire app was 10 Kbytes.
“Comcast On Demand requires that I purchase...”
Get a Netmaster and tell them to pound sand.
One of my tasks back in the 1980s was to rewrite 370 mainframe system code from 24 bit to 31 bit. The old programs were rife with usage of the extra bits, because of clever programmers. It took a lot of work to split off some of that into separate routines and programs. At the time, I took advantage of the rewrite to create my own backdoors in system routines. Twenty years later they were still there. For instance, I could trigger via JCL (job control language) that my application programs get highest priority over all the thousands of other programs running simultaneously. Made my co-workers envious.
FTA: Everyone knows that modern computers are better than old ones. But it is hard to convey just how much better
As soon as you load software on it the “fast” computer slows down. I have Windows 7 and found the needed malwarebytes causes folder to take as long as 15 seconds to open as it scans them everytime you open them. The only fix is in the malwarebytes settings to tell it not to scan. This kinda defeats the purpose. Malwarebytes can block ransomware.
I was a proud user of the 4004 Intel chip. It was a four bit microprocessor we controlled with a Texas Instrument monitor that had cassette tapes for recording the data. I still have a couple of the tapes kept just for nostalgia. Intel bragged that it took a team of a couple dozen engineers a whole 9 months to design the chip.
I can see why; for very few humans these days seem to even HAVE a conscious!
My first computer had 256 BYTES of usable memory.
It loaded info at 300 characters per second.
I still have my first computer.
I see what you did here...
No one will ever need more than 640k of memory.
Not when you load it; only when you run it.
What does your truck have to do with any of this?
That quote was supposedly Mr. Bill, although he denies it.
The one I remember from the era (1983-ish) was some industry pundit declaring that 16-bit computers were overkill since the main use of small computers was word-processing, and words were composed of 8-bit characters. I remember thinking, he's wrong, I hope he's wrong. He was wrong.
At the time, I had been helping my brother shop for word-processing systems for his law office.
I think it’s a matter of time. Here are 2 cool videos of robotic bricklaying. That’s a much easier job to tackle because bricks are nice, easy, regular shapes to work with. But as we learn more, we’ll be able to automate more.
https://www.youtube.com/watch?v=4YcrO8ONcfY
https://www.youtube.com/watch?v=kXJbNY6-ejM
Personally, I’d love a robotic housekeeper and yard worker. No, not a Roomba and a robotic lawnmower. I want MORE!
The year...and it’s simplistic design has allowed it to last 46 years in comparison to a piece of crap computer....
There are far to many variables of which these machines will find difficult to overcome on job sites...I have built houses from the foundation up and two brick layers with a good tender can out perform and more importantly construct a more qualitive wall that will last multiple decades...to many little things that the machine just can’t do. It is not cost efficient for it on sites....
So what are you trying to say? That we should scrap computers and just stick to driving 46-year-old trucks?
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.