Posted on 06/23/2023 3:39:25 PM PDT by aimhigh
Researchers have developed a new design for computer memory that could both greatly improve performance and reduce the energy demands of internet and communications technologies, which are predicted to consume nearly a third of global electricity within the next ten years.
The researchers, led by the University of Cambridge, developed a device that processes data in a similar way as the synapses in the human brain. The devices are based on hafnium oxide, a material already used in the semiconductor industry, and tiny self-assembled barriers, which can be raised or lowered to allow electrons to pass.
This method of changing the electrical resistance in computer memory devices, and allowing information processing and memory to exist in the same place, could lead to the development of computer memory devices with far greater density, higher performance and lower energy consumption. The results are reported in the journal Science Advances.
(Excerpt) Read more at eurekalert.org ...
Resistance is futile.
It’s the graphics processors that are the current issue with power consumption.
It’s the graphics processors that are the current issue with power consumption.
************
What about the massive power consumption of bit coin mining?
Take them out of grades 1,2 & 3 in all schools.
Whew! We can finally save duh erf!
Isn’t that mostly an issue if you’re mining for bitcoin and the like?
BCM is Graphics card powered.
Best to get a NEIGHBOR TO PROVIDE THE POWER.
Bubble memory
Nothing wrong with articles about basic research (precursor to applied research), but the articles rarely talk about what it takes to get theories and bench-scale experiments to commercial production.
When I was in R&D, there was about a 90% drop-out rate at each stage of development:
1. Basic research: Concepts, theoretical analyses, lab work
2. Proof of concept and applied research
3. Pilot plant
4. Production
5. Commercial success and profitability
Costs went up 10X at each stage and only 10% survived each stage.
Worked on bubble memory at Intel in the late’70s.
BCM is Graphics card powered.
***********
Aren’t ASIC rigs manufactured specifically for mining Bitcoin used as well?
Very interesting. Thanks for that insight.
Same in the USAF in the 80s. Just feeling nostalgic.
And the absolutely worthless tailing from the mining operations! Huge piles of bits!
Gadzooks! A marble machine memory!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.