Why’d they use the term 16 gigabit? Is 16 gigabit something other than 2 gigabyte?
I think the way to think about it is is that when you buy a storage device, it is actually composed of maybe 9 such chips (maybe 8 for data and one for parity). Or perhaps 17 (16 for data and 1 for parity). So while the *chips* are quantified in Gigabits, the *devices* on which they sit are quantified in Gigabytes. Just the way the industry does things. I wouldn’t be surprised if 16 Gigabit chips paves the way for 16 Gigabyte devices. And that’s a lot of storage.
It's the same amount of memory, assuming 8 bits per byte.
I think the bits could be used in other word sizes too, so gigabits may be the standard way to specify this part.
September 11, 2006: Samsung Electronics Co., Ltd., the world leader in advanced semiconductor technology solutions, today announced that it has developed the industrys first 40-nanometer (nm) memory device. The new 32 Gigabit (Gb) NAND flash device is the first memory to incorporate a Charge Trap Flash (CTF) architecture, a revolutionary new approach to further increase manufacturing efficiency while greatly improving performance.
The 32Gb NAND flash memory can be used in memory cards with densities of up to 64-Gigabytes (GBs).
THe article seems to be about one or two year old technology.
It's the main reason soda companies went to 2 liter bottles. Consumers think they are getting much more product then if the bottle said half a gallon (which it essentially is).