Eventually most of the AI processing will be done on local devices without the necessity for as many server farms.
Perhaps in ten years, your iPhone will have an embedded LLM as powerful as the ones we have now.
“Eventually most of the AI processing will be done on local devices without the necessity for as many server farms....”Perhaps in ten years, your iPhone will have an embedded LLM as powerful as the ones we have now.”
I would think that might depend on the size of the universe of data to be used to answer the question to be processed by the A.I. model. If the universe of data is all within the data the business unit has, then yes, the A.I model does not need data from the massive server farms.
But the larger questions will demand data outside of the universe of data the business unit has, and maybe even beyond the universe of that business segment in the economy, and for such large universes of data it will come from large server farms, such as “the cloud” now represents. And the “size” question then becomes does the A.I. inquiring unit (computer processing unit) have the capacity to process all the data called for. Or does the unit asking the question only call out to an A.I. server farm, which crunches the answer, and gives it back to the calling unit - much as A.I. questions are now handled in general Internet browser fashion, whether from a computer or a cell phone.
[I remember some actuarial applications we worked on many years ago, and a few of the major report programs worked fine and with quality results, but the combination of the complex mathematical questions being asked together with the mountains of data the questions had to process, it was hours before the computer spit out the full report.]
And I expect some A.I. models will begin to be made into smaller versions of themselves, specializing in particular areas or forma of inquiry, and yes, some of them will even be written directly on chips, as firmware, fitting uses on many types of devices. Some of the A.I. models will be models for making specialized models.