My Dream AI Laptop of the Future
So I was “dream shopping” . . . you know, looking for tech you can’t afford to buy! I’m sure I’m not the only one.
After doing a lot of testing on my own hardware, I have a decent understanding of what I need to do AI processing locally and portably. Okay, some of the development “might” have to be done in the cloud especially if you want to train your own AI with specific training datasets on a very powerful GPU. That means you certainly won’t be using one of the already trained, and free open source AIs availible for download from Hugging Face.
My point is, I could run the results of a pre-trained very powerful AI locally on a high-end laptop of today, that might cost $3500 (with a nice carrying case thrown in). You can find much cheaper options if you choose a workstation computer instead, but then you would loose portability.
I’m guessing tomorrow’s laptop (5 years hence) should have the same or more cognitive computaional power at half the price.
My research has shown that if I want to run say mistral-7b-openorca.Q8_0.gguf you will need about 13GB of VRAM on your NVidia GPU (mine, not the best performer for gaming, but a large VRAM for AI).
WOW! Check out the specs on this Intel CPU (and I’m partial to AMD). Now imagine that coupled with an Intel NPU mentioned in recent news and you have the recipe for some computational cognitive crunching. when software tools become available to use it.
Ultimately, you might ask me why spend all that money on a laptop? To anwser that I will have to imagine the future.
First let’s see what an AI thinks . . .
Putting on my futurist hat and imagining wildly!
The year is 2028 and all professionals have AI assistants. That means, lawyers, doctors, artists, movie directors, actors, musicians, software managers, journalist, CEOs, stockbrokers, any profession you can think of will have an AI assitant custom trained for specific use cases. It’s what will give them an edge against others in the same profession.
Operating Systems will have AI designed into their cores as well as many applications (apps). Smart wearable devices, watches, glasses, VR/AR wearables, talking AI cameras will have embedded AI. Most of it will work just fine from the cloud. However, there will always be a need to protect private proprietary and personal data securely, all used for training these custom AIs. AIs will ingest very important multimodal data and some of these things won’t be trusted to the cloud, they just won’t or can’t.
So the answer in these use cases is a powerful enough machine to take with you wherever you go in you professional life. Remember tomorrows laptop will be cheaper anyway and have the more power than todays best. Not such wild assumption when you really think about it.
Just now some open source models are beginning to use a technique called MoE (Mixture of Experts) getting exceptional performance without even using popular cloud based GPT models (you know which ones).
Who knows how these free open source AI models will improve in the future? I suspect they may be able to run inside the local hardware I’m imagining on an AI laptop of the future.
So now let’s imagine the hardware:
CPU 32–64 cores, mixture of efficiency and performance cores, many NPUs/GPUs
128 GB DDRx main memory minimum. All inside a low power high performance package (using liquid cooling?)
Modular components that are swappable and upgradable
Enabled accessories pens/VR/AR glasses . . .
Well that’s about it for “My dream AI Laptop of the Future”. If you enjoyed this article please comment, clap me up and follow.