You are viewing a single thread.
View all comments 7 points
*
”However, if it is performance you are concerned about, “it’s important to note that GPUs still far outperform NPUs in terms of raw performance,” Jessop said, while NPUs are more power-efficient and better suited for running perpetually.”
Ok, so if you want to run your local LLM on your desktop, use your GPU. If you’re doing that on a laptop in a cafe, get a laptop with an NPU. If you don’t care about either, you don’t need to think about these AI PCs.
2 points
Or use a laptop with a GPU? An npu seems to just be slightly upgraded onboard graphics.