RE: GPT4All: How to run "ChatGPT" locally on your PC, Facebook/Meta has ignited the open-source uncensored GPT community, what an irony 🚀

You are viewing a single comment's thread:

Is it RAM or VRAM that the large language models need?
I have 128Gb RAM but no VRAM



0
0
0.000
1 comments
avatar

The cool thing is, you can run the models both on the GPU and CPU, and also split the inference between them, so preferably on GPU, because it's much faster, but it's possible to run the entire model on the CPU RAM only.

0
0
0.000