I’m looking for a resource efficient AI model for text generation (math, coding etc.) that will work with LocalAI. Which model should I use? I don’t want it to use more than 1-3 GB RAM. I’ll run it on a vps to use with Nextcloud.
I’m looking for a resource efficient AI model for text generation (math, coding etc.) that will work with LocalAI. Which model should I use? I don’t want it to use more than 1-3 GB RAM. I’ll run it on a vps to use with Nextcloud.
try pfizer/poppy-lrud-normal-128, run it straight offff your neural chip and feed it 1 GB RAM you’ll be gud2go
This worked, thank