[ml] langchain runs local model officially

Undescribed Horrific Abuse, One Victim & Survivor of Many gmkarl at gmail.com
Thu Apr 6 05:13:20 PDT 2023


On 4/6/23, efc at swisscows.email <efc at swisscows.email> wrote:
> I run alpaca.cpp on a laptop with 8 GB ram and the 7B model. Works pretty
> well.
>
> I would love to find a project that would enable me to go to the 13B
> model, but have not yet found one that enables me to run that on only 8 GB
> ram.

4 bit quantization and mmap?

i’m not using them myself yet but people are doing these things


More information about the cypherpunks mailing list