• chicken@lemmy.dbzer0.com
    cake
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    Makes sense. In that case I guess your next best option is probably to buy or rent hardware to run the local models that are suitable for chat rp.

      • chicken@lemmy.dbzer0.com
        cake
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        11 months ago

        Same, at least for anything but the tiny ones that will fit in my limited vram. Hoping a gpu that’s actually good for LLM will come out in the next few years that’s not 15k and made for servers.