every AI is getting bad. I’m using some off-brand (some might say indie) ones, and they’re slower dumber and have more payed options by the day.
ChatGPT if not for cheating in school I wouldn’t use it at all probly.
Phind.com was my go-to favorite, now can’t search the web for shit gpt option is payed and it’s just dumb.
You.com my friend used this, straight ignored my request yesterday.
do other’s have the same experience? If yes, follow-up question are they dumbing down AI, and keeping the power to themselves? 2nd follow-up how can I selfhost a good AI and what do I need for it to work?
Dolphin is just peak amusement to me. I’ve asked so many weird things of it that it’s basically just cowering in a corner now, afraid of answering any question for fear of getting another kitten killed.
An nvidia card with as much memory as possible - the newer the better.
I’m also beginning to look into the dedicated accelerators like the coral, but at first wash, it looks like lack of onboard memory will be a massive bottleneck.
every AI is getting bad. I’m using some off-brand (some might say indie) ones, and they’re slower dumber and have more payed options by the day.
ChatGPT if not for cheating in school I wouldn’t use it at all probly. Phind.com was my go-to favorite, now can’t search the web for shit gpt option is payed and it’s just dumb. You.com my friend used this, straight ignored my request yesterday.
do other’s have the same experience? If yes, follow-up question are they dumbing down AI, and keeping the power to themselves? 2nd follow-up how can I selfhost a good AI and what do I need for it to work?
Look into ollama and mixtral variants… You’ll be limited by things like GPU memory though.
Mixtral 8x7B seems fairly capable to me so far. It’s just that I need to wait a few minutes for it to reply, given I’m running it on a 1st gen Ryzen…
Any models you’d reccommend that fit into 4GB of VRAM?
I’ve tried Deepseek Coder, and it certainly works well for quickly churning out bash scripts for whatever purpose I can possibly think of
Similar story here with an old 10xx GPU. I’ve just started tinkering with dolphin-mixtral-8x7b, but it’s early days.
Dolphin is just peak amusement to me. I’ve asked so many weird things of it that it’s basically just cowering in a corner now, afraid of answering any question for fear of getting another kitten killed.
what’d you say a good GPU is for it?
An nvidia card with as much memory as possible - the newer the better.
I’m also beginning to look into the dedicated accelerators like the coral, but at first wash, it looks like lack of onboard memory will be a massive bottleneck.
dang, I got an amd one. Should I even try? (16GB)
That I don’t have any experience with - I hear it’s harder, but not impossible.
Perplexity and poe are my go to for this.
I’ll try them tho, already a + for perplexity, I don’t need to log in
Yep that one my current fav.