- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Running AI models without matrix math means far less power consumption—and fewer GPUs?
Running AI models without matrix math means far less power consumption—and fewer GPUs?
I don’t think that making LLMs cheaper and easier to run is going to “pop that bubble”, if bubble it even is. If anything this will boost AI applications tremendously.