• funkajunk@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    6 months ago

    Next tech is probably going to be dedicated GPUs or similar to run personalized AI

    • Kid_Thunder@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      It’s already here. I run AI models via my GPU with training data from various sources for both searching/GPT-like chat and images. You can basically point-and-click and do this with GPT4All which integrates a chat client and let’s you just select some popular AI models without knowing how to really do anything or use the CLI. It basically gives you a ChatGPT experience offline using your GPU if it has enough VRAM or CPU if it doesn’t for whatever particular model you’re using. It doesn’t do images I don’t think but there are other projects out there that simplify doing it using your own stuff.

      • cybersandwich@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        The m series Mac s with unified memory and ML cores are insanely powerful and much more flexible because your 32gb of system memory is now GPU vram etc