• IngeniousRocks (They/She) @lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    20
    ·
    4 days ago

    AMD doesn’t know I selfhost generative AI models.

    8GB is barely sufficient for my needs and I often need to use multigpu modifications to deploy parts of my workflow to my smaller GPU which lacks both enough cuda cores and enough vram to make an appreciable difference. I’ve been searching for a used K80 in my price range to solve this problem.

      • IngeniousRocks (They/She) @lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        9
        ·
        4 days ago

        I know I’m not but that doesn’t mean that gamers wouldn’t benefit from more VRAM as well.

        Just an example, Nvidia’s implementation of MSAA is borked if you’ve only got 8gigs of VRAM, so all those new super pretty games need to have their gfx pipelines hijacked and the antialiasing replaced with older variants.

        Like, I’m not gonna go around saying my use case is normal, but I also won’t delude myself into thinking that the average gamer wouldn’t benefit from more VRAM as well.

        • KoalaUnknown@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          2
          ·
          4 days ago

          Sure, but if they want more VRAM they can just buy the 16gb version. It doesn’t hurt to have a cheaper option.