• filister@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    arrow-down
    3
    ·
    2 days ago

    What is amazing in this case is that they achieved spending a fraction of the inference cost that OpenAI is paying.

    Plus they are a lot cheaper too. But I am pretty sure that the American government will ban them in no time, citing national security concerns, etc.

    Nevertheless, I think we need more open source models.

    Not to mention that NVIDIA also needs to be brought to earth.

    • demesisx@infosec.pub
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      5
      ·
      2 days ago

      Even if they get banned, any startup could replicate their work if it is truly open source. The best thing about their solution is that it breaks the CUDA monopoly that NVDA has enjoyed. Buy your puts when NVDA bounces because that stock is GOING DOWN. There’s no world where a company that makes GPU’s is worth more than both Apple and Microsoft. It’s inevitable.

      • toffi@feddit.org
        link
        fedilink
        English
        arrow-up
        19
        ·
        2 days ago

        Never forget kids the market can stay irrational much longer than you can stay solvent.

        • demesisx@infosec.pub
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 days ago

          True. Thats why I tend to make small plays instead of being an absolute degenerate gambler.

      • Pieisawesome@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        2 days ago

        It’s written in nvidia instruction set PTX which is part of CUDA ecosystem.

        Hardly going to affect nvidia

        • demesisx@infosec.pub
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          2 days ago

          It certainly does.

          Until last week, you absolutely NEEDED an NVidia GPU equipped with CUDA to run all AI models.

          Today, that is simply not true. (watch the video at the end of this comment)

          I watched this video and my initial reaction to this news was validated and then some: this video made me even more bearish on NVDA.

          Edit: corrected and redacted.