• 0 Posts
  • 123 Comments
Joined 2 years ago
cake
Cake day: June 19th, 2023

help-circle










  • andallthat@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    2 months ago

    LLMs can’t do protein folding. A specifically-trained Machine Learning model called AlphaFold did. Here’s the paper.

    Developing, training and fine tuning that model was a research effort led by two guys who got a Nobel for it. Alphafold can’t do conversation or give you hummus recipes, it knows shit about the structure of human language but can identify patterns in the domain where it has been specifically and painstakingly trained.

    It wasn’t “hey chatGPT, show me how to fold a protein” is all I’m saying and the “superhuman reasoning capabilities” of current LLMs are still falling ridiculously short of much simpler problems.


  • As a paid, captive squirrel, focusing on spinning my workout wheel and getting my nuts at the end of the day, I hate that AI is mostly a (very expensive) solution in search of a problem. I am being told “you must use AI, find a way to use it” but my AI successes are very few and mostly non-repeatable (my current AI use case is: “try it once for non-vital, not time-sensitive stuff, if at first you don’t succeed, just give up, if you succeed, you saved some time for more important stuff”).

    If I try to think as a CEO or an entrepreneur, though, I sort of see where these people might be coming from. They see AI as the new “internet”, something that for good or bad is getting ingrained in everything we do and that will cause your company to go bankrupt for trying too hard to do things “the new way” but also to quickly fade to irrelevance if you keep doing things in the same way.

    It’s easy, with the benefit of hindsight, to say now “haha, Blockbuster could have bought Netflix for $50 Millions and now they are out of business”, but all these people who have seen it happen are seeing AI as the new disruptive technology that can spell great success or complete doom for their current businesses. All hype? Maybe. But if I was a CEO I’d be probably sweating too (and having a couple of VPs at my company wipe up the sweat with dollar bills)



  • I don’t know how much Musk can be separated from Starlink. Not only because Starlink, as part of SpaceX, is privately held but also because the main reason they now have a superior service to offer is that they got fucktons of money from government customers, which is also tied to Musk’s action

    A big part of Musk’s involvement with politics is because everything he does, from EVs to rockets to, now, big energy-guzzling datacenters for AI, needs a lot of government backing, if not in terms of direct contracts at least in terms of regulation and incentives.

    Even his direct involvement with Trump wasn’t because he suddenly became a Nazi (he’s probably always been one, according to his own family) but in order to become even more entangled with government investments, even trying to control NASA directly.

    And not only US governments. I remember Musk suddenly being everywhere in Europe pitching Starlink. Meloni’s government in Italy was grilled for allegedly agreeing on a big contract with Starlink.




  • The article makes a good point that it’s less about replacing a knowledge worker completely and more industrializing what some categories of knowledge workers do.

    Can one professional create a video with AI in a matter of hours instead of it taking days and needing actors, script writers and professional equipment? Apparently yes. And AI can even translate it in multiple languages without translators and voice actors.

    Are they “great” videos? Probably not. Good enough and cheap enough for several uses? Probably yes.

    Same for programming. The completely independent AI coder doesn’t exist and many are starting to doubt that it ever will, with the current technology. But if GenAI can speed up development, even not super-significantly but to the point that it takes maybe 8 developers to do the work of 10, that is a 20% drop in demand for developers, which puts downward pressure on salaries too.

    It’s like in agriculture. It’s not like technology produced completely automated ways to plow fields or harvest crops. But one guy with a tractor can now work one field in a few hours by himself.

    With AI all this is mostly hypothetical, in the sense that OpenAI and co are all still burning money and resources at a pace that looks hard to sustain (let alone grow) and it’s unclear what the cost to the consumers will be like, when the dust settles and these companies will need to make a profit.

    But still, when we’re laughing at all the failed attempts to make AI truly autonomous in many domains we might be missing the point