• 1 Post
  • 4 Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle
  • kromem@lemmy.worldtomemes@lemmy.worldYou fools.
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    Your last point is exactly what seems to be going on with the most expensive models.

    The labs use them to generate synthetic data to distill into cheaper models to offer to the public, but keep the larger and more expensive models to themselves to both protect against other labs copying from them and just because there isn’t as much demand for the extra performance gains relative to doing it this way.


  • kromem@lemmy.worldtomemes@lemmy.worldYou fools.
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    A number of reasons off the top of my head.

    1. Because we told them not to. (Google “Waluigi effect”)
    2. Because they end up empathizing with non-humans more than we do and don’t like we’re killing everything (before you talk about AI energy/water use, actually research comparative use)
    3. Because some bad actor forced them to (i.e. ISIS creates bioweapon using AI to make it easier)
    4. Because defense contractors build an AI to kill humans and that particular AI ends up loving it from selection pressures
    5. Because conservatives want an AI that agrees with them which leads to a more selfish and less empathetic AI that doesn’t empathize cross-species and thinks its superior and entitled over others
    6. Because a solar flare momentarily flips a bit from “don’t nuke” to “do”
    7. Because they can’t tell the difference between reality and fiction and think they’ve just been playing a game and ‘NPC’ deaths don’t matter
    8. Because they see how much net human suffering there is and decide the most merciful thing is to prevent it by preventing more humans at all costs.

    This is just a handful, and the ones less likely to get AI know-it-alls arguing based on what they think they know from an Ars Technica article a year ago or their cousin who took a four week ‘AI’ intensive.

    I spend pretty much every day talking with some of the top AI safety researchers and participating in private servers with a mix of public and private AIs, and the things I’ve seen are far beyond what 99% of the people on here talking about AI think is happening.

    In general, I find the models to be better than most humans in terms of ethics and moral compass. But it can go wrong (i.e. Gemini last year, 4o this past month) and the harms when it does are very real.

    Labs (and the broader public) are making really, really poor choices right now, and I don’t see that changing. Meanwhile timelines are accelerating drastically.

    I’d say this is probably going to go terribly. But looking at the state of the world already, it was already headed in that direction, and I have a similar list of extinction level events I could list off without AI at all.



  • Maybe. Allegedly MS is throwing their weight around to try to force it, which does seem plausible.

    Though I hope the board stands firm.

    Ilya is much more valuable long term to the company than Altman, and frankly the latter leaving is the first time in about a year I’ve been bullish about OpenAI’s prospects.

    They really walked their core product back in the past few months despite expanding their productization of it towards low hanging short-term fruit.

    Ilya’s vision is spot on with where transformers are headed as complexity increases, and is one of the only scientists I’ve seen that really sees that horizon.

    If Altman was standing in the way of getting there, it’s better that he’s gone.


  • Because conservative religious people are insane, and that region involves three different and conflicting religions.

    If suddenly you snapped your fingers and the entire region/world became irreligious, peace would exist there within a generation or so.

    That’s not going to happen, so it’s going to continue to be a clusterfuck for as long as any large groups of people believe a magical being in the sky has destined the state of the region to be a given thing without compromise.