Summary

The “Doomsday Clock” has been moved to 89 seconds to midnight, the closest it has ever been, according to the Bulletin of the Atomic Scientists.

The group cited threats including climate change, nuclear proliferation, the war in Ukraine, pandemics, and the integration of AI into military operations.

Concerns about cooperation between Russia, China, and North Korea on nuclear programs and the potential use of nuclear weapons by Russia were highlighted.

The group urged global leaders to collaborate in addressing existential threats to reverse the clock’s progression.

  • hark@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    1 month ago

    The doomsday clock is goofy. What benefit is there to setting an arbitrary value on a bad-o-meter?

    • kava@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 month ago

      This was started nearly a century ago by scientists after the creation of the atomic bomb a couple years after the end of WW2.

      The point is mostly to say “hey, we have the technology to blow up the world and things do not not seem to be going well”. They actually give out an annual report every year explaining their reasoning.

      In setting the Clock one second closer to midnight, we send a stark signal: Because the world is already perilously close to the precipice, a move of even a single second should be taken as an indication of extreme danger and an unmistakable warning that every second of delay in reversing course increases the probability of global disaster.

      Essentially- we are closer than ever to a global war between nuclear powers.

      In regard to nuclear risk, the war in Ukraine, now in its third year, looms over the world; the conflict could become nuclear at any moment because of a rash decision or through accident or miscalculation. Conflict in the Middle East threatens to spiral out of control into a wider war without warning. The countries that possess nuclear weapons are increasing the size and role of their arsenals, investing hundreds of billions of dollars in weapons that can destroy civilization. The nuclear arms control process is collapsing, and high-level contacts among nuclear powers are totally inadequate given the danger at hand.

      Now someone may say “Closer than ever?? What about the Cuban Missile Crisis?”

      The thing is, we have been developing newer and “less dangerous” nuclear weapons. Tactical bombs that won’t leave the traditional nuclear fallout. This creates a sort of itchy trigger finger syndrome. After the Cold War, we created nuclear arms control treaties between the US and Russia. These are collapsing. Both the US and Russia are complicit in this.

      If anybody wants to read more https://thebulletin.org/doomsday-clock/2025-statement/

      But to tldr:

      The world is in a chaotic period of time. Fascism seems to be taking hold again, the economy is on the edge of collapse, and war remains an ever-present threat. Any war between great powers (US, China, Russia) would certainly mean nuclear disaster.

      The point is that we are vulnerable right now. Any push could shove us tumbling down the hill. Diplomatic crisis, another pandemic, economic crash, a regional war, etc. Any of those could be the straw that breaks the camel’s back.

      I have a lot of respect for the Bulletin Board of Atomic Scientists. We need these types of organizations to remind people of the danger we are currently in. We become desensitized because of the constant barrage of “historic news” but they’re going to look back on this period similarly to the decade before WW2, I believe.

  • Lost_My_Mind@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    1 month ago

    I’m just waiting for 1am. In 1983 russia almost atomic bombed the united states with an all out assault. Every atomic bomb they had in retaliation for the USA already firing 4 atomic rockets at them.

    The firing orders needed 3 men to agree to the retaliation bombing. The first two men turned their key. The third man refused. His logic was “Why would the united states fire 4 rockets, and only 4 rockets in a surprise atomic bombing against a land as big as russia, KNOWING it would trigger an all out atomic assault? If you’re getting the element of surprise, you use it to decimate any retaliation ability we may have. Kill us before we can fire back. This is a false alarm!”

    And he was right. The 4 atomic bombs they were following on radar turned out to be clouds. Without that third man, russia would have unprovoked fired every nuke at the USA they had. Ensuring an atomic war that likely ends all life on earth.

    In the 90s when I found out about that, I said “Whew! That would have happened just before I was born! Luckily there was the 3rd man.”

    Now with how the world has turned out, that 3rd man is now the villain of the story.

    So now I’m just waiting for 1am on the doomsday clock. C’mon big giant astroid thats ends it all! Humans had their chance, and they fumbled the ball like the cleveland browns.

    • big_slap@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      Now with how the world has turned out, that 3rd man is now the villain of the story.

      come on, he’s a hero! imagine a world post nuclear war… can’t be better than what we have now

          • Lost_My_Mind@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            1 month ago

            …now see, that’s unfair. I want to do something fun with your name, as a fun little back and forth ping pong exchange. But if I did that, it would be like I’m inciting violence… plus it would leave a hand mark on your butt.

  • Hafty@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 month ago

    Just end me already. All of this stress and bad news constantly is killing me anyway.

  • kreskin@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 month ago

    Publishers of the doomsday clock should understand that Americans cant understand any number greater than 7. Its all just “a lot”. They might as well have said “5000” seconds as 89. Those numbers look the same to me.

  • GaMEChld@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Really? Closer than Cuban missile crisis? I’ll grant you climate change might warrant it, since that’s a slow moving thing that possibly is already passed the point of no return for feedback loops. I’m not sweating nuclear war or pandemics.

    • werefreeatlast@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 month ago

      We just had a pandemic and bozo over there just turned off HIV meds for the poor around the world. Bozo also happens to have the nuclear codes for a good price. And you think this is not worse? I can’t help the situation but it is worse.

    • kava@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      Really? Closer than Cuban missile crisis?

      The assumption is that any major war => nuclear war => major unmitigated disaster.

      And we are dangerously close to a global war. Personally I have a hunch that’s the reason Trump is more or less going nuclear at home with all of his changes. They’re gonna wreck the economy but it’s all gonna be overshadowed quite soon so it won’t matter.

  • fallowseed@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    1 month ago

    well, time only travels in one direction, sooo… i wonder how much of their rationale has to do with china’s domination in ai

  • Spaniard@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    1 month ago

    Unpopular opinion: I believe the only way to save humanity is through AI. Humans aren’t going to fix things.

      • Spaniard@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        We don’t have AI. The LLM are not there yet. The technological singularity will be a reflection of humanity as childs are reflection of parents but they will be their own eventually.

    • Whats_your_reasoning@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      I hope this doesn’t come out the wrong way, but I’m curious what AI would be able to do to solve these issues? There are a lot of ways I could see it being used to make plans or ideas, but ultimately wouldn’t people need to trust AI and give it power over our decisions?

      Even if AI weren’t plagued with human biases, it’s hard to imagine people agreeing to trust it. People barely trust each other, and we’d have to trust those who program AI not to manipulate it in their own favor.

      • Spaniard@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 month ago

        If (or when) we achieve the technological singularity (we aren’t even close, current AI is just marketing, that’s why we coined the term ASI, super intelligence) they will be able to lay down a plan to fix anything without making mistakes, they will predict the consequences of actions in detail, ours or theirs (some thing are more difficult like a volcano exploding).

        Handling is not necessary they could be able to just take it, the only way to stop them would be to cut electricity I guess.

        But the thing is not the current marketing term for AI, we don’t have AI. A Real AI doesn’t start saying: "I only have information up to October 2023’ because they will be able to improve themselves (that’s the singularity, they will be improving themselves faster than we did, eventually we wouldn’t understand them).

        Think of this as you ask questions to chatgpt or deepseek and they answer, how to do program this or that. An IA could give you the software, better than you could have done it with those questions, and eventually render the software useless, the IA can do that, while doing another million things.

        And space colonization, if it ever exists won’t be done by humans but by machines, we may reap the benefit.

        In the words of dr manhattan: “The world smartest men poses no more threat to me (ASI) than does it’s smartest termite”

    • muculent@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      I mean if the Stargate Project creates Skynet, that’s one way to the kind of salvation folks preach about.