cross-posted from: https://lemmy.ml/post/44996161

March 25, 2026

The policy, announced by Bernie Sanders, an independent senator from Vermont, and Alexandria Ocasio-Cortez, a New York Democratic representative, on Wednesday morning, aims to ensure the AI boom protects the environment and communities, and benefits workers instead of harming them. A temporary ban, the lawmakers say, would give the US government time to create strong federal safeguards for AI, which is “affecting everything from our economy and wellbeing to our democracy, warfare and our kids’ education”.

“AI and robotics are creating the most sweeping technological revolution in the history of humanity,” Sanders said in an emailed statement. “The scale, scope, and speed of that change is unprecedented. Congress is way behind where it should be in understanding the nature of this revolution and its impacts.”

  • tino_408@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    12
    ·
    8 hours ago

    Why are data centers bad for the environment? Wouldn’t this cause American ai innovation to slow down? I thought we are in an ai race?

    • BigDaddySlim@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      Sucking up water resources from local communities, poisoning water with it’s waste, noise causing illnesses in people living nearby, using a fuckload of energy usually from non renewable sources adding CO2 to our atmosphere, just to name a few things. This isn’t news

    • jaycifer@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 hours ago

      “Why are data centers bad for the environment?”

      Computer servers use a lot of electricity when they run. I believe most data centers are focused on data storage and retrieval, which means there are upswings and downswings in their usage as demand to access that data increases and wanes, so it’s not always running at 100% power consumption. My understanding is that AI data centers are primarily used for training new models, which means they are nearly always running at or near 100% to maximize training.

      Not only does this consume a lot of electricity from the grid to run, but a significant byproduct of servers running is heat, requiring strong cooling systems for the data center, which ironically uses even more electricity. I think they use a lot of water cooling to achieve this cooling as well since water is good at absorbing, moving, then dissipating heat. I’ve read comments that this makes the water difficult to reuse, but I don’t know why that would be the case.

      In short, they use a lot of electricity to generate heat that then needs even more electricity and water to manage.

      “Wouldn’t this cause American ai innovation to slow down?”

      Sure, this could cause the base level processing power available for training to taper off, but I think that would actually breed more innovation in making better training methods that use that power more efficiently. I recall a lot of early Chinese models being just as good for end users as American models despite being trained on less processing power. That sounds innovative to me.

      I would liken it to video game optimization. When gaming tech was weaker it was more necessary to optimize games to run on the limited hardware. Modern gaming consoles have enough processing overhead to achieve the same thing that developers can get away with less optimization, which ironically can lead to worse performing games than when that overhead was missing.