• lnxtx@feddit.nl
    link
    fedilink
    English
    arrow-up
    6
    ·
    2 days ago

    Do 7 nm chips are more energy intensive than older 100 nm?
    Or it’s just scale, more chips to manufacture, more energy needed.

    • n3m37h@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      Cutting edge chips consume more electricity to manufacture as there are a crapload more steps than older fabs. All chips are made on the same size silicon wafers regardless of the fabrication process.

      Gamers Nexus has some good videos about chip manufacturing if you are interested

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        I’d be interested in a “payback period” for modern chips, as in, how long the power savings in a modern chip takes to pay for its manufacturing costs. Basically, calculate performance/watt with some benchmark, and compare that to manufacturing cost (perhaps excluding R&D to simplify things).

        • n3m37h@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 hours ago

          Honestly, if you go through all the node changes you could do the math and figure out. Like N3 to N2 is a 15-20% performance gain at the same power useage.

          It wouldn’t be exact. But I doubt any company will tell you how much power would be used in the creation of a single wafer

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      2 days ago

      Older chips definitely consume more watt per processor power, newer are usually better on top of that too.

      Talking about usage, not construction.