Cutting edge chips consume more electricity to manufacture as there are a crapload more steps than older fabs. All chips are made on the same size silicon wafers regardless of the fabrication process.
Gamers Nexus has some good videos about chip manufacturing if you are interested
I’d be interested in a “payback period” for modern chips, as in, how long the power savings in a modern chip takes to pay for its manufacturing costs. Basically, calculate performance/watt with some benchmark, and compare that to manufacturing cost (perhaps excluding R&D to simplify things).
Honestly, if you go through all the node changes you could do the math and figure out. Like N3 to N2 is a 15-20% performance gain at the same power useage.
It wouldn’t be exact. But I doubt any company will tell you how much power would be used in the creation of a single wafer
Do 7 nm chips are more energy intensive than older 100 nm?
Or it’s just scale, more chips to manufacture, more energy needed.
Cutting edge chips consume more electricity to manufacture as there are a crapload more steps than older fabs. All chips are made on the same size silicon wafers regardless of the fabrication process.
Gamers Nexus has some good videos about chip manufacturing if you are interested
I’d be interested in a “payback period” for modern chips, as in, how long the power savings in a modern chip takes to pay for its manufacturing costs. Basically, calculate performance/watt with some benchmark, and compare that to manufacturing cost (perhaps excluding R&D to simplify things).
Honestly, if you go through all the node changes you could do the math and figure out. Like N3 to N2 is a 15-20% performance gain at the same power useage.
It wouldn’t be exact. But I doubt any company will tell you how much power would be used in the creation of a single wafer
“Thanks Steve”
Older chips definitely consume more watt per processor power, newer are usually better on top of that too.
Talking about usage, not construction.