Intel Arc B580 review: A $249 RTX 4060 killer, one-and-a-half years later

GPU power consumption numbers under load.


Credit:

Andrew Cunningham

Power consumption is another place where the Battlemage GPU plays a lot of catch-up with Nvidia. With the caveat that software-measured power usage numbers like ours are less accurate than numbers captured with hardware tools, it looks like the B580’s power consumption, when fully loaded, consumes somewhere between 120 and 130 W in Hitman and Borderlands. This is a tad higher than the 4060, but it’s lower than either Radeon RX 7600.

It’s not the top of the class, but looking at the A750’s power consumption shows how far Intel has come—the B580 beats the A750’s performance every single time while consuming about 60 W less power.

A strong contender, a late arrival

The Intel Arc B580.


Credit:

Andrew Cunningham

Intel is explicitly targeting Nvidia’s GeForce RTX 4060 with the Arc B580, a role it fills well for a low starting price. But the B580 is perhaps more damaging to AMD, which positions both of its 7600-series cards (and the remaining 6600-series stuff that’s hanging around) in the same cheaper-than-Nvidia-with-caveats niche.

In fact, I’d probably recommend the B580 to a budget GPU buyer over any of the Radeon RX 7600 cards at this point. For the same street price as the RX 7600, Intel is providing better performance in most games and much better performance in ray-traced games. The 16GB 7600 XT has more RAM, but it’s $90 to $100 more expensive, and a 12GB card is still reasonably future-proof and decent at 1440p.

All of that said, Intel is putting out a great competitor to the RTX 4060 and RX 7600 a year and a half after those cards both launched—and within just a few months of a possible RTX 5060. Intel is selling mid-2023’s midrange GPU performance in late 2024. There are actually good arguments for building a budget gaming PC right this minute, before potential Trump-administration tariffs can affect prices or supply chains, but assuming the tech industry can maintain its normal patterns, it would be smartest to wait and see what Nvidia does next.

https://cdn.arstechnica.net/wp-content/uploads/2024/12/IMG_2505-1152×648.jpeg

2024-12-12 17:50:18

Leave a Reply

Your email address will not be published. Required fields are marked *