Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Unfortunately lots of games, especially indie ones, don't optimize for IGPUs at all.

I had a 13500K, and I've played many not-so-demanding games which I should expect at least 30fps at lowest video setting, while in reality I have something like 5fps.



I think this is mostly because iGPUs were quite bad (even if they would have been good enough for some low-end indie games).

But that perception has shifted over the past few generations, especially on the AMD side, but increasingly also on the Intel side (and given the new Arc integrated graphics perform similarly to and sometimes even better than the current top end 780M, this trend should continue).

Things like GPU scarcity and increasing prices have obviously also contributed to this.

The pipelines of developing a videogame are quite long, so we are only now seeing releases of a lot of titles that have been developed against these new realities. And especially for indie titles, they don't see a lot of optimization after release, since the studios typically don't have the budget / staff to improve existing titles and work on new ones at the same time.


Maybe for older titles. But in the last 2 years Indies have been forced to optimize for iGPUs because of the Steam Deck.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: