Icyguy Posted June 15, 2023 Share Posted June 15, 2023 (edited) Idk why, but whenever I launch a game on launchbox, it runs on 30 fps. I used other emulators before, but they had the same problems. But when I switched it back, it went back to 60 fps. But now, it runs at 60 fps. On a wii u game, it runs perfectly fine, I'm using retroarch for the others, but it doesn't work idk why but it still runs at 30 fps. Can anyone help me pls? Edited June 15, 2023 by Icyguy Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted June 16, 2023 Share Posted June 16, 2023 Check that Retroarch is set to the proper refresh rate and also check that your display is set to 60hz or higher instead of 30. It's common for 4K TVs hooked up to a PC to default to 30hz and need to be changed. Quote Link to comment Share on other sites More sharing options...
skizzosjt Posted June 16, 2023 Share Posted June 16, 2023 I was recently thrown a curve ball when I setup a new PC that was going to use a different display than my current main system. My main system display is 4K 120Hz but this other TV is 4K 60hz. I didn't expect any issues going to the 60Hz TV.....well boy was I wrong. Retroarch was acting like a SOB! Didn't have issues with other emulators or native games if I remember right....but I noticed Retroarch emulation was def messing up with slow gameplay due to slow frame rate. If I turned v-sync off it would run full speed but holy crap was the screen tearing unbearable to endure so that is not an option I would recommend. I could tell from Retorach's display menu that the resolution was rendering at 4K but only at 30Hz (or maybe 29.XXHz) since that is what was selected when the issue was occurring. I would keep changing it to 4K 60Hz and it would work at that res/refresh until it was closed. Next time it was opened the problem returns....I was chasing my tail!!!! Turns out it was my TV SETTINGS that were forcing Retroarch to adjust. In my TV's settings I can determine which HDMI protocol version to use, options available being Auto, 1.4, or 2.0 (it's a HDMI 2.0 capable port). I always left this on auto and never had an issue. However, once I changed this setting to exclusively use HDMI 2.0 the whole refresh rate dropping problem went away. So the problem was the port/cable was auto negotiating down to v1.4 which maxes at 30Hz when using 4K resolution (unless you use compression like 4:2:0). The bit that clued me in was because the problem also went away if I ran a resolution under 4K. Even just one increment down at 3200X1800 I could run 60Hz fine. Without that clue I'd probably still be looking for the solution lol. The other bit that helped was it looked like my TV was doing an extra "going blank" cycle....idk what to call it from a technical standpoint but it's that moment that the screen goes black in between making a resolution or refresh rate change. It was doing that at least one extra time so made me think the TV was auto adjusting when it was detecting there was a window using fullscreen. Telling my rambling story so others can also think to check their TV/monitor's settings since it may not be purely PC related when this kind of issue pops up. 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.