neil9000 Posted April 15, 2022 Share Posted April 15, 2022 HDR doesn't need specific graphics power, just a compatible display. As for 4K that's not new PC's have been able to do that for forever as long as you have the hardware capable. If you are wanting to play PC games at 4K60 you will be spending more than $1000 on the GPU alone. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 15, 2022 Author Share Posted April 15, 2022 Wait, it's the 60 FPS where the money comes in, more than the actual 4K or HDR? You said PC's have been able to do 4K forever. So, is my Nvidia GTX GeForce 750 TI from MSI included in that as well then? Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 15, 2022 Share Posted April 15, 2022 16 minutes ago, Christopher James Neff said: I wonder why the Arcade Gauntlet Legends and Gauntlet Dark Legacy play at full speed though. Because Mame only utilizes the CPU for running the games, it will only use the GPU when using drivers and Mame shaders are not that demanding. 18 minutes ago, Christopher James Neff said: So, used then? Is it trustworthy to get used parts? I get all my parts brand new. What would cheap be to you? I don't even know the full retail price of the parts to compare the two. You have to figure what is worth it to you. 19 minutes ago, Christopher James Neff said: When you say you doubt it, you mean with Shaders or just period? And are you talking about max settings on those 3 emulators or are you saying that my PC would be too weak even at mid or low settings? Period, PS3, WiiU and Switch emulation is demanding on both the CPU and GPU, not to mention the emulation itself is still in fairly early stages with a lot of issues to be worked out. 20 minutes ago, Christopher James Neff said: However, if I need to get a new PC, I need to find one that has a Quad Core CPU of at LEAST 4.0 GHZ, if not more. Most of them I have seen I am not impressed with. All the other specs are great, but most CPU's are more like 3.3 GHZ something, which aint gonna cut it for MAME. Since MAME uses more CPU than it does GPU, then my CPU is the only reason why the 2 Gauntlet games run at full speed. I heard that those 2 games require at least 4.0 GHZ, which is why I chose an i7 over an i5 when I first got my current Gaming PC. Raw clock speed is not all that matters, instructions per clock cycle or IPC matters almost as much or more. For example my old AMD 8350 runs at 4GHz but is pretty bad when it comes to emulation because of its relatively poor IPC. Also the vast majority of the new CPUs aren't 3.3GHz, that may be their base speeds but they boost to much higher. For example, an Intel i5 12400 has a base clock of only 2.5GHz but boosts up to 4.4GHz. Also you are running a 4th gen Intel CPU while now we are on the 12th gen, there have been a lot of improvements made in the architecture which makes a huge difference in modern PC gaming. Just take a look at this comparison between your older but very good for its time i7 4790K vs a new but budget friendly i5 12400: https://cpu.userbenchmark.com/Compare/Intel-Core-i7-4790K-vs-Intel-Core-i5-12400/2384vs4122 26 minutes ago, Christopher James Neff said: If I upgrade my current PC or get a new one to play the KH series in 4K and HDR, will that also be good enough specs for those 3 emulators as well? PS3, Wii U and Switch that is. If you bought a PC capable of running the latest triple a games at 4K60 you would have a system capable of running any emulator as well as it can run as of today. What the future holds for emulation of those systems we don't know yet. The system requirements may go up as they get more accurate or the system requirements may come down if they find some new speed improvements, we simply don't know what will happen. Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 15, 2022 Share Posted April 15, 2022 6 minutes ago, Christopher James Neff said: Wait, it's the 60 FPS where the money comes in, more than the actual 4K or HDR? You said PC's have been able to do 4K forever. So, is my Nvidia GTX GeForce 750 TI from MSI included in that as well then? Technically yes, the 750 could do 4K gaming, it all depends on the game and how demanding it is. The 750 could probably run Quake at 4K60 but no way in hell it is gonna run Cyberpunk at 4K60. Resolution and framerate are hardware and game engine dependent. You can keep bumping the resolution up but as you do the framerate will drop. Also, graphics settings have a say in all this as well. As you enable more fancy graphics options like better quality shadows, higher quality textures, higher quality lighting and post processing effects the lower the framerate will drop. It's all a balancing act and the higher the resolution and graphics settings the higher the GPU requirement becomes. A lot of the questions you ask are very dependent upon several factors, there is very little cut and dry yes or no answers. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 15, 2022 Author Share Posted April 15, 2022 (edited) 31 minutes ago, Lordmonkus said: Because Mame only utilizes the CPU for running the games, it will only use the GPU when using drivers and Mame shaders are not that demanding. Wait, it uses the CPU exclusively? But it would have to use the GPU for graphics right? Or at the very least the onboard graphics? From what I understand, the CPU is only for speed of a game, but it can't render graphics. That's what the onboard graphics chip or dedicated GPU is for. 31 minutes ago, Lordmonkus said: You have to figure what is worth it to you. What would you say is cheap to you? How much are they retail for brand new? 31 minutes ago, Lordmonkus said: Raw clock speed is not all that matters, instructions per clock cycle or IPC matters almost as much or more. For example my old AMD 8350 runs at 4GHz but is pretty bad when it comes to emulation because of its relatively poor IPC. Also the vast majority of the new CPUs aren't 3.3GHz, that may be their base speeds but they boost to much higher. For example, an Intel i5 12400 has a base clock of only 2.5GHz but boosts up to 4.4GHz. Also you are running a 4th gen Intel CPU while now we are on the 12th gen, there have been a lot of improvements made in the architecture which makes a huge difference in modern PC gaming. Just take a look at this comparison between your older but very good for its time i7 4790K vs a new but budget friendly i5 12400: https://cpu.userbenchmark.com/Compare/Intel-Core-i7-4790K-vs-Intel-Core-i5-12400/2384vs4122 Wait, those new CPU's would most likely cut it for the 2 games of Gauntlet Legends and Gauntlet Dark Legacy even though the clock speed looks low on the specs page? The reason I said all of that is because when I was doing heavy research into a gaming PC about 7 years ago, I asked the MAMEDev people what specs I needed to play those 2 Arcade games at max speed or at least as close as possible, and they all told me that the CPU was the most important component and that I should not even LOOK at any CPU who's base clock speed was not at LEAST 4.0 GHZ or more. They told me that clock speed was the most important thing to MAME more than any other feature or spec and that hyper threading and multiple cores didn't even matter either. As an example, MAME apparently doesn't make use of Quad Core, MAME can only use 2 cores and so 2 cores at a high clock speed per core was better than a quad core at lower clock speed per core. 31 minutes ago, Lordmonkus said: The system requirements may go up as they get more accurate or the system requirements may come down if they find some new speed improvements, we simply don't know what will happen. So, it's the actual optimizations that can make an emulator need more beefy specs as well? When you say speed improvements, are you referring to speed hacks that sacrifice accuracy in favor of speed, performance and playability? Because for me, I ONLY use Cores and Emulators that are as Cycle Accurate as possible. So, I use BSNES/Higan over SNES9X for example. I don't want another shit emulator like ZSNES. So, for PS3, Wii U and Switch, I won't even touch them until they are close to being more Cycle Accurate like N64's Parallel, PSONE's Beetle PSX, or DreamCast's Demul etc, however long that will take. 29 minutes ago, Lordmonkus said: Technically yes, the 750 could do 4K gaming, it all depends on the game and how demanding it is. The 750 could probably run Quake at 4K60 but no way in hell it is gonna run Cyberpunk at 4K60. Resolution and framerate are hardware and game engine dependent. You can keep bumping the resolution up but as you do the framerate will drop. Also, graphics settings have a say in all this as well. As you enable more fancy graphics options like better quality shadows, higher quality textures, higher quality lighting and post processing effects the lower the framerate will drop. It's all a balancing act and the higher the resolution and graphics settings the higher the GPU requirement becomes. A lot of the questions you ask are very dependent upon several factors, there is very little cut and dry yes or no answers. Specifically, I want to play the Kingdom Hearts series that was recently ported over to PC at their very max settings, including Kingdom Hearts III and the ReMind DLC. Will my PC be able to cut it? Edited April 15, 2022 by Christopher James Neff Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 16, 2022 Share Posted April 16, 2022 9 minutes ago, Christopher James Neff said: Wait, it uses the CPU exclusively? But it would have to use the GPU for graphics right? It uses the GPU for outputting the image to the display but not for rendering any of the graphics, that is all handled by the CPU. 12 minutes ago, Christopher James Neff said: What would you say is cheap to you? How much are they retail for brand new? I don't know, I wouldn't more than a hundred bucks or so for a 970. 14 minutes ago, Christopher James Neff said: Wait, those new CPU's would most likely cut it for the 2 games of Gauntlet Legends and Gauntlet Dark Legacy even though the clock speed looks low on the specs page? The reason I said all of that is because when I was doing heavy research into a gaming PC about 7 years ago, I asked the MAMEDev people what specs I needed to play those 2 Arcade games at max speed or at least as close as possible, and they all told me that the CPU was the most important component and that I should not even LOOK at any CPU who's base clock speed was not at LEAST 4.0 GHZ or more. They told me that clock speed was the most important thing to MAME more than any other feature or spec and that hyper threading and multiple cores didn't even matter either. As an example, MAME apparently doesn't make use of Quad Core, MAME can only use 2 cores and so 2 cores at a high clock speed per core was better than a quad core at lower clock speed per core. At the time what they told you was true and is still mostly true. CPU is the most important component when it comes to emulation, especially Mame. CPUs have not progressed a whole lot in terms of clock speed over the years but they have progressed a lot in terms of IPC and Mames CPU requirements really have not changed since then either. 16 minutes ago, Christopher James Neff said: So, it's the actual optimizations that can make an emulator need more beefy specs as well? Optimizations can make an emulators requirements go down. 17 minutes ago, Christopher James Neff said: When you say speed improvements, are you referring to speed hacks that sacrifice accuracy in favor of speed, performance and playability? It can be hacks but also can just be straight up code optimizations or figuring out how the emulated hardware actually works. 18 minutes ago, Christopher James Neff said: I won't even touch them until they are close to being more Cycle Accurate like N64's Parallel, PSONE's Beetle PSX, or DreamCast's Demul etc, however long that will take. You will be waiting a long time then. N64 parallel isn't cycle accurate though it's much better than previous N64 emulation. Demul and PS1 Beetle / Mednafen is not cycle accurate either. 20 minutes ago, Christopher James Neff said: Specifically, I want to play the Kingdom Hearts series that was recently ported over to PC at their very max settings, including Kingdom Hearts III and the ReMind DLC. Will my PC be able to cut it? Go look up the specs for the games, I am not a one stop shop for every games hardware requirements to run at 4K60. I prefer to run all my modern PC games at 1080P 120 fps, its far better than 4K60. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 16, 2022 Author Share Posted April 16, 2022 49 minutes ago, Lordmonkus said: It uses the GPU for outputting the image to the display but not for rendering any of the graphics, that is all handled by the CPU. I didn't even know that a CPU had anything to do with graphics or rendering at all. I always thought the GPU did all of the outputting of the image PLUS the rendering of all the graphics and that the CPU was just for FPS and that was it. 50 minutes ago, Lordmonkus said: At the time what they told you was true and is still mostly true. CPU is the most important component when it comes to emulation, especially Mame. CPUs have not progressed a whole lot in terms of clock speed over the years but they have progressed a lot in terms of IPC and Mames CPU requirements really have not changed since then either. So, most likely I would still need a CPU that had a base clock rate of 4.0 GHZ after all? Or that the 2.5 ones will work fine so long as they can boost to 4.0 or 4.4 GHZ? 51 minutes ago, Lordmonkus said: You will be waiting a long time then. N64 parallel isn't cycle accurate though it's much better than previous N64 emulation. Demul and PS1 Beetle / Mednafen is not cycle accurate either. Okay, I probably shouldn't say Cycle Accurate then. What I should have said is that I will wait for the Wii U, PS3, and Switch emulators to get as good as Parallel, Demul and Beetle/Mednafen. So long as it's at least as good as those 3, then that is good enough for me. 52 minutes ago, Lordmonkus said: I prefer to run all my modern PC games at 1080P 120 fps, its far better than 4K60. Can you even notice a difference between 60 and 120 FPS? I heard the human eye can't even detect anything beyond 60 FPS. Just like the human eye apparently can't detect anything beyond 4K, which according to some people, makes 8K a scam since you wouldn't even be able to see it anyway. Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 16, 2022 Share Posted April 16, 2022 4 minutes ago, Christopher James Neff said: So, most likely I would still need a CPU that had a base clock rate of 4.0 GHZ after all? Or that the 2.5 ones will work fine so long as they can boost to 4.0 or 4.4 GHZ? The base speed is just the speed it runs at when idle or doing light work loads. The boost speed is what the CPU can go up to when under load and as needed. 6 minutes ago, Christopher James Neff said: Can you even notice a difference between 60 and 120 FPS? Absolutely and it makes a big difference one you see it in person. 6 minutes ago, Christopher James Neff said: I heard the human eye can't even detect anything beyond 60 FPS. Just like the human eye apparently can't detect anything beyond 4K, which according to some people, makes 8K a scam since you wouldn't even be able to see it anyway. These are just bullshit internet lies. I remember back when running a game at 30 fps was a big deal and many people said that 26 fps was all that was needed because that was what films are at. People can even perceive the difference between 120 and 240 fps. The human eye is not some digital instrument that has hard limits, you just start to run into diminishing returns and the increases in framerate and resolution need to get in order of magnitudes larger to perceive the difference. For example going from 30 fps to 60 is quite big and then 60 to 120 is almost as big in terms of perception goes but its an extra 60 fps vs 30, then the next step is 240 fps. You can see where this is going. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 16, 2022 Author Share Posted April 16, 2022 1 hour ago, Lordmonkus said: The base speed is just the speed it runs at when idle or doing light work loads. The boost speed is what the CPU can go up to when under load and as needed. No overclocking required? You mean just natural boosting? So, is my i7-4790K 4.0GHZ at base speed or is that the max boosted speed? Also, wouldn't that burn out the CPU faster if it had to always be at it's max boosted speed for intensive games? Kind of like how over clocking shortens a CPU's total life span by a lot. 1 hour ago, Lordmonkus said: These are just bullshit internet lies. I remember back when running a game at 30 fps was a big deal and many people said that 26 fps was all that was needed because that was what films are at. People can even perceive the difference between 120 and 240 fps. The human eye is not some digital instrument that has hard limits, you just start to run into diminishing returns and the increases in framerate and resolution need to get in order of magnitudes larger to perceive the difference. For example going from 30 fps to 60 is quite big and then 60 to 120 is almost as big in terms of perception goes but its an extra 60 fps vs 30, then the next step is 240 fps. You can see where this is going. Is there a limit to how much the eye can perceive? Will we be able to see anything beyond 240 FPS? Where does the Law Of Diminishing Returns start to kick in and at full throttle? What about 4K to 8K? Can the human eye perceive that? Is it really worth getting an 8K TV? For me, I noticed that the jump from 180P to 4K was smaller than the jump from 480P to 720P or even to 1080P. Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 16, 2022 Share Posted April 16, 2022 14 minutes ago, Christopher James Neff said: No overclocking required? You mean just natural boosting? So, is my i7-4790K 4.0GHZ at base speed or is that the max boosted speed? Nope, no overclocking required, it just boosts as needed. If you look up your CPU you will see that the 4.0 is base speed but it can boost up to 4.4. It's not really going to burn or wear out faster running at its max speed while gaming. Overclocking, especially extreme overclocking will wear out a CPU faster running it at higher voltages but modern CPUs and hardware will throttle themselves if they get too hot under normal usage (ie: not overclocking). 17 minutes ago, Christopher James Neff said: Is there a limit to how much the eye can perceive? Will we be able to see anything beyond 240 FPS? Where does the Law Of Diminishing Returns start to kick in and at full throttle? What about 4K to 8K? Can the human eye perceive that? Is it really worth getting an 8K TV? I have no idea where the limits are, im not doing experiments to find out either. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 16, 2022 Author Share Posted April 16, 2022 9 minutes ago, Lordmonkus said: Nope, no overclocking required, it just boosts as needed. If you look up your CPU you will see that the 4.0 is base speed but it can boost up to 4.4. It's not really going to burn or wear out faster running at its max speed while gaming. Overclocking, especially extreme overclocking will wear out a CPU faster running it at higher voltages but modern CPUs and hardware will throttle themselves if they get too hot under normal usage (ie: not overclocking). Okay, so I don't need something that is 4.0 GHZ base, but I DO need to get something that goes up to 4.4 GHZ max boosted. For all I know, the 2 Gauntlet games could be using the boosted max 4.4 GHZ on my CPU, so if I get something that only boosts up to 4.0 GHZ max, then I might be screwed. 9 minutes ago, Lordmonkus said: I have no idea where the limits are, im not doing experiments to find out either. Are you gonna get an 8K TV yet? Do you think I should probably wait a while on 8K? I can't find any 40 inch 8K TV's. I have a 40 inch 4K TV and even that was a pain to find. All of the 8K TV's I see are 50 inch minimum which would never fit in my small bedroom. Hell, the 40 inch 4K TV I currently have is almost too much. Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 16, 2022 Share Posted April 16, 2022 4 minutes ago, Christopher James Neff said: Okay, so I don't need something that is 4.0 GHZ base, but I DO need to get something that goes up to 4.4 GHZ max boosted. For all I know, the 2 Gauntlet games could be using the boosted max 4.4 GHZ on my CPU, so if I get something that only boosts up to 4.0 GHZ max, then I might be screwed. Trust me, any modern CPU currently available, Intel or AMD are all going to out perform that i7 4790 you have. That CPU was and still is a great CPU but the tech has evolved and even the current gen budget CPUs will out perform it in modern games and emulation. And before you even ask, yes, AMD CPUs are just as good as Intels right now, even in emulation. 6 minutes ago, Christopher James Neff said: Are you gonna get an 8K TV yet? I don't even have a 4K TV, I have an old cheapo 40" 1080 Samsung and I have no intention of upgrading it any time soon. Watching TV shows and movies is not something care that greatly about. My money goes into PC gaming. You will be hard pressed to find 40" TVs, it's just not a size that fits well when cutting them out of the larger panel during production. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 16, 2022 Author Share Posted April 16, 2022 (edited) 38 minutes ago, Lordmonkus said: Trust me, any modern CPU currently available, Intel or AMD are all going to out perform that i7 4790 you have. That CPU was and still is a great CPU but the tech has evolved and even the current gen budget CPUs will out perform it in modern games and emulation. Yeah, but MAME is a fickle pain in the ass and has the dumbest CPU requirements for those 2 games and most other 3D games. What is great for most any other emulator MAME shits all over. So it might outperform my Intel i7 for other emulators but I know any new CPU I get will need to go to 4.4 GHZ max because MAME unfortunately, doesn't give a shit about anything other than raw clock speed. And it apparently uses only 2 Cores out of a CPU so the other 2 in a Quad Core might as well not even exist as far as MAME is concerned. MAME doesn't care about the evolved tech or Current Gen. So I have to make sure I do this right. So no matter how new or advanced the CPU is, it still HAS to be 4.4 GHZ max speed for the 2 Gauntlet Games to play at full speed. So, unless I can find a CPU that can do that, I can't upgrade my PC, because my current CPU will play better in MAME than any new CPU that does any less than 4.4 GHZ. 38 minutes ago, Lordmonkus said: And before you even ask, yes, AMD CPUs are just as good as Intels right now, even in emulation. Are you psychic? I was JUST about to start shitting on AMD. LOL 38 minutes ago, Lordmonkus said: I have an old cheapo 40" 1080 Samsung It aint "cheapo" if it's from Samsung. They are legit the best brand you can have and make the best quality parts physically speaking. Even if the specs themselves aren't the newest and greatest. 38 minutes ago, Lordmonkus said: Watching TV shows and movies is not something care that greatly about. My money goes into PC gaming. So, PC Gaming and Emulation is what you do most of the time more than anything else? Like Netflix binging etc? 38 minutes ago, Lordmonkus said: You will be hard pressed to find 40" TVs, it's just not a size that fits well when cutting them out of the larger panel during production. Why is there so many for 1080P though? And what did you mean by not fitting well? They are literally being CUT out of something? As in literal and not figure of speech? Edited April 16, 2022 by Christopher James Neff Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 16, 2022 Share Posted April 16, 2022 49 minutes ago, Christopher James Neff said: So it might outperform my Intel i7 for other emulators but I know any new CPU I get will need to go to 4.4 GHZ max because MAME unfortunately, doesn't give a shit about anything other than raw clock speed. Except it's not just about raw clock speed, Intructions Per Clock cycle matter as well plus the new CPUs can match that raw clock speed as well as better IPC. Check the comparison link I posted earlier, it shows that the lower end i5 12th gen out performs your CPU even on single threaded operations by a large margin. Also, if you want to, you can just keep your current system as your emulation PC and build a new PC for modern gaming. It's not like any of your older hardware would be usable in an upgrade, you would need a new motherboard, ram, cpu and gpu and at that point you may as well just get a new power supply and case and hard drive/s, preferably and ssd. 50 minutes ago, Christopher James Neff said: Are you psychic? I was JUST about to start shitting on AMD. LOL Because prior to the Ryzen gen CPUs AMDs were not as good as Intels for emulation because of its IPC. Like I said, my AMD 8350 @ 4GHz could just run Mednafen Saturn smoothly barely. The Ryzen CPUs improved their IPC to match Intels and there is literally no difference in the 2 brand at this specific point in time, that may change in the future of course. 53 minutes ago, Christopher James Neff said: It aint "cheapo" if it's from Samsung. It was cheap when I got it and it's older now. 53 minutes ago, Christopher James Neff said: So, PC Gaming and Emulation is what you do most of the time more than anything else? Like Netflix binging etc? Mostly PC gaming nowadays. I don't really play emulated games much lately and when I do it is on my MiSTer + CRT TV setup. 54 minutes ago, Christopher James Neff said: Why is there so many for 1080P though? And what did you mean by not fitting well? They are literally being CUT out of something? As in literal and not figure of speech? Panel manufacturers make a giant panel and then they cut smaller panels out of it and they optimize the wastage by cutting out the sizes that utilize the most of that larger panel. Watch this video by Linus, he explains it, time stamped to the right part. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 16, 2022 Author Share Posted April 16, 2022 12 minutes ago, Lordmonkus said: Except it's not just about raw clock speed, Intructions Per Clock cycle matter as well plus the new CPUs can match that raw clock speed as well as better IPC. Check the comparison link I posted earlier, it shows that the lower end i5 12th gen out performs your CPU even on single threaded operations by a large margin. And that's all stuff MAME will care about and can utilize? Maybe I need to contact MAMEDev again and show them this convo and then just ask them what CPU they think I should get. 13 minutes ago, Lordmonkus said: Also, if you want to, you can just keep your current system as your emulation PC and build a new PC for modern gaming. It's not like any of your older hardware would be usable in an upgrade, you would need a new motherboard, ram, cpu and gpu and at that point you may as well just get a new power supply and case and hard drive/s, preferably and ssd. Unfortunately, I only have room for one PC, and I would want the horse power of the new PC for newer console emulation as well, and I don't want my games being separated into 2 different PC's. I'd end up forgetting which PC has which ROMS and Games. LOL. I'd also want to emulate on my new PC as well so I can use more advanced shaders. I want to try one out called CyberLab Mega Bezel Death To Pixels Shader Preset Pack. Or CRT-Royale with 4K resolution and Integer Scaling. Right now, I am using one called CRT-NewPixie Shader. NONE of my older hardware would be usable in an upgrade/new PC? Not even my 2 SSD's? At this point, would it be better that I just get a whole new PC instead of trying to upgrade a bunch of stuff in this one? It's been repaired at least 3 times by now with different hardware components dying and needing to be replaced. Why can't they all just consistently die at the same time like a console does? LOL 18 minutes ago, Lordmonkus said: Because prior to the Ryzen gen CPUs AMDs were not as good as Intels for emulation because of its IPC. Like I said, my AMD 8350 @ 4GHz could just run Mednafen Saturn smoothly barely. The Ryzen CPUs improved their IPC to match Intels and there is literally no difference in the 2 brand at this specific point in time, that may change in the future of course. Still doesn't explain how you knew I was just about to start shitting on AMD's when I haven't even done it yet. LOL Also, is the AMD 8350 @ 4GHZ the new one that is equal to Intel or the old one? 20 minutes ago, Lordmonkus said: It was cheap when I got it and it's older now. Oh, you meant literally as in price. When I hear the word "cheapo" I just automatically assume that they mean something that is shit quality. Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 16, 2022 Share Posted April 16, 2022 11 minutes ago, Christopher James Neff said: And that's all stuff MAME will care about and can utilize? Maybe I need to contact MAMEDev again and show them this convo and then just ask them what CPU they think I should get. Go ahead and ask them. 12 minutes ago, Christopher James Neff said: Also, is the AMD 8350 @ 4GHZ the new one that is equal to Intel or the old one? I knew because all your information is several years old and out of date and at the time of your information AMD was known for being shit at emulation. No, its very old at this point. I currently run an AND Ryzen 2700X and it emulates everything I throw at it, even Wii U PS3 and Switch. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 16, 2022 Author Share Posted April 16, 2022 What's better? My intel i7, or the AMD 8350? Would my 2 SSD's be able to be transferred to a new PC? Adding to whatever HDD or SDD the new one already comes with of course. Also, at this point, would it be better that I just get a whole new PC instead of trying to upgrade a bunch of stuff in this one? It's been repaired at least 3 times by now with different hardware components dying and needing to be replaced. Is my current information really that old and out of date now? Quote Link to comment Share on other sites More sharing options...
Lordmonkus Posted April 16, 2022 Share Posted April 16, 2022 18 minutes ago, Christopher James Neff said: What's better? My intel i7, or the AMD 8350? Your Intel, it was a great CPU for its time and is still very capable. 19 minutes ago, Christopher James Neff said: Would my 2 SSD's be able to be transferred to a new PC? You could. 19 minutes ago, Christopher James Neff said: Also, at this point, would it be better that I just get a whole new PC instead of trying to upgrade a bunch of stuff in this one? Better to build a new one. Any new CPU would require a new motherboard and ram, you would also be getting a new GPU. At that point all that is left to buy is a new power supply and case. I said all of this earlier in the thread. 22 minutes ago, Christopher James Neff said: Is my current information really that old and out of date now? Your knowledge of CPUs is. All your information based on what you have said in this thread is certainly not up to date if you believe that clock speed alone matters and that AMD CPUs are shit. Quote Link to comment Share on other sites More sharing options...
ChristopherNeff Posted April 18, 2022 Author Share Posted April 18, 2022 (edited) On 4/16/2022 at 12:45 AM, Lordmonkus said: Better to build a new one. Any new CPU would require a new motherboard and ram, you would also be getting a new GPU. At that point all that is left to buy is a new power supply and case. I said all of this earlier in the thread. The reason I asked that is because I was wondering, what if I just bought a new GPU and upped my ram to 16 GB and skipped everything else? Would that be good enough? Is it important that I also upgrade my CPU as well? And is there that much of a difference between DDR3 vs DDR4 RAM? As in, if I upgraded my RAM right now to 16 GB, it would be 16 GB of DDR3. Is that very much different than a new PC that will have 16 GB of DDR4? On 4/16/2022 at 12:45 AM, Lordmonkus said: At that point all that is left to buy is a new power supply and case. Is that important to upgrade? Do I need a new case and PSU if I just upgraded my current PC? My current PSU is 600 W from ATNG. Oh, I also wanted to ask, is the new UEFI BIOS with the EFI partitioning system stuff super important? Will that make a huge difference? The reason I ask is because my current PC is still setup under the Legacy BIOS under the MBR partitioning system. It CAN do UEFI and EFI, but it acts weird and my Local PC shop said it's because my generation of hardware barely supported that system since it was still new and in it's infancy then and so it was never fully and properly supported by my current hardware since by the time the UEFI and EFI stuff was made the new standard and stable, we were several hardware versions ahead which means that my current hardware will never be updated to support it fully or properly so I am stuck with Legacy and MBR in order to get full stability and compatibility. I heard that all new hardware and PC's will automatically be setup under UEFI and EFI by default though. Edited April 18, 2022 by Christopher James Neff Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.