This was the dream from the xbox 360 days, and now were finally here
Dunno, after upgrading to 1440p last year, 1080p ain't cutting it anymore. And 120fps guarantees Dirt 5 visuals. NO THANK YOU. I didn't spend $725CAD for a console so my games look like they're straight from late 7th gen.
120fps is pointless for most console gamers.... only a few tvs support it and console gamers don't play on monitors. 60fps is a nice welcome though. Hopefully every game at least has the option for 60fps.
120fps is pointless for most console gamers.... only a few tvs support it and console gamers don't play on monitors. 60fps is a nice welcome though. Hopefully every game at least has the option for 60fps.
False. I am a ferocious cow, and i play on monitors only.
It's about fucking time consoles start catching up to PC. As long as you don't buy an XSS, you should be good for at least 1080p/60fps throughout the generation. However, some developers may choose to push the CPU with advanced simulations that would force the game to run at 30fps at any resolution. I don't see this becoming a common thing though. A lot more gamers are starting to see the value of a higher frame rate and most developers will cater to them offering at least a 60fps option at lower resolutions.
120fps is pointless for most console gamers.... only a few tvs support it and console gamers don't play on monitors. 60fps is a nice welcome though. Hopefully every game at least has the option for 60fps.
I remember playing SNES on an old CRT monitor back in 1996 ;-). And when I got an Xbox 360 around 2006, I used it exclusively with a 1440x900 monitor. I was off of television gaming long ago, and by this point I wouldn't be surprised if quite a few console gamers used monitors.
120fps is pointless for most console gamers.... only a few tvs support it and console gamers don't play on monitors. 60fps is a nice welcome though. Hopefully every game at least has the option for 60fps.
I remember playing SNES on an old CRT monitor back in 1996 ;-). And when I got an Xbox 360 around 2006, I used it exclusively with a 1440x900 monitor. I was off of television gaming long ago, and by this point I wouldn't be surprised if quite a few console gamers used monitors.
Overwhelmingly still televisions, though. So, no sense in aiming for 120 fps. Would be nice to actually see a poll, ratio of TV to monitor console gamers. On a forum that isn't half dead, I mean. A poll type of thread would be lucky to get to four pages here. My thread asking if people would use the next gen consoles for 4K movies only got to freaking two.
I've been here 18 years watching PC gamers whistle away as console gamers rejoice at finally getting something the PC gamer's have had for years and years and years already...
It's about fucking time consoles start catching up to PC. As long as you don't buy an XSS, you should be good for at least 1080p/60fps throughout the generation. However, some developers may choose to push the CPU with advanced simulations that would force the game to run at 30fps at any resolution. I don't see this becoming a common thing though. A lot more gamers are starting to see the value of a higher frame rate and most developers will cater to them offering at least a 60fps option at lower resolutions.
What do you mean as long as you don't get a XSS?
Lemmings told me that was a pure uncut 1440p 1080p machine.😂
I'm just hoping 60fps is an option for most games going forward. I doubt it'll ever be standard and I worry that once they leave the old consoles behind , devs will start pushing other stuff and we'll be back to square 1.
I'm just hoping 60fps is an option for most games going forward. I doubt it'll ever be standard and I worry that once they leave the old consoles behind , devs will start pushing other stuff and we'll be back to square 1.
Not if they push 4K they won't be. And they are going to because it's expected most people gaming will be doing so on a 4K screen. You won't be sustaining 60fps throughout the next generation.
I'm just hoping 60fps is an option for most games going forward. I doubt it'll ever be standard and I worry that once they leave the old consoles behind , devs will start pushing other stuff and we'll be back to square 1.
Not if they push 4K they won't be. And they are going to because it's expected most people gaming will be doing so on a 4K screen. You won't be sustaining 60fps throughout the next generation.
If the game is running at 4K/30fps, then it can also run at 1440p/60fps as long as the game is not CPU bottle-necked.
Dirt 5 looks like it was made for a mobile device. Even PGR3 on the Xbox 360 looks better and has much better physics than this pile of dirt, no pun intended. Heck, Dirt 3 even looks better. PGR3, for those who aren't old enough, was a Xbox 360 launch title that actually did 30fps right. It had a good motion blur system which made the game always feel smooth, also thanks to a steady 30+fps and good vsync implementation.
My point is, these days it's more about a spec sheet than actual development talent or innovation. 4K 120fps means NOTHING if the game lacks any form of art style, graphical fidelity, physics system or innovative gameplay mechanics. Dirt 5 has no collision physics, bland environments and the feeling of the cars look like a ps2 game. Forza Horizon 4 looks miles ahead and runs better on a GTX 1080, god bless Playground Games (many of them came from Bizarre, PGR3 developer). What causes most developers to have become this lazy or untalented? Oh I know, it's the fact that most of the industry revolves around micro-transactions, skins, loot boxes and a f*cking spec sheet. Talent and innovation aren't rewarded like before, that's why Rockstar milks the shit out of GTA 5 for around 7 years now (though it enabled them to make RDR2, god bless that game), that's why EA thrives and that's why gaming has mostly stood still since 2007.
There are some innovative games and studios out there such as Respawn entertainment which spawned (no pun intended) Titanfall 2 or Apex Legends (god bless those games), but even they use a custom version of an age old graphical engine called Source. Crysis Remake is another laughing stock example of prioritising 4k resolution and textures, lighting changes and kool-aid color saturation over physics of even a 2007 year old game. THEY DOWNGRADED A 2007 GAME AND THEN RESELL IT CAN YOU BELIEVE IT? No wonder as they have to code all their gameplay ideas for the potato Jaguar CPU, which made it's way to consoles because the last generation was all about 4k textures, higher resolution and better graphics. And pc gamers, nobody cares if you can flick shot someone better with 240fps in a competitive energy drink fueled gaming competition, that shit has no home on consoles. 30fps with excelent v-sync and motion blur is more than enough for single player games, and a stable 60fps is more than enough for multiplayer console games.
The need for 4k 120fps has set back a whole generation of innovation for another 5 years. This is because for graphics and gameplay to feel like a generational leap you want a single player game to push BOTH THE CPU AND GPU to the limits while still maintaining a steady framerate. Not that when you suddenly scale back some graphics the game suddenly runs at 240fps. Scaling might work for GPU loads, but that does not go for CPU's. If you design gameplay for a 1440p 30fps single player experience on a Ryzen 8 core 16 thread monster cpu, 5gb/s monster SSD and 10+ tflop 16gb Vram GPU, then expect dynamic environments and AI to take on new heights (provided with a god blessed talented development team that is). But disappointingly, the coming 2 years of cross gen are a waste of time as games on XsX will essentially be 120fps 4k versions of what a jaguar cpu can run. That is a sign of CPU's not being coded for and used to the fullest.
Lastly, thanks casuals and agenda pushing hermits, your need for 480fps at 8K will set back gaming innovations for the coming 10 years, and thanks red bull sipping cheeto laced controller freaks your need to buy into all these shady business practices makes producers pull the plug on innovative developers, and forces the industry to create addictions rather than interesting games. I rest my case/ end rant.
120fps is pointless for most console gamers.... only a few tvs support it and console gamers don't play on monitors. 60fps is a nice welcome though. Hopefully every game at least has the option for 60fps.
Pretty much this.
1080p @ 60 fps, or 4K @ 30 FPS
The problem of course is that these are averages, so a good chunk of time is spent at the sub-30 FPS range when at 4K.
But yeah, been saying this for years; we made the jump to 4K (for purely marketing purposes) before we could even do 1080p right. I'm really worried about all this talk of 8k.
Resolution is easy; doing it right, and doing it effectively, is a whole other ballgame.
I've been here 18 years watching PC gamers whistle away as console gamers rejoice at finally getting something the PC gamer's have had for years and years and years already...
That's how I feel haha.
Consoles too busy trying to be big fish in a small pond, meanwhile PC is out in the ocean doing it's own thing.
you don't know of there going to get to that point this gen. when they find ways to make the raytracing less cost on the gpu. compression tech. could get better.
why is every one saying that we seen the next gen jump with these first wave of games?
they never get the 100% of the system out of the gate and they might find ways to do it.
look at xbox one out of the gate the games were 720p instead of 1080p and later on most of the games were 1080p.
I get what your saying it it could be true but we have not seen what these system can do with these first wave of games.
Please Log In to post.
Log in to comment