• Must have Hardware Ray Tracing for newer games ?

    From Trimblebracegirdle2@noreply@pugleaf.net.invalid to comp.sys.ibm.pc.games.action on Tue Dec 9 21:33:01 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    Maybe gone obsolescent thing which got me recently was GPU.
    I was happy with GTX 1080 ti and thought ok for few more years.
    Then out came "Indiana Jones and the Great Circle"
    and one or two other games, which I might want, with
    must have hardware Ray tracing requirement.

    I convinced myself that all games would shortly need that
    and made 1st big hardware purchase in years ** RTX 5070 **
    bargain at Β£450 (I MUST! have bargain price)
    Change for me as for many many years my GPU's have been
    used 3 - 4 year old previous top of the range.
    I'm now not certain that hardware Ray tracing will
    be that essential ?
    But the RTX 5070 (" Oooo ! gasp, see it shine") upscaling features are nice. regards TrimbleBracegirdle @@@
    (\__/)
    (='.'=) This is Bunny. Copy and paste Bunny into your
    (")_(") signature to help him gain world domination.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Rin Stowleigh@nospam@nowhere.com to comp.sys.ibm.pc.games.action on Tue Dec 9 18:30:18 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    On Tue, 09 Dec 2025 21:33:01 +0000, Trimblebracegirdle2 <noreply@pugleaf.net.invalid> wrote:

    Maybe gone obsolescent thing which got me recently was GPU.
    I was happy with GTX 1080 ti and thought ok for few more years.
    Then out came "Indiana Jones and the Great Circle"
    and one or two other games, which I might want, with
    must have hardware Ray tracing requirement.

    I convinced myself that all games would shortly need that
    and made 1st big hardware purchase in years ** RTX 5070 **
    bargain at £450 (I MUST! have bargain price)
    Change for me as for many many years my GPU's have been
    used 3 - 4 year old previous top of the range.
    I'm now not certain that hardware Ray tracing will
    be that essential ?
    But the RTX 5070 (" Oooo ! gasp, see it shine") upscaling features are nice. >regards TrimbleBracegirdle @@@
    (\__/)
    (='.'=) This is Bunny. Copy and paste Bunny into your
    (")_(") signature to help him gain world domination.

    Ray tracing is badly overhyped, I typically don't even turn it on
    because in most games it's not even that noticable or relevant to
    providing a sense of immersion.

    However, buying a card that supports it is not a bad idea, because the
    same card will generally give you getter performance at higher
    resolutions / smoother framerates / more other options that do matter.
    It's just that, you don't have to use the extra horsepower for ray
    tracing.

    The newer cards do frame generation, which as far as I can tell is a
    technology for folks who want to play at 4k or whatever in single
    player games (frame gen would be a bad idea if you want to play well,
    as in competitively), but don't want to spend a fortune and are
    willing to tolerate increased input lag. That's not for me, I don't
    need 4k or want to be at that point on the treadmill and I'm
    certaintly not willing to put up with increased input lag. 1440p is increasingly the new norm.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From rms@rmsmoo@moomoo.net to comp.sys.ibm.pc.games.action on Tue Dec 9 17:13:27 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    and made 1st big hardware purchase in years ** RTX 5070 **
    bargain at Β£450 (I MUST! have bargain price)

    IMHO this is a fine purchase that will serve you well for years.
    Presumably you have a mobo+cpu that's a couple gens old or newer to feed it well, and Indiana Jones would be a great showcase game! As Rin says, the Frame Gen feature is especially important

    rms

    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Rin Stowleigh@nospam@nowhere.com to comp.sys.ibm.pc.games.action on Tue Dec 9 20:40:03 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    On Tue, 9 Dec 2025 17:13:27 -0700, "rms" <rmsmoo@moomoo.net> wrote:

    As Rin says, the
    Frame Gen feature is especially important

    unnn.... :) Yes it is if you want to run at higher resolutions/frame
    rates than the rig is otherwise capable of, and the game itself is not dependent on the player's sense of fun being input-lag sensitive. Most
    folks in this newsgroup (and other use cases like console games) don't
    care as much about super-responsiveness of low input lag, and will get
    more enjoyment out of high framerates/detail at high resolutions... I
    think that's where the technology pays off.

    Even still, I think the 5070 will be a nice card for most folks. I
    think it's roughly close in performance to what I'm running now
    (4080S).. a little less "native" performance traded off for a bit more
    future proofing with regard to features like frame gen and overall
    support.

    I bought this rig last year and the 4080S still rips through
    everything at 1440p, so I'd think mainstream gamers could easily get
    6+ years out of a 5070 as long as the CPU isn't bottlenecking it.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Justisaur@justisaur@yahoo.com to comp.sys.ibm.pc.games.action on Wed Dec 10 08:34:00 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    On 12/9/2025 3:30 PM, Rin Stowleigh wrote:
    On Tue, 09 Dec 2025 21:33:01 +0000, Trimblebracegirdle2 <noreply@pugleaf.net.invalid> wrote:

    The newer cards do frame generation, which as far as I can tell is a technology for folks who want to play at 4k or whatever in single
    player games (frame gen would be a bad idea if you want to play well,
    as in competitively), but don't want to spend a fortune and are
    willing to tolerate increased input lag. That's not for me, I don't
    need 4k or want to be at that point on the treadmill and I'm
    certaintly not willing to put up with increased input lag. 1440p is increasingly the new norm.

    I'm hearing newer AAA games are bloated crapware that you need frame generation for even if you aren't running 4k just to get decent
    framerates. :(

    I play a lot of action games (souls) where input lag is important,
    though for the PVP aspects are probably washed out by network latency typically. There's no way I'm using frame generation. It sounds like
    it's the way of the future though.

    If that doesn't end up just playing on tablets anyway.
    --
    -Justisaur

    ΓΈ-ΓΈ
    (\_/)\
    `-'\ `--.___,
    ΒΆΒ¬'\( ,_.-'
    \\
    ^'
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Rin Stowleigh@nospam@nowhere.com to comp.sys.ibm.pc.games.action on Wed Dec 10 11:38:46 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    On Wed, 10 Dec 2025 08:34:00 -0800, Justisaur <justisaur@yahoo.com>
    wrote:

    On 12/9/2025 3:30 PM, Rin Stowleigh wrote:
    On Tue, 09 Dec 2025 21:33:01 +0000, Trimblebracegirdle2
    <noreply@pugleaf.net.invalid> wrote:

    The newer cards do frame generation, which as far as I can tell is a
    technology for folks who want to play at 4k or whatever in single
    player games (frame gen would be a bad idea if you want to play well,
    as in competitively), but don't want to spend a fortune and are
    willing to tolerate increased input lag. That's not for me, I don't
    need 4k or want to be at that point on the treadmill and I'm
    certaintly not willing to put up with increased input lag. 1440p is
    increasingly the new norm.

    I'm hearing newer AAA games are bloated crapware that you need frame >generation for even if you aren't running 4k just to get decent
    framerates. :(

    Plus, they are designed for 50+ hours of player engagement (2.5 hours
    of which is actually doing something that involves actual control, the
    rest of which is watching cutscenes or assisted animations). So it
    seems like frame gen is geared more toward the movie-watching variety
    of gamer rather than the participating type.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Spalls Hurgenson@spallshurgenson@gmail.com to comp.sys.ibm.pc.games.action on Wed Dec 10 12:30:36 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    On Tue, 09 Dec 2025 21:33:01 +0000, Trimblebracegirdle2 <noreply@pugleaf.net.invalid> wrote:

    Maybe gone obsolescent thing which got me recently was GPU.
    I was happy with GTX 1080 ti and thought ok for few more years.
    Then out came "Indiana Jones and the Great Circle"
    and one or two other games, which I might want, with
    must have hardware Ray tracing requirement.



    There are a few games that /require/ ray-tracing, but these are -as
    yet- the exception. Eventually they will become the norm, but I don't
    think it's going to happen any time soon. After all, a huge chunk of
    the gaming market still plays on /laptops/, and few publishers are
    going to willingly exclude so many people.

    I'll be honest; in the games I've tried it, I have a hard time
    noticing the difference between ray-traced lighting and the baked-in
    effects. I /can/ see it, but it's not a difference that really jumps
    out at me when I'm actually in the game. That's not to see ray-tracing
    is worthless; it gives the developers a lot more options (for
    instance, free reflections!) but it isn't a must-have effect.
    Certainly it's yet to justify the performance hit it creates. It's one
    of those technologies that will, in the long run, prove its worth...
    but we're still in the early days. I think developers are going to be
    relying on old-school methods for a long time to come.

    Which isn't to say that if you're building a new PC you shouldn't get
    a GPU that supports ray-tracing. You build for the future, after all.
    But unless you're running really outdated hardware, it's also not
    anything I'd rush out to upgrade just for ray-tracing capabilities.

    It also very much depends on what you use your GPU for. If you're big
    into triple-A super-block-buster games (or hardware tech demos) you'll
    more likely want a newer GPU. If you are gaming at 4K resolutions,
    even a vaunted 1080ti starts chugging. But if you're more into smaller
    Indie games, or still game at lower resolutions* then a lot of that
    extra performance is wasted.


    That said...


    Prices for computer electronics are going to go up dramatically. AI
    has already hogged all the GPUs (and there's a non-zero possibility
    Nvidia might do a Micron and decide to give up the consumer market
    entirely), and skyrocketing RAM prices are going to make GPUs even
    more expensive. So if you /are/ in the market, buy sooner rather than
    later.





    ----
    * which aren't going away anytime soon. Remember, the SteamDeck is
    still only pushing 1280Χ800.


    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Justisaur@justisaur@yahoo.com to comp.sys.ibm.pc.games.action on Thu Dec 11 09:57:57 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    On 12/10/2025 9:30 AM, Spalls Hurgenson wrote:
    On Tue, 09 Dec 2025 21:33:01 +0000, Trimblebracegirdle2 <noreply@pugleaf.net.invalid> wrote:

    Maybe gone obsolescent thing which got me recently was GPU.
    I was happy with GTX 1080 ti and thought ok for few more years.
    Then out came "Indiana Jones and the Great Circle"
    and one or two other games, which I might want, with
    must have hardware Ray tracing requirement.



    There are a few games that /require/ ray-tracing, but these are -as
    yet- the exception. Eventually they will become the norm, but I don't
    think it's going to happen any time soon. After all, a huge chunk of
    the gaming market still plays on /laptops/, and few publishers are
    going to willingly exclude so many people.

    I'll be honest; in the games I've tried it, I have a hard time
    noticing the difference between ray-traced lighting and the baked-in
    effects. I /can/ see it, but it's not a difference that really jumps
    out at me when I'm actually in the game. That's not to see ray-tracing
    is worthless; it gives the developers a lot more options (for
    instance, free reflections!) but it isn't a must-have effect.
    Certainly it's yet to justify the performance hit it creates. It's one
    of those technologies that will, in the long run, prove its worth...
    but we're still in the early days. I think developers are going to be
    relying on old-school methods for a long time to come.

    I notice it looks slightly better in ER (like 5%?) but really if it were causing deep frame dips I'd turn it off and never look back. They added
    it after the game was complete though.

    Cyberpunk 2077 is supposedly the poster child for Ray-Tracing, I don't
    play without it and it looks maybe 10% better, most noticeable on water
    and holograms, pretty much no effect on people. It possibly makes the
    general distant background look worse especially when driving as it
    looks like greenscreen effects against the car.

    Definitely not something I'd get upgraded just for that. Much like the
    AI framgen.

    My son's been talking about getting a 5080 so '77 looks better with some
    crazy 'photo real' mods for Christmas. I tried to keep from laughing at
    him, and carefully explained that's way beyond our budget and I wasn't
    going to be spending that kind of money on a card that's only maybe good
    for that one thing.
    --
    -Justisaur

    ΓΈ-ΓΈ
    (\_/)\
    `-'\ `--.___,
    ΒΆΒ¬'\( ,_.-'
    \\
    ^'
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Spalls Hurgenson@spallshurgenson@gmail.com to comp.sys.ibm.pc.games.action on Fri Dec 12 10:24:55 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    On Thu, 11 Dec 2025 09:57:57 -0800, Justisaur <justisaur@yahoo.com>
    wrote:



    Cyberpunk 2077 is supposedly the poster child for Ray-Tracing, I don't
    play without it and it looks maybe 10% better, most noticeable on water
    and holograms, pretty much no effect on people. It possibly makes the >general distant background look worse especially when driving as it
    looks like greenscreen effects against the car.


    Cyberpunk 2077 surprised me with its ray-tracing. Not so much with how
    much better it made things look (I'd agree with that '10%
    improvement') but how little impact it had on performance. I always
    keep it on, but I'd be just as happy with the game if it were
    disabled. It makes that little difference/

    Okay, sure, with ray-tracing enabled it makes the screenshots look
    nicer, but during gameplay? When things are whizzing past me at 40mph
    or people are shooting guns at me? It's practically invisible.

    I have faith in the technology, that in time it will become even more
    powerful, and that developers will become more skilled in implementing
    its capabilities, and then there will be a distinct difference between
    games with and without ray-tracing.

    But right now? It's a high-end feature that just isn't necessary for
    anyone except XTR3m3 G4m3rZ. If you're buying new (and have the cash
    to spare), sure, grab an RTX card and have fun. But if you've already
    a decent PC, there's no need to upgrade yet, and if money is an
    concern, you probably won't get value on the dollar by dropping an
    extra $500 just for that feature.

    Maybe with the next generation of hardware (assuming we aren't all on thin-clients and streaming everything) but now? Save your dosh.


    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From candycanearter07@candycanearter07@candycanearter07.nomail.afraid to comp.sys.ibm.pc.games.action on Fri Dec 12 21:50:07 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    Justisaur <justisaur@yahoo.com> wrote at 17:57 this Thursday (GMT):
    On 12/10/2025 9:30 AM, Spalls Hurgenson wrote:
    On Tue, 09 Dec 2025 21:33:01 +0000, Trimblebracegirdle2
    <noreply@pugleaf.net.invalid> wrote:

    Maybe gone obsolescent thing which got me recently was GPU.
    I was happy with GTX 1080 ti and thought ok for few more years.
    Then out came "Indiana Jones and the Great Circle"
    and one or two other games, which I might want, with
    must have hardware Ray tracing requirement.



    There are a few games that /require/ ray-tracing, but these are -as
    yet- the exception. Eventually they will become the norm, but I don't
    think it's going to happen any time soon. After all, a huge chunk of
    the gaming market still plays on /laptops/, and few publishers are
    going to willingly exclude so many people.

    I'll be honest; in the games I've tried it, I have a hard time
    noticing the difference between ray-traced lighting and the baked-in
    effects. I /can/ see it, but it's not a difference that really jumps
    out at me when I'm actually in the game. That's not to see ray-tracing
    is worthless; it gives the developers a lot more options (for
    instance, free reflections!) but it isn't a must-have effect.
    Certainly it's yet to justify the performance hit it creates. It's one
    of those technologies that will, in the long run, prove its worth...
    but we're still in the early days. I think developers are going to be
    relying on old-school methods for a long time to come.

    I notice it looks slightly better in ER (like 5%?) but really if it were causing deep frame dips I'd turn it off and never look back. They added
    it after the game was complete though.

    Cyberpunk 2077 is supposedly the poster child for Ray-Tracing, I don't
    play without it and it looks maybe 10% better, most noticeable on water
    and holograms, pretty much no effect on people. It possibly makes the general distant background look worse especially when driving as it
    looks like greenscreen effects against the car.

    Definitely not something I'd get upgraded just for that. Much like the
    AI framgen.

    unfortunately, the new stuff will probably be more expensive to justify
    it...

    My son's been talking about getting a 5080 so '77 looks better with some crazy 'photo real' mods for Christmas. I tried to keep from laughing at him, and carefully explained that's way beyond our budget and I wasn't
    going to be spending that kind of money on a card that's only maybe good
    for that one thing.


    I REALLY do not get the whole push for "photorealism", honestly. Like,
    sure it looks like a real world or whatever but I play games to have
    fun, not stare at the landscape.

    also i way prefer stylization and super cute fantasy creature stuff
    anyways
    user <candycane> is generated from /dev/urandom
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Justisaur@justisaur@yahoo.com to comp.sys.ibm.pc.games.action on Fri Dec 12 14:42:04 2025
    From Newsgroup: comp.sys.ibm.pc.games.action

    On 12/12/2025 7:24 AM, Spalls Hurgenson wrote:
    On Thu, 11 Dec 2025 09:57:57 -0800, Justisaur <justisaur@yahoo.com>
    wrote:



    Cyberpunk 2077 is supposedly the poster child for Ray-Tracing, I don't
    play without it and it looks maybe 10% better, most noticeable on water
    and holograms, pretty much no effect on people. It possibly makes the
    general distant background look worse especially when driving as it
    looks like greenscreen effects against the car.


    Cyberpunk 2077 surprised me with its ray-tracing. Not so much with how
    much better it made things look (I'd agree with that '10%
    improvement') but how little impact it had on performance. I always
    keep it on, but I'd be just as happy with the game if it were
    disabled. It makes that little difference/

    Okay, sure, with ray-tracing enabled it makes the screenshots look
    nicer, but during gameplay? When things are whizzing past me at 40mph
    or people are shooting guns at me? It's practically invisible.

    I have faith in the technology, that in time it will become even more powerful, and that developers will become more skilled in implementing
    its capabilities, and then there will be a distinct difference between
    games with and without ray-tracing.

    But right now? It's a high-end feature that just isn't necessary for
    anyone except XTR3m3 G4m3rZ. If you're buying new (and have the cash
    to spare), sure, grab an RTX card and have fun. But if you've already
    a decent PC, there's no need to upgrade yet, and if money is an
    concern, you probably won't get value on the dollar by dropping an
    extra $500 just for that feature.

    Maybe with the next generation of hardware (assuming we aren't all on thin-clients and streaming everything) but now? Save your dosh.

    I may have to backtrack a bit.

    I've been playing Control and the graphics have been bugging me. I've
    been fiddling around with the graphics. For some reason, like many
    games it defaults to medium or low. I actually can't run on ultra even
    on 1080p, though it's more a vram issue as some of the textures don't
    load. So I'm running it on "only" high.

    It was feeling very jerky, I turned on ray tracing and vertical sync and
    dlss just to see what each would do. Vertical Sync and dlss each made
    it feel noticeably better in the jerky feeling, and togther made it feel almost buttery. I was surprised dlss did anything as I'm running at
    native resolution. I did once notice some weird interpretations of
    letters on a sign while it was doing it's thing for a moment, which at
    first I thought was supposed to be part of the weird occult/scp stuff
    going on.

    Ray tracing is pretty noticeable in the highlights. It's perhaps more
    that everything doesn't look very good without it, as if they just said
    "who cares if it looks shitty for the plebs.' I'd rate it about 20%
    better looking.

    I can't get HDR working though either, it says I don't have Direct X 12,
    but I do, I checked. I've never had that problem. It might be much
    less noticeable were that on, and ray-tracing is compensating a lot more.

    Ultimately I think it's just I have more of a dislike of the 'fog' or
    'dust' that seems to be everywhere.
    --
    -Justisaur

    ΓΈ-ΓΈ
    (\_/)\
    `-'\ `--.___,
    ΒΆΒ¬'\( ,_.-'
    \\
    ^'
    --- Synchronet 3.21a-Linux NewsLink 1.2