If you own a -series card, how often do you find yourself using RTX ? (eg. , , etc.)

If you just want to say "fuck Nvidia" or "it's proprietary bullshit", I'd politely request you to ignore this toot.


· · Web · 3 · 1 · 0

Even though the sample-size is very low, it's interesting to see that RTX isn't used as often as I thought.

Is Nvidia pulling an Intel here? (referring to Linus Torvald's statement on AVX512 taking up part of the transistor budget)

@finlaydag33k I have a gtx card, and there are some features. I don't use any of them though. Expect for nvec. Again nvidia experience has been awful so

@finlaydag33k I missed the poll but I sometimes use it. Main reason is barely any of my games make any use of it.

I occasionally turn it on in Minecraft while messing around, but it drops performance hard on my 3060.

Gonna try out the Portal RTX upgrade that comes out this oct/nov.

They're all really just fun things to try out, but not worth the performance hit.

@LibreNyaa Yea, this is kind of what I expected.

I have quite literally no games that support it except Minecraft (although, I don't use bedrock, only Java - I do have a Bedrock license tho) but wanted to see whether this was just me or a bit more widespread.

Especially considering that it's basically made on proprietary tech rather than open-tech (from what I know) it's kinda yikes that they market it so heavily if nobody can really use it.

@finlaydag33k That's the whole point too, it's just a marketing gimmick. It's cool tech, but they made it without the intent of putting any actual effort into it aside from the start.

I don't think they ever planned to actually support it properly.

@LibreNyaa Honestly, I think the main reason why it failed is because devs don't wanna invest much time in it cus nobody really uses it (due to the performance impact).

I think they just shouldn't have released RTX yet when the performance impacts were so high.
Especially not considering its proprietary nature.

@finlaydag33k @LibreNyaa

I've played *a lot* of bRTX Minecraft. ~1200 hours or so

The community that has formed around it is dying because Mojang/MS are not concerned enough to fix game-breaking legitimately epileptic inducing bugs on it, and refuse to let the code out so that independent devs can fix it themselves

About 4/6 of the big packs that I know of converted their packs to Java or just all out quit working on bRTX because it's so frustrating.

@justifier So even M$ doesn't see enough of a future in it to really keep updating it, yikes.


Pretty much.

It's not that they don't want to from the looks of it. They see the hype around it.

It's just that the hardware vendor that they went with (Radeon) for this cycle of product rounds can't run it for the general population @ 30 fps, UHD (xbox users) so they didn't bother to allocate engineers to work on it from the looks of it.

Pure speculation ofc, but it seems like they're waiting for next gen hardware to bother with it if they do at all

@justifier Wait, how did we get from RTX on PC to Radeon in Xbox? :|


"I've played *a lot* of bRTX Minecraft. ~1200 hours or so"

we were talking about Bedrock Minecraft from my first post.

bRTX - Bedrock Minecraft with RTX on

Microsoft/Mojang haven't been at the least been pushing their work to the public

There was a build that leaked that allowed Xbox users to use RTX on bedrock but it was nearly immediately removed, and it was actually one of the marketing points they pushed when they debuted this gen Xbox products.

Mojangs/Microsoft interest in pushing/maintaining the feature will be dependant on it being able to run on Xbox.

If they can't get it to run there they absolutely won't invest expensive engineer's hours into it any more than necessary

Since it can't run on this generation's hardware at a reasonable FPS UHD even at fairly low settings (10-12 RD) it makes the hardware look bad.

Why would they add support for features that make their products look subpar?

@justifier No not really.
I think the downfall for them not pushing DXR (or fixing the feature in RenderDragon to begin with) is because there just isn't much demand for it.
I doubt they'd go "yea no, our console can't run it, therefore we'll just not fix it for PC either"...

@justifier @finlaydag33k I haven't been able to use it at all, ever since like 6-7 updates ago it's just said "resource fallback, not enough memory, there may be errors" or something like that when I turn RTX on.

I have a 3060 12gb vram and 32gb ram, I hate that bug the most, especially with the lighting breaking in the water.


I get that as well with the same ram capacity and a 6900xt, When I saw it, it made me laugh.

The worst bug for me has been the cloud popping bugs. I have very a bright screen. Clouds popping in and out hurts my eyes. And yes, I can and do turn the brightness down, but I always have to turn it back up for productivity, and when I forget...

Whole thing is a mess frankly, which is a shame because its potential is worlds above what's possible on Java right now.

Sign in to participate in the conversation

A instance dedicated - but not limited - to people with an interest in the GNU+Linux ecosystem and/or general tech. Sysadmins to enthusiasts, creators to movielovers - Welcome!