Jump to content
  • Sign Up

Recommended Posts

> @"maddoctor.2738" said:

> > @"coso.9173" said:

> > i wonder why people who don't even work for ANET have managed to do this, and the people who work in there don't do a similar thing, if it improves the game substantially.

>

> Because the game (core and heart of thorns) still lists Windows XP SP3 as the minimum requirement. The developers have always been reluctant towards upgrading the core game in any way, that goes for graphics too. Now it's either because they have the hardware data and there are a lot of players using potatoes to run this game, the team doesn't want to mess with existing engine code and instead pile new things on top of it, or simply they don't see any engine improvements as a worthwhile investment.

 

I tried to run this game on Vista last year for a friend that had an old gaming laptop and it simply won't run on that OS. I can't imagine it can run on XP. If someone has succeeded in doing this I'd be fascinated to get more info. Sorry, I know this is a little off topic but I really want to know if this is possible.

Link to comment
Share on other sites

> @"phokus.8934" said:

> The game doesn’t run DX12 with d912pxy. It still runs DX9 as dictated by the game engine. The only thing d912pxy does is translates some dx9 functions into dx12.

>

> This is merely semantics but if you use this mod you’re still running GW2 with dx9.

 

Incorrect semantics or otherwise, you're entitled to your opinion though.

Link to comment
Share on other sites

> @"Super Hayes.6890" said:

> > @"maddoctor.2738" said:

> > > @"coso.9173" said:

> > > i wonder why people who don't even work for ANET have managed to do this, and the people who work in there don't do a similar thing, if it improves the game substantially.

> >

> > Because the game (core and heart of thorns) still lists Windows XP SP3 as the minimum requirement. The developers have always been reluctant towards upgrading the core game in any way, that goes for graphics too. Now it's either because they have the hardware data and there are a lot of players using potatoes to run this game, the team doesn't want to mess with existing engine code and instead pile new things on top of it, or simply they don't see any engine improvements as a worthwhile investment.

>

> I tried to run this game on Vista last year for a friend that had an old gaming laptop and it simply won't run on that OS. I can't imagine it can run on XP. If someone has succeeded in doing this I'd be fascinated to get more info. Sorry, I know this is a little off topic but I really want to know if this is possible.

 

Does your friend have Path of Fire unlocked? The Windows XP SP3 requirements are for the Core game and Heart of Thorns. Although with certain places, like Lion's Arch and Heart of the Mists it gets tricky because they use the update visuals, so I'm not sure if the official minimum requirements still stand.

 

By the way, Path of Fire minimum requirements list Windows 7 64bit, although the visual update happened with Draconis Mons, which was part of Season 3

Link to comment
Share on other sites

> @"maddoctor.2738" said:

> > @"coso.9173" said:

> > i wonder why people who don't even work for ANET have managed to do this, and the people who work in there don't do a similar thing, if it improves the game substantially.

>

> Because the game (core and heart of thorns) still lists Windows XP SP3 as the minimum requirement.

Well that doesnt have anything to do with it, they could just switch between DX9 and DX12 like any other game does.

 

The investment of time and resources though, thats whats keeping it back. Simply put, there's probably no passionate engine coder left at Anet and theres no will in management to push it through either.

Link to comment
Share on other sites

all MMOs start raising their requisites with time and new expacs, why wouldn't GW2 do it?

people having old computers isn't really a good enough reason, you're limiting their own game and how much they can improve it just for old computers that get older with every year.

Link to comment
Share on other sites

> @"maddoctor.2738" said:

> > @"coso.9173" said:

> > i wonder why people who don't even work for ANET have managed to do this, and the people who work in there don't do a similar thing, if it improves the game substantially.

>

> Because the game (core and heart of thorns) still lists Windows XP SP3 as the minimum requirement. The developers have always been reluctant towards upgrading the core game in any way, that goes for graphics too. Now it's either because they have the hardware data and there are a lot of players using potatoes to run this game, the team doesn't want to mess with existing engine code and instead pile new things on top of it, or simply they don't see any engine improvements as a worthwhile investment.

 

At this point, trying to keep the game compatible with XP would be more work than just dropping the support for it altogether. According to the steam hardware survey 96% of OS's out there are non-XP anyway.

 

I mean they dropped 32-bit support completely. Actually, now that I think about it, XP SP3 only came in 32-bit flavor. The 64- bit was only up to SP2. This means that Anet not supporting 32-bit anymore indirectly no longer supports XP SP3.

Link to comment
Share on other sites

> @"Mack.3045" said:

> > @"phokus.8934" said:

> > The game doesn’t run DX12 with d912pxy. It still runs DX9 as dictated by the game engine. The only thing d912pxy does is translates some dx9 functions into dx12.

> >

> > This is merely semantics but if you use this mod you’re still running GW2 with dx9.

>

> Incorrect semantics or otherwise, you're entitled to your opinion though.

What I said isn’t an opinion but rather a fact and that’s not something you can argue with. It’s weird that your heels are dug deep into the sand on this, though.

 

 

Link to comment
Share on other sites

> @"Dawdler.8521" said:

> > @"dandamanno.4136" said:

> > I mean they dropped 32-bit support completely.

> Well at least that was about time - the last 32bit desktop CPU was 15 years ago, lol.

CPU sure, but we're talking about operating systems, not processors. And last Windows OS with 32bit version available was Windows 10. And they decided to stop selling it and go only 64 bit from now on only _this_ year.

 

Link to comment
Share on other sites

D912pxy increased my FPS in Lion's arch from 50-55 to 75-80, at different world bosses with A LOT of players - 25-30 fps to 40-45 fps and in WvW 3 squads against each other from 30-40 fps to 50-60fps

I am using kinda old i7 4770k boosted to 4.6k GHz and GTX1080 (not TI), with 16gb ram

So D912pxy will work for most of the players.

 

Link to comment
Share on other sites

> @"phokus.8934" said:

> > @"Mack.3045" said:

> > > @"phokus.8934" said:

> > > The game doesn’t run DX12 with d912pxy. It still runs DX9 as dictated by the game engine. The only thing d912pxy does is translates some dx9 functions into dx12.

> > >

> > > This is merely semantics but if you use this mod you’re still running GW2 with dx9.

> >

> > Incorrect semantics or otherwise, you're entitled to your opinion though.

> What I said isn’t an opinion but rather a fact and that’s not something you can argue with. It’s weird that your heels are dug deep into the sand on this, though.

>

>

 

Nothing about being wierd, and I get my facts straight from Megai the developer and the user base, not someone with a different opinion on the forums.

I'm not going to disregard a year's worth of discord discussions with Megai and a few other c++ devs who've contributed to the project. As mentioned previously one of the Anet Gw2 programmers is in the discord channel and contributes to the discussions too. It's all about where you get your facts from ?

 

Implying "semantics" and that i'm "digging in my heels" does nothing to change my opinion based on real factual information. It's ironic you'd bring that up. Who should I believe... the developer and contributing devs and a Gw2 programmer and engineer... Or you? ? I'll go with first group thanks!

 

Yes the d912pxy is a DX9 to DX12 wrapper and there's a transition/translation layer but, for all GPU intensive purposes the game is running in DX12 whilst it's being used. The game is being rendered with the DirectX 12 API.

Link to comment
Share on other sites

> @"maddoctor.2738" said:

> > @"Super Hayes.6890" said:

> > > @"maddoctor.2738" said:

> > > > @"coso.9173" said:

> > > > i wonder why people who don't even work for ANET have managed to do this, and the people who work in there don't do a similar thing, if it improves the game substantially.

> > >

> > > Because the game (core and heart of thorns) still lists Windows XP SP3 as the minimum requirement. The developers have always been reluctant towards upgrading the core game in any way, that goes for graphics too. Now it's either because they have the hardware data and there are a lot of players using potatoes to run this game, the team doesn't want to mess with existing engine code and instead pile new things on top of it, or simply they don't see any engine improvements as a worthwhile investment.

> >

> > I tried to run this game on Vista last year for a friend that had an old gaming laptop and it simply won't run on that OS. I can't imagine it can run on XP. If someone has succeeded in doing this I'd be fascinated to get more info. Sorry, I know this is a little off topic but I really want to know if this is possible.

>

> Does your friend have Path of Fire unlocked? The Windows XP SP3 requirements are for the Core game and Heart of Thorns. Although with certain places, like Lion's Arch and Heart of the Mists it gets tricky because they use the update visuals, so I'm not sure if the official minimum requirements still stand.

>

> By the way, Path of Fire minimum requirements list Windows 7 64bit, although the visual update happened with Draconis Mons, which was part of Season 3

 

I didn't think about that. He does have PoF so that makes sense.

Link to comment
Share on other sites

> @"phokus.8934" said:

> The game doesn’t run DX12 with d912pxy. It still runs DX9 as dictated by the game engine. The only thing d912pxy does is translates some dx9 functions into dx12.

>

> This is merely semantics but if you use this mod you’re still running GW2 with dx9.

 

I get what you're saying but Mack is right, the game is being fully rendered in DX12 whilst using the d912pxy. That's what the initial debate was about between him and Ayrilana.

Link to comment
Share on other sites

> @"phokus.8934" said:

> The game doesn’t run DX12 with d912pxy. It still runs DX9 as dictated by the game engine. The only thing d912pxy does is translates some dx9 functions into dx12.

>

> This is merely semantics but if you use this mod you’re still running GW2 with dx9.

 

What you've said is mostly wrong and partially right. The DX912pxy is a DX12 translation layer for Direct3D 9 which allows running GW2 specifically in DirectX 12. I don't think the fact the D912pxy being a wrapper was in question. Whether the game was being rendered in DX12 was. Using the DX912pxy is an implementation of DX12 Runtime for GW2. You are not running GW2 with DX9 whilst using the wrapper.

Link to comment
Share on other sites

Thought i'd post some technical info here regarding how the D3D12 transitional layer works with the d912pxy.

 

Thank you to Megai the developer of the D912pxy for this information **(source)**.

 

_I did not write the following material and explanation, nor am i a C++ programmer or developer, i'm merely posting it for reference for those interested._

 

If you would like to discuss this with Megai in more detail, or have questions regarding the d912pxy, you are invited to join the d912pxy discord channel here

 

https://discord.com/invite/fY9KADf

 

**Dx912pxy and the D3D12 transitional layer.**

 

**What does this do?**

This translation layer provides the following high-level constructs (and more) for the GW2 engine to implement in the rendering pipeline.

 

**Resource binding**

The D3D12 resource binding model is quite different from D3D9 and prior. Rather than having a flat array of resources set on the pipeline which map 1:1 with shader registers, D3D12 takes a more flexible approach which is also closer to modern hardware. The translation layer takes care of figuring out which registers a shader needs, managing root signatures, populating descriptor heaps/tables, and setting up null descriptors for unbound resources.

 

**Resource renaming**

D3D9 and older have a concept of DISCARD CPU access patterns, where the CPU populates a resource, instructs the GPU to read from it, and then immediately populates new contents without waiting for the GPU to read the old ones. This pattern is typically implemented via a pattern called "renaming", where new memory is allocated during the DISCARD operation, and all future references to that resource in the API will point to the new memory rather than the old. The translation layer provides a separation of a resource from its "identity," which enables cheap swapping of the underlying memory of a resource for that of another one without having to recreate views or rebind them. It also provides easy access to rename operations (allocate new memory with the same properties as the current, and swap their identities).

 

**Resource suballocation, pooling, and deferred destruction**

D3D9-style apps can destroy objects immediately after instructing the GPU to do something with them. D3D12 requires applications to hold on to memory and GPU objects until the GPU has finished accessing them. Additionally, D3D9 apps suffer no penalty from allocating small resources (e.g. 16-byte buffers), where D3D12 apps must recognize that such small allocations are infeasible and should be suballocated from larger resources. Furthermore, constantly creating and destroying resources is a common pattern in D3D9, but in D3D12 this can quickly become expensive. The translation layer handles all of these abstractions seamlessly.

 

**Batching and threading**

Since D3D9 patterns generally require applications to record all graphics commands on a single thread, there are often other CPU cores that are idle. To improve utilization, the translation layer provides a batching layer which can sit on top of the immediate context, moving the majority of work to a second thread so it can be parallelized. It also provides threadpool-based helpers for offloading PSO compilation to worker threads **(see the guide on using pre-compiled shader packs with the d912pxy and setting the configurable "Read PSO Cache -1" )** Combining these means that compilations can be kicked off at draw-time on the application thread, and only the batching thread needs to wait for them to be completed. Meanwhile, other PSO compilations are starting or completing, minimizing the wall clock time spent compiling shaders.

 

**Residency management**

This layer incorporates the open-source residency management library to improve utilization on low-memory systems.

Link to comment
Share on other sites

> @"Mack.3045" said:

> Thought i'd post some technical info here regarding how the D3D12 transitional layer works with the d912pxy.

>

> Thank you to Megai the developer of the D912pxy for this information **(source)**.

>

> _I did not write the following material and explanation, nor am i a C++ programmer or developer, i'm merely posting it for reference for those interested._

>

> If you would like to discuss this with Megai in more detail, or have questions regarding the d912pxy, you are invited to join the d912pxy discord channel here

>

> https://discord.com/invite/fY9KADf

>

> **Dx912pxy and the D3D12 transitional layer.**

>

> **What does this do?**

> This translation layer provides the following high-level constructs (and more) for the GW2 engine to implement in the rendering pipeline.

>

> **Resource binding**

> The D3D12 resource binding model is quite different from D3D9 and prior. Rather than having a flat array of resources set on the pipeline which map 1:1 with shader registers, D3D12 takes a more flexible approach which is also closer to modern hardware. The translation layer takes care of figuring out which registers a shader needs, managing root signatures, populating descriptor heaps/tables, and setting up null descriptors for unbound resources.

>

> **Resource renaming**

> D3D9 and older have a concept of DISCARD CPU access patterns, where the CPU populates a resource, instructs the GPU to read from it, and then immediately populates new contents without waiting for the GPU to read the old ones. This pattern is typically implemented via a pattern called "renaming", where new memory is allocated during the DISCARD operation, and all future references to that resource in the API will point to the new memory rather than the old. The translation layer provides a separation of a resource from its "identity," which enables cheap swapping of the underlying memory of a resource for that of another one without having to recreate views or rebind them. It also provides easy access to rename operations (allocate new memory with the same properties as the current, and swap their identities).

>

> **Resource suballocation, pooling, and deferred destruction**

> D3D9-style apps can destroy objects immediately after instructing the GPU to do something with them. D3D12 requires applications to hold on to memory and GPU objects until the GPU has finished accessing them. Additionally, D3D9 apps suffer no penalty from allocating small resources (e.g. 16-byte buffers), where D3D12 apps must recognize that such small allocations are infeasible and should be suballocated from larger resources. Furthermore, constantly creating and destroying resources is a common pattern in D3D9, but in D3D12 this can quickly become expensive. The translation layer handles all of these abstractions seamlessly.

>

> **Batching and threading**

> Since D3D9 patterns generally require applications to record all graphics commands on a single thread, there are often other CPU cores that are idle. To improve utilization, the translation layer provides a batching layer which can sit on top of the immediate context, moving the majority of work to a second thread so it can be parallelized. It also provides threadpool-based helpers for offloading PSO compilation to worker threads **(see the guide on using pre-compiled shader packs with the d912pxy and setting the configurable "Read PSO Cache -1" )** Combining these means that compilations can be kicked off at draw-time on the application thread, and only the batching thread needs to wait for them to be completed. Meanwhile, other PSO compilations are starting or completing, minimizing the wall clock time spent compiling shaders.

>

> **Residency management**

> This layer incorporates the open-source residency management library to improve utilization on low-memory systems.

 

Enjoyed reading the technical info thanks Mack.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...