Jump to content
  • Sign Up

When will this game update engine to DX12?


Recommended Posts

> @"TinkTinkPOOF.9201" said:

> > @"Emberstone.2904" said:

> > > @"TinkTinkPOOF.9201" said:

> > > > @"Trise.2865" said:

> > > > Do you have ANY idea what it will take to rewrite the entire game engine?

> > >

> > > What does rewriting the game engine and the API it uses have to do with each other?

> > >

> > > People seem to not understand that DX12 is an API, it is NOT a game engine, nor does changing the API of a game engine mean starting over. Some game engines have many API's. And API is an interface, the game engine is the functionality that can be made up of all sorts of assets including tools that are used by the developer to aid in faster creation of the game, such as authoring tools, model editors, importers etc etc. Most game engines include an API, but the API does not define the game engine and can be changed.

> > >

> > > The API standardizes how the engine interacts with other software and hardware so it knows what to expect, direct hardware access is also banned by most OS's, such as Windows, Linux and OSX. However, each bit of hardware is different, AMD and NV is not even the same kind of architecture, and that does not include proprietary hardware designs such as RT cores for AI or Ray tracing acceleration. Those vendor specific drivers would never work without a standard API such as DX12.

> >

> > Yes, the API can be changed, but let's spin this a little bit differently.

> >

> > Let's say you have a program written in C++, but you want to port it to Java. Most programming languages offer the exact same set of features at a base level (various kinds of loops, function libraries with a plethora of math/memory/etc. operations figured out for you, etc.). Even if you're only using basic libraries containing functions that can be, more or less, found across many different languages (like square roots), a program with 10,000 lines of code in total will still take a pretty large amount of time to make the like-for-like translation, and then do all the necessary R&D to ensure it works as it did before. This isn't a perfect example, but it should get the point across.

> >

> > Now, I'm pretty sure a dev said somewhere that GW2 has more than 10,000 lines of code, and surely a game of this scale uses a lot of custom-made stuff in its graphics pipeline rather than canned DX9 functions in order to get the game to work the way they want it to. Making that translation on custom code on a scale this large likely won't happen, even if it's just an API change. Sure they wouldn't be changing the core logic of the game, but every graphics call is going to have to be altered and that's still no small feat. DX12 is markedly different than DX9.

> >

> > So, even if you're just translating GW2's code from one form to another, there's still a boatload of work to do to get it all done and ensure it works. And it can't just be a simple translation from DX9 to 12 since there are some major differences that have to be accounted for.

> >

> > Don't get me wrong, I would absolutely love to see this game get a big engine update. Dipping to 20 FPS unless I lower shadows and character models in Dragon's Stand is not fun, especially when I have an overclocked 8700K and GTX 1070. My GPU is barely utilized even at 1440p in these scenarios since the CPU can't keep up. But like I said, because of the time investment a drastic update to the graphics pipeline likely isn't going to happen.

> >

> > But I'm praying that it does with you.

>

> C++ and Java would be closer to the engine, as both already have their own APIs defined in the SDK. Java and C++ are also very different animals, as Java assumes a full fledged OS, while C++ could be the programing of the OS it self. One assumes an OS and the output being an application of some sort, while the other makes almost no assumptions about anything, as C++ can be used for almost anything and as such the standard library assumes almost nothing about what it will have available, and so the standard library doesn't make any dependencies on those features.

>

> In the case of GW2 we are talking porting it to DX12, not rewriting the whole engine like your example above would require. Now, not saying it's as easy as some people assume, however it seems people either assume it's a click of a button, or they assume it requires a ground up rebuild of the engine, when it's really closer to the middle. As has already been stated, a single unpaid person has written a wrapper for GW2 that implements many of DX12 performance benefits. So what you seem to be saying is that anets whole dev team is less capable than this single unpaid programmer?

 

... you completely missed the point of what I said. It was a metaphor about like-for-like translation of an application, not an actual insinuation that they should port anything to another language. The languages in the metaphor were also arbitrary. I know C++ and Java are extremely different animals.

 

Also, I'm sure Anet has taken a look at d912pxy. Given the traction it's getting it would be pretty unwise to ignore it from a security perspective. If it were that easy to get a huge performance boost, I'm sure something might've even been implemented in the game by now. People on these forums always talk like the devs are clueless, but when it comes to Guild Wars, they know more than any of us. Anyway, if you actually do some reading on d912pxy, it still has a multitude of stability issues (random crashes, system hangs, etc. depending on your luck and hardware), and also doesn't even work well with GCN-based AMD graphics cards last I knew according to some posts on the Github page. It's a good solution for many people, but that particular implementation is far from a perfect one.

 

Anet probably hasn't done anything about it because it's going to take a lot more work than that one lone programmer has accomplished to ensure any wrapper or DX12 implementation in GW2 works properly, without any outstanding issues or hardware incompatibilities on setups that should run it fine (see: RX 580s, etc.). Things in-house have to be held to a much higher standard than the work of a random guy on Reddit.

Link to comment
Share on other sites

  • Replies 88
  • Created
  • Last Reply

Top Posters In This Topic

> @"Emberstone.2904" said:

> > @"TinkTinkPOOF.9201" said:

> > > @"Emberstone.2904" said:

> > > > @"TinkTinkPOOF.9201" said:

> > > > > @"Trise.2865" said:

> > > > > Do you have ANY idea what it will take to rewrite the entire game engine?

> > > >

> > > > What does rewriting the game engine and the API it uses have to do with each other?

> > > >

> > > > People seem to not understand that DX12 is an API, it is NOT a game engine, nor does changing the API of a game engine mean starting over. Some game engines have many API's. And API is an interface, the game engine is the functionality that can be made up of all sorts of assets including tools that are used by the developer to aid in faster creation of the game, such as authoring tools, model editors, importers etc etc. Most game engines include an API, but the API does not define the game engine and can be changed.

> > > >

> > > > The API standardizes how the engine interacts with other software and hardware so it knows what to expect, direct hardware access is also banned by most OS's, such as Windows, Linux and OSX. However, each bit of hardware is different, AMD and NV is not even the same kind of architecture, and that does not include proprietary hardware designs such as RT cores for AI or Ray tracing acceleration. Those vendor specific drivers would never work without a standard API such as DX12.

> > >

> > > Yes, the API can be changed, but let's spin this a little bit differently.

> > >

> > > Let's say you have a program written in C++, but you want to port it to Java. Most programming languages offer the exact same set of features at a base level (various kinds of loops, function libraries with a plethora of math/memory/etc. operations figured out for you, etc.). Even if you're only using basic libraries containing functions that can be, more or less, found across many different languages (like square roots), a program with 10,000 lines of code in total will still take a pretty large amount of time to make the like-for-like translation, and then do all the necessary R&D to ensure it works as it did before. This isn't a perfect example, but it should get the point across.

> > >

> > > Now, I'm pretty sure a dev said somewhere that GW2 has more than 10,000 lines of code, and surely a game of this scale uses a lot of custom-made stuff in its graphics pipeline rather than canned DX9 functions in order to get the game to work the way they want it to. Making that translation on custom code on a scale this large likely won't happen, even if it's just an API change. Sure they wouldn't be changing the core logic of the game, but every graphics call is going to have to be altered and that's still no small feat. DX12 is markedly different than DX9.

> > >

> > > So, even if you're just translating GW2's code from one form to another, there's still a boatload of work to do to get it all done and ensure it works. And it can't just be a simple translation from DX9 to 12 since there are some major differences that have to be accounted for.

> > >

> > > Don't get me wrong, I would absolutely love to see this game get a big engine update. Dipping to 20 FPS unless I lower shadows and character models in Dragon's Stand is not fun, especially when I have an overclocked 8700K and GTX 1070. My GPU is barely utilized even at 1440p in these scenarios since the CPU can't keep up. But like I said, because of the time investment a drastic update to the graphics pipeline likely isn't going to happen.

> > >

> > > But I'm praying that it does with you.

> >

> > C++ and Java would be closer to the engine, as both already have their own APIs defined in the SDK. Java and C++ are also very different animals, as Java assumes a full fledged OS, while C++ could be the programing of the OS it self. One assumes an OS and the output being an application of some sort, while the other makes almost no assumptions about anything, as C++ can be used for almost anything and as such the standard library assumes almost nothing about what it will have available, and so the standard library doesn't make any dependencies on those features.

> >

> > In the case of GW2 we are talking porting it to DX12, not rewriting the whole engine like your example above would require. Now, not saying it's as easy as some people assume, however it seems people either assume it's a click of a button, or they assume it requires a ground up rebuild of the engine, when it's really closer to the middle. As has already been stated, a single unpaid person has written a wrapper for GW2 that implements many of DX12 performance benefits. So what you seem to be saying is that anets whole dev team is less capable than this single unpaid programmer?

>

> ... you completely missed the point of what I said. It was a metaphor about like-for-like translation of an application, not an actual insinuation that they should port anything to another language. The languages in the metaphor were also arbitrary. I know C++ and Java are extremely different animals.

>

> Also, I'm sure Anet has taken a look at d912pxy. Given the traction it's getting it would be pretty unwise to ignore it from a security perspective. If it were that easy to get a huge performance boost, I'm sure something might've even been implemented in the game by now. People on these forums always talk like the devs are clueless, but when it comes to Guild Wars, they know more than any of us. Anyway, if you actually do some reading on d912pxy, it still has a multitude of stability issues (random crashes, system hangs, etc. depending on your luck and hardware), and also doesn't even work well with GCN-based AMD graphics cards last I knew according to some posts on the Github page. It's a good solution for many people, but that particular implementation is far from a perfect one.

>

> Anet probably hasn't done anything about it because it's going to take a lot more work than that one lone programmer has accomplished to ensure any wrapper or DX12 implementation in GW2 works properly, without any outstanding issues or hardware incompatibilities on setups that should run it fine (see: RX 580s, etc.). Things in-house have to be held to a much higher standard than the work of a random guy on Reddit.

 

I didn't miss your point. It was an incorrect point, as what you used as an example would be what people are claiming would be a full rewrite from the ground up, which is not the case. A better example would be a Java based program being ported to a new Java API, as there are literally dozens of Java APIs. As again, this is an API change, not a programming language or game engine change.

 

The wrapper is buggy because it is a wrapper, and not native support, DX12 is very well understood at this point, however the programmer who made it doesn't have access to the game code and has to use work arounds and tricks that someone with game engine access would not have to do. On top of that he is working backwards, as he does not have access to this or how they have designed it he is having to figure that all out and how to make use of it's output into a DX12 wrapper, trouble shooting that or fixing bugs when you literally only have access to one part of the puzzle is impossible, and yet he is still making it work. That makes his job harder, not easier.

Link to comment
Share on other sites

> @"TinkTinkPOOF.9201" said:

> They will never do it for GW2, you might see it in GW3 however. GW was built on a custom engine, and when GW2 was being made, they decided to use the same engine and just modify it for the new game. I do not think they really understood the impacts of this on large scale PvE and PvP. The better option would have been one of the off the shelf 3rd party engines that many games use. However being that they already had an engine that only needed some changes, I can only assume this was a cost cutting decision as it required no new licensing costs. I think another reason for this was that it was DX9 and they had plans on releasing GW2 on console, which used a modified DX9 API at the time. The xbox 360 was also a 3 core system of the time, which I think over time they came to see that it just didn't have the CPU power to handle the game properly. I have seen this hinted at but never actually discussed in detail to the community.

>

>

> > @"Zaklex.6308" said:

> > > @"hunkamania.7561" said:

> > > Serious question since the game is sub optimal in performance for large scale pve and pvp.

> >

> > Not necessary since the game is CPU intensive and not GPU intensive...as in, most functions are done with the CPU, the GPU hardly does anything, and ArenaNet themselves have stated several times that the benefits are far outweighed by the drawbacks.

>

> The game is not CPU intensive, people misunderstand why the game is CPU limited, but not CPU intensive. DX9 does not allow for much multi-threading, its main render pipeline is not parallel and as such loads up only a single core, you can see this by looking at a detailed process manager of the game, where one process will stand out as it will use up all of a single CPU thread on it's own. As such, a CPU might only hit 30% load, and GPU only be at 40% load, but you are already capped in FPS, that is because this thread has peaked.

>

> DX12 allows for less over head and better multi-threading, with that said, some processes can't be processed in parallel. It has been shown that systems that struggle with the game using the DX12 wrapper someone made for the game can boost FPS massively and that is from just a wrapper that has it's own over head and made by a single person who isn't getting paid for it. If this was done by anet in native support for DX12, I have no doubt the game would see large improvements.

 

Did you miss my last line in my original comment...ArenaNet devs themselves have stated several times that the boost we think we'd get is not worth the drawbacks(which in this case is taking away development time from new content and systems). They've examined it, ran models...it fails the cost/benefit test every time.

Link to comment
Share on other sites

> @"Zaklex.6308" said:

> > @"TinkTinkPOOF.9201" said:

> > They will never do it for GW2, you might see it in GW3 however. GW was built on a custom engine, and when GW2 was being made, they decided to use the same engine and just modify it for the new game. I do not think they really understood the impacts of this on large scale PvE and PvP. The better option would have been one of the off the shelf 3rd party engines that many games use. However being that they already had an engine that only needed some changes, I can only assume this was a cost cutting decision as it required no new licensing costs. I think another reason for this was that it was DX9 and they had plans on releasing GW2 on console, which used a modified DX9 API at the time. The xbox 360 was also a 3 core system of the time, which I think over time they came to see that it just didn't have the CPU power to handle the game properly. I have seen this hinted at but never actually discussed in detail to the community.

> >

> >

> > > @"Zaklex.6308" said:

> > > > @"hunkamania.7561" said:

> > > > Serious question since the game is sub optimal in performance for large scale pve and pvp.

> > >

> > > Not necessary since the game is CPU intensive and not GPU intensive...as in, most functions are done with the CPU, the GPU hardly does anything, and ArenaNet themselves have stated several times that the benefits are far outweighed by the drawbacks.

> >

> > The game is not CPU intensive, people misunderstand why the game is CPU limited, but not CPU intensive. DX9 does not allow for much multi-threading, its main render pipeline is not parallel and as such loads up only a single core, you can see this by looking at a detailed process manager of the game, where one process will stand out as it will use up all of a single CPU thread on it's own. As such, a CPU might only hit 30% load, and GPU only be at 40% load, but you are already capped in FPS, that is because this thread has peaked.

> >

> > DX12 allows for less over head and better multi-threading, with that said, some processes can't be processed in parallel. It has been shown that systems that struggle with the game using the DX12 wrapper someone made for the game can boost FPS massively and that is from just a wrapper that has it's own over head and made by a single person who isn't getting paid for it. If this was done by anet in native support for DX12, I have no doubt the game would see large improvements.

>

> Did you miss my last line in my original comment...ArenaNet devs themselves have stated several times that the boost we think we'd get is not worth the drawbacks(which in this case is taking away development time from new content and systems). They've examined it, ran models...it fails the cost/benefit test every time.

 

Except for every other game that has switched to DX12 that was CPU limited shows that is not the case. There are no "models to run", you act like this is some mathematical theory that needs to be proven, it doesn't work that way. There are well known and documented data on how and where DX12 will help with performance, CPU limited render pipelines is literary one of the main selling points for DX12, and the main performance issue with GW2.

Link to comment
Share on other sites

> @"TinkTinkPOOF.9201" said:

> I didn't miss your point. It was an incorrect point, as what you used as an example would be what people are claiming would be a full rewrite from the ground up, which is not the case. A better example would be a Java based program being ported to a new Java API, as there are literally dozens of Java APIs. As again, this is an API change, not a programming language or game engine change.

>

> The wrapper is buggy because it is a wrapper, and not native support, DX12 is very well understood at this point, however the programmer who made it doesn't have access to the game code and has to use work arounds and tricks that someone with game engine access would not have to do. On top of that he is working backwards, as he does not have access to this or how they have designed it he is having to figure that all out and how to make use of it's output into a DX12 wrapper, trouble shooting that or fixing bugs when you literally only have access to one part of the puzzle is impossible, and yet he is still making it work. That makes his job harder, not easier.

 

You're still missing it. The point I was trying to make is that everything which talks to DX9 is going to have to be screwed with to talk to DX12, and that's very likely a significant portion of the rendering pipeline.

 

Nothing productive is happening here, so I'll take my leave since it's just going back and forth. We don't know how Guild War 2 is made, and ultimately all we can do is speculate anyway.

 

 

Link to comment
Share on other sites

The game is poorly optimized, it dont really multithread and def does not use multicores. I like how people say its our pcs, some of us play on new hardware ryzen/intel 6 and 8 core machines with great vid cards, we still see a performance hit because the game is not optimized. To be quite honest my old win 7 pc plays the game as well as my 8 month old rig, wonder why? Maybe because this game was created when pcs like my old win 7 were top tech, its smoother for them because it was built around that technology. That was 7 years ago, an eternity in tech time. I think most people dont want to update the game because that would mean they have to upgrade their pc, many have prob never played this game at full settings they get by on low settings and are happy with it because they dont want to upgrade. Anet knows this, it wont make the game better because then it will lose a lot of players that cant upgrade their pcs.

 

So lets be honest here, the game runs poorly even on the best machines, it could always run better.

Link to comment
Share on other sites

> @"Tiviana.2650" said:

> So lets be honest here, the game runs poorly even on the best machines, it could always run better.

 

I think that's actually a good reason to not change it though ... If Anet optimize it for new machines and tech ... where does that leave anyone who doesn't have those things? I'm betting most of the casual audience that this game appeals to isn't rushing out every 6 months to upgrade either. They probably couldn't even tell you what they were running ...

 

Link to comment
Share on other sites

> @"Tiviana.2650" said:

> So lets be honest here, the game runs poorly even on the best machines, it could always run better.

 

Sure. The game could always run better. You could say that for any game to be honest. As far as running poorly on the best machines, well I may not have the best, but it runs well enough for me.

Link to comment
Share on other sites

> @"Obtena.7952" said:

> I think that's actually a good reason to not change it though ... If Anet optimize it for new machines and tech ... where does that leave anyone who doesn't have those things? I'm betting most of the casual audience that this game appeals to isn't rushing out every 6 months to upgrade either. They probably couldn't even tell you what they were running ...

 

eh what? how would an upgrade to game performance only effect better pcs?

 

Link to comment
Share on other sites

> @"Stand The Wall.6987" said:

> > @"Obtena.7952" said:

> > I think that's actually a good reason to not change it though ... If Anet optimize it for new machines and tech ... where does that leave anyone who doesn't have those things? I'm betting most of the casual audience that this game appeals to isn't rushing out every 6 months to upgrade either. They probably couldn't even tell you what they were running ...

>

> eh what? how would an upgrade to game performance only effect better pcs?

>

 

My question is how would a upgrade to game performance be impacted by people with older PC's. Am I so old to think that older machines wouldn't run slower if the tech the games were built on was improved? That's how it used to work.

 

I think the basic message here is that an upgrade isn't win for everyone, unless I completely don't understand the relation between hardware and software anymore. If the game is to work at a optimal functional level for the widest audience, the target should be the most common setup of the players, not the best one.

Link to comment
Share on other sites

> @"Ashantara.8731" said:

> Really? Can you run any MS Office products on it? Or an Adobe Suite with Photoshop etc.? If so, that would be great as Windows 8.1 will certainly be the last Windows I'll use, unless MS stops their silly Windows 10 policy and goes back to their OS not being one that constantly updates to new versions, causing tons of issues for gamers and others. (P.S. AFAIK, there are alternatives to Windows, Linux and Mac altogether, but I would have to thoroughly look into them first.)

 

MS Office I'd have to look into as well as current Adobe Suite stuff since I'm using an older version of Photoshop (CS6 and have not tested it yet) and I don't even use Office on Windows anyway (LibreOffice for me), but if I can get my hands on those programs, I can definitely give them a go. I know some stuff won't run in Wine period, but most will. Also, I tested the iLok Manager on my Arch laptop and it works, so far. Can see my licenses and this laptop is detected by the manager. I need to see if it works with PLAY 6 (which so far works, but with some graphical glitches) and if it does, then I'll be pretty happy (once I work out the graphical glitches for the PLAY UI).

 

If it's not worth it, well then, my laptop will probably become my permanent Windows machine for music creation only and my main PC will be converted to Linux, by imaging the current laptop SSD to the PC once January hits.

Link to comment
Share on other sites

> @"Zaklex.6308" said:

> > @"TinkTinkPOOF.9201" said:

> > They will never do it for GW2, you might see it in GW3 however. GW was built on a custom engine, and when GW2 was being made, they decided to use the same engine and just modify it for the new game. I do not think they really understood the impacts of this on large scale PvE and PvP. The better option would have been one of the off the shelf 3rd party engines that many games use. However being that they already had an engine that only needed some changes, I can only assume this was a cost cutting decision as it required no new licensing costs. I think another reason for this was that it was DX9 and they had plans on releasing GW2 on console, which used a modified DX9 API at the time. The xbox 360 was also a 3 core system of the time, which I think over time they came to see that it just didn't have the CPU power to handle the game properly. I have seen this hinted at but never actually discussed in detail to the community.

> >

> >

> > > @"Zaklex.6308" said:

> > > > @"hunkamania.7561" said:

> > > > Serious question since the game is sub optimal in performance for large scale pve and pvp.

> > >

> > > Not necessary since the game is CPU intensive and not GPU intensive...as in, most functions are done with the CPU, the GPU hardly does anything, and ArenaNet themselves have stated several times that the benefits are far outweighed by the drawbacks.

> >

> > The game is not CPU intensive, people misunderstand why the game is CPU limited, but not CPU intensive. DX9 does not allow for much multi-threading, its main render pipeline is not parallel and as such loads up only a single core, you can see this by looking at a detailed process manager of the game, where one process will stand out as it will use up all of a single CPU thread on it's own. As such, a CPU might only hit 30% load, and GPU only be at 40% load, but you are already capped in FPS, that is because this thread has peaked.

> >

> > DX12 allows for less over head and better multi-threading, with that said, some processes can't be processed in parallel. It has been shown that systems that struggle with the game using the DX12 wrapper someone made for the game can boost FPS massively and that is from just a wrapper that has it's own over head and made by a single person who isn't getting paid for it. If this was done by anet in native support for DX12, I have no doubt the game would see large improvements.

>

> Did you miss my last line in my original comment...ArenaNet devs themselves have stated several times that the boost we think we'd get is not worth the drawbacks(which in this case is taking away development time from new content and systems). They've examined it, ran models...it fails the cost/benefit test every time.

 

When devs say something has no benefit, what they really mean is, the people who require higher profits don't think it's worth the investment. Any sort of change like this is meant for long term improvement, these particular people have short term focuses.

 

Most old games go through some change at some point, even small KR games do. GW2 is one of the few games I've seen that is stubborn about never changing in that aspect.

Link to comment
Share on other sites

I can see Anet's point that enhancing the game engine may not add much to sales (though hard to measure) . To me, what would make sense is ability to turn down other players effects. People want that simply because it is too flashy, gives them headaches, etc, and I would think that would also improve game performance.

 

Link to comment
Share on other sites

> @"crepuscular.9047" said:

> > @"mercury ranique.2170" said:

> > > @"crepuscular.9047" said:

> > >

> > > the GPU side of the things are fine, the problem is on the CPU side of the things that requires massive amount of computation like model positioning, skills executions, damage counters, etc

> > >

> > > anet just keep on adding more and more things that the CPU needs to calculate within seconds, it's really bloated

> > > unless Anet figures out a way to properly separate calculation processes between CPUs to work in parallel, we are pretty much stuck like this; that said, multi-thread processing is difficult because of the computation is very serial

> >

> > You asume, this is cause they are incapable. I think that assumption is wrong and it is on purpose.

> > GW1 and GW2 has always been friendly on low end computers. I can run the game reasonable well on computers running intel GPU. GW1 could run on almost every system capable of running windows XP. This has the advantage that you are inviting and open for almost all new players to the genre and even to online games. People who are new will not likely want to invest into a brand new computer that cost a 4 figure amount of money. So the filosophy seems to be that someone must be able to install the game and try it out, regardless of their system. That is why it is this way. They know it limits those with high end systems (I also have a high end rig), but it is a well thought through trade off and not a lack of skills

>

> oh, i dont assume, i know, my old 4770K was running perferctly smoothly on medium settings for years until PoF came out, was unplayable unless set to lowest setting

> Especially when people jumps at mob spawn groups with raptor tail swipe, the machine will go nuts maxing out the CPU and getting freezes for couple of seconds

>

> i was basically forced to upgrade to what I'm running now, 8700K thanks for PoF

 

I'm running an I3 CPU with Intel HD Graphics and can run everything. It is not a good quality, but it runs. This is the core of my point. You want a higher quality, this goes against the concept of making the game available for most systems. And it is even doubtfull that DX12 support will make a huge difference.

Link to comment
Share on other sites

> @"Emberstone.2904" said:

> > @"TinkTinkPOOF.9201" said:

> > I didn't miss your point. It was an incorrect point, as what you used as an example would be what people are claiming would be a full rewrite from the ground up, which is not the case. A better example would be a Java based program being ported to a new Java API, as there are literally dozens of Java APIs. As again, this is an API change, not a programming language or game engine change.

> >

> > The wrapper is buggy because it is a wrapper, and not native support, DX12 is very well understood at this point, however the programmer who made it doesn't have access to the game code and has to use work arounds and tricks that someone with game engine access would not have to do. On top of that he is working backwards, as he does not have access to this or how they have designed it he is having to figure that all out and how to make use of it's output into a DX12 wrapper, trouble shooting that or fixing bugs when you literally only have access to one part of the puzzle is impossible, and yet he is still making it work. That makes his job harder, not easier.

>

> You're still missing it. The point I was trying to make is that everything which talks to DX9 is going to have to be screwed with to talk to DX12, and that's very likely a significant portion of the rendering pipeline.

>

> Nothing productive is happening here, so I'll take my leave since it's just going back and forth. We don't know how Guild War 2 is made, and ultimately all we can do is speculate anyway.

>

>

 

You do understand you are talking in circles right? You also understand when you talk about [the rendering pipeline](https://docs.microsoft.com/en-us/windows/desktop/direct3d11/overviews-direct3d-11-graphics-pipeline "the rendering pipeline") you ARE talking about the API, drivers and GPU right?

Link to comment
Share on other sites

> @"Blocki.4931" said:

> Maybe it's not completely out of question, but highly unlikely as of now. Though wouldn't it also mean that people with older rigs would get screwed over? Wouldn't be the best decision.

 

Not if they made it optional, which would probably end up being more work to have multiple versions running in the same server.

Link to comment
Share on other sites

 

> @"mercury ranique.2170" said:

> This is the core of my point. You want a higher quality, this goes against the concept of making the game available for most systems. And it is even doubtfull that DX12 support will make a huge difference.

 

 

 

> @"Obtena.7952" said:

> My question is how would a upgrade to game performance be impacted by people with older PC's. Am I so old to think that older machines wouldn't run slower if the tech the games were built on was improved? That's how it used to work.

>

> I think the basic message here is that an upgrade isn't win for everyone, unless I completely don't understand the relation between hardware and software anymore. If the game is to work at a optimal functional level for the widest audience, the target should be the most common setup of the players, not the best one.

 

 

This thread is about a change that would improve performance across the board though, not just newer/high end systems. Multicore/multithreading precedes this game after all. The reason it's not in GW2 is because the engine is an adapted version of original GW.

 

The extent of the improvement would vary for each system, which I assume is the reason Anet doesn't consider the change cost-effective.

 

Link to comment
Share on other sites

> @"AlexxxDelta.1806" said:

>

> > @"mercury ranique.2170" said:

> > This is the core of my point. You want a higher quality, this goes against the concept of making the game available for most systems. And it is even doubtfull that DX12 support will make a huge difference.

>

>

>

> > @"Obtena.7952" said:

> > My question is how would a upgrade to game performance be impacted by people with older PC's. Am I so old to think that older machines wouldn't run slower if the tech the games were built on was improved? That's how it used to work.

> >

> > I think the basic message here is that an upgrade isn't win for everyone, unless I completely don't understand the relation between hardware and software anymore. If the game is to work at a optimal functional level for the widest audience, the target should be the most common setup of the players, not the best one.

>

>

> This thread is about a change that would improve performance across the board though, not just newer/high end systems. Multicore/multithreading precedes this game after all. The reason it's not in GW2 is because the engine is an adapted version of original GW.

>

> The extent of the improvement would vary for each system, which I assume is the reason Anet doesn't consider the change cost-effective.

>

 

The thread is about the desire to improve the performance. People claim they have subpar performance due to GW2 being CPU-sensitive instead of GPU sensitive and ask this to be changed, this however will go against low end systems not being able to play the game anymore. I am not against DX12-support, but I doubt it will change a lot as GW2's performance is a lot more tied to the CPU being the bottleneck, some claim this is sloppy design, I say it is smart for business as it supports more low end systems.

Link to comment
Share on other sites

> @"mercury ranique.2170" said:

> > @"AlexxxDelta.1806" said:

> >

> > > @"mercury ranique.2170" said:

> > > This is the core of my point. You want a higher quality, this goes against the concept of making the game available for most systems. And it is even doubtfull that DX12 support will make a huge difference.

> >

> >

> >

> > > @"Obtena.7952" said:

> > > My question is how would a upgrade to game performance be impacted by people with older PC's. Am I so old to think that older machines wouldn't run slower if the tech the games were built on was improved? That's how it used to work.

> > >

> > > I think the basic message here is that an upgrade isn't win for everyone, unless I completely don't understand the relation between hardware and software anymore. If the game is to work at a optimal functional level for the widest audience, the target should be the most common setup of the players, not the best one.

> >

> >

> > This thread is about a change that would improve performance across the board though, not just newer/high end systems. Multicore/multithreading precedes this game after all. The reason it's not in GW2 is because the engine is an adapted version of original GW.

> >

> > The extent of the improvement would vary for each system, which I assume is the reason Anet doesn't consider the change cost-effective.

> >

>

> The thread is about the desire to improve the performance. People claim they have subpar performance due to GW2 being CPU-sensitive instead of GPU sensitive and ask this to be changed, this however will go against low end systems not being able to play the game anymore.

 

No, this thread is about utilizing multiple threads of the CPU which is something DX12 allows for. It doesn't exclude lower end systems that can run PoF currently.

 

> @"mercury ranique.2170" said:

> I am not against DX12-support, but I doubt it will change a lot as GW2's performance is a lot more tied to **the CPU being the bottleneck** , some claim this is sloppy design, I say it is smart for business as it supports more low end systems.

 

This thread is about alleviating the CPU bottleneck pretty much, although I too doubt it would make a huge difference. Even _some_ difference is welcome though. I agree it's smart business to optimize your game for the average user instead of high-end, I own a mid-level rig that needs upgrading after all. But their decision to go with DX9 was because they used their older GW engine for GW2. You can see some speculated reasons for that choice in posts above. Besides the obvious of lower cost than starting from scratch that is.

Link to comment
Share on other sites

> @"Obtena.7952" said:

> My question is how would a upgrade to game performance be impacted by people with older PC's. Am I so old to think that older machines wouldn't run slower if the tech the games were built on was improved? That's how it used to work.

>

> I think the basic message here is that an upgrade isn't win for everyone, unless I completely don't understand the relation between hardware and software anymore. If the game is to work at a optimal functional level for the widest audience, the target should be the most common setup of the players, not the best one.

 

pretty sure everyone would see an increase, most people have dx12. people with better pc's would see a bigger performance increase cuz their machines are capable of more. at least that's how I understand it.

 

Link to comment
Share on other sites

> @"TinkTinkPOOF.9201" said:

> > @"Emberstone.2904" said:

> > > @"TinkTinkPOOF.9201" said:

> > > I didn't miss your point. It was an incorrect point, as what you used as an example would be what people are claiming would be a full rewrite from the ground up, which is not the case. A better example would be a Java based program being ported to a new Java API, as there are literally dozens of Java APIs. As again, this is an API change, not a programming language or game engine change.

> > >

> > > The wrapper is buggy because it is a wrapper, and not native support, DX12 is very well understood at this point, however the programmer who made it doesn't have access to the game code and has to use work arounds and tricks that someone with game engine access would not have to do. On top of that he is working backwards, as he does not have access to this or how they have designed it he is having to figure that all out and how to make use of it's output into a DX12 wrapper, trouble shooting that or fixing bugs when you literally only have access to one part of the puzzle is impossible, and yet he is still making it work. That makes his job harder, not easier.

> >

> > You're still missing it. The point I was trying to make is that everything which talks to DX9 is going to have to be screwed with to talk to DX12, and that's very likely a significant portion of the rendering pipeline.

> >

> > Nothing productive is happening here, so I'll take my leave since it's just going back and forth. We don't know how Guild War 2 is made, and ultimately all we can do is speculate anyway.

> >

> >

>

> You do understand you are talking in circles right? You also understand when you talk about [the rendering pipeline](https://docs.microsoft.com/en-us/windows/desktop/direct3d11/overviews-direct3d-11-graphics-pipeline "the rendering pipeline") you ARE talking about the API, drivers and GPU right?

 

The game's* rendering pipeline (can refer to your project's handling of graphics calls as well. Semantics.). In many cases, it isn't even a "middled" approach in terms of project scale in order to port your graphics engine. Might want to try some real graphics programming sometime. WebGL is an easy place to start these days.

 

Anyway, whether you use Vulkan, OpenGL, DirectX, etc. (and which version of any of them) will drastically affect how you design your graphics code (how you set up and load your buffers, how your shaders work, even the order in which you draw each vertex of an object sometimes matters) and certain pieces of it it will inevitably affect some of the game's core logic.

 

It's not a full rewrite of the game, not even close, but making your graphics work is usually a bigger portion of your code than you appear to think for a project of this scale. I want DX12, but I'm not going to sugarcoat it by just calling it "an API."

 

Tl;dr: It isn't worth it to Anet.

 

Ok. I'm officially done biting.

Link to comment
Share on other sites

> @"Emberstone.2904" said:

> > @"TinkTinkPOOF.9201" said:

> > > @"Emberstone.2904" said:

> > > > @"TinkTinkPOOF.9201" said:

> > > > I didn't miss your point. It was an incorrect point, as what you used as an example would be what people are claiming would be a full rewrite from the ground up, which is not the case. A better example would be a Java based program being ported to a new Java API, as there are literally dozens of Java APIs. As again, this is an API change, not a programming language or game engine change.

> > > >

> > > > The wrapper is buggy because it is a wrapper, and not native support, DX12 is very well understood at this point, however the programmer who made it doesn't have access to the game code and has to use work arounds and tricks that someone with game engine access would not have to do. On top of that he is working backwards, as he does not have access to this or how they have designed it he is having to figure that all out and how to make use of it's output into a DX12 wrapper, trouble shooting that or fixing bugs when you literally only have access to one part of the puzzle is impossible, and yet he is still making it work. That makes his job harder, not easier.

> > >

> > > You're still missing it. The point I was trying to make is that everything which talks to DX9 is going to have to be screwed with to talk to DX12, and that's very likely a significant portion of the rendering pipeline.

> > >

> > > Nothing productive is happening here, so I'll take my leave since it's just going back and forth. We don't know how Guild War 2 is made, and ultimately all we can do is speculate anyway.

> > >

> > >

> >

> > You do understand you are talking in circles right? You also understand when you talk about [the rendering pipeline](https://docs.microsoft.com/en-us/windows/desktop/direct3d11/overviews-direct3d-11-graphics-pipeline "the rendering pipeline") you ARE talking about the API, drivers and GPU right?

>

> The game's* rendering pipeline (can refer to your project's handling of graphics calls as well. Semantics.). In many cases, it isn't even a "middled" approach in terms of project scale in order to port your graphics engine. Might want to try some real graphics programming sometime. WebGL is an easy place to start these days.

 

Ah yes, the good old talk down to someone by suggesting I do some "real graphics programming", when you admitted above you don't know how a game is made, because you don't have an argument.

 

The first part, all I can say is "what?". Semantics? Draw calls are apart of the rendering pipeline, which is handled by the API and GPU, draw calls are held up by DX9 because unlike DX12 and multi threading, DX9 has to wait for API overhead that is processed on the CPU before those draw calls from the API can be sent to the GPU to render the frame. It is again, one of the main selling points of DX12, frame time is reduced because you are not waiting on API over head to be processed by the CPU for the calls to be made resulting in less of a CPU bottleneck and greater loading of the GPU and higher frame rates.

 

>

> Anyway, whether you use Vulkan, OpenGL, DirectX, etc. (and which version of any of them) will drastically affect how you design your graphics code (how you set up and load your buffers, how your shaders work, even the order in which you draw each vertex of an object sometimes matters) and certain pieces of it it will inevitably affect some of the game's core logic.

>

> It's not a full rewrite of the game, not even close, but making your graphics work is usually a bigger portion of your code than you appear to think for a project of this scale. I want DX12, but I'm not going to sugarcoat it by just calling it "an API."

>

> Tl;dr: It isn't worth it to Anet.

>

> Ok. I'm officially done biting.

 

Please, go on and explain how shaders and vertex order effects the change to DX12.

 

I never said it would be a small amount of work. I actually stated that both people who think it's a click of a button, and the other side who think it's a full ground up rewrite of the game are both wrong, and it is closer to the middle.

 

Calling it an API is not "sugercoating it", that is flat out, matter of fact [what it is.](https://en.wikipedia.org/wiki/Direct3D "what it is.")

Link to comment
Share on other sites

> @"Stand The Wall.6987" said:

> > @"Obtena.7952" said:

> > My question is how would a upgrade to game performance be impacted by people with older PC's. Am I so old to think that older machines wouldn't run slower if the tech the games were built on was improved? That's how it used to work.

> >

> > I think the basic message here is that an upgrade isn't win for everyone, unless I completely don't understand the relation between hardware and software anymore. If the game is to work at a optimal functional level for the widest audience, the target should be the most common setup of the players, not the best one.

>

> most people have dx12.

>

 

Source?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...