Jump to content
  • Sign Up

The Future: GW2 in 4K


Recommended Posts

> @"kharmin.7683" said:

> > @"Korval.3751" said:

> > Blizzard invested money back into their engine to modernize it.

>

> I really wish people would stop comparing GW2 to WoW. It's not remotely close to the same model. WoW has more resources due to its subscription model whereas it's like pulling teeth to get anyone to spend any cash in the GW2 Gem Shop. People complain constantly about the cost of items and/or whine because things aren't free. It GW2 players spent more, then perhaps ANet would have the financial resources to make the changes that the OP desires.

 

Comparing GW2 to WOW is absolutely a fair comparison. I don't care about Blizzard's resources versus ArenaNet's resources. That's totally irrelevant to the point.

Link to comment
Share on other sites

  • Replies 88
  • Created
  • Last Reply

Top Posters In This Topic

> @"phokus.8934" said:

> It was inferred by you that you're getting a 3-5ms ping to the servers, otherwise why would you give out superfluous information? You're be comparing oranges to mashed potatoes otherwise.

 

No, what I communicated was because I get an average of 3-5 ms ping online then that is the possible average ping rate one might get from GW2. If ArenaNet's servers are running at shit performance and can't go less than 100 ms, then that's all on ArenaNet and has nothing to do with my connection speed and low latency. In other words, Fiber Optics don't lie.

 

> @"phokus.8934" said:

> Also consider the following: you're once against making a bad comparison - an MMO versus an FPS that doesn't carry the infrastructure of an MMO. This is purely speculation but I can only imagine that ArenaNet went with the Guild Wars game engine is due to their familiarity with it. If they went with say, the Unreal engine, there might've been a significant learning curve and also who knows how the licensing of the Unreal engine looks like for a game like GW2.

 

Here's the problem. I compare GW2 to WOW, and everyone was like, "No! You can't compare GW2 to WOW!" So, my response is "Okay, let's look at Bioshock Infinite that came out in 2013 running on DirectX 11." And everyone is like "No! You can't compare GW2 to BIS." People, seriously. WTF?! It's a flipping 2012 game, and it's an MMO. Comparing it to WOW and BioShock Infinite is entirely acceptable. You're not going to find a PERFECT 1:1 comparison because one does not exist.

 

> @"phokus.8934" said:

> In a perfect world, I'd love to see GW2 push our high end machines to run at max FPS with full settings but being the realist I am, I'm fine with what GW2 has and its performance.

 

Please don't toss out the "in a perfect world" adage. ArenaNet doesn't run a charity. They run a business, and their business is gaming. They chose to release their games on PC. It is ArenaNet's responsibility to ensure their main product stays current with modern hardware. If they ignore this, then they do so at the risk of their business.

Link to comment
Share on other sites

So aside from performance.......is the UI scaling better in 4k now? previously it was very tiny and game was unplayable as a result.

Most MMOs have this issue, which is why 4k is still years away from being worth it.....which is hilarious because when it eventually does, 8k or something else will replace it.

Link to comment
Share on other sites

Scaling is improved, but still not great (one can increase text size and icon sizes to reasonable level, but I still find the mini map too small).

Given the current level of 4K adoption, it may not make sense to improve performance just for 4K. But a lot of people play at 1080p complain about performance, so any rewrite to use more CPU threads or better utilize the graphics card is helpful to a lot of people. The general trend seems to be more about adding cores vs making the cores faster (eg, AMD Ryzen).

And while Gameplay may be more important that graphics, and certain points, per graphics (or poor performance) has impact on gameplay. I'll give credit to Anet in that they seem to be improving the quality of graphics in the new releases, but it also means performance suffers, and when you get into some fight and FPS drops really low, that just hurts gameplay.

 

Link to comment
Share on other sites

> @"Korval.3751" said:

> > @"kharmin.7683" said:

> > > @"Korval.3751" said:

> > > Blizzard invested money back into their engine to modernize it.

> >

> > I really wish people would stop comparing GW2 to WoW. It's not remotely close to the same model. WoW has more resources due to its subscription model whereas it's like pulling teeth to get anyone to spend any cash in the GW2 Gem Shop. People complain constantly about the cost of items and/or whine because things aren't free. It GW2 players spent more, then perhaps ANet would have the financial resources to make the changes that the OP desires.

>

> Comparing GW2 to WOW is absolutely a fair comparison. I don't care about Blizzard's resources versus ArenaNet's resources. That's totally irrelevant to the point.

 

Just because you don't care doesn't make it irrelevant.

 

/shrug

Link to comment
Share on other sites

[https://youtube.com/watch?v=TCzoMyIFHCk](https://www.youtube.com/watch?v=TCzoMyIFHCk "https://youtube.com/watch?v=TCzoMyIFHCk")

 

i need to retest with "native" rendering, as i forgot to set that before recording the video.

MAX settings at 4k with native rendering was like 7-12fps average.

4k on low settings with "native" around 70-100 fps depends on area, lowest between 30-50fps.

 

Sorry, it's hard to recall the exact numbers, Also the FPS drop from subsampling to Native isn't much.

 

The GPU model was a low profile XFX rx 550 4gb.

 

ArenaNet has set their minimum requirements to support old dual core CPU's. The game was set more for CPU loading than the GPU so doesn't require much of a gpu for 4k gaming.

 

They factored in the rebuild of the engine and decided the performance gains to the cost ratio was not worth the risk.

 

I'm still curious if it's really a engine problem or Server problem. Only way to see is for them to hold inside house testing of 4k gaming on the newest model PC and see if FPS drops occur. If drops don't occur then the problem lies within the servers and not the engine.

 

Maybe they can hold a Live stream of this testing on their next stream *hint* ArenaNet.(I'd watch this)

 

On the Overclock side of things, the "bus speed"(usually the numbers are like "200") not the "multiplier"(usually numbered as "x10") usually helps increase the IPC of CPU's. This option did great performance gains on the old FX series of CPU's for gamers. You want to mess with the "multiplier" after messing with "bus speed" first.

 

I recommend RAM OC, specifically TIMING as on my athlon x2 64 5200+ i messed with the ram timing and FPS went from 70fps to 100fps on the Phantasy Star Online 2 Benchmark. Higher frequency RAM is better(usually a 10fps increase) but, Timing did a better FPS increase. I read OC or Timing adjusting can damage registries on OS though. Whether that true or not is up for debate.

 

ArenaNet can do a FundMe/kickstarter to pool funds for a Game engine overhaul(add new features) and Optimization pass.

Link to comment
Share on other sites

Unless I'm mistaken, GW2 still runs on DirectX 9. This is where the problem lies, and the reason why World of Warcraft doesn't have the same issue. Despite that game being the same age as the engine for GW2, it runs on DX11 and now supports DX12.

 

The tl;dr of why DX9 is poop is because it essentially duplicates resource consumption when running graphics, which is an issue they essentially eliminated in DirectX 10. If GW2 is using that, then it is surpassing what was really expected of DX9 in its prime, and we're going to feel it.

Link to comment
Share on other sites

  • 3 weeks later...

> @"Stralgarr.7561" said:

> So aside from performance.......is the UI scaling better in 4k now? previously it was very tiny and game was unplayable as a result.

 

Yes it is. Very playable.

 

> Most MMOs have this issue, which is why 4k is still years away from being worth it.....which is hilarious because when it eventually does, 8k or something else will replace it.

 

Not sure about 8K, but 4K appears to be right in most games. The only area where I've seen issues are a couple distribution platforms like Origin and uPlay. Some of the menus in those apps don't scale to 4K, so they look super tiny.

Link to comment
Share on other sites

> @"jishi.7568" said:

> [https://youtube.com/watch?v=TCzoMyIFHCk](https://www.youtube.com/watch?v=TCzoMyIFHCk "https://youtube.com/watch?v=TCzoMyIFHCk")

>

> i need to retest with "native" rendering, as i forgot to set that before recording the video.

> MAX settings at 4k with native rendering was like 7-12fps average.

> 4k on low settings with "native" around 70-100 fps depends on area, lowest between 30-50fps.

>

> Sorry, it's hard to recall the exact numbers, Also the FPS drop from subsampling to Native isn't much.

>

> The GPU model was a low profile XFX rx 550 4gb.

>

> ArenaNet has set their minimum requirements to support old dual core CPU's. The game was set more for CPU loading than the GPU so doesn't require much of a gpu for 4k gaming.

>

> They factored in the rebuild of the engine and decided the performance gains to the cost ratio was not worth the risk.

>

> I'm still curious if it's really a engine problem or Server problem. Only way to see is for them to hold inside house testing of 4k gaming on the newest model PC and see if FPS drops occur. If drops don't occur then the problem lies within the servers and not the engine.

>

> Maybe they can hold a Live stream of this testing on their next stream *hint* ArenaNet.(I'd watch this)

>

> On the Overclock side of things, the "bus speed"(usually the numbers are like "200") not the "multiplier"(usually numbered as "x10") usually helps increase the IPC of CPU's. This option did great performance gains on the old FX series of CPU's for gamers. You want to mess with the "multiplier" after messing with "bus speed" first.

>

> I recommend RAM OC, specifically TIMING as on my athlon x2 64 5200+ i messed with the ram timing and FPS went from 70fps to 100fps on the Phantasy Star Online 2 Benchmark. Higher frequency RAM is better(usually a 10fps increase) but, Timing did a better FPS increase. I read OC or Timing adjusting can damage registries on OS though. Whether that true or not is up for debate.

>

> ArenaNet can do a FundMe/kickstarter to pool funds for a Game engine overhaul(add new features) and Optimization pass.

 

Thanks! You provide great insights and tips. I think the FundMe/Kickstarter campaign would instantly make people think Arenanet is out of money and that might make people leave the game. The best thing to do is to defer an expansion in favor of an overhaul to their engine. They're going to need to do this eventually. Better sooner than later.

Link to comment
Share on other sites

> @"Tolmos.8395" said:

> Unless I'm mistaken, GW2 still runs on DirectX 9. This is where the problem lies, and the reason why World of Warcraft doesn't have the same issue. Despite that game being the same age as the engine for GW2, it runs on DX11 and now supports DX12.

>

> The tl;dr of why DX9 is poop is because it essentially duplicates resource consumption when running graphics, which is an issue they essentially eliminated in DirectX 10. If GW2 is using that, then it is surpassing what was really expected of DX9 in its prime, and we're going to feel it.

 

Yes. That's why in an earlier comment I mentioned Bioshock Infinite (released a year after in 2013) runs on DX12. With maxed settings, that game hovers around 120 FPS. Now, granted BSI isn't an MMO, so it doesn't have the same open environments. However, there are no doubts that upgrading GW2 to DX12 would bring a HUGE performance boost. I think Anet should skip an expansion and use their resources to portal their engine to DX12. Yes, I know people will be upset by that, but in the long run, it will be a good move. It's either that or work on GW3. Pretty sure Anet doesn't want to do that.

Link to comment
Share on other sites

I'm pretty sure all posts regarding the speed of Guild Wars 2 made in the past mention the CPU clockspeed and Instructions per cycle are the main bottleneck. This has to do with the age of the engine and choices made in game engine programming 15 years ago. This game is VERY CPU bound, It's performance is mostly bound to the maximum speed you can get out of 2 cores.

 

The basis of the software engine used for the game is almost 15 years old. This software engine is owned by ArenaNet. Due to this fact we do NOT have a monthly fee. However we neither have a DX 11 or DX 12 option, nor a Vulkan, not even an OpenGL one.... Some of these other options would allow easier migration and inplementation on other platforms... But the basic engine is slow by modern standards. The basis for the engine was perfectly optimized for 4 vs 4 players and 8 vs 8 player battles a long time ago. We now run 70 vs 70 vs 70 player battles on the same engine. This shows the engine is good, but some things are a bit outdated. Direct X 9.0c has been around for nearly 15 years now. The interesting thing is direct x is mostly OS platform exclusive. So a lower version includes more platforms. This might not be relevant today but in 2007 when the development of Guildwars 2 began this was extremely relevant. (see timeline blow)

 

This game is DX 9. This was done for compatibility... Not all people have a nice 1080 Ti, I myself still game on my 5 year old set of Sli-ed 780's They, combined even, are inferior to a 1080, but they are all high end cards especially compared to the ones available on average for consumers in 2012. At launch I played this game on a Nvidia 9600 GT passive cooled GPU... at ~15-22 FPS.... The dx 9 standard was outdated before the game was launched, BUT the DX 9 API standard has been a relatively good implementation and is supported by almost all PC's still in use, even if they are outdated. I do know people who play on potato and/or almost bricked laptops and standalones of 8-10 years old. They also provide income for ArenaNet. I regularly hear people complain of 2-5 FPS in zerg or Boss fights, so there are still pretty low end machines in WvW and PvE as well...

 

I have had my now 5 year old i7 3930K OC-ed for years from 3.2 Ghz to 4.0 Ghz and I saw close to 25% FPS improvement (at the cost of more heat, and more power usage, yes...). It has been stable for the better part of these years . The amount of characters needing to be rendered seems to bottleneck framerates mostly. This is noticable in WVW with Blob vs Blob vs Blob fights in primetime. Having 150-200 people clash completely clogs all communications, and lowering the amount of characters on screen fixes this immediately.

 

The resolution doesn't seem to matter much, just the amount of characters shown. On 1280*1024 or 1920*1200 I get exactly the same framerates No idea for 4k, but I normally do use supersampling. OC-ing my GPU's? no remarkable change in FPS whatsoever...

 

------------

 

GUILD WARS 2 DEVELOPMENT TIMELINE:

Note: Guild Wars 2 uses a modified engine based upon the original Guild Wars engine.

 

o DirectX 9.0 (2002) was made for windows 98, 98SE, ME, 2000 and XP

o AMD/ATi introduced DirectX 9.0 capable gpu's around 2002

o DEVELOPMENT OF ORIGINAL GUILD WARS STARTS 2003 (<- *ENGINE DEVELOPED* )

o Nvidia introduced DirectX 9.0 capable gpu's around 2003

o ORIGINAL GUILD WARS RELEASED IN 2005

o Nvidia and AMD/ATI introduced DirectX 9.0c GPU's around 2005

o DirectX 10 (2006) was implemented with Vista.

 

o DEVELOPMENT OF GUILD WARS 2 STARTED IN 2007 (<- *ENGINE MODIFIED* )

o AMD/ATi and AMD/ATI introduced DirectX 10 capable GPU's in 2007

o Nvidia introduced DirectX 10 capable GPU's in 2008

o FIRST PLAYABLE PREVIEW OF GUILD WARS 2 2009

o AMD/ATi Introduced DirectX 11 capable GPU's in 2009

o Directx 11 (2011) was implemented with windows 7,8 and 10

o Nvidia Introduced DirectX 11 capable GPU's in 2010-2011

 

o GUILD WARS 2 RELEASED IN 2012 (<- *RELEASE* )

 

o Directx 12 (2015) was implemented with windows 10

 

The GTX 900 series (2014) were the first true DX12 advertised GPU's from Nvidia.

The Radeon R5/R7/R9 (200 series) (2013) were the first true DX12 cards from AMD/ATi.

DirectX 12 implementation howeverwas only available from 2015 as the software was only released with windows 10

Both Nvidia and AMD/ATi included a lot of other GPU's in the DirectX 12 capable lineup later on, nowadays stating from 2011 on, but my GTX 780 (2013) was sold as a DirectX 11.1 capable card at the time.

 

While people can argue DirectX 11 should have been used the problem was and will be the fact the engine of Guild wars 2 is:

A.) derived from the original guild wars engine

B.) direct X 11 was not standard in the potential group of players in 2009 as a lot of people had no faith in Windows Vista, nor windows 7 so these versions were used alongside older XP and 2000, all supporting directX 9.0c however.

 

Completely rewriting the engine is something potentially usefull to NCSoft and ArenaNet BUT is costly and timeconsuming, the migration would be costly as well. Since the introduction of Windows 10 most people will have acces to DirectX 12 software, but hardware might be older and incompatible still...

 

--------------

 

Best config for this game is a stupifyingly OC-ed (5Ghz? 5.5GHz?) 4 core CPU with a medium-end GPU.... it will require an SSD for faster loading and 8 Gb of OC-able RAM (oc-able cause if you clock the CPU you'd best clock the ram as well) so you will not get memory (the 8 Gb, not the speed) problems.

If you want a 4k screen with everything maxed out? Go for i!. But the requirements are not much greater. The GPU only does scaling and filling based upon DirectX 9.0 API.... having another player or NPC present is a lot more data to process, and cause they are player characters they are not predictable.

 

I have my GPU's setup so they do not downthrottle while idle for a moment, as this destroys my framerate... This happens cause the game creates next to no load for the GPU's... And the GPU's tend to go in energy saving, the PCI bandwith is limited. So maximum performance...

Also SMAA tends to work better for some then FXAA... getting your settings optimized tends to help as well... you can max out everything save maybe the shadows and reflections

 

-------------

 

tl;dr; GW2 is CPU BOUND, OC it and see improvement. Having a 1080 Ti in Guild Wars 2 is like using your bugatti veyron on an old dirt road. You will go from A to B and in good places really really fast, but in bad spots you'll be as fast as a horse drawn carriage and the guy having the horse drawn carriage, will have less good moments :) It happens if your basic game engine (the road) is ancient.... and so is a part of the traffic... having 100 ppl drivig teh same way some in refurbished beetles or cinquecento's doesn't help.

Link to comment
Share on other sites

> @"Korval.3751" said:

> Recently, I upgraded to a machine capable of steady 60+ FPS at 4K.

> ...

> The most disappointing was Guild Wars 2.

 

I agree with your sentiment that Guild Wars 2 could use more performance optimizations. They've done some things to improve this over the years, but it's not enough to make the game perform well in highly populated areas on any modern system.

 

The rest of your post shows that you've put a lot of work into testing many different games, but don't seem to be aware of what your system's components actually do. You're accounting for things that are unrelated to framerates (internet speed, SSD throughput) while ignoring an obvious potential bottleneck in many games (CPU). A simple test with different in-game settings would've pointed out the obvious: Guild Wars 2 is heavily CPU-limited in highly populated areas like Lion's Arch.

 

An area with a high player count in GW2 puts an inordinate load on your processor. This load is not spread across multiple cores, but largely dependent on single-core speed. This is why your performance tanks whenever there are a lot of players on screen. It's also why most settings have no impact, whatsoever, on performance in those scenarios. The only settings that do are those that offload the CPU (Shadows & Reflections, which are partially CPU-dependent in GW2 and many other games) or mitigate the player-overload (lower player model quantity / quality & maybe turn off friendly player names).

 

The performance problem you've stumbled upon has **_nothing_** to do with 4K resolution. The game will perform just as poorly in LA if you drop down the resolution to 1080p. Indeed, my native resolution is 1440p and if I bump it to 4K (using DSR) my framerate in LA doesn't change _at all_ even though that's more than double the pixel count.

 

For future reference in your quest for determining 4K compatibility:

* Internet connectivity: problems result in rubber-banding, skill misfiring, action delay, longer load times, disconnects & pop-in of players. This never causes framerate issues.

* Hard-disk throughput: problems result in pop-in of objects / textures / players, or longer loading times. This never causes framerate issues.

* Processor: check whether changing resolution or GPU-bound detail settings (anti-aliasing / ambient occlusion / texture quality / post processing) have an impact on performance. If they don't, like in GW2, then the game is CPU-bound.

* Resolution: always check lower resolutions. It's pointless to point at a game saying it isn't 4K proof when the performance is similarly poor at 1080p, like in Guild Wars 2.

 

 

> @"Korval.3751" said:

> No. I'm not overclocking. I talked with several professional system builders, and they said overclocking has a benefit when there are deficiencies but with my computer, the sum of each part IS the whole. In other words, everything I have combined makes each game run super fast. But not GW2.

 

Indeed, you aren't overclocking, because you can't with this CPU.

 

As for those professional system builders, they were almost certainly talking about general performance in most games. The load distribution on your PCs components differs from scenario to scenario. For most games that means the GPU is the bottleneck that drags down performance. In the case of Guild Wars 2 that load distribution is completely different from most games and mostly falls onto your CPU once player counts climb or shadow/reflection effects get crazy. As such, in this scenario, your CPU **is** the bottleneck / deficiency in your setup.

 

An i7 8700K would've been a much better pick for GW2 even at just stock speeds. Although even overclocked to 4.5GHz+ that processor probably still wouldn't hit much higher than 45 FPS in LA (if that).

 

 

> @"Korval.3751" said:

> I'll upgrade the RAM eventually, but not for GW2. RAM is way too expensive.

 

Don't bother, unless you're looking to get a 32GB kit for video/3D rendering which your system (again, CPU) isn't set up to do anyway.

 

RAM speed and latency have no meaningful impact on game performance. See techbuyerguru's 2016 article on DDR4 performance in games, Crucial's official recommendations, etc. Just save that money for upgrading the important parts (CPU/GPU/PSU/mobo) of your PC somewhere down the line. Also, you can probably boost that kit just fine...

 

 

> @"Korval.3751" said:

> Displays: (2) LG 27"" 4K UHD IPS screens

 

4K on 27" monitors!? Do you wear microscopes?

Link to comment
Share on other sites

I don't think that Guild Wars 2 showing sub-par performances in 4k is a problem at all

As pointed out, mass 4k adoption is still some years away, not 2 or 3, but more 4 to 5

I think it is more reasonable to expect GW3 (or a new game with different brand) in such time frame, rather than additional content to GW2

And I expect Anet to come out with a different engine for then (either proprietary or rented)

 

Also, I have a feeling that for GW2, catering to potato-PC owners is more lucrative than addressing top-performance pursuers

Link to comment
Share on other sites

> @"Korval.3751" said:

> > @"Tolmos.8395" said:

> > Unless I'm mistaken, GW2 still runs on DirectX 9. This is where the problem lies, and the reason why World of Warcraft doesn't have the same issue. Despite that game being the same age as the engine for GW2, it runs on DX11 and now supports DX12.

> >

> > The tl;dr of why DX9 is poop is because it essentially duplicates resource consumption when running graphics, which is an issue they essentially eliminated in DirectX 10. If GW2 is using that, then it is surpassing what was really expected of DX9 in its prime, and we're going to feel it.

>

> Yes. That's why in an earlier comment I mentioned Bioshock Infinite (released a year after in 2013) runs on DX12. With maxed settings, that game hovers around 120 FPS. Now, granted BSI isn't an MMO, so it doesn't have the same open environments. **However, there are no doubts that upgrading GW2 to DX12 would bring a HUGE performance boost**. I think Anet should skip an expansion and use their resources to portal their engine to DX12. Yes, I know people will be upset by that, but in the long run, it will be a good move. It's either that or work on GW3. Pretty sure Anet doesn't want to do that.

 

Wrong. You misunderstand what DX actually does.

 

Link to comment
Share on other sites

GW2 is not CPU gated, it's RAM gated. Your RAM timings on that kit are trash which is resulting in the CPU being starved. This is why you aren't seeing 100% usage on a single core, because GW2 doesn't even need your entire core when it's stuck waiting on slow RAM.

 

I'm sitting here playing gw2 at 60 fps with 4k on a 1060, don't give me "oh the engine sucks" crap. I'm going to add this thread to my collection of "people who think spending lots of money automatically makes their computer good."

 

> @"Droniac.8153" said:

> RAM speed and latency have no meaningful impact on game performance. See techbuyerguru's 2016 article on DDR4 performance in games, Crucial's official recommendations, etc. Just save that money for upgrading the important parts (CPU/GPU/PSU/mobo) of your PC somewhere down the line. Also, you can probably boost that kit just fine...

This is demonstrably wrong. RAM doesn't matter when you're bottlenecking on the GPU, but in any game that isn't bottlenecking on the GPU, RAM absolutely does matter. To a point. Beyond 3200 MHz it gets rather pointless, but going from 2133mhz to 3200mhz will show to advantage in a number of games.

 

> @"Droniac.8153" said:

> > @"Korval.3751" said:

> > Displays: (2) LG 27"" 4K UHD IPS screens

 

> 4K on 27" monitors!? Do you wear microscopes?

I play on a 24" 4k monitor. It's fine, and the pixel density is really nice.

Link to comment
Share on other sites

> @"GenghisKhan.7842" said:

> I don't think that Guild Wars 2 showing sub-par performances in 4k is a problem at all

> As pointed out, mass 4k adoption is still some years away, not 2 or 3, but more 4 to 5

> I think it is more reasonable to expect GW3 (or a new game with different brand) in such time frame, rather than additional content to GW2

> And I expect Anet to come out with a different engine for then (either proprietary or rented)

>

> Also, I have a feeling that for GW2, catering to potato-PC owners is more lucrative than addressing top-performance pursuers

 

As an MMO provider you have to take a wide range of customers into account hardware-wise as well. Not everybody has a lot of money to spend on a rig. I think the OP is just not realistic to expect 60 fps in an MMO with 100 players in his vicinity. I have a pretty good rig and I stay over 40 fps. That's perfectly fine (and yes I run on 4k). When it's quieter around me I can hit around 100 fps at 4k. So really for an MMO that's pretty good. If it was a single player game that'd be different, but then an MMO is a completely different type of game than a single-player game so the OP really is just make incorrect comparisons.

 

I mean single player games and WoW? I get can get over 200 fps in GW1 as well at 4k, which is about the same age as WoW. This is just a matter of unrealistic expectations because he wants to compare this game to ancient MMOs and single player games, neither of which makes any sense.

Link to comment
Share on other sites

> @"Tabasco.1743" said:

> > @"Korval.3751" said:

 

> ...Really, it makes what they have achieved all the more impressive. ArenaNet has been iterating on this software for more than a decade, and in the process designed some kitten creative effects and workarounds in the absence of DX11 libraries.

>

> I get where you're coming from though. I wish some things worked better than they do, and better _is better_. (For lack of a better word.)

 

Which just makes it more disappointing when they would rather bury their heads in the sand and ignore the problem, it's not like when GW2 launched DX11 wasn't a thing already. They chose to keep DX9 only, and that has it's costs. Threads like these keep popping in, and honestly, they have a merit... I mean if i spend 1500++€ on a machine, i expect my 6 years old game to run better than the newest AAA games. And that just isn't the case.

I'm not even going to address 4K, even just 1080p is a stretch to have solid 60 fps, especially when you hit higher population areas.

Link to comment
Share on other sites

> @"ReaverKane.7598" said:

> > @"Tabasco.1743" said:

> > > @"Korval.3751" said:

>

> > ...Really, it makes what they have achieved all the more impressive. ArenaNet has been iterating on this software for more than a decade, and in the process designed some kitten creative effects and workarounds in the absence of DX11 libraries.

> >

> > I get where you're coming from though. I wish some things worked better than they do, and better _is better_. (For lack of a better word.)

>

> Which just makes it more disappointing when they would rather bury their heads in the sand and ignore the problem, it's not like when GW2 launched DX11 wasn't a thing already. They chose to keep DX9 only, and that has it's costs. Threads like these keep popping in, and honestly, they have a merit... I mean if i spend 1500++€ on a machine, i expect my 6 years old game to run better than the newest AAA games. And that just isn't the case.

> I'm not even going to address 4K, even just 1080p is a stretch to have solid 60 fps, especially when you hit higher population areas.

 

I recently bought a new PC, laptop halfway-decent for gaming, i7, 16Gb RAM, SSD, Gtx1050 4Gb

At 1080p, The Witcher 3 runs awesome at 60fps on High, Guild Wars 2 runs good: 70fps on lower detail

GW2 relative performance indeed looks awful, but the absolute performance is above the "very good" mark, which is where I stop caring

Link to comment
Share on other sites

> @"Droniac.8153" said:

> RAM speed and latency have no meaningful impact on game performance. See techbuyerguru's 2016 article on DDR4 performance in games, Crucial's official recommendations, etc. Just save that money for upgrading the important parts (CPU/GPU/PSU/mobo) of your PC somewhere down the line. Also, you can probably boost that kit just fine...

>

 

This myth again. RAM speed and latency definitely helps tremendously in CPU limited games like GW2. Especially if you manually tune all the 1st, 2nd and 3rd timings. XMP Profiles, which are usually used for reviews and tests, are utter garbage.

Having 30+ fps at boss fights with medium character model limit and everything else set to the max (eg field of view and supersampling too) is quite possible with current hardware if you take the time to tune everything

Link to comment
Share on other sites

> @"Malediktus.9250" said:

> > @"Droniac.8153" said:

> > RAM speed and latency have no meaningful impact on game performance. See techbuyerguru's 2016 article on DDR4 performance in games, Crucial's official recommendations, etc. Just save that money for upgrading the important parts (CPU/GPU/PSU/mobo) of your PC somewhere down the line. Also, you can probably boost that kit just fine...

> >

>

> This myth again. RAM speed and latency definitely helps tremendously in CPU limited games like GW2. Especially if you manually tune all the 1st, 2nd and 3rd timings. XMP Profiles, which are usually used for reviews and tests, are utter garbage.

> Having 30+ fps at boss fights with medium character model limit and everything else set to the max (eg field of view and supersampling too) is quite possible with current hardware if you take the time to tune everything

 

Well, you also get 30+ fps with current hardware with medium character model limit without tuning anything else... Thing is, 30fps isn't great.

Link to comment
Share on other sites

> @"Crinn.7864" said:

> This is demonstrably wrong. RAM doesn't matter when you're bottlenecking on the GPU, but in any game that isn't bottlenecking on the GPU, RAM absolutely does matter. To a point. Beyond 3200 MHz it gets rather pointless, but going from 2133mhz to 3200mhz will show to advantage in a number of games.

 

> @"Malediktus.9250" said:

>This myth again. RAM speed and latency definitely helps tremendously in CPU limited games like GW2. Especially if you manually tune all the 1st, 2nd and 3rd timings. XMP Profiles, which are usually used for reviews and tests, are utter garbage.

 

The benchmarks I referred to were of course for normal games and not the handful of CPU-limited ones like GW2, so I thought you might be right about this for GW2.

Thank you both for giving me this opportunity to test the effects of RAM performance in Guild Wars 2.

 

I compared in-game performance and CPU utilization for my system at 2133CL13/15 vs 2666CL13/15 (neither XMP).

This should easily highlight and amplify any memory bottleneck the game has, with a substantial (10+ fps) framerate difference in that scenario.

I did this in the worst-performing, most-visited, area of Lion's Arch, namely the TP/Bank area.

This was done at the same in-game time of day and with similar player counts.

 

**Memory Test Results:**

2133 CL13: 41 fps avg / 26 fps min

2133 CL15: 41 fps avg / 25 fps min

2666 CL13: 43 fps avg / 29 fps min (fps was slightly more stable than 2133)

2666 CL15: 43 fps avg / 28 fps min (see above)

CPU Utilization was pretty much identical in all tests.

GPU Utilization was around 50% in all tests.

 

I also tested the effects of CPU overclocking with my normal 2666 CL13 for comparison.

The uptick is very similar at 500MHz, seeing as the RAM uptick was 533MHz.

This overclock should have very limited results if there's a significant memory bottleneck in place.

 

**CPU Overclock Test Results:**

5820k @ 3.7GHz: 33 fps avg / 14 fps min

5820k @ 4.2GHz: 43 fps avg / 29 fps min (fps was substantially more stable)

 

**Totals:**

Memory overclock: **2-4 fps** gain from 2133CL15 to 2666CL13 - small stabilizing effect.

CPU overclock: **10-15 fps** gain from 3.7GHz to 4.2GHz - substantial stabilizing effect.

Player Model Quantity: **4-7 fps** gain from High to Medium - noticeable stabilizing effect.

 

 

> @"Crinn.7864" said:

> This is demonstrably wrong.

 

I believe I have just demonstrated the opposite, on my system anyway.

The benchmarks and Crucial's statement I mentioned don't seem to agree with you either.

There are exceptions of course, like PUBG where you could gain dozens of fps by tweaking RAM, but Guild Wars 2 doesn't appear to be one.

 

 

> @"Crinn.7864" said:

> I play on a 24" 4k monitor. It's fine, and the pixel density is really nice.

 

I'm sure it is in Windows and in Guild Wars 2, but what do you do when you use one of the many applications or most games that don't support scaling?

I've used 4K on small screen sizes before, it's pleasant right until you hit that no-scaling wall, which is still all-too-common.

 

 

> @"Malediktus.9250" said:

> RAM speed and latency definitely helps tremendously in CPU limited games like GW2.

 

Perhaps it does on your system, but on mine even a massive change in memory frequency & latency apparently only results in a small improvement.

Meanwhile a similar CPU overclock has an enormous effect, illustrating that any memory bottleneck is relatively insignificant.

It's certainly not worth stating that "there is no CPU bottleneck, only a RAM bottleneck" the way Crinn did.

And my CPU/memory combination is reasonably comparable to the OPs in terms of raw performance.

 

 

> @"Malediktus.9250" said:

> Having 30+ fps at boss fights with medium character model limit and everything else set to the max (eg field of view and supersampling too) is quite possible with current hardware if you take the time to tune everything

 

Not news to me.

I did say that you could probably hit 45+ fps in the busiest parts of LA with the right processor and overclock.

 

 

> @"ReaverKane.7598" said:

> Well, you also get 30+ fps with current hardware with medium character model limit without tuning anything else... Thing is, 30fps isn't great.

 

This.

Link to comment
Share on other sites

> @"ReaverKane.7598" said:

> > @"Malediktus.9250" said:

> > > @"Droniac.8153" said:

> > > RAM speed and latency have no meaningful impact on game performance. See techbuyerguru's 2016 article on DDR4 performance in games, Crucial's official recommendations, etc. Just save that money for upgrading the important parts (CPU/GPU/PSU/mobo) of your PC somewhere down the line. Also, you can probably boost that kit just fine...

> > >

> >

> > This myth again. RAM speed and latency definitely helps tremendously in CPU limited games like GW2. Especially if you manually tune all the 1st, 2nd and 3rd timings. XMP Profiles, which are usually used for reviews and tests, are utter garbage.

> > Having 30+ fps at boss fights with medium character model limit and everything else set to the max (eg field of view and supersampling too) is quite possible with current hardware if you take the time to tune everything

>

> Well, you also get 30+ fps with current hardware with medium character model limit without tuning anything else... Thing is, 30fps isn't great.

 

30 isnt great, but at least enough to use gsync or freesync.

But you are right it is not great considering I am using a 1080ti@2GHz, 8700k@5GHz (4.8GHz Cache) and 16GB of 4266-17-17-17-28 RAM with all the subtimings tuned to just being 1 off from causing crashes. Maybe DDR5 will make those speeds more accessible, we will see.

 

Link to comment
Share on other sites

> @"Kheldorn.5123" said:

> This is 6 year old game on 12 year old engine. Reality check required.

WoW is running on a 17+ years old engine, an upgraded version of the Warcraft 3 engine. And Blizzard upgraded it so that he could use more CPU threads and they upgrade it to support DX 11 and 12

Link to comment
Share on other sites

The difference is that WoW is a subscription based game, so it has more money and also more to lose (if people get fed up and stop playing & paying, that hurts them). GW2 is buy once, so if people get fed up and stop playing, Anet just loses out on microtransactions (or potential purchases for next expansion). There is probably some lost revenue of people trying the F2P, saying 'performance sucks', and decide not to buy it, but that is hard to measure.

I'm sure most everyone would be happy if GW2 had a newer & better performing engine, however it would not directly make them much money. In comparison, a new expansion does make them a bunch of money. If you can only do one of those due to resources, I think it is pretty clear which one gets done.

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...