Gehenna.3625 Posted August 7, 2018 Share Posted August 7, 2018 > @"Jamesst.7523" said: > Thank you all for your kind answers and help. I think I realized I need a better motherboard or a Z series one to overclock my CPU. > Apparently its running at 4.3GHZ after enabling the CPU ratio to 47 on the Motherboard settings. > @"Malediktus.9250" Would you please explain to me a bit more details about the 4.3GHZ thingy being the 4.7? I don't understand it how the 4.7 Ghz turbo is 4.3 at all cores... > And from what I understand correctly , if I only enable 1 core, at 4.7ghz as you said, will the game run smoother?? That's something Im looking forward to hear more about from you. and thank you all for your kind help again , @"Gehenna.3625" I'll try what you said, but could you mention some statistics about how high the FPS went up after you tried something other than the HDMI. Thank you so much Well, I didn't specifically check the fps but the result was instantly noticeable as far as sluggishness which disappeared and graphics that were much more crisp. I hope you'll understand I'm not going to find that old cable back and hook it back on again :) Also the CPU at 4.3 Ghz should be more than plenty. So I don't see that as an issue. There could also be server side issues that you may not be aware of. MMOs are a bit notorious for that. Just a question, since I don't remember if you mentioned it. Did you compare GW2's performance to other MMOs or only to single-player games? Link to comment Share on other sites More sharing options...
Magnus Godrik.5841 Posted August 7, 2018 Share Posted August 7, 2018 > @"Emberstone.2904" said: > > @"Magnus Godrik.5841" said: > > Your gpu may he holding your cpu back. And if you oc your rig make sure you do it manually. That auto kitten sucks. To get its full potential as far as processing you need memory thats fast otherwise your never hit that 4.7. On my rig i needed xmp memory modules to hit 5. I didnt like the the temp so i dropped it at 4.7 with a steady temp of 65 under full load. Your monitor can affect this as well especially refresh rates and resolutions. The chip maybe fantastic but it cant reach its full potential on gaming with an outdated card. > > I wish people would stop saying the GPU is the problem. If I can max the game on a Radeon HD 7950 (midrange card from 2012) and achieve 60 FPS in most of the game, excluding world events, then a 1060 3GB will have no problem. > > No, the VRAM isn't holding him back. I'm just finished a full-map Octovine in Auric Basin and I'm at 1.3GB of VRAM used on my GTX 1070. 3GB is plenty. > > The issue is elsewhere. A 1070 is over 30% better than a 1060. Memory does affect the cpu in some ways especially if you are trying to OC. It could be anything from a bad slot or just wrong settings. Link to comment Share on other sites More sharing options...
Goettel.4389 Posted August 7, 2018 Share Posted August 7, 2018 I can only give you one reference: my i7-7600K/16GB/GTX1070 chucks out 100+ frames maxed out in core Tyria, and even with large zergs it never drops so low that I felt I needed to check the numbers, so it's probably always at least 60-ish. In a DS meta squad, I have to drop either Character Model Limit or Character Model Quality to low or lowest to stay 40+, and even then it dips into the high 30s sometimes. Shadows of course make some difference to these numbers, but not enough on my 1070 to drop it. Still, when I'm really into the game I do set it to off to have as much fluidity as possible. PoF frames are worse than HoT's. It seems clear the engine is having difficulty coping with the increased texture load and...stuff (no techie). Edit: playing at 3440x1440. Link to comment Share on other sites More sharing options...
Emberstone.2904 Posted August 7, 2018 Share Posted August 7, 2018 > @"Magnus Godrik.5841" said: > > @"Emberstone.2904" said: > > > @"Magnus Godrik.5841" said: > > > Your gpu may he holding your cpu back. And if you oc your rig make sure you do it manually. That auto kitten sucks. To get its full potential as far as processing you need memory thats fast otherwise your never hit that 4.7. On my rig i needed xmp memory modules to hit 5. I didnt like the the temp so i dropped it at 4.7 with a steady temp of 65 under full load. Your monitor can affect this as well especially refresh rates and resolutions. The chip maybe fantastic but it cant reach its full potential on gaming with an outdated card. > > > > I wish people would stop saying the GPU is the problem. If I can max the game on a Radeon HD 7950 (midrange card from 2012) and achieve 60 FPS in most of the game, excluding world events, then a 1060 3GB will have no problem. > > > > No, the VRAM isn't holding him back. I'm just finished a full-map Octovine in Auric Basin and I'm at 1.3GB of VRAM used on my GTX 1070. 3GB is plenty. > > > > The issue is elsewhere. > > A 1070 is over 30% better than a 1060. Memory does affect the cpu in some ways especially if you are trying to OC. It could be anything from a bad slot or just wrong settings. The GPU core is irrelevant to this discussion. That's the primary difference between a 1060 and 1070. The architecture of the cards, however, is identical. Therefore, how they both buffer textures and stuff in VRAM will be identical. TL;DR: If I see 1.3GB of VRAM used on my 1070, he will see the exact same VRAM used on his 1060 in the same area with the same settings. Even between different architectures and brands, you should still see roughly the same VRAM used as the game itself is the primary determining factor in what gets stored on the card's memory. As for the GPU core itself, a 1060 3GB is overkill for this game given the era it was designed for (mid-2012). The game can easily be maxed on it given enough CPU performance, but since GW2 is GW2 he'll never get 60 FPS everywhere on high settings. Zergs destroy framerates. And system memory can be a potential problem here, and I would recommend the OP to take all his sticks out and try them in each slot one at a time to ensure each stick/slot works, but it has nothing to do with VRAM. Link to comment Share on other sites More sharing options...
Jamesst.7523 Posted August 7, 2018 Author Share Posted August 7, 2018 Ok guys riddle me this , I just turned the "shadows" option from the graphical settings from Ulta/high to Low I was astonished by a 30-20 fps difference … now from my pervious info , the shadows are from the GPU settings , when I fiddle with it , the GPU usage % doesn't change at all , neither does the CPU … so I was wondering which setting does it lay under , the GPU or CPU Link to comment Share on other sites More sharing options...
GDchiaScrub.3241 Posted August 9, 2018 Share Posted August 9, 2018 > @"Jamesst.7523" said: > Ok guys riddle me this , I just turned the "shadows" option from the graphical settings from Ulta/high to Low > I was astonished by a 30-20 fps difference … now from my pervious info , the shadows are from the GPU settings , when I fiddle with it , the GPU usage % doesn't change at all , neither does the CPU … so I was wondering which setting does it lay under , the GPU or CPU ANET's description suggests that the option conflates dynamic (shadows that move due characters and such) and static shadows (like cast from immobile buildings). **I do think you have to reload the map too.** If it's similar to Unreal engine then the "static shadows" would generate a texture-like thing (light map) to be stored in VRAM. The dynamic shadows may be processed by [GPU](https://docs.unrealengine.com/en-us/Engine/Performance/GPU "GPU") or [CPU](https://docs.unrealengine.com/en-us/Engine/Performance/CPU "CPU") probably. (Those links may use too much jargon for this thread) _I don't know where you're getting %usage from, thus making this comment lack a base anyway._ It would also help if you laid out all parts (and monitor for that matter) in the OP, not spread out in replies. Similar in format to a random custom pc builder like [this](https://www.maingear.com/boutique/pc/configurePrd.asp?idproduct=2746 "this"). A list of everything. I would never recommend overclocking anything without knowing your power supply, then motherboard followed by cooling. So it is hard for me to suggest something beyond disabling shadows if getting 60+ fps is your thing. Or if it's the stutters 40-60 that's the issue then frame limit the game to 30 frames. Link to comment Share on other sites More sharing options...
jishi.7568 Posted August 17, 2018 Share Posted August 17, 2018 There's a option in advance settings on the "Power" options in windows 10(7,xp)when you navigate into the advance options, there's a category for Processor minimum and maximum. Adjust the minimum to 95% and have maximum at 100% . Go into your Nvidia graphics settings and adjust every option for performance. Now try again on gw2 with your OC options set. Update us on the cpu temp and Usage of cpu and GPU. Link to comment Share on other sites More sharing options...
Sylent.3165 Posted August 17, 2018 Share Posted August 17, 2018 I have a similar rig, try setting character model limit and others to low detail and mess with the shadows, made a huge difference for me Link to comment Share on other sites More sharing options...
Cobrakon.3108 Posted August 17, 2018 Share Posted August 17, 2018 its weird to think that even a 6 year old game cant run high fps on a current high end system Link to comment Share on other sites More sharing options...
Jamesst.7523 Posted August 19, 2018 Author Share Posted August 19, 2018 Thank you all so much for the information and advice, I'll try what you suggested all of you , and sorry for taking so long I was on a vacation. @"Sylent.3165" Would you please tell me ur specs? and ur graphical settings in game with avg fps in certain places?? Link to comment Share on other sites More sharing options...
sorudo.9054 Posted August 19, 2018 Share Posted August 19, 2018 don't think to much about your GPU, it's the CPU you have to worry about. besides, this game is poorly stabilized, even a monster of a computer drops FPS like a brick. Link to comment Share on other sites More sharing options...
Jamesst.7523 Posted August 19, 2018 Author Share Posted August 19, 2018 Do really think that an i7 8700k is not enough for Gw2 and should be "worried" about? Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now