World of Warcraft running on Window Server 2012 R2

Forums Operating Systems Windows Server 2012 R2 Games Compatibility World of Warcraft running on Window Server 2012 R2

Viewing 0 reply threads
  • Author
    Posts
    • #44869

      I’m having some serious performance with World of Warcraft running on a Windows Server 2012 R2 (Datacenter) installation on my system featuring Intel Core i7-3930K, Rampage IV Extreme and GeForce GTX 770 graphics card.

      I have been running Windows Server converted to a more “workstation” like installation for some time, mainly because I’ve got loads of Western Digital RE4 2TB hard drives which are shared locally on my network and they are formatted / running ReFS.

      Normally I haven’t seen any major performance difference between Window Server 2008 vs Windows Vista, Windows Server 2008 R2 vs Windows 7, Windows Server 2012 vs Windows 8 or Window Server 2012 R2 vs Windows 8.1 so I find it awkward that World of Warcraft is having performance issues all over the place.

      Running Intel Core i7-3930K, 24GB of DDR3 2133MHz RAM and GeForce GTX 770 and having my system installed on a Intel 320-series 300GB SSD and the actual game on a Intel X25-M Gen2 120GB SSD shouldn’t really have any major performance issues. And one thing that makes me feel there is something really fuzzy going on is that fact that lowering resolution, disabling anti-aliasing and lowering the graphic settings is not really doing much about the performance issues.

      With everything on the lowest settings and a resolution of 800×600 I’m still seeing a top of perhaps 50-60 FPS, but often it dips to about 8-15 and normally we are talking about and average of 20-30 FPS. If I crank things up to my native 1920×1080 resolution, full anti-aliasing and high settings I’m still seeing about the same performance, but going ultra will make the top being 30-40 FPS, the dips around 5-10 FPS and the average about 15-25 FPS.

      Booting Windows 8.1 Update 1 the very same hardware is fully capable of pushing ultra settings with tops around 50-70 FPS, dips at about 25-30 and average around 40-60 which is acceptable and more in the lines of what I would expect from the GTX 770. Running high will more or less ensure 60 FPS+ all the time besides in situations where you get limited by the servers due to there being loads of people all around you at all times.

      What gives? How come the performance under Window Server 2012 R2 is not remotely close to the Windows 8.1 Update 1 performance? Why doesn’t lowering the in-game graphics make any impact on the performance numbers? Running 800×600 nd lowest settings should normally ensure 60 FPS+ even on Intel HD 4000 graphics.. Even Intel HD 3000…

Viewing 0 reply threads
  • You must be logged in to reply to this topic.