Legend of Grimrock blew up my computer...

Have trouble running Grimrock 1 or you're wondering if your graphics card is supported? Look for help here.
User avatar
Crash
Posts: 97
Joined: Fri Mar 02, 2012 5:20 pm

Re: Legend of Grimrock blew up my computer...

Post by Crash »

Spathi wrote:Unfortunately it is a fad to run games at the highest possible framerate, pushed by PC marketers and noobs. It is a bad idea and does not add anything to the visual experience as the screen will only be 60 or 120 fps anyway. It actually results in screen tearing and is far worse to look at if anything.
It is my understanding that the idea behind trying to achieve the highest possible frame rates, is to also achieve the highest minimum frame rate. I don't know of any games that run at a consistent frame rate, and it is not that I need any game to run at 300fps, it is when they slow down to 15fps that it becomes a problem. I personally use VSYNC because I detest tearing, but try to have a fast enough video card (or cards) to allow the highest frame rates with this feature enabled. I has also been my understanding that with vsync enabled, if the computer is not fast enough to always maintain 60fps to match the display, for example, it will drop dramatically by design. This is an excerpt from a Hardocp article on the new adaptive vsync feature:

"The cure to tearing is to turn VSync on. What this does is cap the game's framerates to the highest native refresh rate of your display. This means on our 60Hz display, the game won't exceed 60FPS. As most people consider 60 FPS to be a very smooth gameplay experience, this sounds like there would be no drawbacks, but unfortunately there is. The problem with turning VSync on is that the framerate is locked to multiples of 60. If the framerate drops even just a little below 60 FPS VSync will drop all the way from 60 FPS to 30 FPS. This is a huge drop in framerate, and that large change in framerate becomes noticeable to the gamer. The result is called stuttering, and when you are playing a game that consistently changes between only 30 and 60 FPS, the game speeds up and slows down and you feel this difference and it distracts from the gameplay experience. What's worse is that if the framerate drops ever so slightly below 30 FPS the next step down for VSync is 20 FPS, and then the next step down is 15 FPS. "

Therefore, a great advantage to having a graphics engine and card capable of running at extremely high frame rates, is to avoid the problem of drastic drops in frame rate that can adversely affect gameplay. LoG could be extremely difficult to play, with regard to some of the timing puzzles and combat, if such stuttering were occurring.

I suspect that the idea that this game is causing computers to overheat may be that there is low cpu usage, and without that as a bottleneck, the video card can render at the limits of its capability, causing it to run at higher temperatures than with some other games. I've seen the same thing happen when benchmarking my cards using a 3Dmark utility, where the fans would run at top speed during testing, a phenomena that I rarely see in the course of playing any game. Hopefully enabling Vsync in LoG will help, and those with 120hz monitors concerned about high temperatures may wish to enable Vsync and run the monitor at 60hz during LoG.
User avatar
Spathi
Posts: 199
Joined: Tue Mar 20, 2012 4:33 am
Location: Melbourne Australia

Re: Legend of Grimrock blew up my computer...

Post by Spathi »

That is just marketing... full monitor framerate or half or one quarter framerates are better than what is in between. If you want to get a few milliseconds less lag then you need exactly double framerate.

Stuttering comes from a slow cheap PCIe Bus or broken drivers and is noticeable if you use two cards. The latency on PCIe3 is lower so stuttering will disappear, or at least return to the quality of AGP's parallel bus. (Not talking about bandwidth, but in fact the time it takes one tiny message to traverse the bus).. so this effects control signals when two cards are trying to co-ordinate who has what triangles where. It also happens betwwen the CPU and GPU on the bus, old single threaded CPU's had this problem, even though the next hypertheaded CPU's added only 3% more bandwidth performance when flat lined they reduced stuttering because the extra hyper thread help to remove delays between threads.

The drop in frames is noticeable as it is increasing the pause between frame change, especially when it drops below 24fps.. well probably 48 as I explain below. I would not call this anomaly stuttering though. This is why movies chose 24, they researched what the human brain was comfortable with. Movies used a black frame between each frame so it was 48fps and the brain filled in the black one... this is why tv's and monitors operate need this (48) or higher... your power comes out at 50 or 60Hz, which is why they chose that.... but 48Hz is what you brain wants to see more so than 50 or 60 according to the research they did.
pilgrimboy wrote:Anyway, arguing that it isn't their fault when it appears to be is ridiculous. Something in this game is causing it to overheat computers. I would appreciate it if they fixed that, so I can play the game without worrying about frying my computer.
Dude that is just retarded. I suppose they should have turned on VSync, it annoys the heck out of me looking for the setting in each and every game. They have put it in the patch though. My guess is (may be wrong) that this game has large textures or bitmaps for the walls.. the cards have to move those walls across the quadrants of the video cards and use transcendental calculations that use up the pipelines faster than a game like Skyrim. Most video cards should still be able to handle it though. Maybe some video cards from 2006 or 2008 may struggle, but many/some from before this and most gaming cards from after should be fine.

this for exampe will struggle (maybe).... http://en.wikipedia.org/wiki/R600_%28ASIC%29
A shader cluster is organized into 5 stream processing units. Each stream processing unit can retire a finished single precision floating point MAD (or ADD or MUL) instruction per clock, dot product (DP, and special cased by combining ALUs), and integer ADD.[3] The 5th unit is more complex and can additionally handle special transcendental functions such as sine and cosine.[3] Each shader cluster can execute 6 instructions per clock cycle (peak), consisting of 5 shading instructions plus 1 branch.[3]
the reason being that there may not be enough shader clusters if the 5th unit is being used heavily.

That is if it still uses the GPU heavily if you move when VSynced... I have not even looked, lol (Crash is also right about the non-Vsynced (and 120) allowing too many frames, sort of dancing around two issues in this thread maybe)
User avatar
Crash
Posts: 97
Joined: Fri Mar 02, 2012 5:20 pm

Re: Legend of Grimrock blew up my computer...

Post by Crash »

I should unlock vsync, plug into the 120hz monitor, and log temperature, fan speed, GPU, and CPU usage on my SLI and single card systems to see what happens.
Lmaoboat
Posts: 359
Joined: Wed Apr 11, 2012 8:55 pm

Re: Legend of Grimrock blew up my computer...

Post by Lmaoboat »

This is probably just a shot in the dark, but if you're playing the steam version, you might want to check the steam overlay process. I remember whenever I would bring it up when I played Skyrim it would start to slow down the game, and my computer would start overheating and sound like a leaf blower.
User avatar
mercenar
Posts: 6
Joined: Tue Mar 27, 2012 10:12 pm
Location: Belgium
Contact:

Re: Legend of Grimrock blew up my computer...

Post by mercenar »

It's funny that on EVERY forum of EVERY videogame, at least 1 guy is telling that the game made his computer explodes =D !!

May all those machines rest in peace or pieces...
User avatar
e8hffff
Posts: 82
Joined: Mon Apr 09, 2012 10:01 pm

Re: Legend of Grimrock blew up my computer...

Post by e8hffff »

Most likely caused by power supply not beefy enough. Make sure your PSU has the amp ability to drive your cpu and gfx at full demand.
Last edited by e8hffff on Mon Apr 23, 2012 9:04 am, edited 1 time in total.
User avatar
Arctor
Posts: 89
Joined: Mon Mar 26, 2012 7:50 pm
Location: Phoenix, Arizona, USA
Contact:

Re: Legend of Grimrock blew up my computer...

Post by Arctor »

It should also be pointed out that not all video cards are created equally even among the same product run. Variations in thermal paste under heat sinks, for instance, can dramatically change the way two identical pieces of hardware behave. I had a video card in my desktop machine that constantly fanned. Even when doing the simplest tasks. It burned itself up over several months. I replaced it with an identical card (the former was still under warranty and I received a direct replacement), and I've yet to actually hear the new one's fan kick in - even on games that used to get the first card blowing like a hurricane.

But really, it's that virus from 1987. It lives in the boot sector of your 3.5" floppies. Watch out for it.
armac911
Posts: 1
Joined: Sat Jun 09, 2012 1:00 am

Re: Legend of Grimrock blew up my computer...

Post by armac911 »

Yes there is something odd in this game engine that forces the GPU to run on 95% + load, but its not the textures or the pixel shaders. I think its some sort of complex algorithm that the engine uses (similar to furmark) that forces the GPU to run at nearly 100%.

To all the people complaining. Software, this game or any other game can't fry you are PC. No software can. If your GPU or PSU died because of LoG then there was something wrong with your computer in first place. Always run stress testing when assembling or buying new pc. Run Furmark, Prime95 and Lynx for couple of hours. Stability testing is often overlooked.

Yes I am experiencing the same high GPU usage as everyone else, but LoG didn't fry my PC. It's just working a little louder and wasting some extra electricity. That's all.

And yes, turning on V-Sync helps of course. Without it I was getting 250-300 FPS and GPU load of 94%, V-Sync on locked on 60 FPS and GPU usage of 40%. That seems right. You can't see above 25 fps anyway, although you can "feel" them all the way up to 50-60. Everything above that is waste of resources.

Phenom II X4 955
2 x AMD 6870
CM Haf 932
User avatar
Limba
Posts: 13
Joined: Mon Apr 23, 2012 8:44 pm
Location: Finland - Oulu

Re: Legend of Grimrock blew up my computer...

Post by Limba »

Did the NVidia already fixed this fan control bug?
It was overheating cards by wrong fan control.
This was some time ago.
User avatar
glyn_ie
Posts: 120
Joined: Mon Mar 05, 2012 4:19 am

Re: Legend of Grimrock blew up my computer...

Post by glyn_ie »

To be fair to the developers, I would say that it's s***ty GPU design rather than a coding error. Graphics card drivers should protect against this kind of nonsense.
Post Reply