Legend of Grimrock blew up my computer...
Re: Legend of Grimrock blew up my computer...
That makes sense - when I run graphics benchmarks on my GTX275 SLI system, the fans get extremely fast and loud, but they are never so active in other games. It must be that the cards are working very hard and are not CPU limited as with Grimrock. I would agree that any card should be designed to handle such a cooling load, because it is only a matter of time before some game will push the card to performance and temperature limits. I have heard of cases where a driver was not correctly controlling fan speed. On very hot days, I have sometimes used a Fan speed utility to force the fans to top speed.
Re: Legend of Grimrock blew up my computer...
Guys, graphics cards do what they're told to do by games.
This is the only game I've ever played in my 2 years with this card that pushes it like this.
Yes, I agree, there is an easy explanation - that the game runs so fast that it's capable of asking for several hundred frames per second.
I.e. it's capable of asking for pretty much everything the graphics card is capable of.
This sucks power and increases heat, for no in-game effect.
The game should limit itself to 100 frames per second max, or something like that.
It's not a driver issue.
Running graphics cards that hard decreases their life spans whether it makes them explode immediately or not.
This should get fixed. It's dangerous for those with powerful computers, and if one doesn't notice increased fan noise, or they don't know to change the VSync option from its default of "no limit", this game will deteriorate their expensive hardware.
This is the only game I've ever played in my 2 years with this card that pushes it like this.
Yes, I agree, there is an easy explanation - that the game runs so fast that it's capable of asking for several hundred frames per second.
I.e. it's capable of asking for pretty much everything the graphics card is capable of.
This sucks power and increases heat, for no in-game effect.
The game should limit itself to 100 frames per second max, or something like that.
It's not a driver issue.
Running graphics cards that hard decreases their life spans whether it makes them explode immediately or not.
This should get fixed. It's dangerous for those with powerful computers, and if one doesn't notice increased fan noise, or they don't know to change the VSync option from its default of "no limit", this game will deteriorate their expensive hardware.
Re: Legend of Grimrock blew up my computer...
I have to agree that there is something wrong with the game, I'm running a GTX 460 with in a HAF 932 with 4x 200mm fans on 800rpm, completely dust-free. And the heat levels on my GPU are reaching critical levels while playing LoG (Over 90 degrees celsius while the maximum safe temperature for the card is 104 degrees), checked with a software monitor and an aerocool v12 hardware temperature monitor.
Its really the only thing that is bugging me, otherwise it is still my GotY so far!
Its really the only thing that is bugging me, otherwise it is still my GotY so far!
Re: Legend of Grimrock blew up my computer...
So, just put a limit on framerate through GPU driver would fix this problem easily.THEaaron wrote:In this case, the game will generate like 150-200 frames all the way through the power of your gpu - which generates a heavy load on it. This is just like a real GPU benchmark - where there are no limiting factors like the CPU.
For Nvidia users, you can use Nvidia inspector to limit framerate of any game/application.
Re: Legend of Grimrock blew up my computer...
Yes it is. And I am quite baffled why this seems to be so hard to see.Hamda wrote: It's not a driver issue.
The game doesn't communicate directly with the hardware. There is the driver in between. A game doesn't have to be bothered with what hardware a computer exactly runs, but only uses an API (be it DirectX or OpenGL or whatever) and issues commands with it. These commands are processed by the driver. There is no way a game could possibly say: "Now burn the hardware to death!!" all it does is issue drawing commands to the underlying driver via the respective API. It is the responsibility of the driver to process these commands in a way that is within the specifications of the hardware, so that it is not damaged.
The driver is always in between a game and the hardware, there is no way to bypass it for a non-privileged program. If this wasn't the case, then any malicious program, maybe a virus, could destroy your computer, simply by overloading the GPU. No credible OS would allow this to happen.
Re: Legend of Grimrock blew up my computer...
No its nothaffax wrote:Yes it is. And I am quite baffled why this seems to be so hard to see.Hamda wrote: It's not a driver issue.
You can not overload a GPU (or a CPU). It will do what it is told as fast as it can, thats why it is there in the first place.
It will draw more power and generate more heat and it will be fine if it is cooled sufficiently.
Besides:
The GF104 Fermi GPU has a termal threshold of 104°, if that is reached it will just black your screen and run the fan at 100% to cool down.
However, if your power supply vaporizes, the damage can be huge all across the mainboard and the rest of the hardware.
Re: Legend of Grimrock blew up my computer...
It is my understanding that an overtaxed power supply may output unpredictably, which could mean that some users are experiencing problems with a video card that has power demands too close to the limits of their power supply. Another possible explanation is a defective power supply, unable to provide what the card needs. I have had more than one PSU go boom under load, so now I'm using a 1200w model.
Re: Legend of Grimrock blew up my computer...
No game, application, or other piece of non-driver software is going to have the ability to force your GPU (or any other piece of hardware) to overheat itself. The problem here is that these PC's probably have a combination of undersized power supplies and improper cooling/ventilation. Saying that "other games don't cause this to happen" is irrelevant, any game or application that equally taxes your graphics card in the same way would have the same effect. Properly cool your machine and you will not have these problems (and I don't mean properly cool it so it works with most of the games you play, properly cool it so that even when running at full capacity the card is not in a position to overheat).
- Grindlepol
- Posts: 3
- Joined: Wed Apr 18, 2012 3:37 am
Re: Legend of Grimrock blew up my computer...
just a quick little note
Well it did not blow up my PC, but it did lock it tho.
uninstall and reinstall, but it still lucked up the pc (not blue screen) total lock up
after reading the forum here i enabled VSync and wow working fine, did the first level with no probs. (PC problems that is )
waiting for the patch now - to adjust the volume
Well it did not blow up my PC, but it did lock it tho.
uninstall and reinstall, but it still lucked up the pc (not blue screen) total lock up
after reading the forum here i enabled VSync and wow working fine, did the first level with no probs. (PC problems that is )
waiting for the patch now - to adjust the volume
Re: Legend of Grimrock blew up my computer...
Sigh.haffax wrote:Yes it is. And I am quite baffled why this seems to be so hard to see.Hamda wrote: It's not a driver issue.
The game doesn't communicate directly with the hardware. There is the driver in between. A game doesn't have to be bothered with what hardware a computer exactly runs, but only uses an API (be it DirectX or OpenGL or whatever) and issues commands with it. These commands are processed by the driver. There is no way a game could possibly say: "Now burn the hardware to death!!" all it does is issue drawing commands to the underlying driver via the respective API. It is the responsibility of the driver to process these commands in a way that is within the specifications of the hardware, so that it is not damaged.
It's physics and unavoidable reality that if you drive the card to its limit and keep it there, it reduces lifespan. No, it shouldn't ever cause the card to just spontaneously blow up, and yes, it should never be possible to push a card to the point that it does explode. But if there is no benefit to pushing the card to that point, a game should not do it. If nothing else, the additional noise from increased fan speed should be enough to justify that rule of thumb. Or the additional power usage.
A graphics card going at full tilt for extended periods of time will reduce its life expectancy. It will also increase the chances of dormant power supply issues rearing their heads, which can wreak havoc on the whole system. This is not an ideal world in which this stuff doesn't matter in practice.
When a game does this for no human-discernible effect on performance, it is a flaw in the game's code. The game should throttle itself, as most games do. (Try going into a small simple room with no enemies in most shooters and watch the frame rate counter - it'll generally cap out at a nice round number, like 100. This is the *game* throttling its demands on the GPU.)
This isn't actually reasonably debatable and I suspect that if it's not fixed, this is the sort of thing that Steam will pull the game from their servers for. It's sort of a big deal, regardless of the apologists on this forum.
Luckily easily fixable, once one notices, by setting the vsync option. Very glad I noticed, due to the increased fan noise.