When a player runs most PC games for the first time, the graphic settings are automatically set for him, according to a rough estimate of what his computer can handle. So if his computer is high end, and the game is relatively simple, the graphics would automatically be maxed out upon initial run. Though I do notice that sometimes games are quite off in their estimates.
So my question is how do the developers determine this benchmark? Are they actually checking the hardware or are they running some sort of test in the background? Anybody have any experience/ideas?