LEDs, as with all semiconductor devices, are designed to work at specific currents. Exceed that current and the PN junction fails (burns out). We'll assume an LED with a voltage drop of 3V and a 12V for these examples (I've also included the required power rating of the resistor in Watts). Note that unless directly connected across the supply, the voltage across the LED will be 3V but not necessarily at a safe working current. The other 9V is dropped across the series resistor, yielding the current and power dissipation figures as shown.
1Ω = 9000mA (9A) = 81W
10Ω = 900mA (0.9A) = 8.1W
100Ω = 90mA (0.09A) = 0.81W
250Ω = 36mA (0.036A) = 0.324W
As I mentioned above, you should use the nearest value resistor to that which is returned from the calculation. If the calculation returns a figure which happens to be a preferred resistance rating, use that. On no account should you use a resistor of less value than the figure returned by the calculation, as this means that the current through the LED will exceed its rated value, and thus shorten its life (if it doesn't cause it to burn out immediately).