Hoagy/Copper/SineWave, at the risk of boring everyone with a further dose of electical theory, I think you're all mistaken: I doubt there's any domestic dimmer for lighting that works simply by adjusting the voltage. That's not how they work (if they did, they'd have to dissipate all the excess energy as heat). Hence, simply quoting Ohm's law isn't that relevant to what's going on.
Dimmers work by switching on and off over the period of the AC signal from the mains. It is this switching, and the consequent severe distortion of the AC signal, that can cause transformers to operate outside their ratings (the science bit "due to the high harmonic content and out of phase components in the signal the transformer may saturate"). As an aside, it's this effecrt that can cause regular bulbls to "sing" sometimes when dimmed.
Another science bit: dimmers can also introduce a DC offset into the signal. The energy represented by this has nowehere else to go in a circuit with a transformer other than to be dissipated (as heat) by the transformer. Although transformers may have a thermal cut out, it will get hot, and potentially very hot, in these circumstances. Hence the danger of overheating, and consequent fire risk.
My reason for persisting with this by the way is that Hoagy's statement "Either it will work or it won't, you won't damage anything" is dangerous in my opinion.