I think it would take a pretty extreme distance difference and very sensitive ears, and a rather extreme condition for this to matter.
12 AWG wire is 12.5 mOhms/foot (so 25 mOhms for the two runs from amp to speaker). Consider a 10' run vs a 20' run of unequal wire. I'll take the extreme case of the speaker dipping down to 2 Ohms (which only some speakers do, and at a fairly narrow frequency range).
A 10' run provides a drop of -0.107900637734123 dB, a 20' run -0.214477307835461
dB, so only about 1/10th of a dB at that frequency.
At a more typical 6 Ohms across most of the audio band, we are talking a mere ~0.036 dB difference.
So under some extreme length differences, demanding speakers, and 'golden' ears, well, maybe? But with those differences, the attenuation of the sound itself through air, room effects, and speaker to speaker and amp to amp and a dozen other variables are more likely to swamp it out. I didn't do the math on inductance differences, but I'm guessing those are minimal as well at these lengths and frequencies.
-ERD50