Is there such a thing as a rough guideline as to what is good, bad or appropriate?
A well-regarded pre-amp has the following:
Input Impedance: 100k ohms [this seems a fairly common figure]
Another that I've found details of has this specification:
Input Impedance: 25k ohms (50k ohms optional, 15k ohms minimum)
What difference would this make, or is it depdendent on the amplification circuitry and design of the individual amplifier? Output gain on both is similar; 10db gain in the first instance, and 8db in the second (although 20db is also available from the manufacturer).
I think there is some guideline here with regard to power amplifier compatibility, but I've no idea how that works.