Antimode input sensitivity and input signal
I'm toying with the idea of using an Antimode dsp device on my bass feed. This would have to go between the CDP and the inputs to the power amps in the bass speakers ie most (fixed) attenuation would be done on the power amp inputs and take place after the Antimode device, so the Antimode would have to be able to accept whatever the CDP was sending. Nominal output of the CDP is 2V and the Antimode is quoted as having a 1.65V input sensitivity (which I take to be maximum input signal) for single ended and 1.35V for balanced. Am I right in assuming this would not be a problem because:
- 1.65V/1.35V are only about 1.7dB/3.4dB down on 2V (if I've done my sums right!)
- An average music signal will very rarely, if ever, reach the nominal 2V output and even peaks are probably recorded to leave a little headroom
- I would normally be applying a slight (variable) volume reduction via the CDP 's variable output to compensate for differences in the recording level on different CDs
To answer your questions as asked:-
1) Yes it would be a problem, as if the CD player is outputting 2 v with a sensitivity of 1.65 volts, clipping will take place very frequently
2) On rock/pop CDs the average music level perhaps doesn't reach 2 volts, but the peaks certainly will, very frequently.
3) If the volume adjustment is slight, then it may not be enough to avoid the problems in 1) and 2) above.
However, You need to check if the antimode device's sensitivity is absolute or nominal. A number of modern devices will list the sensitivity as the nominal working sensitivity, allowing for some headroom. For example, my Behringer DEQ and DCX have an input sensitivity of +4dBu (1.2v) but the input doesn't clip until +22dBu (9.6v) allowing 18dBs of headroom.
Thanks for a quick and helpful reply. I've asked the question. I suppose I could always use an in-line attenuator if these turned out to be absolute levels
Yes, an input attenuator will work fine.
Originally Posted by Parkandbike