We absolutely beat this topic to death a few years ago, and to be honest, I don't have a complete understanding of the fine details yet. What I would like to do is throw out there is a discussion on why manufacturers null their coils the way that they do.
The specific coil I am working with now is my 4th concentric - this one is a 10" for the IDX project. To this point, they all have worked wonderfully.
The way I make my coils, I basically "copycat" one of my coils for the White’s Spectrum - frequency, phasing, nulling, etc. My commercially made 9.5" coil is nulled with the Rx coil about -90 degrees (leading the Tx signal) to around 100mV P-P. (Performance gets copied but not the cost ; )
So we have several ways to null a coil- either Rx leading or Rx lagging the Tx signal or the deepest null... and heck if I can find any difference at all !! I can prove it to myself over and over again. GB, DISC and sensitivity all seem be exactly the same no matter. The reason that there seems to be no difference between nulling either way is that the "sample" into the demodulators gets flipped on the opposite side of a null and so does the phase shift direction, (relative to the Tx signal) hence the overall response through the demodulators is the same.
Case in point - Some Nautilus detectors have an adjustment to manually feedback phase and/or amplitude from the Tx signal to the Rx signal to achieve a good null. No difference there either.
The only odd thing that happens while nulling to one of the sides is that large metal objects cause the Rx phasing to "roll" from one side to the other though the null point. (coil rollover?)
So the question becomes this - If no difference can be found while bench testing, why did White's choose -90 degrees to null?
Don
The specific coil I am working with now is my 4th concentric - this one is a 10" for the IDX project. To this point, they all have worked wonderfully.
The way I make my coils, I basically "copycat" one of my coils for the White’s Spectrum - frequency, phasing, nulling, etc. My commercially made 9.5" coil is nulled with the Rx coil about -90 degrees (leading the Tx signal) to around 100mV P-P. (Performance gets copied but not the cost ; )
So we have several ways to null a coil- either Rx leading or Rx lagging the Tx signal or the deepest null... and heck if I can find any difference at all !! I can prove it to myself over and over again. GB, DISC and sensitivity all seem be exactly the same no matter. The reason that there seems to be no difference between nulling either way is that the "sample" into the demodulators gets flipped on the opposite side of a null and so does the phase shift direction, (relative to the Tx signal) hence the overall response through the demodulators is the same.
Case in point - Some Nautilus detectors have an adjustment to manually feedback phase and/or amplitude from the Tx signal to the Rx signal to achieve a good null. No difference there either.
The only odd thing that happens while nulling to one of the sides is that large metal objects cause the Rx phasing to "roll" from one side to the other though the null point. (coil rollover?)
So the question becomes this - If no difference can be found while bench testing, why did White's choose -90 degrees to null?
Don
Comment