In pulse induction (PI) detectors, a damping resistor is traditionally used in the TX stage to suppress the flyback voltage that occurs when the coil current is switched off. However, with this method, the energy stored in the coil is dissipated directly as heat, and control over the pulse waveform remains limited.
At this point, has anyone tried implementing an active clamp topology using a MOSFET and a capacitor instead of a damping resistor? In particular, it seems theoretically possible to redirect the coil energy to a clamp capacitor at the moment the current is interrupted, thereby limiting Vds stress and achieving more controlled damping. In theory, this approach could offer lower losses, improved pulse repeatability, and potential EMI advantages.
However, since the measurement window in PI systems is extremely critical, important design considerations would include whether the switching noise of the active clamp affects the RX stage, the avalanche robustness of the MOSFET, optimization of the clamp capacitor value, and precise gate timing control.
Has anyone implemented this method in practice or evaluated it through simulation and measurement? Compared to the conventional damping resistor approach, does it provide a meaningful real-world advantage, or is it generally avoided due to increased complexity and noise risk? I would appreciate hearing your experiences and recommendations.
At this point, has anyone tried implementing an active clamp topology using a MOSFET and a capacitor instead of a damping resistor? In particular, it seems theoretically possible to redirect the coil energy to a clamp capacitor at the moment the current is interrupted, thereby limiting Vds stress and achieving more controlled damping. In theory, this approach could offer lower losses, improved pulse repeatability, and potential EMI advantages.
However, since the measurement window in PI systems is extremely critical, important design considerations would include whether the switching noise of the active clamp affects the RX stage, the avalanche robustness of the MOSFET, optimization of the clamp capacitor value, and precise gate timing control.
Has anyone implemented this method in practice or evaluated it through simulation and measurement? Compared to the conventional damping resistor approach, does it provide a meaningful real-world advantage, or is it generally avoided due to increased complexity and noise risk? I would appreciate hearing your experiences and recommendations.





Comment