Any channelized receiver suffers from a phenomenon referred to as rabbit ear effect. This phenomenon appears when an out-of-band pulse generates a distorted response at the output of the selected filter. In this paper, the rabbit ear effect is analyzed and a tri-band filter structure is proposed to identify and suppress the rabbit ear effect. The average likelihood detection strategy is adopted to optimize the parameters of the tri-band filter as well as the detector structure. It is shown that while competing methods reject all short duration pulses, the proposed system preserves all in-band pulses while eliminating most of the out-of-band pulses (i.e. rabbit ear pulses).