View Single Post
Old 7th Mar 2021, 12:15 am   #12
ianbatty311
Tetrode
 
Join Date: Jul 2018
Location: Rosebud, Victoria, Australia
Posts: 50
Default Re: Noise in receivers

To GXtanuki

Um.....

Some years ago, at an HRSA Farm Meeting (thanks to Laurie Harris), I ran a receiver competition for really old sets.

Most had neither frequency nor station calibrations, so I brought along my trusty HP as a 'spotter' to help tune to known local stations.

Jim Easson (hi, Jim) had an older set that he thought would work pretty well, and it was OK.

...until I turned off my HP.

Then the radio's output fell noticeably - you could easily hear the drop in gain.

Ummm... so I turned the HP back on, and we both listened. Got the 400 Hz modulation, so I turned that off.

Well, Dave and Bill had given me a sig gen accurate to a few hertz of dial setting (you could just hear a low-frequency 'swoosh' kind of sound behind the station program.

Turn the sign gen off, station sound drops.

It's heterodyne detection, where - within reason - adding in a very-near or actually-coherent signal pushes a diode demodulator up its non-linear curve to give more output.

It's the radio equivalent of the low-intensity bias light used to push camera tubes (ever since the Iconoscope) up out of their low-sensitivity 'inertial' part of the response curve.

Where was I?

Oh, yes -

so I get that there is a native noise level - mixer noise, usually - and that adding the signal can push the demodulator up off its low-sensitivity curve and effectively give more amplification by virtue of moving to the linear part of operation.

This is distinct from the phase noise argument, however...

So what do we think?

Is the phenomenon due to pushing the demodulator up its response curve (as was clearly demonstrated with Jim's radio), or to phase noise?

If it *is* phase noise (which the textbooks assert), where did I go wrong using my HP generator to substitute for the LO in my transistor radio experiment

Your thoughts?
ianbatty311 is offline