View Single Post
Old 6th Mar 2021, 9:00 am   #1
ianbatty311
Tetrode
 
Join Date: Jul 2018
Location: Rosebud, Victoria, Australia
Posts: 50
Question Noise in receivers

Hi, folks.

I'm looking for an answer to a mystery.

When I get a radio on the bench and turn it on at maximum volume, I get a certain amount of noise output, let's say 1 milliwatt.

Then, I connect a sign gen and bring the signal up from zero.

I can hear the 400 Hz modulation tone, but I also hear (and measure) an *increase* in the noise level - it can easily get to 10x the noise level with zero signal input.

As I increase the signal level, the tone output increases, and the noise level eventually subsides so that I can get the standard of 20 dB Signal-to-Noise.

This is the puzzle: why does the noise level increase initially? I get why it decreases with stronger signal input, but why is it highest with the weakest signals?

Which radios?

All superhets with good sensitivity, whether it's valve sets (wire antenna or loopstick) or transistor sets (wire antenna or loopstick).

... and I first noticed this back in 1963 testing VHF aircraft receivers in a military-grade screened room.

Yep, all sig gens, too: HP, Advance, Fluke, homebrew, whatever.

Outside interference?

Nup, got my own screened room now that is electrically quiet (really quiet, not a peep from broadcast stations and no electrical interference) for testing.

Any ideas?

Ian
ianbatty311 is offline