- Microwave Oven(s)
- Television(s)
- Cell Phones
- Cordless Telephone connected to the land-line
- Computer with a Monitor
- Flourescent Lights
- Laptop with/without wireless networking
- Wireless Access Point for 802.1**
- Hand Held blow dryer for hair
- Electric heating element Stove
- Automatic Garage door opener with remotes in cars
- Speakers from a Stereo System
- Any Television Stations within 5 miles
- Any FM radio station within 5 miles
- Any AM Radio Station within 20 miles
- Electricity Lines with 100 feet of the ground (telephone poles?)
- Any power sub-stations within a mile
- City within 200 miles up wind
- A CB Radio enthusiast within the Neighborhood, in house or car
- An Amateur Radio Operator
Let use take worst case scenario here 380 channels times .3 W == 114Watts of radiated power maximum. That monitor they are sitting in front of, TV they have, 1800 Watt Microwave they bought, 900MHz Cordless phone, the Cell phone they have etc... All have MORE radiation (radiation being a generic term for transmitted Radio Frequency signal) than that Tower does. Most towers have dead spots of about 500 yards in a radius around the tower. The antennas are designed to transmit more horizontally and down than Straight down and onto the neighborhood. They also minimize signal goin into space.
So, given a few massive generalization in calculations, I would guess *AT* most at any one time, these people are getting radiated with something on the order of .3mW (0.0003 Watts) of Cell phone signal from the tower. There is *MORE* background noise being radiated from the Sun and the Earth to worry about. And Watch out for those handheld Hair blow-dryers... YEOWCH!