Hi,
4 hours over 1/2 a billion years isn't a great deal of difference. Also, wouldn't half a billion years be sufficient time for the radiation to stop or significantly slow.
>>>But in my theory, light that has traveled hundreds of millions of years acts different, and doesn't fit on the same EM spectrum as fresh locally emitted light.
>>>
>>>It suggests that frequency can drop in really old light, and the wavelength does NOT go up.
>>
>>How would that be possible? The speed of ANY wave - not only EM waves - must always be the product of wavelength * frequency. This is not advanced math, but simple geometrical reasoning. You can't circumvent that.
>
>
>Sure you can.
>
>Wavelength and frequency are classical concepts.
>
>My code constitutes a very non-classical physics.
>
>Remember the wave-particle duality?
>
>EM "waves" are different than sound waves or ocean waves, or anything like that.
>
>
>> For example, if light goes at 300,000 km/sec, and the frequency is one second, you'll have one wave crest every 300,000 km, since that is the distance the previous wave crest advanced in one second.
>>
>>Nor have countless experiments produced any evidence that light goes at any speed but "c".
>
>How about this:
>
>"Delayed gamma rays from deep space may provide the first evidence for physics beyond current theories."
>
>
http://www.physorg.com/news110480559.html>
>It seems to corroborate my hypothesis.
>
>
>
>>>If anyone can think of a way to test that, it would be pretty good proof.
>>
>>It should be possible, in principle, to shine a laser from the moon, or perhaps from a spacecraft further away, and measure any change in frequency. But I guess that first there would have to be a sound theory to justify such an experiment.
>
>We need to test it at distances where Hubble redshift is observed, hundreds of millions of light years.
>
>An experiment that involved us emitting the light would have to take hundreds of millions of years to complete.
>
>That's not going to work.
Regards N Mc Donald