Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
A novel hypothesis for Hubble Redshift
Message
From
13/07/2010 00:51:31
Neil Mc Donald
Cencom Systems P/L
The Sun, Australia
 
 
To
12/07/2010 23:11:19
General information
Forum:
Visual FoxPro
Category:
Other
Miscellaneous
Thread ID:
01472304
Message ID:
01472308
Views:
81
Hi,
Great Article. How do your models handle the following:-

a) Very recent advances in solar panel design have found that photons recharge themselves i.e. they interact with the solar panel and loose say 50% of their energy, but within 0.05mm recharge themselves and then giveup the same 50% of their energy again, this has been observed using multilayer substrates.

b) The Universe is accelerating in its expansion. This defies all laws of physics. Is it the fabric of space itself accelerating, if so how are we observing it, remember the frames of reference arguments.

c) All are objects are moving away from earth at equal rates. How can this be, unless we are at the centre of the universe, we are supposed to be on the outer edges.


>Hey, I wrote these models in VFP, so thought I'd share them here, appreciate any feedback:
>
>
>abstract: A model based on a novel interpretation of the observed Hubble redshift is compared and contrasted to a model based on the widely accepted expansion interpretation and also, for demonstration purposes, to a model based on the long refuted tired light interpretation.
>
>Let us begin with an observation, some empirical evidence.
>
> Exhibit A:
> Hubble redshift is detected in electromagnetic radiation that has traveled cosmological distances.
>
>To explain Exhibit A, I submit the following conjecture:
>
> Conjecture:
> The redshift is a decrease in energy and frequency which will eventually reach 0. This is caused by the internal dynamics of electromagnetic radiation.
>
>The established theories of electromagnetism have been so well tested in our laboratories as to be above suspicion as the source of the observed redshift at cosmological scales. And because the wide acceptance of the expansion interpretation, there seemed to be little reason to even consider changing the theories of light to accommodate Exhibit A.
>
>On the other hand, the expansion interpretation changed the apparent motion of every galaxy and the properties of space itself, limits the age of a Universe that contains vast superclusters, and introduces mysterious entities like dark energy. All of this increases the possibility of more elegant theories being discovered.
>
>Upon reflection, would changing the theories of EM radiation to accommodate a phenomenon detected in EM radiation, Exhibit A, be uncalled for?
>
>To explore that possibility, the conjecture needs to be developed into a hypothesis and worked into a model.
>
>In developing the hypothesis, the mindset demonstrated here is that the empirical reality of Exhibit A might indicate a new principle of physics at cosmological scales, and might demonstrate there are limits to the domain of applicability of many established theories, which are well tested but at much smaller scales. In other words, my hypothesis may contradict many other theories at cosmological scales, but only in the pursuit of best explaining what is actually observed at cosmological scales.
>
>According to the conjecture, the redshift is caused by the internal dynamics of EM radiation and not the motion of the galaxy that emitted the light; the apparent recessional velocity of a redshifted galaxy is not its actual recessional velocity. This leads me to ask, is there is a better way to state Hubble's Law (v = H * D) using some other physical magnitude in the place of the galaxy's apparent recessional velocity?
>
>I've developed what I think is the proper alternative to Hubble's Law, which generates some pretty interesting predictions and consequences.
>
> Hypothesis:
> v = c - Ht
> where
> v = the speed of light in a vacuum
> c = 299792.458 km/sec
> H = 21.77 km/sec/one million years
> t = duration of the photon's journey between emission and absorption in millions of years
>
>Given that the speed of a wave is its frequency * wavelength, and we observe a reduced frequency as empirical fact Exhibit A, it's not too difficult to see that reducing frequency would reduce the speed of the wave, just as the hypothesis predicts.
>
>To demonstrate the hypothesis I've built a model of light which is emitted along an x axis at a speed determined by the hypothesis, v = c - Ht. Once the timer starts, the light takes off until it reaches a target which is 6 billion light years away, at which point the timer is checked and the results are displayed.
>
>I've built two more models to compare and contrast the results with.
>
>In the second model, which is based on the dominant expansion hypothesis, the light does not obey my hypothesis, but instead always travels at c. On the other hand, the target that light is traveling toward is receding from the source of the light at a velocity v that increases proportionally with the distance D the light has traveled thus far, such that v = H * D.
>
>In the third model, which is based on the discredited tired light models, light may lose some energy based on an unexplained interaction, but it doesn't slow down nor does it encounter an increasing distance to its target.
>
>The models are written in the Visual FoxPro programming environment and provided in Appendix A of this paper. When they finish, in the first two models t=8.83 billion years, and in the third model t=6 billion years. A video of screen shots of the model running is available on the Internet at:
>
> Video:
> http://www.youtube.com/watch?v=k0JTD3FkWjc
>
>Here is a graph that shows the final result of the models.
>
> Graph:
> (included at the end of the video)
>
>It can be seen from these results that while the distance covered by the v=c-Ht and tired light models is the same, the duration of the trip is larger than tired lights predictions by equal amounts in both v=c-Ht and the expansion models. v=c-Ht may not predict increasing distances, like the Big Bang, but it does predict increasing durations, identical to the Big Bang.
>
>It stands to reason that if a solar panel collecting energy X in 24 hours, were to start collecting the same energy X in 26 hours, that (assuming the change had occurred in the source and not the panel) the increasing duration would imply a decreased frequency of the incoming light. Because the increase in duration predicted by the expansion model and the v=c-Ht model are equal, it would stand to also reason that both models predict identical redshifts, the empirically observed Exhibit A.
>
>The increase in duration is a feature shared by the v=c-Ht and expansion models, creating a general class of models to which the tired light model does not belong, as it cannot predict Exhibit A and is again ruled out.
>
>Predictions
>
>That leaves the expansion model and v=c-Ht. Even though these models have been show to both predict an equally increasing duration, there are some differences in the cosmologies they predict, which will be examined so that tests may be devised to determine which of the models is the best fit for the whole cosmos.
>
>v=c-Ht predicts a finite range of light, whereas the established models have an indefinite range of light. Consequently, the distances of the established model must be increasing in an expansion that rewinds back to a Big Bang. Thus, the established models, with an indefinite range of light predict that the Universe has a finite age and size.
>
>On the other hand, v=c-Ht with its finite range of light, predicts an indefinite age and size of the Universe. If there were galaxies beyond the finite range of light, we would never see them. This would be confirmed by observing structure in the cosmos older than expansion model allows for.
>
>Further, as there is no increasing distances in the v=c-Ht model, it predicts shorter distances between galaxies, and thus a stronger force of gravity should be observed between galaxies than with expanded distances.
>
>Criticisms
>
>The first critical flaw that is often pointed out where this hypothesis's predictions and what is observed seem to be in conflict is that the measured wavelength of redshifted light is increased, whereas my theory predicts the frequency will reduce along with the speed, which means a static wavelength. On the contrary, the light's speed is dependent on the time between emission and absorption. That means once the light interacts with the measurement device, it will have been re-emitted. Since the time the light has been traveling since emission will now be small rather than cosmological, it will be re-emitted at c. The light won't magically regain its redshifted frequency or energy, Exhibit A, but since the hypothesis predicts the light will be traveling at c, its wavelength is predicted to be larger too.
>
>And that is what is observed of the light coming from the diffraction grating. This criticism actually works in favor of the hypothesis. A common reaction to this claim is that its an ad-hoc resolution to the criticism. But clearly this behavior is dictated by the hypothesis, even if it works differently than the established theories, which should be addressed here.
>
>I'll use Special Relativity as an example as it seems to be the theory most in disagreement with the decreasing speed of light. The response to this criticism is Exhibit A requires us to change our picture of Special Relativity, and it can be easily demonstrated how and why considering light cones in Special Relativity.
>
>No matter what scale you're talking about, somewhere the light cones will intersect. But even under the expansion model of redshift, this won't apply to the light at all scales. In the Big Bang, the distance between the light sources is expanding, and at distances beyond Hubble's Limit, the light cones won't intersect at all because of the expansion is too great.
>
>The novel finite range of light model, on the other hand suggests the light cones curve, making "goblet" or wineglass shapes, rather than cone-like martini glasses.
>
> Video:
> http://www.youtube.com/watch?v=Te4AJJTCMXk
>
>These light goblets and their deviations from their light cone counterparts, are suggested to be the source of a range of cosmological observations, from the fall off in surface brightness, the time dilation in supernovae light curves, and of course, Hubble redshift, Exhibit A.
>
>As more predictions and tests are worked out of the hypothesis, in the meantime go look out in the night sky sometime. Did all of that expand from a single point? Or is it possible light doesn't travel forever, and maybe there's even unfathomably more out there beyond what light is able to show us?
>
>You be the judge.
>
>Appendix A:
>
>
>clear
>lEscape = .f.
>lPictures = .f.
>on escape lEscape = .t.
>
>DECLARE Sleep IN Win32API INTEGER nMilliseconds
>if lPictures
>	Declare Integer formtobmp IN "PCT_DLL.dll" integer hwnd,String bmpFileName
>	lcFile = sys(2015)
>endif
>
>if type("_screen.target1") = "O"
>	_screen.RemoveObject("target1")
>endif 
>if type("_screen.target2") = "O"
>	_screen.RemoveObject("target2")
>endif 
>if type("_screen.target3") = "O"
>	_screen.RemoveObject("target3")
>endif 
>
>ntimescale = 10
>graphscale = 14E+19
>ygraphscale = 200/ntimescale
>
>c = 299792.458
>millyearseconds = 60 * 60 * 24 * 365 * 1000000
>xtarget = 6000 * c * millyearseconds
>xtarget2 = xtarget
>_screen.AddObject("target1", "target")
>_screen.target1.top = 10
>_screen.target1.left = xtarget/graphscale
>_screen.target1.visible = .t.
>_screen.AddObject("target2", "target")
>_screen.target2.top = 40
>_screen.target2.visible = .t.
>_screen.target2.left = xtarget/graphscale
>_screen.AddObject("target3", "target")
>_screen.target3.top = 70
>_screen.target3.visible = .t.
>_screen.target3.left = xtarget/graphscale
>_screen.Cls()
>
>c = 299792.458
>H = 21.77
>c1 = c
>x = 0
>x2 = 0
>x3 = 0
>t = 0
>v2 = 0
>t = 0
>do while not lEscape and ;
>	(empty(_screen.target1.caption) or ;
>	empty(_screen.target2.caption) or ;
>	empty(_screen.target3.caption))
>	
>	* Take a picture
>	* Take a screen shot before we go
>	if lPictures and mod(t, 500) = 0
>		_screen.Caption = transform(t) + " million years"
>		retVal = formtobmp(_vfp.HWnd ,fullpath(lcFile + transform(t) + ".bmp"))  
>	endif 
>	
>	t = t + 1/ntimescale
>
>	if xtarget > x
>		x = x + (c1 * (millyearseconds/ntimescale))	
>		c1 = c1 - (H/ntimescale)
>
>		_screen.ForeColor = rgb(255, 0, 0)
> 		_screen.Circle(5, x/graphscale, 20)
>*		_screen.Circle(5, _screen.Width/2 * (x/graphscale), _screen.Height - t/ygraphscale)
>	endif
>	if empty(_screen.target1.Caption) and xtarget <= x
>		_screen.target1.Caption = transform(t)
>	endif 
>
>	if xtarget2 > x2
>		x2 = x2 + (c * (millyearseconds/ntimescale))
>		* These work out the same, it's Hubble's Law
>		*v2 = H * (x2 / (c * millyearseconds))
>		v2 = v2 + (H/ntimescale)
>		xtarget2 = xtarget2 + v2 * (millyearseconds/ntimescale)
>		_screen.target2.left = xtarget2/graphscale
>
>		_screen.ForeColor = rgb(0, 0, 255)
>		_screen.Circle(5, x2/graphscale, 50)
>*		_screen.Circle(5, _screen.Width/2 * (x2/graphscale), _screen.Height - t/ygraphscale)
>	endif	
>	if empty(_screen.target2.Caption) and xtarget2 <= x2
>		_screen.target2.Caption = transform(t)
>	endif 
>
>	if xtarget > x3
>		x3 = x3 + (c * (millyearseconds/ntimescale))	
>
>		_screen.ForeColor = rgb(0, 255, 0)
>		_screen.Circle(5, x3/graphscale, 80)
>*		_screen.Circle(5, _screen.Width/2 * (x3/graphscale), _screen.Height - t/ygraphscale)
>	endif
>	if empty(_screen.target3.Caption) and xtarget <= x3
>		_screen.target3.Caption = transform(t)
>	endif 
>
>enddo
>_screen.Caption = transform(t) + " million years"
>wait window 
>if lPictures
>	retVal = formtobmp(_vfp.HWnd ,fullpath(lcFile + transform(t) + ".bmp"))  
>endif
>
>
>_screen.RemoveObject("target1")
>_screen.RemoveObject("target2")
>_screen.RemoveObject("target3")
>clear dlls
>
>define class target as label
>	caption = ""
>	BorderStyle = 1
>	Width = 50
>	Height = 20
>enddefine 
>

k0JTD3FkWjc
Te4AJJTCMXk
Regards N Mc Donald
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform