Two atomic clocks are synchronized while on Earth before one is launched (within a GPS satellite) into a medium-altitude Earth orbit. The satellite orbits the Earth at an altitude of 22,000 km, with an orbital velocity of 15,000 km/hr. After 1000 orbits, estimate the time difference between the two atomic clocks? (Assume that only special relativity effects are present.)
Expert's answer
Answer on Question #43043, Physics, Relativity
Two atomic clocks are synchronized while on Earth before one is launched (within a GPS satellite) into a medium-altitude Earth orbit. The satellite orbits the Earth at an altitude of 22,000 km, with an orbital velocity of 15,000 km/hr. After 1000 orbits, estimate the time difference between the two atomic clocks? (Assume that only special relativity effects are present.)
According to special relativity, the rate of a clock is greatest according to an observer who is at rest with respect to the clock. In a frame of reference in which the clock is not at rest, the clock runs more slowly, as expressed by the Lorentz factor.
The Lorentz factor is defined as:
γ=1−c2v21=1−β21
where:
- v is the relative velocity between inertial reference frames,
- β is the ratio of v to the speed of light c.
- c=3⋅108 m/s is the speed of light in a vacuum.
Thus,
γ=1−9⋅10164166.6721=1.000000000096451
Time dilation
The time (Δt′) between two ticks as measured in the frame in which the clock is moving, is longer than the time (Δt) between these ticks as measured in the rest frame of the clock:
Δt′=γΔt
For time Δt we have:
Δt=velocityDistance=v2π(R+h)N
where R=6.371⋅106 m is the Earth's mean radius.
Thus,
Δt=4166.672π(6.371⋅106+22⋅106)⋅1000=4.27824⋅107 s
The time difference between the two atomic clocks is
δt=Δt′−Δt=γΔt−Δt=Δt(γ−1)
Thus,
δt=4.27824⋅107(1.000000000096451−1.0)=0.00413 s=4.13⋅10−3 s