Interesting question, but what would be even more interesting to ponder is what would happen if a spaceship crew began to broadcast live video/audio as they launched, and then accelerated to near the speed of light. Since time passes much more slowly for the spaceship crew at near light speed than it does for those on Earth, would the Earthbound viewers see the broadcast keep slowing down as the craft approached light speed, until eventually it seemed to be running at a fraction of real-time? My guess is that this is precisely what would happen, but curious if anyone has other ideas.
The transmission would run at normal time and speed. It would just be delayed.
Think about this.
The speed of light, is the limit to how fast the spacecraft can fly.
But what if the spacecraft was to succeed in reaching 99% of the speed of light? So just 1% short of the speed of light.
The light beams (or radio waves) broadcast back toward base, would be travelling at the speed of light.
But what speed of light? Speed of light as measured relative to the ship which transmitted them? It is travelling away from earth, at 99% the speed of light. So would the radio/light waves be actually received here, at 1 percent of their original velocity?
Or would the radio waves be received here, at the speed of light here?
Is that conflict, what relativity is all about?
Anyway, that’s all. :D
GPS satellites move at way less than the speed of light, but GPS calculations still need to account for Special and General Relativity. Special Relativity slows the satellites' clocks by about 7 microseconds a day, but General Relativity speeds them up by about 45 microseconds/day. The net is, they gain about 38 microseconds per day. From this link:
To achieve this [high] level of precision, the clock ticks from the GPS satellites must be known to an accuracy of 20-30 nanoseconds. However, because the satellites are constantly moving relative to observers on the Earth, effects predicted by the Special and General theories of Relativity must be taken into account to achieve the desired 20-30 nanosecond accuracy.
Because an observer on the ground sees the satellites in motion relative to them, Special Relativity predicts that we should see their clocks ticking more slowly (see the Special Relativity lecture). Special Relativity predicts that the on-board atomic clocks on the satellites should fall behind clocks on the ground by about 7 microseconds per day because of the slower ticking rate due to the time dilation effect of their relative motion [2].
Further, the satellites are in orbits high above the Earth, where the curvature of spacetime due to the Earth's mass is less than it is at the Earth's surface. A prediction of General Relativity is that clocks closer to a massive object will seem to tick more slowly than those located further away (see the Black Holes lecture). As such, when viewed from the surface of the Earth, the clocks on the satellites appear to be ticking faster than identical clocks on the ground. A calculation using General Relativity predicts that the clocks in each GPS satellite should get ahead of ground-based clocks by 45 microseconds per day.
The combination of these two relativitic effects means that the clocks on-board each satellite should tick faster than identical clocks on the ground by about 38 microseconds per day (45-7=38)! This sounds small, but the high-precision required of the GPS system requires nanosecond accuracy, and 38 microseconds is 38,000 nanoseconds. If these effects were not properly taken into account, a navigational fix based on the GPS constellation would be false after only 2 minutes, and errors in global positions would continue to accumulate at a rate of about 10 kilometers each day! The whole system would be utterly worthless for navigation in a very short time.
The engineers who designed the GPS system included these relativistic effects when they designed and deployed the system. For example, to counteract the General Relativistic effect once on orbit, the onboard clocks were designed to "tick" at a slower frequency than ground reference clocks, so that once they were in their proper orbit stations their clocks would appear to tick at about the correct rate as compared to the reference atomic clocks at the GPS ground stations. Further, each GPS receiver has built into it a microcomputer that, in addition to performing the calculation of position using 3D trilateration, will also compute any additional special relativistic timing calculations required [3], using data provided by the satellites.
The earth based receiver would have to compensate, not for the delay, but the modification to the signal. The wavelength would get longer the faster it went. It’s a process called “redshifting”. The information itself would also get slower in it’s reception. One frame of video which in a 30 frame per second video takes naturally 1/30 of a second to watch. Viewing video is based on that 1/30 of a second transmission speed per frame being attainable. Eventually the frame reception would slow down to that one frame takes significantly longer than 1/30 of a second to receive. This would cause an effect like watching a video in slow motion. So imagine if you’re watching live, as it accelerates, you’ll have to compensate for signal redshift and even then, you’ll get a slower and slower frame rate and stuttering audio till it becomes unwatchable.
In the future, video would have to be sent with some form of signaling to indicate the entire message has been sent so the viewer can compile and watch it as close to real time as possible.
“Interesting question, but what would be even more interesting to ponder is what would happen if a spaceship crew began to broadcast live video/audio as they launched, and then accelerated to near the speed of light.”
That’s where my mind went as well.