Yes, a radio transmission from a source to a destination 12 light years away, where the destination is moving away from the source at 75% of the speed of light, would appear slowed down to the recipient. This is due to the relativistic effect known as time dilation.
Time dilation occurs when an object is moving relative to an observer at a significant fraction of the speed of light. According to the theory of relativity, time appears to pass slower for the moving object compared to the observer at rest. In this case, the recipient on the moving destination would observe time passing slower compared to the source of the radio transmission.
The degree of time dilation can be calculated using the Lorentz factor, denoted by γ. The Lorentz factor is given by the equation:
γ = 1 / sqrt(1 - (v^2 / c^2))
Where:
- v is the relative velocity between the source and the destination (0.75c, where c is the speed of light).
- c is the speed of light in a vacuum (approximately 299,792,458 meters per second).
In this scenario, the Lorentz factor can be calculated as follows:
γ = 1 / sqrt(1 - (0.75^2))
γ ≈ 1.515
This means that time on the moving destination would appear to pass approximately 1.515 times slower compared to the source.
As a result, the radio transmission would be perceived as slowed down by the recipient on the destination. For every second that passes on the source, only approximately 0.66 seconds would pass on the destination moving away at 75% of the speed of light.