+11 votes
in Wavelength by
edited by

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
+4 votes
by

The radiation resistance of a half-wavelength antenna depends on several factors, such as the design, construction, and environment. In general, a half-wavelength dipole antenna, which is one of the most common types, has a radiation resistance of approximately 73 ohms (Ω).

The radiation resistance represents the portion of the antenna's impedance that is associated with the radiation of electromagnetic waves into space. It is important to note that the radiation resistance alone does not provide a complete picture of the overall impedance of the antenna, which also includes reactance and possibly other losses.

It's worth mentioning that the actual radiation resistance may deviate from the theoretical value due to factors like nearby objects, ground effects, and other environmental conditions. Additionally, different antenna designs and lengths can yield variations in radiation resistance. Therefore, the specific value may vary in practical implementations.

Welcome to Physicsgurus Q&A, where you can ask questions and receive answers from other members of the community.
...