- Thank you received: 0
Relativistic Interpretation of Magnetic Fields
18 years 11 months ago #13038
by north
Reply from was created by north
<blockquote id="quote"><font size="2" face="Verdana, Arial, Helvetica" id="quote">quote:<hr height="1" noshade id="quote"><i>Originally posted by Thomas</i>
<br />In standard textbooks (e.g. Berkeley Physics Course, Feynman) the magnetic field of a wire is derived from the principles of relativity by considering the length contraction for the charge distribution in different reference frames. This yields a result exactly identical to the one predicted by electrodynamics for the field. However, the derivation is always only done for an infinite wire. For a finite wire, one finds in fact that the results differ:
it is straightforward to show (see my page Magnetic Fields and Lorentz Force ) that a linear charge distribution of length L has (to second order) an electric field
E = Q*(1/r^2 - L^2/2r^4) ,
where Q is the total charge, L the length of the wire and r the vertical distance.
This means that any length contraction of L has no effect on the first (monopole) term and the net monopole field due to the ions and electrons is zero. Length contraction would thus only lead to a magnetic field decreasing asymptotically ~1/r^4 . However, electrodynamics yields a field ~1/r^2 for the finite wire (see any derivation of the Biot-Savart law).
Apart from just noticing this inconsistency between Relativity and Classical Electrodynamics, I wonder what the correct behaviour in the far field of a current system actually is. Does anybody know some observational or experimental evidence in this respect? (just for clarification, I am referring to the static magnetic fields of DC currents here not the radiation field of AC currents).
Thomas
www.physicsmyths.org.uk
www.plasmaphysics.org.uk
<hr height="1" noshade id="quote"></blockquote id="quote"></font id="quote">
Thomas
how do you conceptualize your question?
<br />In standard textbooks (e.g. Berkeley Physics Course, Feynman) the magnetic field of a wire is derived from the principles of relativity by considering the length contraction for the charge distribution in different reference frames. This yields a result exactly identical to the one predicted by electrodynamics for the field. However, the derivation is always only done for an infinite wire. For a finite wire, one finds in fact that the results differ:
it is straightforward to show (see my page Magnetic Fields and Lorentz Force ) that a linear charge distribution of length L has (to second order) an electric field
E = Q*(1/r^2 - L^2/2r^4) ,
where Q is the total charge, L the length of the wire and r the vertical distance.
This means that any length contraction of L has no effect on the first (monopole) term and the net monopole field due to the ions and electrons is zero. Length contraction would thus only lead to a magnetic field decreasing asymptotically ~1/r^4 . However, electrodynamics yields a field ~1/r^2 for the finite wire (see any derivation of the Biot-Savart law).
Apart from just noticing this inconsistency between Relativity and Classical Electrodynamics, I wonder what the correct behaviour in the far field of a current system actually is. Does anybody know some observational or experimental evidence in this respect? (just for clarification, I am referring to the static magnetic fields of DC currents here not the radiation field of AC currents).
Thomas
www.physicsmyths.org.uk
www.plasmaphysics.org.uk
<hr height="1" noshade id="quote"></blockquote id="quote"></font id="quote">
Thomas
how do you conceptualize your question?
Please Log in or Create an account to join the conversation.
Time to create page: 0.261 seconds