WHICH IS MORE "FUNDAMENTAL,"
ELECTRIC CURRENT OR CHARGE?
(c)1996 William J. Beaty
Here's a perspective on Electricity which I've yet to encounter elsewhere.
The Ampere is far easier to measure than the Coulomb, if the measurement
must be made with extreme precision. Time (seconds) is also easy to
measure. Therefore, whenever physics standards are concerned, the Ampere
and Second are measured, and the value of the standard Coulomb is derived
from these measurements.
Unfortunately this concept has a very large and unwarranted impact on
physics teaching and on educational materials. It appears in an ugly,
twisted form. It appears as an assertion that electric current is more
"real" than charge. It even appears as educators' unexamined assumption
that electric currents are real, while electric charge is merely an
abstract concept. Not true!
The same error appears in K-12 school books, where it is claimed that
"quantity of electricity" is measured in Amperes, not in Coulombs. The
same books assert that "current electricity" is a fundamental type of
electrical substance, and they frequently state that "current" is the
substance which flows within wires. Yet the "stuff" that flows in wires
is called charge, not current.
The same problem appears in higher-level texts where students are taught
about Amperes, but rarely are they given enough information to properly
visualize Amperes in terms of flowing Coulombs. Sometimes students are
taught first about electric current and Amperes, and only later (if at
all) taught about Coulombs and about the electron sea of metals which does
the flowing.
The idea that "Amperes are more real" also appears subtly all through non-
science electronics texts, where authors focus on current, on amperes.
They talk constantly about the flowing motion of "current," and
only ever mention the flowing charge.
The misconception has spread so far that it has infected electrical
engineering. Our textbooks teach us about "current carriers," and the law
of "conservation of current" in circuits. Neither one exists.
Charge-carriers exist. Charge is conserved. But electric current can
appear and vanish, and doesn't fall under any conservation law. Conservation
of current? Particles made out of current? It's just bizarre!
Also the same distorted concept appears in the widespread conviction that
charge is ghostly and unimportant, while electric current is real and
substance-like.
And it appears in the idea that electric charge only applies to "static
electricity", a phenomenon thought to be mostly useless, (static cling,
doorknob sparks) or even dangerous (lightning). Conversely, Electric
Current is supposed to apply to modern technology of nearly every kind.
I long have wondered where these various misconceptions arose, but
recently I suspect that they have a common origin. I believe that our
method of standardizing physical units might be the cause. The common
thread of these misconceptions is the fallacy that amperes are more
*fundamental* than coulombs, where the word "fundamental" is mistakenly
used in a popular sense, rather than with a specialized meaning pertaining
to standard physical units: "fundamental" units, versus "derived" units.
In the everyday world, and using a popular meaning of "fundamental," we
would say that conserved quantities are more fundamental than rates. For
example, Kilograms are more fundamental than KGs/second, volume is more
fundamental than flow rate, distance (meters) is more fundamental than
speed, Joules of energy are more fundamental than Watts of
energy flow, etc. For example, it would be ridiculous to introduce the
concept of speed to students who have little understanding of distance or
time. Or introduce gallons/second to someone who had never encountered
water before.
But in electricity, many believe the opposite, that current is real and that
charge is abstract. They'll fight fiercely to defend their belief. Some
authors describe Amperes as fundamental units, and
point out that Coulombs are derived from Amperes. They may introduce the
Ampere to students who have no grasp of the Coulomb. They write as if
"electricity" is always measured in Amperes, while also writing that charge
is an abstract and hard-to-understand concept involving a strange unit called
...not the Coulomb, but instead, the Ampere-second. (Charge, it's amperes
TIMES seconds, so charge obviously must be an abstract mathematical concept?)
I say, on the contrary. Coulombs are fundamental, while Amperes are just
a convenient simplification, a jargon-term meaning "Coulombs per second."
Yes yes, the Coulomb unit is a Derived Standard, while the Ampere unit is
a Fundamental Standard. Even so, electric charge is fundamental, and
electric current is just the flow-rate of charge. Coulombs are a conserved
quantity, like a "stuff," while Amperes are not.
This does seem to violate the statement that "Amperes are fundamental,
while Coulombs are derived." But this statement is really saying that
"the physical standard for Amperes is directly measured, while the
standard for Coulombs is derived from Amperes and Seconds." It's not
saying that Amperes are fundamental, it only discusses which unit is
easier to measure with precision.
When explaining electricity, I suggest that we avoid trying to teach students
about the
alternate definitions of the word "fundamental", and avoid teaching about
Fundamental versus Derived units (at least in the lower grades). We
should stick with the definition of "fundamental" which most students
already know. I believe that the inexperienced learner will find
much more sense in the statement:
CHARGE IS MORE FUNDAMENTAL THAN CURRENT. COULOMBS ARE A FUNDAMENTAL
ENTITY, WHILE THE "AMPERE" IS SHORTHAND FOR "COULOMBS PER SECOND."
It is the charge which is "real," while the current is a rate; a flow; an
abstract concept.
Going further ...I also believe that students would be better served if
the term
"Ampere" was held back until later grades, so that elementary electricity
was taught based on the concept of charge, and on value of
charge-per-second. If additional terms were introduced, they should be
the terms "Coulomb" and "Coulombs per second." The term "Ampere" is
engineering shorthand, and should only be introduced to students who have
lots of experience thinking in terms of "coulombs per second."
Now, how does one convince a textbook publisher to take these ideas
seriously, when he or she can open most any physics book and find the
clear statement that "amperes are the fundamental unit"?