ART

.

In physics, energy (Greek: ἐνέργεια energeia "activity, operation"[1]) is an indirectly observed quantity. It is often understood as the ability a physical system has to do work on other physical systems.[2][3] Since work is defined as a force acting through a distance (a length of space), energy is always equivalent to the ability to exert pulls or pushes against the basic forces of nature, along a path of a certain length.

The total energy contained in an object is identified with its mass, and energy (like mass), cannot be created or destroyed. When matter (ordinary material particles) is changed into energy (such as energy of motion, or into radiation), the mass of the system does not change through the transformation process. However, there may be mechanistic limits as to how much of the matter in an object may be changed into other types of energy and thus into work, on other systems. Energy, like mass, is a scalar physical quantity. In the International System of Units (SI), energy is measured in joules, but in many fields other units, such as kilowatt-hours and kilocalories, are customary. All of these units translate to units of work, which is always defined in terms of forces and the distances that the forces act through.

A system can transfer energy to another system by simply transferring matter to it (since matter is equivalent to energy, in accordance with its mass). However, when energy is transferred by means other than matter-transfer, the transfer produces changes in the second system, as a result of work done on it. This work manifests itself as the effect of force(s) applied through distances within the target system. For example, a system can emit energy to another by transferring (radiating) electromagnetic energy, but this creates forces upon the particles that absorb the radiation. Similarly, a system may transfer energy to another by physically impacting it, but in that case the energy of motion in an object, called kinetic energy, results in forces acting over distances (new energy) to appear in another object that is struck. Transfer of thermal energy by heat occurs by both of these mechanisms: heat can be transferred by electromagnetic radiation, or by physical contact in which direct particle-particle impacts transfer kinetic energy.

Energy may be stored in systems without being present as matter, or as kinetic or electromagnetic energy. Stored energy is created whenever a particle has been moved through a field it interacts with (requiring a force to do so), but the energy to accomplish this is stored as a new position of the particles in the field—a configuration that must be "held" or fixed by a different type of force (otherwise, the new configuration would resolve itself by the field pushing or pulling the particle back toward its previous position). This type of energy "stored" by force-fields and particles that have been forced into a new physical configuration in the field by doing work on them by another system, is referred to as potential energy. A simple example of potential energy is the work needed to lift an object in a gravity field, up to a support. Each of the basic forces of nature is associated with a different type of potential energy, and all types of potential energy (like all other types of energy) appears as system mass, whenever present. For example, a compressed spring will be slightly more massive than before it was compressed. Likewise, whenever energy is transferred between systems by any mechanism, an associated mass is transferred with it.

Any form of energy may be transformed into another form. For example, all types of potential energy are converted into kinetic energy when the objects are given freedom to move to different position (as for example, when an object falls off a support). When energy is in a form other than thermal energy, it may be transformed with good or even perfect efficiency, to any other type of energy, including electricity or production of new particles of matter. With thermal energy, however, there are often limits to the efficiency of the conversion to other forms of energy, as described by the second law of thermodynamics.

In all such energy transformation processes, the total energy remains the same, and a transfer of energy from one system to another, results in a loss to compensate for any gain. This principle, the conservation of energy, was first postulated in the early 19th century, and applies to any isolated system. According to Noether's theorem, the conservation of energy is a consequence of the fact that the laws of physics do not change over time.[4]

Although the total energy of a system does not change with time, its value may depend on the frame of reference. For example, a seated passenger in a moving airplane has zero kinetic energy relative to the airplane, but non-zero kinetic energy (and higher total energy) relative to the Earth.

History
Main articles: History of energy and timeline of thermodynamics, statistical mechanics, and random processes

The word energy derives from the Greek ἐνέργεια energeia, which possibly appears for the first time in the work of Aristotle in the 4th century BCE.
Thomas Young – the first to use the term "energy" in the modern sense.

The concept of energy emerged out of the idea of vis viva (living force), which Gottfried Leibniz defined as the product of the mass of an object and its velocity squared; he believed that total vis viva was conserved. To account for slowing due to friction, Leibniz theorized that thermal energy consisted of the random motion of the constituent parts of matter, a view shared by Isaac Newton, although it would be more than a century until this was generally accepted. In 1807, Thomas Young was possibly the first to use the term "energy" instead of vis viva, in its modern sense.[5] Gustave-Gaspard Coriolis described "kinetic energy" in 1829 in its modern sense, and in 1853, William Rankine coined the term "potential energy". It was argued for some years whether energy was a substance (the caloric) or merely a physical quantity, such as momentum.

William Thomson (Lord Kelvin) amalgamated all of these laws into the laws of thermodynamics, which aided in the rapid development of explanations of chemical processes by Rudolf Clausius, Josiah Willard Gibbs, and Walther Nernst. It also led to a mathematical formulation of the concept of entropy by Clausius and to the introduction of laws of radiant energy by Jožef Stefan.

During a 1961 lecture[6] for undergraduate students at the California Institute of Technology, Richard Feynman, a celebrated physics teacher and Nobel Laureate, said this about the concept of energy:

There is a fact, or if you wish, a law, governing all natural phenomena that are known to date. There is no known exception to this law—it is exact so far as we know. The law is called the conservation of energy. It states that there is a certain quantity, which we call energy, that does not change in manifold changes which nature undergoes. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not change when something happens. It is not a description of a mechanism, or anything concrete; it is just a strange fact that we can calculate some number and when we finish watching nature go through her tricks and calculate the number again, it is the same.
—The Feynman Lectures on Physics

Since 1918 it has been known that the law of conservation of energy is the direct mathematical consequence of the translational symmetry of the quantity conjugate to energy, namely time. That is, energy is conserved because the laws of physics do not distinguish between different instants of time (see Noether's theorem).

Energy in various contexts

The concept of energy and its transformations is useful in explaining and predicting most natural phenomena. The direction of transformations in energy (what kind of energy is transformed to what other kind) is often described by entropy (equal energy spread among all available degrees of freedom) considerations, as in practice all energy transformations are permitted on a small scale, but certain larger transformations are not permitted because it is statistically unlikely that energy or matter will randomly move into more concentrated forms or smaller spaces.

The concept of energy is widespread in all sciences.

In the context of chemistry, energy is an attribute of a substance as a consequence of its atomic, molecular or aggregate structure. Since a chemical transformation is accompanied by a change in one or more of these kinds of structure, it is invariably accompanied by an increase or decrease of energy of the substances involved. Some energy is transferred between the surroundings and the reactants of the reaction in the form of heat or light; thus the products of a reaction may have more or less energy than the reactants. A reaction is said to be exergonic if the final state is lower on the energy scale than the initial state; in the case of endergonic reactions the situation is the reverse. Chemical reactions are invariably not possible unless the reactants surmount an energy barrier known as the activation energy. The speed of a chemical reaction (at given temperature T) is related to the activation energy E, by the Boltzmann's population factor e−E/kT – that is the probability of molecule to have energy greater than or equal to E at the given temperature T. This exponential dependence of a reaction rate on temperature is known as the Arrhenius equation.The activation energy necessary for a chemical reaction can be in the form of thermal energy.
In biology, energy is an attribute of all biological systems from the biosphere to the smallest living organism. Within an organism it is responsible for growth and development of a biological cell or an organelle of a biological organism. Energy is thus often said to be stored by cells in the structures of molecules of substances such as carbohydrates (including sugars), lipids, and proteins, which release energy when reacted with oxygen in respiration. In human terms, the human equivalent (H-e) (Human energy conversion) indicates, for a given amount of energy expenditure, the relative quantity of energy needed for human metabolism, assuming an average human energy expenditure of 12,500kJ per day and a basal metabolic rate of 80 watts. For example, if our bodies run (on average) at 80 watts, then a light bulb running at 100 watts is running at 1.25 human equivalents (100 ÷ 80) i.e. 1.25 H-e. For a difficult task of only a few seconds' duration, a person can put out thousands of watts, many times the 746 watts in one official horsepower. For tasks lasting a few minutes, a fit human can generate perhaps 1,000 watts. For an activity that must be sustained for an hour, output drops to around 300; for an activity kept up all day, 150 watts is about the maximum.[7] The human equivalent assists understanding of energy flows in physical and biological systems by expressing energy units in human terms: it provides a “feel” for the use of a given amount of energy[8]
In geology, continental drift, mountain ranges, volcanoes, and earthquakes are phenomena that can be explained in terms of energy transformations in the Earth's interior.,[9] while meteorological phenomena like wind, rain, hail, snow, lightning, tornadoes and hurricanes, are all a result of energy transformations brought about by solar energy on the atmosphere of the planet Earth.
In cosmology and astronomy the phenomena of stars, nova, supernova, quasars and gamma ray bursts are the universe's highest-output energy transformations of matter. All stellar phenomena (including solar activity) are driven by various kinds of energy transformations. Energy in such transformations is either from gravitational collapse of matter (usually molecular hydrogen) into various classes of astronomical objects (stars, black holes, etc.), or from nuclear fusion (of lighter elements, primarily hydrogen).

Energy transformations in the universe over time are characterized by various kinds of potential energy that has been available since the Big Bang, later being "released" (transformed to more active types of energy such as kinetic or radiant energy), when a triggering mechanism is available.

Familiar examples of such processes include nuclear decay, in which energy is released that was originally "stored" in heavy isotopes (such as uranium and thorium), by nucleosynthesis, a process ultimately using the gravitational potential energy released from the gravitational collapse of supernovae, to store energy in the creation of these heavy elements before they were incorporated into the solar system and the Earth. This energy is triggered and released in nuclear fission bombs. In a slower process, radioactive decay of these atoms in the core of the Earth releases heat. This thermal energy drives plate tectonics and may lift mountains, via orogenesis. This slow lifting represents a kind of gravitational potential energy storage of the thermal energy, which may be later released to active kinetic energy in landslides, after a triggering event. Earthquakes also release stored elastic potential energy in rocks, a store that has been produced ultimately from the same radioactive heat sources. Thus, according to present understanding, familiar events such as landslides and earthquakes release energy that has been stored as potential energy in the Earth's gravitational field or elastic strain (mechanical potential energy) in rocks. Prior to this, they represent release of energy that has been stored in heavy atoms since the collapse of long-destroyed supernova stars created these atoms.

In another similar chain of transformations beginning at the dawn of the universe, nuclear fusion of hydrogen in the Sun also releases another store of potential energy which was created at the time of the Big Bang. At that time, according to theory, space expanded and the universe cooled too rapidly for hydrogen to completely fuse into heavier elements. This meant that hydrogen represents a store of potential energy that can be released by fusion. Such a fusion process is triggered by heat and pressure generated from gravitational collapse of hydrogen clouds when they produce stars, and some of the fusion energy is then transformed into sunlight. Such sunlight from our Sun may again be stored as gravitational potential energy after it strikes the Earth, as (for example) water evaporates from oceans and is deposited upon mountains (where, after being released at a hydroelectric dam, it can be used to drive turbines or generators to produce electricity). Sunlight also drives many weather phenomena, save those generated by volcanic events. An example of a solar-mediated weather event is a hurricane, which occurs when large unstable areas of warm ocean, heated over months, give up some of their thermal energy suddenly to power a few days of violent air movement. Sunlight is also captured by plants as chemical potential energy in photosynthesis, when carbon dioxide and water (two low-energy compounds) are converted into the high-energy compounds carbohydrates, lipids, and proteins. Plants also release oxygen during photosynthesis, which is utilized by living organisms as an electron acceptor, to release the energy of carbohydrates, lipids, and proteins. Release of the energy stored during photosynthesis as heat or light may be triggered suddenly by a spark, in a forest fire, or it may be made available more slowly for animal or human metabolism, when these molecules are ingested, and catabolism is triggered by enzyme action.

Through all of these transformation chains, potential energy stored at the time of the Big Bang is later released by intermediate events, sometimes being stored in a number of ways over time between releases, as more active energy. In all these events, one kind of energy is converted to other types of energy, including heat.

Distinction between energy and power

Although in everyday usage the terms energy and power are essentially synonyms, scientists and engineers distinguish between them. In its technical sense, power is not at all the same as energy, but is the rate at which energy is converted (or, equivalently, at which work is performed). Thus a hydroelectric plant, by allowing the water above the dam to pass through turbines, converts the water's potential energy into kinetic energy and ultimately into electric energy, whereas the amount of electric energy that is generated per unit of time is the electric power generated. The same amount of energy converted through a shorter period of time is more power over that shorter time.

Conservation of energy
Main article: Conservation of energy

Energy is subject to the law of conservation of energy. According to this law, energy can neither be created (produced) nor destroyed by itself. It can only be transformed.

Most kinds of energy (with gravitational energy being a notable exception)[10] are subject to strict local conservation laws as well. In this case, energy can only be exchanged between adjacent regions of space, and all observers agree as to the volumetric density of energy in any given space. There is also a global law of conservation of energy, stating that the total energy of the universe cannot change; this is a corollary of the local law, but not vice versa.[6][11] Conservation of energy is the mathematical consequence of translational symmetry of time (that is, the indistinguishability of time intervals taken at different time)[12] - see Noether's theorem.

According to Conservation of energy the total inflow of energy into a system must equal the total outflow of energy from the system, plus the change in the energy contained within the system.

This law is a fundamental principle of physics. It follows from the translational symmetry of time, a property of most phenomena below the cosmic scale that makes them independent of their locations on the time coordinate. Put differently, yesterday, today, and tomorrow are physically indistinguishable.

This is because energy is the quantity which is canonical conjugate to time. This mathematical entanglement of energy and time also results in the uncertainty principle - it is impossible to define the exact amount of energy during any definite time interval. The uncertainty principle should not be confused with energy conservation - rather it provides mathematical limits to which energy can in principle be defined and measured.

In quantum mechanics energy is expressed using the Hamiltonian operator. On any time scales, the uncertainty in the energy is by

\( \Delta E \Delta t \ge \frac { \hbar } {2 } \)

which is similar in form to the Heisenberg uncertainty principle (but not really mathematically equivalent thereto, since H and t are not dynamically conjugate variables, neither in classical nor in quantum mechanics).

In particle physics, this inequality permits a qualitative understanding of virtual particles which carry momentum, exchange by which and with real particles, is responsible for the creation of all known fundamental forces (more accurately known as fundamental interactions). Virtual photons (which are simply lowest quantum mechanical energy state of photons) are also responsible for electrostatic interaction between electric charges (which results in Coulomb law), for spontaneous radiative decay of exited atomic and nuclear states, for the Casimir force, for van der Waals bond forces and some other observable phenomena.

Applications of the concept of energy

Energy is subject to a strict global conservation law; that is, whenever one measures (or calculates) the total energy of a system of particles whose interactions do not depend explicitly on time, it is found that the total energy of the system always remains constant.[13]

The total energy of a system can be subdivided and classified in various ways. For example, it is sometimes convenient to distinguish potential energy (which is a function of coordinates only) from kinetic energy (which is a function of coordinate time derivatives only). It may also be convenient to distinguish gravitational energy, electric energy, thermal energy, and other forms. These classifications overlap; for instance, thermal energy usually consists partly of kinetic and partly of potential energy.
The transfer of energy can take various forms; familiar examples include work, heat flow, and advection, as discussed below.
The word "energy" is also used outside of physics in many ways, which can lead to ambiguity and inconsistency. The vernacular terminology is not consistent with technical terminology. For example, while energy is always conserved (in the sense that the total energy does not change despite energy transformations), energy can be converted into a form, e.g., thermal energy, that cannot be utilized to perform work. When one talks about "conserving energy by driving less," one talks about conserving fossil fuels and preventing useful energy from being lost as heat. This usage of "conserve" differs from that of the law of conservation of energy.[11]

In classical physics energy is considered a scalar quantity, the canonical conjugate to time. In special relativity energy is also a scalar (although not a Lorentz scalar but a time component of the energy-momentum 4-vector).[14] In other words, energy is invariant with respect to rotations of space, but not invariant with respect to rotations of space-time (= boosts).

Energy transfer

Because energy is strictly conserved and is also locally conserved (wherever it can be defined), it is important to remember that by the definition of energy the transfer of energy between the "system" and adjacent regions is work. A familiar example is mechanical work. In simple cases this is written as the following equation:

\( \Delta{}E = W \) (1)

if there are no other energy-transfer processes involved. Here E is the amount of energy transferred, and W represents the work done on the system.

More generally, the energy transfer can be split into two categories:

\( \Delta{}E = W + Q \) (2)

where Q represents the heat flow into the system.

There are other ways in which an open system can gain or lose energy. In chemical systems, energy can be added to a system by means of adding substances with different chemical potentials, which potentials are then extracted (both of these process are illustrated by fueling an auto, a system which gains in energy thereby, without addition of either work or heat). Winding a clock would be adding energy to a mechanical system. These terms may be added to the above equation, or they can generally be subsumed into a quantity called "energy addition term E" which refers to any type of energy carried over the surface of a control volume or system volume. Examples may be seen above, and many others can be imagined (for example, the kinetic energy of a stream of particles entering a system, or energy from a laser beam adds to system energy, without either being either work-done or heat-added, in the classic senses).

\( \Delta{}E = W + Q + E \) (3)

Where E in this general equation represents other additional advected energy terms not covered by work done on a system, or heat added to it.

Energy is also transferred from potential energy (E_p) to kinetic energy (E_k) and then back to potential energy constantly. This is referred to as conservation of energy. In this closed system, energy cannot be created or destroyed; therefore, the initial energy and the final energy will be equal to each other. This can be demonstrated by the following:

\( E_{pi} + E_{ki} = E_{pF} + E_{kF} \) (4)

The equation can then be simplified further since \( E_p = mgh \) (mass times acceleration due to gravity times the height) and \( E_k = \frac{1}{2} mv^2 \) (half mass times velocity squared). Then the total amount of energy can be found by adding \( E_p + E_k = E_{total}. \)
Energy and the laws of motion

In classical mechanics, energy is a conceptually and mathematically useful property, as it is a conserved quantity. Several formulations of mechanics have been developed using energy as a core concept.

The Hamiltonian

The total energy of a system is sometimes called the Hamiltonian, after William Rowan Hamilton. The classical equations of motion can be written in terms of the Hamiltonian, even for highly complex or abstract systems. These classical equations have remarkably direct analogs in nonrelativistic quantum mechanics.[15]

The Lagrangian

Another energy-related concept is called the Lagrangian, after Joseph Louis Lagrange. This is even more fundamental than the Hamiltonian, and can be used to derive the equations of motion. It was invented in the context of classical mechanics, but is generally useful in modern physics. The Lagrangian is defined as the kinetic energy minus the potential energy.

Usually, the Lagrange formalism is mathematically more convenient than the Hamiltonian for non-conservative systems (such as systems with friction).
Noether's Theorem

Noether's (first) theorem (1918) states that any differentiable symmetry of the action of a physical system has a corresponding conservation law.

Noether's theorem has become a fundamental tool of modern theoretical physics and the calculus of variations. A generalization of the seminal formulations on constants of motion in Lagrangian and Hamiltonian mechanics (1788 and 1833, respectively), it does not apply to systems that cannot be modeled with a Lagrangian; for example, dissipative systems with continuous symmetries need not have a corresponding conservation law.

Energy and thermodynamics
Internal energy

Internal energy is the sum of all microscopic forms of energy of a system. It is the energy needed to create the system. It is related to the potential energy, e.g., molecular structure, crystal structure, and other geometric aspects, as well as the motion of the particles, in form of kinetic energy. Thermodynamics is chiefly concerned with changes in internal energy and not its absolute value, which is impossible to determine with thermodynamics alone.[16]
The first law of thermodynamics

The first law of thermodynamics asserts that energy is conserved[17] and that heat flow is a form of energy transfer. For homogeneous systems, with a well-defined temperature and pressure, a commonly used corollary of the first law is that, for a system subject only to pressure forces and heat transfer (e.g., a cylinder-full of gas), the differential change in the internal energy of the system (with a gain in energy signified by a positive quantity) is given as

\( \mathrm{d}E = T\mathrm{d}S - P\mathrm{d}V\,, \)

where the first term on the right is the heat transfered into the system, expressed in terms of temperature T and entropy S (in which entropy increases and the change dS is positive when the system is heated), and the last term on the right hand side is identified as work done on the system, where pressure is P and volume V (the negative sign results since compression of the system requires work to be done on it and so the volume change, dV, is negative when work is done on the system).

This equation is highly specific, ignoring all chemical, electrical, nuclear, and gravitational forces, effects such as advection of any form of energy other than heat and pV-work. The general formulation of the first law (i.e., conservation of energy) is valid even in situations in which the system is not homogeneous. For these cases the change in internal energy of a closed system is expressed in a general form by

\( \mathrm{d}E=\delta Q+\delta W \)

where \( \delta Q \) is the heat supplied to the system and \( \delta W \) is the work applied to the system.

Equipartition of energy

The energy of a mechanical harmonic oscillator (a mass on a spring) is alternatively kinetic and potential. At two points in the oscillation cycle it is entirely kinetic, and alternatively at two other points it is entirely potential. Over the whole cycle, or over many cycles, net energy is thus equally split between kinetic and potential. This is called equipartition principle; total energy of a system with many degrees of freedom is equally split among all available degrees of freedom.

This principle is vitally important to understanding the behavior of a quantity closely related to energy, called entropy. Entropy is a measure of evenness of a distribution of energy between parts of a system. When an isolated system is given more degrees of freedom (i.e., given new available energy states that are the same as existing states), then total energy spreads over all available degrees equally without distinction between "new" and "old" degrees. This mathematical result is called the second law of thermodynamics.

Oscillators, phonons, and photons

In an ensemble (connected collection) of unsynchronized oscillators, the average energy is spread equally between kinetic and potential types.

In a solid, thermal energy (often referred to loosely as heat content) can be accurately described by an ensemble of thermal phonons that act as mechanical oscillators. In this model, thermal energy is equally kinetic and potential.

In an ideal gas, the interaction potential between particles is essentially the delta function which stores no energy: thus, all of the thermal energy is kinetic.

Because an electric oscillator (LC circuit) is analogous to a mechanical oscillator, its energy must be, on average, equally kinetic and potential. It is entirely arbitrary whether the magnetic energy is considered kinetic and whether the electric energy is considered potential, or vice versa. That is, either the inductor is analogous to the mass while the capacitor is analogous to the spring, or vice versa.

1. By extension of the previous line of thought, in free space the electromagnetic field can be considered an ensemble of oscillators, meaning that radiation energy can be considered equally potential and kinetic. This model is useful, for example, when the electromagnetic Lagrangian is of primary interest and is interpreted in terms of potential and kinetic energy.

2. On the other hand, in the key equation \( m^2 c^4 = E^2 - p^2 c^2 \) , the contribution \( mc^2 \) is called the rest energy, and all other contributions to the energy are called kinetic energy. For a particle that has mass, this implies that the kinetic energy is \( 0.5 p^2/m \) at speeds much smaller than c, as can be proved by writing \( E = mc^2 √(1 + p^2 m^{-2}c^{-2}) \) and expanding the square root to lowest order. By this line of reasoning, the energy of a photon is entirely kinetic, because the photon is massless and has no rest energy. This expression is useful, for example, when the energy-versus-momentum relationship is of primary interest.

The two analyses are entirely consistent. The electric and magnetic degrees of freedom in item 1 are transverse to the direction of motion, while the speed in item 2 is along the direction of motion. For non-relativistic particles these two notions of potential versus kinetic energy are numerically equal, so the ambiguity is harmless, but not so for relativistic particles.
Work and virtual work
Main articles: Mechanics, Mechanical work, Thermodynamics, and Quantum mechanics

Work, a form of energy, is force times distance.

\( W = \int_C \mathbf{F} \cdot \mathrm{d} \mathbf{s} \)

This says that the work (W) is equal to the line integral of the force F along a path C; for details see the mechanical work article.

Work and thus energy is frame dependent. For example, consider a ball being hit by a bat. In the center-of-mass reference frame, the bat does no work on the ball. But, in the reference frame of the person swinging the bat, considerable work is done on the ball.


Quantum mechanics
Main article: Energy operator

In quantum mechanics energy is defined in terms of the energy operator as a time derivative of the wave function. The Schrödinger equation equates the energy operator to the full energy of a particle or a system. In results can be considered as a definition of measurement of energy in quantum mechanics. The Schrödinger equation describes the space- and time-dependence of slow changing (non-relativistic) wave function of quantum systems. The solution of this equation for bound system is discrete (a set of permitted states, each characterized by an energy level) which results in the concept of quanta. In the solution of the Schrödinger equation for any oscillator (vibrator) and for electromagnetic waves in a vacuum, the resulting energy states are related to the frequency by the Planck equation \( E = h\nu \) (where h is the Planck's constant and \( \nu \) the frequency). In the case of electromagnetic wave these energy states are called quanta of light or photons.
Relativity

When calculating kinetic energy (work to accelerate a mass from zero speed to some finite speed) relativistically - using Lorentz transformations instead of Newtonian mechanics, Einstein discovered an unexpected by-product of these calculations to be an energy term which does not vanish at zero speed. He called it rest mass energy - energy which every mass must possess even when being at rest. The amount of energy is directly proportional to the mass of body:

\( E = m c^2 , \)

where

m is the mass,
c is the speed of light in vacuum,
E is the rest mass energy.

For example, consider electron-positron annihilation, in which the rest mass of individual particles is destroyed, but the inertia equivalent of the system of the two particles (its invariant mass) remains (since all energy is associated with mass), and this inertia and invariant mass is carried off by photons which individually are massless, but as a system retain their mass. This is a reversible process - the inverse process is called pair creation - in which the rest mass of particles is created from energy of two (or more) annihilating photons. In this system the matter (electrons and positrons) is destroyed and changed to non-matter energy (the photons). However, the total system mass and energy do not change during this interaction.

In general relativity, the stress-energy tensor serves as the source term for the gravitational field, in rough analogy to the way mass serves as the source term in the non-relativistic Newtonian approximation.[14]

It is not uncommon to hear that energy is "equivalent" to mass. It would be more accurate to state that every energy has an inertia and gravity equivalent, and because mass is a form of energy, then mass too has inertia and gravity associated with it.
Energy and life
Main article: Bioenergetics
Basic overview of energy and human life.

Any living organism relies on an external source of energy—radiation from the Sun in the case of green plants; chemical energy in some form in the case of animals—to be able to grow and reproduce. The daily 1500–2000 Calories (6–8 MJ) recommended for a human adult are taken as a combination of oxygen and food molecules, the latter mostly carbohydrates and fats, of which glucose (C6H12O6) and stearin (C57H110O6) are convenient examples. The food molecules are oxidised to carbon dioxide and water in the mitochondria

C6H12O6 + 6O2 → 6CO2 + 6H2O
C57H110O6 + 81.5O2 → 57CO2 + 55H2O

and some of the energy is used to convert ADP into ATP

ADP + HPO42− → ATP + H2O

The rest of the chemical energy in the carbohydrate or fat is converted into heat: the ATP is used as a sort of "energy currency", and some of the chemical energy it contains when split and reacted with water, is used for other metabolism (at each stage of a metabolic pathway, some chemical energy is converted into heat). Only a tiny fraction of the original chemical energy is used for work:[18]

gain in kinetic energy of a sprinter during a 100 m race: 4 kJ
gain in gravitational potential energy of a 150 kg weight lifted through 2 metres: 3kJ
Daily food intake of a normal adult: 6–8 MJ

It would appear that living organisms are remarkably inefficient (in the physical sense) in their use of the energy they receive (chemical energy or radiation), and it is true that most real machines manage higher efficiencies. In growing organisms the energy that is converted to heat serves a vital purpose, as it allows the organism tissue to be highly ordered with regard to the molecules it is built from. The second law of thermodynamics states that energy (and matter) tends to become more evenly spread out across the universe: to concentrate energy (or matter) in one specific place, it is necessary to spread out a greater amount of energy (as heat) across the remainder of the universe ("the surroundings").[19] Simpler organisms can achieve higher energy efficiencies than more complex ones, but the complex organisms can occupy ecological niches that are not available to their simpler brethren. The conversion of a portion of the chemical energy to heat at each step in a metabolic pathway is the physical reason behind the pyramid of biomass observed in ecology: to take just the first step in the food chain, of the estimated 124.7 Pg/a of carbon that is fixed by photosynthesis, 64.3 Pg/a (52%) are used for the metabolism of green plants,[20] i.e. reconverted into carbon dioxide and heat.
Measurement
A schematic diagram of a Calorimeter - An instrument used by physicists to measure energy

Because energy is defined as the ability to do work on objects, there is no absolute measure of energy. Only the transition of a system from one state into another can be defined and thus energy is measured in relative terms. The choice of a baseline or zero point is often arbitrary and can be made in whatever way is most convenient for a problem.
Methods

The methods for the measurement of energy often deploy methods for the measurement of still more fundamental concepts of science, namely mass, distance, radiation, temperature, time, electric charge and electric current.

Conventionally the technique most often employed is calorimetry, a thermodynamic technique that relies on the measurement of temperature using a thermometer or of intensity of radiation using a bolometer.
Units
Main article: Units of energy

Throughout the history of science, energy has been expressed in several different units such as ergs and calories. At present, the accepted unit of measurement for energy is the SI unit of energy, the joule. In addition to the joule, other units of energy include the kilowatt hour (kWh) and the British thermal unit (Btu). These are both larger units of energy. One kWh is equivalent to exactly 3.6 million joules, and one Btu is equivalent to about 1055 joules.[21]
Energy density
Main article: Energy density

Energy density is a term used for the amount of useful energy stored in a given system or region of space per unit volume.

For fuels, the energy per unit volume is sometimes a useful parameter. In a few applications, comparing, for example, the effectiveness of hydrogen fuel to gasoline it turns out that hydrogen has a higher specific energy than does gasoline, but, even in liquid form, a much lower energy density.
Forms of energy
Main article: Forms of energy
Heat, a form of energy, is partly potential energy and partly kinetic energy.

In the context of physical sciences, several forms of energy have been defined. These include:

Thermal energy, thermal energy in transit is called heat
Chemical energy
Electric energy
Radiant energy, the energy of electromagnetic radiation
Nuclear energy
Magnetic energy
Elastic energy
Sound energy
Mechanical energy
Luminous energy
Mass (E=mc²)

These forms of energy may be divided into two main groups; kinetic energy and potential energy. Other familiar types of energy are a varying mix of both potential and kinetic energy.

Energy may be transformed between these forms, some with 100% energy conversion efficiency and others with less. Items that transform between these forms are called transducers.

The above list of the known possible forms of energy is not necessarily complete. Whenever physical scientists discover that a certain phenomenon appears to violate the law of energy conservation, new forms may be added, as is the case with dark energy, a hypothetical form of energy that permeates all of space and tends to increase the rate of expansion of the universe.

Classical mechanics distinguishes between potential energy, which is a function of the position of an object, and kinetic energy, which is a function of its movement. Both position and movement are relative to a frame of reference, which must be specified: this is often (and originally) an arbitrary fixed point on the surface of the Earth, the terrestrial frame of reference. It has been attempted to categorize all forms of energy as either kinetic or potential: this is not incorrect, but neither is it clear that it is a real simplification, as Feynman points out:

These notions of potential and kinetic energy depend on a notion of length scale. For example, one can speak of macroscopic potential and kinetic energy, which do not include thermal potential and kinetic energy. Also what is called chemical potential energy is a macroscopic notion, and closer examination shows that it is really the sum of the potential and kinetic energy on the atomic and subatomic scale. Similar remarks apply to nuclear "potential" energy and most other forms of energy. This dependence on length scale is non-problematic if the various length scales are decoupled, as is often the case ... but confusion can arise when different length scales are coupled, for instance when friction converts macroscopic work into microscopic thermal energy.

Transformations of energy
Main article: Energy transformation

One form of energy can often be readily transformed into another with the help of a device- for instance, a battery, from chemical energy to electric energy; a dam: gravitational potential energy to kinetic energy of moving water (and the blades of a turbine) and ultimately to electric energy through an electric generator. Similarly, in the case of a chemical explosion, chemical potential energy is transformed to kinetic energy and thermal energy in a very short time. Yet another example is that of a pendulum. At its highest points the kinetic energy is zero and the gravitational potential energy is at maximum. At its lowest point the kinetic energy is at maximum and is equal to the decrease of potential energy. If one (unrealistically) assumes that there is no friction, the conversion of energy between these processes is perfect, and the pendulum will continue swinging forever.

Energy gives rise to weight when it is trapped in a system with zero momentum, where it can be weighed. It is also equivalent to mass, and this mass is always associated with it. Mass is also equivalent to a certain amount of energy, and likewise always appears associated with it, as described in mass-energy equivalence. The formula E = mc², derived by Albert Einstein (1905) quantifies the relationship between rest-mass and rest-energy within the concept of special relativity. In different theoretical frameworks, similar formulas were derived by J. J. Thomson (1881), Henri Poincaré (1900), Friedrich Hasenöhrl (1904) and others (see Mass-energy equivalence#History for further information).

Matter may be destroyed and converted to energy (and vice versa), but mass cannot ever be destroyed; rather, mass remains a constant for both the matter and the energy, during any process when they are converted into each other. However, since \( c^2 \) is extremely large relative to ordinary human scales, the conversion of ordinary amount of matter (for example, 1 kg) to other forms of energy (such as heat, light, and other radiation) can liberate tremendous amounts of energy (~9x10^{16} joules = 21 megatons of TNT), as can be seen in nuclear reactors and nuclear weapons. Conversely, the mass equivalent of a unit of energy is minuscule, which is why a loss of energy (loss of mass) from most systems is difficult to measure by weight, unless the energy loss is very large. Examples of energy transformation into matter (i.e., kinetic energy into particles with rest mass) are found in high-energy nuclear physics.

Transformation of energy into useful work is a core topic of thermodynamics. In nature, transformations of energy can be fundamentally classed into two kinds: those that are thermodynamically reversible, and those that are thermodynamically irreversible. A reversible process in thermodynamics is one in which no energy is dissipated (spread) into empty energy states available in a volume, from which it cannot be recovered into more concentrated forms (fewer quantum states), without degradation of even more energy. A reversible process is one in which this sort of dissipation does not happen. For example, conversion of energy from one type of potential field to another, is reversible, as in the pendulum system described above. In processes where heat is generated, quantum states of lower energy, present as possible excitations in fields between atoms, act as a reservoir for part of the energy, from which it cannot be recovered, in order to be converted with 100% efficiency into other forms of energy. In this case, the energy must partly stay as heat, and cannot be completely recovered as usable energy, except at the price of an increase in some other kind of heat-like increase in disorder in quantum states, in the universe (such as an expansion of matter, or a randomization in a crystal).

As the universe evolves in time, more and more of its energy becomes trapped in irreversible states (i.e., as heat or other kinds of increases in disorder). This has been referred to as the inevitable thermodynamic heat death of the universe. In this heat death the energy of the universe does not change, but the fraction of energy which is available to do work through a heat engine, or be transformed to other usable forms of energy (through the use of generators attached to heat engines), grows less and less.
See also
Portal icon Energy portal
Portal icon Physics portal
Book icon Book: Energy
Wikipedia books are collections of articles that can be downloaded or ordered in print.

Index of energy articles
Index of wave articles

Notes and references

^ Harper, Douglas. "Energy". Online Etymology Dictionary. Retrieved May 1, 2007.
^ "Retrieved on 2010-Dec-05". Faculty.clintoncc.suny.edu. Retrieved 2010-12-12.
^ "Retrieved on 2010-Dec-05" (PDF). Retrieved 2010-12-12.
^ Lofts, G; O'Keeffe D; et al. (2004). "11 — Mechanical Interactions". Jacaranda Physics 1 (2 ed.). Milton, Queensland, Australia: John Willey & Sons Australia Ltd.. p. 286. ISBN 0-7016-3777-3.
^ Smith, Crosbie (1998). The Science of Energy – a Cultural History of Energy Physics in Victorian Britain. The University of Chicago Press. ISBN 0-226-76420-6.
^ a b Feynman, Richard (1964). The Feynman Lectures on Physics; Volume 1. U.S.A: Addison Wesley. ISBN 0-201-02115-3.
^ "Retrieved on May-29-09". Uic.edu. Retrieved 2010-12-12.
^ Bicycle calculator - speed, weight, wattage etc. [1].
^ "Earth's Energy Budget". Okfirst.ocs.ou.edu. Retrieved 2010-12-12.
^ "E. Noether's Discovery of the Deep Connection Between Symmetries and Conservation Laws". Physics.ucla.edu. 1918-07-16. Retrieved 2010-12-12.
^ a b The Laws of Thermodynamics including careful definitions of energy, free energy, et cetera.
^ "Time Invariance". Ptolemy.eecs.berkeley.edu. Retrieved 2010-12-12.
^ Berkeley Physics Course Volume 1. Charles Kittel, Walter D Knight and Malvin A Ruderman
^ a b Misner, Thorne, Wheeler (1973). Gravitation. San Francisco: W. H. Freeman. ISBN 0-7167-0344-0.
^ The Hamiltonian MIT OpenCourseWare website 18.013A Chapter 16.3 Accessed February 2007
^ I. Klotz, R. Rosenberg, Chemical Thermodynamics - Basic Concepts and Methods, 7th ed., Wiley (2008), p.39
^ Kittel and Kroemer (1980). Thermal Physics. New York: W. H. Freeman. ISBN 0-7167-1088-9.
^ These examples are solely for illustration, as it is not the energy available for work which limits the performance of the athlete but the power output of the sprinter and the force of the weightlifter. A worker stacking shelves in a supermarket does more work (in the physical sense) than either of the athletes, but does it more slowly.
^ Crystals are another example of highly ordered systems that exist in nature: in this case too, the order is associated with the transfer of a large amount of heat (known as the lattice energy) to the surroundings.
^ Ito, Akihito; Oikawa, Takehisa (2004). "Global Mapping of Terrestrial Primary Productivity and Light-Use Efficiency with a Process-Based Model." in Shiyomi, M. et al. (Eds.) Global Environmental Change in the Ocean and on Land. pp. 343–58.
^ Ristinen, Robert A., and Kraushaar, Jack J. Energy and the Environment. New York: John Wiley & Sons, Inc., 2006.

Further reading

Alekseev, G. N. (1986). Energy and Entropy. Moscow: Mir Publishers.
Crowell, Benjamin (2011) [2003]. Light and Matter. Fullerton, CA: Light and Matter.
Ross, John S. (23 April 2002). "Work, Power, Kinetic Energy". Project PHYSNET. Michigan State University.
Smil, Vaclav (2008). Energy in nature and society: general energetics of complex systems. Cambridge, USA: MIT Press. ISBN 0-262-19565-8.
Walding, Richard, Rapkins, Greg, Rossiter, Glenn (1999-11-01). New Century Senior Physics. Melbourne, Australia: Oxford University Press. ISBN 0-19-551084-4.

Physics Encyclopedia

Retrieved from "http://en.wikipedia.org/"
All text is available under the terms of the GNU Free Documentation License

Home - Hellenica World