Luminous Efficacy Explained: Why Lumens per Watt Is the Key Number in Every Fixture Specification

What luminous efficacy means, how it is measured, and why 120 lm/W has become the threshold that separates energy-compliant fixtures from those that fail to meet current global standards.
When comparing two light fixtures that produce the same amount of light, the one that draws less electrical power to do so is the more efficient product. Luminous efficacy is the metric that makes this comparison possible: it expresses how much visible light a fixture produces for every watt of electricity it consumes, stated in lumens per watt (lm/W). The higher the number, the more efficiently the fixture converts electrical energy into light.
This single figure carries considerable weight in specification decisions. It determines whether a fixture meets the minimum thresholds set by energy regulations in the markets where it will be installed. It predicts the operating cost of the fixture over its lifetime. And it provides a standardised basis for comparing fixtures across different technologies, power ratings, and manufacturers — without which comparisons based on wattage or lumen output alone can be misleading.
The definition: what luminous efficacy actually measures
Luminous efficacy is defined as the ratio of luminous flux to input power: lm/W = total lumens output ÷ total watts consumed. A fixture that emits 1,200 lumens while drawing 10 watts has a luminous efficacy of 120 lm/W. A fixture that emits the same 1,200 lumens while drawing 15 watts has a luminous efficacy of 80 lm/W — it produces the same light, but at a higher energy cost.
The lumens figure used in the calculation must be measured at the fixture level, not at the LED chip or light source alone. A bare LED chip may carry a rated efficacy significantly higher than the fixture in which it is installed, because the fixture assembly introduces optical losses through its lens, diffuser, or reflector; thermal losses that affect LED output over time; and driver losses from converting mains voltage to the low-voltage DC required by the LEDs. The fixture-level efficacy — sometimes called system efficacy — is the number that matters for specification purposes, as it reflects the actual performance of the complete product as installed.
Four factors that determine a fixture's luminous efficacy
The efficacy of the LED package itself — measured at the chip level under controlled conditions — sets the ceiling for what the fixture can achieve. Higher-grade chips with more advanced phosphor conversion and tighter binning start at a higher baseline.
The LED driver converts mains AC power to the low-voltage DC the LEDs require. This conversion is never perfectly efficient — driver losses typically account for 8–15% of input power. A higher-efficiency driver directly raises the fixture's overall lm/W figure.
Every optical element — diffuser, lens, reflector, cover glass — absorbs or redirects a portion of the light before it leaves the fixture. Optical design choices that minimise these losses while achieving the required beam angle preserve more of the LED's output.
LED efficacy decreases as junction temperature rises. A fixture with inadequate thermal management runs its LEDs hotter, reducing their output and therefore the fixture's effective lm/W in real operating conditions compared with what was measured at the test bench.
How luminous efficacy has evolved across light source technologies
The historical context of lm/W figures helps to place the current 120 lm/W threshold in perspective. The incandescent lamp — the dominant light source through most of the twentieth century — operated at approximately 10–15 lm/W. The halogen lamp, an evolution of the incandescent, reached roughly 15–25 lm/W. Both technologies generated light through thermal radiation, heating a filament until it glows, a process that converts most of the electrical input to heat rather than light.
Fluorescent lamps improved substantially on this baseline, reaching 60–100 lm/W in their more efficient forms — a step change made possible by the different mechanism of fluorescent excitation. Compact fluorescent lamps brought comparable efficacy to the form factor of the screw-base lamp, though with trade-offs in light quality and dimming behaviour.
LED technology has advanced further and faster than any previous lamp type. Early LED products entering the market around 2010 offered 50–70 lm/W at the fixture level. By the mid-2010s, mainstream commercial products were regularly exceeding 100 lm/W. Current production-standard products from quality manufacturers routinely fall in the 120–160 lm/W range at the fixture level, and laboratory results on advanced chips have demonstrated values considerably higher. The 120 lm/W threshold that now appears in regulatory frameworks worldwide represents current mainstream attainability, not the leading edge.
"A fixture's wattage tells you its cost to run. Its lumen output tells you how much light it produces. Only luminous efficacy — the ratio of the two — tells you whether it is doing both efficiently."
The 120 lm/W threshold in global energy standards
The figure of 120 lm/W appears across a range of regulatory and certification frameworks as the minimum efficacy expected of energy-compliant LED products. Understanding why this specific threshold has been adopted — and how it is applied in different markets — is useful for anyone purchasing or specifying fixtures for international projects.
The DLC Qualified Products List sets minimum efficacy requirements by product category. Commercial luminaires targeting rebate eligibility must meet or exceed thresholds that, in most categories, now reach or exceed 120 lm/W. The Premium tier sets higher minimums still.
ENERGY STAR certification for residential luminaires sets efficacy requirements by product category. Requirements have been progressively tightened and many categories now require luminaire-level efficacy at or above 80–100 lm/W, with updated specifications continuing to rise.
EU Ecodesign regulations have phased out inefficient lamps and established minimum efficacy thresholds for luminaires placed on the EU market. Successive phases have raised minimum values; the top energy class corresponds to efficacy consistent with or above 120 lm/W at the fixture level.
China's mandatory energy efficiency standards for LED products establish minimum lm/W requirements by product category and energy efficiency grade. Grade 1 (highest) requirements for many LED lamp and luminaire categories reach or approach 120 lm/W, with standards subject to revision on a regular cycle.
The Equipment Energy Efficiency (E3) programme sets MEPS for lighting products sold in Australia and New Zealand. Requirements for LED lamps and luminaires have been aligned with international trends, with minimum thresholds for many categories now placing 120 lm/W as the expected standard for compliant products.
Luminous efficacy vs. related metrics that are often confused with it
Several related figures appear on specification sheets and data tables alongside luminous efficacy, and it is worth understanding precisely how each differs — because conflating them leads to specification errors that only become apparent after installation.
LED chip efficacy or package efficacy is the lm/W figure measured at the bare LED package, typically under standard test conditions of 25°C junction temperature and a defined drive current. This figure is always higher than the fixture-level efficacy because it does not account for driver losses, optical losses, or the thermal rise that occurs during normal fixture operation. A chip rated at 180 lm/W may be the basis for a fixture that achieves 130 lm/W at the system level — both figures are accurate, but they measure different things.
Luminous intensity, measured in candela (cd), describes how much light is emitted in a particular direction rather than in total. It is relevant to beam angle and distribution characterisation but says nothing about efficiency. A fixture could have very high intensity in a narrow beam while being highly inefficient — or vice versa.
Luminous efficacy of radiation (LER) is a theoretical measure of how efficiently a given spectral power distribution converts to human-visible light, based on the eye's sensitivity curve. It is distinct from luminous efficacy as used in fixture specification. LER is of interest in photometric research but does not appear on product data sheets in the same way.
Reading efficacy data on product specification sheets
When reviewing a fixture specification sheet, the luminous efficacy figure should be stated as a luminaire or system efficacy — not as chip or driver efficacy alone. The measurement conditions should correspond to a recognised photometric testing standard: LM-79 in the United States, EN 13032 in Europe, or equivalent national standards. These standards define how fixtures should be conditioned before measurement, at what ambient temperature, and how the photometric sphere or goniophotometer is calibrated.
LM-79 testing, specified by the Illuminating Engineering Society, is particularly significant because it measures the complete luminaire at its actual operating temperature rather than at a controlled low-temperature baseline. This matters because LEDs produce less light when hot than when cold, and a test conducted at artificially low temperatures will report optimistic lm/W values that the fixture will not achieve in a real installation. Independent LM-79 test reports from accredited laboratories provide more reliable data than manufacturer-conducted tests or chip-level ratings presented as fixture-level claims.
When evaluating fixture specifications, locate the LM-79 test report rather than relying solely on summary data in a brochure. Verify that the lumens and wattage figures on the report match those being quoted, that the test was conducted at a standard ambient temperature (typically 25°C), and that the report comes from an accredited independent laboratory. A fixture whose published efficacy cannot be traced to a third-party test report warrants additional scrutiny before specification.
How efficacy interacts with colour rendering index and colour temperature
Luminous efficacy and colour rendering index (CRI) exist in a trade-off relationship that is important to understand. The CRI of a white LED is determined by the phosphor blend used to convert the blue LED emission to a broad-spectrum white light. Phosphor blends that produce a higher CRI — more accurate colour rendering, closer to the Ra 90+ range — generally do so by adding red-wavelength components to the spectrum. Red wavelengths are relatively inefficient in terms of the human eye's sensitivity response (which peaks in the green region), so adding more red content for better CRI typically reduces the lm/W figure at a given drive current.
The practical implication is that a fixture rated at Ra 90 CRI will typically have a lower luminous efficacy than an otherwise equivalent fixture rated at Ra 80 CRI, all other factors equal. When specifying against a minimum efficacy threshold such as 120 lm/W, the CRI requirement must be stated alongside the efficacy requirement, not treated as an independent variable. A fixture that reaches 125 lm/W at Ra 80 may achieve only 108 lm/W at Ra 90 — satisfying the efficacy threshold with one CRI specification but not the other.
Colour temperature also interacts with efficacy. In general terms, cool white LEDs (5000–6500K) tend to offer slightly higher lm/W than warm white (2700–3000K) at comparable CRI levels, because the spectral distribution of cool white sources aligns more closely with the eye's peak sensitivity. This difference is typically modest — within the range of 5–10% — but is worth accounting for when specifying warm white sources against a tight efficacy threshold.
Practical efficacy ranges by fixture application type
Direct/indirect panel and linear fixtures for office applications now routinely achieve 120–160 lm/W at the system level. This category has seen the most rapid efficacy improvement and includes many products above the 120 lm/W threshold at accessible price points.
Recessed downlights span a wide efficacy range depending on optical design, aperture size, and trim type. Products with enclosed optical assemblies or narrow beam angles introduce more optical loss. Quality fixed-aperture downlights from established manufacturers regularly exceed 120 lm/W.
Industrial high bay fixtures — designed for large, open spaces with mounting heights above 6m — are among the highest-efficacy products in the market. The combination of high power, simple optical design, and mature thermal management has driven efficacy in this category well above 120 lm/W.
LED street lighting has been among the most efficacy-driven categories globally, with municipal procurement typically requiring evidence of compliance with energy performance thresholds. Many street lighting tenders specify a minimum of 120–130 lm/W as a basic entry requirement.
Decorative fixtures prioritise form, material, and visual character alongside light output. Optical assemblies are often open, with significant upward-directed or scattered light not counted in useful lumen output. Efficacy in this category is typically lower and not always the primary specification driver.
Track-mounted accent fixtures balance efficacy with beam precision, colour rendering (often Ra 90+ for retail and gallery applications), and optical control. High-CRI requirements reduce the ceiling on achievable efficacy; well-engineered products in this category can still reach or exceed 120 lm/W.
Why wattage alone is not a reliable comparison basis
In the era of incandescent lamps, wattage was a reliable shorthand for brightness because all incandescent lamps of the same wattage produced roughly the same amount of light — efficacy was consistent across the technology. The widespread adoption of LED lighting has invalidated this shorthand entirely. Two LED fixtures rated at 20W may produce 1,600 lumens and 2,400 lumens respectively, depending on their efficacy. Comparing them by wattage alone tells a buyer nothing useful about the light they will receive.
The consequence of wattage-based comparisons for buyers is real and measurable. A fixture specified as a "20W LED downlight" without an efficacy figure or lumen output specification could be delivering 80 lm/W or 130 lm/W — a difference of 62% in light output for the same electricity consumption, and potentially a difference in whether the product meets regulatory requirements in its installation market.
Specifying on the basis of lumen output and luminous efficacy together — rather than wattage — resolves this ambiguity. A requirement stated as "minimum 1,500 lm at minimum 120 lm/W" defines both the quantity of light and the efficiency at which it must be produced, leaving no room for a fixture to meet one criterion by sacrificing the other.
"Specifying only wattage is like hiring a staff member by their salary without knowing what their job is. Efficacy is the job description."
Efficacy over the fixture's lifetime: lumen maintenance and its interaction with rated efficacy
Luminous efficacy as measured at the beginning of a fixture's life will not remain constant over its operating hours. LEDs depreciate — they produce less light as they age, a process driven primarily by phosphor degradation and changes in the LED chip itself. A fixture that measures 130 lm/W at hour zero may measure 104 lm/W at hour 50,000 if it maintains 80% of its initial lumen output (an L80 rating in the terminology of lumen maintenance standards).
The lumen maintenance rating — expressed as L70, L80, or L90 at a stated number of hours — describes the percentage of original output retained at that point in the fixture's life. When evaluating efficacy in the context of a project's lifetime operating cost, it is worth considering what the effective average efficacy will be across the fixture's expected service life, not only its initial value. A fixture with high initial efficacy but poor lumen maintenance may deliver less over time than one with slightly lower initial efficacy but superior long-term stability.
Thermal management is the primary determinant of lumen maintenance quality. Fixtures that run their LEDs at lower junction temperatures — through better heat sink design, lower drive currents, or higher-grade thermal interface materials — demonstrate superior lumen maintenance and therefore sustain their efficacy advantage over their operating lifetime.
Related Posts

Beam Angle Selection: Using Narrow 15° Beams for Focused Accents and Wider 60° Beams for General Ambient Wash in Large Spaces
Beam angle is one of the first decisions in lighting fixture selection — and one…

Thermal Management: How Aluminium Heat Sinks Prevent LED Degradation and Determine Whether a Fixture Reaches Its 50,000-Hour Lifespan
The rated lifespan of an LED chip is a potential, not a guarantee. Whether a…

Smile Lighting Co., Ltd.
https://www.tiktok.com/@smilelighting_com/video/7640358870552005910