LM Losses: Can You Handle the Truth?

Nicholson and Cruise II

While developing our LED Product Performance and Quality Ranking Report, I have encountered LED suppliers who have very different perspectives on providing LM loss data to buyers. Two examples remind me of the characters played by Tom Cruise and Jack Nicholson in the movie A Few Good Men. I thought the analogy would illustrate the importance of an informed buyer.

Is it fair to compare buyers and suppliers to the characters of Lieutenant Kaffee (I want the truth) and Colonel Jessup (you can’t handle the truth). Probably not, but I hope more buyers are like Lieutenant Kaffee and fewer suppliers view the world like Colonel Jessup.

The Backstory

Developing the LED Product Rankings requires persuasion – which is understandable, because we ask suppliers to provide data that requires testing by a third-party. It’s not unusual for a supplier to push back on some of our testing requirements. In the case of LM testing, it is my professional opinion that they are essential, but not everyone agrees. The following is a dramatization of discussions with two suppliers who have a different perspective on LM testing.

Conversation with Supplier A

Me: “Bill (not his real name), we would also like to get your LM79 and TM21 Ratings.”

LED Supplier Bill: “Marc, we are focused mostly on performance and have really great specs in that regard. We think that is most important to our customers.”

Me: “That’s great, and I would agree that performance is important, but so is quality and one measure we like to use are the LM ratings.”

LED Supplier Bill: “Well, we do have LM80 tests from the chip manufacturer, which as you know is one of the best in the world (I agree). We’re kind of new to the business and we haven’t tested LM79 or TM21 because we believe that the LM80 tests speak for themselves.”

Me: “Bill, LM80 tests are one indicator of quality, but what really matters to our members are fixture-level tests. As you know, OEMs make design decisions that can be very different from one-another, and they can result in big differences in LM loss rates.” (he agrees). 

LED Supplier Bill: “I’d really rather not spend the money if I don’t need to, but okay, I agree that they’re more valuable than LM80 tests alone and the quality of our design should hold up well under the scrutiny. Can you recommend a third-party that can test in a hurry?”

Before we were done with the call, I had introduced him to a third-party testing organization and he was busy getting quotes and making arrangements for a test (absolutely true). This is the kind of supplier that buyers like Lt. Kaffee should want to work with.

Conversation with Supplier B (Colonel Jessup)

Me: “Hi Colonel Jessup. Can I call you CJ?

LED Supplier CJ: No. Sir, will be fine.” (okay, that didn’t really happen)

Me: “Okay . . . uh Sir, we would also like to get your LM79 and TM21 Ratings.”

LED Supplier CJ: “Marc, the problem I have with LM ratings is they use lumens as the metric. It’s an erroneous measure that is bound to lead the buyer astray regarding making true performance comparisons. IES and DLC are working to define appropriate metrics for grow lights right now.”

I have some insight into the disagreements about LM testing (see Bob Erhardt’s comments at the bottom an earlier post). For the most part, they are related to testing the testing environment of horticulture lighting (i.e. temperatures), and have very little to do with lumens as a metric. LM losses are caused by flaws (called dislocations) in the materials that conduct electrons. These flaws reduce efficiency, and worsen with higher temperatures, drive current and age. High quality manufacturers have fewer dislocations. More importantly, a good power and thermal management design doesn’t accelerate their aging. Differences among chips and OEM designs are isolated and measurable at the fixture level, and it doesn’t matter whether they are measured in lumens or in photon flux.

Me: “Sir, there is no question that the semantics and interpretation of lumen maintenance losses is confusing for buyers of grow lights. Specifically, the nuanced differences in the PAR spectrum and color shift is not measured. However, LM loss rates are absolutely valuable (and necessary) to evaluate distinctions in the quality of the manufacturing process and materials of the chips themselves, and more importantly, the OEM design decisions on drive current and thermal management. We think buyers want to be informed.”

LM ratings may not be perfected for grow lights, but the are the best test of quality we have currently. I would disagree that they lead the buyer astray. In fact, they are more often ignored (or worse, misrepresented).  LM ratings ‘tease out’ lesser quality chips and shortcuts by OEMs. They serve as a flashing red light that something under the cover is wrong. It is our intention to drive this point home to our members in the form of educating them and demanding that OEMs disclose the specifications. The result will be a far better buyers’ decision making process and a more honest marketing culture”. 

Who do You Want as Your Supplier?

My harshest interpretation is that CJ doesn’t believe buyers are capable of understanding the nuances of LM testing and therefore it shouldn’t factor into their evaluation – lest they be led astray. In truth, I am probably being a bit unfair to CJ, but his reluctance to take the time to inform buyers for any reason (so long as the content is accurate) is counterintuitive for me.

In contrast, our first LED supplier – Bill – isn’t doubting the buyers capacity to understand. He believes in an informed buyer. I like Bill!

Can Buyers Handle the Truth?

I believe so, and am committed to the idea that more information, articulated in the right way, is always a good thing. What do you think? Please take the poll below, comment below, and share this blog with others. And register for our the First Edition of the LED Product Performance and Quality Ranking Report,



Risks of LED Lighting – Part III: More on LM Losses

Why Don’t Manufacturers Want to Talk About LM Losses?

We received dozens of comments about our lumen maintenance loss article, often with a recommendation of a useful information source. In more than one instance, readers referenced Philips Greenpower LEDs, and a provided a link to a 39 page brochure with the tagline ‘there’s more to light’.

We searched for lumen maintenance references in the brochure. After all, the readers posted the link as an anecdote to an article on LM losses. Here’s what we found:

The brochure mentions light recipes (13 times), plant growth (12 times), increased results of some type (12 times), light intensity levels (5 times), and long life (3 times).  There was no mention of LM80 testing, Lx ratings, or TM21 estimates of useful life. 

So we checked the Philips Greenpower specification sheet for details of lumen maintenance losses, and there it was. An L90 rating of 25,000 hours. There is no L80 or L90 ratings. There is no TM21. There  is also no indication of the drive current or testing temperature – both of which had a material impact on LM losses. Why not?

We checked with two of Philips closest competitors. Neither publish LM related information on their horticulture light specification sheet. We probed a little deeper with one of the two and found a spec sheet for a non-horticulture light that uses the same basic LED component and configuration. There we found a single LM-related data point – L70 point is greater than 75K hours at 40ºC.

To make comparisons, we thought we would chart the two data points along with TM80 data for a high quality Cree LED component which had the most detailed dataset. Take a look at the graph below and see if you can tease out any meaningful information.

Lumen Maintenance Data

If it seems like an apples-and-oranges comparison, you’re right. Philips provides an L90, but nothing more. The competitor an L70, but no L90 to compare. And perhaps most confusing – the competitor uses a test temperature that is lower than the prescribed minimum of the IES testing standard. This is problematic. Why? Fixture temperatures will vary based on variables outside of the manufacturers control – how closely the fixtures are arranged to each other, airflow around the fixtures, the consistency of cooling the building, etc. So long as both manufacturers use the same standard test temperatures, the local variances will be roughly the same for both fixtures (like a basketball team both playing on 10′ baskets). In this case, however, the competitor is supplying a figure that would have to be considered to be ‘in the best possible circumstances’ (like a basketball team playing on 8′ baskets). You just can’t compare the two.

So what is the takeaway?

  1. Comprehensive, standardized LM data is almost never included in specification sheets, even though that is the purpose of LM80 testing.
  2. Horticulture LED Manufacturers don’t want to talk about LM losses.

So why not? We think the relative differences between manufacturers is large enough to have a greater (negative) impact on total farm production over time than the relative differences in ‘light recipes’. In other words, the buyer should give equal or more weight to LM loss differences than to light recipe differences because the former will have a greater impact on their economics. Unfortunately, you won’t get that information unless you demand the data and do the math yourself.