While developing our LED Product Performance and Quality Ranking Report, I have encountered LED suppliers who have very different perspectives on providing LM loss data to buyers. Two examples remind me of the characters played by Tom Cruise and Jack Nicholson in the movie A Few Good Men. I thought the analogy would illustrate the importance of an informed buyer.
Is it fair to compare buyers and suppliers to the characters of Lieutenant Kaffee (I want the truth) and Colonel Jessup (you can’t handle the truth). Probably not, but I hope more buyers are like Lieutenant Kaffee and fewer suppliers view the world like Colonel Jessup.
Developing the LED Product Rankings requires persuasion – which is understandable, because we ask suppliers to provide data that requires testing by a third-party. It’s not unusual for a supplier to push back on some of our testing requirements. In the case of LM testing, it is my professional opinion that they are essential, but not everyone agrees. The following is a dramatization of discussions with two suppliers who have a different perspective on LM testing.
Conversation with Supplier A
Me: “Bill (not his real name), we would also like to get your LM79 and TM21 Ratings.”
LED Supplier Bill: “Marc, we are focused mostly on performance and have really great specs in that regard. We think that is most important to our customers.”
Me: “That’s great, and I would agree that performance is important, but so is quality and one measure we like to use are the LM ratings.”
LED Supplier Bill: “Well, we do have LM80 tests from the chip manufacturer, which as you know is one of the best in the world (I agree). We’re kind of new to the business and we haven’t tested LM79 or TM21 because we believe that the LM80 tests speak for themselves.”
Me: “Bill, LM80 tests are one indicator of quality, but what really matters to our members are fixture-level tests. As you know, OEMs make design decisions that can be very different from one-another, and they can result in big differences in LM loss rates.” (he agrees).
LED Supplier Bill: “I’d really rather not spend the money if I don’t need to, but okay, I agree that they’re more valuable than LM80 tests alone and the quality of our design should hold up well under the scrutiny. Can you recommend a third-party that can test in a hurry?”
Before we were done with the call, I had introduced him to a third-party testing organization and he was busy getting quotes and making arrangements for a test (absolutely true). This is the kind of supplier that buyers like Lt. Kaffee should want to work with.
Conversation with Supplier B (Colonel Jessup)
Me: “Hi Colonel Jessup. Can I call you CJ?
LED Supplier CJ: No. Sir, will be fine.” (okay, that didn’t really happen)
Me: “Okay . . . uh Sir, we would also like to get your LM79 and TM21 Ratings.”
LED Supplier CJ: “Marc, the problem I have with LM ratings is they use lumens as the metric. It’s an erroneous measure that is bound to lead the buyer astray regarding making true performance comparisons. IES and DLC are working to define appropriate metrics for grow lights right now.”
I have some insight into the disagreements about LM testing (see Bob Erhardt’s comments at the bottom an earlier post). For the most part, they are related to testing the testing environment of horticulture lighting (i.e. temperatures), and have very little to do with lumens as a metric. LM losses are caused by flaws (called dislocations) in the materials that conduct electrons. These flaws reduce efficiency, and worsen with higher temperatures, drive current and age. High quality manufacturers have fewer dislocations. More importantly, a good power and thermal management design doesn’t accelerate their aging. Differences among chips and OEM designs are isolated and measurable at the fixture level, and it doesn’t matter whether they are measured in lumens or in photon flux.
Me: “Sir, there is no question that the semantics and interpretation of lumen maintenance losses is confusing for buyers of grow lights. Specifically, the nuanced differences in the PAR spectrum and color shift is not measured. However, LM loss rates are absolutely valuable (and necessary) to evaluate distinctions in the quality of the manufacturing process and materials of the chips themselves, and more importantly, the OEM design decisions on drive current and thermal management. We think buyers want to be informed.”
LM ratings may not be perfected for grow lights, but the are the best test of quality we have currently. I would disagree that they lead the buyer astray. In fact, they are more often ignored (or worse, misrepresented). LM ratings ‘tease out’ lesser quality chips and shortcuts by OEMs. They serve as a flashing red light that something under the cover is wrong. It is our intention to drive this point home to our members in the form of educating them and demanding that OEMs disclose the specifications. The result will be a far better buyers’ decision making process and a more honest marketing culture”.
Who do You Want as Your Supplier?
My harshest interpretation is that CJ doesn’t believe buyers are capable of understanding the nuances of LM testing and therefore it shouldn’t factor into their evaluation – lest they be led astray. In truth, I am probably being a bit unfair to CJ, but his reluctance to take the time to inform buyers for any reason (so long as the content is accurate) is counterintuitive for me.
In contrast, our first LED supplier – Bill – isn’t doubting the buyers capacity to understand. He believes in an informed buyer. I like Bill!
Can Buyers Handle the Truth?
I believe so, and am committed to the idea that more information, articulated in the right way, is always a good thing. What do you think? Please take the poll below, comment below, and share this blog with others. And register for our the First Edition of the LED Product Performance and Quality Ranking Report,