Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When I was fiddling with LED circuits back in the 70s (!) I experimented with turning the LED on and off with a square wave. If you turned it on and off rapidly enough, your eye did not notice it, and your eye perceived it as fully bright. (Adjusting both the frequency and duration of the "on" part.)

Hence, you could get some decent power savings doing this.

I wonder if this is commonly known. I've mentioned it to a couple EEs over the years, and they were able to reduce the power consumption of their devices.



It's widely known among EEs. It's used for lots of interesting things, such as temperature control, motor control and positioning, and LED lighting. You can do it in hardware old-style with a 555 timer or hex inverter, but most modern systems I've worked with do it with a microcontroller.

An addendum to this that you may find interesting -- I've experimented with turning the LED on for a few microseconds at higher than rated current, then off for tens of milliseconds. The average current stays far below the specifications. This results in very high apparent brightness per unit of power consumption.

Using the IV curve of the LED, this also let me eliminate the typical current-limiting resistor. The power savings are more than the power cost of the MCU that controls it (modern low-power microcontrollers are awesome).

Anyway, the end result is a little LED + CR2032 cell + magnet that you stick to furniture, and it runs for about 3 years. I made it so that elderly people I know who wake up at night to go to the bathroom don't bump into furniture (especially in an unfamiliar place, like while traveling). Without creating a thing they have to think about often. If you're curious, I posted the code here: https://github.com/seanboyce/tinylight

An additional one you might like: I did PWM for LED dimming in the tens of Mhz for some 1 Watt red LEDs. This is for my wife -- when she has a migraine she prefers very dim red light to complete darkness. In the Mhz range, there's no visible flicker by a longshot (although it costs a little more power). Most PWM systems I've seen that flicker, use lower-frequency signals.

It must have been cool to play with LEDs in the 70s. We sort of take them for granted now, but they are so awesome. Truly we live in an age of wonders.


> It's used for lots of interesting things, such as temperature control, motor control and positioning, and LED lighting.

A pretty slow version of this is used in your microwave or airfryer. But they typically use a temperature sense (like a bimetallic strip) to turn the heating parts on for at least a few seconds and then turn it off again; they don't use a timer.

This is known as bang-bang control.

https://en.wikipedia.org/wiki/Bang%E2%80%93bang_control


Microwaves never cease to amaze me. If I were in some alternate timeline where they weren't ubiquitous, and you told me such a thing would soon be in everyone's homes, there's no way I would believe you.

I mean, people putting their foods in a Faraday cage and then pointing a cavity magnetron at them to cook them? It sort of sounds far-fetched.

Bimetallic strips are also so awesome. Every time I hear that little 'click' noise, I think about them.

Is there a term for knowing just enough about the technology that underlies daily life, that you feel mildly absurd using it for mundane tasks?


Thank you for the awesome reply!


In the now distant past, all manufacturers of electronic components published very extensive datasheets, application notes and user handbooks for all the devices that they were selling.

One could typically learn much more electronics from the application notes or maintenance manuals of the vendors than from university courses.

This included LEDs. For instance Hewlett-Packard published a good handbook for their LEDs, where many useful techniques for designing with LEDs were explained, including what you mention, that LEDs may have higher luminous efficiency at very high currents, so for achieving a given luminous flux you may save energy by operating them with pulsed currents.

The use of multiplexing in the interfaces of multi-digit/multi-character LED displays (e.g for clocks or calculators) not only reduces the number of wires in the interface, but it also improves the energy efficiency, because only one digit/character is powered on, at a much higher current, but for a much shorter duration in comparison with a non-multiplexed display.

During the golden era of electronics documentation, it could be difficult to get the vendor documentation, even if it was usually free, when you were located far away, e.g. in another country.

When the Internet has appeared, for a short time it solved this problem so you could be located at the other end of the world and still access easily the datasheets, application notes and user manuals.

Unfortunately, very soon after that, towards the end of the nineties and much more since 2000, the quality of technical documentation has degraded tremendously, so you now have easy access, but to much less useful information.


Too many companies nowadays just straight up lock all relevant documentation behind a contract and an NDA.

You want anything more than a 2 page marketing fluff piece? Talk to the sales! If you agree to a MOQ of 50000 and sign away your soul in a two miles long NDA, then you can have a look at the documentation. With one eye only. For a couple minutes.


PWM. This is widely used method for dimming.


It's also widely known to cause headaches. I swapped out all my PWM bulbs and couldn't be happier. More info here: https://flickeralliance.org/


IKEA has, at least in Sweden, started publishing how much their lights flicker. I found that it is actually an upper limit for their lights. The dimmable ones flicker less for most of their dimmable range for example.

Now, I only measured two bulbs, but I am pretty darn happy with those results. I also opened one of their chargers and haven't looked elsewhere for chargers since. The thing was even more well built than my apple charger (the 45w sjöss is actually quite crazy. The 30w had some issues)


I have a video projector, and if I flick my eyes across the screen I can perceive the separate R G B frames. But just watching it, it's fine.


The interstimulus interval (ISI) for vision is much longer than most flicker rates or frame intervals in displays and projectors. However flicker can be perceived through temporal aliasing. For lighting, even simple motion in the scene can reveal flicker. Waving your spread fingers in front of your eyes is a sure way to detect flicker.

What you're describing is likely saccadic masking, where the brain suppresses visual input during eye movements. It "freezes" perception just before a saccade and masks the blur, extending the perception of a "frame" up to the point in time of the sharp onset of masking. That's how you get a still of a partially illuminated frame instead of the blended together colors.

I’m no expert in this, but if you're curious, check out the Wikipedia pages on interstimulus interval, saccadic masking, chronostasis, and related research.


That's DLP and it's because DLP uses a color wheel at 3x the frame rate.


Not only DLP. Some laser projectors use a color wheel to get the same effect; other laser projectors use multiple lasers, but pulse the colors individually and you get a similar effect; probably there are some laser projectors that do all colors simultaneously too, just as there are some DLP projectors with 3 DLP surfaces and prisms/etc to split and combine the light. My laser projector does the pulsing, I find it slightly less distracting than I remember color wheels being, but it's very similar; thankfully the family has gotten used to it; I only notice it on content that encourages a lot of fast eye movement.


Scanning displays like a laser projector or a CRT will flash any individual spot but the illumination is almost continuous, so people have less issues with them.


I'm talking about laser illuminated panel projectors used for viewing film or video content, not scanning lasers used for light shows. Scanning a laser beam is a lot more difficult than scanning an electron beam, so rasterised displays from a scanning laser are limited compared to CRTs.


Ah, those are just wider-gamut conventional projectors.


I believe the colour wheels now spin twice as fast which may reduce the effect somewhat. But for me it's not a very nice effect which is why I use a 3 chip projector. Unfortunately 3 chip DLP is prohibitively expensive (cinemas use them), but JVC DLA projectors are good. Although it looks like the newer 4K models are also prohibitively expensive :/


I can not stand the DLP projectors, so I got an Epson 3LCD, and have no issues with it, not crazy expensive either.


LCDs have typically been considered inferior for home cinema usage. The only advantage of them is the price and that they have no colour wheel. DLA is a sort of middle ground that is technically an LCD but works more like DLP. None of this matters if your room isn't set up right or you're projecting in a not completely dark room, though.


I know. I was just pointing out that flicker doesn't affect me, for which I'm glad. I was a bit concerned before I bought it, due to the complaints about the flicker.


Yes, I get those as well from some cheap LEDs. Though if you flicker LEDs fast enough, it's fine.


How could the same modulation achieve "dimming" and "fully bright" simultaneously?


For traditional lightbulbs, a PWM signal actually makes the light noticeably dimmer, and has a very simmilar effect to simply reducing the current. This is because the mechanism for them is heating up the filement, which happens on a much slower time scale then the PWM duty cycle.

In practice, traditional dimmers are not quite PWM as they do not generate a square wave. Instead they generate a sin wave with portions of each cycle clamped to 0.

LEDs already need driver circutry to condition the relativly high AC voltage into a stable lower voltage DC. Dimmable LEDs create a stable DC power supply from the chopped up AC power, then use the width of the active portions of the AC as a signal to drive their own dimmer logic.


That's a good PWM explanation but I think GP was asking a different question. Walter claimed that if the PWM was high enough frequency you would perceive it as full brightness while using less power.

It obviously wouldn't work that way when lighting a room, but what about for an LED indicator light that you look at directly? I don't know enough to form an opinion.


Without any more information on frequencies and measurements it's impossible to know. My guess is either A) They did not perceive the dimming, and/or B) The frequency was high enough that the inductive and capacitive effects of the circuitry became relevant and was filtering the PWM signal to DC signal that still drew the same amount of power as it did when fed the full voltage DC signal


The measurable brightness of the LED is a straightforward sum of the times it's on and the times it's off. Shift the ratio so it's off more than it's on and it gets dimmer.

The "fully bright" part is a consequence of human vision. Your brain is making sense of limited, noisy information coming from your eyes. A flickering light appears brighter than the light in steady state, lots of still images shown in rapid succession look like they're moving, as far as you can tell you can perceive the full range of colour out of the corner of your eyes, and the dress could be either colour.


Also (and I think more important, but I might be wrong), we don't perceive light intensity linear but logarithmic. A 50% duty cycle does appear brighter than half as bright. A 90% duty cycle might be only barely perceptible.


The apparent brightness is caused by the ratio of on/off and is called the duty cycle. 50% brightness would mean that half the time the light is on, and half the time the light is off.

If the cycling of the light on and off is done at say 10kHz it's perceived as a dim light.


you dont. But your eyes perceive brightness non-linear, which means to your eye, 80% of brightness is very close to 100%


Perhaps less widely known is that if you market a commercial product that actually uses PWM to modulate LED intensity, you're liable to be litigated against by Philips for patent infringement?

Wild...yeah, I know. Heard that from a buddy in Austin over a decade ago. Vaguely recall that he had to redesign using some sort of current driver instead to avoid the legal encumbrance.


Probably no longer true; patents only last 20 years.


To be fair, it's probably naive to blindly assume that a company like Philips, with a long history of litigious behavior, isn't playing patent continuation games.


Not only is this widely known, it is the main method by which multicolor RGB lights can exist, because they are actually a package with red, green, and blue LEDs in them, where each color channel is individually “dimmed” using this technique (called pulse width modulation, or PWM) to be able to produce many more colors (much like a TV or LCD/OLED display).

The white color produced by full-on of R, G, and B is quite ghastly, so modern ones come with an individual white LED (and frequently a second warm white LED) in the package for a total of four or five individual color LEDs in the single light (RGBW or RGBWW).


He's not talking about dimming. He's talking about the perceived brightness when you look directly at an indicator LED.


I watched a fascinating video of a guy explaining, in detail, how vacuum tube radios worked. Quite a joy to watch.

I'm glad I didn't take up Electrical Engineering as a major. All the fun of using signal generators and oscilloscopes is gone. Just program a Raspberry Pi to do it. or a circuit simulator on your computer. Sigh. EE is no fun without getting accidentally zapped by A/C once in a while or letting the smoke trapped in a transistor out.

It's like hotrodding a car by plugging in a laptop. Ehhh no.


Thats more 'computer engineering' no? EE also has things like Power Systems, RF, solid state if you want to go into chip design and the like.


I thought this was very common knowledge, we used the technique in our first year EE project.


That’s literally how LED drivers have worked for half a century.


It is widely known and has been utilized extensively for decades because the junction is more efficient when cool. Running it at a lower duty cycle allows for higher peak current/light output.


I have a couple LED night lights that are triggered by a photocell to only turn on in the dark. I've often wondered if the trigger circuit consumed more power than just leaving the LED on.


It likely consumes next to no power: a proper switching MOSFET has negligible resistance when open, and has a very high gate resistance, so the photo cell does not have to produce any significant current. Possibly the current-limiting resistor in the power supply of the LED circuit dissipates more than the trigger circuit.


This is why I love HN! Only here you’ll find the actual guy who invented dimming!


I should have patented it! Mwuhahahaha


Nominative determinism strikes again! (Or is this anti-determinism?)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: