I need some LED power advice

Don't want to see this ad? Sign up for anRPF Premium Membershiptoday. Support the community. Stop the ads.

Don't want to see this ad? Sign up for anRPF Premium Membershiptoday. Support the community. Stop the ads.

Psab keel

Master Member
RPF PREMIUM MEMBER
LED's are current sensitive devices. Typically speaking you don't need a lot of voltage to power an LED. You usually want a voltage that is close to the forward voltage drop of the LED itself. When using an LED driver this may vary a bit depending on the voltage requirements of the driver itself, but if you over drive the LED you can easily blow it.

So for example if the forward voltage drop (forward voltage drop is the minimum required voltage to power your LED) is say 3.1 you want a power source that is close to 3.1 volts. Plus you have to make sure that the driver can accommodate the flashing feature. Typically a driver will say something about PWM (Pulse Width Modulation) which accounts for pulsing the current to the LED which allows it to flash or pulse by alternating the current supplied to the LED.
 
Last edited:

Travis McGee

Active Member
LED's are current sensitive devices. Typically speaking you don't need a lot of voltage to power an LED. You usually want a voltage that is close to the forward voltage drop of the LED itself. When using an LED driver this may vary a bit depending on the voltage requirements of the driver itself, but if you over drive the LED you can easily blow it.

So for example if the forward voltage drop (forward voltage drop is the minimum required voltage to power your LED) is say 3.1 you want a power source that is close to 3.1 volts. Plus you have to make sure that the driver can accommodate the flashing feature. Typically a driver will say something about PWM (Pulse Width Modulation) which accounts for pulsing the current to the LED which allows it to flash or pulse by alternating the current supplied to the LED.

Even when the light its self does the animation? Would it be better if I go with a resistor instead of the driver?
 

Travis McGee

Active Member
LED's are current sensitive devices. Typically speaking you don't need a lot of voltage to power an LED. You usually want a voltage that is close to the forward voltage drop of the LED itself. When using an LED driver this may vary a bit depending on the voltage requirements of the driver itself, but if you over drive the LED you can easily blow it.

So for example if the forward voltage drop (forward voltage drop is the minimum required voltage to power your LED) is say 3.1 you want a power source that is close to 3.1 volts. Plus you have to make sure that the driver can accommodate the flashing feature. Typically a driver will say something about PWM (Pulse Width Modulation) which accounts for pulsing the current to the LED which allows it to flash or pulse by alternating the current supplied to the LED.


I looked at the driver and I didn't see anything talking about the PWM in its description and additional information.
 

Psab keel

Master Member
RPF PREMIUM MEMBER
Think of it this way. In my example if your vf (forward voltage) of your LED is 3.1 and you use 3.5 volts to power your LED, then you need a resistor to make up the .4 volts to balance the circuit. To find the correct resistor you would use Ohm's Law. For simplicity sake you can find Ohm's Law calculators online to get the correct value resistor for your application with a Google search.


I didn't fully read the description before. If you use the LED you linked in your original post you only need a resistor. Typically you would need a cricuit or driver to do that but this is much simpler. As for your power source, like I said you want as close to the forward voltage of the LED as possible because then you can use a lower value resistor. Since LED's require more current and less voltage the more voltage you pump into your LED, the more resistance you need to balance your circuit. Then you get into the more complicated matters of heat management for your resistor and LED and that's more complex that what you need for your set up.

I hope that makes sense. It's been a number of years since I learned this directly from my late step father who taught me all about it. He was an electrical engineer and in the last few years of his life worked for an LED illumination company. Plus I've been building in hilt LED lightsabers for almost 20 years.

I hope this helps you out!
 

Don't want to see this ad? Sign up for anRPF Premium Membershiptoday. Support the community. Stop the ads.

Travis McGee

Active Member
Think of it this way. In my example if your vf (forward voltage) of your LED is 3.1 and you use 3.5 volts to power your LED, then you need a resistor to make up the .4 volts to balance the circuit. To find the correct resistor you would use Ohm's Law. For simplicity sake you can find Ohm's Law calculators online to get the correct value resistor for your application with a Google search.


I didn't fully read the description before. If you use the LED you linked in your original post you only need a resistor. Typically you would need a cricuit or driver to do that but this is much simpler. As for your power source, like I said you want as close to the forward voltage of the LED as possible because then you can use a lower value resistor. Since LED's require more current and less voltage the more voltage you pump into your LED, the more resistance you need to balance your circuit. Then you get into the more complicated matters of heat management for your resistor and LED and that's more complex that what you need for your set up.

I hope that makes sense. It's been a number of years since I learned this directly from my late step father who taught me all about it. He was an electrical engineer and in the last few years of his life worked for an LED illumination company. Plus I've been building in hilt LED lightsabers for almost 20 years.

I hope this helps you out!


This helps a lot. I also noticed in the pictures of the LED also have a resistor selection table pictured that helps you find the resistor you need for the supply voltage.
 

zapwizard

Sr Member
RPF PREMIUM MEMBER
To clarify what Psab keel is saying: If your driving a single LED, and you have control over the voltage source, then yes running the voltage as close to the LED voltage as possible means you have less wasted power at the resistor or driver.

To contradict what Psab keel is saying: The voltage driving the LED doesn't matter so long as the current is regulated. A 3.1V LED drops the remaining voltage on a wire by 3.1V. You can drive it from any voltage you want, so long as you can keep the current limited and can dissipate the remaining power. That little driver can work at up to 45V. But at 45V, if you only had one LED, then you would have 4.6W of power at the driver, and it probably would burn up from the lack of a heatsink. So not that is of course not a practical voltage to use.

You don't need any resistor when using that constant current LED driver. They are great as getting rid of the worry about what resistor to select, and they also ensure your LED doesn't burn out due to voltage surges, and it will ensure the LED remains at full brightness until the 9V battery dips below ~5.1V.

A resistor is the traditional way to drop the current across an LED when your source voltage is fixed and stable. But since your running of a battery, the driver is actually better.

Be aware that the LED driver will still dissipate heat just like a resistor. In your first post you correctly list 5.1V for the LED and Driver. You calculate the remaining voltage: (9V - 5.1V) = 3.9V. So 3.9V * 0.02A = 0.078Watts will be dissipated by the driver or resistor.

You also only need one LED driver per serial chain of LEDs. So long as your serial LEDs add up to less voltage than your source. (LED1+LED2+LED3+2V for the driver) >= source voltage.

As a rough example: Lets say you want to design a series of lights to run off 100 Volts DC, and each LED is 3.1V and 20mA. You can run up to 30 LEDs in series using just one driver. 3.1V * 30 LEDs = a 93V drop before the driver. So the driver see just 5V at 20mA, or 0.1Watts.
 

nkg

Sr Member
RPF PREMIUM MEMBER
Yes, I definitely like little current regulators. They increase the cost of the installation, but then you know that you can feed a whole range of power to the LED without risking any problems. I did a museum installation with multiple LEDs, and put a current regulator on each one. I feel it was worth it. :)
 

Travis McGee

Active Member
Yes, I definitely like little current regulators. They increase the cost of the installation, but then you know that you can feed a whole range of power to the LED without risking any problems. I did a museum installation with multiple LEDs, and put a current regulator on each one. I feel it was worth it. :)
Cool thank you. I'll have to get some another time. I just bought a bunch of resistors for now. Thank you for your help
 

Don't want to see this ad? Sign up for anRPF Premium Membershiptoday. Support the community. Stop the ads.

Don't want to see this ad? Sign up for anRPF Premium Membershiptoday. Support the community. Stop the ads.

Top