LED Heating problem

sapper36

Sr Member
RPF PREMIUM MEMBER
Hey guys - Need some help from the electronics gurus:

Here's what I'm trying to hook up.

12v 500MA DC source

2 Green 2.8v LEDs hooked up parallel

260ohm resistor hooked to one end of the LEDs

I thought I calcuated this right, but my resistor keeps heating up way more than I'm comfortable saling in a model. - If I use higher Ohm resistors I don't seem to get the full brightness out of the LED - Am I doing this wrong or should I just build in a heat sink?
 
You could try going down in battery size and resistor size. That might help and will still keep the LEDs bright. You will just have to change the batter more often.
 
Well... Actually its not a battery but a DC transformer. Also this won't be the only LEDs powered in the circuit - It will also be powering about 10-15 other LEDs too. I'm just trying to get this one node balanced out first.

Thanks
 
Need to know the current draw of the LEDs. You need to go higher in resistance, not lower, if you're heating up the resistor.

Jim
 
A resistor's byproduct will be heat. If that's the only thing you're running off your 12 volt transformer, I'd drop that voltage way down (get a lower voltage transformer).

If you have other things that will be running off of that transformer, too, you'd better hook everything up first. It will make the voltage drop, and could play weird tricks on other parts.
 
Yes, need to know the current draw for the LED. Resistors have wattage levels too. Perhaps you need to go up a size or two in wattage level for your particular resisitor to help the resisitor dissapate the heat buildup.
 
Another thought occurred to me - do the LEDs need to be parallel in this leg? Connect them in series, the LEDs drop more voltage on that leg, and the current draw is less, and your resistor is dissipating less. Assuming a 20mA current per LED:

Parallel LEDs = 2.8v drop, 40mA current. 12v - 2.8v = 9.2v. 9.2v / .04A = 230ohms. 9.2v x .04a = .368watts, so 1/2w resistor.

Series LEDs = 5.6v drop, 20mA current. 12v - 5.6v = 6.4v. 6.4v / .02A = 320ohms. 6.4v x .02A = .128watts, so 1/4w resistor.

You'll need to adjust for the actual current of your LEDs of course.

Jim
 
As said above, these solutions are worth repeating...

The the best solution it to use a lower voltage power supply...

Short of that the LEDs should be wired in series to maximize the voltage consumption, taking the load off the resistor...

The resistors by product is heat, in order to destroy the voltage it converts the excess volts to heat... So when you start out with 12 volts and only use 2.8 the resistor has to destroy 9.2 volts... Wire them in series and it only has to destroy 6.4 volts... This will result in much less heat given off... Also you should be using at least a 1/2 watt resistor for the 9.2 volt scenario or a 1/4 watt for the 6.4 volt scenario...

And if all else fails and you don't want to do one of the preferred solutions go to a bigger wattage rated resistor, say at least 1 watt it will disperse the heat better...
 
Back
Top