Old 12-23-17, 07:05 PM
  #5  
grizzly59
Senior Member
 
Join Date: Aug 2016
Posts: 712
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 283 Post(s)
Liked 262 Times in 164 Posts
Well, pretty much. Specs for a LED should give a max current and a voltage drop. Subtract the voltage drop from the voltage of the power source. Then find a resistor that allows less than the max current with that remaining voltage, and that should work. Example DC 6v supply, simple series circuit. LED says 1.8v drop, 30ma max. So 6v - 1.8v is 4.2v. 4.2v / 0.030a = 140ohm resistor. Practical use, derate the current to say 80%, 0.8 * 30ma = 24ma, 4.2v / 0.024a = 175 ohm resistor, but in the parts bin probably find a 180ohm, call it good. If you don't have specs for an LED, use a resistor that you're sure is enough, hook it up and turn it on, then measure the voltage drop with a meter. Maybe start with a resistor that gives 5ma, and try that. Keep going up till your satisfied or you burn it up.

And everything HTupolev said above.

Last edited by grizzly59; 12-23-17 at 07:09 PM.
grizzly59 is offline