Im almost too embarrassed to ask because it seems such a simple task. However I need to make sure I get this right.
I have a ‘off the shelf’ string of xmas LEDs (series lights) with a single cell battery (1.2 Volt) and a small solar panel… Thats the easy bit…
I want to change the single cell battery (1.2) to a 12 volt rechargeable battery with a bigger solar panel which is now mounted on the roof.
In other words the voltage has to be dropped to 1.2 volts and a maximum current draw of 1 amp. (the specs on the LED string says Bulb Rating 3V .015 watt)
I purchased a “ADJUSTABLE SWITCHING POWER SUPPLY MODULE IN 4V-35V OUT 1.5V-30V LM2596S” which I can use to bring the voltage down but not the current.
I have to reduce the current from a maximum output of the Power Supply of 3 amp down to 1 amp…
What is the best way I can do that?
You’re already done!
When working with electronics its critical that deviced be supplied with exactly the right voltage. The good news is that the current is determined by the device using it and not by the supply. Using water as an analogy, think about voltage as the pressure of the water in a pipe, and current being the diameter of the pipe. If you have too much pressure (voltage) it will break your device, but a big pipe at the proper pressure just means that there is more water(current) available.
Your lights will only draw as much current as they need. Any available current extra will just go unused.
I hope that helps! Let me know if you have any questions!
Thank you. My understanding was that LEDs had to have a fixed/maximum amperage because they will draw too much current if they are not limited…QUOTE…”If you connect an LED directly to a current source it will try to dissipate as much power as it’s allowed to draw, and, like the tragic heroes of olde, it will destroy itself. That’s why it’s important to limit the amount of current flowing across the LED.”
Can you clarify the above statement for me.
Thank you again
Ahhh, it depends on if your LEDs are connected in series or parallel. Christmas lights are typically in series and I forgot all about that! You have controlled the voltage to be 3V, using Ohm’s Law Volt = Amps x Resistance we can calculate how many resistors we need to reduce the amperage. R= Volts/Amps. I’m concerned that the power supply will not limit the amperage to 500mA, that’s just what it is rated for. I think what might happen is it will join the LED strip in a blaze of glory, but perhaps enough resistance would prevent that. I’m a little rusty on my electrical theory, I’ll invite someone else to this conversation that might know!
I’m even more rusty on the electrical theory. I last dabbled in it in 1966 so I thought I would ask.
Let’s see what comes back.
I prefer to not go to a resistor hoping there might be something more sophisticated
Where did you get the 1A limit from? If that is what the current supply is rated at I think you will be fine to use the 3A as both would are probably high enough to burn out the LEDs (though without seeing I would not be sure, though it is considered bad practice to design circuits that draw overcurrent).
My advice would be to put a put a potentiometer into the circuit and slowly wind down the resistance if they get brighter than the original supply before the resistance gets to the lowest you may have a problem.
A resistance will be the easiest way to limit the current but you could also use a transistor to build a more advanced one. I would use something like the BJT limiter or this mosfet one
Much appreciated… I’ll explore your suggestions
PO Box 369
KYNETON Vic 3444