Testing the voltage drop across a strip
My excess of LED’s helps a bit here. I theorise there are two ways you run into this issue, both being interwoven. And can be easily cimcumvented by injecting power along the length - however if the strips come in IP rated sheaths it the waterproofing is gone
I wired up 10m (600 LED’s) of strip with a couple of headers on the end, ran the strip with each LED trying to output RGB: 45,45,45. You can clearly see the change in colour on the end of the strip
Code and further details for this experiment
I used a 5 V 6 A PSU to power the strip, above a brightness of about 30 the colour starts to change.
At this point the voltage drop is about 1.2 V.
I speculate that most people would be using a 2/4A PSU and maybe 5M/ power supply best case
Testing with a brightness of 128 (above doesnt make too much of a difference) the voltage drop was ~1.35 V. at 255 (Max brightness) the voltage drop was ~1.45 V.
Conclusion: Ideally we would be sensing with an offset and gain so that the analogue input is mapped between the power supplys voltage input(V) and V/2) to get the most out of the 12-bit ADC on the ESP32.
import neopixel
import utime
print('----------RUNNING----------')
print('----------RX Board----------')
np = neopixel.NeoPixel(machine.Pin(21), 600)
c = 45
c = (c,c,c)
print(c)
for i in range(600):
np[i] = c
# np[i-1] = (0,0,0)
utime.sleep_ms(5)
np.write()
To keep things easy to source and assemble, a voltage divider will work a treat.
To find the upper bound of the ADC I refered to the MicroPython documentation (it is also on the ESP32 electrical specifications datasheet). 1V is the maximum unattenuated voltage (not a concern on most other devboards) - and I’ll stick to this as any software mishaps wont break anything.
With some cheeky math I arrived at a voltage divider consisting of 10k and 2k2 ohms. Check out the excel sheet for how the voltage is mapped out.
Obiously you’d want something a bit more balanced in practice but if someone wants to make one of these they have to source x more amount of parts. KISS.
At this point it is paramount that the microcontroller is powered on THEN the LED strip, voltages fed back into pins that arent powered up can destroy them. That’s due to an input having a maximum voltage of something like 0.7 x Vdd, over time powering up the system incorrectly it will destroy the ADC pin (maybe even the whole MCU).
On the final design a MOSFET will control the strip being powered on, hopefully also saving some power as WS2812’s quiescent current is 1 mA @ at large numbers it adds up!
EDIT: Just thought I broke the LED strip but a wire came loose. Tested the output with the whole system at 5V (no LED’s on) and across the power supply directly (5.1 V) and the output was still below 1V - Max of 0.94 V with the PSU directly. The IO is tolerant up until 3.3V so should be good.