I am new to the forum.
I am currently undertaking a project to build a 1:1 scale K2-SO droid from the star wars movie series.
I am 3D printing the parts now, from a great group called Droid Division on Facebook.
The finished print stands a little over 7" tall, so it is a HUGE project.
As part of the print files they provide animatronic eyes that move.
This is where I fall down a bit. I am a NooB when it comes to electronics and writing code. So I am hoping someone will take pity on me and help me out.
There are two features I want to add to this project.
Simple animatronic eye movement and pan and tilt object tracking.
But I have very little idea where to begin.
I will be using a ESP32 because, that’s what I have on hand.
So, First things first.
I’d like the eye mechanism to do a random movement. The central position looking straight forwards. Position 1.
2 3 4
5 1 6
7 8 9
I think position 1 is around 40% of the time. So a random generator where anything above 40% it chooses a position from 1 ~ 9 with a random pause internal from 0.5 to 3 seconds before generating a new position.
Task two is object tracking. A pan and tilt function where it will tracks a person’s face and positions the head to follow their movement.
Can I use the same controller to do both functions at the same time?
So, any help anyone can offer to point me in the right direction would be very much appreciated.
Bear in mind, I am a complete beginner with this, so please be kind.
Welcome to the forum!!
AWESOME PROJECT!!! This has got to be one of the coolest 3D prints I have seen (a bit biased - profile pic aha)
The ESP32 might be a bit of a stretch for facial recognition, but it will certainly be able to handle the random movements and controlling the servo’s, Tim at Core has some excellent guides on facial recognition with the pan and tilt HAT (combining a couple might be a bit tough to get perfect since there is some distance offset from the eyesto neck).
PS: I’d love to make my own, are the files open sourced/publicly available?
I am not being picky but do you mean 7 feet tall or 7 inches. The pic looks like 7 feet (7’) but the text says 7 inches (7"). Note the different quote marks, double for inches and single for feet.
Like you say, a HUGE project and a very impressive looking robot indeed.
PS I am assuming you live somewhere that still uses feet and inches. I am old so am used to dual systems.
That’s immense! Looks awesome, @Liam120347 's got you started with some info.
There are some guides around online that cover ESP32’s doing facial recognition though I’m not too sure how smooth or real-time it is. With some tuning and overclocking, @Tim 's guide gets the delay down quite a bit.
This does introduce complexities in getting two modules talking to each other - UART would most likely be your best bet!
yes, it is 7 foot tall.
Actually I live in OZ, but the specs for the print are in imperial.
Actually it doesn’t really need facial recognition. I just want it to track an object, whether that be a person entering the room or the cat walking past.
Welcome to the forum
That’s a super impressive project, I can only imagine the print times required to get all of the individual pieces done!
Is the photo from your build or another member of the Droid Division? It would be great to see updates on your build as it comes together over time.
Object tracking is computationally far less intensive than object recognition, so if you don’t need it to recognise specific objects, go for the simpler implementation. There are still plenty of challenges to a good object tracking algorithm on its own.
My first thought to implementing object tracking on something like this would probably be to try and combine a super basic PIR motion sensor to trigger when something is moving, with a Time of Flight laser imager or ultrasonic sensor, which can pan around and locate the nearest object.
I haven’t had a search yet to see if there are any good object tracking libraries out there but I think those would be the hardware sensors best suited to locating objects a good distance away from the head of the droid without the expense of a LIDAR system.
The picture is of another members build.
I have only the head and right leg done so far.
A PIR sensor is hard to do because it is difficult to build it into the droid without it being too obvious. I’m thinking along the lines of a ESP32 Cam or similar as it is easier to hide. (?)
The tracking tutorial above look to be a good option although I am not too sure if it would be self contained on the PI or rely on an external computer to work.
The face tracking in the tutorial is all done using the Pi’s computing power with no external resources. This does mean that OpenCV will only really run on the 4GB and 8GB Pi 4 boards because it needs all that extra RAM and CPU grunt the other Pi models aren’t capable of providing.
ok, I’ll go with a PI instead. I have a 4B here somewhere.
It’d probably be easier to drive the servos anyway from what I can tell.
Awesome project! The finish on your prints looks excellent!
The only input I have on this for this problem is on the “random movement” idea.
For small, constrained movements, perhaps a pair of servos per eye would be the way to go. I can’t speak too much about the mechanics of it without having the head open in front of me, but as far as the general code goes, servo libraries take an angle integer, and convert that to a PWM signal that the servos use as their “target angle”.
The Python library
random has the
choice function that will return an item from a list. You could “weight” the center position more strongly by including it more times in the list i.e.:
from random inport choice
positions = [45, 90, 90, 135]
# This is crude, for a real weight for the actual center, you'd have a longer list with tuples in it, and weigh [90, 90] more than all the others
EDIT: If you’re using a Pi, you’ll need an extra board on top to drive the servos, the Linux OS of the Pi gets in the way of the precise timings needed by the servos to my knowledge:
You can control two servos with just a Pi for testing, but any more will need the HAT:
The eye mechanism is a very simple idea that allow side to side movement of each eye in it’s “socket” and up and down by tilting the frame.
Awesome, looks like the model designers have already provisioned for micro-servos in there, so the hardware side of things is covered.
I’d see if any of our servos match the dimensions in CAD (hard to get a sense of scale from here):
Since the linkages reduce the servo count to 2, you could do this with just a Pi.
If you need a hand with anything else, be sure to ask us
I already have two MG90 servos for it. It is the codeing I have no idea how to do.
I have printed the mechanism and mounted the servos. but not too sure where to go from there.
I have two neopixel jewels for the eyes. I need to work out how to wire them to the PI so they light up a soft blue. There is a opaque diffuser in front of each eye.
Adafruit’s Neopixel uberguide is the definitive tutorial for everything Neopixel.
This section covers connecting them to the Pi, but the whole tutorial is well worth reading start to finish.