PTZ (Pan tlilt zoom) DSLR camera tracker with liftoff detection project

The Rocketry Forum

Help Support The Rocketry Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

curtisheisey

Well-Known Member
TRF Supporter
Joined
Jan 26, 2010
Messages
335
Reaction score
162
In my quest for ever better photos of launches, I have decided to take on a gimbal DSLR camera mount and rocket tracker project. It would have standoff detection of rocket launch and track the rocket for the first 1000 feet. I plan to initially use a 28mm lens at about 50 feet standoff. Eventually I would like to work automated track and slew. Finally, tie in zoom control of a DSLR with a telephoto, but that is a ways down the road.

I would use a raspberri pi for the computer/controller.

I figure this would be in several phases.

1) Build camera gimbal
I found this wonderful gimbal on thinguniverse https://www.thingiverse.com/thing:3375167
I probably need to beef up the plastic gears with metal ones. I estimate the initial vertial slew rate is going to be 30 deg/se to 60 deg/s.

2) Create liftoff detection and use canned slew profiles. Collect data and develop tracking algorithms.
I could use the Lightware SF30 range finder. This has a high update rate and a 50 meter range. I could either point it at a fin, or just about the launch rail.
Adafruit has an 8x8 pixel IR camera, but I don't know if it has has the range. I could use a blow torch at 50 feet as a surrogate. Also, the update rate is not great (10/s).

3) Incorporate tracking algorithms.

There are some good open source video analytics packages - openCV, Processing, etc.

4) Work on DSLR zoom


This will keep me out of trouble for the next year. Ideas and comments welcome.

upload_2019-5-17_7-58-4.png
upload_2019-5-17_7-58-21.png
 

Attachments

  • upload_2019-5-16_17-54-53.png
    upload_2019-5-16_17-54-53.png
    466.3 KB · Views: 70
  • upload_2019-5-16_18-2-40.png
    upload_2019-5-16_18-2-40.png
    404.9 KB · Views: 76
  • 6a6d6ad3aacfe6738c1bc8c0abeea905_preview_featured.JPG
    6a6d6ad3aacfe6738c1bc8c0abeea905_preview_featured.JPG
    31 KB · Views: 74
I would guess that a distance of 50 ft a pan rate of 60 deg/sec would be insufficient.
 
I would guess that a distance of 50 ft a pan rate of 60 deg/sec would be insufficient.
+1, it would be better to use a telephoto lens at a greater distance, but a shorter lens should work well enough for testing purposes. I am definitely interested in following your progress on this project.
 
Honestly, I'm not sure the IR tracking method would be the best. You could easily do the same function using a regular camera, and just looking for a bright spot in the frame which would be the plume. This would also let you use much less expensive higher resolution cameras. The only 8x8 pixels of that IR sensor is going to make it pretty hard to keep the tracking object in frame. Better yet, you can just use the video output of the DSLR itself to point the tracker.

You basically have two main choices on how to track an object.
The first and straightforward method, is to have a tracker on the rocket itself, which sends the position of the rocket over RF down to your optics mount. Then all you need is a GPS coordinate of the mount itself, and then its pretty simple math to get pointing angles for your mount. Then some basic PID or other form of closed loop control and then you just need to tune things in to get it working right.

The other method would be by computer vision. There has been a huge amount of development in this area in recent years. So there are very likely some very good well behaved programs around that will do a vast majority of the work on this for you. I know at the very least there was an optically tracking paintball turret program around 10 years ago. The computer vision solution would let you just track based off an outline, the plume, or a bright paintjob on the rocket. With very good results.

If you get fancy you can combine the two options I mentioned. Using the RF telemetry to get a general pointing angle, then using the computer vision to center the rocket in the view.
 
There is/was a camera tracking system that used a RF fob to track something like a runner, cant remember the brand name, but it was determined in an old thread it would be most likely to slow in pan/tilt to work for rockets.
 
FYFRODGI7IJ9TU6.LARGE.jpg The beacon tracking has been suggested to me by several people. I would like some configuration that just uses video tracking, so that I can point it to anyone's rocket. Maybe use the beacon for my own when video gets out of range.

I was thinking that the thermal camera could be used for liftoff detection. However, the update rate is not great (10/s). Simple video processing might suffice, even just looking at frame to frame differences, and the having some threshold algorithm. There are several good open source video processing packages out there.

I'll probably need two set-ups eventually. Wide-angle and close-up, and telephoto long range.

I did find another pan/tilt design that may be more suitable. https://www.instructables.com/id/Motorized-panorama-and-timelapse-pan-tilt-sytem/ . I could connect the drive of the servo right to the camera mount. For the close-up verision, I probably don't need a pan axis. Just tilt.
 
I am eager to see if this works. This could a great project.
 
View attachment 383376 The beacon tracking has been suggested to me by several people. I would like some configuration that just uses video tracking, so that I can point it to anyone's rocket. Maybe use the beacon for my own when video gets out of range.

I was thinking that the thermal camera could be used for liftoff detection. However, the update rate is not great (10/s). Simple video processing might suffice, even just looking at frame to frame differences, and the having some threshold algorithm. There are several good open source video processing packages out there.

I'll probably need two set-ups eventually. Wide-angle and close-up, and telephoto long range.

I did find another pan/tilt design that may be more suitable. https://www.instructables.com/id/Motorized-panorama-and-timelapse-pan-tilt-sytem/ . I could connect the drive of the servo right to the camera mount. For the close-up verision, I probably don't need a pan axis. Just tilt.

The GPS TM over RF is nice because there is a lower programming threshold. But on the other hand, it takes a lot more set-up for it to work. As well as you lose pointing angles if the GPS's max velocity is exceeded or you lose TM lock.

The thermal camera I am guessing will not save you much work from implementing with a normal camera. If it is hot out, it's likely the heat from the motor will get washed out by the ambient temps, especially with the low resolution of that IR sensor. If you had one that was like 320x280 or in that neighborhood you stand a better chance of seeing it. Other issue is, when the plume starts, there is a very large ball of hot gas that goes everywhere after the plume reflects off the ground. So if you are trying to track off of it, you will have a hard time of it.

So I think the launch detect with optical means than with IR. You could always trigger it manually if it is a big problem, and just have the camera pan up 5 or so degrees at a fixed rate until it starts the video processing and searches for the rocket.

I don't really like that second mount that much. Seems like it would have bad wobble. If you can, you generally also want a counterweight on your mounts. Which will balance out the mass of your camera/lens/mount. This makes things easier on your motors, and in tuning your PIDs, as the required force isn't varying as much with angle. You can always gain schedule it with respect to pointing angle I suppose; to make up for that type of variance.
 
This is another of those projects that I have been brainstorming for many years. As with most of the other grand ideas, I decided that this is beyond by knowledge and skill levels. But, here are some thoughts I had that may be useful.

I was considering processing the real-time video output. That has gotten easier than when I first considered the idea, not just because of better computing hardware and software, but because of most DSLRs supporting 4K video. You can set the camera to record 4K, but produce the final video at 1024 resolution. That gives you 1/4 of the image to publish and 3/4 to use for motion tracking. Of course, the video won't be as nice as if you recorded in 4K and zoomed in so that the rocket filled the frame. But, it certainly will be watchable.

In addition to providing more area to process in real time, the system only has to keep the rocket within the large 4K frame since you can center the rocket in the 1024 frame in post.

The software should be easier than most feedback-based control software. ("Easier," but not necessarily easy!)

Rather than having the feedback directly control the pointing of the camera, you would have it tune a model of the rocket's flight and use that model to drive the camera's movement. From lift-off to ejection, the model should be simple - the rocket follows an arc Transitioning from flight to recovery would be a challenge, but doable I would think. But, even if tracking was lost at ejection, it would still be cool.

Processing the video in real time is not something I have any experience with. So, I am not sure how hard that will be from either the hardware or software side. It is something a lot of things do, though, so references should be available online to help.
 
I haven't thought about how to control the zoom and focus during the flight.

Theoretically, if the tracking is smooth enough, the camera's autofocus should work. But, does your DSLR support autofocus of video? Mine doesn't.

A mirror-less camera might be a better option. In addition to supporting autofocus for video they are usually lighter and, if you design the system to take stills instead of video, they won't shake when taking a photo.

Also, some mirror-less (or just plain video) cameras are controllable through USB, Bluetooth, or whatever which would offer a way to control the zoom programmatically.
 
I haven't tried to do anything like this, although I have done some embedded programming of motor drive devices, but I have followed a few others that have started projects like this.

If I recall correctly, the biggest issue was getting the feedback loop fast enough to drive the camera angle to follow a fast rocket during the first several hundred feet of flight. The closer you are to the rocket the faster the Deg/s movement of the camera has to be. Of course the faster the rocket the faster the Deg/s has to be too. The other end of that is that the closer you are, the quicker the Deg/s drive required slows down as you are looking close to vertical much sooner than if you were further away, but that means your acceleration and deceleration rates will be higher.

Your overall Deg/sec rate won't exceed about 60 deg/sec, but your actual instantaneous drive rate might need to be several hundred deg/sec at different times during the flight. I agree with dhkaiser, I don't think 60 deg/s will be enough.

Since the motor profile will be all about acceleration and deceleration, do you know what current loads your drive circuits will need to handle? They could be surprisingly high with the required accel/decel rates. Your current limits in the drive circuits could limit your accel/decel more than the motor or drive line.

If you use video processing, what kind/size of processor do you need to get a feedback loop that can update as fast as your video frames?

Thanks for the thread, this will be interesting and please post the engineering/calculations behind the design.
 
I plan to use a raspberri pi B+. It has a 1.3Ghz Arm processor on it. Nonetheless, the video processing and control loop latency will be a challenge.

I plan to use a separate video camera for the tracking and just set the DSLR for stills. I have a Sony A65 mirrorless that does about 5 frames/s.

I figure the very first part of liftoff can use a fixed altitude/time profile, then the video can acquire track as it goes up.

Good point about very high initial accelerations and pulling a lot of current on the stepper. I was going to use a NEMA23 stepper and just direct drive it to the tilt.
 
I finally got back around to working on this. First of all, I wanted to thank everyone for their feedback and comments. This has given me a lot of useful information to mull over.

I have realized that I need a couple of set ups:
- Very close range with a go-pro. I might be able the 8x8 thermal array to trigger. There are a number of off the shelf lightweight gimbal units with servos.
- 50 ft range with a 28 mm DSLR. Use the lidar for triggering.
- 100 ft range with a telephoto and DSLR. Maybe use video analytics to trigger.

Also, I realized for just photographing lift-off to 600 feet, I can get away with 1D camera slew, simplifying the mechanical set-up (vs 2D slew)

I will need a horse of a motor to obtain several 100 degrees/sec . Found some on servo city, https://www.servocity.com/153-rpm-spur-gear-motor-w-encoder for $10. 153 RPM, 1,000 oz-in torque, but can draw up to 20A. I found a motor controller that can drive 13A and 30A peak https://www.robotshop.com/en/cytron-13a-5-30v-single-dc-motor-controller.html . I can use a continuous turn pot as an absolute encoder.

Initially, I can use canned slew profiles for several classes of rocket weighs and thrust and then fold in the tracking as a second project phase. Or I can just a canned profile initially and then switch over to tracking. That would buy some time to acquire the track.

I have the Lidar trigger unit assembled and ready to unit test. At 50 ft, the Lidar has roughly a 1.5 ft square field of view, so I can aim it just above the launch rail, or for very large rockets, I can probably aim it at the intersection of the top of the fin and rocket body.

The Lidar is a Lightaware SF-30B. 50m range, high sample rate, 0.5 degree beam divergence . Processor is a Sparkfun redboard M0 turbo (adruino footprint), I2C LCD display. I have a pot to set the range threshold. So when the measured distance is less than the threshold, I trigger. So the rocket will need to cut the beam but not trigger on the launch rail.


Once I get the trigger solid, I can star working on the camera slew.

Picture1.jpg Picture2.jpg
 
I proposed something like this a couple of years ago on this site, but I'm too lazy to find the link right now. I also was thinking of using IR for tracking, because (a) it should continue to work after burnout (and all the way down to the ground, assuming the rocket comes down in one piece) and (b) I also wanted to be able to point at any rocket, vs. depending on a beacon. I really thought there were IR sensors better than 8x8 to be had, but honestly I wouldn't bet against an 8x8 sensor being good enough for the purpose here.

I'm excited to see someone try this. I think it has the potential to be really great.
 
Yes, I would like to be able to point this at any rocket, and not depend on a beacon or telemetry feed.

There also are some hobby 64x64 IR sensors in the "low end" worth investigating in addition to the 8x8 that I found. High end IR sensors get *very* expensive, g.g. many $100's even $1000's.

So the main issue is the IR getting washed out over range. Also, sampling rate can be a limitation. Some of the sensors have a mode where if any pixel goes above a threshold, it sends an interrupt. This can possibly be a means of triggering liftoff, and can be lower latency than reading out all the pixels all the time. However, triggering and tracking are two different requirements with different implications for latency and sensitivity.

One thing I want to do once I get the Lidar trigger going is to collect some thermal and video data.
 
The Lidar is a Lightaware SF-30B. 50m range, high sample rate, 0.5 degree beam divergence .
Wow, pretty pricey item. FWIW, I've had some success with Sharp laser rangefinders, although they have to be right next to the rocket. A mechanical switch of some sort that the rocket could sit on ("squat switch") would also be an alternative.
 
Last edited:
... A mechanical switch of some sort that the rocket could sit on ("squat switch") would also be an alternative.

What about a SST flow switch under the nozzle. As soon as any exhaust comes out the switch would trigger. Probably before an AP motor comes to pressure or the rocket moves.
 
mikec, Handeman: So one of my design objectives is stand-off detection of the liftoff. I've tried to deal with mechanical switches, break wires, laser trigger on fins, but dealing with set-up to be cumbersome and unreliable. And stringing cables is a royal pain. I just want to point it at any rocket and go.

That said, one could easily modify the code to read an external trigger of some sort, and use a mechanical based trip instead of the expensive sensor. There garmin Lidar that is about half the cost of the SF-30B. It has wider beam divergance, but it should still work https://www.sparkfun.com/products/14599 .

Actually, for flexibility, I'll probably just add a 3.5 mm phonojack and code it up as an external trigger option.
 
I am elated at my initial test results at a CMASS launch yesterday. I set it up and got a great shot on the first try.

It is picking up some jitter in the measurements. Wind? I will need to add a filter . It was consequently getting a lot of false positives. Also, at max range it was give some erroneous values, and I might have a bug in the decoding software that is reading the measurements from the Lidar over a serial stream.

Again, the Lidar is pointed above the launch rod. It is set to trigger the camera when the distance is less than a threshold value (set by the pot). The max Lidar range is 160 ft, so I set the threshold at about 30 ft. I took a swag at where to aim the camera, accounting for approx 60 ms trigger circuit and camera latency.

The rocket was slow (some drag on the rail), but I got it in two frames (Sony A67fps=9) with a 28 mm lens. When a trigger occurs, I hold the shutter on for a 1 second burst. The camera is set for manual focus and manual exposure.

The display indicates Lidar distance reading (ft), trigger distance value, and 1 or 0 to indicate if we are in a trigger state or not.

Photos shown, no crop in the vertical!

I broke a wire and ran my battery down, so I only got one test in.

So more work to do, but good progress.

_DSC4310_cr1.jpg _DSC4311_cr1.jpg thumbnail_1.jpg thumbnail_2.jpg
 
So one of my design objectives is stand-off detection of the liftoff. I've tried to deal with mechanical switches, break wires, laser trigger on fins, but dealing with set-up to be cumbersome and unreliable. And stringing cables is a royal pain. I just want to point it at any rocket and go.
Absolutely, I hear you. For my tests I was using an analog 10-80cm Sharp distance sensor pointed at the side of the rocket, and my Arduino box controlled a wireless shutter release to trigger my camera. So there wasn't a lot of wire running or setup, although the Sharp sensor is a little sensitive to sunlight and rocket color (for some reason I could never get it to work on a fluorescent yellow rocket.) Unfortunately the Sharp analog sensors are pretty slow, so I am now experimenting with a 10cm Sharp digital sensor.

Congrats on your first successful test!
 
Ahh, mikec, great minds think alike.

You might check out this ultrasonic distance sensor, 254 inches, 20 Hz, but you might get a faster update rate at shorter distances. Update rate is a bit slow, but it is a sensor that will work in full daylight.
https://www.sparkfun.com/products/8502

I'd love to see your circuit and notes on the wireless shutter release. How much latency does that add?
 
I'd love to see your circuit and notes on the wireless shutter release. How much latency does that add?
I'm using a Oppilas wireless remote control https://www.amazon.com/PIXEL-Oppilas-RW-221-L1-Wireless/dp/B004DOUOI2 -- I hacked into the half/full button press pads in the remote and drive them through 2501 optocouplers controlled from the Arduino. The Sharp analog rangefinder takes about 40 milliseconds to output each sample, but through the whole chain into my Panasonic FZ150 I was seeing about 150 msec of latency, most of that probably in the camera itself (I was trying to prefocus but maybe wasn't successful.)

Thanks for pointing out that ultrasonic sensor, I may try that if the Sharp digital rangefinders don't work adequately.
 
mikec For the Panasonic FZ150 You should be getting 14 ms shutter delay in prefocus (see https://www.imaging-resource.com/PRODS/FZ150/FZ150DAT.HTM ). Nice: 12 fps continuous shooting! If you are getting daylight interference from the analog rangefinder, you might consider a cone type of baffle. Even one a few inches in diameter and a few inches long should screen out a lot of background daylight.

For my wireless link between the trigger and the camera(s), I plan to use a pair of Adafruit Feather M0 with RFM65 transceiver. This is in the 915 MHz Ism band and easily has a range of 300 feet. The module has a cpu, tranceiver module and enough i/o pins. I can use the Arduino IDE to program them. Using a "RadioHead" library, I am getting 7 ms round trip time between sending a 1 character message and getting an ack back. So the latency between a trigger sensor and camera would be half that. So < 10 ms in the Lidar, 3.5 ms in the comm link and 50, ms in my Sony A65, I am aiming for < 70 ms total latency between trigger and 1st photo.

I plan to come up with a generic Tx/Rx node with the same HW and SW in such a way that it can be used both at the trigger send end and the camera. I will also use it as a monitoring node with a buzzer or something to indicate the trigger, and have a manual override.

I will publish all my designs and code. After I get it working, I plan to investigate some sort of cost reduction by using some other sensor than the expensive LIDAR.

The LIDAR is totally sweet, though. From 25 feet, it has about a 4 inch field of view. I was able to quickly aim it at a HPR right at the top of the fin and body tube. I could move the rocket one inch up and trigger it.

TxRx.jpg
 
This is a great project. I’ve often thought about something like this for very high flights to confirm separation, but the tech gets crazy the higher the rocket goes. It reminds me of this SpaceX video:

https://petapixel.com/2018/02/23/spacex-shoots-amazing-footage-rockets/

It looks like your tech is working just fine, but the one idea I had was to use an Android phone, an external telephoto lens, and tensorflow image classification. Based on where the rocket was in frame the phone would send commands to a gimble or servo to adjust up/down/left/right. My sons robotics team in high school did the same thing last year to have a robot autonomously track a yellow ball and it was surprisingly easy. That would probably work if all rockets were white and/or you could reliably track bright exhaust. You could use the phone to also capture the photos/video or have it ride piggyback on a DSLR just for tracking.
 
It looks like your tech is working just fine, but the one idea I had was to use an Android phone, an external telephoto lens, and tensorflow image classification.

Using a cell phone as the tracker is a cool idea. For some reason, I never thought about an app being able to process live video from the camera, but I just realized that it is the way "augmented reality" apps work.
 
Back
Top