Quantcast

A Optical Rocket Tracking Platform Project

The Rocketry Forum

Help Support The Rocketry Forum:

bandman444

Well-Known Member
Joined
Feb 6, 2010
Messages
2,200
Reaction score
9
So this is a project that I have had bouncing around in my head for a while, and now after completing another large arduino based electronics project, I think I might know enough to start making progress toward this.

But before I start, I need to know a little more about what I don't know so that I can develop some baby steps toward the final objective:


A Optical Rocket Tracking Platform




Here are my goals for the project:


  • Be able to track a rocket's flight optically from launch till landing
  • Be compatible with various optical viewing systems (DSLRs, Camcorders, Telescopes w/CCDs, other devices with a video output)
  • Be open source so that others can edit, modify, and improve the platform
  • Allow for a reasonably high level of zoom capability
  • Capable of tracking any rocket to any altitude such that atmospheric conditions become the limiting case
  • Be able to record and display the video during the flight. (have a monitor with just the camera view on it in addition to just recording it)

    Stretch goals

  • Smart enough for multistage tracking (ie two platforms running with one running on each stage)
  • Require minimal to no extra hardware onboard the vehicle being tracked
  • Allow for easy incorporation into a "live" webcam view of a launch with multiple launches going on. (Program launch pad directions, then a user inputs the launch pad being used and it locks on for that flight until the user selects another pad to target)


Originally I had thought of incorporating a BigRedBee GPS to send APRS packets to the ground station then by using a GPS at the ground station a standardized coordinate system could be used to find an altitude and azimuth angle relative to a given vector. While probably easiest to implement, the slow down-link of data would provide large gaps in tracking which I had thought about interpreting the calculated velocity between it and the point before it to get a average velocity in 3 dimensions then continuing to move the platform in that direction till the next data point is received, then reevaluate the guessed location and iterate. Very quickly this strategy breaks down as the video is truly the most interesting when? During launch. Right when GPS data is the least helpful and the most motion is occurring.

Next I thought, what about more data back from the rocket? A special payload that can be flown that allows the ground station to have a faster reaction time to change in altitude. Filtered altitude data could be sent back really quickly at a rate maybe of 20hz or more and that could be used in conjunction with GPS data to better track the rocket.

My latest idea is what I think would be the most robust, and the method I would like to see work, but also the most unknown to me. And that is through actual analysis of the optical data. It is relatively easy to track a colored ball in a room using just the video feed from the camera. I think this method could work for a rocket in an open sky field (look at black object on blue uniform sky), but I think will be tricky to get the launch tower to sky transition, particularly when the smoke trail could confuse the system.


In the next couple months I would like to build and test the important platform that will move the camera, or a prototype. I am far more comfortable with the mechanical design and implementation than I am with the software nuances. Once I run the numbers and figure out what speed and resolution I will need for the rotational axes then I could get that built pretty quick. Once I have that operating as intended, I hope that it would be as easy as getting the software to output a "go 5 steps up, 4 steps left","now 7 steps up, 5 steps left" from the real time video feed.


Let me know what you think. If you have some suggestions on how I can best proceed let me know. I am happy to continue to learn as much as I can from this project.
 

neil_w

Marginally Stable
TRF Supporter
Joined
Jul 14, 2015
Messages
11,023
Reaction score
3,872
Location
Northern NJ
I proposed something like this a while back (you can find it if you search). I think it sounds like a great project. A very fast gimbal is required.

After thinking about it quite a bit, i became concerned that there were too many difficulties with a straightforward optical approach, and convinced myself that infrared would work better. The motor should put out a very strong heat signature all the way until the rocket lands, and should even be findable at pretty high altitudes.

The infrared imager should not need to be high resolution, it just needs to be fast. I wouldn't be surprised if it could work with as few as 64 pixels or something like that.
 

bandman444

Well-Known Member
Joined
Feb 6, 2010
Messages
2,200
Reaction score
9
Very interesting. I have not considered IR before. I will look into it.

Great thread, it looks like it covers a lot of the concerns I have thought about. The big break through there is that when you are far away and zoomed in that the rate of rotation is not that high. Of course this project is more geared toward far away high speed projects, so you will always be 1,000s of feet from the launch.
 

neil_w

Marginally Stable
TRF Supporter
Joined
Jul 14, 2015
Messages
11,023
Reaction score
3,872
Location
Northern NJ
I was just poking around the FLIR website... they do have APIs for their IR camera, but the framerate is 9 Hz. Is that fast enough to track? I dunno, seems doubtful, but if you're far enough away, and the camera is sufficiently wide angle, maybe it could work. Would need to run some numbers.

https://developer.flir.com/
 
Top