bandman444
Well-Known Member
So this is a project that I have had bouncing around in my head for a while, and now after completing another large arduino based electronics project, I think I might know enough to start making progress toward this.
But before I start, I need to know a little more about what I don't know so that I can develop some baby steps toward the final objective:
A Optical Rocket Tracking Platform
Here are my goals for the project:
Originally I had thought of incorporating a BigRedBee GPS to send APRS packets to the ground station then by using a GPS at the ground station a standardized coordinate system could be used to find an altitude and azimuth angle relative to a given vector. While probably easiest to implement, the slow down-link of data would provide large gaps in tracking which I had thought about interpreting the calculated velocity between it and the point before it to get a average velocity in 3 dimensions then continuing to move the platform in that direction till the next data point is received, then reevaluate the guessed location and iterate. Very quickly this strategy breaks down as the video is truly the most interesting when? During launch. Right when GPS data is the least helpful and the most motion is occurring.
Next I thought, what about more data back from the rocket? A special payload that can be flown that allows the ground station to have a faster reaction time to change in altitude. Filtered altitude data could be sent back really quickly at a rate maybe of 20hz or more and that could be used in conjunction with GPS data to better track the rocket.
My latest idea is what I think would be the most robust, and the method I would like to see work, but also the most unknown to me. And that is through actual analysis of the optical data. It is relatively easy to track a colored ball in a room using just the video feed from the camera. I think this method could work for a rocket in an open sky field (look at black object on blue uniform sky), but I think will be tricky to get the launch tower to sky transition, particularly when the smoke trail could confuse the system.
In the next couple months I would like to build and test the important platform that will move the camera, or a prototype. I am far more comfortable with the mechanical design and implementation than I am with the software nuances. Once I run the numbers and figure out what speed and resolution I will need for the rotational axes then I could get that built pretty quick. Once I have that operating as intended, I hope that it would be as easy as getting the software to output a "go 5 steps up, 4 steps left","now 7 steps up, 5 steps left" from the real time video feed.
Let me know what you think. If you have some suggestions on how I can best proceed let me know. I am happy to continue to learn as much as I can from this project.
But before I start, I need to know a little more about what I don't know so that I can develop some baby steps toward the final objective:
A Optical Rocket Tracking Platform
Here are my goals for the project:
- Be able to track a rocket's flight optically from launch till landing
- Be compatible with various optical viewing systems (DSLRs, Camcorders, Telescopes w/CCDs, other devices with a video output)
- Be open source so that others can edit, modify, and improve the platform
- Allow for a reasonably high level of zoom capability
- Capable of tracking any rocket to any altitude such that atmospheric conditions become the limiting case
- Be able to record and display the video during the flight. (have a monitor with just the camera view on it in addition to just recording it)
Stretch goals
- Smart enough for multistage tracking (ie two platforms running with one running on each stage)
- Require minimal to no extra hardware onboard the vehicle being tracked
- Allow for easy incorporation into a "live" webcam view of a launch with multiple launches going on. (Program launch pad directions, then a user inputs the launch pad being used and it locks on for that flight until the user selects another pad to target)
Originally I had thought of incorporating a BigRedBee GPS to send APRS packets to the ground station then by using a GPS at the ground station a standardized coordinate system could be used to find an altitude and azimuth angle relative to a given vector. While probably easiest to implement, the slow down-link of data would provide large gaps in tracking which I had thought about interpreting the calculated velocity between it and the point before it to get a average velocity in 3 dimensions then continuing to move the platform in that direction till the next data point is received, then reevaluate the guessed location and iterate. Very quickly this strategy breaks down as the video is truly the most interesting when? During launch. Right when GPS data is the least helpful and the most motion is occurring.
Next I thought, what about more data back from the rocket? A special payload that can be flown that allows the ground station to have a faster reaction time to change in altitude. Filtered altitude data could be sent back really quickly at a rate maybe of 20hz or more and that could be used in conjunction with GPS data to better track the rocket.
My latest idea is what I think would be the most robust, and the method I would like to see work, but also the most unknown to me. And that is through actual analysis of the optical data. It is relatively easy to track a colored ball in a room using just the video feed from the camera. I think this method could work for a rocket in an open sky field (look at black object on blue uniform sky), but I think will be tricky to get the launch tower to sky transition, particularly when the smoke trail could confuse the system.
In the next couple months I would like to build and test the important platform that will move the camera, or a prototype. I am far more comfortable with the mechanical design and implementation than I am with the software nuances. Once I run the numbers and figure out what speed and resolution I will need for the rotational axes then I could get that built pretty quick. Once I have that operating as intended, I hope that it would be as easy as getting the software to output a "go 5 steps up, 4 steps left","now 7 steps up, 5 steps left" from the real time video feed.
Let me know what you think. If you have some suggestions on how I can best proceed let me know. I am happy to continue to learn as much as I can from this project.