Featherweight Tracker iPhone App (iFIP)

The Rocketry Forum

Help Support The Rocketry Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Adrian A

Well-Known Member
TRF Sponsor
TRF Supporter
Joined
Jan 21, 2009
Messages
3,157
Reaction score
2,806
Location
Lakewood, CO
https://youtu.be/R3vGVNKkoVk

Kevin is still on his way back from Black Rock without good internet connection, so I'm posting this for him.

This is a screen recording from the iPhone I used to track Kevin's minimum diameter M rocket he flew at BALLS on Sunday. I only got Kevin a full working set of hardware a week before BALLS, so all the focus has been on the functionality rather than the aesthetic. Still, it's a functional app with a cool feature of letting your phone point toward your rocket. The arrow at the top shows the azimuth (left-right) direction, and the bubble level on the left shows you how much to tilt the phone in order to make it line up with where the rocket is in the sky. It's fun and intuitive to use on a real flight in person, but in this screen capture video it looks pretty random because you can't see the phone motions that the arrow and bubble are responding to.

There are some data dropouts and data transmission errors that came from having the ground station in my pocket and sometimes having my body between the ground station and the rocket. The first few points during the liftoff are weird in the recording because of the GPS trying to figure out how it just got from sitting still to a vertical Mach 1.3. But by the time I took my eyes off the rocket liftoff and started looking at the phone, the GPS had it figured out.

After reviewing the data for this flight that was logged in the tracker in the rocket, I realized that a couple of the odd data points are coming from a new feature I was trying to add to the firmware that morning just before the flight. More on that later. Anyway, consider this a work in progress.
 
Couple comments about the app:

1- It’s a little rough now because this was my very first iPhone app. Seeing as my first iPhone app had to also include bluetooth communications, GPS, accelerometer, magnetometer and some graphics calculations / drawing, I’m pretty happy that we had it functional for BALLS!

2- For a good example of “pointing to the rocket”, you can go to time 0:48 to 0:52 in the video where Adrian gets the phone pointed at the rocket at 15K ft and ~3000 feet down range. At that point, the rocket is at 79 degrees elevation and the phone at 80 degrees ‘tilt’. The rocket is also at 164 degrees heading and Adrian has the phone pointed at 163 degrees according to the magnetometer.

3- As Adrian noted, sometimes the arrows jump around due to a bad packet. We have ways to address those so that should go away, but you’ll see sometimes where the elevation drops to near zero (as if the rocket was suddenly on the ground) and then back up again. At the 1:21 mark there is a bad packet where the rocket suddenly transports ~23 Million feet away before it comes back again. Again, we will address those oddities… ;-)

4- Some comments on the layout:

- The black circle represents the elevation of the rocket relative to the phone. The red circle represents the tilt of the phone. If the red circle is inside the black circle (within 5 degrees), then the phone (elevation-wise) is pointed at the rocket and the red circle is green.
- The red/green arrow at the top represents left to right direction of the rocket. If you are pointing within 5 degrees of the rocket, the arrow is green.
- If both above are green, you are pointed within 5 degrees of the rocket in both elevation and heading.
- Lat/Lon are for the rocket. Alt is altitude of the rocket above what the phone thinks it’s altitude is. The phone isn’t high accuracy so sometimes the delta jumps around when the rocket is on the ground. Dist I the distance over ground to the rocket. I’ll add line of sight distance if that is useful. Elev is elevation to the rocket and Direction is heading in degrees to the rocket. VertV and HorzV are vertical and horizontal velocity in fps - we’ll probably add metric as an option. Headed is what direction the rocket is headed according to the GPS. RSSI and SNR have to do with the radio / signal quality. The other numbers at the bottom are some shrapnel left from development but include Adrian’s heading and speed (262 / 0.0) as well as the phone elevation and coordinates. The 3 represents 3D lock for the GPS and the time to show if it is still receiving packets. The little red arrow at the bottom is total shrapnel as it was my first arrow drawn in iOS just to convince myself I could do it ;-).

5- I may add a circular compass but in reality it takes up a lot of screen space and you are either walking to the rocket (so you care about +/- 45 degrees at the top of the compass) or you aren’t walking to the rocket (and don’t care about the compass…).

Regardless, I’ll be cleaning up the display and maybe breaking it into different screens. Nobody’s brain processes Lat/Lon on the fly so they likely don’t even need to be on that screen. Likewise after landing, you don’t need the rocket VertV, HorzV or Heading.

Trivia - As Adrian noted, I didn’t have any complete hardware until about a week before BALLs…. To make progress prior to that, I took a KML file from a Telemetrum, read it with a C program, converted the information to be a format like what Adrian said he was going to give me and then fed that over UART to the prototype development board for our BlueTooth module as if it came from Adrian/GPS…. I then added extra blue tooth commands to ‘command’ the phone to think it was at a location near the launch site in the KML file. A lot of this was developed based on a KML flight file from the Kloudbusters site in Kansas…. thanks to Todd Kerns of the Albuquerque Rocket Society for finding me a KML file... Later when we save the received data on the phone, I will likely add a way for you to play it back as if you were at the launch site again - and maybe share it via a web site like we do Raven data files…

Now back to sleep as I need to drive the rest of the way home tomorrow…. And after 500 miles, my trailer is still dropping chunks of Black Rock dust anywhere I stop…!?

/kjs
 
Very cool stuff, neat to see it in action.

As an option you might consider aural feedback for pointing. I worked on a project where I used a low or high tone indicating the platform needed to be raised or lowered and a low/high warble on the tone to indicate too far left or right. This provided real-time feed back without visually distracting the user.

With a tone the user could hold the iPhone at arms length in front of them and it would help guide them to point at the rocket. Otherwise you have to be watching the iPhone screen which makes it a lot harder to pick up the rocket visually.

Just something to think about.


Tony
 
Very cool stuff, neat to see it in action.

As an option you might consider aural feedback for pointing. I worked on a project where I used a low or high tone indicating the platform needed to be raised or lowered and a low/high warble on the tone to indicate too far left or right. This provided real-time feed back without visually distracting the user.

Tony

Tony, I was thinking of some type of audio feedback - or vibration. I like your tones concept so will look into that. But yes, you want to be looking at the point in space where you think the rocket should be and not looking at the phone. For walking to the rocket, I figured some type of audio feedback to keep you within +/- 5 degrees of the destination... Thank you for the feedback!
 
Any plans on porting this over to Android? I have no plans on switching over to an iPhone.

Yes, I have switched back and forth between Android and iOS so plan to support them both. My goal is to update / improve / finalize the iOS interface and then try to make an Android that is as close as possible to the same thing. I want to avoid starting the Android and then having to update both of them to a new format...

Thank you for the feedback!
 
Tony, I was thinking of some type of audio feedback - or vibration. I like your tones concept so will look into that. But yes, you want to be looking at the point in space where you think the rocket should be and not looking at the phone. For walking to the rocket, I figured some type of audio feedback to keep you within +/- 5 degrees of the destination... Thank you for the feedback!
Friendly neighborhood dev here...watch out for feature creep! ;) Do what you want, but simple, intuitive & functional is a great way to ship apps.
 
Friendly neighborhood dev here...watch out for feature creep! ;) Do what you want, but simple, intuitive & functional is a great way to ship apps.
As a former developer I agree. But if Kevin already has the math in place to move the graphics in relationship to the required attitude, then converting that to modulate an audio tone is *fairly* trivial.

As I mentioned I developed a product that used tones to tell a user where to point. I can't think of a simpler, more intuitive way to do the same thing with a rocket tracker app. The functionality it would bring to the user would mimic an RDF tracker but with a much higher degree of accuracy.

And clearly this is not Kevin's first rodeo!


Tony

ps: comments motivated by a selfish desire to have such a product in my hot little hands!
 
GrouchDuke and manixFan are both correct... "deliver quickly, avoid feature creep, implement 'trivial/useful' features, improve from there, deploy updates automatically / frequently" (my ad-lib rewording)...

If we had production hardware ready, I would deploy what I have and use iPhone's (and Android) features to update releases and use video emails / rocketry forum to educate on the changes... I was actually able to update the iOS build for the iOS App Store twice from the Black Rock Playa for Adrian to install.... (I love Verizon..! Adrian with AT&T had no service - I had to provide a hotspot for him to use ... ;-) ). But I think we have one hardware rev maybe (I have to talk to Adrian again) - and I'd like to clean up the iOS 'screens' before I try to replicate on Android. My goal is to follow the path of "upgrade iOS features, release, user validation, fixes, replicate to Android, user validation, fixes, repeat"... If Java supports BTLE, then I might consider a laptop based ground station as well but want the phones working first...

ps- in my day job, in mid 1990s, I was told I couldn't deploy a windows app (it had to be web) because I couldn't keep the windows app up to date for everyone... so my first 'F-That - must do' was an 'auto-updater', so that when they started the app it would automatically update their installation to my central 'master' copy... I deployed it in 1997 and used for 20 years with very minor changes ever since...(originally used HTTP, now my own custom data service)! Interesting enough, I could release Production , Beta, Pilot, Prototype and Developer builds and use ACLs (Access Control Lists) to control who could see and install the different builds... iOS and Android stores provide similar features now (iOS TestFlight for team, then beta testers, then production) - so I don't have to reinvent it.... [some young engineers ask "why didn't you just use XYZ...?" (for a number of things) ... my reply is "it didn't exist then..." ;-) ]

pps- we also want to investigate 'down the wire' updates over bluetooth... we expect firmware updates as well initially and don't want to include extra complexity if we can simply update via phone to tracker/ground station...
 
As an option you might consider aural feedback for pointing. I worked on a project where I used a low or high tone indicating the platform needed to be raised or lowered and a low/high warble on the tone to indicate too far left or right. This provided real-time feed back without visually distracting the user.

Tony - as a followup option for this, is there any reason I don't use straight voice? if you are outside a range of say +/- 5 degrees, the phone simply says "left 12" "down 5"... etc. I figured you'd be someone to ask since you have experienced the 'tone' methodology...

thanks!
 
Tony - as a followup option for this, is there any reason I don't use straight voice? if you are outside a range of say +/- 5 degrees, the phone simply says "left 12" "down 5"... etc. I figured you'd be someone to ask since you have experienced the 'tone' methodology...

thanks!

That would be awesome.
 
Tony - as a followup option for this, is there any reason I don't use straight voice? if you are outside a range of say +/- 5 degrees, the phone simply says "left 12" "down 5"... etc. I figured you'd be someone to ask since you have experienced the 'tone' methodology...

thanks!
The tone is much faster than voice and very dynamic. You immediately know if you overshoot the angle or haven't moved enough. The dynamic tone would help the user get a feel for the rate of descent or drift due to wind.

Hold your arm out high and slowly lower it. Now imagine a tone that varied if you were too high or too low. Very quickly you learn what rate you need to lower your arm to maintain the neutral tone.

A tone can also encode both vertical and horizontal position at the same time with pitch/volume indicating attitude and high/low warble indicating heading. That's very hard and slow to do with voice. If the voice says 'left 12 and down 5' you don't get any feedback until the next voice output to tell you if have moved the correct amount or not. It's like skating to where the puck has been instead of where it's going to go. The tone feedback 'trains' you to follow the descent/drift to keep the tone steady. It's current and continuous feedback vs. historical and discrete.

Of course it should be pretty easy to try it either way. A few real world tests will indicate if my method translates to a rocket tracker or not.


Tony
 
The tone is much faster than voice and very dynamic.
Tony

Tony - I was thinking about this later 'offline' (after I posted) and think you are/will be right - the tone is easier to 'train' more users with vs degrees of up/down/left/right with voice commands... thank you for confirming your thoughts as well!
 
Tony - I was thinking about this later 'offline' (after I posted) and think you are/will be right - the tone is easier to 'train' more users with vs degrees of up/down/left/right with voice commands... thank you for confirming your thoughts as well!

Don't forget that a directional antenna isn't necessary for this system, even for a 100,000+ foot flight, so there is nothing to steer except your eyes. Personally, I would rather have the altitude and speed called out, especially for the early part of the flight when my eyes are still following the rocket anyway. If the rocket went out of sight but I'd like to try to see it coming down on the main chute, I would be watching the phone for main chute deployment altitude and then use the displayed arrows to point it at the rocket. But I could see the tones being nice when the rocket is on the ground so that you don't need to look back and forth between the phone, the ground where you're walking, and trying to spot the rocket. So for me, limiting the tones to left/right would be just fine.
 
....Personally, I would rather have the altitude and speed called out, especially for the early part of the flight when my eyes are still following the rocket anyway....
I agree that during ascent it would make a lot more sense to have things called out verbally. Directional tracking via a 'pointer' isn't really even necessary if the unit is communicating with the base station it is sending the GPS location of the rocket.

But clearly some form of directional feedback is very useful during descent. Since you already have the programming to move the two indicators converting that to auditory feedback should be simple. Adrian is of course correct that direction is more important than elevation. As the software matures and with the benefit of field experience Kevin can perhaps allow the user to choose what type of feedback they find most useful.

Regardless of the specifics, having a new GPS system on the market will be great for the hobby.


Tony
 
I'd love to have an app that tells me if the unit is powered up and working right so I don't have to drill excessive holes in my airframe or have to peer into a vent hole to find the LED... Being deaf has it's drawbacks sometimes.
 
I'd love to have an app that tells me if the unit is powered up and working right so I don't have to drill excessive holes in my airframe or have to peer into a vent hole to find the LED... Being deaf has it's drawbacks sometimes.

Funny thing.. I had a tracker from another supplier (since we didn't have a Featherweight one yet) and if I was trying to work with it, the beeper would wake my wife even though I couldn't hear that frequency... I actually removed the piezo from that one because it was of no use to me - and yet still woke my wife...
 
I think a mix of voice (when needed / useful) and tones (when more efficient) might be good. I can say that for my M flight, I spent more time looking at my phone to determine which direction it was than I spent looking in the sky (where it was pointing) to see if I could actually see it... If that could be replace by tones so that I didn't have to look at the phone, then I think maybe I would have caught part of it's descent...
 
Funny thing.. I had a tracker from another supplier (since we didn't have a Featherweight one yet) and if I was trying to work with it, the beeper would wake my wife even though I couldn't hear that frequency... I actually removed the piezo from that one because it was of no use to me - and yet still woke my wife...

If it was the EggFinder LCD removing the piezo works or converting the USB single channel receiver to battery and bluetooth will eliminate the beeping problem. Out in the field, the beeping is advantageous.
If you removed the piezo and still woke up your wife, you must be too danged noisey in the house at night!:no::lol: Kurt
 
:) .... I think it was one of the Telemetrum products - and now I see my wording was confusing... after removing the piezo, it did not wake my wife ;-)
 
Back
Top