Open Rocket Simulated Apogee Accuracy

The Rocketry Forum

Help Support The Rocketry Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

El Cheapo

Well-Known Member
Joined
Jan 20, 2009
Messages
1,658
Reaction score
7
Has anyone done a comparison with altimeter apogee to open rocket time to apogee? I'm curious because I was reading on the PML site and they mention most simulations err on the side of negative 10-15% accuracy time to apogee. If that is the case one can probably get a better feel for delay times by subtracting an average of 12.5%.

Thanks for the input.
 
My guess here is that most people are over-estimating the surface finish and fin trueness of their rockets. Once these factors are accounted for properly, the simulator will return proper results.
 
or should I have said simulations overestimate by 10-15%? Either way, you get the idea.
 
Ten - 15% is inside the noise of a combination of motor performance and atmospheric conditions including winds aloft. Sim accuracy, any sim, should only be used as an indicator of the general performance envelope.
 
So, as an example, lets say simulated time to apogee is 10sec. If I were to take a middle ground of 12.5% error, would it be fair to assume that real world time to apogee would be closer to 8.75sec?
 
No. It could be anywhere between 7.5 and something above 10 seconds. You don't have the ability to control for enough variables. That's where experience and judgment come into play when picking delays (not unlike there being no substitute for time in the chassis of a race car.)

So, as an example, lets say simulated time to apogee is 10sec. If I were to take a middle ground of 12.5% error, would it be fair to assume that real world time to apogee would be closer to 8.75sec?
 
E.C. -
You are asking a good (and often asked) question -ie what are typical differences observed between simulation and reality.

I don't have a good explanation of WHY, but it does seem that, even taking conservative assumptions about rocket surface finish into account, that RockSim and Openrocket tend to over predict altitude much more often than under predict. My guesses are: sim rockets are "perfectly straight" and have "perfect fins", and I am also currently trying to investigate if friction from the launch rod and/or non-perfectly-straight and/or rod-whip-effects might be playing a significant role.

All that said, YOU asked about TIME to apogee - which is different than what is more often looked at- peak ALTITUDE. In "normal" flights, where the rocket is coasting (and slowing down) prior to apogee (and where the parachute does NOT come out until at or after apogee), the TIME to apogee prediction will always be far more accurate than the actual prediction of the peak altitude. While I believe 10-15% accuracy on peak altitude prediction is about right for most sims, I think you will find that the accuracy in the time to apogee is more on the 0-10% or maybe even 0-5% range.

-Kerry


DaveL -
Thanks for posting the link to my OpenRocket presentation.
 
Last edited:
In "normal" flights, where the rocket is coasting (and slowing down) prior to apogee (and where the parachute does NOT come out until at or after apogee), the TIME to apogee prediction will always be far more accurate than the actual prediction of the peak altitude.

It is also worth noting that a rocket gains very little in altitude during it's last second (or more for longer flights) prior to apogee and accelerates rapidly post apogee. Therefore, if you are using a timed delay for ejection, it is always better to set your delay on the shorter side. You will lose some altitude, but this is not an issue if you aren't in competition. On the plus side, any lateral movement will be reduced and you will have much less of a chance of zippering on deployment.
 
Has anyone done a comparison with altimeter apogee to open rocket time to apogee? I'm curious because I was reading on the PML site and they mention most simulations err on the side of negative 10-15% accuracy time to apogee. If that is the case one can probably get a better feel for delay times by subtracting an average of 12.5%.

This question actually decomposes into two questions:

1. What factors influence simulation quality?

The deciding value for the trajectory simulation is the expression Cd*rho*A, where Cd is the drag coefficient, rho is the density of the air and A is the cross section of the rocket. If you get that expression right, your simulation will be exact, even if each individual factor is completely wrong. As a result, winds aloft are irrelevant (at least to this order). If you underestimate drag, your sim will be wrong.

rho: Density changes are not that large. A density change of 10% corresponds to a temperature change of about 30K, or to an altitude change of about 1000m. So density alone cannot account for such large differences.

A: Relatively easy to measure, not much room for error.

That leaves as the main problem the Cd. Cd is difficult to measure (who has a wind tunnel in the basement?) and difficult to compute (sim programs disagree wildly about Cd). In addition, Cd ist not a constant. Cd during motor burn is much smaller than during coast due to the reduced base drag. So if you want accurate simulations, you should probably fly a few times and determine the matching Cd.

2. How large is the effect of an error in rocket data or atmospheric conditions on various parameters?

A simple way to determine these influences are the performance nomograms you can download from the resource section of the aerotech website: https://www.aerotech-rocketry.com/c...ogs_Flyers_Data_Sheets/aerotech_nomograms.pdf. The nomograms show time to apogee and altitude depending on various parameters. I recommend that you have a look at the J350W diagram (attached), a motor familiar to many.

You'll notice that for a heavy rocket, a change in Cd, diameter or density has very little influence on the altitude or time to apogee. Changing Cd means you move vertically in the grid, and for large rocket mass, the curves are very steep, so a vertical change doesn't move you far away from one of the curves. The weight however has a very large impact. Make a 200g error on your 7kg rocket (thrust to weight is bad in this case, it's only an example), and your sim will be off 10%.

For light rockets, the diameter*rho*Cd becomes more important. With large diameter and/or drag, there even is a region where the time hardly changes with weight if the rocket is light enough. The same will happen with altitude if your rocket is even lighter, but most often, rockets are heavier. The upshot is that for light rockets you'll notice errors in the altitude before you'll notice errors in apogee time.

Since the altitude curves are always steeper than the apogee time curves, apogee time is always more affected by Cd/diameter/density errors, and altitude is always more seriously affected by mass.

In any event, a simple rule like "just subtract 10% from the delay time" is probably too simple. I'd rather try to find out where on the diagram your rocket is. The nomograms are grouped into nomograms for motors that one may want to use with a certain size of rocket. All nomograms in a group use the same scales, so your rocket (Cd/diameter/mass combination) is the some point on all nomograms. So simply put a transparency on top of the nomogram, and draw a point where your flight ended (time and altitude). After a few flights you should get a cloud of points that fairly well characterizes the average Cd of your rocket. You can then take that as the Cd for future simulations.

And we haven't discussed motor variance yet.

Best regards

Andreas Müller

View attachment j350w.pdf
 
It is also worth noting that a rocket gains very little in altitude during it's last second (or more for longer flights) prior to apogee and accelerates rapidly post apogee. Therefore, if you are using a timed delay for ejection, it is always better to set your delay on the shorter side. You will lose some altitude, but this is not an issue if you aren't in competition. On the plus side, any lateral movement will be reduced and you will have much less of a chance of zippering on deployment.

Actually, it's exactly the other way. Before apogee, gravity and drag decelerate the rocket. After apogee, gravity accelerates the rocket while drag counteracts it. Therefore the rocket brakes "harder" before apogee than it accelerates afterwards. For practical reasons, drag can usually neglected near apogee, because the rocket is slow.
Earths gravitational acceleration is 32.12 ft/s^2 (9.81m/s^2). If, for example, the deployment is 3 seconds too early or late, the rocket will be traveling at approximately 100 ft/s - either upwards or downwards.

Reinhard
 
Therefore, if you are using a timed delay for ejection, it is always better to set your delay on the shorter side.
Of course, the delay time itself is uncertain by 20% and even more in some cases (*cough*Aerotech*cough*). Seems like I've gotten more zippers from early ejections than late ones, so I generally go on the longer side by a second or two personally. Many times a rocket doesn't descend ballistically after apogee anyway (backsliding, for example).
 
I'll throw a little anecdotal data out there for consideration, as I've just started working with Open Rocket over the winter and have been kind of testing myself with it. I recently built a scratch rocket my daughter drew up, using Open Rocket simulations to decide on final fin size, shape, and such. I then recreated the Open Rocket model as I built the rocket based on actual sizes and weights of the individual components, and double checked the final measured weight and CG with the Open Rocket model's weight and CG, to give me some level of comfort that I had built the sim model correctly.

I ran simulations with Estes A8-3, B6-4, and C6-5 motors. Not looking at time, but where in the flight (altitude) deployment would happen, the simulations for B6-4 and C6-5 motors showed just about perfect deployment at apogee. The A8-3 showed slightly late deployment, about ten feet into the descent. The first test flight, due to trees at the location, I chose an A8-3. Figuring that I could repair a broken rocket easier than I could fish one out of the top of a tree, I risked the late deployment. It was VERY late, with the parachute not even getting a chance to open. After making the repairs, I decided to try a B4-2 motor for the second test flight. I ran a sim in Open Rocket, which showed deployment just before apogee. The second test flight went perfectly. Live, deployment looked really early, but when watching it on video, it was not as early as it had looked. In fact, I'd say it was VERY close to what was predicted.

I did not have an altimeter in the rocket either flight to compare the altitude with the predicted, and I am not great at estimating that just by looking. My deployment evaluation was based on when it happened relative to apogee, so if the rocket only reached half the predicted altitude, that would explain why deployment seemed so late with the A8-3. Having said that, I did try to guestimate altitude based on rooflines and trees and such, and I believe the altitudes were both fairly close to the predicted. Also, just before each flight, I ran additional simulations looking up wind speeds, ground elevation, and such so that the simulation was as close to actual conditions as I could get it.

So with the A8-3, it looks like the simulation was way off, and with the B4-2, it looks like it was dead on. At the March club launch, I plan on flying this rocket a few times on various motors with an AltimeterOne in it so I can compare the results with Open Rocket simulations. I'll also try to get video of each flight so later I can look at time in addition to altitude. I may even throw the AltimeterOne in two kit rockets I've worked into Open Rocket to compare their flights with the simulations as well. This is more as a test of how well I am doing with with Open Rocket than a test of how accurate Open Rocket is. But I'll report back the results if folks are interested.
 
Last edited:
E.C. -
You are asking a good (and often asked) question -ie what are typical differences observed between simulation and reality.

I don't have a good explanation of WHY, but it does seem that, even taking conservative assumptions about rocket surface finish into account, that RockSim and Openrocket tend to over predict altitude much more often than under predict. My guesses are: sim rockets are "perfectly straight" and have "perfect fins", and I am also currently trying to investigate if friction from the launch rod and/or non-perfectly-straight and/or rod-whip-effects might be playing a significant role.

All that said, YOU asked about TIME to apogee - which is different than what is more often looked at- peak ALTITUDE. In "normal" flights, where the rocket is coasting (and slowing down) prior to apogee (and where the parachute does NOT come out until at or after apogee), the TIME to apogee prediction will always be far more accurate than the actual prediction of the peak altitude. While I believe 10-15% accuracy on peak altitude prediction is about right for most sims, I think you will find that the accuracy in the time to apogee is more on the 0-10% or maybe even 0-5% range.

-Kerry


DaveL -
Thanks for posting the link to my OpenRocket presentation.
The attached plot illustrates Kerry's point that time to apogee predicted by a sim is much less effected by non-vertical flight than apogee altitude. In a vertical flight, this rocket reaches an apogee of 1283' in 8.8 seconds. If the launch rod angle was 24 degrees off vertical, the apogee drops to 1074' but the time to apogee only drops to 8.0 seconds. You lost 209' in apogee, but only 0.8 seconds in time to apogee! That a 16% lost in altitude, but only a 9% shortening of time to apogee.

Bob

View attachment time to apogee sensitivity.pdf

time to apogee sensitivity.jpg
 
Live, deployment looked really early, but when watching it on video, it was not as early as it had looked. In fact, I'd say it was VERY close to what was predicted.
That begs the question of what really defines deployment. Is it when the ejection charge fires, or when the laundry actually comes out and becomes effective? How much time is there between those two events, and is it so small that it is insignificant in the overall scheme of things? It gets treated as instantaneous in all the sim packages, but we all know that nothing in the real physical world is instantaneous.
 
Back
Top