To test your integration you can find what level of gravity offset you would need to get the apogee timing right with your data. See how far off it is from 1. It doesn't have to be 1.0000, in fact, as mentioned in a previous post, 1.000 offset is not the optimal choice.
I think I have found the issue...
Using the logged data, I was able to model velocity in Excel for the ADXL. First, I baselined it by modeling velocity using "average Lin Z" and using my log timestamp (in only 0.000 precision) to determine rate. That baseline was accurate compared to the onboard 500hz calculated velocity within half a second (crossing zero). very, very close.
I know that my average Linear Z data has exactly 1G subtracted from the raw values, so I adjusted the G offset by subtracting values, until velocity lined up with apogee. In this model my G offset would need to be .842G to peg the timing of apogee.
I then went back to my log files and looked at 20 minutes of data, before launch (not in the file I posted). When I averaged the resting "average Lin Z" it was actually .174G, instead of 0. If I add this discrepancy in bias to the .842 I land at 1.016G.
So, I think the smoking gun is the bias offset error. As David (
@UhClem ) points out above, I should be calculating that while sitting on the pad, instead of depending on my calibration at home. The ADXL375 is incredibly noisy, but when sitting on the pad sampling at 500hz it becomes clear exactly what the bias should be to remove gravity.
In the meantime, I will also switch my floats over to doubles. I am using an ESP32, so the default float is single precision. I am not sure that it mattered here, but just in case I'll switch to double precision and see if I get a performance drop.
Thanks,
Mike