The FIT SDK claims to handle the endianness of the various fields for each message. I do see that there is some logic for this; however, running my C++ program to decode the example Activity.fit file yields different results on a Windows system and a Linux system.
For example, for the first record,
mesg.GetTimestamp()
, on Windows returns 702940946 which seems correct based on the csv value for the same record. However, on linux, the value is 319284777. I get this same value on Windows if I swap the bytes before printing the result.
I assumed that the FIT SDK, in handling endianness, would return a value that is correct on whatever system its running under. However, it seems to read the correct endianness, but always return a value in the same endianness regardless of the endinanness of the system environment.
Am I wrong in this? It would seem I just need to convert endianness all the time for every value when I run in a linux environment.