I am working on reverse engineering an encoding.
A number between -12 and 12, with 2 decimals is encoded using an unknown method, and the result is a pair of strings as follows:
value string1 string2
0 AAAAAAAAAA AAAAAAAA
0.01 AAAAB7FK5H 4XqEPwAA
0.02 AAAAB7FK5H 4XqUPwAA
0.03 AAAAC4HoXr UbiePwAA
1 EAAAAAAAAA AADwPwAA
2 IAAAAAAAAA AAAAQAAA
3 MAAAAAAAAA AAAIQAAA
4 QAAAAAAAAA AAAQQAAA
5 UAAAAAAAAA AAAUQAAA
6 YAAAAAAAAA AAAYQAAA
7 cAAAAAAAAA AAAcQAAA
8 gAAAAAAAAA AAAgQAAA
9 kAAAAAAAAA AAAiQAAA
10 oAAAAAAAAA AAAkQAAA
11 sAAAAAAAAA AAAmQAAA
12 wAAAAAAAAA AAAoQAAA
-12 T///8AAAAA AAAowAAA
- 1 ////8AAAAA AADwvwAA
- 0.64 ////97FK5H 4XrkvwAA
2.01 IAAAAUrkfh ehQAQAAA
(any additional samples can be provided upon request)
Obviously, I would like to determine the algorithm used to do this conversion.
My research indicates it MIGHT have to do with base64 encoding, and the alphabet used is ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/
A=0 B=1 ... /=63
For instance, the first column, translated to decimal, then divided by 4 (discarding the remainder), gives the "integer" part of the value, when the value is positive, or 16-value, when negative.
I had lass luck with the decimal part, I just don't have the knowledge.
If anyone can give me a hint or a solution, it would be immensely appreciated.
I am a photographer with basic math and programming knowledge, and this would be the central piece needed to complete a script that will optimize my workflow, that now requires for me to manually read, compute (excel) and write those values back (in decimal).
Thank you.