I have a program that calculates the distance of protein strands. It came from an old SGI computer and was compiled with cc, but now I'm trying to get it to work on a new Linux computer and compiling with gcc. Most of the output is the same on Linux, but there are some numbers (~5-10%) that deviate a little from the expected output.
Is it possible that the precision of doubles and floats is different between the two machines/compilers? If not precision, does anyone have other ideas that come to mind?