Here's something I'm throwing out to all the code monkeys that post here, I think theres a few anyway.
I've just moved departments at work and I've been giving the task of upgrading an old engineering simulation model we have, the model currently runs on an IBM RS/6000 (RISC based AIX server). It's pretty intense, most fo the code is written in Fortran with smatterings of C and COBOL!!!.
There are a few requirements, the main one being that it has to run fast, it's a pretty complicated model it typically runs over about 10,000-20,000 itterations for a simulation, it's programmed to stop when a suitable value of convergence is met. The other big drama is accuracy and floating point precision. This thing is old, it has to be compiled using the compiler in Fortran '77 mode as it doesn't support fortran '90 for some reason. I believe all the floats are only 8bit. Also it has to accept it's inputs from .dat files, being that most of the data for it is now stored in proper databases I'd like to use SQL, which should speed it up a bit as well (fewer file read operations).
I'm pretty tempted to re-write the whole thing in C, since I'm pretty sure I can get it to run a lot faster (this means a lot of manual memory allocation etc). Obviously something this old and this big it's going to be a hassle. I have to admit thhough I'm only semi comfortable with C, i'd much rather work in a nice modern O.O language like say Java, I'm pretty sure this isn't a good idea.
So a couple of questions,
How does a RISC processor effect floating point precision, will this be more dependent on the architecture or the compiler, ideally I would like to make this model as portable as possible.
Java, is it complete ludicrous to expect a high level langauge like Java to be able handle all this intense number crunching.
has anyone done anything similar with upgrading programs written in Fortran, I'd never seen any Fortran code till this morning...