not sure this is the right place to ask, but here goes.
I have a task that will involve multiple modeling & simulation tools and data sets developed across my company. You can think of it as Tool_A feeds Tool_B feeds Tool_C, etc. iterating to achieve some result. It’s more complicated then that, but you get the idea.
These tools vary in their maturity, with some pieces yet to be written, and some going back decades. Languages are primarily C/C++ and FORTRAN. Most of the tools read & write their data to files, some are SQL database oriented. Performance is such that all of this can probably initially run on a single large machine, but we’ll eventually need distribute the work across multiple machines. Wholesale re-architecting of these tools isn’t on the table, but it’s possible to open them up and make some changes, if required.
To put it diplomatically, most of the team is oriented more toward physical sciences, math and engineering then software.
So, what I’m looking for is some system/set of tools/whatever you call it to manage & coordinate this enterprise. Something to basically run the overall process – start the individual tools, monitor the progress of each step, manage the selection, translation & transfer of data between tools, assess decision points, etc. All while logging & providing feedback on progress to the user.
We’ve hacked stuff like this together in the past with incrontab & scripting, but it’s always been very fragile and hard to maintain. It seems there has to be a better way, but I don’t know how to look for it. I don’t even know what to put in the tags field!