In our institution, we have a vast database of observational data including precipitation, temperature, altitude and many other variables, from different projects and in different formats. We currently have this in one folder, each dataset having a subfolder. Due to the amount of data, it has become necessary to store and retrieve data based on their metadata (version, years covered, variable, resolution, etc.) on a new archive.
We are looking for the best application to implement this, based on our mandatory download / recovery workflows, which are listed below.
We would like to implement the following workflow to upload a new dataset to the archive:
- The end user compiles an online metadata form
- The form includes a requirement for a source directory
- Once submitted, the system validates the metadata (verifying that everything is present and consistent) and copies it to a separate location where all observations must be stored.
To recover data, users must:
- access the database via a web form
- filter by the necessary metadata
- get a download link (or a location in the archive)
This system should be used primarily for internal use. We started looking for applications to simplify that (we would prefer not to code everything from scratch).
Data Manager Kit,
Open Data Kit are under evaluation. What are the main applications or applications typically used with this type of workflow?