Ranking Interoperability standards #275
isaacullah
started this conversation in
Standards: Interoperability
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Rankings are first by MINIMAL, CORE, IDEAL for what should be required, and what should be aspired to, and then TECHNICALLY EASY, MODERATE, or DIFFICULT for how hard it may be to implement.
Building a coupled model from process-oriented software components (models or model elements) MINIMAL, TECHNICALLY DIFFICULT
Building “system of systems” models by assembling sub-models of particular systems (for example, a “lake system” model integrated with a “watershed system”) IDEAL, TECHNICALLY DIFFICULT
Operating models/components in multiple different frameworks CORE, TECHNICALLY MODERATE
Operating models and data inputs/outputs efficiently as part of a sequence of tasks (approach: use/encourage file formats that are both standardized and open) CORE, TECHNICALLY MODERATE
Swapping input data sources (for example, comparing behavior of a model with two different satellite-based inputs of land cover, as opposed to having the model hard-wired to one particular source) CORE, TECHNICALLY MODERATE to DIFFICULT
Controlling parameter values and behavior without recompiling CORE, TECHNICALLY MODERATE
Operating a model on multiple platforms CORE, TECHNICALLY MODERATE (for some coding languages), TECHNICALLY DIFFICULT (for other operations)
Retrieving information about a model’s current state (including state variables) (implementation question: direct memory exchange vs. file-based exchange vs. web API) CORE, TECHNICALLY DIFFICULT
Pausing and continuing model execution IDEAL, TECHNICALLY MODERATE to DIFFICULT
Adjusting model variables and/or control parameters during a run (for example, to support data assimilation) IDEAL, TECHNICALLY DIFFICULT
Computing derivatives where applicable, to facilitate operations such as sensitivity analysis, optimization, and inference (note: different views among participants about whether this should be included in a standard, a “best practice” guideline, or not at all) IDEAL, TECHNICALLY DIFFICULT
Metadata and documentation related to interoperability MINIMAL, TECHNICALLY EASY
Clarity and precision in definitions of parameters and variables (ontology) MINIMAL, TECHNICALLY EASY
Data items to include in metadata: scale (space and time), typical run time, limits (e.g., range of calibration data) MINIMAL, TECHNICALLY EASY
Beta Was this translation helpful? Give feedback.
All reactions