
HMC Projektübersicht
In HMC sind Projektausschreibungen ein Instrument, um die kontinuierliche Entwicklung der HMC-Community zu fördern - die jährliche Ausschreibung ist ein Schlüssel zur Aktivierung und Einbindung der Helmholtz-Gemeinschaft.
Auf dieser Seite finden Sie eine Übersicht sämtlicher Projekte, die bisher im Rahmen der HMC Projektausschreibung eine Förderung erhalten haben. Aus den drei durchgeführten Aufrufen werden derzeit 22 Projekte gefördert: 7 Projekte wurden für die dritte Ausschreibung im Jahr 2022, 6 Projekte für die zweite Ausschreibung im Jahr 2021 und 9 Projekte für die erste Ausschreibung im Jahr 2020 zur Förderung ausgewählt.
Stöbern Sie gerne durch die Projekte, filtern Sie nach Jahren stöbern oder informieren Sie sich über den aktuellen Stand.

ADVANCE: Advanced metadata standards for biodiversity survey and monitoring data: Supporting of research and conservation
The project aims at supporting metadata generation with interoperable metadata standards using semantic artefacts that facilitate access to, integration and reuse of data from biodiversity research of Helmholtz Centres for research and applied biodiversity conservation, focusing on monitoring data and the integration of data from the terrestrial, freshwater and marine realms.

ALAMEDA: A scalable multi-domain Metadata management platform
ALAMEDA proposes a standards-based metadata management system that prioritizes the end user perspective. This will allow access, search and compare meta-information across databases with an automated metadata enrichment utilizing available DataHub tools, implemented exemplary for soil moisture and stable isotope geochemistry.

AutoPeroSol: Towards automatic data management and a common ontology for perovskite solar cell device data
This HMC project will create an automated data flow to facilitate data and sample sharing among partners and collaborations in the field of next generation multi-junction solar cell devices.

eFAIRs: Enhancing FAIRness in seismological data management
The project plans to achieve a FAIR data management of ocean bottom seismometer (OBS) data sets by integrating them into the routine data management workflows for long term archival at seismological data centre. Moreover, consistently with the land stations, we plan to integrate instrument PIDs and automate the metadata creation process.

ELN-DIY-Meta: ELN-Driven InteroperabilitY for Metadata
In this project, a method for metadata and data transfer between the open-source ELNs Chemotion and Herbie will be developed. The project will aim for a generalization of the specific process via the description of a general guideline and the implementation of available standards. The concept will be demonstrated for the use case of polymer membrane research.

FAIR WISH: FAIR Workflows to establish IGSN for Samples in the Helmholtz Association
Establishing standardised IGSN workflows for samples in the Earth Science community within the Helmholtz Association. IGSN is a globally unique, citable and persistent identifier (PID) for physical samples with discovery functionality in the internet.

FDO-5DI: FAIR Digital Objects for 5D imagery of our and other planet(s)
The aim of this project is to develop interoperable metadata recommendations in the form of FAIR digital objects (FDOs) for 5D (i.e. x, y, z, time, spatial reference) imagery of Earth and other planet(s). The main expected benefit would be to achieve more effectiveness and efficiency in managing, publishing and interpreting imaging data for both domains either in exploring the ocean floor or planetary surfaces and in between.

HARMONise: Enhancing interoperability of marine biomolecular (meta)data across Helmholtz Centres
This collaborative project will develop sustainable solutions and digital cultures to enable high-quality, standards-compliant curation and management of marine biomolecular metadata to better embed biomolecular science in broader digital ecosystems and research domains.

HELIPORT: HELmholtz ScIentific Project WORkflow PlaTform
The project aims at developing a workflow platform which accommodates the complete life cycle of a scientific project and links all corresponding programs and systems to create a more FAIR and comprehensible project description.

HELPMI: Helmholtz Laser-Plasma Metadata Initiative
HELPMI will develop a metadata standard for experimental data of the global laser-plasma community. To date, this community is widely using openPMD, an open meta-standard, established for the domain of simulations.

HERMES: Helmholtz Rich Metadata Software Publication
The goal of this project is to support researchers in publishing their research software, in a way that makes it findable, comprehensible, citable and reusable. The key to this is the creation, curation and deposit of rich metadata with software publications.

MEMAS: Metadata Enriched Manufacturing data for Automated Simulation
Manufacturing of composite parts involves multiple process steps, which each involve large datasets generated from processing machines or surrounding sensors. With help of the recently developed data management system shepard, the project MEMAS aims at storing and connecting these manufacturing data in a persistent way.

MetaCook: The Metadata Cookbook
In summary, MetaCook creates a framework for the preparation of (1) FAIR Vocabularies, and (2) FAIR Ontologies. The VocPopuli software engages users in a Git-based collaborative composition of controlled vocabularies which are then converted via semi-supervised machine learning into ontologies with the help of OntoFAIRCook.

MetaMap3: Metadata generation, enrichment and linkage across the three domains health, environment and earth observation
MetaMap³ deals with the compilation, generation, and enrichment and mapping of machine-readable and interoperable metadata schemes for exemplary data of the three domains Health, Earth & Environment and Aeronautics, Space & Transport.

Metamorphoses: Metadata for the merging of diverse atmospheric data on common subspaces
This project will develop enhanced standards for storage efficient decomposed arrays and tools for an automated generation of standardised Lagrange trajectory data files thus enabling an optimised and efficient synergetic merging of large remote sensing data sets.

MetaMoSim: Generic metadata management for reproducible high-performance-computing simulation workflows
The project aims at developing a generic, cross-domain metadata management framework to foster reproducibility of HPC based simulation science, and to provide workflows and tools for an efficient organisation, exploration and visualisation of simulation data.

MISO: Metadata for Ionospheric and Space Weather Observations
The project aims to develop meta data standards for interoperability of spaceborn data from different satellites and communities.

PATOF
From the Past to the Future: Legacy Data in Small and Medium-Scale - a Blueprint for PUNCH and Other Disciplines

SECoP@HMC: A standardized interface for sample environment metadata and control - SECoP integration into experiment control systems
The Sample Environment Communication Protocol (SECoP) provides a generalized way for controlling measurement equipment – with a special focus on sample environment (SE) equipment. In addition, SECoP holds the possibility to transport SE metadata in a well-defined way.

STAMPLATE: SensorThings API Metadata ProfiLes for eArTh and Environment
Time-series data are crucial sources of reference information in all environmental sciences. Helmholtz-Centers from the research field Earth and Environment operate some of the largest measurement infrastructures worldwide. To ensure consistency and comparability of (meta)data from these infrastructures according to the FAIR principles, we must ensure standardised interfaces and metadata conventions. A state-of-the-art and community driven framework for time-series and sensor (meta)data, that is jointly adopted across different scientific fields and communities, is still missing.
The main aim of the STAMPLATE project is to develop the domain specific technical and semantic foundations for