The Handbook is currently under development and may change at any point - it is not meant for production use
Skip to content Skip to footer

Maturity model

Page image

Indicators

Indicator Maturity Levels
Strategy for RDM
  1. No RDM strategy is defined
  2. Initial RDM strategy planned or drafted
  3. RDM strategy approved by the organisation
  4. RDM strategy is actively communicated
  5. RDM strategy fully implemented with regular revisions
Personnel for RDM
  1. No staff has explicit DM duties
  2. There are staff that has DM duties
  3. There are dedicated DM staff
  4. There is a dedicated DM coordination team
Internal RDM goals and KPIs (operational improvement)
  1. No RDM goals nor KPIs defined
  2. RDM goals defined without KPIs
  3. (Some) RDM goals and KPIs defined (but not monitored)
  4. RDM goals defined and monitored via KPIs
Adherence to relevant governing RDM policies on international, European, national, local, organisational level
  1. Governing policies not taken into consideration
  2. Governing policies partially taken into consideration
  3. Governing policies fully taken into consideration
Research data governance: policies and procedures
  1. Minimal or no documented RDM policies and procedures. Individual researchers or departments manage research data independently.
  2. Some RDM policies and procedures are defined but not yet fully implemented across the organization.
  3. Well-defined and written RDM policies and processes, supported by technology and formal oversight. A unified approach to research data management in the organisation.
  4. Proactive, and continuous improvement of well-established research data governance.
Organizational RDM efforts align with relevant RDM best practices on international, European, national, local, organisational level (harmonisation)
  1. No consideration
  2. Some mentioned in strategy
  3. Networking and alignment actively pursued
Funding sources for RDM efforts
  1. No specific funding
  2. Funding from individual grant-based project or minimal stable funding (annual allowance)
  3. Large stable funding or combination of grant-based and stable funding
  4. Combination of grant-based, stable funding and detailed cost recovery model

Indicator Maturity Levels
Legal framework for research data
  1. Legal processes are ad hoc and reactive, with no standardized framework or templates regarding RDM. Legal tasks are handled informally by various departments. Agreements go through minimal review.
  2. Standard templates and contractual clauses are being introduced, but not consistently used. The legal department exists but does not yet support project initiation or have fully defined processes for RDM issues.
  3. The legal framework is well-defined and standardized across the organization. A dedicated personelle is responsible for supporting the establishment of a legal framework covering RDM requirements. Comprehensive templates and standardized contractual clauses for RDM issues.
  4. Continuously improved and optimized legal framework for research data; regularly updated to reflect RDM best practices and regulatory changes. Use of advanced, customizable templates and optimized contractual clauses for RDM issues.
IT security for research data (RD) and associated services
  1. Security measures for RD and services are ad hoc or minimal. There are few or no documented policies and/or procedures. Staff have limited awareness of IT security. Basic protections like antivirus software and standard firewalls are used reactively.
  2. Implementation of documented policies/procedures for security of RD; regular staff training of recognition and response to cyber threats; access control and password management systems are in place; regular software updates and patches.
  3. Incident response and disaster recovery plans for RD are in place and tested; proactive security measures, vulnerability testing and assessment; Multi-Factor Authentication (MFA) is enforced for sensitive systems; tools for advanced threat detection and response are in place.
  4. ISO certification or similar (ISO/ NIST, ...); real-time threat monitoring for RD and automated response mechanisms are in place; regular audits and continuous improvement.
Data protection compliancy of research data
  1. The organisation has a part-time Data Protection Officer (DPO) with general responsibilities (consulting) for RD. Active monitoring for data protection compliance by the DPO. Mostly, there is no standardised framework or templates regarding RDM, and most of the work is reactive.
  2. Organisation has a full-time DPO. Guidelines, documentations and consultations are actively done by DPO for RD. No procedures in place to involve the DPO.
  3. Established systems for Data Protection legislation compliance (ROPA, consent management, DPIA, etc.) of RD. No procedures in place to involve the DPO. Sometimes a DPO is involved at the planning stages, but not always.
  4. Procedures are in place and the organisation is willing and actively involves a DPO at the right stages of the project. The top leaders of the organisation understand the necessity of the DPO and the processes regarding data protection. The organisation does regular auditing (internal or external), and the ongoing evaluation is taking place regarding data protection for RD. Established procedures for the exercise of Data Subject Rights.
Ethics / ELSI for research data (regarding animal / human subjects, Nagoya matters)
  1. Ethical considerations for RD are minimal and reactive. There are no formal guidelines or processes for ensuring ethical compliance in research.
  2. Basic ethical guidelines are established, but not consistently applied. Awareness of ethical issues is growing, and some training is provided.
  3. Ethical guidelines regarding RD are well-defined and consistently applied. Regular training and monitoring ensure compliance, and there is a clear process for addressing ethical issues.
  4. Ethical standards regarding RD are fully integrated into the research process. Continuous improvement and proactive measures are in place, with advanced training and robust mechanisms for ensuring ethical integrity.

Indicator Maturity Levels
RDM Training
  1. No RDM training offered to staff
  2. Ad-hoc RDM training provided, no or limited targeted approach
  3. Regular training and/or broad audience reach with targeted contents
  4. Structured program and train-the-trainer is made available to internal trainers
Availability of RDM training material
  1. Not available
  2. Available on request
  3. Accessible to staff and partners only
  4. Open, multilingual/English and widely promoted
Framework for different levels (e.g., local, national, international) of networking opportunities for the organizational RDM expertise
  1. Only internal interaction - RDM expertise is only shared and accumulated within the organization, i.e., no further networking
  2. Informal interaction with external RDM experts – Sharing and exchange of RDM expertise is reaching beyond the own institute, but only informally
  3. Minimal network participation -- Formal interaction with external RDM expert network(s) with occasional collaboration with external partners
  4. Formal support for RDM expert networking participation – Encouragement and/or funds to attend regular forums, meetings, or platforms enable regular collaboration and sharing of RDM expertise
RDM information / guidelines for researchers
  1. None
  2. Unclear and hard to find – Information is scattered or outdated; staff don’t know where to look
  3. Partially available – Some centralised sources exist and are occasionally updated; visibility varies
  4. Clearly visible and maintained – There are accessible, up-to-date sources that are actively shared across the organisation
Flow of RDM communication and responsiveness
  1. One-way and slow – Information is shared top-down only; feedback is rare or unanswered
  2. Some feedback and interaction – Staff can ask questions or raise issues, but response time and engagement vary
  3. Responsive and two-way – Communication is timely and interactive, with active listening and feedback loops in place
Formalization of support for RDM services (e.g., tools as service, and similar)
  1. Services present but only informally supported
  2. Services formally supported and documented
  3. Professional service provision with ticketing system, usage agreements, formal documents, templates, etc; relevant services are integrated with each other

Indicator Maturity Levels
RDM software and resources used by end-users
  1. No software or resources for data management present or recommended
  2. Reference to generic software and resources
  3. Software and guidelines provided and documented for researchers / relevant groups / relevant RDM processes
  4. Software and guidelines adopted by researchers / relevant groups / relevant RDM processes
Research data organisation standards
  1. No standard present or recommended
  2. Reference to generic best practices
  3. Guidelines / standards provided and documented for researchers / relevant groups / relevant RDM processes
  4. Guidelines / standards adopted by researchers / relevant groups / relevant RDM processes
Documentation standards
  1. No standard present or recommended
  2. Reference to generic best practices
  3. Guidelines / standards provided and documented for researchers / relevant groups / relevant RDM processes
  4. Guidelines / standards adopted by researchers / relevant groups / relevant RDM processes
Technical research infrastructure
  1. No technical infrastructure for research / RDM is available
  2. Technical infrastructure for research / RDM is available; general usage recommendation
  3. Technical infrastructure for research / RDM facilitates data handling along the research data life cycle; more specific usage recommendations
  4. Usage guidelines (and/or SOPs) and automated processes are present in the technical infrastructure for research / RDM regarding, e.g., storage conditions, archiving rules, deletion rules
Research data publication
  1. No guideline present or recommended for research data publication
  2. Reference to generic best practices of research data publication
  3. Specific guidelines and standards are provided regarding data and metadata
  4. Research data publication follows a standardized, documented process and support is available
Catalog of data assets
  1. Lists of data assets capturing basic (non-standardized) infomation do not exist or exist within separate sub-units.
  2. Central catalog exists (although incomplete) with basic metatadata about data assets.
  3. Central catalog exists. Procedures are in place to capture most of the datasets of organizational interest. Search allows to find data asset of interest only based on basic metadata.
  4. Centralized searchable catalog with standardized metadata allowing search of datasets in alignment with organizational goals and retrieval of metadata via API. All assets have a unique accession number.
Accessibility of research data
  1. No access standards are in place on the actual format and SOPs are not in place for access decision making (e.g., unofficial approval of management)
  2. Standard access channel is in place. The data access requests have to be submitted.
  3. Data use conditions are reviewed upon request. Full audit trail is in place for access.
  4. Data is easily accessible and retrieval is optimised for efficiency. Access is (semi-)automated. APIs and efficient transfer channels allowing fast automated retrieval. Following standard secure protocols. Data accessibility is seamless, with advanced search and retrieval capabilities following domain standards.
Data management planning
  1. Nothing offered
  2. Best practice and guidelines available
  3. Consultancy and training available, more standardized guidelines; DMP writing may be required from some researchers
  4. DMP tool available and machine actionability facilitate further integration; DMP writing may be required for all projects

Version information

  • Version: 0.1.5 [Strategy updates by Vassilios]
  • Release date: 2025-09-17T12:38:11.869Z