Work package WP1 – RDM Audit is about assessment of RDM practices at UH in order to identify the gaps, requirements and to transfer knowledge from experienced RDM practitioners to all staff holding valuable data. We are employing two methods to carry out the audit: DAF survey and interviews.
DAF survey: we have carried out a survey based on DAF methodology over the last month. The questions we asked were mostly faithful to the DAF online tool with some tweaks to accommodate local infrastructure. The result is more like that used by Orbital at Lincoln rather than Iridium at Newcastle. The survey was circulated to around 600 staff via our “Research Grants News and Funding Opportunities” newsletter, with follow up reminders sent by our information managers to schools and research centres. We have had 60 responses so far from senior researchers, principal investigators, research students, lecturers and research fellows. We have extended the open period due to requests from a couple of research groups who want to consider it at their next regular group forums. The results already make interesting reading and will be published here soon.
Interviews: We have designed an interview protocol for carrying out semi-structured interviews with selected researchers across different disciplines in the University of Hertfordshire. The protocol was designed using the following sources:
- University of Southampton generic interview schedule and University of Oxford Interview Framework (from DAF implementation Guide)
- Data Management Plan Checklist (DCC)
- University of Bath Postgraduate DMP template
- Twenty Questions for Research Data Management(Oxford DMPonline Project)
When evaluating benefit the first port of call will often be a hard financial metric:
- does RDM as a whole cost less now than before we started?
- are we winning more research grants as a result of RDM good practice?
Given the relatively short timescale of the project and our complete lack of existing RDM accounting we can not answer these questions. These leaves us considering a softer set of metrics:
- has the usage of robust centralised storage increased during the life of the project?
- has the use of Data Management Plans increased?
- how many datasets have we published in support of our publications?
Even these questions will not be easy to answer because they have not previously been asked, but they probably are measurable over the period of the project.
There are less quantifiable but still tangible benefits to be recorded too. For example, RDTK has already lead to a closer a relationship between our information systems providers and research administrators, which had become distanced by organisational restructuring. For another example, as a result of a tangential intervention from RDTK, our largest ‘departmental’ facility (an 80 core HPC cluster and 200TB SAN) is about to move out of its less than ideal premises and into one of our purpose built data centres, making it much less prone to downtime or disaster.
In order to capture this kind of collateral benefit and to try to get the individual researcher’s perspective we believe it is worth considering factors like ‘increasing awareness regarding RDM good practice’, ‘improving staff confidence in developing a DMP’, as well as the ‘usage of resources and organisational capacities and infrastructure to support RDM activities’. To this end we have added a section to our interview protocol which asks the respondents about their competencies in these areas. At the conclusion of the project we will return to our interviewees to see if their competencies have improved.
These measures of benefit may not show us an explicit return on investment, but to paraphrase ViDaaS’s James Wilson – it is better to measure what you can than what you can’t, and ‘soft’ benefits are known to yield hard results (see the latter part of JISCMRD launch event: Thematic session on the business case for RDM).