rdap 16 poster: evaluating research data management education: the good, the bad, and the ugly

Post on 15-Apr-2017

94 Views

Category:

Education

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Evaluating Research Data Management Education:

the good, the bad, and the ugly

Regina RaboinAssoc. Dir. Research & Educationregina.raboin@umassmed.edu

Amanda RinehartData Management Services Rinehart.64@osu.edu

Amy KoshofferScience Informationistkoshofae@ucmail.uc.edu

Tiffany GrantResearch Informationistjoffritm@ucmail.uc.edu

Carlson, J., Sapp Nelson, M., Johnston, L.R., & Koshoffer, A. (2015) "Developing Data Literacy Programs:Working with Faculty, Graduate Students and Undergraduates" ASIS&T Bulletin 41(6). p.14-17. 

The Good:- Addressed professional needs and OSTP/Federal

Agencies requirements (OSU, UC, UMMS)- Flexible curriculum development; modular, case

studies, websites, customizable (UMMS)- Cultivating relationships (UC, UMMS, OSU)- Raised awareness – (UC, UMMS, OSU) - Individual consultations (UC, OSU)- Group presentations (UC)- Referrals (UC, OSU)

The Bad: - Not specific to administrative student data,

or unfunded research (UC)- Shorter sessions, less time between them (UC,

UMMS)- Not enough tool/software coverage (OSU)

- Too much content (UMMS)- Time concerns (UC, UMMS)

- Updated content (UMMS)

Addressing (the Ugly):

- Pre-class surveys, customization (UMMS)- Discipline or audience-specific content (UC,

OSU) - Pilot courses, NEMDMC MOOC (UMMS), Vet

Med (OSU)

As research data management services become more common so does evaluation of those services. Although there is some recent research about assessing faculty and student research data management skills, there is little that addresses assessment of the way we teach research data management (Carlson et al. 2015). We detail our own experiences with assessing data education activities and list the top five lessons learned.

Photo courtesy of kunstfoto at https://www.flickr.com/photos/darknetportal/1258650180, altered to remove background, CC BY-NC-SA 2.0

Lessons Learned- Seek out and involve institutional collaborators.

- Don’t be afraid to switch it up – you can tweak the next session based on the feedback from the last.

- You can’t please everyone all the time – don’t try! Give up comprehensiveness in favor of low-hanging objectives; they will

come back for more if you are immediately useful.

- It’s OK to not know the answer – many times, no one knows the answer, because it hasn’t been determined yet –credibility

is key!

- Expect attrition from registration numbers; anything from 20-50% attendance is typical.

- Be aware that attendees, depending on their expertise, may need a more customized learning experience.

- Be flexible and don’t take any criticisms personally!

top related