aeri_draftcont_15jul15
TRANSCRIPT
P R O D U C T
P R O C E S S ?O R
C RE AT I NG PAT H WAY S AND
C ATALY Z I N G ADVE NT UR E I N T H E
ARC H I VES W I T H T HE S M I T HS ON I AN
T RANS C R I P T I O N CE NT ER
@meghaninmotionMeghan Ferriter, Ph.D.
16 July 2015
Question:
In what ways might crowdsourcing &
knowledge-sharing inform cultural
heritage professional practice?
What can be gained from and what is risked by inviting crowds to
workwith your
collections?
Can you bring together collections,
workflows, and crowdsourced
transcription to effectively extend
research and engagement?
BetterBEST
SPOILER:Yes, it appears so!
Smithsonian Transcription Center
Digital Smithsonian: Priorities
• Enhance the in-person visitor experience
• Digitize the collections
• Make content easy to find & use
• Spark engagement and participation
T H E G OA L?
Create: indexed, searchable text
TH E R E SU LT
Engagement
Connections
Access
Workflow
Prioritization + Collaboration
Is it worth the risk? And what about the challenges?
How does it work?• Peer-review Process: Transcribe - Review -
Approve• ANYONE can Transcribe, anywhere, around
the world, at any time• Only registered volunteers can Review• A final pass by Smithsonian staff for
Approval
The TC Workflow
MOTLEY CREW: (informal) n. - a roughly organized assembly of characters of various backgrounds, appearance, and character
D RAW I N G A M O TL E Y C R E W
…Containing characters of conflicting personality, varied backgrounds and a wide array of methods for overcoming adversity - achieved through the narrative using the various specialties, traits and other personal advantages of each member
E A R N I N G L I F E LO N G L E A R N E R S ?
Risks & Concerns• Misbehavior & Vandalism• Quality & Trustworthiness• Process & Workflow - Center & Unit• Resources - Units & Types of Volunteers• Motivation - If We Build It, Will They
Come?
Answering Risks & Concerns• Misbehavior & Vandalism - not so much, so far• Quality & Accuracy - on-going evaluation • Process & Workflow - Improving Design, Distributing
Tasks• Resources - Deepening understanding: Units &
Volunteers• Motivation - They’re here! Best ways to extend
engagement?
TH E R E SU LT
Engagement
Connections
Access
Workflow + Prioritization + Collaboration
Transcription Center: By The Numbers
In just over 2 years, since June 2013:5,187 Digital Volunteers from 176 countries have transcribed & reviewed 107,529 pages, including
• 27,440 pages from Archives Collections
• 47,394 Biodiversity Specimens & Labels
SU C C E S S !
• Making PDFs (product) available to the Public• Making Notes field to communicate• Connecting People, Discoveries, & Contexts of
Projects• Integrating New Units• Watching Community form, collaborate, (fight!), &
grow• Connecting TC data to external sources• 7 Day Review Challenges + HangTime +
Knowledge• Improving workflows and addressing SI groups’
needs
RESULTS
Engagement
How We Do It• Focus on Process AND Product
• Collaborative & Cross-Promotional
• Two-way/multi-directional learning
• Soft gamification in social programming - collaborative competition
• Focus on skills acquisition & sharing
• No public leaderboard
• Behind-the-scenes access & acknowledgement
Collaborative Competition & Highlighting Discovery
• Helping researchers by making indexed, searchable text from Collections Search Center and providing PDF downloads from our project pages and extracting data for collections records
• Exploring what has been transcribed to highlight hidden stories with #TranscribeTuesday, #FridayFinalLines and showcasing challenges with #MondayMindmelter
• Working together in our #7DayReviewChallenge, Contribute & Connect, and Hangouts
• Creating new research questions and enriching global resources relating to the wealth of people, organizations, places and topics in Smithsonian Collections
We Stay Busy
We Stay Busy
• Share projects
• Leave notes in Projects
• Share their success
• Ask for help from other #volunpeers
Working Together
Working Together
Contribute…
…and Connect
RESULTS
Connections
RESULTS
Access
PDF Download (free)
PDF Download (free)
Accessing PDFs
From our 6 Archives: 2002 downloads in the last year… with no additional effort required from archival staff!
RESULTS
Quality?
Assessing Quality• Peer review• Evolving and Iterative Learning• Precise & Scoped + Specific to Unit• Giving us more than we need + Refining
Qs• Volunteers “get” peer review… sometimes!
Henry Cushier Raven: OCR v VOLs
RESULTS
Workflow & Prioritization
Challenges Remain• Resources: Time, Money, Brains• Maintaining Balance, Ecology of Site• Digitizing Material• Workflow demands• Sharing & Helping Other
Crowdsourcing/CitSci/Digital Humanities Projects
• Participation by New Units