startupmeeting - OER Synthesis and Evaluation

OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
UK OER II
synthesis and
evaluation
Allison Littlejohn, Lou McGill
Helen Beetham, Isobel Falconer
Caledonian Academy
Glasgow Caledonian University, UK
www.academy.gcal.ac.uk
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
Role
of the evaluation teamExtending reputation
Vision
Support projects in their evaluation work
−
Wiki, blog, evaluation resources, one-to-one
−
Collate evidence across projects
Evaluate programme overall against original objectives:
−
OERs released and collected
−
Practices around OER reviewed/reformed/cascaded
−
Lessons learned about OER release, management,
discovery and use
−
Benefits in terms of challenges & stakeholders
Report on findings e.g. alternative approaches to OER,
approaches that are sustainable and usable,
evidence of uptake and use, identified benefits to
users
Link with other members of support team and across
strands – including with 'users' study
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
What
will we actually do? Extending reputation
Vision
Synthesis/evaluation framework with existing
evidence/findings that we will build on
1-2-1 feedback on evaluation components of plan
+ mapping to framework
Resources to support evaluation/tracking
Interim findings/blog posts to help you track and
contribute to issues emerging
Support to projects (with named contact and
project pairings)
Final report
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
Our
resources
Vision
Extending reputation
Wiki: https://oersynth.pbworks.com/
OER-Synthesis-and-Evaluation-Project
− report from Pilot Phase UK OER
− synthesis framework
− evaluation resources e.g. 'Which Evidence', 'Evaluating
OERs' 'Tracking OERs'
Blog: http://oersynthesis.jiscinvolve.org/wp/
‒ updates about our work and findings
‒ commentary on events and issues in the OER
community
OER InfoKit:
https://openeducationalresources.pbworks.com/
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
Pilot
phase findings: motivations
to release
Extending
reputation
Vision
Motivations are typically multiple, complex, and open to
change as the process unfolds. They might include:

personal academic/professional reputation

share-and-share-alike

institutional reputation and attracting potential students

commitment to open education agenda

capacity building e.g. staff skills, content management






outreach and public engagement goals
other public interest agenda (e.g. public health, climate change)
efficient content development (e.g. in niche/declining subjects)
enhance learner access and choice (e.g. work-based,
international, lifelong learners)
changing modes of learning (e.g. peer-to-peer, learner-directed)
build curriculum partnerships (e.g. with industry)
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
Pilot
phase findings: sustainable
approaches
Extending
reputation
Vision

Individual showcasing
reputation enhancement, personal/prof rewards, individual values
(openness, public interest, quality), learner focus

Institutional showcasing
attracting students, learner choice, (international) reputation,
potential learners and partners as end-users, influencing

Share and share alike
tightly-knit subject/topic communities ,learners and other teachers
as end-users, sharing practice, scholarship, collaboration

Capacity building
staff skills, institutional strategies (e.g. LTA, content management),
change awareness, sustainable development




Long-term sustainability
close loop with re-use – demand/supply, discovery, re-usability, value
communities that are already sharing are OER-ready
staged release helps manage risk, gives more control
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
Pilot
phase findings: aspects
of openness
Extending
reputation
Vision
Different motivations to release → different definitions of
'open' and different priorities for open content, e.g.:







Re-usability vs integrity (granularity issues)
Generic, often skills-based vs topical (tied to subject
benchmarks/prof body requirements?)
Professionally produced RLOs/multimedia vs 'shared
back of envelope'
Context-free vs various means of representing
educational context, level, values, purpose
Personal, institutional, community branding vs no
branding/third party branding
Different hosting solutions (push/pull, web
2.0/repository)
Linked-to vs embedded content elements (updating?)
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
Pilot
phase findings: benefits
Extending reputation
Vision
We are seeing evidence (and we need more) that OER
release can:







have institutional marketing potential
enhance visibility to stakeholders (employers, potential
learners, franchise/partner colleges)
catalyse change in institutional strategy and practice
support new partnerships around content development
support sharing/discussion of teaching practice
be part of development strategy for centres of
excellence (scholarship/teaching)
be attractive to established scholars (legacy) and to
new ones (building reputation)...
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
We
still need to find out... Extending reputation
Vision










more about OER use and re-use
who OERs are showcased to and with what impact
what kind of communities benefit from OER sharing
and/or have existing 'open' practices
professional development implications of OER
impact of cascade activities on capacity and
organisational readiness
how Web 2.0 tools can be used to maximise
discoverability
how repositories including JorumOpen are being used and
how effectively
impact of collection/collation on discovery and re-use
Impact of OER release in specific topic areas and case
study areas to meet sector challenges
...
OER synthesis and evaluation
Phase II Start-Up Meeting – September 2010
Strand
Specific Evaluation Extending
Issues
reputation
Vision
Release (including OMAC)
Technical and organisational, legal and quality issues,
motivations and benefits cases, integration into programmes
Collection/collation
Technical and usability issues, user engagement, evidence of
discoverability and other benefits
Cascade
Evidence of capacity building, organisational readiness
issues, guidance/support needed by the sector
Tracking and user studies
User engagement, quantitative and qualitative evidence of
use and re-use