23 things & the EBLIP toolkit
"What resources/tools should be included to in an EBLIP toolkit at Libraries Using Evidence – eblip.net.au to ensure it is a resource that library practitioners across the world will find useful to support application of EBLIP? How can Library 2.0 techniques further enhance the toolkit’s usefulness?".
This is the question upon which a project I am involved with is based, and will be the focus of my thinking as I work through the "23 things". I expect that one deliverable of the project will be a report/issues paper outlining our experience in exploring/implementing "Library 2.0" throughout the toolkit's development (and therefore informing decisions re use of Library 2.0 tools beyond the project's pages for other online pages/services). I spent a few hours travelling by train to today, so had a bit of time to ponder things! I started to consider what sort of evaluation measure should be considered? My initial thoughts are:
Carry out a Library 2.0 skills/tools audit (invidual perspective (all team members), and what is used in our intranet/internet sites)
* as at start of Learning 2.0/project (they coincide)
* end of Learning 2.0
* end of project
* 6 months post Learning 2.0/project
*** note to self - this needs to happen this week before team mates get stuck into the course!
Using a scale of something like
* no experience/use
* dabbled with
* trialling/investigating/looking good
* "I'm hooked"/it's firmly in place
To investigate
* were skills developed?
* were new technologies implemented?
* was implementation sustained & were further developments actioned?
* which ones were successful? which ones weren't? Why? *need qual. data source - comments section in audit? Discussion topic in project evaluation love-in (focus group style)? Project team to document experiences/reflections as exploration is undertaken (diaries/project team meeting minutes)?
I wonder if this would be something any of the "real Learning 2.0" participants would be interested in, re their own work unit/s? See how results of the audits differ between a hospital/academic library, and an innovative public library? Are different tools "loved" by different workplaces or do we all agree that the same good ones are good, and same tricky ones are tricky? Is this sort of evaluation [or similar] already happening by PLCMC?
This is meant to be a fun experience, but so much the better if it is 'reportable' fun!
2 Comments:
Possible things to include in audit. From Stephen Abram
· RSS (really simple syndication)
· Wikis
· New and revised programming methods like AJAX and APIs
· Blogs and blogging
· Commentary and comments functionality
· Personalization and “My Profile” features
· Personal media such as Podcasting and MP3 files
· Streaming media audio and video formats
· Reviews and user driven ratings
· Personalized Alerts
· Web Services
· Instant messaging and virtual reference including co-browsing
· Folksonomies, Tagging, and tag clouds
· Photos (e.g. Flickr, Picasa)
· Social networking software
· Open Access, Open Source, Open Content
· Socially driven content
· Social bookmarking (such as Delic.io.us)
list of web2.0 tools
http://www.listible.com/list/complete-list-of-web-2-0-products-and-services
Post a Comment
<< Home