Tuesday, August 31, 2010

Summary of week Five

Daffodils by di_the_huntress
Spring is here and you have all got a spring in your step as well.  You have been very busy indeed with the mock evaluations, and thinking about your evaluation projects.  The thorough approach you took in the tasters and mock evaluations has made my heart skip a beat. It is very exciting anticipating what you are going to come up with when doing the real thing.  There has been some shuffling of partners, and hopefully everyone is feeling more confident about the collaborative approach to the projects. The details are on the wiki in the section about Evaluation Planning.

Mareena uses a wonderful analogy for evaluation, which all of us will be able to relate to - clothes buying.  See if you agree that when she is measuring the quality of the clothes "in terms of price" she is evaluating from a cost-effectiveness perspective. I believe this must influence the decision to buy, and will vary in different contexts and situations. For example, you may spend more than normal on one item if buying a garment for a wedding. How does the context influence different types of evaluation?

Veronique and Mareena have prepared a comprehensive description about the Searching for Information module which they evaluated.  They have identified some items which could cause big frustration for students on dial-up connections. At the level of digital information literacy for which this module was designed - lower level users - such problems could be very off-putting. This demonstrates the necessity of getting the balance of "fancy stuff" and "usable stuff" right. They have done a good job on this, some constructive suggestions balanced with the "fish hooks" they encountered. I like the way each person has referred to the other's perceptions about the module and Veronique also referred to others' evaluations, e.g., Alex's suggestion about the length of modules like this. 

Lisa  and Alex have identified some important factors about the computer taster and have taken an educational perspective which includes the provision of interaction opportunities. It is interesting that they have picked up on the need for online communication to supplement the f2f lectures and online and computer-based activities. I agree that the evaluation can be used to find out what "would work" as well as what is working. They have made some good suggestions - does anyone know do they match with what the evaluator discovered in his report in the Exemplars section?

Alex and Jon and Misha are the first people to put up their choice of eLearning Guidelines. The explanation Alex has provided  with each one indicates how new eLearning is to her organisation and some of the challenges they face.  Jon has taken a different approach about his choice of guidelines and re-described them to give a bigger picture view of the issues from his perspective. Misha has a mix appropriate for students and staff. Remember you can re-word the guidelines to suit your context, and also make up your own using the framework.

Jon mentions feeling some "excitement" about learning creeping in. He makes some valid points about the danger of over-analysing. My experience is that it has been this fear which has either led to, either no analysis at all, or very poor estimations of what is possible or necessary. Getting the balance right is key to progressing, as he says, and likewise, I believe it is the end user who must be considered pivotal to the whole process. How well do you believe educational organisations do this or are they too concerned with economics?

Katie, Alex and Rachel teamed up to evaluate the Digital Information Literacy OIL module. They were very creative and organised their evaluation on wikispaces witht the different perspectives - corporate and primary school - side by side. See if you agree with them? I agree that offering the module as part of a blended approach would be much more powerful, and making the module more user friendly for dipping in and out would also work. Excellent evaluation!

Lisa and Alex have also evaluated the Business module. They  have highlighted some of the issues which can arise with pre-packaged learning objects. First of all, it is very hard to design something which "suits all the people all the time" - in terms of navigation, content, layout, learning preference and style. Although there are style sheets for this type of thing, the minute the developer wants to get creative, they may become irrelevant. As you say, working through a self-paced module like this is boring if the level of interaction isn't just right for each individual - it is like reading a text-book from start to finish.  They make good points about the length of the module and the amount of content.

The one thing which I find frustrating with these modules, is the hoohah required to customise them. That is why my preference has moved away from static learning objects such as this, customisable as they may in theory be, towards open developments such as those on a wiki. One of the modules started it's life on a wiki - Digital Information Literacy - but the project brief was to create Learning Objects.  One of the developers wanted to push the wiki option but was outnumbered - I guess he was right all along. What are your thoughts?

Dana has posted an overview of the reasons for his choice of evaluation - it will be an effectiveness evaluation - and he will evaluate "the smiley face approach" This is a resource Dana has developed to teach the eukaryotic cell in the bachelor of Nursing to help them understand and always remember the basis of life.   I really like the reflective element which is emerging in these posts, and the next one - Working together with Louise - is particularly reflective and describes the difficulties which can occur when teachers are trying to be students, and also the challenges which can arise when working collaboratively on projects.

Katie has formulated an excellent big evaluation question. It appears she will not only need to find out what sort of technologies/tools the teachers would like to use, but also what they will need to use to keep up with educational trends, and maybe later she can find out more about what is going to be relevant for students to use. I have suggested that a good starting point might be Derek Wenmouth's post - 100 Ways Google Can Make You a Better Educator.  Some of you may know him - he critiques the latest strategies and technologies for the school sector.  Katie has made a great start with the thinking for her part of the evaluation plan. Where will she head  next?

Misha is feeling behind the 8 ball, but her post makes it seem like she has been doing this blogging lark for a while. The guidelines she has chosen are very appropriate for the situation described. Being a "fingernail ahead" must be excruciating - hanging off the cliff alone and wondering if anyone is going to come and save her.  She mentions the old adage "time" - there is never enough and staff are always concerned with what is priority - so that generally their up-skilling gets left until last. Why I wonder is course development always seen as "the thing you do" on top of everything else - then the word quality gets rolled out as an after thought? Haarumph! It makes me grumpy too!

Her plan to do "a Needs analysis around the digital literacy levels of our students and more particularly staff" is an excellent choice. I like the 3 Fs! People may be interested in some research I have recently completed with Oriel and others on Digital Information Literacy capability- staff and students - Digital Information Literacy: Supported Development of Capability in Tertiary Environments.

We found that time to play was essential for supporting people to develop confidence in using a range of digital technologies in our information rich world. 

I hope you can all see evaluation as a way to celebrate what is working well as well as seeing it from a problem-based focus. Remember whether planning a needs analysis or an effectiveness or maintenance evaluation you can look at something from several different angles and perspectives. However for this project, you need to keep your approach focused and not let it get too big and unwieldy.  When choosing your sampling tools for collecting data make sure you enable triangulation and more reliable information to be obtained. So far theire is a mix of needs analysis, effectiveness and maintenance evaluation projects planned.

No comments: