Hi All
Please find my initial draft results here, summarized, for comments.
Thanks.
Debra
Thursday, June 18, 2009
Saturday, June 13, 2009
Initial thoughts on Results
Some initial thoughts now that the evaluations are completed...
I choose to do a small user review questionaire, as well as peer group discussions and an 'expert reveiw'. 100% completion happened, however as a bonus, one of the peers became an 'expert reviewer' as well (he's our eLearning Advisor)-this was extremely valuable to me in terms of overall useability and looking at the actual instructional design issues.
The instrumentation that made the biggest impression on me was the questionaire on Moodle-Moodle was chosen as the programme resources are to be delivered to students on Moodle. I only had 5 students participating but I could see exactly when the questionaire was completed online, and I could download the results (including any comments) straight into an Excel spreadsheet for analysis. Moodle also emails you a summary of each completed evaluation.Now I wish that I had asked more users! Like Herve though, I was very surprised as to the amount of effort that it took to get the small number of users to complete- I seemed to have timed the evaluation perfectly for the real-time prgramme development but badly in terms of asking for people's commitment in time...people went on leave or about to go on leave- and general mayhem due to end of term coming up I think.
I think I will present the quantitative results pretty much like Bronwyn suggested- %Freq and means, for each question. I think there are clear trends in the question groups. These trends and the comments I received can also be related to the sub questions via a table.
A bar graph for final results could show the trends/numbers well (given the small number of users participating- and likert scale responses).
The emphasis for me was not on the quantitative data, but the qualatative data (comments-feedback) recieved from discussions with peers/expert (and the 'students'). Basically I had several discussions with the peers, as they progressed through the programme. Due to the amount of time the evaluation took, they all needed more that one sitting to complete. And they all completed their questions on paper before discussing with me finally (so I have those as well as my own notes). I underestimated the amount of time needed.
Presentation of the qualatative data will be as summarized comments, grouped under the various headings used - perhaps in a table form as well.
The nature of the stage of development with the Online programme meant that "access was via guest log ons", for both Moodle and the GET SET programme, this process was not as seamless as I had hoped- while I had pre-arranged guest log ons and access- they did not arrive at quite the right time, and I handed out my questionaires/information sheets prior to having the log ons- and had to back track quite a bit with lots of talking through the process of logging on...(interesting how instructions can be 'lost' and not quite read properly as well). Even one of my peers misread the instructions to use an enrolment key when logging on.
I had been so used to working on the programme these last few months that I did not realise either, how hard it would be for others to actually 'find' in the list of all programmes on Moodle-a minor hiccup!!! It was unintentionally hidden in a layer below what people normally see :-(.
What stood out basically was that the pedogogy of the programme was great-the content and learning support was all there, it was the instructional design and general 'getting started' instructions that needed fine tuning. No real surprises resulted, but some valuable and timely feedback occured, which is currently being implemented.
I believe that my overall sub-questions were answered, with sufficent feedback to make some key changes in time for the programme to be released generally.
On to the summary of the results...
Debra
I choose to do a small user review questionaire, as well as peer group discussions and an 'expert reveiw'. 100% completion happened, however as a bonus, one of the peers became an 'expert reviewer' as well (he's our eLearning Advisor)-this was extremely valuable to me in terms of overall useability and looking at the actual instructional design issues.
The instrumentation that made the biggest impression on me was the questionaire on Moodle-Moodle was chosen as the programme resources are to be delivered to students on Moodle. I only had 5 students participating but I could see exactly when the questionaire was completed online, and I could download the results (including any comments) straight into an Excel spreadsheet for analysis. Moodle also emails you a summary of each completed evaluation.Now I wish that I had asked more users! Like Herve though, I was very surprised as to the amount of effort that it took to get the small number of users to complete- I seemed to have timed the evaluation perfectly for the real-time prgramme development but badly in terms of asking for people's commitment in time...people went on leave or about to go on leave- and general mayhem due to end of term coming up I think.
I think I will present the quantitative results pretty much like Bronwyn suggested- %Freq and means, for each question. I think there are clear trends in the question groups. These trends and the comments I received can also be related to the sub questions via a table.
A bar graph for final results could show the trends/numbers well (given the small number of users participating- and likert scale responses).
The emphasis for me was not on the quantitative data, but the qualatative data (comments-feedback) recieved from discussions with peers/expert (and the 'students'). Basically I had several discussions with the peers, as they progressed through the programme. Due to the amount of time the evaluation took, they all needed more that one sitting to complete. And they all completed their questions on paper before discussing with me finally (so I have those as well as my own notes). I underestimated the amount of time needed.
Presentation of the qualatative data will be as summarized comments, grouped under the various headings used - perhaps in a table form as well.
The nature of the stage of development with the Online programme meant that "access was via guest log ons", for both Moodle and the GET SET programme, this process was not as seamless as I had hoped- while I had pre-arranged guest log ons and access- they did not arrive at quite the right time, and I handed out my questionaires/information sheets prior to having the log ons- and had to back track quite a bit with lots of talking through the process of logging on...(interesting how instructions can be 'lost' and not quite read properly as well). Even one of my peers misread the instructions to use an enrolment key when logging on.
I had been so used to working on the programme these last few months that I did not realise either, how hard it would be for others to actually 'find' in the list of all programmes on Moodle-a minor hiccup!!! It was unintentionally hidden in a layer below what people normally see :-(.
What stood out basically was that the pedogogy of the programme was great-the content and learning support was all there, it was the instructional design and general 'getting started' instructions that needed fine tuning. No real surprises resulted, but some valuable and timely feedback occured, which is currently being implemented.
I believe that my overall sub-questions were answered, with sufficent feedback to make some key changes in time for the programme to be released generally.
On to the summary of the results...
Debra
Sunday, June 7, 2009
Progress at Last
I have completed my plan , its been marked, I've incorporated feedback and it is currently being implemented. Bronwyn suggested I share.
Please find my final plan here on Google Docs as a web page.
In terms of 'progress', in real time: my plan included gathering responses from peers, an 'expert reviewer', and student-user reviewers. To date, I have discussed with 2 out of the 3 peers, (one has become elusive :-( ), received feedback from the expert reveiwer (still to 'interveiw' him); and the users have yet to complete the questionaire for me. So nearly there. Some interesting and valuable feedback is already happening, so I am very keen to get involved in the analysis this week. I actually think, after agonising over the writing of them, that I have asked the right questions in my questionaires!!
Thanks
Debra
Please find my final plan here on Google Docs as a web page.
In terms of 'progress', in real time: my plan included gathering responses from peers, an 'expert reviewer', and student-user reviewers. To date, I have discussed with 2 out of the 3 peers, (one has become elusive :-( ), received feedback from the expert reveiwer (still to 'interveiw' him); and the users have yet to complete the questionaire for me. So nearly there. Some interesting and valuable feedback is already happening, so I am very keen to get involved in the analysis this week. I actually think, after agonising over the writing of them, that I have asked the right questions in my questionaires!!
Thanks
Debra
Subscribe to:
Posts (Atom)