Sunday, May 26, 2013

W3: Effective Web Instruction

Ted Frick and Elizabeth Boling simplify the process of designing computer mediated learning in Effective Web Instruction: Handbook for Inquiry-Based Process.

This easy to read handbook concisely describes the essential steps for creating effective computer based training. It is split up into 6 main parts: Getting Started, Conducting the Analysis, Preparing for Testing a Prototype, Testing a Prototype, Building a Web Prototype, and Assessing and Maintaining the site.

This summary will cover the first two parts: Getting Started and Conducting the Analysis.

Getting Started
In Getting Started, Frick and Boling explain that to design an inquiry-based web instruction, one must follow an iterative process of making "empirical observations to answer questions and make decisions" (p 4), giving you a chance to revaluate and adjust the training as necessary throughout the design process.

Frick and Boling stress that a designer need not be an expert in all stages of the design process, as long as they hire good people for the design team whose strengths compliment their teammates' weaknesses. They list the most important members of a web-based instruction design team as: Analyst, Instructional Designer, Information, design, and graphic designer, Web Technology Specialist, Evaluation/usability testing specialist, and Web server & system administrator.

Conducting the analysis

Develop your Instructional Goals
Frick and Boling assert that the first thing to do when developing instructional goals is to determine who the "stakeholders" are. The stakeholders are the people most directly (and sometimes indirectly) affected by the training and it's outcome. The people who are most invested in the training and most affected by it's success (or failure). In the case of my project, the stakeholders are
  • me (as the Language Instruction Specialist of the department, it's my job to teach pedagogy principles and lead professional development),
  • my boss (because his reputation is affected by the work CeLCAR publishes) 
  • CeLCAR, the department I work for (part of our grant stipulates providing professional development oppurtunities for LCTL teachers and language developers, and every four years we have to reapply for funding)
  • US Department of Education/Title VI (because they provide the majority of our funding and evaluate our materials)
  • Indiana University/College of Arts and Sciences (because they provide partial funding and it is their name on the final project)
  • Language Developers (they will be using the training to help them develop language textbooks)
  • LCTL Teachers (because they will be using the finished product to teach their classes)
  • LCTL Students (because they will be using the textbooks to learn the languages)
For this project, I plan on focusing on the stakeholders I have immediate access to: my boss, CeLCAR, the language developers, LCTL Teachers, and LCTL students.

Form the Instructional goals and identify their indicators
Once you have determined the stakeholders, you can think about what you want the learners to be able to do at the end of the training. They go even one step forward and encourage a designer, while working on Instructional goals to think about how one can assess whether or not a learner acquired the desired ability/skill/knowledge. I especially enjoyed this discussion, because these concepts are mostly new to me (they are very different than what we discussed in my Testing courses, which focused more  on how to write good traditional tests). The examples given on the differences between "more or less efficient and more or less authentic" (p 11) were very useful at illustrating the different levels of efficiency and authenticity available. 

Obviously I wrote my instructional goals before thinking about mastery. And I am still unsure of how I am going to assess them. Which means, as I start thinking of mastery assessment (due next week), I expect I might have to review my instructional objectives as well. (It is an iterative process after all!)

Learner Analysis
Conducting a thorough learner analysis is important for determining motivation, predicting possible difficulties, planning learning, etc. In the case of my project, I am very familiar with my learner audience. I feel my biggest obstacles will include:
  • second language speakers
  • work full time (they could view this training as another drag on their time)
  • many developers/teachers of LCTL do not have much formal language education experience (be careful not to teach beyond their zone of proximal development)
I will have to continue working on a thorough learner analysis in order get a more complete view of my learner. (Their comfort with technology, etc)

Context Analysis
Frick and Boling go to lengths to describe that web-based instruction should never be designed as a just because you can choice. Instead, a designer should have sufficient justification for choosing a web-based context over a standard pen and paper and/or classroom delivery. 

In the case of my project, my justification is: 
  • time (instructors/developers can complete the training on their own and at their own speed. No meetings to attend that interfere with their other many duties)
  • money (we can't afford to host a multi-developer in-person training)
  • distance (many of our potential developers are located oversees, and they are unable to come attend in person)
I'm still not sure of the resources we will need, besides a computer and internet access. I will work on this more as the project develops.

1 comment:

  1. Hi Amber,

    I seems for the most part, you project is lining up nicely with the guidelines detailed in the first part of this handbook. I also had to rewrite my original objectives to align with the mastery assessment. I know we have talked about alignment in previous classes, but this class has really helped me understand how objectives and assessment should work together and support each other.

    Kassie

    ReplyDelete