Below is a step in the development research process using instructional design works of Dick Carey and Carey. This
development design is used for research in education that includes the
development of tools, planning, teaching materials, and media.
Identify Instructional Goal(s)
The first step in the model is to determine what new information and skills you want learners to have mastered when they have completed your instruction, expressed as goals. The instructional goals may be derived from a list of goals, from a performance analysis, from a needs assessment, from practical experience with learning difficulties of students, from the analysis of people who are doing a job, or from some other requirement for new instruction.
Conduct Instructional Analysis
After you have identified the instructional goal, you determine step by step what people are doing when they perform that goal and also look at subskills that are needed for complete mastery of the goal. The final step in the instructional analysis process is to determine what skills, knowledge, and attitudes, known as entry skills, are needed by learners to be successful in the new instruction. For example, students need to know the concepts of radius and diameter in order to compute the area and the circumfer ence of a circle, so those concepts would be entry skills for instruction on computing area and circumference.
Analyze Learners and Contexts
In addition to analyzing the instructional goal, there is a parallel analysis of the learners, the context in which they will learn the skills, and the context in which they will use them. Learners' current skills, preferences, and attitudes are determined along with the characteristics of the instructional setting and the setting in which the skills will eventually be used. This crucial information shapes a number of the succeeding steps in the model, especially the instructional strategy.
Write Performance Objectives
Based on the instructional analysis and the description of entry skills, you write specific statements of what learners will be able to do when they complete the instruction. These statements, derived from the skills identified in the instructional analysis, identify the skills to be learned, the conditions under which the skills will be demon strated, and the criteria for successful performance.
Develop Assessment Instruments
Based on the objectives you have written, you develop assessments that are parallel to and measure the learners' ability to perform what you described in the objectives. Major emphasis is placed on relating the kind of skills described in the objectives to the assessment requirements. The range of possible assessments for judging learners' achievement of critical skills across time includes objective tests, live performances, measures of attitude formation, and portfolios that are collections of objective and alternative assessments.
Develop Instructional Strategy
Based on information from the five preceding steps, you then identify the strategy to use in your instruction to achieve the goal. The strategy will emphasize components to foster student learning including such preinstructional activities as stimulating motivation and focusing attention, presentation of new content with examples and demonstrations, active learner participation and assessment, and followthrough activities that relate the newly learned skills to realworld applications. The strategy will be based on current theories of learning and results of learning research the characteristics of the media that will be used to engage learners, content to be taught, and the characteristics of the learners who will participate in the instruction. These features are used to develop or select materials and plan instructional activities.
Develop and Select Instructional Materials
In this step you use your instructional strategy to produce the instruction. This typically includes guidance for learners, instructional materials, and assessments. (In using the term instructional materials we include all forms of instruction such as instructor's guides, student reading lists, PowerPoint presentations, case studies, videos, podcasts, computerbased multimedia formats, and web pages for distance learning.) The decision to develop original materials will depend on the types oflearning outcomes, the availability of existing relevant materials, and developmental resources available to you. Criteria for selecting from among existing materials are also provided.
Design and Conduct Formative Evaluation of instruction
Following completion of a draft of the instruction, a series of evaluations is conducted to collect data used to identify problems with the instruction or opportunities to make the instruction better. This type of evaluation is called formative because its purpose is to help create and improve instructional processes and products. The three types of formative evaluation are referred to as one-to-one evaluation, small-group evaluation, and field trial evaluation. Each type of evaluation provides the designer with a different set of information that can be used to improve instruction. Similar techniques can be applied to the formative evaluation of existing materials or classroom instruction.
Revise Instruction
The final step in the design and development process (and the first step in a repeat cycle) is revising the instruction. Data from the formative evaluation are summarized and interpreted to identify difficulties experienced by learners in achieving the objectives and to relate these difficulties to specific deficiencies in the instruction. The dotted line in the figure at the beginning of this chapter labeled "Revise Instruction" indicates that the data from a formative evaluation are not simply used to revise the instruction itself, but are used to reexamine the validity of the instructional analysis and the assumptions about the entry skills and characteristics of learners. It may be necessary to reexamine statements of performance objectives and test items in light of collected data. The instructional strategy is reviewed and finally all of these considerations are incorporated into revisions of the instruction to make it a more effectivelearning experience. In actual practice a designer does not wait to begin revising until all analysis, design, development, and evaluation work is completed; rather, the designer is constantly making revisions in previous steps based on what has been learned in subsequent steps. Revision is not a discrete event that occursat the end of the ID process, but an ongoing process of using information to reassess assumptions and decisions.
Design and conduct summative evaluation
Although summative evaluation is the culminating evaluation of the effectiveness of instruction, it generally is not a Part of the design Process. It is an evaluation of the absolute or relative value of the instruction and occurs only after the instruction has been formatively evaluated and sufficiently revised to meet the standards of the designer. Since the summative evaluation is usually not conducted by the designer of the instruction but instead by an independent evaluator, this component is not considered an integral part of the instructional design process per se.
Procedures used for summative evaluation are receiving more attention today than in previous years. This increased attention is due to interest in the effectiveness of web-based instruction across organizations, states, and countries. For example, will web-based instruction developed for learners in Utah, which is very transportable electronically, be effective for students in the Caribbean or China? What would experts in learning conclude about the instructional strategies within very attractive materials that were developed "a world away"? Terms such as learner verification of materials effectiveness and assurances of materials effectiveness are resurfacing now that materials transportability is much more economical and effortless.
The first step in the model is to determine what new information and skills you want learners to have mastered when they have completed your instruction, expressed as goals. The instructional goals may be derived from a list of goals, from a performance analysis, from a needs assessment, from practical experience with learning difficulties of students, from the analysis of people who are doing a job, or from some other requirement for new instruction.
Conduct Instructional Analysis
After you have identified the instructional goal, you determine step by step what people are doing when they perform that goal and also look at subskills that are needed for complete mastery of the goal. The final step in the instructional analysis process is to determine what skills, knowledge, and attitudes, known as entry skills, are needed by learners to be successful in the new instruction. For example, students need to know the concepts of radius and diameter in order to compute the area and the circumfer ence of a circle, so those concepts would be entry skills for instruction on computing area and circumference.
Analyze Learners and Contexts
In addition to analyzing the instructional goal, there is a parallel analysis of the learners, the context in which they will learn the skills, and the context in which they will use them. Learners' current skills, preferences, and attitudes are determined along with the characteristics of the instructional setting and the setting in which the skills will eventually be used. This crucial information shapes a number of the succeeding steps in the model, especially the instructional strategy.
Write Performance Objectives
Based on the instructional analysis and the description of entry skills, you write specific statements of what learners will be able to do when they complete the instruction. These statements, derived from the skills identified in the instructional analysis, identify the skills to be learned, the conditions under which the skills will be demon strated, and the criteria for successful performance.
Develop Assessment Instruments
Based on the objectives you have written, you develop assessments that are parallel to and measure the learners' ability to perform what you described in the objectives. Major emphasis is placed on relating the kind of skills described in the objectives to the assessment requirements. The range of possible assessments for judging learners' achievement of critical skills across time includes objective tests, live performances, measures of attitude formation, and portfolios that are collections of objective and alternative assessments.
Develop Instructional Strategy
Based on information from the five preceding steps, you then identify the strategy to use in your instruction to achieve the goal. The strategy will emphasize components to foster student learning including such preinstructional activities as stimulating motivation and focusing attention, presentation of new content with examples and demonstrations, active learner participation and assessment, and followthrough activities that relate the newly learned skills to realworld applications. The strategy will be based on current theories of learning and results of learning research the characteristics of the media that will be used to engage learners, content to be taught, and the characteristics of the learners who will participate in the instruction. These features are used to develop or select materials and plan instructional activities.
Develop and Select Instructional Materials
In this step you use your instructional strategy to produce the instruction. This typically includes guidance for learners, instructional materials, and assessments. (In using the term instructional materials we include all forms of instruction such as instructor's guides, student reading lists, PowerPoint presentations, case studies, videos, podcasts, computerbased multimedia formats, and web pages for distance learning.) The decision to develop original materials will depend on the types oflearning outcomes, the availability of existing relevant materials, and developmental resources available to you. Criteria for selecting from among existing materials are also provided.
Design and Conduct Formative Evaluation of instruction
Following completion of a draft of the instruction, a series of evaluations is conducted to collect data used to identify problems with the instruction or opportunities to make the instruction better. This type of evaluation is called formative because its purpose is to help create and improve instructional processes and products. The three types of formative evaluation are referred to as one-to-one evaluation, small-group evaluation, and field trial evaluation. Each type of evaluation provides the designer with a different set of information that can be used to improve instruction. Similar techniques can be applied to the formative evaluation of existing materials or classroom instruction.
Revise Instruction
The final step in the design and development process (and the first step in a repeat cycle) is revising the instruction. Data from the formative evaluation are summarized and interpreted to identify difficulties experienced by learners in achieving the objectives and to relate these difficulties to specific deficiencies in the instruction. The dotted line in the figure at the beginning of this chapter labeled "Revise Instruction" indicates that the data from a formative evaluation are not simply used to revise the instruction itself, but are used to reexamine the validity of the instructional analysis and the assumptions about the entry skills and characteristics of learners. It may be necessary to reexamine statements of performance objectives and test items in light of collected data. The instructional strategy is reviewed and finally all of these considerations are incorporated into revisions of the instruction to make it a more effectivelearning experience. In actual practice a designer does not wait to begin revising until all analysis, design, development, and evaluation work is completed; rather, the designer is constantly making revisions in previous steps based on what has been learned in subsequent steps. Revision is not a discrete event that occursat the end of the ID process, but an ongoing process of using information to reassess assumptions and decisions.
Design and conduct summative evaluation
Although summative evaluation is the culminating evaluation of the effectiveness of instruction, it generally is not a Part of the design Process. It is an evaluation of the absolute or relative value of the instruction and occurs only after the instruction has been formatively evaluated and sufficiently revised to meet the standards of the designer. Since the summative evaluation is usually not conducted by the designer of the instruction but instead by an independent evaluator, this component is not considered an integral part of the instructional design process per se.
Procedures used for summative evaluation are receiving more attention today than in previous years. This increased attention is due to interest in the effectiveness of web-based instruction across organizations, states, and countries. For example, will web-based instruction developed for learners in Utah, which is very transportable electronically, be effective for students in the Caribbean or China? What would experts in learning conclude about the instructional strategies within very attractive materials that were developed "a world away"? Terms such as learner verification of materials effectiveness and assurances of materials effectiveness are resurfacing now that materials transportability is much more economical and effortless.
The nine basic steps represent the procedures employed when using the systems approach to design instruction. This set of procedures is referred to as a systems approach because it is made up of interacting components that together produce instruction to satisfy needs expressed in a goal. Data are collected about the system's effectiveness so that the final product can be improved until it reaches the desired quality level.
To view the Indonesian translation Click Here
To view the Indonesian translation Click Here
Tidak ada komentar:
Posting Komentar