The flashcards below were created by user
on FreezingBlue Flashcards.
State the purpose of a testing program?
To ensure a quality testing process is implememted to effectively assess the trainee's achievement of learning objectives
State the roles and responsibilities of the following for an efective testing program:
A: Naval Education Training Command(NETC)
B: NETC N7
C: Learning Center Commanding Officer
D: Learning Center Derctor of Training
E: Learning Center Learning Standards Officer
F: Course Curriculum Model Manager(CCMM)
G: Curriculum developer
H: Learning Site Commanding Officer/ Officer in Charge
I: Learning Site Testing Officer
J: Course Supervisor
K: Participating Activities
- A: NETC// testing policy and guidance
- B: NETC N7// testing policy and guidance oversight and monitors compliance by Centers.
- C: LCCO// serves as CCA; manges sites, DET; resoves difference, incorporates TYCOM test banks, as appropriate.
- D: LCDT// Ensures testing program(s) are conducted, oversees development of testing plans.
- E: LCLSO// provides guidance to curriculum developers on testing, monitors Total Quality Indicators (TQI) and test item analysis and remediation programs.
- F: approves test design, maintains master test item bank.
- G: designs and develops the testing, admin guides, and the tests.
- H: Implememtnst testing plan, designates Testing Officer(s), dignates the course supervior.
- I: LSTO// Test administration, oversees grading, secures tests, maintains test bacnk(s) coordinates/ manages revisions, conducts IS training.
- J: CS// Ensures, monitors, and validates admin, security, and test item analysis.
- K: PA// provides comments, feedback, new test items and maintains test and test item analysis data.
State the primary course source data for creating test items
JDTA, OCCSTDS, CTTL/PPP table, COI
List usable course source data to be used when the primary course source data is not available or has not been created
combination of: OCCSTDs, CTTL, PPP Table, and a COI
Define the following tests:
- A: Formal// test is graded and is used in the calculation of the traineee's final grade.
- B: Informal// May or may not be graded- regardless, the grade will not be used in the calculation of the trainee's final grade.
For each below items, define the three levels of prficiency levels contained within each:
- A: Skill//
- Level 1: imitation
- Level 2: repetition
- Level 3: habit
- B: Knowledge//
- Level 1: knowledge/ comprhension
- Level 2: application/ analysis
- Level 3: synthesis/ evaluation
List the five categories for performance and knowledge tests.
- Pre-test:L for validation of material, acceleration, pre-requisite, advanced oganizer
- Prgress: test blocks of instruction
- Comprehensive test: within course of final exam
- Oral Test: Normally by board (panel of evaluators) assesses trainees comprehension
- Quiz: short test to assess achievement of recently taught material
Discuss the process of piloting a test.
It is review process to assess test reliability and validity and make corrective adjustments before actually collecting data from the target population.
Describe the use of each test instrument as they relate to knowledge and performance test.
A: Job Sheet
B: Problem Sheet
C: Assignment Sheet
J: Case study
K: Validaton of Test Instruments
- A: JS// direct the trainees in the step-by-step performance of a practical task they will encounter in their job assignment.
- B: PS// present practical problems requiring analysis and decision making similar to those encountered on the job
- C: AS// designed to direct the study or homework efforts of trainees
- D: MC// the most versatile of all knowledge test item formats
- E: T ro F// provide only two answers.
- F: Matching// defined as two lists of connected words, phrases, pictures, or symbols.
- G: Comp// free response test items in which the trainees must supply the missing information from memory.
- H: Label// used to measure the trainee's ability to recall facts and label parts in pictures, schematics, diagrams, or drawings
- I: Essay// require trainees to answer a question with a written response.
- J: CS// posing a complex issue, when a comprehensive understanding of materail is required.
- K: VTI// after test instruments have been contructed, and before they are actually assembled into a test the content must be validated.
What are the two types of grading systems used in testing?
- Criterion-Refrenced Test
- Norm Referenced
Discuss test failure policies and associated grading criteria within you learning evironment.
Test(if failed), re-train, re-test. If passed, the hightest score the student can receive is an 80%
Discuss during performance test design how the skill learning objective criticality is determined.
Criticality of performance poits to the need for selecting tasks for training that are essential to job performance, even though the tasks may not be perormed frequently.
Identify the ten sections of a testing plan.
- Course Data
- Course Roles and Responsibilities
- Course Waivers
- Test Development
- Test Administration
- Course Tests and Test Types
- Grading Criteria
- Test and Test Item Analysis
Discuss during knoledge test desgn how the knowledge learning objective criticality is determined to perform a task
determines which learning objectives to assess through formal testing and which learing objectives should be assessed by informal testing.
State the purpose of test and test item analysis
- To determine statistical validity, test and test item analysis techniques are required.
- 1) Difficulty index
- 2) Index of discrimination
- 3) Effectiveness of alternatives
In a remediaton program, discuss what the primary and secondary goal is:
- Primary goal is to motivate and assist trainees in achieving the critical learning objectives of a course by providing additional instructional study time.
- Secondary goal is to remove barriers to learning.
Discuss the three methods of remeiation available to instructors:
- Targeted: to assist the trainee who is having difficulty in accomplishing an objective(s) and/or understaning the material during normal classroom time.
- Scalable: to assist the trainee who is having difficulty in accomplishing objective or understanding the material for a major portion of a course, during normal classroom time.
- Iterative: one-on-one mentorship or SME engagement of each major objective area that the trainee is having difficulty with using a total recall approach using one or a combination of: text, labs, flash-cards, etc; complete a minimum of 20 questions per each objective area with a minimum score of 80 percent.
Define the following sections of a remediation program:
C) Drop from training and attrites
E) Academic Review Boards (ARB's)
- A) Retest: trainee does not achieve a minimum passing grade, retest may cover the portion of the test the trainee had difficulty with or entire test.
- B) Setback: When a trainee has to be retain due to having academic challenges and being put into the next class.
- C) Drop from training: times when the trainee is unsuitable, unable, and/or unwilling to complete the course the trainee is dropped from the course. Trainees who are discharged from the Navy will be classified as attrites.
- D) Counseling: Preventive counseling instituted in "A" and "C" schools and should include counseling for performance and personal problems.
- E) ARB's: when other means of academic counseling, remediation, and an initial academic setback have failed to improve trainee performance.