360 Degree Academic Performance Assessment Model

360 Degree Academic Performance Assessment Model

Best Practices for Writing Objective Test Items College of Nursing January 2011 Writing Objective Test Items Best Practices for Writing Objective Test Items January 2011 March 2010 Presenter January 2010 Dr. James Coraggio, Director, Academic Effectiveness and Assessment Contributor Alisha Vitale, Collegewide Testing Coordinator January 7. 2011

Academic Effectiveness and Assessment 2 Writing Objective Test Items Best Practices for Writing Objective Test Items January 2011 March 2010 Former Life January 2010 Director of Test Development , SMT Director of Measurement and Test Development, Pearson Taught EDF 4430 Measurement for Teachers, USF January 7. 2011 Academic Effectiveness and Assessment 3 Purpose

Best Practices for Writing Objective Test Items January 2011 March 2010 This presentation will address the January 2010 importance of establishing a test purpose and developing test specifications. This presentation will explain how to create effective multiple choice test questions. The presentation will provide item-writing guidelines as well as best practices to prevent students from just guessing the correct answers. January 7. 2011 Academic Effectiveness and Assessment 4 Agenda

Best Practices for Writing Objective Test Items January 2011 March 2010 Purpose of a Test January 2010 Prior to Item Writing Advantages of Objective Tests Types of Objective tests Writing Multiple Choice Items The Test-wise Student Test Instructions Test Validity

January 7. 2011 Academic Effectiveness and Assessment 5 Purpose of a Test Best Practices for Writing Objective Test Items January 2011 March 2010 To clearly delineate between those that January 2010 know the content and those that do not. To determine whether the student knows the content, not whether the student is a good test-taker. Likewise, confusing and tricky questions should be avoided to prevent incorrect responses from students who know (and understand) the material. January 7. 2011 Academic Effectiveness and Assessment

6 Prior to Writing Items Best Practices for Writing Objective Test Items January 2011 March 2010 Establish the test purpose January 2010 Conduct the role delineation study/job analysis Create the test specifications January 7. 2011 Academic Effectiveness and Assessment 7 Establish the Test Purpose Best Practices for Writing Objective Test Items

January 2011 March 2010Questions Initial January 2010 How will the test scores be used? Will the test be designed for minimum competency or content mastery? Will the test be low-stakes, moderatestakes, or high-stakes (consequences for examinees)? Will the test address multiple levels of thinking (higher order, lower order, or both)? January Will there be time constraints? 7. 2011 Academic Effectiveness and Assessment 8 Establish the Test

Purpose Best Practices for Writing Objective Test Items January 2011 March 2010 Responses to those initial questions January 2010 have implications such as the overall length of the test, the average difficulty of the items, the conditions under which the test will be administered, and the type of score information to be provided. Take the time to establish a singular purpose that is clear and focused so that goals and priorities will be effectively met. January 7. 2011

Academic Effectiveness and Assessment 9 Conduct the Job Analysis Best Practices for Writing Objective Test Items January 2011 March 2010 The primary purpose of a role delineation January 2010 study or job analysis is to provide a strong linkage between competencies necessary for successful performance on the job and the content on the test. This work has already been conducted by the National Council Licensure Examination for Registered Nurses [See Report of Findings from the 2008 RN Practice Analysis: Linking the NCLEX-RN Examination to Practice, NCSBN, 2009]

January 7. 2011 Academic Effectiveness and Assessment 10 Create Test Specifications Best Practices for Writing Objective Test Items January 2011 March 2010 Test specifications are essentially January 2010 the blue print used to create the test. Test specifications operationalize the competencies that are being assessed. NCLEX-RN Examination has established test specifications. [See 2010 NCLEX-RN Detailed Test Plan, April 2010, Item Writer/Item

Reviewer/Nurse Educatorand Version] January 7. 2011 Academic Effectiveness Assessment 11 Create Test Specifications Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January 7. 2011 Academic Effectiveness and Assessment 12

Create Test Specifications Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January 7. 2011 Academic Effectiveness and Assessment 13 Create Test Specifications Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January 7. 2011

Academic Effectiveness and Assessment 14 Create Test Specifications Best Practices for Writing Objective Test Items January 2011 March 2010 Test specifications: January 2010 Support the validity of the examination Provide standardized content across administrations Allow for subscores that can provide diagnostic feedback to students and administrators Inform the student (and the item writers) of the required content

January 7. 2011 Academic Effectiveness and Assessment 15 Item Development Best Practices for Writing Objective Test Items January 2011 March 2010 After developing the test January 2010 specifications, item development can begin. The focus on the remaining presentation will be on creating appropriate objective items. January 7. 2011 Academic Effectiveness and Assessment 16

Objective Tests Best Practices for Writing Objective Test Items January 2011 Measure March 2010 several types of learning (also levels) January 2010 Wide content, short period of time Variations for flexibility Easy to administer, score, and analyze Scored more reliability and quickly What type of learning cannot be measured? January 7. 2011 Academic Effectiveness and Assessment 17

Types of Objective Tests Best Practices for Writing Objective Test Items January 2011 March 2010 Written-response January 2010 Completion (fill-in-the-blank) Short answer Selected-response Alternative response (two options) Matching Keyed (like matching) Multiple choice

January 7. 2011 Academic Effectiveness and Assessment 18 Written-response Best Practices for Writing Objective Test Items January 2011 Single questions/statements or clusters (stimuli) March 2010 Advantages January 2010 Measure several types of learning Minimizes guessing Points out student misconceptions Disadvantages Time to score

Misspelling and writing clarity Incomplete answers More than one possible correct response (novel answers) Subjectivity in grading January 7. 2011 Academic Effectiveness and Assessment 19 Completion Best Practices for Writing Objective Test Items January 2011 March 2010 A word that describes a person, place or January 2010 is a ________. thing 1. 2. 3.

4. 5. 6. Remove only key words Blanks at end of statement Avoid multiple correct answers Eliminate clues Paraphrase statements Use answer sheets to simplify scoring January 7. 2011 Academic Effectiveness and Assessment 20 Short Answer Best Practices for Writing Objective Test Items January 2011 March 2010describe the term proper Briefly January 2010

noun. ____________________________ Terminology Stimulus and Response 1. Provide an appropriate blank (word (s) or sentence). 2. Specify the units (inches, dollars) 3. Ensure directions for clusters of items and appropriate for all items January 7. 2011 Academic Effectiveness and Assessment 21 Selected-response Best Practices for Writing Objective Test Items January 2011 March from

2010 provided responses Select Advantages January 2010 Measure several types of learning Measures ability to make fine distinctions Administered quickly Cover wide range of material Reliably scored Multiple scoring options (hand, computer, scanner) Disadvantages Allows guessing Distractors can be difficult to create Student misconceptions not revealed January 7. 2011 Academic Effectiveness and Assessment 22

Alternative Response Best Practices for Writing Objective Test Items January 2011 March T F2010 1. A noun is a person place or thing. January 2010 T F 2. An adverb describes a noun. 1. 2. 3. 4. 5. 6. Explain judgments to be made Ensure answers choices match Explain how to answer Only one idea to be judged Positive wording Avoid trickiness, clues, qualifiers January 7. 2011

Academic Effectiveness and Assessment 23 Matching Item Best Practices for Writing Objective Test Items January 2011 March 2010 Column A January 2010 Column B __Person, place, or thing. Adjective __Describes a person, place, or thing. Noun a. b. Terminology premises and responses 1. Clear instructions

2. Homogenous premises 3. Homogenous responses (brief and ordered) January 7. 2011 Academic Effectiveness and Assessment 4. Avoid one-to-one 24 Keyed Response Best Practices for Writing Objective Test Items January 2011 March 2010 Responses January 2010 a. A noun b. A pronoun c. An adjective d. An adverb ___Person, place, or thing. ___Describes a person, place, or thing. Like matching items, more response options

January 7. 2011 Academic Effectiveness and Assessment 25 MC Item Format Best Practices for Writing Objective Test Items January 2011 March 2010 What is the part of speech that is used to name a January 2010 person, place, or thing? A) A noun* B) A pronoun C) An adjective D) An adverb January 7. 2011 Academic Effectiveness and Assessment

26 MC Item Terminology Best Practices for Writing Objective Test Items January 2011 March 2010 Sets the stage for the item; Stem: January 2010 question or incomplete thought; should contain all the needed information to select the correct response. Options: Possible responses consisting of one and only one correct answer Key: correct response Distractor: wrong response, plausible but not correct, attractive to an underprepared student January 7. 2011 Academic Effectiveness and Assessment 27

Competency Best Practices for Writing Objective Test Items January 2011 March 2010 should test for the appropriate Items January 2010 or adequate level of knowledge, skill, or ability (KSA) for the students. Assessing lower division students on graduate level material is an unfair expectation. The competent student should do well on an assessment, items should not be written for only the top students in the class. January 7. 2011 Academic Effectiveness and Assessment 28

Clarity Best Practices for Writing Objective Test Items January 2011 March 2010 Clear, precise item and instructions January 2010 Correct grammar, punctuation, spelling Address one single issue Avoid extraneous material (teaching) One correct or clearly best answer Legible copies of exam January 7. 2011 Academic Effectiveness and Assessment 29 Bias Best Practices for Writing Objective Test Items January 2011 March 2010

Tests should be free from bias January 2010 No No No No No No January 7. 2011 stereotyping gender bias racial bias

cultural bias religious bias political bias Academic Effectiveness and Assessment 30 Level of Difficulty Best Practices for Writing Objective Test Items January 2011 March 2010 Ideally, test difficulty should be January 2010 aimed at a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e., workforce area). January 7. 2011 Academic Effectiveness and Assessment

31 Level of Difficulty Best Practices for Writing Objective Test Items January 2011 March 2010 To make a M/C item more difficult, January 2010 make the stem more specific or narrow and the options more similar. To make a M/C item less difficult, make the stem more general and the options more varied. January 7. 2011 Academic Effectiveness and Assessment 32 Trivial and Trick Questions

Best Practices for Writing Objective Test Items January 2011 March 2010 Avoid trivia and tricks. January 2010 Avoid humorous or ludicrous responses. Items should be straight forward. They should cleanly delineate those that know the material from those that do not. Make sure every item has value and that to the final January 7. 2011 it is contributing Academic Effectiveness and Assessment 33 http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf

Test Taking Guidelines Best Practices for Writing Objective Test Items January 2011 When2010 you dont know the answer March As with January 2010 all exams, attempt the questions that are easiest for you first. Come back and do the hard ones later. Unless you will lose marks for an incorrect response, never leave a question blank. Make a calculated guess if you are sure you dont know the answer. Here are some tips to help you guess intelligently. Use a process of elimination Try to narrow your choice as much as possible: which of the options is most likely to be incorrect? Ask: are options in the right range? Is the measurement unit correct? Does it sound reasonable? January 7. 2011

Academic Effectiveness and Assessment 34 http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Best Practices for Writing Objective Test Items January 2011 Look for grammatical inconsistencies March 2010 In extension-type questions a choice is nearly always wrong if the January 2010 question and the answer do not combine to make a grammatically correct sentence. Also look for repetition of key words from the question in the responses. If words are repeated, the option is worth considering. e.g.: The apparent distance hypothesis explains b) The distance between the two parallel lines appears

January 7. 2011 Academic Effectiveness and Assessment 35 http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Best Practices for Writing Objective Test Items January 2011 Be wary of options containing definitive words and March 2010 generalizations January 2010 Because they cant tolerate exceptions, options containing words like always, only, never, must tend to be incorrect more often. Similarly, options containing strong generalizations tend to be incorrect more often.

Favor look-alike options If two of the alternatives are similar, give them your consideration. e.g.: A. tourism consultants B. tourists C. tourism promoters D. fairy penguins January 7. 2011 Academic Effectiveness and Assessment 36 http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Best Practices for Writing Objective Test Items January 2011 Favor 2010 numbers in the mid-range March If you have no idea what the real answer is, avoid extremes.

January 2010 Favor more inclusive options If in doubt, select the option that encompasses others. e.g.: A. an adaptive system B. a closed system C. an open system D. a controlled and responsive system E. an open and adaptive system. Please note: None of these strategies is foolproof and they do not apply equally to the different types of multiple choice questions, but they are worth considering when you would otherwise leave a blank. January 7. 2011 Academic Effectiveness and Assessment 37 Test-wise Students Best Practices for Writing Objective Test Items January 2011 March 2010

Are familiar with item formats January 2010 Use informed and educated guessing Avoid common mistakes Have testing experience Use time effectively Apply various strategies to solve different problem types January 7. 2011 Academic Effectiveness and Assessment 38 Test-wise Students Best Practices for Writing Objective Test Items January 2011 March 2010 Vary your keys: Always pick option January 2010 C. Avoid all of the above and none of

the above. Avoid extraneous information: It may assist in answering another item. Avoid item bad pairs or enemies. Avoid clueing with the same word in January 7. 2011 Academic 39 the stem and theEffectiveness key.and Assessment Test-wise Students Best Practices for Writing Objective Test Items January 2011 March 2010 Make options similar in terms of January 2010 length, grammar, and sentence structure. Different options stand out. Avoid clues.

January 7. 2011 Academic Effectiveness and Assessment 40 Item Format Considerations Best Practices for Writing Objective Test Items January 2011 March 2010 Information in the stem January 2010 Avoid negatively stated stem, qualifiers Highlight qualifiers if used Avoid irrelevant symbols (&) and jargon Standard set number of options (Prefer only four) January 7. 2011

Effectiveness Assessment Ideally, you Academic should tieandan item to 41 Test Directions Best Practices for Writing Objective Test Items January 2011 March 2010 Directions Highlight January 2010 1. 2. 3. 4. 5. State the skill measured.

Describe any resource materials required. Describe how students are to respond. Describe any special conditions. State time limits, if any. January 7. 2011 Academic Effectiveness and Assessment 42 Ensure Test Validity Best Practices for Writing Objective Test Items January 2011 2010 March Congruence between items and January 2010

course objectives Congruence between item and student characteristics Clarity of items Accuracy of the measures Item formatting criteria Feasibility-time, resources January 7. 2011 Academic Effectiveness and Assessment 43 Questions Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010

January 7. 2011 Academic Effectiveness and Assessment 44 Best Practices for Writing Objective Test Items College of Nursing January 2011

Recently Viewed Presentations

  • Offending behaviour programmes

    Offending behaviour programmes

    A reconviction outcome study of Core SOTP - Mews et al., (2017) 2562 Core programme men who completed the programme between 2000 - 2012 Compared to 13,219 men who did not do the Core programme
  • The University PowerPoint Template

    The University PowerPoint Template

    The School of English, University of Sheffield Acting Together Volunteer Programme Founded in 2002, Acting Together is the university's Theatre in Education programme. Based in the School of English, it has provided a range of programmes in collaboration with local...
  • Stakeholder Summit Session 1: Values, Vision & Mission

    Stakeholder Summit Session 1: Values, Vision & Mission

    Rachel White. Seth Baugess. Bob Mihalek. Jim Hannah. Kris Sproles. Elements of Strategic Planning. Phase 1. Phase 2. The strategic map is a visual depiction of the outcomes generated by the strategic planning process being used by Wright State.
  • They've got your number… Data, digits and Destiny - how the ...

    They've got your number… Data, digits and Destiny - how the ...

    Problem with sampling - voluntary sample, not randomized. Conclusion. Online dating story. Reveal the algorithms of love through an online dating service- Chemistry.com. ... Data, digits and Destiny - how the Numerati are changing our lives
  • instruction organisation - LT Scotland

    instruction organisation - LT Scotland

    The recount book Sue Palmer recount text * retells events * in time order (chronological) Blank version recount letter write-up of a trip or activity newspaper report diary or journal magazine article encyclopaedia entry non-fiction biography These texts are often...
  • Race in Hollywood Film - Asu

    Race in Hollywood Film - Asu

    Dictionary.Com Male Pastoral Buddy films contain the male pastoral, "…an interracial romance that focuses almost entirely on the two men." Women often try to get in the way of the relationship, as is the case in The Defiant Ones and...
  • FHA Condo Approvals County Name No. of Projects

    FHA Condo Approvals County Name No. of Projects

    The Crossing Village Homes. Miami-DadeCounty; 10/23/2014. Azure (76) Trump Palace (275) Mystic Pointe #3. The Courts at S. Beach (190) Brickell Place Phase II (467) Brickell on the River North (384) The Crossings Village Homes . ... Slide 1 Last...
  • Pop Art Portraits

    Pop Art Portraits

    Op Art When something plays tricks on your eyes it is called an Optical Illusion… Op Art is artwork that plays tricks on our eyes. Op Art There is an artist named M.C. Escher who is famous for creating lots...