About the L.A.R.E. Adrienne W. Cadle, PhD
Changes to the Exam Blueprint for 2017 Previous Sub Sections
Current Sub Sections
Section 1
2
5
Section 2
2
3
Section 3
2
3
Section 4
1
4
• Greater detail on what’s covered on the exam • Greater emphasis on sustainable LA practices • Better organization of LA concepts
CLARB
Section 1: Project and Construction Management Previous Examination Blueprint
2017 Examination Blueprint
Project Management
Pre-Project Management
Bidding and Construction
Project Management Bidding Construction Maintenance
CLARB
Section 2: Inventory and Analysis Previous Examination Blueprint
2017 Examination Blueprint
Site Inventory
Site Inventory
Analysis of Existing Conditions
Physical Analysis Contextual Analysis
CLARB
Section 3: Design Previous Examination Blueprint
2017 Examination Blueprint
Concept Development
Stakeholder Process
Design Development
Master Planning Site Design
CLARB
Section 4: Grading, Drainage, and Construction Documentation Previous Examination Blueprint
2017 Examination Blueprint
Construction Documentation
Site Preparation Plans General Plans and Details Specialty Plans Specifications
CLARB
Length of Each Exam • Section 1:
What’s a pretest item?
85 Scored items + 15 Pretest items = 100 items
• Section 2: • 70 Scored items + 10 Pretest items = 80 items
• Section 3: • 85 Scored items + 15 Pretest items = 100 items
• Section 4: • 105 Scored items + 15 Pretest items = 120 items
CLARB
Pretesting New Items • Included on each exam • Not scored • Collect item statistics
CLARB
How is an item supposed to perform? Item discrimination Distractor analysis
Item difficulty
Quality of an item
CLARB
Item Difficulty • Item difficulty is calculated using the p-value statistic • This is the proportion of candidates who answer the item correctly • Item difficulty ranges from 0.00 to 1.00
0.00
1.00
CLARB
Item Discrimination • Item discrimination is calculated using the point biserial correlation • This is the relationship between a candidate’s individual question and their raw score • Item discrimination ranges from -1.00 to 1.00
-1.00
1.00
CLARB
Item Discrimination Item 0001
Raw Score
Item 0002
Raw Score
Item 0003
Raw Score
1
100%
0
100%
1
100%
1
98%
0
98%
1
98%
1
97%
0
97%
1
97%
1
86%
0
86%
1
86%
1
75%
0
75%
1
75%
0
62%
1
62%
1
62%
0
59%
1
59%
1
59%
0
58%
1
58%
1
58%
0
54%
1
54%
1
54%
0
43%
1
43%
1
43%
CLARB
Item Statistics Item Discrimination
1.00
Item Difficulty 0.00
1.00
-1.00
CLARB
Distractor Analysis Which movie won the Academy Award for Best Picture in 2017? A. B. C. D.
Lion Fences Moonlight La La Land
(12%) ( 5%) (35%) (48%)
More people are choosing distractor “D” than the correct answer
CLARB
When do we write these items?
CLARB
Annual Activities Calendar Exam Committee Meets
January
Exam Committee Meets
April L.A.R.E. is Administered
June
August L.A.R.E. is Administered
December L.A.R.E. is Administered
CLARB
Annual Activities Calendar Prepare for Exam Committee Meets
January
Exam Committee Meets
April
June
L.A.R.E. is Administered
August L.A.R.E. is Administered
December L.A.R.E. is Administered
Review results from
CLARB
About the Exam Committee Exam Committee Members
Exam Committee Activities
• Sections 1 and 2:
• Review the results of every examination • Review each new exam form • Write and review pretest items
• Six (6) volunteers
• Sections 3 and 4:
• Ten (10) volunteers • Plus one dedicated drafter
CLARB
Exam Committee Activities Review Results of Previous Administrations • Review poorly performing items • Item difficulty outside of normal ranges • Negative or low item discrimination • A higher percentage of candidates picking any distractor over the correct answer
• Review candidate comments (Yes, we read ALL of them!)
CLARB
What Happens to the Poorly Performing Items? • Items are edited and pretested again to see if they will perform differently • Items are deemed “un-savable” and are deleted
CLARB
Form Reviews • Content Imbalance • Do we have too many items devoted to one topic?
• Bad Pairs • Do we have an item that gives away the answer to another item?
• Gross Errors • Do we have an incorrect key?
• Grammatical and Typographical Errors
CLARB
Write and Review Pretest Items • Each Exam Committee is tasked with writing enough pretest items for the next exam form(s)
• Section 1 and 2
• Each Exam Committee is tasked with writing items to specific content areas and specific item types based on need
• Section 3 and 4
• Multiple choice • Multiple response • Multiple choice • Multiple response • Hotspot • Drag and place
CLARB
Drafting a Drag and Place Item
Draft the item on paper
Create the base image and project elements (tokens) on computer
Save each base and token as a separate image
Import each base and token into the item banking software
Link the base image and tokens in a new item
Write the stem of the item (the question part of the question)
Create the scoring regions (keys and distractors)
CLARB
What did I not cover that you wanted me to cover?
CLARB
THANK YOU! Adrienne W Cadle, PhD
[email protected] @DrACadle