Test Validation
- Home
- Employment Testing
- Test Validation
What About Validaton?
What Is Validation? What Is The TestGenius Validation Wizard?
What is validation? This question is a complex and important one, but for the sake of this situation, we’re going to address it in terms of “content validity.” Content Validation is a process whose design is codified in Section 14.C., and whose reporting is codified in section 15.C., of the Uniform Guidelines on Employee Selection Procedures or UGESP. Basically, the process of content validation involves a two-way linkage of knowledge, skills and abilities (KSAs) to job duties, on one hand, and to the test, on the other hand.
This linkage, if written as a sentence, would look something like this:
The KSA (A) is necessary to perform the critical duty (B) and the KSA (A) is measured by the test (C).
So you can see, the test ends up being linked directly to the job duties via the KSAs needed to successfully complete the test.
This validation process can be extremely difficult to perform and complete, in a way that results in a defensible validation report. Thankfully, the Geniuses behind TestGenius have taken years of experience and distilled it into an easy-to-use Wizard that takes only moments to complete.
Here’s How The TestGenius Validation Wizard Works:
1. Set the testing software into Validation Mode. Your account manager will be glad to show you how if you don’t know (but, basically, you go into the Admin Program > System Customization > Check the box to complete the validation survey > Okay).
2. Once done, select seven to ten incumbent employees to take the tests for you as subject matter experts (SMEs.) If you don’t have seven SMEs, then select a “reasonable sample” but only choose those who will give a good effort and will provide helpful responses. Try to get a good mix of ethnicities and genders that would be representative of your applicant and worker populations. When they log into the software, they will be greeted by this screen to the right, which tells them that the software is in “Validation Mode.”
3. The SMEs take the tests designated for each job title, just as an applicant would. Starting and completing each test, giving each a good effort, their scores being reported back to them at the end of each.
4. After the scores for a particular test are reported, the SME is greeted with a survey that introduces the Validation Wizard. They begin by contributing their names and some bio-demographic information that is compiled into the validation report. The bio-demographics include gender, race/ethnicity, and the number of years of job-related experience that they have.
5. The survey continues by asking each SME critical questions about the job-relatedness of the test. Each question is designed to address specific areas of sections 14.C and 15.C of the UGESP. We ensure that the KSA being measured by the test is (1) appropriate for the job, (2) linked to specific duties on the job, (3) required on the first day of the job, (4) not something that is normally trained on the job or learned in a brief orientation, and (5) not presented in a way that is more difficult than when it is used on the job.
After all that, we remind each SME of their own score on the test. Then we ask them, based upon their own score on the test, and given their time on the job and experience with the position, what in their opinion is a minimum qualifying score for someone entering into this position for the first time. The suggested scores are aggregated and four, defensible, job-related cutoff scores are presented to the organization to choose from as pass/fail cut scores.
At the end, a professional validation report is compiled with all of the requisite information, so that, in the event of a questioned hiring practice, an employer need only log in to the system and print a validation report to provide to an applicant, attorney or representative of the court. While we can make no guarantees regarding the outcome, we can say that we have never heard of this particular validation process —when properly administered— failing to provide our clients with the defense they needed for the test validity or the setting of the cutoff scores.