The Examination is a written examination composed of 200 multiple-choice, objective questions with a total testing time of four (4) hours. The examination is based on a content outline developed from a job analysis completed in 2013, which surveyed diabetes educators about the tasks they performed. Questions on the Examination are linked directly to a task or tasks. Each question, therefore, is designed to test if the candidate possesses the knowledge necessary to perform the task or has the ability to apply it to a job situation.
Exam Construction and Scoring
The National Certification Board for Diabetes Educators (NCBDE) develops the Certification Examination for Diabetes Educators (Examination) with the technical assistance of a testing agency. The two organizations work together to construct and validate the examination. NCBDE periodically conducts a survey of diabetes educators practice – often called a practice or job analysis. The study surveys Certified Diabetes Educators to determine the significance of specific tasks to a CDE’s practice. The practice analysis information is used to develop the examination content outline and to determine the percent distribution of the items for the role. Therefore, the subject matter and importance of each item on the examination reflects data validated by this periodic study.
NCBDE selects Certified Diabetes Educators who represent the multidisciplinary aspect of profession to serve on its Examination Committee. The Examination Committee drafts the examination’s multiple-choice items, which are then edited and validated by the testing agency, and approved by the Committee for inclusion on the examination. The Examination Committee and the testing agency review all the examination items for subject matter, validity, difficulty, relevance, bias, and importance for current practice. All items are evaluated, classified, and revised by the Examination Committee and the testing agency for conformance to psychometric principles. Each item is pretested prior to its use and must meet statistical parameters prior to being used as a scored item.
On the basis of a completed practice analysis, it is usually necessary to develop a new examination form to reflect the updated examination content outline and to review the minimum passing point/score. A Passing Point Study is conducted by a panel of experts in the field. The methodology used to set the minimum passing score is the Angoff method. NCBDE’s most recent analysis was completed in 2013, with the new examination content outline being implemented starting with the administration of the spring 2014 examination. In conducting the Passing Point Study, the experts evaluated each question on the spring 2014 examination to determine how many correct answers were necessary to demonstrate the knowledge and skills required to pass the examination, while keeping in mind the need to ensure that the passing score was consistent with the intended purpose of the examination.
Scores are reported as raw scores and scaled scores. A raw score is the number of correctly answered questions; a scaled score is statistically derived from the raw score. The total score determines whether candidate passes or fails; it is reported as a scaled score ranging between 0 and 99. The minimum scaled score needed to pass the examination has been set at 70 scaled score units.
The reason for reporting scaled scores is that different forms of the examination may vary in difficulty. Because new forms of the examination are introduced each year, a certain number of questions in each content area are replaced. These changes may cause one form of the examination to be slightly easier or harder than another form. To adjust for differences in difficulty, a procedure called “equating” is used. The goal of equating is to ensure fairness to all candidates. In the equating process, the minimum raw score (number of correctly answered questions) required to equal the scaled passing score of 70 is statistically adjusted (or equated). For instance,
The examination questions are developed and reviewed for relevancy, consistency, accuracy, and appropriateness by individuals with expertise in diabetes education. Twenty-five of the 200 questions are new questions that have not been used on previous Examinations. Inclusion of these questions allows for collection of meaningful statistics about new questions, but are not used in the determination of individual Examination scores. These questions are not identified and are scattered throughout the Examination so that candidates will answer them with the same care as the questions that make up the scored portion of the Examination. This methodology assures candidates that their scores are the result of sound measurement practices and that scored questions are reflective of current practice.
For additional information about the examination, please refer to the current Handbook.
For some more information on the testing experience itself, you are encouraged to review a video available from PSI/AMP (NCBDE's testing agency). This short video provides an overview of computer-based testing procedures. NCBDE and PSI/AMP hope this video will increase your comfort level by outlining the process leading up to the administration and letting you know what to expect on your testing day.
To access the video, please click the photo above.