https://icva.net/image/cache/NAVLE_blog_B.jpg Image

NAVLE 101: How the Exam Is Created, Administered, and Scored

NAVLE
Author: Heather Case, DVM, MPH, DACVPM, CAE

A Guide for Veterinary Licensure Candidates

If you’re preparing for the North American Veterinary Licensing Examination® (NAVLE®), you may have wondered: How is this exam built? Who writes the questions? How is it scored? This overview breaks down the process behind one of the most important steps on your path to becoming a licensed veterinarian.

Ensuring Quality and Objectivity in NAVLE Development Process

The NAVLE is developed, written, and administered by the International Council for Veterinary Assessment (ICVA) and is required for veterinary licensure in every U.S. and Canadian jurisdiction.

For 25 years, ICVA has overseen the NAVLE as an objective measure of competency for entry-level private clinical practice. The National Board of Medical Examiners (NBME) supports the NAVLE program by providing editorial review, psychometric expertise, and separate scoring services, creating a clear division between who develops the exam and those who score it. This independence strengthens the fairness, validity, and objectivity of the examination and helps ensure it continues to meet rigorous professional testing standards.

Each version of the NAVLE, known as a test form, is built from a standardized blueprint that defines the topics, species, and competencies every exam must cover. This blueprint is based on a large-scale Practice Analysis that identifies the real-world knowledge and skills essential for entry-level veterinary practice.

Balanced Representation from Subject Matter Experts (SMEs)

The NAVLE is developed through a rigorous and standardized item development process.

  • Subject Matter Experts (SMEs) in veterinary medicine write all test questions after completing formal training in best practices for item construction.
  • Every submitted question is edited and reviewed for technical accuracy by the National Board of Medical Examiners (NBME).
  • Questions undergo multiple rounds of review by additional SMEs.
  • Before appearing as scored items, all questions first appear on the NAVLE as pre-test items, which allows ICVA to collect item performance data in an unscored environment.

 

All test forms, which are the individual versions of the exam given to candidates, follow the same published blueprint. Automated test assembly (ATA)—a computerized process that selects and arranges questions based on strict statistical and content rules—ensures that each test form is as similar as possible in difficulty. This process laid out above is a continuous cycle of writing, reviewing, pre-testing, and analyzing which maintains the exam’s high technical and professional standards. A 2020 review by California’s Office of Professional Examination Services confirmed that NAVLE development, administration, and scoring meet all relevant guidelines and standards.

NAVLE Format, Timing, and Testing Requirements

The NAVLE is a secure, in-person, proctored examination available only to human candidates. It includes:

  • 360 clinically relevant multiple-choice questions
  • Six (6) blocks of 60 questions
  • 7.5 hours total testing time
  • Approximately 15–20% of questions include graphics such as photos or radiographs, with zoom and contrast tools available

 

The NAVLE is offered during three annual testing windows at select Prometric testing centers across North America and international sites. All candidates are given 5 attempts to pass the NAVLE.

NAVLE Testing Accommodations

The ICVA is committed to complying with the principles of the Americans with Disabilities Act (ADA) by providing equal exam access to all candidates. NAVLE candidates with documented disabilities as defined by the ADA must review and complete the Accommodation Request Packet for consideration of testing accommodations through any U.S. state or territorial licensing board. Candidates applying in Canada should contact the Canadian National Examining Board (NEB) for more information on the NEB accommodation process.

Standardized Scoring and Equating Processes

Once a candidate completes the exam, data files are delivered electronically to the NBME.

  • Responses are converted to raw scores (number of correct answers).
  • Raw scores are converted into proficiency estimates.
  • Proficiency estimates are then transformed into three-digit scaled scores (200–800) using equating, a statistical method that accounts for slight differences in difficulty across test forms.
  • The passing score is set using a criterion-referenced process, meaning candidates pass by meeting a fixed standard—not by competing against each other. NAVLE scores range from 200 to 800, and a scaled score of 425 represents the minimum passing level.

 

All scoring is done using proprietary NBME software, and candidates are identified only by a unique identifier, not by name or academic background. There is no human involvement in the scoring of the test.

NAVLE Results and Performance Reporting

Candidates receive:

  • A pass/fail designation
  • Their three-digit scaled score
  • A diagnostic performance report across content areas

 

Scores are delivered directly to candidates from the NBME through a secure online portal. Candidates can preview the format using the Sample NAVLE Interactive Score Report.

How the NAVLE Passing Standard Is Set

The passing standard – the level of knowledge required to pass – is periodically evaluated through a standardized ‘standard setting’ process to ensure it continues to be relevant, valid, and defensible, psychometrically sound, and reflecting the knowledge required for safe, effective entry-level veterinary practice.

During this process, experts evaluate how well candidates should be expected to perform on each item to determine the pass/fail cut scores for the examination. The SMEs are made up of volunteer veterinarians from across the veterinary profession, representing a cross section of all species encountered on the NAVLE. All volunteer SMEs are trained on the standard setting procedure and conduct in-depth item reviews. This process:

 

  • Is conducted periodically using established best practices
  • Uses expert judgment from a consortium SMEs representing diverse backgrounds, practice areas, and geographic regions
  • Is based on the NAVLE Practice Analysis

 

For the most recent NAVLE standard setting, three separate exercises were conducted to broaden SME participation and avoid “group think”.

 

The Role of Practice Analysis

All licensure exams[1] rely on practice analysis to ensure the content reflects actual, current practice. The NAVLE Practice Analysis describes:

  • Veterinary work context
  • Species and diagnoses encountered
  • Clinical and professional competencies
  • Foundational and basic veterinary science knowledge

The NAVLE’s current blueprint is based on the 2017 practice analysis. The practice analysis surveys thousands of practicing veterinarians across species and practice types across North America to identify critical competencies that entry level private clinically practicing veterinarians need to have a firm grasp of.

An updated blueprint is underway with completion expected in 2026.

The NAVLE is more than an exam—it’s a carefully designed, continuously evaluated assessment built to ensure that newly licensed veterinarians are prepared for safe and effective practice. Understanding how it is created, administered, and scored can help you walk into exam day with confidence, knowing that the process behind the NAVLE is as rigorous and evidence-based as the profession it serves.

Have more questions? Be sure to check out our comprehensive FAQ page.

 

[1] A licensure exam (or licensing exam) is a formal test required by a regulatory authority to determine whether an individual has the minimum knowledge and skills necessary to safely practice a profession. Passing the exam is one of the key steps needed to obtain a professional license.

SETTING A HIGHER STANDARD TOGETHER®