Chapter 1A: Preliminaries

Thank you for purchasing licenses for FastTest! FastTest is an Internet application designed to support the entire test development cycle. It helps to improve the process of developing assessments as well as make them more psychometrically rigorous. This manual provides detailed instructions on how to manage users, create banks of items, assemble the items into tests, build test sessions to administer to examinees, schedule the examinees, deliver tests online, and generate reports.

The Two Components of FastTest 

There are two parts to FastTest. The Test Development part is where you write items, store statistics, and build tests (and, optionally print them). The Test Delivery part is where you create test sessions, schedule examinees, and deliver tests to examinees online. The Test Delivery part is optional. If you administer tests by paper-and-pencil and therefore have not purchased test credits (test codes), much of the Test Scheduler Tab and Delivering Tests portions of the manual will not apply to you.

Navigating FastTest

FastTest is designed to be as intuitive and user-friendly as possible. In recognition of the fact that users interact with software in different ways, there are three ways that a user can work with FastTest: left-clicking on buttons, right-click drop-down menus, and keyboard shortcuts.

Figure 1.1: Item Explorer
item explorer tab

Figure 1.1 shows the Item Explorer, in the view that a user sees when they first log into their workspace. There are six prominently displayed buttons in the upper left: New, View, Edit, Delete, Sync, and Import. The functionalities associated with these are discussed in detail later in this manual, but for now note that the primary buttons will be presented in this location. However, many of the functionalities in these buttons can be accomplished by right-clicking on item banks, categories, or items to bring up a drop-down menu. For example, a new item can be created either by selecting a bank/category and left-clicking on the New button and selecting New Item, or by right-clicking on a bank/category, and selecting New Item from the list that appears. See the Item Explorer Tab for more information.

Keyboard Shortcuts

If a dialog is open, pressing the “Esc” key does the same thing as the “Cancel” button (or the “×” in the top right corner). Pressing the “Enter” key will submit the dialog. (Note: this will not work if the current focus is in a text area because the “Enter” key does a newline in text areas.)

Application Tabs

The four tabs shown near the top of Figure 1.1 represent the test development cycle simplified to four primary steps.

  1. Item Explorer – uploading, writing, sorting, and reviewing test items
  2. Test Assembler – assembling items into test forms
  3. Test Scheduler – scheduling and delivering tests to examinees online (paper versions of tests can be printed and managed in the Test Assembler tab)
  4. Report Manager – analysis of examinee results, items/banks, and tests

All four tabs are visible to workspace administrators. For users in specific roles, tab visibility is limited by relevance. A person who serves only as a Test Scheduler (uploading lists of examinees, scheduling times, and accessing results) will not see the Item Explorer or Test Assembler tabs. This is to enhance the security of the assessment program.

Chapter 2A: Workspaces and Users

Table of Contents

Workspaces represent a testing program, such as a profession or school. Each FastTest account comes with one workspace and one administrator account for that workspace.

  • Users (item writers, etc.) are then assigned to a workspace by the administrator.
  • Each workspace must have at least one administrator, meaning that if you want additional workspaces to keep different parts of your testing program completely separate, you must purchase additional administrator licenses.
  • Note: multiple banks can be created within a workspace, and different users in your organization can have access to only designated banks.


The workspace is the highest level in the hierarchy of your testing program. Each workspace contains item banks, which contain categories (domains), which contain items.

  • For example, a school district might set up two separate workspaces, one for internal formative testing and the other for official summative testing; each workspace would have banks for each subject (like Science or Math); then each bank would have categories to organize items on the subject (Biology, Chemistry, and Physics). However, users of the internal workspace would not have access to the official testing workspace.
  • The user accounts within a workspace are created and managed by the workspace administrator.
  • Non-administrative users are taken directly into their workspace to see the banks and tests available.
Figure 2.1: Workspace Homepage

workspace homepage

Creating New Users

  • Click New User
    • Information tab
      • First and last name and email are required. Other fields shown are optional
    • User Defined tab
      • Contains custom fields that have been created in the “custom fields” editor accessible from the workspace homepage

new user

  • Roles tab
    • Choose the appropriate role(s) for user
      • Workspace role is optional
      • Item bank level role is required
    • Determine Bank access
      • Note: system defaults to No Banks, which restricts access from all items
    • Determine Content Hierarchy access
      • Note: system defaults to No Content
      • If Content Hierarchy exists (set-up at the Admin level) then users must be given access to content on some level to gain access to items.

new user roles


The Configure button controls the metadata of the workspace.

  • Change the skin of the workspace
  • Upload logo
    • A large file size may decrease the speed of each screen loading
  • Modify which data fields are displayed regarding users (see below)
  •  Other options include setting the workspace’s IRT D scaling factor and giving labels the user defined statistic fields for items.
Figure 2.2: Workspace Configuration

workspace configuration

  • The Asset Properties tab provides the option to prevent creating assets that have duplicate names.
    • If this box is unchecked it will prevent assets from being created/updated individually or when uploading a zip file.

asset properties

  • The Delivery Properties tab provides options regarding Test Code emails, Group Test Codes, and the Test Session Results Reports.
    • The Default Test Results Email can be used to enter an email(s) that will receive a blind copy of all Test Results Emails.

delivery properties


Content Hierarchy

Content hierarchy is used to categorize and organize items. Users are able to create nested folders based on topic, knowledge area, or any other categorical structure.  Administrators can control user access by assigning content areas.  Content hierarchy is 100% customizable, and is therefore a very powerful tool.


  • From the Administration page, select Content Hierarchy. A new tab will open.
  • Create a content hierarchy based upon your assessment’s needs. Name the area and click Save.
  • Create additional content areas by selecting the green plus signs – there is no limit to the number of levels.
  • Once the content hierarchy is created, close the tab.
  • Within the workspace, create item banks and categories.
    • Under the Information tab in the item editor, assign items to a content area as they are created.
  • Note: There is also an option to import a preexisting content hierarchy.
    • Clicking Import will open the dialog shown below
    • Click to download the sample import file if needed

content hierarchy import

    • Existing content hierarchy can also be be utilized to make changes
    • Click Export, make adjustments and save the xlsx file
    • Import the updated file by clicking Import


The following example is based upon a hypothetical school situation in which teachers may be assigned to review items based on grade and/or subject.  Therefore, we will have a Content Hierarchy based on subject, and Item Banks based on grade.  In the example below Level 1 is History. If a user was creating a content hierarchy for an entire school, he/she might have a number of subjects listed as Level 1 content areas. Each level of content breaks down the content areas into more specific categories. In this instance, the first level is the subject, the second level is the area of history, and the third level is a class within that area of history.

history content hierarchy

Item Banks

Create meaningful item banks and categories to distinguish where certain items are held. See below for an example.

history content item bank

This example includes a content area called Modern US History (in the hierarchy, level 3). There is also a folder within the item bank Grade 11 called Modern US History. Having both of these labels could be valuable in assigning roles, because a teacher could be assigned to only Grade 11 items (and therefore they would only see US Modern History and World Government and Politics) or could be assigned to the content area (within the hierarchy) of Modern US History. This way the teacher would only see the items within that content area.

Assigning content areas allows users to restrict user access on a much larger scale.

It is important to note that users cannot be assigned to item categories (folders).  Access for users can only be limited at the bank level.

Item Creation

As items are created, nest them appropriately in their respective category. See below for an example.

history items

Also assign items an area within the content hierarchy. See below for an example.

assign content hierachry

A newly created item will not automatically be assigned a place within the content hierarchy. However, if you duplicate an item, it will retain its assignment within the hierarchy.

User Access

  • From the workspace administration page, select New User (or select the User you wish to Edit).
  • Under the Roles tab, restrict their access appropriately. See below for examples.

Grade 10 Teacher: This teacher will see all content for Grade 10.

user role access workspace

World History Teacher: This teacher will see World History content for every grade.

world history content


Grade 10 World History Teacher:

world history grade 10

Custom Fields


The Custom Fields button, next to the Configure button, allows you to define custom fields for system users, items in your bank, tests, and examinees. For example, all items might be associated with a “Task objective.” You could then create a field called Task Objective, which would appear in the Item Editor for each item. All users might have a “Security status,” which can be created here.

Figure 2.3: New Custom Field Dialog

custom fields

The image above shows the dialog for defining custom fields. Each custom field has a required name and an optional description and default value. The type of the custom field can be specified to give the user a better experience when inputting a value. For example, a date type custom field will have a calendar widget associated with it. The available types are:

input widget

A custom item field can also be used “for review.” For cases when it is desirable to get input from multiple users (such as Angoff Index), the custom field can be used for review by checking the box “Use for review”. Normally, each item has a single value associated with it for each custom field, but review fields allow for any privileged user to add a value to the field.  Note: custom fields used for review do not have default values.

Once you have finished your review process and collected Angoff ratings, you can right click an Item Bank and select Export > Review Field. The spread sheet generate will have a row for each item and a column for each logged in user who added a value to the review field.

Custom examinee fields will appear in the Edit Examinee Dialog and can be used when developing a Demographics Screen in the Edit Test Options Dialog. They will also appear as columns in the Export Examinee Data CSV export, accessible on the Test Scheduler tab.



The bottom part of the screen you see in Figure 2.1  pertains to the management of users. A list of users for the workspace is presented here. The columns visible in this list are controlled by the check boxes in Figure 2.2. You can see that the sample workspace in the figure has two users, Napoleon Bonaparte and Julius Caesar.


  • Click the New User button to create a new user or Edit User to edit an existing user. Either button will bring up the window shown below.
  • Specify personal information regarding a user and specify his or her roles (shown below in Figure 2.5) and a password for the user. Fields marked with a red asterisk (*) are required.
    • Note that you need both a first and last name for each user; this is because the user ID will be created with those names. The default login is the first letter of the first name and the last name (with spaces removed). For example, if the user’s name is John Smith, the default login is jsmith.

The User Defined tab presents fields that were defined by the workspace administrator using the Custom Fields functionality.

Figure 2.4 Edit User Dialog

user defined


User Roles

There are six roles for a user in addition to workspace administrator. Users can be assigned in Figure 2.5 to any number of roles to specifically delineate what they are allowed to do in the workspace.

Figure 2.5: User Roles Screen

user roles

These roles can be modified at a later date. This allows you to set up new users with limited access to start with, such as a read-only Browse User, until they become familiar with the system. The user can then be given more access. The following table further delineates available functionality.
user role1

Remember that multiple roles can be assigned to specifically control what users can do.

  • For example, the Test Creator can only create tests and edit test metadata; they cannot add items to assemble tests because they do not have rights to view the item bank.
    • This role is intended to manage existing tests or create new tests that await assembly by other users.
  • If you want the user to also be able to assemble tests, you can add the “Browse User” role to their account, which will allow them to browse through the item bank to select items for tests.

If all users have been set up as necessary, and you have created any custom fields you need, you can proceed to Enter Workspace.

Chapter 2B: Workflow Management

Table of Contents

FastTest is built with a flexible workflow management module that is designed to manage large numbers of item writers and reviewers across a range of content areas, supporting the validity of your assessments.  Based on the concept of item status, this is an essential aspect of best practices in item banking. Numerous resources exist to help you establish guidelines and processes for content development at your organization; one of our founders wrote this review on a landmark book for item development.  Because FastTest is configurable, you can decide on the best workflow process for you and then set up FastTest to meet your needs, rather than be forced into a pre-defined workflow.  There are three aspects to Workflow Management: defining the workflow (Admin only), changing item status, and assigning items to users.


Defining the Workflow

Workspace administrators can now define the statuses they want to use and how the workflow can proceed.

  • From the workspace homepage (the Administrator landing page), click on the button for Workflow Management.
  • A new window will appear that allows administrators to establish a list of statuses that represent your stages of item development, and the possible paths of status change
    • For example, the workflow can be set so that an item can only change to “Active” status after being “Review 2” status, which can only occur after being “Review 1” status, thereby requiring that all items be reviewed by at least two experts.
  • A default list of four statuses is provided as the example below.  Administrators can then delete these or add new statuses, and then change the sequence of the workflow.
    • The check boxes denote possible changes.  In this example, it is not possible for an item to go back to being New after having been in Review status.

 Workflow managment


Changing item status

Users can then assign items to each status as they proceed through their work, ensuring that items meet development and review requirements at all stages.  This digital paper trail serves as documentation of content aspects of validity.

  • To change a status of an item, click on it the Item Explorer tab, then click the Workflow button above.
    • This button provides options to review, change the item’s status, assign the item(s) to a user, or preassign the item(s). These options can also be accessed via right-clicking on an item.
    • These commands can also be used on multiple items when relevant, for example SHIFT-clicking to select a number of items and then changing them all to Review status.


Assigning Items to Users

Administrators can assign items to specific users (authors/reviewers) to complete each stage of development.

  • Using the “Assign Item(s) To” command allows administrators to designate which item(s) to assign to a user.
  • This action also works with multiple items, so for example, you can select a number of items and assign them all to a user to review.
  • A user can then view a list of all items assigned to them, review them, and change the status to the next status after reviewing each.

assign items

 Preassigning Items to Users

Administrators can preassign items to specific users (authors/reviewers) to complete each status that has been created in the workflow.

  • Using the “Preassign Item(s)” from the Workflow button or by right-clicking allows administrators to preassign items to users at specific points, or statuses, in the workflow.
  • If a workspace has multiple item statuses, such as the example shown below, users will receive an email notification that an item has been assigned to them when the item status reaches their assignment level.
    • From the example below, after Jane Carlson looks at the item and changes the status to ‘Review,’ John Smith will receive an email that the item is now assigned to him.

preassign items


  • Multiple items can be highlighted by SHIFT-clicking to preassign at one time
    • Note that if multiple items are highlighted that do not have the same preassignment configuration, preassignment is disabled.
    • Clicking ‘Force Assignment’ will allow for a force change to those statuses.

 force assign

Chapter 3A: The Item Explorer Tab – Creating Items, Banks, and Categories

Table of Contents


The Item Explorer Tab is where you create items and organize them into banks and categories. Items can be entered directly via your browser or uploaded into the system. The image below shows the landing page for a bank in the Item Explorer, before any items are shown. Note that the main pane on the right (“item search results pane”) gives tips on how to present a list of items.

Figure 3.1 The Item Explorer Tab

item explorer tab


Navigating a Workspace

When a user has entered a workspace, five tabs are visible: Item Explorer, Test Assembler, Asset Manager, Test Scheduler, and Report Manager. Use of these tabs is the primary focus of this manual.

Also note the additional links on the edges of the screen. In the upper right corner of Figure 3.1 on the Item Explorer Tab page, there are five links.

  • Clicking on the workspace name link takes the user back to the workspace overview screen (Figure 2.1 on the Workspaces page) (This is available to workspace admins only).
  • Clicking on Edit Profile allows the user to edit personal information as well as change his or her password.
  • Clicking on the Manual link will open this manual, and the Help link is an email link to
  • Clicking Logout will log the user out of FastTest.

At the bottom of the screen is the logo for Assessment Systems Corporation, the developer of FastTest. The user can click on the logo to visit the company’s website. In the center of the bottom are additional links to return to the workspace overview page, to logout, to access the manual, and to write a help email, as well as a link to the Terms of Use for FastTest. The current version is also displayed and clicking it will bring up the Version History page of the manual.

Creating Banks and Categories

Having created a workspace to store your items and tests, the next step is to open one or more banks into which you will save your questions/items and from which you will draw items to create your tests.

  • To create a new bank, select New → New Item Bank with the button above the bank listing. A dialog box will then open in which you name your bank and provide a description of it.
    • Note that only workspace administrators can create banks.

FastTest’s item banks are designed to accommodate hierarchical structures so that your banks can reflect the structure of a curriculum, the needs of large item banking projects, or any other application for which structured item banks are useful.

  • You can place items in any category of the hierarchical bank structure.
  • Items can be freely moved (by drag-and-drop) among the categories within a bank or between banks within a workspace.
  • In addition, bank categories (including all of the items in them and subcategories below them) can also be moved both within and between banks by drag-and-drop.

The menu of category commands is available by right-clicking on any category. In addition to creating a new item or category, you can edit the category information and delete the category.

  • The menu also contains commands regarding the items in that category
    • Show all items in the item listing
    • Export items
    • Import statistics for existing items
    • Export statistics of items.

When you create a new category, or edit an existing category, you will see a dialog with two tabs, as shown in Figure 3.2 below.

  • The Information tab contains the name, short description, and location (path) of the category within the bank.
  • The second tab is a larger text field that allows you to record specific objectives or guidelines for items in that category.

These objectives will be visible whenever a user is editing an item, to remind them exactly what content or other information items in a given category must take into account.

Figure 3.2: Edit Category Dialog

Edit Category Dialog

When creating a new category, you can also create any number of subcategories at the same time by specifying the folder path for the subcategories on each line.

  • For example, if you needed to create a category named Algebra with three subcategories named Linear, Quadratic, and Graphs, you would paste the following into the Name field:

If you are creating a new bank and already know the folder structure you wish to use, this method allows you to create the entire folder structure at once.

Viewing Banks, Items, and Categories

After banks and categories are created or imported, they are visible in the Item Explorer Tab. The same is true for items. Figure 3.3 below is showing a list of items in a Math bank.

  • The number of items in the bank or category is presented in parentheses as (x of y).
    • The first number indicates the number of items contained in the immediate root of the folder while the second number indicates the total number of items at all levels within the folder.
    • In this example, there are 11 items in the Math bank; 3 are in the root, while the remaining items are in categories (Algebra, Arithmetic, Calculus, & Geometry).
Figure 3.3: Item Explorer showing a list of items
item explorer banks

Note, however, that if you import categories and their items, they will not be immediately visible on the screen. You need to click the Sync button so that the screen shows the state of your bank after the items have been imported. You will receive an email when the import is complete.

The Item Explorer is similar to Microsoft Windows Explorer, in that items are arranged in a nesting system of folders, which can be moved by drag-and-drop, renamed, and minimized/maximized by clicking on the plus/minus sign.

  • To view a list of immediate items in a category or bank, click the category or bank’s label.
  • To view a list of all items contained in a bank or category, drag the bank or category into the list space or right click the bank or category and select Show All Items.
  • Large numbers of items are paginated; you can control the number of items per page, and which page to view, immediately above the list.

Copying an Existing Item

An existing item can be copied by right-clicking on the item in the list of items (Figure 3.3). A menu will appear with an option to Duplicate Item. Items can be selected and moved (by drag-and-drop) from the list of items to a bank or category on the left.

Adding Items to Your Banks

FastTest provides two ways for you to enter items into your banks: directly through the Edit Item dialog, or with import files. The Edit Item dialog (Figure 3.4) appears when you create a new items or open an existing item.

  • A new item can be created by clicking the New button, or by right-clicking any bank or category, and then selecting New Item.
    • Note: In order to create a new item via the New button, the bank or category that will contain the item must first be selected.
Figure 3.4: Edit Item Dialog – Information Tab
edit item info tab


Creating Items Directly in FastTest’s Edit Item Dialog

The Edit Item dialog contains six default tabs.

  • The Information Tab records basic information about the item, such as name, version status, and author. It also lists all tests that currently use the item.
  • The Content Tab is where you enter the text of the item and the answers.
  • The Statistics Tab holds item statistics and graphs IRT functions.
  • The Comments Tab records comments made on the item by reviewers.
  • The Review Tab provides a summary of the values for custom fields used for review.
  • The Objectives Tab simply presents the objectives of that category to provide guidance to item writers.

These tabs will be discussed in more detail below, under Editing items.

Importing Multiple Items From a Word Processor or Text File

Existing banks can be imported in seven different formats

  1. PC FastTest 2.0 export format
  2. QTI 1.2 (Question and Test Interoperability)
  3. QTI 2.1
  4. Word XML
  5. Tagged RTF/TXT file (@@signs)
  6. Simple, text-based import
  7. Word XLSX
  • To import items, click the Import button or right-click on a bank and select Import → Items.
  • A dialog window (Figure 3.5) will then appear where you specify the path of the file to import and the destination bank.
    • Note that file uploads are limited to 100 MB, so it might be necessary to separate your files before importing. Optionally, you can have a uniform style applied to the items that are imported. This style can be defined differently for each bank (on the Formatting tab of the Edit Bank dialog).
  • The menu of commands from right-clicking on a bank is shown in Figure 3.6.
Figure 3.5: Item Import Dialog

item import1

Figure 3.6: Item Bank Right-Click Menu

item right click

FastTest supports the following import formats:

import format table2

Wondering which method fits best for you? Follow the questions below for a recommendation.

  1. Do I currently use another banker that can export to the QTI 1.2 or QTI 2.1 format?
    Yes: Use the QTI 1.2 or QTI 2.1 format.
    No: My current item banker exports to RTF format or another text format, or my items are currently in Word documents or similar state. Proceed to Question 3.
  2. Do I have ancillary information to import, such as Author or Statistics?
    Yes: Proceed to Question 3.
    No: Use the Word XML format (if item text is rich/formatted ) or the simple text importer (if item text is plain text).
  3. Do I need to import tables or text boxes?
    Yes: Use the Word XML format to import items and images, then use the Import Statistics feature to import the remaining information.

Importing Item Metadata to Existing Items

If you want to update item metadata to existing items, you can import a CSV file with the desired changes. This is especially helpful if your items were imported with an import format that does not support metadata but only supports item content (such as the Word XML import).

  • To import metadata, you must first generate an import file.
    • Right click on the bank or category that contains the items you wish to update and select Export → Item Metadata. This will download a CSV file that you can open, make changes, save a copy, and reimport. You can delete any columns that you do not wish to make changes to, but the UNIQUE ID column must remain intact.

The metadata you are able to change includes the following:

metadata you can change


metadata you can change2

Note: When you save a copy of the exported file to a new location, it MUST be saved as a .CSV file. By default, Excel will save it as “Unicode Text (*.txt)”, so a Save As must be done to explicitly change the file type to “CSV (Comma Delimited).” Importing will overwrite existing values and leaving a cell blank will cause that value to be erased.

Certain columns have specific requirements.

  • The columns related to statistics fields must be numbers.
  • The STATUS column must be one of NewReviewActive, or Retired.
  • The VERSION column can only be changed by creating a new version of an item in FTW.

After making changes, and saving as a CSV file, you are ready to import the file.
To import the file, right click on a bank or category and select Import → Item Metadata. It does not matter which you choose because the importer will use the ITEM KEY field. A dialog will appear which you can use to browse for your file and upload.

*Changes made to the CSV file and imported will immediately be reflected on FastTest.  For example, changing the category of an item will result in the item being moved from the previous category bank into the new category bank.

Chapter 3B: The Item Explorer Tab – Editing Items

Table of Contents


As mentioned, items can be opened for editing by right-clicking on them in the items list or simply double-clicking them. This opens the Edit Item dialog, which has seven tabs:

  • Information
  • Content
  • Statistics
  • Comments
  • Review
  • Objectives
  • Rubrics

An item is locked if another user currently has it open; a workspace administrator has the option to override this and unlock the item. An item will unlock automatically once a user saves or closes the item, logs out of the system, or loses the session (times out).

The Information Tab

The information Tab shown in Figure 3.4 in Chapter 3A contains information about an item that is not related to content or psychometrics. The following is a list of the fields.

Info tab fields

Custom item fields and the list of tests that use the item are also shown on the information tab. The history button will display a history of important events in the item’s history.

The Content Tab

The Content Tab is where you enter the actual text of the item (shown below). The stem is entered in the box on top. To add an answer, click the Add Answer link at the bottom. The number of answers is unlimited.

Figure 3.7: Edit Item Dialog – Content Tab

IE content tab

The correct answer, or key, is specified by utilizing the Answer Weight field. This appears beside the answer marker in an input box.

  • For conventional number-correct scoring, specify the correct answer with a weight of 1.0 and the incorrect answers with a weight of 0.0.
    • The answer weight is used directly in the score calculations.

At the top of the content tab, you will see the response Type drop down. This allows you to specify which type of response is expected by the examinee. The types currently supported by FastTest are described in the table below.

Examples of each Item Response type can be found in Appendix K: Item Authoring.

 item response type table


Multiple Response Options

  • Partial Credit multiple response item
    • Examinees will receive some credit for the correct answers they choose.
      • Item writers mark which items are correct by indicating appropriate answer weights for each answer.
        • For example, in the sample item below the two correct answers are each worth .5 points.
        • The examinee can earn 1 point, .5 points, or 0 points on this item.

MR Partial credit item

    • The amount of answers selected by an examinee can be controlled by using the drop down menu (MR menu button) located to the right of the Response Type down down.
    • Click on Scoring to open the window shown below

MR partial scoring options

  • The Limit Examinee Responses drop down allows for examinees to only be able to select the number of responses indicated
    • NOTE: Unlimited will allow for examinees to select all answers and therefore receive credit by default for having included the correct answers.
  • All-or-nothing multiple response item
    • Examinees will only receive credit if they correctly select all of the answers that are correct
      • Item writers mark which items are correct by checking the box to the left of the answer content
      • For example, in the sample item below the two correct answers are indicated by the check marks
      • The examinee can earn full credit or zero credit
  • MR all or nothing item
    • The credit given for each item can be specified using the drop down menu (MR menu button) located to the right of the Response Type down down.
    • Click on Scoring to open the window shown below
  • MR scoring options
    • Enter in the desired Total Points for the item
    • The Limit Examinee Responses drop down allows for examinees to only be able to select the number of responses indicated
      • NOTE: Unlimited will allow for examinees to select all answers and therefore receive credit by default for having included the correct answers.


Likert-Type Items

Likert-Type items have a weighted scale associated with them. A scale is a reusable set of responses.

  • A scale can be defined by clicking the Create/Edit Scales link that appears on the left hand side when Likert-Type is selected.
    • A new dialog will appear that shows all of your currently defined scales and allows you to create, edit, and delete scales. Creating a new scale will present you with the following dialog:
Figure 3.8: Edit Scale Dialog

edit scale dialog

  • Each responses is given a width, a label, and a weight.
  • When a test is scored, the weight of the selected response acts the same as the weight of a selected (traditional) answer.
    • Optionally, either the first or the last response can be given a blank weight to act as an N/A option. The N/A option will be recorded as answered, but it will not affect the overall score. It will also be visually separated from the other responses in the scale during the online test.
    • Options can be given the same weight to collapse categories for IRT scoring purposes; this is typically done when IRT analysis finds that some particular options do not have sufficient numbers of respondents to calibrate with IRT (too small “sample” size) and are combined with options that have had more responses.

Scale Layout controls how the stem is placed relative to the scale:

  • Stem Left
    • Stem text is to the left of the scale and its radio buttons. In the testing engine, the scale labels will be printed once at the top of the page in a fixed pane and many Likert items will be able to reference that pane.
  • Stem Top
    • Stem text is above the scale. The stem and scale each take up the same width. The scale is repeated for each item.
  • Stem Embedded
    • The scale is part of the stem. Requires special modification of the underlying markup (HTML) of the item. Documentation pending.


Drag and Drop Items

Drag and drop items offer three options: Place Answers (Insert Blanks), Place Answers (Insert Lists), and Single List.

  • Place Answers – Blanks and Images
    • Include instructions for the item in the first box.
    • The Target Container is presented to the examinee, and where they will place their answers.
      • To create a target element in text, first enter in all of the text.
        • If the item includes an image, insert the image(s) as well.
        • The distance is the amount of pixel variance given to examinees when the Snap check box is selected.
          • A low value like 20 will require the placement to be relatively exact.
          • A higher value like 200 will allow the examinee to drag the element just in the area of the target and it will “snap” to the correct location.
      • Highlight the text that you want to replace with a blank and select Insert Blank.
        • The highlighted text will automatically be removed and replaced with a blank.
        • If the blank is the last word in a sentence before a period,do not include the period when highlighting
        • Note: Utilize the Preview function to ensure proper placement.

Figure 3.9: Place Answers – Blanks and Images

placeable answers

    • Enter the key(s) and distracters in the answer boxes below.
      • The answer weight for the keys/correct responses must be great than 1. Incorrect answers/distracters are weighted 0.
      • Only answers given a weight greater than 1 will be options for placing correct answers.

Figure 3.10: Answers and Weights

 placeable answers and weights

    • Click on Place Answers to open the Place Answers Window, shown below.
      • The answers previously given a weight greater than 1 will be available to drag and drop to the correct blank or be placed on an image.
      • Drag your answers from the top box to the desired location in the question below.
        • In the example, we have dragged “Saint Paul” to the target zone in “The capital of Minnesota is ___________.” Saint Paul is the capital of the State of Minnesota.
      • Placeable zones on images are created based on where you place the correct answer and the distance you set using the drop-down in the Item Editor window.
        • The distance is the amount of pixel variance given to examinees when the Snap check box is selected.
          • A low value like 20 will require the placement to be relatively exact.
          • A higher value like 200 will allow the examinee to drag the element just in the area of the target and it will “snap” to the correct location.
      • Clicking Save will return you to the Item Editor.
      • Click Save to finish creating the item.


Figure 3.11: Place Answers Window

 place answers and windows

Figure 3.12: Place Answers – Exam View

place answers exam view

  • Place Answers – Lists
    • Multiple lists can be included in an item by using the Insert List button
    • Include instructions for the item in the first box
    • Prepare the Target Container by entering all surrounding text
      • Note: Leave ample line breaks so that the text is not disrupted when the list is inserted.
    • Place your cursor appropriately and click Insert List. A green box will appear in the Target Container.
    • Enter your answers and weights

Figure 3.13: Place Answers – Lists

placeable lists


  • Click Place Answers to open the Place Answers window
    • Answers weighted greater than 1 will be able to be shown in the top box
    • Place the answers in the correct lists and click Save to return to the Item Editor


Figure 3.14: Lists Place Answers Window

place answers window

Figure  3.15: Place Lists Exam View

place answers lists exam view

Single List Item Response

  • Type the instructions or question for the item in the top box.
  • Type the desired target list title into the Target Container box
    • This labels the area into which the examinee will drop their answers.
      • In the example below the title is ‘Cities Located in Minnesota
  • Provide the key(s) and distracters in the answer boxes.
    • Key(s)/correct answers must be given a weight greater than 0 (i.e. .25, .5, 1, etc)
    • Distracters/incorrect answers are weighted 0.
  • Set the List Limit using the drop down option located towards the top of the Window.
    • The chosen number is the amount of maximum answers the examinee is allowed to place in the Target Container.
      • For example, if you are asking them to select 2 from a list of 5, the List Limit is 2.
      • Note:  If this goes unaltered, examinees could get credit by dragging all options into the box.

Figure 3.16:  Single List Response Type

single list item

  • Figure 3.13 depicts how the droppable list response type is presented in an exam to an examinee. 

Figure 3.17: Single List Exam View

single list exam view


  • Type your item instructions or general question in the top box
  • The Target Container contains the text that the examinee will be editing.
    • Type the text including the errors that need to be identified and corrected.
    • You can then select the text elements that you wish to be editable or proofable.  To create a text element, use your cursor to highlight the word(s), sentence, etc.
    • Once highlighted, click the PR button on the toolbar (). This will create an editing option for the examinee.
    • To add the correct answer (correction to be made by examinee) to the error, click on the text element with the error and then click Add Answer.
    • Type the correct response.
      • In the example below, “too” is corrected to “two.”
  • Note: If browers have automatic spell check turned on misspelled words will be underlined, thus giving away the answers.  Spell check will need to be disabled for exams containing Proofing items.

 Figure 3.18: Creating a Proofing Item proofing1

  • Proofing items are presented to the examinee as shown below.
    • The examinee clicks on the error, and a box appears above for the examinee to type the correction into.

Figure 3.19: Proofing Item Exam View



Fill in the Blank

  •  Enter the appropriate text in the top box
    • Instructions, reading passage, etc.
  • In the Target Container, enter the text that will include the blanks
    • Wait to insert blanks until all text is entered to maintain correct formatting
  • Place the cursor where the blank will go and click Insert Blank
  • Only the blank highlighted in green corresponds to the keyed answers shown below
  • Use the arrows (<<) below the Target Container to move between blanks and keyed answers
    • Make sure to enter keyed answers for each blank
    • Use the preview function to view all blanks with corresponding keyed answers
  • Note: Spelling is taken into account when scoring

FITB item

Item Layout

Next to the Response Type is Item Layout. Clicking the button brings up a new dialog with layout options, as seen below:

Figure 3.20: Edit Item Layout

edit item layout

Two check boxes exist for adding an Asset Links Pane and/or adding a separate assets pane that creates a Split-Pane View (See below).

  • The Answer Marker is where to specify whether the answers should be marked A-B-C-D or 1-2-3-4, or follow the format of whatever test the item is used in.
  • The Show Rationale button toggles the display of rationale entry boxes in the editor.
    • Rationale can be specified at the item level and/or for each response (boxes under the stem/response).
  • Above each answer, buttons exist for reordering and deleting that answer.

The Content Tab includes a complete set of text formatting buttons as you would see in a word processing application. Click the Show Toolbar button to make it visible. You can change:

  • Font
  • Alignment
  • Create bulleted lists
  • Insert symbols and equations
  • Highlight text

A description of each button is visible as a hint if you let your cursor hover over the button.

Scoring and Rationale options

  • The Scoring button controls how the item is scored within a test
    • Sum scored (default)
    • Unscored
      • Item will not be taken into account when scoring a test
    • Hand scored
      • An admin or user must enter a score for the item after it has been delivered
      • Items that are scored using Rubrics must be set to hand scored
        • Requires rescoring a test after hand scored items have been given a score

scoring button

  • The Show Rationale button allows item writers to include rationale when creating an item
    • Extra text fields appear under the stem and each answer (shown below)
    • Can be viewed when previewing a test under the Test Assembler tab
    • Rationale statements are included in the solutions key that can optionally be shown to examinees after completing a test


The Asset Manager

Your test items can include multimedia such as graphic images (pictures), audio files, and video files.

FastTest  supports:

  • Images include .jpg, .png, and .gif files
  • Video files include .avi, .flv, and .mpg.
  • Audio files are supported in the .mp3  or ogg. format.
  • Text (html)
  • PDF files (currently available for use as reference materials – see Chapter 4A)
  • Adobe Illustrator (.ai) files can also be uploaded with the option to convert to png or jpg.
  • The maximum size of uploaded multimedia files is 100 MB.
    • Multiple assets can be uploaded at once through a zip file.


To add an asset (multimedia) to an item:

  • Click the Insert/Edit Multimedia button in the toolbar (insert asset).
    • This will open the Asset Manager window.  The Asset Manager stores all of the files that have been uploaded within your workspace, making them easy to reuse across items.
  • Select the asset you want to use and click Add.
  • Close the Asset Manager window.
Figure 3.21: Asset Manager

assett managner

To upload a new asset

  • Individual Asset Upload
    • Open the Asset Manager
    • Select the folder you want to upload to, or create a new folder by clicking on New under Asset Folders
    • Clicking New will display the screen shown in Figure 3.22
      • Input a name (required) and a description (optional)
      • Select the type of file that will be uploaded using the drop-down
      • Click Choose File to browse and select a file
    • Clicking Upload Batch will display the screen shown in Figure 3.23
      • Batch upload requires a zipped file that contains only supported file types
        • The upload will fail if the zipped file contains 1 or more unsupported files
    • Clicking Save will upload the file and it will then appear in the folder
      • A batch upload may take more time. Refresh the page to show the new assets
    • Note: Create a folder structure to better organize the workspace’s assets.
      • Can easily move assets between folders by dragging
      • Assets are automatically arranged in alphabetical order


Figure 3.22: Uploading a New Asset

new asset upload

  • You can preview an asset by clicking on it.
    • This page also allows you to edit the asset name, description, replace the asset by uploading a new file under the same name, and/or download the asset.
  • The table at the bottom of the page details the asset’s usage in the workspace.
    • For example, in the image below the asset is being used in an item named “sample.”

asset usage

Text Assets

  • A text asset could be used to represent a reading passage that is used by multiple items.
    • Text assets are created using a rich text editor (much like items).
  • To create a new text asset, select New → Text Asset.
    • A dialog will appear with Name and Description and the rich text editor for entering the text (shown below).
    • To get existing documents into a text asset, copy and paste directly into the rich text editor and clean up the formatting as needed.

Figure 3.23: Text Assets

text asset

Editing Assets

  • Once an image has been inserted, its size can be changed by selecting the image in the text and then clicking the Insert/Edit Image button in the toolbar.
    • This will bring up a dialog that allows specification of the height/width of the image.
      • (Note if only the height or  width is specified, the image will automatically scale to maintain the aspect ratio.) Audio and video cannot currently be resized.
  • To replace an old file in the Asset Manager select Replace File → Choose File.
    • Optionally, the new file can be previewed before saving.
    • Note: the asset WILL be replaced in all items it is currently used in, including within protected tests.
      • If the replaced asset is not showing in the item editor, reload the page and/or log out and log back in.
  • Additionally, workspaces have the option to be configured to show/edit asset links while previewing an item.
    • To preview an asset, click Configure -> Item Properties -> Add links to edit assets when previewing an item.

configure asset options

Other Methods of Associating Assets with Items

Assets can be inserted directly into the item text, but this can be insufficient if the asset is very large or needs to be reused across items. FastTest offers two additional ways of associating assets with items: an asset links pane and a split pane view. These options are accessible on the Edit Item Layout dialog.

Asset Links Pane
For presenting large images to examinees on a computer screen, the maximum screen space should be used with the minimum amount of scrolling. Asset Links can be used to achieve this by displaying assets in overlay dialogs that appear when a link is clicked.

  • To turn on the asset links pane, open the Edit Item Layout
  • Check the box labeled “Add a fixed pane to the top of the item for asset links”
    • A new pane shown below will appear above the item stem.
Figure 3.24: Asset Links Pane

asset links pane

Clicking Add Link will bring up the Asset Manager, where you will select the Asset to link to. After clicking Insert, an input box will appear where you can enter the text of the link that the examinee will click on to view the overlay. Clicking the name of the asset below the input allows you to view the asset and clicking Remove will delete it.

Split-Pane View
An item can be displayed using a split pane view. One side of the view would contain the item stem and answers and the other side would contain one or more embedded assets. This is especially useful in displaying reading passages alongside items.

  • To turn on a split-pane view, click on Layout, and check the box labeled “Add a separate pane for embedded assets.”
  • Choose between the two split pane types:
    • horizontal (top/bottom split)
    • vertical (left/right split, as shown below)
  • A new pane will appear in the item editor.

Clicking Embed Asset will bring up the Asset Manager, where you will select the Asset to embed. Upon inserting, the asset will appear embedded in the pane (shown below). Click Remove to delete it.

Figure 3.24: Item with Vertical Split-Pane

split pane view

You can control the percent of the screen height/width that the assets pane will use by dragging the resizer or entering a number in the input. The pane will scroll as needed when presented to the examinee.

  • Note that the size of the panes as seen in the item editor does not necessarily reflect what the examinee sees. The item editor reflects how the item will appear to the examinee on the smallest supported resolution (1024×768).
    • If the examinee has a larger monitor, the test engine will take advantage of all of the screen space to minimize the presence of scrollbars. To get the best representation of how an item will appear to an examinee, always preview the test.

Embed Asset in Multiple Items in One Operation

If multiple items are going to use the same asset, the assets can be embedded in one operation.

  • Create the items without adding the assets.
    • In the Item Explorer tab, highlight the items that will use the same asset
      • Use the Shift and Ctrl key functions for multiples
    • You can control the percent of the screen height/width that the assets pane will use by dragging the resizer or entering a number in the input. The pane will scroll as needed when presented to the examinee.
      • Note that the size of the panes as seen in the item editor does not necessarily reflect what the examinee sees. The item editor reflects how the item will appear to the examinee on the smallest supported resolution (1024×768).
        • If the examinee has a larger monitor, the test engine will take advantage of all of the screen space to minimize the presence of scroll bars. To get the best representation of how an item will appear to an examinee, always preview the test.
    • Right-click and select Embed Assets. This will open the window shown in the figure below.
    • Select Add a separate pane for embedded assets
      • Select a Horizontal or Vertical split
    • Click Add Asset, this will open the Asset Manager window
      • Select the asset to embed and click Add.  You may now close the Asset Manager window.
    • Click Save

Figure 3.25: Embed Assets in Multiple Items

multiple items for an asset

If items have the same configuration of linked and/or embedded assets and are placed next to each other in a test, they will appear together on the same page. Currently, a maximum of 8 items can be on the same page in this way.

  • Randomizing the item order when the test is delivered will prevent this page sharing from taking place as the items will likely no longer be adjacent. This problem will be solved with the new notion of Test Sections in a future release.
    • Note: this page sharing is similar to how Likert items using the same scale will be presented on the same page.

The Equation Editor

FastTest comes with an equation editor built into the content tab ( equation editor button ). When pressed, the Equation Editor dialog (shown below) will appear and the user can form equations.

  • The Image Font Size provides some control on the font size in the outputted image. (The image can be resized later, as well, like any other image.)
  • Once the equation has been formed, clicking Add will save the equation as an image and insert it into the item text at the last known cursor position.
  • Existing equations can be edited by double clicking the equation image.
Figure 3.26: Insert Equation Dialog

equation editor


The Statistics Tab

Figure 3.27: Edit Item Dialog – Statistics Tab

stats tab

The Statistics Tab contains fields for item statistics, whether classical, item response theory (IRT), or user-defined. These can all be simply typed into the appropriate text boxes.

  • To enter IRT statistics (called parameters in that context), select the relevant IRT model from the drop-down menu.
    • “User Stat” fields provide a place for any statistics other than the common ones already specified.
    • These labels can be customized in Items tab of the Configure Workspace dialog by the workspace adminstrator.

The graph on the right shows the item response function or item information function, whichever is selected in the drop-down menu above. You can view and export the actual values for these graphs by clicking the View Data button in the upper right.

Graded Response Model

To import parameters for the Graded Response Model (GRM) for multiple items:

  1. Change one item to the GRM (as shown below) and hand enter the parameters. Save and close
  2. Run a metadata export – this will provide the file that can then be used to import additional metadata
  3. Add the parameters to items that require them. Save file.
  4. Import metadata


GRM parameters

Comments Tab

The comments tab is an open place for comments on the item. Users with the Item Reviewer role can record comments on items but are unable to modify any fields. FastTest automatically records the date the comment was made and the identity of the commenter. Clicking one of the comment buttons will bring up the comment dialog. Unlike other dialogs in FastTest, this dialog will not block interaction with the background so the commenter is free to to change tabs in the Edit Item dialog.

 Review Tab

The review tab summarizes the values for custom item fields marked for review. The data is presented in a tabular format: columns are custom item field (used for review) and rows are users. This tab is only accessible to workspace administrators and bank managers. See Reviewing Items.

Objectives Tab

The objectives tab presents the objectives or guidelines that have been defined for the category to which the current item belongs. The purpose of this tab is simply to provide guidance to item writers. If categories of content in your testing program do not have specific objectives, this tab can be ignored.

Rubrics Tab

The rubrics tab allows users to access rubrics, which can only be created when Item Marking has been activated by an Administrator  by editing the workspace.

  • From the workspace homepage click Edit
  • Go to the License tab
  • Check the box Allow Online Marking
  • There will now be an Item Marker tab within the workspace, as shown below

item marker tab


Rubrics are useful for hand scoring items, such as short answer or essay response types, within the system.

Creating Rubrics and Defining Scores

  • Under the Item Marker tab click on create/edit rubrics
    • Note: Only Admins are allowed to create rubrics. Non-admin users can assign and mark rubrics.
  • Clicking Create New Rubric results in the screen shown below
    • Enter a name and description of the scoring rubric

new rubric

  • Click Create score to add score options for the rubric
    • Enter a name
    • Select the desired score
    • Provide a description
    • The Explanation box can be used to provide examples of an examinee response that warrants the score
    • Create multiple score options for each rubric
      • Scores range from 1 to 5
    • Once all score options are created click Save Rubric
      • Saved rubrics will be displayed under the Create/Edit Rubrics link
  • The example below is for a spelling rubric. The score created would provide a score of 5 indicating the student’s response contained 1 or fewer spelling errors.

spelling rubric

Assigning and Marking Rubrics

  • Once a rubric has been created it can be assigned to an item
    • Note: The scoring for the item must be set to Hand Scored
      • Under the Content tab Click on the Scoring button and select Hand Scored
    • Note: Once assigned to an item the rubric can no longer be edited
  • From the Edit Item dialog under the Rubrics tab click Add
  • Select the rubric, click Save
  • Once an item has been added to a test, protected, and delivered it will appear under the Item Marker tab as shown below.


  • Selecting the box for the test session will cause individual items within that test to appear below
  • Once an item is selected,  click the Assign or continue marking item button, resulting in the Mark an item page shown below
    • The question and examinee’s response are shown on the right side
    • Mark, or score, the response by selecting from the score options on the left
      • Clicking the question mark will provide the description of the scores within the rubric

mark an item

  • Once all of the items within a protected test have been scored via the Item Marker tab the examinee’s test will  need to be rescored.
    • This can be done under the Test Scheduler tab


Multiple Markers

If you have a marking situation where you need more than one person to mark each student response (such as two teachers reading each student essay) this can be enabled with the Marking Scheme dropdown on the Marking Settings screen, as shown below.

  • To access Marking Settings, click on the Rubric name from the Administration page.

admin page

  • Because a student will receive only one score on each rubric, the subsequent dropdown specifies how to deal with this if the markers disagree: use the average mark or the highest mark.
    • For example, if one teacher gives a mark of 2 points on grammar and the other gives a 3, the final score can be 2.5 or 3.  Note that if you use IRT to score the tests, the model only accepts whole numbers.

marking settings

A third approach to dealing with marker disagreement is adjudication.  This refers to bringing in an additional marker to provide another rating.  That rating can then be included in the average, or serve as the final mark for the student response; this is selected in “How to resolve rubric score with adjudication.”  As an administrator, you have a choice of score difference to require adjudication.



Chapter 3C: Reviewing and Searching for Items

Table of Contents

Reviewing Items

In addition to commenting, item reviewers are able to enter data about an item through the use of custom item fields. To learn how to define custom item fields used for review, see Custom Fields on the Workspaces page.

  • To review an item, right click an item in the item list and select Review Item.
  • A dialog will appear that shows the item alongside the input fields (Figure 3.16).
  • Each reviewer will have his or her values saved for each item.
Figure 3.16: Item Review Dialog

item reviewer

  • The check box labeled “Automatically save when clicking next or previous” will make reviewing faster by allowing users to enter values and immediately continue reviewing without being prompted to save.
  • Users are also provided with a check box labeled “Show Answer Key” and can choose to turn the feature on or off.

The data for a specific custom field can be exported in a CSV for all items in a bank and/or category.

  • Simply right click the bank or category containing the items that have been reviewed, and select Export → Review Field.
  • A dialog will appear that allows you to select the custom field that you want to export.

Note: only custom item fields used for review will appear here. In the generated CSV, each row represents an item and each column represents a user.

Searching for Items

FastTest has extensive item search capabilities. You can search for items that you want to find for editing purposes, or if you have a test open, you can search for items to be added to a test.

  • The item search options are accessed from the area in the upper right of the Item Explorer bank view (not when an item is already open for editing).
  • There is a quick search box immediately available, which searches for the specified string in the item’s name, description, keywords, author, and source.
  • The Advanced Search link brings up a new dialog (Figure 3.17) to search on additional fields.
    • This dialog is useful in test assembly, as you can search for items that meet your statistical/psychometric specifications.
    • Advanced searches can be saved by clicking Save at the bottom of the dialog.
    • To access saved searches open the Advanced Search and click Open at the bottom of the dialog.
Figure 3.17: Advanced Item Search Dialog

advanced item search

Chapter 3D: Deleting Categories and Items in the Bank, and Item Versioning

Table of Contents


Deleting Categories and Items in the Bank

  • To delete a category, right-click on the category and select Delete Category, or highlight the category in the Item Explorer tree and either click the red Delete button or press the Delete key.
  • You will be asked if you also wish to delete subcategories and/or items.
    • By default, items and subcategories in a deleted category will not be deleted but will be moved up a level in the tree.
  • To delete an item or items, select items in the item list and either right-click on one of the items and select Delete Item(s), click the red Delete button, or press the Delete key.
    • Use the Control key to select multiple individual items or the Shift key to select a range of items. You can also right-click and Select All.

Note: Selections are not preserved if the current page of items is changed.

  • An item can be deleted only if it does not appear in any tests.
  • If you want to delete an item and it appears in one or more tests, you will first have to either delete the test(s) or delete that item from the tests in which it appears.
    • A list of tests in which an item appears can be seen on the Information tab in the Item Editor when the item is opened for editing. This is to help ensure that accurate records are kept of past tests.

Note: Items that are part of tests that have been taken cannot be deleted. Instead, these items can be set to the “Retired” status in the Edit Item dialog, and they will no longer appear in the item list or bank structure. Use advanced search to find items in “Retired” state.

Item Versioning

If you attempt to change the content or IRT statistics of an item that is part of a protected test, the changes will be saved as a new version of the item.

  • Note that changes made to tabs other than the content tab will not cause an item to be versioned. A message will appear at the bottom of the content tab for an item if it is part of a protected test.
Figure 3.18: Item Content Tab with Versioning Warning

item versioning warning

Sometimes a typo is discovered in an item and a group of examinees is already scheduled to take it. Normally, fixing the typo would cause the item to be versioned and a new test would have to be built switching out the old version with the new version. To shortcut this process, FastTest allows Workspace Administrators to make changes to the text and/or answer weights without causing the item to version.

  • At the bottom of the tab, a checkbox will appear that is labeled “Make change without versioning”. Once checked, certain actions will be disabled- such as adding answers, removing answers, and changing item response type.
    • Note that if an answer needs to be added/deleted or the item response type needs to be changed, the item must be versioned.
  • Updating a protected item without versioning can also be used to fix scoring issues in the case that the correct answer was miskeyed.
  • Update the answer weights without versioning and rescore the affected examinees.
    • This should be done with extreme caution, as with fixing typos, live data is being modified.

There is also the option of creating a new version of an item that is not part of a protected test manually by selecting Save New Version at the bottom of the dialog.

Figure 3.19: Item Content Tab

item content tab

Chapter 4A: The Test Assembler Tab – Information and Online Delivery Tabs

Table of Contents


After all of the items you need have been added to your banks, you can begin to assemble tests. Tests are assembled and managed in the Test Assembler tab, the second tab in FastTest. You begin by creating a new test in this tab, then returning to the Item Explorer tab to add items to the test.

  • Tests are organized into groups (folders).
    • Before creating any tests, it is recommended that you create a group into which you wish the test to go into by clicking the New button and then selecting New Group.
    • Then, either right-click on the group folder or select the group and click the New button, and select New Test.
      • For example, if you are going to be creating vocabulary tests, create a group named “Vocabulary” as shown in Figure 4.1.
Figure 4.1: The Test Assembler Tab

test assembler tab

Creating a New Test

When you create a new test, or edit an existing test, a new dialog (the Edit Test Options dialog) will open with tabs for specifying important information about the test (Figure 4.2).

The Information Tab

Figure 4.2: New Test Dialog – Information Tab

information tab

  • The first tab is the Information tab, which contains fields for basic descriptive information such as the test name and description.
    • The “Answer Marker” refers to how you want your test answers (also known as options or alternatives) labeled when the test is delivered to examinees.
      • You can have letters (ABCD) or numbers (1234), or no markers if delivered online (examinee clicks the answer text or clicks on a radio button or check box; see Online Delivery tab).
      • This will not override what you have selected for the individual items, but will be applied if you indicated for each item that the marker would be specified at the test level.
  • You can also set the test to being public or private. A private test can only be edited and used by the author.
    • For example, the various teachers in a school can only use and modify their own tests and not everyone else’s tests.
  • Tests can have custom fields which allows a workspace to add any additional fields that may be missing from FastTest.
    • The example workspace above has one custom field called Grade Level.

Not all of the fields in the Edit Test Options dialog need to be specified immediately – you can return to complete the fields after adding items to your test.

The Online Delivery Tab

Figure 4.3: New Test Dialog – Online Delivery Tab

test assembler online delivery tab

The Online Delivery tab (Figure 4.3) specifies information for online delivery of the test.

  • Select the type of the test from test method drop down. Detailed explanation of each test method is discussed in the next sections.
  • You can specify the text the examinee presses to submit the test and the text of the confirmation message the examinee agrees to before it is submitted.
  • A time limit can be set here in minutes (leave at zero for an untimed test). The test will be submitted automatically once that time limit is met.
  • Status icons used by an examinee for seeing the completion status of the questions or jumping to specific questions can be shown or hidden.
  • The test can optionally be taken by examinees anonymously, meaning they do not need to provide their name.

A test has many timing options.

  • A time limit can be set on the test.
    • Once the time expires during a test, the examinee’s test will automatically be submitted, and any items left unanswered will be counted as skipped (even required items).
    • FastTest supports extra time accommodation through the use of a multiplier.
      • If a test has a time limit and allows for extra time accommodation, examinees that require extra time will have a new time limit equal to the old time limit times the multiplier.
        • For example, if the time limit for a test is 1 hour and its multiplier is 1.5, examinees requiring additional time would have a time limit of 1 hour 30 minutes. The multiplier will also apply to any other time limits in the test (i.e., in a test section).
    • If no time limit is set, the elapsed time can be shown to the examinee.
    • Time that the examinee spends on a test will be recorded regardless of what options are used.


  • Checking the box ‘Allow examinees to log out of the test’ will create a new button for exiting the online test before it is completed.
    • The examinee can then log back into the test at a later time by using the original test code.
  • Checking the box Show test name will display the test name in the delivery engine.

Enabling Access to Reference Materials

FastTest can provide a link for examinees to have access to reference material during an exam.

  • Checking the box Give examinee access to reference material during test will cause the Select Asset Folder button and Reference Material Link Text text box to appear

access to reference material

  • Select which asset folder to provide access to
    • Note: Only PDF assets within the chosen folder will be accessible to examinees
    • Folder hierarchy will be maintained and any PDFs located in subfolders will be available to examinees
      • Option to select subfolder only
  • Option to individualize the reference material link text
    • Defaults to ‘Reference Library’
  • Click Save
  • During an exam, the Reference Library link will be located above the item navigation

ref library link

  • Clicking on the link will open a new window with a list of the PDF assets from the chosen asset folder

Ref library link window

  • Examinees can click on the the file name to open the document
  • To return to the list, click on Open a Different Document located in the upper right corner
  • Click on the X to close the reference material window

reference opened

Chapter 4B: The LOFT Options Tab

Table of Contents


This tab includes options unique to a LOFT.

Figure 4.4: New Test Dialog – Loft Options Tab

LOFT options tab

A linear-on-the-fly test (LOFT) is a test delivery method where every examinee receives the same number of items, but a different set of items than other examinees.

  • This makes it halfway between a traditional linear test (that has the same number and the same set of items) and an adaptive test (that has a different number of items and a different set of items).
  • With LOFT, every examinee will have their own test form constructed when they take the test, based on the specifications you provide, which greatly enhances exam security.

LOFTs can be constructed by selecting a total number of items, or by balancing across content domains.

  • For example, under the standard model of linear tests, the same 100-item test form with 20 items in each of the five domains is delivered to every examinee.
  • With LOFT, you can establish a pool of 150 items with 30 in each area.
    • FastTest’‘s intelligent test generator will custom-build a test for each examinee by selecting 20 out of 30 items in each area.
      • Because every person will have a different test, the content is much more secure.
      • Additionally, the assessment has greater perceived fairness because all the tests will be built from the same specifications, and every examinee is presented the same number of items.

The LOFT delivery method is specified when editing test options.

  • When creating a new test, or editing existing one, go to the Online Delivery tab of the Edit Test Options menu and select LOFT from the Test Method drop-down menu.
  • After selecting the LOFT method, this will create a new tab entitled LOFT Options.
    • This new tab allows test editors to edit the Content Constraints and other specifications for this test method.

Note: If there are no items in the current test, then you will need to return to Edit Items to add test questions. Items will need to be added to the test in order to set the options for this test method.

There are two ways a LOFT can be used:

  1. Build the test only by selecting a total number of items
  2. Build the test according to the Content/Domain specifications: use the Edit Content Constraints button to choose a target number of items for each individual Test Section.
    • First, Test Sections need to be built that implement the desired content outline.
    • The user will then need to return to the LOFT Options tab to check the box next to Enable Content Constraints in order for the test to take the constraints for the precedence.
    • For the Constraint Input Type, they can be based on a percentage or by the number of items.
    • Enabling content constraints in the Scoring tab would allow calculation of subscores in the test, which is often necessary for student feedback reports. To allow subscores in the test, go to the Scoring tab and check the box towards the bottom entitled Enable Subscores, as seen below. For the Constraint Input Type, they can be based on a percentage or by the number of items.
Figure 4.5: Edit Content Constraints with Number of Items Mode

enable content constraints number

Figure 4.6: Edit Content Constraints with Percent Mode

enable content constraints

When all of the settings have been determined, click Save at the bottom of the dialog box.

Chapter 4C: The CAT Tab

Table of Contents


This tab (Figure 4.7) includes options unique to a CAT.

Figure 4.7: New Test Dialog – CAT Tab

CAT tab

CAT is a delivery method that uses IRT scoring to deliver items intelligently to each examinee.  The test adapts itself to each examinee, so that high-ability examinees do not waste time on easy items and low-ability examinees are not discouraged by difficult items.

  1. Item pool – a set of items calibrated with a psychometric model (e.g., IRT);
  2. Initial θ (ability estimate) – where the algorithm should begin;
  3. Item selection method – the process of matching items to examinee;
  4. θ estimation method – the mathematical approach to determining based on responses to items that have been administered to an examinee;
  5. Termination criterion – the mathematical and/or practical constraint that must be satisfied for an examinee’s test to end.

The item pool is defined by you in the process of developing the test in FastTest. This leaves the remaining four components to be specified.

Initial θ
CAT selects each item for an examinee individually by determining which item in the bank is most appropriate for their ability level. The ability level (θ) estimate is updated after each item.

  • But for the first item, there is no θ estimate because no items have been administered yet – there is no way to score the examinee, so a temporary θ must be assigned.
  • The simplest method is to assign every examinee to start at the same level, typically the average of the distribution, often θ = 0.0 (Option 1).
    • This has the drawback that every examinee will see the same first item unless there is randomization in the item selection (next paragraph).
    • A simple way to address this is to randomly pick an initial θ for each examinee within given bounds (Option 2).

Item Selection Method
A typical CAT adapts itself to each examinee by selecting the item with the most information at the current θ estimate (Option 1).

  • This is the most efficient way to deliver items that are most appropriate for each examinee and obtain a precise final score with as few items as possible.
    • However, some testing programs wish to insert some amount of randomization (Option 2).
    • This randomization will, instead of picking the single item with the highest information, identify x number of items with the highest information, and randomly select between them.
    • This is extremely useful in two situations.
      • First, if the test is high-stakes and there is the possibility that the first few items will become well-known amongst examinees. Utilizing randomization will greatly increase the number of items being utilized across the population at the beginning of the test, aiding in security.
      • Second, if examinees will be taking the test more than once, the second test will likely lead down the same path into the bank; randomization will reduce the amount of items that are seen again during the second test.

θ Estimation Method
FastTest is designed to use maximum likelihood estimation (MLE) of θ. Because MLE is undefined for nonmixed response vectors early in a test, where examinees have all incorrect or all correct responses (always the case after the first item), Bayesian maximum a posteriori estimation (MAP) is utilized as a temporary θ estimation in those cases. The test reverts to the less biased MLE method which a mixed response vector is obtained (at least one correct and incorrect both).

Termination Criterion
There are two approaches to terminating the CAT.

  1. Like conventional tests, a CAT can be ended after an arbitrary number of items that is the same for each examinee (e.g., 100 items for all examinees). This is Option 1.
    • However, the sophistication of CAT also allows for variable-length testing, where the test is concluded when a certain criterion is satisfied.
  2. Option 2 has two choices.
    • First, you can end the test when the standard error of measurement (SEM) falls below a certain point. This ensures that all examinees have scores of equal precision, something which is nearly impossible with fixed-form conventional test.
    • The second choice is to end the test when no items remaining in the bank provide a certain level of information, which is designed to ensure that all items appropriate for a given examinee are used.
  • Additionally, you can set a minimum and maximum for the test length. A minimum is useful to ensure that all examinees receive at least 20 items, for example. A maximum is intended to prevent the CAT from continuing until the entire item pool is used, which it will if either choice for Option 2 is too strict.

When a test is delivered via CAT, certain options are locked elsewhere in the Edit Test window.

  • For example, the option of allowing examinees to mark items for review is disabled.
  • Scoring is also fixed to IRT scoring. Concordantly, all scored items in a CAT must have an IRT model and parameters before the test can be protected.
  • Unscored items (such as instructional and survey items) can also be placed in a CAT, but they will appear before and/or after the scored items.
    • To make unscored items appear before the scored items, place the unscored items adjacent as the first items in the test item list.
    • Any unscored item that is not grouped at the beginning of the item list, will appear at the end of the CAT.

Content Constraints
While computerized adaptive testing (CAT) is based on item response theory, which assumes an unidimensional trait, many testing applications have content areas or domains across which they desire to spread items.

  • For example, a mathematics test might have a blueprint which calls for 75% algebra items and 25% geometry items.
    • When building a traditional fixed-form test, this can be explicitly controlled for; a 20-item test would have 15 algebra items and 5 geometry items.
    • CAT exam can be of variable length, so it needs to dynamically keep track of the item content distribution and select the next item appropriately.
      • If a CAT exam had delivered 19 items, 15 of which were algebra, then the CAT algorithm must know to select a geometry item next.

CATs implement this by constantly keeping track of the target proportions (0.75 and 0.25 in this example) and the actual proportions at any given time in the test.

  • The target proportions are specified by the test designer based on the blueprint of the exam, in the Content Constraints dialog window of the CAT tab.
  • Note that the exam pool must first be constructed according to the blueprints; that is, each item is specified as an algebra or geometry item.

Clicking the Edit Content Constraints button at the top of the CAT tab will bring up the following dialog. Note: this button only appears upon editing an existing test, not when creating a new test.

Figure 4.8: Content Constraints Dialog

cat content constraints

To enable Content Constraints, check the box at the top. The dialog lists all of the test sections defined for the test as well as the number of items in each test section. Each test section has an associated Target Percentage, which reflects the percentage of total items administered that should come from that section. The Target Percentages must add up to 100; you will be prevented from deleting a test section that has a positive Target Percentage assigned to it.