Alison Cernich, Ph.D. Director, Neuropsychology VA Maryland Health Care System Assistant Professor, Neurology & Psychiatry University of Maryland School of Medicine
The views and opinions expressed during this talk do not represent the opinion of either the Department of Veteran s Affairs and/or the Department of Defense. Conflicts of interest None
History of computerized cognitive testing Implementation of computerized cognitive testing Technical issues in computerized assessment Human factors & usability
In the beginning was the word, and the light, and those could be displayed on the PC. And yea, though it was large and slow, it held enormous opportunity
The WAIS was automated in 1969 by Elwood and Griffin Space introduced a psychological testing console in 1975 Computerization of the Category test from the Halstead-Reitan in 1975 included changing stimuli, response recording, and measurement of response latency Late 1970s saw the development of early versions of CAT by the Navy s research and development division Expansion in the 1980s led to the development of APA s Guidelines for computer-based tests and interpretations (1986).
Incorporation of physiologic, imaging, and sensory data
Goal of the measure Cognitive screen Comprehensive assessment Stand-alone assessment of cognitive domain Diagnostic tool Repeatability and stability of performance Equivalence to paper-andpencil form Construct validation if paper version does not exist Ecological validation
Potential for error introduced by the system Speed of the processor Minimum timing accuracy thresholds from standard OS Available memory needed to run the program Interference from the OS, anti-virus or update software
Refresh cycle or refresh rate Accurate timing of the display Intended interstimulus interval Resolution Monitor type Monitor size Assurances need to be included with respect to timing of display through recording and default adjustment of screen resolution settings
Mouse Types (infrared/optical) Polling rate (Hz) Pixel accuracy (dpi) Keyboard Sampling rate Greater effect of age or motor impairment Touch Screen Varying types Greater room for variation in stimulus position, presentation, and user adjustments in these parameters Personal Data Assistants/Smartphones Difference in display, input capability, and potential additions of accelerometer or GPS
Claims regarding accuracy of timing, especially at the millisecond level require an additional level of assurance Standard lab cards that are independent of the system clock Real time operating systems Programming solutions using high precision event timers (HPETs) External chronometry verification If important to the nature of the test, timing should be verified and reported to the user Effects of varying from the methods used to verify the timing should be detailed
"The Internet is not something you just dump something on. It's not a truck. It's a series of tubes. And if you don't understand, those tubes can be filled. And if they are filled, when you put your message in, it gets in line and it's going to be delayed by anyone that puts into that tube enormous amounts of material, enormous amounts of material." -SEN Ted Stevens (R-AK) June 28, 2006
Bandwidth Speed of flow of information on a transmission path Varies by type of connection All can be affected by service provider fluctuations All have varying maximum data rates Type of information exchange Real-time Store-and-forward Transparency with respect to best methods for implementation and enhancing user understanding of how the test operates would be beneficial
Consideration of the programming language used Challenges of user verification Access may be available to a population for which the test was not intended Problems with informed consent and release of test data Difficulties with implementation in unmonitored or uncontrolled situations Need to ensure that obsolete or outdated versions are no longer available Need a mechanism to communicate examination results Reference the Report of the APA Internet Task Force (2004)
Data storage Data security and back-up Firewall protection Local vs. remote storage The cloud Issues with electronic transmission HIPAA requirements Potential regulations or rules dependent on setting Security of data transmission
We haven t added the people yet
Implementation Cost System implementation requirements User qualifications Level of supervision required during testing Time required for the battery Use, modification, administration, and termination Troubleshooting instructions Interpretation Information on the back end including scoring algorithms and results generation Inclusion of how to use clinical reports generated and caveats to use of those reports Considerations that could impact the validity of the measure
Computer familiarity Accommodation for disability Motor impairments and accommodation Accommodations for low vision or low hearing Clarity of instructions Ease of use Engagement with the test and level of effort
BENEFITS Standardization of presentation Unobtrusive and sensitive reaction timing Potential repeatability or availability of alternate forms Flexibility or responsiveness of the battery to performance level (CAT) Rapid and accurate data scoring, storage, retrieval, and comparison to previous assessment Potential for assessment of multiple dimensions of human performance Efficient presentation of complex stimuli Potential to deliver assessment remotely Potentially lower cost LIMITATIONS Equal attention needed for technical and psychometric development Difficult to flexibly account for patient factors Inability to test the limits or adjust the testing strategy in the session Some restrictions on cognitive domains one is able to test Effects of age, disability, and computer naiveté on performance Computer phobias or inexperience on the part of the provider Difficulties with verification of the user in remote settings Dependent on system requirements potentially higher initial investment
Assurance that the test is valid for the purpose for which it was developed with data support Technical issues with respect to implementation, external verification, and quality assurance are clarified Instructions for implementation for the user are clear and provide information on tasks, administration, modification, and validation of alternate forms If CAT or IRT protocols are used how these were developed and implemented Data gathered, scoring protocols, and output are easy to understand and interpretation guides are included Data storage, security, and retrieval are transparent Test security is paramount both for the client and the manufacturer Human factors are included to ensure appropriate accommodations