46TH
ANNUAL
CONGRESS
OF
THE
SAEVA
SKUKUZA
16-‐20
FEBRUARY
2014
135
per 2–3MP monitor so that the information displayed on the screen most accurately
depicts the information stored in the digital matrix. Stated in another way,
‘‘squeezing’’ a large image on to a small monitor will result in the loss of diagnostic
information. The image review software functions of window, level, and zoom should
always be used because they increase diagnostic accuracy.4,9,25 Cropping tools or
automatic collimation detection should be used at the time of image production to
minimize white areas surrounding the exposed portion because this reduces
excessive back lighting.
Features that improve the clinical utility of software programs include the hanging
protocol, the default image resolution, and the user interface. Hanging protocols
require that specific keywords be incorporated into the DICOM header information
so the software can recognize them and hang or display the radiographs or other
images in a specified order and orientation. Having an appropriate hanging protocol
can greatly increase radiologists’ productivity.
The default resolution should be as high as possible because important diagnostic
decisions are made very rapidly upon first seeing a radiograph. The user interface is a
key feature in increased or decreased radiologist productivity. A radiologist should
be able to use the image viewing system with little or no training and the system
should be user friendly.29 In one study, the reviewers’ eyes were fixed on the menu
options of a software program for 20% of the total time spent reviewing bone
radiographs.32
Software options that either stack the remaining images or display the remaining
images in a thumbnail format can greatly aid in study evaluation.27 The user interface
differs between software programs and this is an important point of comparison in
purchasing decisions.
References
1.
2.
Ballance D. DICOM and the network. Vet Rad Ultrasound 2008;49:S29–S32.
Wright M, Ballance D, Robertson ID, et al. Introduction to DICOM for the practicing
veterinarian. Vet Rad Ultrasound 2008;49:S14–S18.
3. A.C.R. American College of Radiology. ACR technical standards for teleradiology: American
College of Radiology, Reston, VA, 2002.
4. Bacher K, Smeets P, De Hauwere A, et al. Image quality performance of liquid crystal display
systems: influence of display resolution, magnification and window settings on contrast-detail
detection. Eur J Radiol
5. 2006; 58:471–479.
6. Badano A. PACS equipment overview: display systems. RadioGraphics
7. 2004; 24:879–889.
8. Balassy C, Prokop M, Weber M, et al. Flat-panel display (LCD) versus high-resolution grayscale display (CRT) for chest radiography: an observer preference study. Am J Roentgenol
2005; 184:752–756.
9. Batchelor J. Monitor choice impacts diagnostic accuracy and throughput. Auntminnie.com
May
4,
2002.
http://www.auntminnie.com/index.asp?
Sec1/4rca&Sub1/4scar_2002&pag1/4dis&ItemId1/453236 (accessed September 11, 2007)
10. Doyle AJ, Le Fevre J, Anderson GD. Personal computer vers us workstation display: observer
performance in detection of wrist fractures on digital radiographs. Radiology 2005; 237:872–
877.
11. Graf B, Simon U, Eickmeyer F, et al. 1K versus 2K monitor: a clinical alternative freeresponse receiver operating characteristic study of observer performance using pulmonary
nodules. Am J Roentgenol 2000; 174:1067–1074.
135