Optimizing CT Imaging Protocols: Factors That Affect the Radiation Dose
February 1, 2007 | Evaluations & Guidance
Imaging protocols play a key role in controlling dose. Good quality assurance and practices such as the use of reference values have helped keep protocols more consistent and, consequently, have helped doses stay more closely within accepted limits (as we discuss in Radiation Dose in Computed Tomography: What You Can Do to Minimize It). Nevertheless, there is still room for improvement: A survey in the United Kingdom found that CT dose for routine protocols can vary by up to a factor of five for the same exam (Shrimpton et al. 2005).
Typically, a radiology department will set standard protocols for specific exams. The choice of parameters is usually made by radiologists with the guidance of the CT manufacturer and medical physicists. The key question in any protocol development is how the required clinical information can be reliably obtained with minimal x-ray dose. The information content within any image depends on three quantities: signal, noise, and resolution. The main parameters affecting these quantities are discussed below. Readers requiring a detailed introduction to CT concepts and terminology can refer to our Explanation of Key CT Terms.