Go Back

Computed tomography


Computed tomography (CT) was first introduced into medical imaging in the mid-1970s. Today CT results in 24% of the total radiation dose from all sources to the population in the United States, and 49% of the medical radiation dose to the population [UNSCEAR, 2008]. CT provides significant diagnostic information and eliminates the need for exploratory surgery which was common before the advent of CT. Consequently, it is essential to ensure that CT provides the necessary diagnostic information, i.e., image quality is acceptable for the task, while optimizing the radiation dose to the patient.

Important Principles

CT scanners can result in high doses to the patient. Cases of erythema and epilation have been reported as a result of perfusion studies with CT.

It is essential to understand the factors impacting on patient dose in CT. These include the selection of kilovoltage and tube current as well as the spacing of the acquired images. In addition, it is not just the CT equipment and technical factors that are important. For example, in perfusion studies the number of images acquired in a selected region of the body will have a significant impact on patient dose. Consequently, the selection of techniques and protocols by the imaging physician or radiographer can change patient doses dramatically.

Introduction to References

The basics of CT are presented in the IAEA Handbook on the Physics of Diagnostic Radiology, Cody and Mahesh, and Sprawls. The intricacies of CT dose measurement are described in Nagel and dose management is discussed in the ICRP publication.