Why using a medical grade monitor to view X-rays?
I get this question all the time: why should I pay thousands of dollars for a medical grade monitor to diagnose digital X-ray (CR/DR), if I can buy a very nice looking commercial grade off-the-shelf, (COTS) monitors at the local computer store. I have boiled this argument down to 6 important reasons based on critical arguments, which are (hopefully) simple to understand and allow you to convey this to your radiologists or administrators who have little technical or physics background.
1. A commercial grade monitor does not show all of the critical anatomical information. As the name implies, the COTS monitors are intended for office automation, to display documents to appear like a printed page. Therefore performance attributes are weighted heavily to being as bright as possible so that text is easily resolved with minimal eyestrain. Commercial displays therefore attain maximum luminance way before the graphic card input reaching its maximum input value. Remember that a typical graphics card can display 256 different input values, each representing a distinct piece of valuable diagnostic information. These monitors have been observed to max out as low as a value of 200, which means values 201 to 255 are being mapped to the same luminance value … maximum. This means that 20 percent of all the data is cropped or simply eliminated.
By contrast, medical grade monitors are calibrated to map each individual distinct pixel into something you can detect rather than following the natural response of a graphics card output. Unfortunately, it is normal for the natural COTS monitor response (un-corrected to DICOM) to yield the same luminance value (measured) for multiple sequential values, i.e., a flat spot in the response curve. These flat spots are especially obvious in the low range, i.e. the first 160 of the 256 values.
What is the impact of a flat response? Let’s take as an example, for a commercial grade monitor the pixel values of 101, 102, 103, 104, and 105, could be mapped into a single luminance value on the screen. That means that if there is a slight nodule, which is identified by a difference in value between 102 and 105, it will disappear, as there is no distinction between these values on the monitor. Note that since the better part of the clinical information from the imaging modalities is in the lower 50 percent of the luminance range, this means that these are in the most critical areas wherein the ability to resolve pixels at different luminance values is being compromised.
In conclusion, the potential to miss critical diagnostic information both at high luminance and due to flat spots in the response should be the number one reason to not even consider a commercial grade monitor. Therefore, the first requirement for medical monitors is to insist on a monitor that is calibrated according to DICOM standards, which truly maps each of the different pixel values into a luminance value on the screen that is detectable by the human visual system as noticeably different. It is best to have this calibration done at manufacturing to have an optimal mapping of the three RGB channels into the DICOM compliant curve.
2. Many commercial grade monitors don’t have the required dynamic range. The maximum light output of a monitor is specified using the units of cd/m2 (candela/square meter). A good quality commercial display can achieve 300 cd/m2, sometimes more if you are lucky. The maximum value of 300 cd/m2 would be at the low end of any medical grade monitor, which might be able to go up to 500 cd/m2 or more. Why do we need this much? The reason is that when a display is calibrated to DICOM, a percentage of the response is lost in the mapping process. At 300cd/m and applying DICOM corrections, the maximum value can be expected to decrease by about 10 percent.
The human eye has a 250:1 contrast ratio at the ambient conditions of the viewing environment. Assuming the commercial display was DICOM compliant with aftermarket software, the luminance ratio of the display and the eye would be very close. However, ambient light detracts from the ability to see low contrast information. This particular example would need to be in a low light room to achieve a 250:1 luminance ratio inclusive of ambient light.
Medical displays are designed to operate between 400 and 600 cd/m2 as corrected to DICOM with reserve luminance potential for extended life at those levels. Even if a monitor is calibrated, if there are not enough points to map the pixel data into, you clip off part of the information. For example, if you want to map 256 grayscale pixel values but have only 200 points available, you’ll lose that information. The required dynamic range depends on where you are going to use the monitor. As you are probably aware, the lighter, i.e. the brighter the room light, the more information you are going to lose in the dark, as you simply won’t be able to distinguish details in the dark.
There is a simple fix for that, the calibration takes into account the room light and makes sure the lowest pixel value is mapped to something you can detect. The whole range is shifted, which is important when using it in a light area such as ER or ICU. Also, it is good to have some “slack” at the dynamic range, the light source of the monitor will decrease (compare the output of an old light bulb), and get lower and lower over time). Therefore, the maximum brightness to facilitate mapping the whole data range should be about 350 cd/m2[1] assuming you use it in a dark environment. If you are using it in a bright area or if you want to make sure you have some slack to facilitate the decrease of monitor output over a period of let’s say 5 years, you might want to go with 450-500 cd/m2.
3. A medical-grade monitor typically adjusts the output to compensate for start-up variations in output. The light output of a monitor varies as the temperature needs to stabilize for about 30-60 minutes. You can leave them on day and night, or switch them on automatically one hour before they are going to be used, however, either method will drastically reduce the lifetime. Better grade medical monitors typically have a feedback mechanism built in that measures the light output and adjusts the current to the light source to create the same output. The third requirement therefore is to have a medical grade monitor with a light output stabilizer.
4. A medical grade monitor can usually keep a record and keep track of its calibration. One of my students in our PACS training told me that he had to deliver the calibration record of a specific monitor dated 2 years back for legal reasons, to prove that when an interpretation was made on a workstation, there was no technical reason that a specific finding was missed. In addition, you need access to these records on a regular basis, regardless, to make sure that the monitor is still operating within the acceptable range. This brings me to another point – many users seem to replace their monitors after a period of five years. If they are still within the calibration, there is no reason to do that. Therefore, the fourth requirement for a medical grade monitor is to make sure that you can retrieve and store the calibration records.
5. A medical grade monitor is typically certified. There are recommendations that are defined by the ACR for monitors. They are somewhat technical and in my opinion not worded strongly enough. Also, most medical grade monitor manufacturers are FDA approved, which is actually only a requirement in case you are reading digital mammography. If you meet the requirements stated above, you should be OK, but FDA approval does not hurt. You can check the FDA website and look up the manufacturer to see if they have been approved. The fifth (optional) requirement is therefore to be FDA approved.
6. In addition to being able to see all of the grayscale, which is characterized by the contrast resolution, you also need to be able to distinguish between the different pixels, i.e. your monitor needs to have the right spatial resolution to see the individual details. Let’s take a typical CR chest, which might have an image matrix size of 2000 by 2500 pixels, that results in 5 million pixels or 5MP. The standard configuration for a diagnostic monitor to look at X-rays is 3MP, because a physician has the capability to zoom or use an electronic loupe to see a one-to-one mapping of each image pixel element on the screen. One could actually argue that you can use a 2MP monitor as well, and yes that is correct as long as you realize that it will take more time to make a diagnosis, as you need to zoom more frequently. But if you are very cost sensitive, for example considering a system that is placed in a developing country where money is major issue, a 2MP configuration would do. So, the sixth and final requirement is to have a 3 MP monitor configuration (assuming time is more important than cost).
Does this mean that a commercial grade monitor cannot be used? It depends, if you are willing to manually calibrate the monitor and do this on a regular basis, by running a calibration check and making sure this can be applied by the monitor, if you make sure to take care of the warm-up time, if you have a monitor that meets the maximum brightness requirement, if you keep your calibration records, and are not worried that in case of a legal dispute the plaintiff does not have enough expertise to challenge you with the fact that you use sub-standard components that could impact patient care, well… it is up to you. But I would think twice about it, especially as the price difference between a good quality medical grade monitor and commercial grade monitor is not that great compared with the overall cost of a PACS system.
[1] The unit of measurement for luminance specified as brightness is called cd/m2 which stands for candela per square meter.
1. A commercial grade monitor does not show all of the critical anatomical information. As the name implies, the COTS monitors are intended for office automation, to display documents to appear like a printed page. Therefore performance attributes are weighted heavily to being as bright as possible so that text is easily resolved with minimal eyestrain. Commercial displays therefore attain maximum luminance way before the graphic card input reaching its maximum input value. Remember that a typical graphics card can display 256 different input values, each representing a distinct piece of valuable diagnostic information. These monitors have been observed to max out as low as a value of 200, which means values 201 to 255 are being mapped to the same luminance value … maximum. This means that 20 percent of all the data is cropped or simply eliminated.
By contrast, medical grade monitors are calibrated to map each individual distinct pixel into something you can detect rather than following the natural response of a graphics card output. Unfortunately, it is normal for the natural COTS monitor response (un-corrected to DICOM) to yield the same luminance value (measured) for multiple sequential values, i.e., a flat spot in the response curve. These flat spots are especially obvious in the low range, i.e. the first 160 of the 256 values.
What is the impact of a flat response? Let’s take as an example, for a commercial grade monitor the pixel values of 101, 102, 103, 104, and 105, could be mapped into a single luminance value on the screen. That means that if there is a slight nodule, which is identified by a difference in value between 102 and 105, it will disappear, as there is no distinction between these values on the monitor. Note that since the better part of the clinical information from the imaging modalities is in the lower 50 percent of the luminance range, this means that these are in the most critical areas wherein the ability to resolve pixels at different luminance values is being compromised.
In conclusion, the potential to miss critical diagnostic information both at high luminance and due to flat spots in the response should be the number one reason to not even consider a commercial grade monitor. Therefore, the first requirement for medical monitors is to insist on a monitor that is calibrated according to DICOM standards, which truly maps each of the different pixel values into a luminance value on the screen that is detectable by the human visual system as noticeably different. It is best to have this calibration done at manufacturing to have an optimal mapping of the three RGB channels into the DICOM compliant curve.
2. Many commercial grade monitors don’t have the required dynamic range. The maximum light output of a monitor is specified using the units of cd/m2 (candela/square meter). A good quality commercial display can achieve 300 cd/m2, sometimes more if you are lucky. The maximum value of 300 cd/m2 would be at the low end of any medical grade monitor, which might be able to go up to 500 cd/m2 or more. Why do we need this much? The reason is that when a display is calibrated to DICOM, a percentage of the response is lost in the mapping process. At 300cd/m and applying DICOM corrections, the maximum value can be expected to decrease by about 10 percent.
The human eye has a 250:1 contrast ratio at the ambient conditions of the viewing environment. Assuming the commercial display was DICOM compliant with aftermarket software, the luminance ratio of the display and the eye would be very close. However, ambient light detracts from the ability to see low contrast information. This particular example would need to be in a low light room to achieve a 250:1 luminance ratio inclusive of ambient light.
Medical displays are designed to operate between 400 and 600 cd/m2 as corrected to DICOM with reserve luminance potential for extended life at those levels. Even if a monitor is calibrated, if there are not enough points to map the pixel data into, you clip off part of the information. For example, if you want to map 256 grayscale pixel values but have only 200 points available, you’ll lose that information. The required dynamic range depends on where you are going to use the monitor. As you are probably aware, the lighter, i.e. the brighter the room light, the more information you are going to lose in the dark, as you simply won’t be able to distinguish details in the dark.
There is a simple fix for that, the calibration takes into account the room light and makes sure the lowest pixel value is mapped to something you can detect. The whole range is shifted, which is important when using it in a light area such as ER or ICU. Also, it is good to have some “slack” at the dynamic range, the light source of the monitor will decrease (compare the output of an old light bulb), and get lower and lower over time). Therefore, the maximum brightness to facilitate mapping the whole data range should be about 350 cd/m2[1] assuming you use it in a dark environment. If you are using it in a bright area or if you want to make sure you have some slack to facilitate the decrease of monitor output over a period of let’s say 5 years, you might want to go with 450-500 cd/m2.
3. A medical-grade monitor typically adjusts the output to compensate for start-up variations in output. The light output of a monitor varies as the temperature needs to stabilize for about 30-60 minutes. You can leave them on day and night, or switch them on automatically one hour before they are going to be used, however, either method will drastically reduce the lifetime. Better grade medical monitors typically have a feedback mechanism built in that measures the light output and adjusts the current to the light source to create the same output. The third requirement therefore is to have a medical grade monitor with a light output stabilizer.
4. A medical grade monitor can usually keep a record and keep track of its calibration. One of my students in our PACS training told me that he had to deliver the calibration record of a specific monitor dated 2 years back for legal reasons, to prove that when an interpretation was made on a workstation, there was no technical reason that a specific finding was missed. In addition, you need access to these records on a regular basis, regardless, to make sure that the monitor is still operating within the acceptable range. This brings me to another point – many users seem to replace their monitors after a period of five years. If they are still within the calibration, there is no reason to do that. Therefore, the fourth requirement for a medical grade monitor is to make sure that you can retrieve and store the calibration records.
5. A medical grade monitor is typically certified. There are recommendations that are defined by the ACR for monitors. They are somewhat technical and in my opinion not worded strongly enough. Also, most medical grade monitor manufacturers are FDA approved, which is actually only a requirement in case you are reading digital mammography. If you meet the requirements stated above, you should be OK, but FDA approval does not hurt. You can check the FDA website and look up the manufacturer to see if they have been approved. The fifth (optional) requirement is therefore to be FDA approved.
6. In addition to being able to see all of the grayscale, which is characterized by the contrast resolution, you also need to be able to distinguish between the different pixels, i.e. your monitor needs to have the right spatial resolution to see the individual details. Let’s take a typical CR chest, which might have an image matrix size of 2000 by 2500 pixels, that results in 5 million pixels or 5MP. The standard configuration for a diagnostic monitor to look at X-rays is 3MP, because a physician has the capability to zoom or use an electronic loupe to see a one-to-one mapping of each image pixel element on the screen. One could actually argue that you can use a 2MP monitor as well, and yes that is correct as long as you realize that it will take more time to make a diagnosis, as you need to zoom more frequently. But if you are very cost sensitive, for example considering a system that is placed in a developing country where money is major issue, a 2MP configuration would do. So, the sixth and final requirement is to have a 3 MP monitor configuration (assuming time is more important than cost).
Does this mean that a commercial grade monitor cannot be used? It depends, if you are willing to manually calibrate the monitor and do this on a regular basis, by running a calibration check and making sure this can be applied by the monitor, if you make sure to take care of the warm-up time, if you have a monitor that meets the maximum brightness requirement, if you keep your calibration records, and are not worried that in case of a legal dispute the plaintiff does not have enough expertise to challenge you with the fact that you use sub-standard components that could impact patient care, well… it is up to you. But I would think twice about it, especially as the price difference between a good quality medical grade monitor and commercial grade monitor is not that great compared with the overall cost of a PACS system.
[1] The unit of measurement for luminance specified as brightness is called cd/m2 which stands for candela per square meter.
Article Credits: Herman Oosterwijk
Expert trainer/consultant on DICOM, HL7, PACS, EHR
Leave A Comment