First, let’s look back at how a mammogram was created. In 1917, a German named Albert Salomon actually performed the first mammograms on mastectomy specimens in an attempt to see if he could detect cancer cells. He identified breast cancer cells as appearing as grains of sand or calcification. Unfortunately, the Nazis interrupted Salomon’s studies in the mid-1930s. By the mid-1960s, mammography returned to X-ray clinics in the United States through the work of Dr. Robert Egan in Houston, Texas. His mammograms could detect tumors as small as a grain of barley. During this time period, a large prospective randomized trial was conducted with women between 40 and 64 years of age. By 1971 the results of this trial showed that a mammogram produced a survival benefit. Consequently, the American Cancer Society launched a massive campaign to promote screening with mammograms.
Nearly 50 years later, the trial now shows a survival benefit for women over the age of 50, but minimal benefit for younger women. Thus, the question remains: when should a woman have her first mammogram? Recently, a Harvard University study looked at the same question from a different perspective. This study identified women who had been diagnosed with breast cancer between 1990 and 1999. These patients were tracked until 2007. The results showed that out of 609 breast cancer deaths, 395 of these women (71%) never had a mammogram prior to the diagnosis. The most important finding, however, was that more than half of these breast cancer deaths were in women younger than 50, while only 13% occurred in women over 70. Also important to note is that this study used “modern” mammography, which has continued to evolve over the last 50 years.
Today, the American Cancer Society recommends yearly mammograms starting at age 40 and continuing for as long as a woman is in good health. October is Breast Cancer Awareness Month—so be “aware” and get your mammogram.