ColumnistsGuest Columnist

Reflections: Reflecting on Science 75 Years After Hiroshima and Nagasaki

By August 14, 2020 September 1st, 2020 No Comments

Dylan Mori

The Manhattan Project was the largest collective scientific endeavor in United States history. It was a massive collaboration between scientists and engineers at national laboratories and universities. Throughout the five years of development for the atomic bomb, the understanding of the atom — the building block of all matter — grew exponentially.

When all was said and done, it resulted in the death of 150,000 Japanese citizens. The cities of Hiroshima and Nagasaki were leveled. Because of the difficulty in retrieving samples and the nature of long-term epidemiological studies, it’s unclear how many cases of cancer in Japan were linked to the 1945 bombings. As a result, the total number of deaths resulting from the atomic bomb will likely never be known for sure.

As a scientist, it’s hard for me to romanticize this time in history. And history casts a long shadow.

A common rallying cry is that science is, by its very nature, “apolitical.” This slogan has arisen recently from the Trump administration’s various anti-science measures. Its decision to curtail and suppress research on climate change and cut funding to the World Health Organization has led to widespread criticism and backlash from the scientific community and people who consider themselves pro-science. In a time where wearing a face mask is considered a political statement, it’s tempting to buy into the apolitical label.

But the claim of neutrality overlooks the greater systems in which research operates. Science is a framework of thinking and methodologies rather than a concrete system of morals or beliefs, and in the wrong hands, this can lead to harm.

The human experiments of Nazi Germany and Imperial Japan are an extreme example of this, and the legacy of the Tuskegee Experiments here in America can still be felt in racial inequities in health care to this day (it should also be noted that the atrocities of Nazi research could not have happened without the support of American company IBM).

Beyond the actions of individual actors, however, lies an environment where an ethical or humancentric application of science is nearly impossible. At a time where the hypercompetitive nature of academic funding is at an all-time high, scientists are financially dependent on the whims of their government funders, which more often than not bend toward the political status quo.

On the industrial side, tech companies offer miracle technofixes to “improve” peoples’ lives, often at the expense of our individual privacy and necessary public goods and services. It’s difficult to look back at all of the major technological advances of the last 75 years and find one that has not been exploited for profit, war or surveillance (or some combination of the three).

In this framework, the development of the atomic bomb casts a much darker shadow. Its development could not have happened as rapidly as it did without the full power of the military-industrial complex behind it.

Some scientists, like Leo Szilard and Lilli Hornig, tried to mitigate the damage and prevent the bomb’s detonation without giving Japan a chance to surrender. Some, like Oppenheimer, opposed the future development of nuclear weapons, foreseeing the cold arms race between the U.S. and the Soviet Union.

But in either case, the damage had already been done. Looking back at 1945, it seems clear that the destruction of Hiroshima and Nagasaki was the only clear outcome of the Manhattan Project.

It’s sometimes difficult to predict the long-term outcomes of the discoveries that scientists are making today. Most tech workers and scientists are not building bombs, but their breakthroughs might still have the potential to cause harm. However, greater accountability isn’t impossible if science becomes more democratized and is pursued with clearer intentions.

The construction of the 30-meter telescope (TMT) on Mauna Kea in Hawaii is a modern example of this type of accountability. To the physicists advocating for the construction of TMT, it was an unprecedented new opportunity to make discoveries in the field of astronomy. To the Native Hawaiians, it was another colonial project on their land, on a holy site, to which their sovereignty and right to self-determination were being ignored.

The local resistance toward the project resulted in the blockading of Mauna Kea by land protectors and activists, leading to years of conflict and legal battles. As of today, the land protectors and the constructors are at a standoff, and the future of the TMT remains unclear.

Land protectors do not have to bear this responsibility on their own. Greater accountability is possible within the scientific and tech establishments as well. Part of the solution lies in tech workers’ greater control over what they develop and how it is used.

Facial recognition has been one major target on this front for its racist and discriminatory execution. Tech workers and activists part of the #NoTechForICE movement have called on major companies such as Amazon, Microsoft and Salesforce to end their contracts with ICE.

They have gone on strike and held public demonstrations to prevent the selling of facial recognition technology to ICE and have universities cut ties with these companies.

Like all things, tech doesn’t exist in a vacuum. The atomic bombing set the course of the second half of the 20th century, and it’s an important part of our own community’s history. Although researchers can’t predict the future, we can control the long-term vision for our technology. It’s a question of our political will, but it’s the clearest way forward to remake science in service to the people.

Dr. Dylan Mori is a postdoctoral fellow at the University of Colorado Anschutz Medical Campus and the president of the JACL Mile High chapter.