How can we reflect on ethical responsibility amid technological advancement?

This blog post examines the ethical responsibilities rapidly advancing technology demands of humanity and explores ways to reflect on them.

 

Introduction

The exploration of human nature has likely persisted since we first found leisure beyond merely eating and living. Humans classified the living beings on Earth into animals, plants, insects, microorganisms, and so forth. The category of animals was further divided into species like mammals, primates, reptiles, and amphibians. Humans are animals belonging to the mammal family. Yet, humanity continues to seek to distinguish itself from animals. Six key differences are often cited to distinguish humans from other animals: rational beings, tool-using beings, playful beings, social beings, cultural beings, and ethical beings. Among these, being rational and ethical creates the most significant distinction. Humans can act independently, make judgments, and choose what they deem most valuable. They can reflect on their choices to make better decisions and judgments next time. This rational thinking is the driving force enabling humans to form and sustain complex societies. Humans also possess the characteristic of creating and diversely using tools according to purpose, defining them as instrumental beings. Tools were invented by humans alongside their emergence, making them indispensable when discussing the nature of human existence. Through this, humans pursued goals beyond mere survival and achieved original accomplishments in diverse fields such as art, philosophy, and science.
Humans utilize technology to invent and advance tools. Technology is also an indispensable element when describing humanity. Starting with the creation of chipped stone tools, technology developed at an unstoppable, rapid pace through the medieval era and World War II. Although it took over 100 years from the first invention of the automobile in 1769 to the development of the first car with an engine mounted at the front by Panhard Levassor in 1891, the pace of technological advancement accelerated dramatically thereafter. After the world’s first mobile phone was developed by Dr. Martin Cooper and his team at Motorola in 1973, it took only 30 years for smartphones to become commercially available with the release of the iPhone in 2007. Technologies derived from these rapid advancements themselves developed swiftly. Innovations like the internet, touchscreens, and fingerprint recognition—once groundbreaking and astonishing—are now commonplace around us and have become essential elements in developing new products. Technologies to watch in the 21st century—nanotechnology, biotechnology, ubiquitous computing—may still be unfamiliar names to us. Yet within a generation, or even just a few years from now, they will undoubtedly permeate every aspect of our society.
Technological advancement and societal development progress at nearly the same pace. As technology advances, society undergoes a revolution, transforming into something entirely new. After the agricultural revolution, then the industrial revolution, we entered the information age, and now we stand on the threshold of the ubiquitous era. How much society will change is unknown. Therefore, there must also be change within humanity itself. Change in the spiritual and ethical realms. A changing society creates new events we have never known before. It creates new forms of life, new ways of thinking, and even new crimes. Dilemmas will emerge that are difficult to resolve using past ethical standards, and these changes will demand ethical reflection not only from individuals but across society as a whole. Therefore, we must consider technological advancement and human ethics together.

 

What is the meaning of technology?

First, let’s examine what technology means. The term ‘technology’ originates from the Greek word ‘techne’. Originally, the word ‘techne’ meant the practice of producing external manifestations of the human mind. Today, the term ‘technology’ is often used to mean ‘the production of material goods’. There are three aspects that constitute this technology. First, technology is an artifact. For example, a stone is simply a natural object found abundantly in nature. However, when we processed this stone in the past to use it as a weapon or tool, that stone became technology as an artifact. Second, technology also refers to knowledge, meaning that specific logic and knowledge are required to create and use artifacts. Third, technology is an activity. The activities of both the creators and the users are applied to technology. Even if technology as an artifact is developed through engineering knowledge, it loses its meaning as technology if it is not used. Therefore, technology is not limited to mere material creations; it is deeply connected to the intellectual and social activities of humans who create and utilize it.

 

The History of Technology

Technology emerged alongside humanity. While it’s a bit of a stretch to say it appeared with humans—since humans weren’t born holding torches in one hand and knives in the other—it’s certain that humans were the cause of technology’s emergence. During the Paleolithic era, humans lived as hunter-gatherers. Humans, finding it difficult to gather tree fruits and hunt animals with bare hands, crafted tools by shaping stones into easy-to-hold forms—these were chipped stone tools. Furthermore, they created spears, arrows, traps, and knives to facilitate hunting. Through this process, technology became an essential element for human survival and grew increasingly sophisticated alongside the development of human society. As humans discovered fire and began gathering around it, communities formed. They settled down, entered the Neolithic Age, and underwent the agricultural revolution. Their technology evolved accordingly: they crafted farming tools like flaked stone axes, hoes with flaked blades, and sickles made from ground flint, and also produced pottery for storing food. Bronze was not a material easily processed by everyone. Therefore, upon entering the Bronze Age, a specialized artisan class emerged, responsible for metal production and related activities. In ancient times, the ability to make tools using iron became possible, and technologies like the catapult and crane, invented by Archimedes, were utilized as weapons of war. In fact, ancient times saw little major technological progress due to the widespread belief that one should not interfere with nature. However, technology advanced significantly during the Middle Ages. Although the Middle Ages are often called the Dark Ages of science, technology could develop because the dominant Christian ideology of the time recognized the value of utilizing nature, despite the stagnation of science. Agriculture, military technology, and power systems saw major advancements during this period. Gunpowder, invented in China, reached Europe by the 13th century, enabling the use of cannons by the 15th century. Interestingly, gunpowder itself was not originally developed for military purposes. While the theory that gunpowder was first invented in China is widely accepted, it was actually one of the medicinal ingredients used to treat boils and plague. Neither the crane mentioned earlier nor Chinese gunpowder were technologies developed specifically for warfare. However, the crane was used as a weapon in ancient warfare, and gunpowder remains the primary component of cannonballs. We must recognize that human intervention transformed the purpose of these technologies. While the initial use of technology often aimed at survival and improving daily life, its purpose can easily be distorted according to human intent. Many technologies we enjoy today were developed during World War II. Technologies originally intended for non-military purposes were also utilized in warfare, and truly diverse technologies were experimented with and employed for war aims. For instance, aircraft, originally a means to transport long-distance mail, were used as weapons of war during World War II. Physics was applied to determine the most effective shell design, and chemistry was used to develop new explosives. Nuclear weapons, the most prominent example that comes to mind when we discuss the negative aspects of technology, were also first invented during this period. Even the internet, a technology we cannot live without, was initially developed for military purposes. This internet and computing technology advanced at such a rapid pace that we now live in an era where invisible networks of computers surround us. Mark Weiser felt using computers was too complex and conceived ‘quiet technology’—computers that operate without the user’s awareness. That is ubiquitous computing technology. Ubiquitous computing technology aims to fuse physical and digital spaces. It involves effectively embedding computers into real-world objects and environments to enable information flow between objects, people, and computers. Examining current technological trends reveals fragmented yet widespread phenomena indicating our progression toward the ubiquitous computing era. Examples include internet-connected home appliances (washing machines, refrigerators, microwaves), automatic meter reading and control systems (water, electricity, gas, lighting), remote-controlled robots that clean and guard homes, intelligent concept cars, wristwatch mobile phones, and communication-enabled accessories (earrings, rings). While the advancement of these technologies contributes to improving the quality of human life, it simultaneously raises new problems.

 

What problems does technology pose?

The most significant technological advancement during World War II also brought about its most severe consequence: nuclear weapons. The war ended when the United States dropped atomic bombs on Hiroshima and Nagasaki, leading to Japan’s surrender. The destructive power of an atomic bomb is immense, capable of rendering an entire city virtually uninhabitable with a single detonation. A 15-kiloton bomb creates a blast radius of about 500 meters and a thermal radiation radius of about 3.5 kilometers from the epicenter. This means that within one second of detonation, everything within 500 meters of the epicenter vaporizes. Between 500 meters and 1 kilometer, survival rates are about 30% if sheltered or inside a building. The thermal radiation exposure zone within 3.5km receives an instantaneous thermal radiation of approximately 2,000 degrees Celsius. Direct skin exposure causes carbonization, while indirect exposure results in third-degree burns. The atomic bombs dropped on Hiroshima and Nagasaki, Japan, were 16kt and 21kt class, respectively. The damage inflicted then continues to manifest nearly 70 years later. This destructive power starkly demonstrates how human-made technology can pose immense danger to nature and humanity itself.
In December 1938, German scientists Otto Hahn and Fritz Strassmann discovered the principle of nuclear fission. Nuclear fission occurs when uranium-235 absorbs a neutron, splitting the atom into two pieces and releasing vast amounts of energy in the process. Originally, nuclear fission was used to generate energy for boiling water to drive steam turbines. Its development into a lethal weapon of war was driven by the will of its users. This case reminds us how easily technology can be corrupted by human hands.
The most significant issues with current technology are likely cyber violence and personal information breaches. Cyber problems include spam emails, computer viruses, and personal data leaks. In 2017, many people suffered damage due to ransomware. This method involves spreading malicious software called ransomware to a target’s computer via email, rendering all documents inaccessible and demanding a substantial ransom. The damage was even greater because there is no perfect antivirus solution for this virus. In the upcoming ubiquitous society, where people and objects will be connected via computers, we must also be vigilant against anticipated security threats. If a device is stolen or lost, access rights to the target network could be compromised. Furthermore, attacks like battery depletion or signal interference could render ubiquitous computing technology completely unusable. These risks signify that technological advancement can seriously threaten human safety and privacy.
While virus infections via email or hacking represent specific technology-based crimes, a more frequent and severe form of cybercrime exists: malicious comments. The term “malicious comment” combines the Korean words for ‘evil’ (惡) and “reply” (리플), literally meaning a bad comment. Such malicious comments frequently target celebrities and other public figures. Unlike criticism of a person or their actions, malicious comments contain excessive and unwarranted condemnation. The severity of this problem was highlighted when the late Korean actress Choi Jin-sil, known to have suffered greatly from malicious comments, took her own life. Yet, the issue remains unresolved, with discussions about implementing a real-name system on the internet still ongoing. Tragically, this year alone has already seen two singers lose their lives due to malicious comments. These crimes—malicious comments, rumor spreading, cyberbullying—don’t require mastering specific skills to commit. Anyone can become the perpetrator. At this point, we must reflect on human ethics failing to keep pace with rapid technological advancement. Using technology demands more than mere proficiency. Ethical use and a sense of responsibility will become increasingly crucial.

 

What is the responsibility of the user employing technology?

Since the development of AI, some have worried whether the day might come when machines dominate humanity, as depicted in common movies or novels. Perhaps the thing humanity fears most in technological advancement is the moment when humans lose control over technology. Philosopher Jacques Ellul, who arguably ushered in the era of technology philosophy, argued that as modern technology advances, humans lose their freedom. The freedom Ellul speaks of refers to the state of being able to make choices without necessarily having to justify them. In modern technology, choices are made automatically. Furthermore, modern technology inevitably interconnects and expands. Once one technology is developed, another must be developed to maintain it.
All of this seems true. But could technology actually escape human control? Humans created the first technologies and have been the sole ones to develop and use them. Humans are both the creators and the only users of technology. Therefore, human responsibility in the use of technology must inevitably be emphasized.
In his book ‘The Ethics of Technology: The Practice of the Principle of Responsibility’, Hans Jonas stated that technology permeates all human concerns and also functions as a form of human power. Furthermore, since all human actions require moral scrutiny, technology—being a form of action—also demands moral examination. Among the terms Hans Jonas used is ‘ethical vacuum’. This refers to the gap between the advancement of science and technology and the ethics that cannot keep pace. We can see from Hans Jonas’s argument that the user’s responsibility is also inescapable.
Ethics refers to the principles that humans ought to follow or uphold, the norms governing human behavior. The issues discussed earlier—the development of nuclear weapons, social problems stemming from the spread of the internet, and the threats posed by ubiquitous technology—all arose from a lack of ethical awareness in humans. Why did people 60 years ago use research focused on generating more energy to create weapons of mass destruction? Why, as communication speeds up through the internet, do people plant viruses in others’ computers via email, invade someone’s privacy, and hide behind anonymity to spread rumors or malicious comments, hurting each other? These questions demand fundamental reflection on how we use technology. As Hans Jonas pointed out, our society still has an ethical vacuum, and as technological development accelerates, that vacuum will only continue to grow. Technology is merely a tool; it is neither inherently good nor evil. However, depending on the will of the humans who use it, it can become a weapon or a tool for improving society. Therefore, we must take responsibility for how we use technology, and ethical judgment and reflection are more crucial than ever.

 

Conclusion

In conclusion, reflection on technology itself is necessary, but equally important is reflection on human ethical consciousness when using technology. From the start of formal education, students receive instruction in basic morality and ethics. Nevertheless, crime persists in society, and as technology advances, only more diverse types of crime proliferate. To address this problem, ethics education suited to our changing society is necessary. This process takes time and is likely still ongoing. There is inevitably a gap of at least one generation between educators and students. That much time must pass before students complete their studies and become educators themselves. However, technology continues to advance even within the short span of less than a generation. Therefore, educators must thoroughly study the changed society and the developed technology. And the ethics discussed here are not solely about technology ethics. Technology ethics is something researchers developing technology must learn, while users who will enjoy technology until the day they die must learn ethics concerning user responsibility. Of course, since this is already being taught, the aspect needing change is the method. Students should not be taught ethics merely as something crammed into their heads for the sake of passing exams. Even within schools, even for seemingly minor matters, if an action violates ethical awareness or morality, it must be strongly communicated that it is wrong.
This alternative may seem unclear, abstract, and lacking in practicality. However, human ethics cannot be arbitrarily controlled by any certain method. Ethical consciousness is shaped by individual will and social environment, and there are limits to fully regulating or institutionalizing it. What we, as users of technology, can do is avoid using it in negative ways and take responsibility for our actions. If each individual makes a small effort, over time, the term “ethical vacuum” will fade into history, and we will become humans who wisely use technology, not humans dominated by it.

 

About the author

Writer

I'm a "Cat Detective" I help reunite lost cats with their families.
I recharge over a cup of café latte, enjoy walking and traveling, and expand my thoughts through writing. By observing the world closely and following my intellectual curiosity as a blog writer, I hope my words can offer help and comfort to others.