Over the years,Star Trekhas become synonymous with amazing futuristic technology, as well as the philosophical quagmire of ethics. Ever since the start,the shows have been political, making social commentary on issues of the day using sci-fi allegory.The Next Generationwas no different. One of the issues it approached was what it means to be human, through the android character of Data.

In his search to become human, Data often questioned the more unusual activities of humanity, including some of their questionable decisions and odd choices in the name of protecting the greater good. Following such logic, there is a moment in his journey where he pulls the trigger on a nefarious villain named Fajo, with all intent to kill him despite it being strictly against his programming. Why, and how, did Data try and kill this character?

Star Trek: death Fajo

RELATED:Star Trek: The Next Generation - The Best Data Episodes

The first important thing to remember about Data and his programming, was that he was in fact unburdened by the notion of the three laws of robotics, an established guideline for robotic and synthetic lifeforms. These laws act as a programmed moral code, as opposed to one followed by an individual through choice and social constructs like most humans. The most important of these laws states a robot must not injure a human being, or through inaction allow a human being to come to harm. Of course, in the world ofStar Trek,the use of ‘human’ implies all sentient beings (which is complicated in itself), but regardless, Data is not programmed with these rules. He is totally capable of doing harm, and in fact, does so at various occasions to foes ofthe iconic USS Enterprise D. Data is repeatedly required to take part in combat situations. If anything, he is part of a crew that has killed quite a few sentient beings over the years, him playing a vital part.

Those who abide by programming the three laws into robots and synthetics do so out of fear that without a forced moral code, their creation would end up going on a maddened rampage. Data, however, mainly through teaching but also his own personality, does not go the same way ashis twin brother Lore. He understands and abides by his own sense of ethics because he can understand morality. There is no ethical subroutine guiding his actions; instead, it is an incredibly complex array of judgment calls and decision-making. His process is as complicated as the logical train of thought that goes through a human’s mind when an ethical decision has to be made.

Star Trek: Data

This all comes to a head in the controversial episode “The Most Toys” where Data pulls the trigger ofa high-powered phaseron villain Fajo. He does so in a combat situation, but instead in a moment of cold-blooded murder. This is not an unreasonable situation, as Fajo had just killed an innocent woman just to make a point. He acted like it was nothing, and felt safe knowing that Data ‘could’ not do anything about it. Many humans or other emotionally driven races would have acted in retaliation, angered by Fajo’s actions. Many would have retaliated with enraged revenge, potentially killing Fajo much like how Data attempted. The difference here is that Data, at this point in the timeline, does not feel emotion. He can not feel anger towards this man for his actions. Instead, he responds in an entirely logical way.

Data calculates his options, evaluating that Fajo is a person capable of doing great harm to others, a serious threat to innocent people. He concludes that killing him is the morally right thing, stopping more injustices happening. He says so himself: ‘I cannot allow this to continue.’ Data’s ethical code on paper is largely against this action, to kill an unarmed opponent in cold blood upside the remits of combat. However, Data evaluates the consequences of not doing this and chooses to break his rule. It is simultaneously an exceptionally dark moment, and a pivotal part of Data’s change to become more human.

There is also something much darker happening when reading between the lines of this situation, one that makes uncomfortable observations about humanity. Data wants to become human so badly that he often emulates human behavior, mirroring things he believes to be inherently human. He observed that his crew and friends(such as disability icon Geordi LaForge)on board the ship were not killers, and would never as a rule kill another outside of combat. However, he knows that if the situation were dire enough and enough hinged on it, theymight. Humans of history have always done so, eventhe fairly militaristic Starfleet. The notion of killing one to save others is prominent through not only real history, butStar Trekhistory too. Data’s logical conclusion is that death/murder is bad, but if Fajo were left to live, he would likely continue killing people. However, if Data were to permanently stop him, there would only be one death. Data judged, unemotionally, that one death was better than the possibility of multiple. There was no revenge or anger, simply math.

While Data’s actions may have been what he believed to be right despite it going against some of his fundamental ethical guidelines, it was luckily not how things played out. At that moment Data made his decision, and at the very second he pulled the trigger to fire a shot that would have killed Fajo, they were both beamed away. Audiences know he in fact did pull it, asfan icon Miles O’Briennotably detects a weapon discharge, but managed to deactivate it as Data was beamed up. The episode ends with Data simply brushing this moment off, unaware of the significance of it or how he was seconds away from making a life-changing choice.

MORE:Star Trek: Voyager - What Made Kes and Neelix’s Relationship So Inappropriate?