Here's an alternative vantage point, my vantage point, one I think makes these kinds of ethical quandaries easier to navigate:
* I'm not a "white hat" or a "black hat"
* I'm not deliberately involved in any kind of "cyber" conflict
* I don't do what I do because I'm battling the forces of evil, or organized crime, or anything else
Instead: I do engineering. The same way a contract driver developer does, or a Rails dev. I happen to work in a particularly challenging problem domain. My work happens to have some interesting implications. But those implications are not the reason I work in the field; I work here because it allows me to grapple with compilers, number theory, low-level networking, hardware, OS kernels, and every imaginable development platform. It's about the craft.
I find this vantage point, which appears amoral, makes the ethical dilemmas easier to resolve. If a company like Narus asks me to help them make a network monitoring system harder to evade, I don't have to put that request into some ethical framework that considers the good that application might do. I just turn the work down. Same goes for the US Government; no, sorry, not interested.
Total respect for Alex (the "white hat consulting company" he founded is iSec Partners, our sister company and former archrival). I get the sense that Alex engages intentionally with these dilemmas, that he wants to be a part of something larger than himself and, I think, larger than the craft. As a result, sure, he has to live a carefully examined life, and make sure the projects he's working on aren't skewing his compass. I admire him for picking his way through those problems. But I'm every bit as engaged with the field as Alex is, and I'm here to tell you that you don't have to get tangled up in these kinds of ethical problems if you don't want to.
Reading what you just wrote reminded me of the famous Edmund Burke quote: "All that is necessary for evil to triumph is for good men to do nothing".
If it had not been for the acquiescence of engineers who took part in the creation of PRISM, XKeyscore, etc. we... well, we would not have PRISM, XKeyscore, etc. Increasingly there is no such thing as an "amoral" position when it comes to a lot of these things -- you're either an entity who willingly chooses profit over principles, or you do something to defeat the evil as you see it (or, at least refuse to take part in it). In this day and age the conscientiousness of man is one of the last remaining defenses to fight the many evils, new or old, mercurial or familiar. It falls on all of us to think of the moral ramifications of our actions, in the workplace and off, and choose carefully and to the extent we comfortably can to see humanity continue prosperously.
I don't mean this to be a thoughtless, idealistic anti-NSA tirade, I'm frankly very okay with folks working on hip new technology that catches the bad guys, I just think your decision framework which is devoid of any ethical considerations is highly, highly dangerous and I wish for the good of us all that it doesn't catch on.
I think your philosophy is the more dangerous one. "Evil" and "Good" are highly subjective and not nearly as straightforward as you're imagining. You see yourself as battling evil. So do the creators of PRISM. Nobody does anything because they think it's bad for the country, or humanity. We are all the heroes of our own story. The greater good has been used all through history to justify horrific acts, and those people genuinely thought they were doing the right thing.
or you do something to defeat the evil as you see it (or, at least refuse to take part in it)
Refusing to take part doesn't absolve you of anything, according to your philosophy. That's exactly what Thomas says he does - refuses to take part. And by refusing to take part, you're still doing nothing to thwart what you see as evil.
You bring up good points, and I agree that a lot of these issues are very difficult to grapple with: it's difficult to pin down how evil something is, how much you're contributing to it, and whether you should take part in it when you've got mouths to feed at home. But I also think that an informed and learned individual in this day and age will recognize that dragnet surveillance encroaches on fundamental rights of human privacy. If I were an employee at NSA and had been asked to implement some part of PRISM I would protest within the proper confines of law, and ask to be given other work which I would be ethically okay with.
> Refusing to take part doesn't absolve you of anything, according to your philosophy
I think in this context it's fair to interpret a refusal to work on something you deem evil not as inaction but as an act that makes it difficult for evil to prevail. If most good men did this they either would not find persons to complete the work or only be able to find persons who cannot do it well or do it completely.
To further clarify, what I am really saying is one's decisions at work which are detached from any ethical considerations is a problem, they're not just engineering problems -- they affect people, in good ways or bad. I hope everyone would make an earnest effort to determine the morality of tools, laws, policies, etc. they're in charge of creating or maintaining by accessing existing literature, discussing the moral considerations of their work with their peers and others, and then decide if they really want to be a part of that. And, as it happens, the chances are that since a lot of this stuff work requires high competency, whenever you find yourself in a situation where determining the morality of your work is exceedingly difficult, there is good chance you can easily find good work elsewhere that will give you the right engineering challenges without the difficult ethical questions.
> it's difficult to pin down how evil something is
You completely missed my point. You can't know how "evil" something is because "evil" is a point of view, not an objective fact.
> I think in this context it's fair to interpret a refusal to work on something you deem evil not as inaction but as an act that makes it difficult for evil to prevail. If most good men did this they either would not find persons to complete the work or only be able to find persons who cannot do it well or do it completely.
I think you would prefer that this were the case, but it's not. It assumes that everybody else in the world with the type of training required to do the task also turns it down. It also assumes that a young, idealistic programmer with a talent for [crypto/big data/whatever] isn't convinced he's helping to protect his fellow americans by taking the very job you turned down. In a nutshell, it assumes everybody has the same moral values you do, which is demonstrably not the case.
This brings up the question, if somebody is going to do that job anyway, is it enough that it's not you? In other words, is turning down the job enough to resolve your ethical dilemma? Personally, I choose Thomas's method of avoiding jobs that even make me think about it, especially when it comes to surveillance and privacy.
I am understanding it differently, may be because after much deliberation I fell into the same position. I interpret the parent comment to mean that the questions posed are very, very relevant and worth grappling with, but knowing the answers is not a pre-requisite to work in the chosen field.
I don't know about the original commenter, but I fell into this position as being the best for me based on, "I am overwhelmed by the number of things that I need to know to make a judgement of good/bad over here, but don't or cannot know. There is too much random chance in my life to figure out how my actions play out. Until I grow wiser, let me do what that chance has laid my way, knowing fully well that I am operating in the dark."
The big, muddying parameters for me to answer 'is what I am doing right?' were:
1. In what context?
2. Over what timeframe?
The larger the context, the longer the timeframe, the more the number of competing principles I had to prioritize, often in inconsistent ways over different aspects of life. In the end, I defaulted to the original commenter's position, with the blind optimism that I would somehow, somewhere in the future get more clarity and wisdom through experience.
Exactly this. Alex Stamos' ethical strategy is perilous; it forces him to do a balancing act when asked to help companies like Narus or the USG, because his work for those organizations could help more than it hurt. I'm confident Alex can perform those ethical acrobatics, but I prefer to avoid them altogether.
One, I don't think it's the HN law that all threads should follow in some precise linearity: there's such a thing as free-form debate. Two, my comment did have something to do with your comment. Three, you very often reply like this -- "did you read what I said?" -- not that it matters much to me personally, but can you please start making an effort of at least trying to communicate such things in a little more civilized manner? The tone of such comments is often toxic and inflammatory.
The idea you took away from my comment was the opposite of the idea that it communicated. You decided that my comment was an endorsement of amorality. It wasn't. Now that you've been called out on that, you wriggle and writhe through all sorts of meta-commentary to avoid acknowledging your misperception.
Funny, I also happen to think that the idea you took away from my comment was the opposite of the idea that it communicated.
Okay, forgive me for getting even more meta here, but: You're operating under the assumption that I would doggedly stick to my misinterpretation (if it were the case), or maybe that I have some agenda to distort the messages in your comments? My comment history suggests otherwise: I'm more than happy to back off, apologize, and recant any misguided statements I make if it's pointed out rationally. Why doesn't the idea of calmly, non-abrasively trying to explain the breakdown in communication occur to you? Why are you so quick to elevate differences into an us-vs-them orientation on a personal level?
>If it had not been for the acquiescence of engineers who took part in the creation of PRISM, XKeyscore, etc. we... well, we would not have PRISM, XKeyscore, etc.
Oh, come on, man. If you think this is true in the fullest sense, you are not a thinking individual. Mere technological feasibility is 99% of the battle; implementation is the last, most inevitable step.
>the famous Edmund Burke quote: "All that is necessary for evil to triumph is for good men to do nothing".
Googler. The quote may be famous, but Burke is not. Like most men with noble fighting words, he had more noble words in him than he did noble fighting.
The argument is presumably that if you won't do it, they'll find someone else who will.
That's a pretty shit argument for being the one who folds. If everyone does refuse then it doesn't get built. If you refuse and they find someone else, that person may not be as good as you and may create a less effective surveillance apparatus which is easier for white hats to neutralize or dismantle.
Engineering has practical consequences. If you build something that gets democracy activists killed, you're the one who has to live with that. There are plenty of cool problems to solve that don't involve the construction of a surveillance state.
I missed you at Defcon for multiple reasons, not the least being the opportunity to get your feedback on the talk as delivered. Maybe we can run a pan-NCC internal conference this fall and see what everybody else is working on. Chicago is nice and central between SF and Manchester.
A big part of the talk was my theory that our industry can no longer claim neutrality; like medicine or law, our actions have become innately entwined with ethical dilemmas that I feel to be better dealt with explicitly and ahead of the moment of decision. I don't think you necessarily disagree, since you lay out two lines you are not willing to cross even if you do not specify your reasoning.
I expect somebody as seasoned and experienced as you can make these decisions subconsciously without violating your basic principles. Younger, less experienced individuals may find this to be a greater challenge and they were the real target of my talk.
In my eyes your actions definitely make you a white hat, even if you avoid the label.
well professional engineers and technicians already have this
in the UK (engineering council)
"All professional engineers and technicians are bound by the Codes of Conduct of their professional engineering institutions. "
and in the US from the NSPE
Engineers, in the fulfillment of their professional duties, shall:
Hold paramount the safety, health, and welfare of the public.
Perform services only in areas of their competence.
Issue public statements only in an objective and truthful manner.
Act for each employer or client as faithful agents or trustees.
Avoid deceptive acts.
Conduct themselves honorably, responsibly, ethically, and lawfully so as to enhance the honor, reputation, and usefulness of the profession.
Correct me if I'm wrong, but you're saying the ethically questionable work will not be the most serious from an engineering perspective and thus less interesting for a hard-core security engineer. If that's your position then I think it needs to be fleshed out more than just saying you can amorally and categorically reject contracts from Narus or the US Govt. What is fundamentally uninteresting about their systems? I'm sure a lot of engineers working on PRISM et al found it to be very technically challenging and rewarding work.
> he wants to be a part of something larger than himself
In fact, you too are part of something larger than yourself and as you work you make decisions that affect it (us) whether or not you think about it. Ignoring an ethical quandary isn't the same as escaping it.
So what ethical stance are you saying you take? You identify as "amoral" and seem to use that to mean "simply self-interested", where self-interest involves doing a craft you enjoy. But then you say there are jobs you wouldn't take for reasons external to the technology. cgag's question is a good one, and I think the contradiction there points to a flaw in the approach of starting out by thinking you can avoid ethical choices. Since you can't really, the only result will be that you make them without thinking them through.
It's categorical rejection. Instead of opening up that venue and having to navigate the murky waters of cognitive dissonance and moral obligations against lucrative opportunity, you simply don't open the door. You stick to a client base that is meaningful (relatively so) but not highly impactful and potentially dangerous down the road.
This is an excellent basic strategy. I've applied it to my own work; obviously avoiding government or defense contracting work is a good first step, cutting out Wall St. proper and stock/bond/asset trading is likely a prime second candidate.
To call this position as "non-participation" or "categorical rejection" is intellectually dishonest, and to say this stance is some kind of blanket protection from future ethical considerations is borderline trolling.
Engineers need to learn the art of building social and political capital. Also patience.
Just the finances, scientific mindset and a moral conscience doesn't get you real change. I have seen too many bright people "rant and run" when confronted with morally ambiguous/uncomfortable situations. The only way to deal with this, if you really believe in something, is to stick around and convince others in the group, one by one, over the long term. There are no shortcuts or hacks to this process.
* I'm not a "white hat" or a "black hat"
* I'm not deliberately involved in any kind of "cyber" conflict
* I don't do what I do because I'm battling the forces of evil, or organized crime, or anything else
Instead: I do engineering. The same way a contract driver developer does, or a Rails dev. I happen to work in a particularly challenging problem domain. My work happens to have some interesting implications. But those implications are not the reason I work in the field; I work here because it allows me to grapple with compilers, number theory, low-level networking, hardware, OS kernels, and every imaginable development platform. It's about the craft.
I find this vantage point, which appears amoral, makes the ethical dilemmas easier to resolve. If a company like Narus asks me to help them make a network monitoring system harder to evade, I don't have to put that request into some ethical framework that considers the good that application might do. I just turn the work down. Same goes for the US Government; no, sorry, not interested.
Total respect for Alex (the "white hat consulting company" he founded is iSec Partners, our sister company and former archrival). I get the sense that Alex engages intentionally with these dilemmas, that he wants to be a part of something larger than himself and, I think, larger than the craft. As a result, sure, he has to live a carefully examined life, and make sure the projects he's working on aren't skewing his compass. I admire him for picking his way through those problems. But I'm every bit as engaged with the field as Alex is, and I'm here to tell you that you don't have to get tangled up in these kinds of ethical problems if you don't want to.