A federal court has ordered Apple to develop software to
decrypt the cell phone of a terrorist. Apple is opposed to this order, citing
larger privacy concerns. Many have said that Apple is wrong and we need the
information to understand what happened on that fateful day in San Bernardino.
Apple’s concern is that by creating the technology /
software in this case will end up having it available not only to Apple, but the
government in general, and the “black hats” of the coding world. They feel that
creating this key creates a bigger risk to all users than any possible benefit
in this single case.
I can’t help but think about it this way: The government has
a search warrant for a house but don’t have a key. They have tried but failed
at breaking in the door. A court orders the local locksmith to create a key to
the lock. There is no known way of doing this, so the locksmith will have to
spend time and money to figure out a way. Since it is a court order, even if he
doesn’t want to, he will be compelled to do so, despite the fact that he had no
part in any crime. He also knows that once he creates this new key, it will be
able to open any house in the world. He is somewhat reluctant to have to create
such a key.
Many in government have said that a person should not be
made to do something against their moral or religious code, and we know from
the “Citizens United” decision that corporations are people, how can those same
government officials ask a “person” (Apple) to go against their ethical code
without being hypocritical?
Where do ethics and security meet and where do they diverge?
Is it better to risk many to save a few? Is it better to lock up many,
including innocents, to be sure you don’t miss any of the guilty? Share your
thoughts
No comments:
Post a Comment