Skip to content

Are Apple’s privacy concerns phony?

February 17, 2016

A federal judge has ordered Apple to provide a backdoor access to the iPhone belonging to the now deceased San Bernardino terrorists.

Apple, citing security and privacy concerns says it does not have and will not create such an access point, fearing that installing such code on all its phones would leave users vulnerable to hacking attacks.

Leaving aside the question of how truly private our lives are in actual practice, this argument seems strangely short-sighted.

First , the judge’s order only covers a phone formerly owned by one or more dead terrorists. Their privacy rights died with them.

Second, it should be possible to design code that could only be used under certain specific circumstances, and even limited to an after-purchase update sent only to certain phones.

Third, if Apple is so sure it is the only incorruptible and final arbiter of privacy, make it so that an iPhone can only be unlocked or its privacy settings changed remotely by an Apple employee in a secure government location, and then only upon receipt of a court order.

Failing that, perhaps Apple’s devices, which are manufactured in a country that is hostile to the U.S., should come with payment of a substantial national security fee, payable when manufactured, so the government can hire people to crack the security code.

It’s not unlikely that some bright 16-year-old hacker or freelance cyber criminal has already written the code. Perhaps the payment of a prize could pry it out of them.

This is of course tied back to the public furor over NSA bulk data collection exposed by Edward Snowden.

It can be fairly argued that if the government had a better way to place only specific devices on the data collection list, the bulk collection wouldn’t be necessary. After all, one of the biggest and most truthful knocks on bulk data collection is that it is ineffective due to sheer volume.

From a business standpoint, Apple is betting that their refusal to comply with the court order will result in more sales. In the age of nanny cams spying on you, who wouldn’t want a phone that not even the U.S. government can crack?

That’s going to work fine until there is another mass casualty event, and it’s discovered all the bad guys were using iPhones to plan and implement the attack.

Any further terrorist events post-San Bernardino and Paris could leave Apple, Google or any of the large tech and data management  firms  open to a class action lawsuit of gargantuan proportions if their technology enabled the attack.

There is a practical answer to this problem of safety vs. privacy. For the sake of persons not yet killed or maimed it would behoove Apple to find it.

From → op-ed

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: