Apple v FBI

It does make very secure devices the device of choice for bad guys. I don't know that there is any good answer to this - I want protection from bad guys, and I'm just not all that hung up on privacy the way some people are - but even I see this isn't an easy one.

-ERD50

For me it's an easy one: unlock the phone!

But I will agree philosophically that it's a challenging situation, but I come at it from the practical view of:
- we don't have any online privacy to begin with
- I don't have anything the FBI might be interested in
- we are at war in a sense and the rules become different in this case
 
For me it's an easy one: unlock the phone!

But I will agree philosophically that it's a challenging situation, but I come at it from the practical view of:
- we don't have any online privacy to begin with
- I don't have anything the FBI might be interested in
- we are at war in a sense and the rules become different in this case

I pretty much agree with that (and other's may disagree - that's fine, just different viewpoints).

But I think there is still a pragmatic issue. It not only protects your data, but your payment info. If Apple provides a backdoor, they do need to worry about that backdoor getting out and in the hands of bad guys. They become a target. It could be better to simply have no backdoor than to try to protect that backdoor.

Maybe a backdoor could be constructed where it is software and some complex hardware - maybe 3 different pieces, kept in safes with the old multiple key locks, and the phone is moved, under guard, from place to place, so the hardware pieces are never unlocked at the same time. Different designers for each piece of hardware (or I suppose, different keys that are stored and locked with the device, but not exposed unless needed to replace the hardware - all done under a high level protocol.

Sounds like the plot for T-Al's next book? ;)

-ERD50
 
I can see both sides as well, though I tend to side with Apple/Google/Facebook's position. It's not as easy or simple as cracking one phone if I understand this correctly. It would likely compromise the privacy of many. May be naive, but we have iPhones largely because we trust Apple more than Google/Samsung.

It's not as if there is no other evidence they have to work with. There are still lots of people who don't have smartphones. It wasn't that long ago that no one did. Yet crimes were solved.

Interesting what Snowden unleashed...
 
Last edited:
- we are at war in a sense and the rules become different in this case
Oh yeah - dump all those individual protections. We are at war! Never mind liberty.

We don't keep any personal documents online nor use iCloud backups. I don't see any link between online privacy and this case.

The FBI has been pressuring Apple for years to create a backdoor for them to access any iPhone. Apple won't do it for several reasons, not the least being that once there is a backdoor, the bad guys will figure out how to access it too.
 
Last edited:
The FBI/CIA knows who the best hackers are. Some can be probably be found at Blackhat/Defcon, maybe. Hey in the US, these conferences are held in Las Vegas so why doesn't the FBI put some money up with a "specification" and challenge. If these guys/gals (hackers) are so good, simply offer "the first of them" to win (capture the flag in Defcon parlance) $10m to design and effectively demonstrate the code/method necessary to break into such a device, as specified. If successful, the FBI gets what they want, someone (the hacker) is rewarded for their efforts/skills, and Apple would know they need to fix their code.
 
Last edited:
The FBI/CIA knows who the best hackers are. Some can be probably be found at Blackhat/Defcon, maybe. Hey in the US, these conferences are held in Las Vegas so why doesn't the FBI put some money up with a "specification" and challenge. If these guys/gals (hackers) are so good, simply offer "the first of them" to win (capture the flag in Defcon parlance) $10m to design and effectively demonstrate the code/method necessary to break into such a device. If successful, the FBI gets what they want, and Apple would know they need to fix their code.

I believe that the security of a device like the iPhone is actually getting so good as to be essentially un-hackable. Yeah, I know, never say never, but - you have a device with extremely close-coupled hardware, very detailed and involved internal software checks, challenges, and validations, and a very dedicated group of very smart people who have been working on this for a long, long time.

This is different from trying to protect data that gets sent over the web or other external methods. We are talking about the data that resides only in encrypted form, deep inside the hardware of that device. You just can't get to the data w/o passing ALL the tests. And when you get to long alpha-numeric encryption keys for all this, brute force just won't do it.

Again, I'm talking about any data stored only internally in the device, and not the stuff that also goes external. But even with the long keys, that stuff could be near impossible to decode as well, if the keys are only stored internally in that phone.

-ERD50
 
I have it set on my iPhone to wipe after 10 failed attempts.

As do I .... I also:

  • have more than 4 numbers in my passcode
  • remove access to Siri when locked
  • remove "reply with Message" when locked
  • remove access to wallet when locked
  • Disabled SMS Preview
  • enabled "ask to Join Networks" for wifi
  • Turn off Frequent Locations


Speaking of frequent locations... try this, On your iPhone go to Settings->Privacy->Location Services
Scrolll to the bottom and go into System Services->Frequent Locations you will see all the places you have been recently, home address, malls, work....
 
I believe that the security of a device like the iPhone is actually getting so good as to be essentially un-hackable. Yeah, I know, never say never, but - you have a device with extremely close-coupled hardware, very detailed and involved internal software checks, challenges, and validations, and a very dedicated group of very smart people who have been working on this for a long, long time.

This is different from trying to protect data that gets sent over the web or other external methods. We are talking about the data that resides only in encrypted form, deep inside the hardware of that device. You just can't get to the data w/o passing ALL the tests. And when you get to long alpha-numeric encryption keys for all this, brute force just won't do it.

Again, I'm talking about any data stored only internally in the device, and not the stuff that also goes external. But even with the long keys, that stuff could be near impossible to decode as well, if the keys are only stored internally in that phone.

-ERD50

No one said it was easy, :) but if they offered $10m, it might be worth the effort. Heck there's freeware encryption code "out there" that was written by small teams, that's never been hacked/cracked. Soooooo, I'm not sure anyone could hack the Apple device, but it would be interesting to see.
 
Last edited:
The problem with providing a backdoor is that once it's done, the criminals will use other means to protect/encrypt their data. If corporations are forced to provide backdoors into their products, then all we've done is given the government/criminals a means to collect data on anyone. IMO, that's not good.

I feel the government's pain on this issue, but I don't think they have any easy solutions. If you're part of a criminal/terror organization, why not create your own app to communicate among your members and store all your data? The app can be protected with a strong passcode/encryption and self-destruct if you enter an invalid passcode three times. And since it's not provided by a corporation that the government can force into compliance, there's not much they can do. Not an easy problem to solve.
 
No one said it was easy, :) ...

I'm saying that "not easy" is a gross, gross understatement of what it probably takes to crack a device with these coupled hardware/software protections.

OK, maybe Apple slipped and there is some trick way to get it? Not impossible, but remember the only entry is through the fingerprint module, or a passcode. It's just a very different beast we are talking about here. And after several attempts, delays are built in to slow down brute force attempts (IIRC from a recent blog - a one hour delay between requests after 10 or so bad attempts). It's just so tightly coupled, getting around these delays just does not appear feasible.



I'm not sure anyone could hack the Apple device, but it would be interesting to see.

So would a perpetual motion machine.

Maybe someone will crack it, and I'll eat those words. But I think the probability is really, really, really low. And maybe worth more to the bad guys than any legitimate $XXM reward?

-ERD50
 
Here's an article on CALEA, which was passed in 1994 (and has since evolved) and specifies what the Telecoms must provide to support wiretapping. But of course back then there we didn't have smartphones that stored information. So perhaps one effect of the Apple vs FBI dispute will be to evolve the law further to clarify what should happen in such cases.

https://en.wikipedia.org/wiki/Communications_Assistance_for_Law_Enforcement_Act
 
The problem with providing a backdoor is that once it's done, the criminals will use other means to protect/encrypt their data. If corporations are forced to provide backdoors into their products, then all we've done is given the government/criminals a means to collect data on anyone. IMO, that's not good.

I feel the government's pain on this issue, but I don't think they have any easy solutions. If you're part of a criminal/terror organization, why not create your own app to communicate among your members and store all your data? The app can be protected with a strong passcode/encryption and self-destruct if you enter an invalid passcode three times. And since it's not provided by a corporation that the government can force into compliance, there's not much they can do. Not an easy problem to solve.
Yes - the criminal will use other means, AND also can use the new backdoor "feature" to gain access to the phones of the non-criminals.
 
Last edited:
If you wonder why Apple is so paranoid about security read this from Krebs on home surveillance products.

http://krebsonsecurity.com/2016/02/this-is-why-people-fear-the-internet-of-things/

As I noted in a recent column IoT Reality: Smart Devices, Dumb Defaults, the problem with so many IoT devices is not necessarily that they’re ill-conceived, it’s that their default settings often ignore security and/or privacy concerns. I’m baffled as to why such a well-known brand as Foscam would enable P2P communications on a product that is primarily used to monitor and secure homes and offices.
 
For me it's an easy one: unlock the phone!

But I will agree philosophically that it's a challenging situation, but I come at it from the practical view of:
- we don't have any online privacy to begin with
- I don't have anything the FBI might be interested in
- we are at war in a sense and the rules become different in this case


I do not care that we are at war.... heck, let them break down your door and search because someone phoned in a tip and said you were a terrorist.... really?




BTW, it is not just this phone.... they had the NYC chief of police say that he has a number of phones that he wants access.... no terrorist connections at all.... if they can order it done for the terrorist, they can order it for any phone...

Also, look at what happened to the forfeiture laws that were put into place to go after the 'drug lords'.... now they take grandma's car because grand kid was driving and got pulled over with some drugs... not to sell, just had some... if you give them an inch some will take a mile (or more)....
 
Oh yeah - dump all those individual protections. We are at war! Never mind liberty.

I do not care that we are at war.... heck, let them break down your door and search because someone phoned in a tip and said you were a terrorist.... really?

Well, having extremely close and personal connections to both 9-11 and the Boston Bombings, I'm sure that my perspective is a bit skewed, but, you had to be there I guess.

How close? A relative and a neighbor's sister on flight 11.
DW was directly across the street from the devastation on Boylston St. less than 200 feet away. Took me two hours find her.

So, yeah, from my vantage point, we're at war; you can look at what's in my phone.
 
I see that we are at war. War with terror, war with drug cartels, war with North Korea hacking, ditto Russian, Chinese, and on and on.

Security is a powerful defense, it is a bastion of freedom that can make the internet of things secure, the internet and online transactions secure. Being forced with this precedent to sacrifice so much future security for a dubious outcome is extremely short sighted.

A security expert in cyber was talking on NPR about making a back door in their product. The question was raised if anyone could hold out in a black van scenario or a family hostage situation. They decided to not create the back door.
 
For me it's an easy one: unlock the phone!

But I will agree philosophically that it's a challenging situation, but I come at it from the practical view of:
- we don't have any online privacy to begin with
- I don't have anything the FBI might be interested in
- we are at war in a sense and the rules become different in this case
Your comment on having nothing the FBI might be interested in is just a specific of the old "I've got nothing to hide" argument. But that's been shown over and over to be nothing other than a lazy intellectual response to an individual case.

Here's a good comment I read in a paper about the topic.

By saying "I have nothing to hide," you are saying that it’s OK for the government to infringe on the rights of potentially millions of your fellow Americans, possibly ruining their lives in the process. To me, the "I have nothing to hide" argument basically equates to "I don’t care what happens, so long as it doesn’t happen to me."
If you've got curtains, a shredder, a password on your computer or phone, you've got things you prefer others not have access to. Otherwise, feel free to post your SSN, mother's maiden name, etc on the internet. This also applies to your "no privacy online" comment. Obviously we do, and Apple and others are trying to strengthen it.

I do not care that we are at war.... heck, let them break down your door and search because someone phoned in a tip and said you were a terrorist.... really?

It happens already, and has a name - swatting. All it takes it getting someone POed at you, and the next thing you know the cops are breaking down the door.

https://en.wikipedia.org/wiki/Swatting

Here's a good article about it.

http://www.nytimes.com/2015/11/29/magazine/the-serial-swatter.html?_r=0
 
Feds having access to smart phones wouldn't have prevented 9/11 or any other terrorist attacks. That ticking time bomb scenario doesn't work if they have to go through millions or billions of combos.

The cops have gotten lazy, because now that people are putting all their personal info in their phones, they think there is this single point for all info. But they can't get those phones until after the terrorist act has been perpetrated.

The other part is, if they lose their phones, they can remotely wipe the iPhone or just change their plans.

What did the cops do before smart phones? The modern smart phones haven't been out even 10 years. Are they slacking on developing other sources of Intel because if they can crack the suspects phones they get all the evidence?

Now I'm getting alerts that the DoJ filed a motion to compel Apple to do this and said Apple securing their product is only about marketing. Makes me think less of this attorney general, who's been talked about as a potential SCOTUS nominee, and this administration in general.


If they put out a compromised firmware, it will get out in the wild and users would be vulnerable not only to hackers but thieves. iPhone theft went way down after they put in activation lock, requiring an iCloud password to make the iPhone useable at all. But if people can unlock the device or flash any firmware, then it brings back the incentive to steal iPhones again.

Truth is, terrorism is a very low probability great that only touches a few people. There are much greater everyday threats to the health and safety of Americans. Hell, the threat of being mugged or worse for your $800 iPhone is a much higher probability than becoming a victim of terrorism.
 
So, yeah, from my vantage point, we're at war; you can look at what's in my phone.

Just because you decided that you're willing to accept warrantless searches, does not mean the rest of us should be compelled to.

Governments have shown, throughout history and geography, that when they take civil liberties in the name of war/terrorism/whatever the problem du jour is, they will use those new powers 1% of the time for the intended purpose and 99% of the time for unrelated things just because it serves their purposes. And usually that ends up being abusive.

I hope Apple continues to fight this.
 
Last edited:
Also, look at what happened to the forfeiture laws that were put into place to go after the 'drug lords'.... now they take grandma's car because grand kid was driving and got pulled over with some drugs... not to sell, just had some... if you give them an inch some will take a mile (or more)....
Not to mention the routine forfeiture of small business funds simply because they "appeared" to be limiting their cash deposits to under $10,000. Didn't matter that there was absolutely no criminal activity involved. No trial, nada. But a sudden cash infusion into a law enforcement budget. Talk about creating something ripe for abuse!

These "expanded" powers are inevitably abused.
 
Last edited:
Just because you decided that you're willing to accept warrantless searches, does not mean the rest of us should be compelled to.

Governments have shown, throughout history and geography, that when they take civil liberties in the name of war/terrorism/whatever the problem du jour is, they will use those new powers 1% of the time for the intended purpose and 99% of the time for unrelated things just because it serves their purposes. And usually that ends up being abusive.

I hope Apple continues to fight this.


Although I empathize with the victims of terrorism, I agree that Apple should fight this. Having to give one's intellectual property to the government sets a dangerous precedent.


Sent from my iPhone :).using Early Retirement .//82339)
 
The govt trying to compel a company to expend their resources and damage the value of their product for a case where the company isn't even a party:confused::confused:

I'd fight this too.

I wonder if Apple would change their tune if the IRS sent an audit notice to all on the Board of directors...

Or if the Government blocked the airwaves for any phone company that did not give them the keys.

Let the Court proceedings continue and throw the CEO in jail for contempt until it finishes. And let the IRS continue to do the job that they are set out to do.

The cops have a warrant. Apple is using this as a cop out. They certainly support other areas that the constitution is being eroded away.

Imagine if a gun dealer did not turn over the records on a gun used in a crime.
 
Just because you decided that you're willing to accept warrantless searches, does not mean the rest of us should be compelled to.

I was just explaining how my own perspective is admittedly skewed.

I wasn't implying that I was advocating warrentless searches for anyone. At least I didn't intend to do that.

As I noted in an earlier post, philosophically I find it a challenging and interesting situation; but for me personally --whether I like or not-- takes me to a different place.
 
Back
Top Bottom