Apple v FBI

LEO has no problems getting access to phone calls and SMS.

They're looking for more info. that phones contain, such as location data, health data, financial data.

And in this specific case, I don't think they really care about what's in that iPhone so much as this is a well-known case and they could use it as leverage to set a precedent that opens it up for all encrypted devices.
 
I agree. Politics and philosophies aside, this is a marketing nightmare more than anything else.

They must know that they will eventually have to cave in.

From a strictly strategic perspective, the longer this goes on the worse it will be. Get it over with and move on.

IMHO

Agreed. I am generally not an Apple fan, not that it matters here. There is a court order. End of story.
 
Agreed. I am generally not an Apple fan, not that it matters here. There is a court order. End of story.


Well, not really.... that is why Apple is taking it to court to try and change the court order... IOW, if a court order is wrong, it can be overturned...
 
I'm saddened Apple hasn't offered to do this and I'm saddened the fbi can't do it.

Divided country. It's going to take something more than 9/11 to bring it back together.
 
Well, not really.... that is why Apple is taking it to court to try and change the court order... IOW, if a court order is wrong, it can be overturned...
+1

I'm saddened Apple hasn't offered to do this and I'm saddened the fbi can't do it.

Divided country. It's going to take something more than 9/11 to bring it back together.
+1

But I do not blame the FBI for not being able to do it. These things can be very complex, and you need the knowledge and tools of someone who initially designed it.

One thing I just thought of. There have been embarrassing leaks from the government, but it is when the sensitive material is in many hands. When the knowledge is limited to only a few that are carefully vetted, there are secret things that the government can keep very well, or has well-designed safeguards so that one rogue player cannot jeopardize the entire nation. Or so I hope. Examples include the mechanism and protocol for launching of nuclear weapons, or even development of new advanced systems.
 
Last edited:
I'm saddened Apple hasn't offered to do this and I'm saddened the fbi can't do it.



Divided country. It's going to take something more than 9/11 to bring it back together.


Tend to agree with this. Most are not aware that we are engaged in a new type of war that is not simple to define. The subject phone is probably seen as an outlier by the public. But there will be more incidents of terror to come. The perpetrators' digital life holds many clues that could be used in detecting and thwarting a future event.
 
The FBI cannot do this. Hacking protected software is illegal. Apple retains ownership of the source code.

Apple has a responsibility to their owners and customers to protect their owners and customers interests. I don't like Apple, but own some stock and switched over to their premium priced wares for safety and security reasons. Take that away and Apple's premium value is gone. By extension, it really is a threat to Apple's entire business model.

As a citizen, I see every foothold into a new sacrifice of rights is in short time twisted far beyond it's original scope. My privacy is important to me. If others want to willingly give their privacy away, that is their choice, it isn't mine.

Any advantage gained by backdoor hacks will quickly be lost with third party encryption being used, but those privacy rights given away are gone forever.
 
Apple filed a motion to vacate. It is 65 pages.

Sit back, read. Lots of factual info on the iphone and legal reasoning. Theodore Olson is one of the attorneys. He is no lightweight. The reading is not light. I have read all 65 pages.

A couple of excerpts below.

https://assets.documentcloud.org/documents/2722223/Apple-s-Motion-to-Vacate.pdf
"
Unfortunately, the FBI, without cons
ulting Apple or reviewing its public
guidance regarding iOS, changed the iCl
oud password associated with one of the
attacker’s accounts, foreclosing the possi
bility of the phone initiating an automatic
iCloud back-up of its data to a known Wi-Fi network,
see
Hanna Decl. Ex. X [Apple
Inc.,
iCloud: Back up your iOS device to iCloud
], which could have obviated the need
to unlock the phone and thus for the extr
aordinary order the government now seeks.
21
Had the FBI consulted Apple first, this
litigation may not have been necessary. "

"
III. ARGUMENT
A. The All Writs Act Does Not Provide
A Basis To Conscript Apple To Create
Software Enabling The Government
To Hack Into iPhones.
The All Writs Act (or the “Act”) does not provide the judiciary with the
boundless and unbridled power the
government asks this Court
to exercise. The Act is
intended to enable the federal c
ourts to fill in gaps in the
law so they can exercise the
authority they already possess by virtue of
the express powers granted to them by the
Constitution and Congress; it
does not grant the courts free-wheeling authority to
change the substantive law, resolve policy
disputes, or exercise new powers that
Congress has not afforded them."

"

In the section of CALEA entitled “Design of features and systems
configurations,” 47 U.S.C. § 1002(b)(1), the st
atute says that it “does not authorize any
law enforcement agency or officer—
(1) to require any specific design of
equipment, facilities, services,
features, or system configurations to be adopted by any provider of
a wire or electronic communication service, any manufacturer of
telecommunications equipmen
t, or any provider of
telecommunications support services.
(2) to prohibit the adoption of a
ny equipment, facility, service, or
feature by any provider of a wi
re or electronic communication
service, any manufacturer of tel
ecommunications equipment, or any
provider of telecommunications support services."
 
Last edited:
Tend to agree with this. Most are not aware that we are engaged in a new type of war that is not simple to define. The subject phone is probably seen as an outlier by the public. But there will be more incidents of terror to come. The perpetrators' digital life holds many clues that could be used in detecting and thwarting a future event.

+1

I know there are strong opinions on both sides and I respect both sides.

One thing, I am in IT and (in my opinion) it is ludicrous for Apple to say they have no way into a phone and they would have to create one. My application folks were laughing about that the other day. It's one thing to argue weather the gov has a right to that access or not but I think it is naive to think they don't already have the capability. Again, JMHO.
 
Last edited:
My two cents:

I think Apple is trying to draw their line in the sand, and sadly it probably won't work. On a personal level, if their are too many keys out there to unlock stuff, sooner or later someone is going to get into our investments and steal our money out from under us. What then will you and I do?

In addition to major illness, we are much more likely to die in a storm, a car accident, or in one of these copycat senseless killings than we are in a terrorist attack. We are way more likely to have other bad things happen to us than a terrorist attack.

The perpetrators of the San Bernardino attack are gone. They destroyed their own phones ahead of time. That's where the information lies. It is unlikely there will be stuff of use on the work phone. The FBI are trying to distract from the mistake of changing the password by chasing Apple over this.

I fear the Pandora's box the government is trying to open. If the software to disable the lock on the phone gets out to Isis or Iran of North Korea, you and I could be looking at a lifetime of eating cat food because our accounts get wiped out. The bad guys buy weapons with our money. No thanks!


Sent from my iPhone using Early Retirement Forum
 
Sooner or later there's going to be a very clear case where a demonstrably bad person obviously has an iPhone with important information on it that could lead to the apprehension of other bad people and the saving of lives. The FBI will want it, and Apple will suffer a giant PR blowback if they don't provide it. I don't think Apple gains much ground right now by arguing against this particular order (because the attackers are dead, because the data extraction is now more difficult than it might be under other circumstances, because the attackers had other phones, etc). To be a meaningful "win" for Apple, they need to make the case that the risks of doing this will >always< outweigh the potential gain. I doubt they will be successful, because they'll have to convince lifelong jurists that the courts can't be trusted to make these distinctions (distinctions they make every day before issuing search warrants and subpoenas) and they'll have to make the case that Apple can't be trusted to keep this tool in-house (which is all the government has requested).
 
...

One thing, I am in IT and (in my opinion) it is ludicrous for Apple to say they have no way into a phone and they would have to create one. My application folks were laughing about that the other day. It's one thing to argue weather the gov has a right to that access or not but I think it is naive to think they don't already have the capability. Again, JMHO.

I don't think it's laughable at all. I suspect that your opinion is based on your IT work, and network security is very different from secure hardware/firmware on a device with proprietary chips working with proprietary firmware all soldered down in such a way to even make physical removal very, very difficult (and probably useless anyhow, w/o the 'keys').

I have a bit of experience with some secure boot devices, though this was a while back, and my memory is fading on the details (which were under NDA anyway, so just as well). These devices initially boot with a set of keys that are used to lock down all the secure components. The resulting info they need to validate is stored in memory that can't be externally read. These devices won't talk to each other if they don't validate, and that validation process is all in 'burned in' firmware that cannot be bypassed. If Apple did not retain the keys used to lock that device, which maybe they don't - they can't be hacked if they don't own them anymore, they have no better chance of breaking in than anyone else.

What Apple does have, from what I understand, is the 'signing process' to load new software on the phone. What I don't know is whether some of the restrictions on number of tries, etc (what the FBI is asking for) are part of software that can be re-loaded, or is it part of the firmware in their 'secure enclave' that is not re-programmable.

edit/add: when I use the term 'secure boot', I don't mean what is on some of our computers, I'm talking about a device level system-on-chip boot level security, a totally different thing.

-ERD50
 
Last edited:
I don't think it's laughable at all. I suspect that your opinion is based on your IT work, and network security is very different from secure hardware/firmware on a device with proprietary chips working with proprietary firmware all soldered down in such a way to even make physical removal very, very difficult (and probably useless anyhow, w/o the 'keys').

I have a bit of experience with some secure boot devices, though this was a while back, and my memory is fading on the details (which were under NDA anyway, so just as well). These devices initially boot with a set of keys that are used to lock down all the secure components. The resulting info they need to validate is stored in memory that can't be externally read. These devices won't talk to each other if they don't validate, and that validation process is all in 'burned in' firmware that cannot be bypassed. If Apple did not retain the keys used to lock that device, which maybe they don't - they can't be hacked if they don't own them anymore, they have no better chance of breaking in than anyone else.

What Apple does have, from what I understand, is the 'signing process' to load new software on the phone. What I don't know is whether some of the restrictions on number of tries, etc (what the FBI is asking for) are part of software that can be re-loaded, or is it part of the firmware in their 'secure enclave' that is not re-programmable.

-ERD50

I get all that and understand firmware and IOS development, but stand by my opinion that apple has a way in. It is certainly just my opinion, though.
 
I get all that and understand firmware and IOS development, but stand by my opinion that apple has a way in. It is certainly just my opinion, though.

And you may be right - I just don't consider the possibility that Apple may not have a way in either to be 'laughable'.

From what I understand of all this (and I have not read every word, so please correct mt if I'm wrong), Apple is telling the Feds they don't have a way in and they don't want to create one for fear it could leak out. And it seems like their 'way in' isn't simply unlocking the phone, but bypassing the limits on tries so the FBI can attempt a brute-force unlocking attempt. Maybe Tim Cook is flat out lying to the Feds, I don't know, but would he really put himself in that position?

-ERD50
 
...What Apple does have, from what I understand, is the 'signing process' to load new software on the phone...

This is my understanding as well from reading the court order that was was on a BBC Web page, a link to which I included in post #100.

It appears that the user's data is still encrypted and there's no back door to get at it, other than a brute force attack by guessing or trying all combinations.

The new software FBI wants Apple to write will facilitate this brute force passcode attempt by 1) disabling the auto erase after 10 wrong tries, 2) eliminating the increasing delay between trials, and 3) allowing the passcode to be entered electronically (possibly via the Bluetooth link).

Apple acknowledges that this is feasible, and takes 10 programmers 2 to 4 days. However, it wants to destroy this software afterwards, and is fretful that it will have to repeat this exercise again and again.
 
This is my understanding as well from reading the court order that was was on a BBC Web page, a link to which I included in post #100.

It appears that the user's data is still encrypted and there's no back door to get at it, other than a brute force attack by guessing or trying all combinations.

The new software FBI wants Apple to write will facilitate this brute force passcode attempt by 1) disabling the auto erase after 10 wrong tries, 2) eliminating the increasing delay between trials, and 3) allowing the passcode to be entered electronically (possibly via the Bluetooth link).

Apple acknowledges that this is feasible, and takes 10 programmers 2 to 4 days. However, it wants to destroy this software afterwards, and is fretful that it will have to repeat this exercise again and again.

If Apple just built some of these multi-retry delays into the secure firmware of the chip itself (something that was not accessible by any software update), it would make a brute force attack on a 6 char alpha-numeric passcode essentially impossible - say one attempt per minute after X failed retries:

(36^6) ∕ (60 ⋅ 24 ⋅ 365) ≈ 4141.5189

36 char raised to the 6th power, divided by 60 min/hour times 24 hours/day, times 365 days/year, is thousands of years!

And that assumes only upper case alpha. With upper/lower and 10 digits, we are talking > 100,000 years at one try a minute.

Of course, that's to try every combo. On average, you'd reach it in half that time, a mere 2,000 to 50,000 years. :cool: Good guessing might cut that down by a large factor, but probably not enough to matter.

-ERD50
 
Last edited:
I get all that and understand firmware and IOS development, but stand by my opinion that apple has a way in. It is certainly just my opinion, though.


I also believe there is a way in. Of course there is no good reason to step forward and show how it's done.

The better course for govt and vendors us to work on guidelines and solutions that can be triggered in the right way, given legal judgment is on place.
 
I also believe there is a way in. ...

I'm curious why you would be so sure. You may be right, but what leads you to this?

Apple appears to take security very seriously. Security is a very difficult/complex thing to do right. Adding a back door adds complexity, it could have flaws which could reduce the security, and risks a leak of the knowledge. Some people just have to know about it. All they need is a Snowden-type guy on their team.

And what does Apple gain by adding a back door?

I'll apply Occam's Razor - the path of least resistance is for Apple to have no back door, and that is where I would place my bet. I could be wrong of course, but that is how I got to my view.

-ERD50
 
I don't understand why so many people are convinced that Apple already has a back door or other way in without modifying the existing iOS.
 
If you want this in the future, all congress needs to do is pass a law that every encrypted piece of software sold in the US must have a backdoor key that is given to the FBI.
 
Then they'll cede the mobile software industry to other countries.
 
Then they'll cede the mobile software industry to other countries.

U.S authorities have had the cooperation of Apple and other mfgrs up to this point, and these companies still seem to be doing very well against foreign competition. Maybe another company will spring up in a country that offers better "protections" (in a narrow sense), but that threat exists today and doesn't seem to be taking market share right now.
 
If you want this in the future, all congress needs to do is pass a law that every encrypted piece of software sold in the US must have a backdoor key that is given to the FBI.

The FBI tried to get this kind of legislation passed a couple of times in the past decade. It never could get off the ground.

So now they're trying this approach.
 
For me it's an easy one: unlock the phone!

But I will agree philosophically that it's a challenging situation, but I come at it from the practical view of:
- we don't have any online privacy to begin with
- I don't have anything the FBI might be interested in
- we are at war in a sense and the rules become different in this case

Of course, there is the opposite view.

What I decide to do with my information is not the government's business. If I decide to display secrets on an internet forum, for example, the government should not construe that as an invitation to delve deeper into my life or life's work. Whether or not I have something to hide is of little consequence. There is nothing in the law (read Constitution) that says the measure of the government's intrusion into my life is based in any way on the level secrets I may or may not have. Quite the opposite, it is because I do have things I want to hide from all types of people to one degree or another that the government needs judicial permission or some exigency to violate my right to secrecy. More importantly, I have the right to be left alone. Left alone to pursue any and all things that make me happy without being bothered by anyone.

Yes, we are at war. And it is good to point out that what the government is demanding of Apple is no different than conscription. They have simply replaced the gun with a phone. Since there is no emergency, I default to Apple's right to defend their work product. It is not Apple's job or duty to make anyone's job easier, including the government.

Ron
 
Back
Top Bottom