|
"Very few" is different than "no one", I believe you used the latter.
Daniel Pfeffer wrote: An algorithm with a vulnerability could be perfectly encoded, but still be vulnerable to attack.
Sure, that is true of anything in this world, but that's the rationale for open sourcing projects... To allow other people other than the original designers to assess vulnerabilities.
|
|
|
|
|
Once you let the Genie out of the bottle, you can't put it back in...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
This section seems a bit strange:-
Quote: Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession.
If the file exists on the phone and was encrypted using an existing version of the data, how would installing a new version of the iOS allow easier unencryption?
Also - wouldn't doing that utterly corrupt the chain of evidence meaning anything discovered could not possibly be used in a civilian court of law?
|
|
|
|
|
No
Yes
If it's not broken, fix it until it is
|
|
|
|
|
They have enough evidence to go to any level of legal measure required. This is an attempt to get more information and intelligence.
|
|
|
|
|
I believe I read somewhere that there's currently a security measure that deletes the encryption key upon too many failed attempted login attempts. If I'm not mistaken, they're asking Apple to change that setting so that they can brute force the password (i.e. make it so it doesn't delete anything when faced with a brute force attack).
|
|
|
|
|
Duncan Edwards Jones wrote: If the file exists on the phone and was encrypted using an existing version of the data, how would installing a new version of the iOS allow easier unencryption?
My understanding is that if you attempt bad passwords X number of times, the phone bricks itself essentially. The "new" iOS being requested by the courts/FBI would allow unlimited attempts therefore making any phone that can have that OS installed brute forcible.
|
|
|
|
|
We answered the same thing at just about the same time, so I guess that is the stated story.
I can see the concern, if this "modified" version of the OS got out onto "the wild", anybody could brute force an iPhone.
|
|
|
|
|
Vark111 wrote:
My understanding is that if you attempt bad passwords X number of times, the phone bricks itself essentially. The "new" iOS being requested by the courts/FBI would allow unlimited attempts therefore making any phone that can have that OS installed brute forcible. Ten attempts. Then the phone not just blocks all info on the phone, it erases it completely. After that no tool can recover it; there is nothing to recover.
The code in question is a 4 decimal digit code, so a brute force attack requires only ten thousand tries (or on the average half of that) - so little that it neither sounds very much "brute" nor very strong "force"
|
|
|
|
|
Duncan Edwards Jones wrote:
If the file exists on the phone and was encrypted using an existing version of the data, how would installing a new version of the iOS allow easier unencryption?
Unless the user specifies the full encryption key every time the encrypted information is accessed, the software does know the key. It is stored somewhere in the file system. Move that flash (/disk, for general PCs) over to another machine, as a secondary storage device, and the key can be read by that other machine.
Sure, the key is usually encrypted; you won't find it in cleartext. But the OS/Application knows how to decrypt it. It must know, in order to decrypt the info for the proper user. But in a standard version, the OS/App refuses to do it until the operater has authenticated himself. The special OS edition on the other machine may be willing to decrypt the key without the the owner authenticating himself, e.g. presenting a password or fingerprint.
Couldn't that info, given by the user, be (part of) what encrypts the key, so that an intruder would have to know that?
But the OS knows that, too. It must know the PW (or some transformation of it) in order to check that the user gives the right one. So the alternate OS version may pretend that it has just read from the user a PW corresponding to the expected one, even if no user ever specified anything.
Whether you install the alternate OS version on the same device or you move the storage device (flash/disk) to another machine makes no essential difference, as long as there exists a possiblity for loading a new OS version without logging in to the machine. In the old days, that wasn't always the case, but with modern automatic over-the-air updates and fixes, it it probably possible to replace all essential parts of the OS that way.
The only safe encryption is where you are the one generating the key, the only one knowing it, and you never present it to the OS or to any application. For standard PC use, I would like to have a USB dongle where I can load, say, my X.509 certificates into a flash area that is not adressable across the USB interface; only the processor in the dongle can see it. So the PC sends the ciphertext across the USB interface, the dongle decrypts it, and returns hte cleartext to the PC across the USB interface. (Or it receives cleartext and returns ciphertext.) In many applications (such as S-MIME), the ciphertext will not be the full document text but e.g. a one-time 3DES or AES256 key, used for the text body, but in principle, the dongle could encrypt/decrypt the entire text body.
This dongle could itself require authentication. E.g. it can have a Bluetooth [Smart] interface to your smartphone, requesting a 6-digit PIN to be keyed on the phone. No keylogger on the PC will be able to pick it up (the way it can pick up any PIN, PW or key you type at the PC keyboard). So to access an encrypted document would require both the right USB dongle with the proper keys loaded, the right smartphone for authentication, and knowledge of the PIN requested by the dongle. (Plus, implicitly, the ability to unlock the smartphone, eg. by fingerprint.) In principle, a keylogger may be installed on the phone, but the risk of the intruder knowing how those digits typed are actually used - as a PIN code for some independent dongle - is rather small.
The biggest problem is to make e.g. an email program use that dongle for decrypting/encrypting the one-time-key (or the entire text body). Even if there exists standard encryption APIs, there is a great risk that common mail programs insists on accessing the X.509 certificate itself; maybe it doesn't use that standard encryptin API at all. So if I make myself such a dongle (in fact, I do have access to a programmable USB dongle that could do the job - I just have to learn to develop software for it!), I guess I would have to obtain some open-source email reader (such as Thunderbird) to adapt the source code for it. I guess that I might get the time to complete that project as soon as I retire as an old age pensioner...
|
|
|
|
|
More people have been killed with babies by guns than terrorists. Don't let hoopla and propaganda cloud your judgement. Yes it was sad, but the media blew it up to play the fear card to make it seem like it's a much bigger problem than it really is. So, it's not worth Pandora's box being opened.
Jeremy Falcon
|
|
|
|
|
I say we ban babies!
|
|
|
|
|
Agreed. They don't do anything but cry and poop anyway. Who needs them.
Jeremy Falcon
|
|
|
|
|
It could be a marketing stunt on the part of Apple:
(1)Apple publish that they refuse to unlock phones knowing damn well that that will unlock them.
(2)Their sales go up and they gain a market share from the Android users who think 'Apple have an ethical stance'.
(3)Apple then say that sadly they had no choice and unlock the phone - they come out of it smelling of roses.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
It is not like that...but Apple has no idea how to unlock iPhone
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|
|
You need to get the full story. The government has not asked Apple to "unlock" that phone. The government wants Apple to create and install software on that phone which makes it hackable. Software can be copied.
You may or may not love to hate Apple. But their words open another perspective: Customer Letter - Apple[^]
Life is too shor
|
|
|
|
|
The question is...can you trust the government.
When you see how the IRS abused its power, I think the answer is obvious: No. Any tools given to the government will be used against real and 'perceived' enemies.
A 'perceived' enemy is someone you disagree with politically.
|
|
|
|
|
Apple are saying no because it will devalue their biggest selling product. Even if we believe that it was special software written to access one particular phone, the fact that it could be done to anyone on a court order may well deter people from their products in the future.
Having said that, I find it disingenuous of Apple stand up for security concerns when they've allowed such easy access to the data to, albeit legitimately, installed applications such as Facebook.
Ultimately we should consider any computing device, especially devices capable of over the air comms, as insecure anyway.
|
|
|
|
|
It not only could, it will. If one phone got unlocked, what's wrong with two? If two, what's wrong with three? And so on and so forth.
|
|
|
|
|
"The Gov isn't asking hem to unlock EVERYONE's phone"
No, that's exactly what the government is demanding. They want a tool that will unlock any iPhone. And that is a dangerous precedent. If history has taught us anything, it is that no government should be trusted, at any time, to do the right thing, when the wrong thing is an option. It also represents a significant reduction in security, whose primary purpose is preventing hackers/crackers from gaining access to your data. If a backdoor is created, attackers will find it, and they will exploit it.
What can this strange device be?
When I touch it, it gives forth a sound
It's got wires that vibrate and give music
What can this thing be that I found?
|
|
|
|
|
I do not understand what it means "No". The article says the authorities have the device. So if the device could be unlocked (it doesn't matter if it is, it matters that it could), then everyone could unlock the iPhone (okay, without a source code it takes a little bit longer, but not so much). If the device is strongly encrypted (as it should be), no backdoor may unlock it, instead a strong encryption would take several million years to brute force for a super computer. Finally if the device is not really encrypted, or private key could be reached by the hardware or it is obfuscated, then device is already unlocked, just use the right tools (obfuscation is not a security, but prevents power users to poke around the device).
So what it means "We could, but we said "No"!"? Are the iPhone's are really secure or they're just secure, because normal users does not have proper hardware/source code (first is easy to create, second could be reverse-engineered). A really secure device should be impossible to be unlocked by its manufacturer, unless wiped out.
|
|
|
|
|
From my understanding of what is being requested is to have a version of iOS that will not wipe the device if the incorrect password is typed more than 10 times. If the Feds can have a version of iOS that will allow an unlimited number of password attempts, then they can eventually type the correct password and access the phone.
I suspect the source code to allow an unlimited number of sign-in attempts before wiping the phone is a pretty easy code change.
|
|
|
|
|
So what, if someone steal your device, he/she cannot unlock it unless connects directly to processor bus (yes, there are devices that could do that). Normally such hardware price is high in the skies for normal users (which stealing person usually is). I don't think government cannot afford such a hardware, so besides legal problems why do they need Apple. Yes, modifying source code is easier, cheaper and faster, but modifying machine code is not that difficult.
|
|
|
|
|
Once someone is convicted of a crime, have they not given up their right to privacy? Like a felon has given up the right to vote?
I'm not for spying on innocent citizens, but what about citizens that have been proven to NOT be innocent?
Liberty comes with a price, and so does wickedness.
|
|
|
|
|
I think it extremely interesting that the government is forced to go to the manufacturer to get the data.
The entire situation itself indicates that the government (which obviously includes the NSA) isn't all powerful when it comes to invasion of personal privacy.
I have to admit, my first reaction was, "why can't they just give the damn phone to Apple and have a government (FBI) representative (for chain of evidence reasons) present when the data is produced." That way the code-breaking capability doesn't leave Apple's "clean room" and reduces by many factors the vulnerability of such a program escaping into the wild.
However, if Apple did such a thing, the government would be knocking on their door to do it again in less time than it takes to say iPhone. Ah those pesky precedents.
I'll be stepping out shortly to get more popcorn for the rest of the show.
Talk amongst yourselves.
Cheers,
Mike Fidler
"I intend to live forever - so far, so good." Steven Wright
"I almost had a psychic girlfriend but she left me before we met." Also Steven Wright
"I'm addicted to placebos. I could quit, but it wouldn't matter." Steven Wright yet again.
|
|
|
|
|