Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ask Slashdot: Linux Security, In Light of NSA Crypto-Subverting Attacks?

timothy posted about 10 months ago | from the go-ask-theo-de-raadt dept.

Government 472

New submitter deepdive writes "I have a basic question: What is the privacy/security health of the Linux kernel (and indeed other FOSS OSes) given all the recent stories about the NSA going in and deliberately subverting various parts of the privacy/security sub-systems? Basically, can one still sleep soundly thinking that the most recent latest/greatest Ubuntu/OpenSUSE/what-have-you distro she/he downloaded is still pretty safe?"

cancel ×

472 comments

No. (4, Funny)

Anonymous Coward | about 10 months ago | (#44791833)

I think there's even a law for this kind of reply...

Ken Thompson, Anyone? (5, Interesting)

Jeremiah Cornelius (137) | about 10 months ago | (#44791915)

You can not add security, later.

In Unix systems, there’s a program named “login“. login is the code that takes your username and password, verifies that the password you gave is the correct one for the username you gave, and if so, logs you in to the system.

For debugging purposes, Thompson put a back-door into “login”. The way he did it was by modifying the C compiler. He took the code pattern for password verification, and embedded it into the C compiler, so that when it saw that pattern, it would actually generate code

that accepted either the correct password for the username, or Thompson’s special debugging password. In pseudo-Python:

    def compile(code):
        if (looksLikeLoginCode(code)):
            generateLoginWithBackDoor()
        else:
            compileNormally(code)

With that in the C compiler, any time that anyone compiles login,

the code generated by the compiler will include Ritchie’s back door.

Now comes the really clever part. Obviously, if anyone saw code like what’s in that

example, they’d throw a fit. That’s insanely insecure, and any manager who saw that would immediately demand that it be removed. So, how can you keep the back door, but get rid of the danger of someone noticing it in the source code for the C compiler? You hack the C compiler itself:

    def compile(code):
        if (looksLikeLoginCode(code)):
            generateLoginWithBackDoor(code)
        elif (looksLikeCompilerCode(code)):
            generateCompilerWithBackDoorDetection(code)
        else:
            compileNormally(code)

What happens here is that you modify the C compiler code so that when it compiles itelf, it inserts the back-door code. So now when the C compiler compiles login, it will insert the back door code; and when it compiles

the C compiler, it will insert the code that inserts the code into both login and the C compiler.

Now, you compile the C compiler with itself – getting a C compiler that includes the back-door generation code explicitly. Then you delete the back-door code from the C compiler source. But it’s in the binary. So when you use that binary to produce a new version of the compiler from the source, it will insert the back-door code into

the new version.

So you’ve now got a C compiler that inserts back-door code when it compiles itself – and that code appears nowhere in the source code of the compiler. It did exist in the code at one point – but then it got deleted. But because the C compiler is written in C, and always compiled with itself, that means thats each successive new version of the C compiler will pass along the back-door – and it will continue to appear in both login and in the C compiler, without any trace in the source code of either.

http://scienceblogs.com/goodmath/2007/04/15/strange-loops-dennis-ritchie-a/ [scienceblogs.com]

Re:Ken Thompson, Anyone? (4, Informative)

Jeremiah Cornelius (137) | about 10 months ago | (#44791923)

Moral

The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect.

http://cm.bell-labs.com/who/ken/trust.html [bell-labs.com]

Re:Ken Thompson, Anyone? (1, Interesting)

Anonymous Coward | about 10 months ago | (#44792041)

This argument is much, much too complicated. Plus, it can indeed be tracked down in the compiler binary. Compiling the compiler with an unrelated compiler will remove the malware in the compiler binary. You can use a really slow one for this effort, as you must use it only once.
In reality, there are more than enough bugs of the "Ping of death" style, which can be used. Read "confessions of a cyber warrior".
The worst thing Bell Labs brought into this world was the C and C++ languages and the associated programming style. Like char* pointers, uninitialized pointers possible and so on.

If Bell Labs had no foisted C and C++ on this world for "free", the government would have had to invent something to make their "cyber war space" possible. Wait, Bell Labs WAS the government.

If that's not enough, a single buffer overflow in firefox or Acrobat reader can trigger something like the Pentium F00F bug, and then they OWN THE CPU. Your stinking sandbox is wholly irrelevant at this time.

Go figure, sucker. Me, I am a C and C++ software engineering sucker, too.

Re:Ken Thompson, Anyone? (5, Interesting)

Anachragnome (1008495) | about 10 months ago | (#44792345)

"The moral is obvious. You can't trust code that you did not totally create yourself...."

I agree, but that doesn't really help us in the real world--writing our own code doesn't reasonably work out for most people. So, what's the solution to your dismal conclusion? Ferret out those that cannot be trusted--doing so is the closest we will ever come to being able to "trust the code".

So, how does one go about ferreting out those that cannot be trusted? The Occupy Movement had almost figured it out, but wandered around aimlessly with nobody to point a finger at when they should have been naming names.

The NSA has made it clear that making connections--following the metadata--is often enough to get an investigation started. So why not do the same thing? Turn the whole thing around? Start focusing on their networks. I can suggest a good starting point--the entities that train the "Future Rulers of the World" club. The "Consulting Firms" that are really training and placing their own agents throughout the global community. These firms are the world's real leaders--they have vast funding and no real limitations to who and where they exert influence. In my opinion, they literally decide who runs the world.

Pay close attention to the people associated with these firms, the inter-relatedness of the firms and the other organizations "Alumni" end up leading. Pay very close attention to the technologies involved and the governments involved.

Look through the lists of people involved, start researching them and their connections...follow the connections and you start to see the underlying implications of such associations. I'm not just talking the CEO of Redhat (no, Linux is no more secure then Windows), but leaders of countries, including the US and Israel.

http://en.wikipedia.org/wiki/Boston_Consulting_Group [wikipedia.org]

http://en.wikipedia.org/wiki/McKinsey_and_Company [wikipedia.org]

http://en.wikipedia.org/wiki/Bain_%26_Company [wikipedia.org]

THIS is the 1%. These are the perpetrators of NSA surveillance, to further their needs...NOT yours. People with connections to these firms need to be removed from any position of power, especially government. Their future actions need to be monitored by the rest of society, if for no other reason then to limit their power.

As George Carlin once put it so well..."It's all just one big Club, and you are not in the fucking club."

Re:Ken Thompson, Anyone? (5, Informative)

dalias (1978986) | about 10 months ago | (#44792017)

Fortunately there is an effective counter-measure: http://www.dwheeler.com/trusting-trust/ [dwheeler.com]

Re:Ken Thompson, Anyone? (2)

rvw (755107) | about 10 months ago | (#44792263)

Fortunately there is an effective counter-measure:

http://www.dwheeler.com/trusting-trust/ [dwheeler.com]

So you compile the code using two different compilers. How can you be sure that both other compilers don't have a parent compiler that is infected?

Re:Ken Thompson, Anyone? (1)

1s44c (552956) | about 10 months ago | (#44792067)

Madness. But gcc isn't the only C compiler that can compile code that contains GNU extensions, there was another that could even compile a working kernel but I forget it's name. Plus if you strip the GNU extensions out there are loads of alternative compilers.

Re:Ken Thompson, Anyone? (2)

TheRealMindChild (743925) | about 10 months ago | (#44792157)

icc. Intels c compiler

Re:Ken Thompson, Anyone? (3, Interesting)

gl4ss (559668) | about 10 months ago | (#44792111)

write your own login, use a different sourced compiler for compiling the compiler..

anyways, can we stop posting this same story to every fucking security story already? put it in a sig or something.

Not much worry with a source build (4, Interesting)

msobkow (48369) | about 10 months ago | (#44791835)

The big worry is not building from source, but builds delivered by companies like Ubuntu, which you have absolutely no guarantee are actually built from the same source that they publish. Ditto Microsquishy, iOS, Android, et. al.

The big concern is back doors built into distributed binaries.

Re:Not much worry with a source build (4, Interesting)

msobkow (48369) | about 10 months ago | (#44791859)

Another one that concerns me is Chrome, which on Ubuntu insists on unlocking my keystore to access stored passwords. I'd much rather have a browser store it's passwords in it's own keystore, not my user account keystore. After all, once you've granted access to the keystore, any key can be retrieved.

And, in the case of a browser, you'd never notice that your keys are being uploaded.

Re:Not much worry with a source build (2)

Concerned Onlooker (473481) | about 10 months ago | (#44791883)

In the Apple Keychain Access app the access to each key is restricted to a list of applications that are set by the user. You are allowed to grant access of a particular key to all applications, however.

Re:Not much worry with a source build (4, Insightful)

Zumbs (1241138) | about 10 months ago | (#44791889)

Then why do you use Chrome? Pulling stunts like that would make me uninstall a program in a heartbeat ...

Re:Not much worry with a source build (1, Informative)

reve_etrange (2377702) | about 10 months ago | (#44791905)

Much better to use LastPass or whathaveyou instead of the Chrome keystore, IMHO. For one thing, you're right about separating that from your user account keystore, but also the Chrome keystore is pretty insecure. LastPass makes a point of this during installation, once you've OK'd the install it's able to silently access all your passwords.

Re:Not much worry with a source build (1)

Ingenium13 (162116) | about 10 months ago | (#44792139)

I don't think Chrome uses my Ubuntu keystore. It never asks for a password when opening Chrome, and it never requested access to the keystore. I'm using 12.04.

Re:Not much worry with a source build (0)

Anonymous Coward | about 10 months ago | (#44792151)

I wouldn't trust Google nor Canonical. Google I don't have to explain. AFAIK there's no mention of Canonical in the recent revelations, but I would be surprised if they haven't been targeted, either by legal extorsion or hacking or source code insertion. Ubuntu (at least the no-prefix version) is big enough to be a target. And they never valued people's privacy in the first place, see the Amazon lens debacle.

Re:Not much worry with a source build (4, Interesting)

AlphaWolf_HK (692722) | about 10 months ago | (#44791897)

Eventually you have to draw the line somewhere with regard to where you stop trusting. If the Linux kernel sources themselves contained a backdoor, I would be none the wiser, and neither would most of the world. Some of us have very little interest in coding, let alone picking through millions of lines of it to look for that kind of thing. And then of course there's syntactic ways of hiding backdoors that even somebody looking for one might miss.

Re:Not much worry with a source build (1)

hedwards (940851) | about 10 months ago | (#44791937)

You do, but if you're that worried, there's always truecrypt and keepassx. If you keep the database in a truecrypt encrypted partition, the NSA can't get at that with any reasonable period of time. You can also ditch the keepassx and just store it as plain text in the encrypted partition, but that's not very convenient.

Re:Not much worry with a source build (5, Insightful)

rvw (755107) | about 10 months ago | (#44792279)

You do, but if you're that worried, there's always truecrypt and keepassx. If you keep the database in a truecrypt encrypted partition, the NSA can't get at that with any reasonable period of time. You can also ditch the keepassx and just store it as plain text in the encrypted partition, but that's not very convenient.

Can you be sure that Truecrypt has no backdoors? If so, how?

Re:Not much worry with a source build (2)

ImdatS (958642) | about 10 months ago | (#44791941)

Mod up the parent!

Yes, that's actually my concern all the time. Of course, with open source, you could technically check the source of the system you are using. But then, you'd need to check every line of code, thinking exactly like the NSA (or what-not) in every piece of software you use, including the compiler you use to compile and the compiler compiler, etc, etc.

Additionally, you'd need to check the source of all the HW-components that come with their own BIOS, including the system's BIOS, networking chip's onboard software, and a lot more. Of course, you could reduce the number of checks if you would write your own code for encryption that sits between your keyboard/mouse, memory, etc - meaning, if you really want to sleep sound, you need to write your own encryption system end-to-end, i.e. from the first input (electricity flowing from .e.g keyboard) to the last output. Even then, I wouldn't be sure if I hadn't forgotten anything in-between...

Re:Not much worry with a source build (0)

Anonymous Coward | about 10 months ago | (#44792131)

Additionally, you'd need to check the source of all the HW-components that come with their own BIOS, including the system's BIOS, networking chip's onboard software, and a lot more.

Which components outside of the motherboard come with their own BIOS? Are you conflating firmware with BIOS? They are not one and the same. It's like suggesting a mouse be checked to see if it contains a keyboard.

Pedantry aside, everything else is right on the money.

Re:Not much worry with a source build (4, Interesting)

Smallpond (221300) | about 10 months ago | (#44792267)

There was an attempt to backdoor the kernel [lwn.net] a few years back. I don't believe the perpetrators were ever revealed.

Re:Not much worry with a source build (1, Interesting)

roscocoltran (1014187) | about 10 months ago | (#44791919)

I can't help but feel scared by this SELINUX thing. You can tell me a hundred time that the code was reviewed (was it ?) I still won't trust it. I'd like to be sure that just disabling it altogether is enough to stop it completely from....I don't know, opening backdoor ? Cmon, NSA code in the kernel ?

Re:Not much worry with a source build (1)

1s44c (552956) | about 10 months ago | (#44792105)

I can't help but feel scared by this SELINUX thing. You can tell me a hundred time that the code was reviewed (was it ?) I still won't trust it. I'd like to be sure that just disabling it altogether is enough to stop it completely from....I don't know, opening backdoor ? Cmon, NSA code in the kernel ?

Plus it breaks things all over the place if you actually try to use it.

Can't we just throw out SElinux and pretend it never existed?

Re:Not much worry with a source build (5, Insightful)

M. Baranczak (726671) | about 10 months ago | (#44792123)

The NSA is a big organization. They do plenty of things that don't violate the Constitution, international treaties, or common sense.

SELinux is the least of our worries. It's not impossible to hide backdoors or vulnerabilities in an open-source product, but it is pretty difficult. And if the spooks managed to do it, they certainly wouldn't be putting their name on this product, because the people that they're really interested in are even more paranoid than you.

Re:Not much worry with a source build (1)

Anonymous Coward | about 10 months ago | (#44791921)

tl;dr:

Use Gentoo.

Re:Not much worry with a source build (1)

1s44c (552956) | about 10 months ago | (#44792115)

tl;dr:

Use Gentoo.

If the NSA have backdoors or deliberate exploitable bugs in the kernel it doesn't matter what distro or meta-distro you use.

OpenBSD is the last OS you can really be sure of.

Re:Not much worry with a source build (1)

M. Baranczak (726671) | about 10 months ago | (#44792173)

If they did it to Linux (and we still don't know for sure if they did, or what they did) then they could have done it to OpenBSD.

Re:Not much worry with a source build (2, Funny)

MaskedSlacker (911878) | about 10 months ago | (#44792207)

Or at least, they will have in ten years when the OpenBSD codebase catches up.

Re:Not much worry with a source build (1)

Truekaiser (724672) | about 10 months ago | (#44792125)

Also rip out se-linux too. As well as roll your own ipsec implementation.

Re:Not much worry with a source build (1)

Anonymous Coward | about 10 months ago | (#44791927)

Ubuntu (and other FOSS distributions): Tthey do publish the source, patches and buildscripts used, so you can recompile yourself and either use that version or compare the checksums against their own. PGP keys mean you can verify the builds are authentic from Ubuntu and not provided by another party.

Re:Not much worry with a source build (2)

WaffleMonster (969671) | about 10 months ago | (#44791985)

The big worry is not building from source, but builds delivered by companies like Ubuntu, which you have absolutely no guarantee are actually built from the same source that they publish. Ditto Microsquishy, iOS, Android, et. al.

The big concern is back doors built into distributed binaries.

So what is the practical difference between a "back door" and a security vulnerability anyway? They both remain hidden until found and they both can easily result in total ownage of the (sub)system.

History demonstrates "open source" community is not immune from injection of "innocent" security vulneribilities into open source projects by way of human error. I find it illogical to assume intentional vulnerabilities would be detectible in source code where we have failed to detect innocent ones.

And as for your compiler argument what guarantee do you have the compiler itself is not compromised how do you know epic fail is not being injected into resulting executables during compilation?

Even if you can compile a compiler attacks have been previously demonstrated which are capable of compromising the additional layer of indirection.

Re:Not much worry with a source build (0)

Anonymous Coward | about 10 months ago | (#44792255)

The practical difference is that since somebody puts backdoors in on purpose, they know about them from day one. Whereas other security problems have to be discovered before they're exploitable.

Re:Not much worry with a source build (1)

lesincompetent (2836253) | about 10 months ago | (#44791995)

I tend to agree but... is somebody actually looking at the source and auditing the important bits?

Re:Not much worry with a source build (1)

1s44c (552956) | about 10 months ago | (#44792127)

I tend to agree but... is somebody actually looking at the source and auditing the important bits?

Many people are. Not all of them are on your side.

Re:Not much worry with a source build (2)

henryteighth (2488844) | about 10 months ago | (#44792027)

Did you build your own compiler? If not, how can you trust the binaries it produces? Have you dissected your CPU? How do you know it's executing the instructions you want and not quietly running other instructions too? As others have said, you have to draw the line somewhere. Personally, I have no trouble running a binary distribution (not sure why you pick on Ubuntu and not Redhat or Suse or Debian or FreeBSD, but meh)

What about the hardware or compiler? (4, Insightful)

BitterOak (537666) | about 10 months ago | (#44792035)

The big concern is back doors built into distributed binaries.

And what about the hardware? And how can you be sure the compilers aren't putting a little something extra into the binaries. There are so many places for NSA malware to hide it's scary. Could be in the BIOS, could be in the keyboard or graphics firmware, could be in the kernel placed there by a malicious compiler. Could be added to the kernel if some other trojan horse is allowed to run. And just because the kernel, etc. are open source doesn't mean they have perfect security. The operating system is incredibly complex, and all it takes is one flaw in one piece of code with root privileges (or without if a local privilege escalation vulnerability exists anywhere on the system, which it surely does), and that can be exploited to deliver a payload into the kernel (or BIOS, or something else). Really, if the NSA wants to see what you're doing on your Linux system, rest assured, they can.

Re:Not much worry with a source build (-1)

Anonymous Coward | about 10 months ago | (#44792053)

Ubuntu is as wide open as a two dollar whore.

Re:Not much worry with a source build (0)

Anonymous Coward | about 10 months ago | (#44792077)

Do you go over all the source of Linux to ensure nothing detrimental is in there? If not you may as well be running the binaries.

Re:Not much worry with a source build (1)

1s44c (552956) | about 10 months ago | (#44792087)

Compile again for the source package and do a binary diff. There will of course be a few differences so it might be hard to find real code differences.

It would be an interesting experiment anyway.

Re:Not much worry with a source build (1)

Skapare (16644) | about 10 months ago | (#44792253)

Just be sure you do not build from source with a compiler that was not built from source, or a compiler that was built from source by a compiler that was not built from source, or a compiler that was built from source by a compiler that was built from source by a compiler that was not built from source ... and so on. In other words, source is fine (as long as you read it all) but binary is not since it may be secretly infected.

No (-1)

Anonymous Coward | about 10 months ago | (#44791843)

No, you can't.

OpenBSD (1)

Anonymous Coward | about 10 months ago | (#44791845)

Short of writing it all yourself, I think OpenBSD is as close as you will find to a useful OS you can trust.

Re:OpenBSD (2, Interesting)

Predius (560344) | about 10 months ago | (#44792133)

Even that's no good if the problem is flaws in the spec rather than how it's implemented by OSs. If the NSA did things correctly they didn't have to muddle with actual Linux/BSD/etc src, they got flaws into the crypto definition itself that reduces the work needed to crack it. The better an OS follows the spec... the easier for the NSA to punch through.

Obama Fellatio HQ (-1, Troll)

Anonymous Coward | about 10 months ago | (#44791847)

Attention Obama Supporting Slashdot Douchenozzles!

You stupid fucking dorks, you supported Obama here like stink on shit, you have absolutely no right to complain. Shut up and enjoy your anal probe.

You have butr yourselves to thank!!! Oh the joy of statism! Bend over douchenozzles and suck up the Obama socialism!!! Get in line for your food stamps, government cheese and a colonoscpy!

http://www.washingtonpost.com/world/national-security/obama-administration-had-restrictions-on-nsa-reversed-in-2011/2013/09/07/c26ef658-0fe5-11e3-85b6-d27422650fd5_story.html

"The Obama administration secretly won permission from a surveillance court in 2011 to reverse restrictions on the National Security Agency’s use of intercepted phone calls and e-mails, permitting the agency to search deliberately for Americans’ communications in its massive databases, according to interviews with government officials and recently declassified material.

In addition, the court extended the length of time that the NSA is allowed to retain intercepted U.S. communications from five years to six years — and more under special circumstances, according to the documents, which include a recently released 2011 opinion by U.S. District Judge John D. Bates, then chief judge of the Foreign Intelligence Surveillance Court. "

Tyranny!!! It's what's for dinner at Slashdot Socialist Central!

AES (1)

greenfruitsalad (2008354) | about 10 months ago | (#44791853)

i never understood why people go for AES. clearly, if NSA recommends it, in my view it is something to be avoided (i personally go for twofish instead). in ubuntu, ecryptfs uses aes by default, so i would not trust that.

Re:AES (5, Informative)

Digana (1018720) | about 10 months ago | (#44791953)

The last time that the NSA weakened an algorithm they recommended was by shortening the key for DES. Snowden confirms that properly implemented crypto still works, and Rijndael (AES) still seems strong. The problem aren't the algorithms, because the mathematics still check out. The thing to fear are the implementations. Any implementation for which we are not free to inspect its source is suspect.

Re:AES (4, Informative)

cold fjord (826450) | about 10 months ago | (#44792045)

The last time that the NSA weakened an algorithm they recommended was by shortening the key for DES.

Minor correction: They strengthened the DES algorithm by substituting a new set of S-boxes which protected against an attack that wasn't publicly known at the time. They shortened the key space which made it more susceptible to brute forcing the key. Full strength DES has held up very well against attacks overall until its key length became a problem. It lasted much longer in use than intended.

I seem to recall that DES was never approved for protecting classified data, but that AES does have that approval.

Re:AES (2)

Threni (635302) | about 10 months ago | (#44792101)

Is there any particular reason why people don't strengthen AES (or any other symmetric encryption) by just reencrypting 1000 times? Perhaps interleaving each encryption with encrypting with the first 1, then 2 etc. It would make next to no difference for the end user, who's going to decrypt just once, but I imagine it would add a lot more time to the cracking of the encrypted data than increasing the size of the key.

Re:AES (4, Insightful)

WaffleMonster (969671) | about 10 months ago | (#44792129)

Is there any particular reason why people don't strengthen AES (or any other symmetric encryption) by just reencrypting 1000 times? Perhaps interleaving each encryption with encrypting with the first 1, then 2 etc. It would make next to no difference for the end user, who's going to decrypt just once, but I imagine it would add a lot more time to the cracking of the encrypted data than increasing the size of the key.

Exponents are actually what protects information, multiplication just makes people feel good.

Re:AES (4, Informative)

burne (686114) | about 10 months ago | (#44792175)

One Bruce Schneier is a (loud) advocate for increasing the number of rounds in AES. Currently it's set at 16, and he advocates increasing it to much more. His main reason for this is that there's a differential crypto-analysis attack against known plaintext data encrypted with reduced rounds AES implementations. In short: If you know or control some of the encrypted data, you can extract bits of the key by comparing changes between encrypted known data. The bits you gain reduce the keyspace you need to search. AES according to the guidelines isn't vulnerable for this. Yet.

Re:AES (1, Interesting)

rvw (755107) | about 10 months ago | (#44792331)

Is there any particular reason why people don't strengthen AES (or any other symmetric encryption) by just reencrypting 1000 times? Perhaps interleaving each encryption with encrypting with the first 1, then 2 etc. It would make next to no difference for the end user, who's going to decrypt just once, but I imagine it would add a lot more time to the cracking of the encrypted data than increasing the size of the key.

It seems that encrypting the file multiple times with the same key is not safe, and tends to expose the flaws in the encryption method. It will be less secure. Hashing the password with a random key multiple times (like keepass uses 5000 rounds), and then using that string to encrypt the file, however should work. I'm not an expert on this matter, just repeat what someone else replied me when asking the same question.

Re:AES (0)

boristhespider (1678416) | about 10 months ago | (#44792215)

"Snowden confirms that properly implemented crypto still works"

Yes, because I'm going to trust everything he says. Honestly, kudos to him for leaking everything he has but am I *really* going to trust he knows *everything* and he also isn't going to misreport things for his own personal gain, or to mislead the Russians and Chinese, or to *help* the Russians and Chinese (who, after all, he ran to and was held by for a month) and mislead others, or any other conspiracy-lead scenario you can think of?

Surely, if you care at all about what Snowden decided to leak one thing you should really get from it is *don't fucking trust what someone says who may have vested interests*. And he has vested interests, merely different ones from the NSA (although not necessarily that different if his pious claims to not leaking anything that would hurt American security can be believed, which they may or may not be). And since we never know what someone's vested interests are, assume that everyone has vested interests. So don't trust anyone. Then it all becomes a game about how far you think you can go on anything.

Personally I take the view that what I do online - and, if they are bugging every single computer which is technically possible though unless they've got to them too would pop up on network analyzers which it doesn't seem to - is just going on an enormous slagheap of metadata, which is kept for about three days before space runs out and it's pruned down to the most likely pieces of interest and kept for about six months to a year, at which point most of it is trashed to recover the space.

Paranoia is all well and good - and definitely, definitely do not trust anything *anyone* says, and that includes Snowden just as much as it includes shitheads like Obama and Putin (who is in a different world of shitheadedness entirely, no matter what the propaganda currently says) if there's even the vaguest reason to think they might want to keep one thing hidden or exaggerate another.

Healthy cynicism is the way to go, people. Mad paranoia is just ridiculous because ultimately the NSA/GCHQ/FSB/whoever really just do not care about you. Mad acceptance is equally ridiculous because every spy agency on the planet will always push its remit as far as possible to invade as much privacy as they can - because they would be remiss if they didn't, and hang the consequences.

Re:AES (1)

hedwards (940851) | about 10 months ago | (#44791963)

AES consists of well studied algorithms. Whether or not the NSA recommends it, it's still known to be secure by independent researchers. From what I understand the only breaks to it are marginally better than brute force, and not likely to result in the data becoming available in a useful period of time.

Re:AES (4, Funny)

greenfruitsalad (2008354) | about 10 months ago | (#44792057)

if the whole world goes for one cipher, then nsa can concentrate on creating and improving a single ASIC design for breaking it. we should be using hundreds of different algorithms. then they'd have to design hundreds of types of ASICs, build 100x more datacentres, increase taxation in USofA to 10x what it is now, yanks would rebel and overthrow that government and then there would be no more evil NSA. simples

Re:AES (2, Informative)

Anonymous Coward | about 10 months ago | (#44792061)

The AES was designed by Rijmen and Daemen, who are not working for the NSA (the former for a belgian university, and the other one for ST microelectronics), after a public competition. Every element of its design (which is simple) was justified. If the NSA wanted something they could break, then why not doing it themselves (that was the case of the DES).

The AES was chosen by the US government because it was apparently secure while fast and easy to implement. The academic crypto community widely considers it secure after more than 10 years of effort to break it (note that twofish does not look less secure, but what makes you think that the NSA could break the AES and not twofish ? In fact nobody can break any of them).

The point of the AES competition was to provide US companies (i.e. the public) with something secure enough against potential attacks from their competition.

Re:AES (1)

WaffleMonster (969671) | about 10 months ago | (#44792095)

i never understood why people go for AES. clearly, if NSA recommends it, in my view it is something to be avoided (i personally go for twofish instead). in ubuntu, ecryptfs uses aes by default, so i would not trust that.

Pick a government. If you trust the Russians use GOST. If you trust the Japanese use CAMILLA. NSA has a dual role in spying and protection from spying. By intentionally selecting a vulnerible algorithm cleared for use in protection of classified secret information they also KNOWINGLY compromise their mission to protect US secrets.

Obviously we can't reason about what we don't know. We must all make our own decisions regarding who/what to trust. I will add AES has been continually subject to attention by researchers because of its heavy use worldwide. To date there is nil public information to indicate it is subject to compromis when used properly.

If it is off (5, Insightful)

morcego (260031) | about 10 months ago | (#44791869)

You can sleep soundly if your computer is off and/or unplugged. Otherwise, you should always be on your guard.

Keep your confidential data behind multiple levels of protection, and preferentially disconnected when you are not using it. Never trust anything that is marketed at 100% safe. There will always be bugs to be exploited, if nothing else.

A healthy level of paranoia is the best security tool...

Re:If it is off (1)

1s44c (552956) | about 10 months ago | (#44792033)

You can sleep soundly if your computer is off and/or unplugged.

That's the good advice that nobody takes. Putin went one step further and recommended using typewriters for confidential data.

Re:If it is off (2)

morcego (260031) | about 10 months ago | (#44792091)

You can sleep soundly if your computer is off and/or unplugged.

That's the good advice that nobody takes. Putin went one step further and recommended using typewriters for confidential data.

I'm more on the "never sleep soundly" side of things. Trusting you have a secure system is a good part of the problem. Even a typewriter had flaws.

Re:If it is off (2)

symbolset (646467) | about 10 months ago | (#44792141)

Hopefully manual typewriters. Some electric typewriters have been hacked.

Re:If it is off (0)

Anonymous Coward | about 10 months ago | (#44792245)

The electric typewriters were hacked by inserting hardware into them - they aren't computers that can run arbitrary code, they are just an electromechanical implementation of a mechanical device. You could very easily develop hardware that could act as a keylogger/transmitter for a mechanical typewriter, the only thing about electrics that make it easier is the availability of electrical power to increase the device's endurance vs. batteries.

Re:If it is off (-1)

Anonymous Coward | about 10 months ago | (#44792259)

Considering that virtually every new PC these days will come pre-fitted with hardware (a TPM) to ensure that your machine is never under your control (you're new PC is basically root-kitted at the factory)... you're damn fucking right that typewriters might be the only reason option if you really do want privacy,

Re: If it is off (1)

mike260 (224212) | about 10 months ago | (#44792309)

That's not even remotely what a TPM does.

Re:If it is off (1)

viperidaenz (2515578) | about 10 months ago | (#44792271)

As long as you burn the used ribbon afterwards, and nobody gets a hold of the type writer afterwards either.

It has never been safe. (0)

Anonymous Coward | about 10 months ago | (#44791871)

Every encryption protocol you use has been sabotaged to be readable by them. You dont really think they will try 200 trillion keys to break your stream do you?
No. They modified the protocols, (to make them more secure) and of course never explained the changes. They just mandated it.

Re:It has never been safe. (4, Informative)

1s44c (552956) | about 10 months ago | (#44792009)

Every encryption protocol you use has been sabotaged to be readable by them. You dont really think they will try 200 trillion keys to break your stream do you?
No. They modified the protocols, (to make them more secure) and of course never explained the changes. They just mandated it.

Even the almighty NSA with it's insanely high budget can't crack all the encryption. But it does make me wonder if I should avoid everything they recommend.

I suspect the NSA has developed custom hardware for the more common encryption types. Custom hardware was shown to work extremely well on DES by deep crack. http://en.wikipedia.org/wiki/EFF_DES_cracker [wikipedia.org]

Yes. (2, Insightful)

Anonymous Coward | about 10 months ago | (#44791911)

You have to trust the integrity of Linus and the core developers.

If any of them let in such major flaws they would be found out fairly quickly... and that would destroy the reputation of the subsystem leader, and he would be removed.

Having the entire subsystem subverted would cause bigger problems.. but more likely the entire subsystem would be reverted. This has happened in the past, most recently, the entire changes made for Android were rejected en-mass. Only small, internally compatible changes were accepted, and these went through the usual analysis, and (rather severe) modifications to make them compatible.

It is possible that this is part of the reason IPsec has never been accepted in the kernel networking code.

Re:Yes. (0)

Anonymous Coward | about 10 months ago | (#44792003)

You have to trust the integrity of Linus....

Welp, that answers that.

Comparatively (1)

ciscon (107483) | about 10 months ago | (#44791925)

There are no guarantees, but I'm much more concerned with what Microsoft/Oracle/Cisco does than what Redhat, Ubuntu, or OpenBSD throws into their builds/source.

Re:Comparatively (-1)

Anonymous Coward | about 10 months ago | (#44792075)

There was a time you could crash the oracle listener simply by doing

$ telnet oraserver.mycompany.com 1521

and then randomly hit the keyboard. No passwords and user ids required whatsoever.

It probably got better, but only superficially. MySQL security is also horribly shitty.

Essentially, go back to paper and longhand if you have ANYTHING critical.

There is no such thing as "Security"... (3, Insightful)

dryriver (1010635) | about 10 months ago | (#44791931)

or "Privacy" anymore. Perhaps there hasn't been for the last decade or so. We just didn't know at the time. ---- Enjoy your 21st Century. As long as people fail to defend their basic rights, there will not be such a thing as "security" or "privacy" again. My 2 Cents...

Re:There is no such thing as "Security"... (1)

donb3 (719527) | about 10 months ago | (#44791967)

or "Privacy" anymore. Perhaps there hasn't been for the last decade or so. .

Relatives of mine who are really smart software engineers do not have Facebook accounts and little to no web presence. They are over 50, so some of this may be generational, but for years, I wondered what they knew that they couldn't tell me. Now, I know.

Linux and RdRand (5, Informative)

Digana (1018720) | about 10 months ago | (#44791943)

There was recently a bit of a kerfuffle over RdRand [cryptome.org] .

Matt Mackall, kernel hacker and Mercurial lead dev, quit Linux development two years ago because Linus insulted him repeatedly. Linus called Matt a paranoid idiot because Matt would not allow RdRand into the kernel, because it was an Intel CPU instruction for random numbers that could not be audited. Linus thought Matt's paranoia was unwarranted and wanted RdRand due to improved performance. Recently Theodore T'so has undone most of the damage, but call RdRand still exist in Linux. I do not understand exactly if there are lingering issues or not.

Re:Linux and RdRand (4, Funny)

Greyfox (87712) | about 10 months ago | (#44791961)

Yeah yeah and I'm having to go through the last couple years of E-mails and tell the various paranoid whackos, slightly demented old relatives and that one guy with the tinfoil that they were right and I was wrong. How do you think that makes ME feel?

Re:Linux and RdRand (0)

Anonymous Coward | about 10 months ago | (#44792109)

Like an asshat. Which isn't undeserved - especially after insulting someone with the tinfoil label.

Re:Linux and RdRand (1)

mdielmann (514750) | about 10 months ago | (#44792283)

Is it a label if he's actually wearing a tinfoil hat? Is the insult really because it's probably aluminum? Or is it an actual vintage tinfoil hat?

Re:Linux and RdRand (1)

KiloByte (825081) | about 10 months ago | (#44791975)

/dev/random has been fixed, anything that uses get_random_int() is not. And that function could trivially be modified to mix RdRand's output rather than use it exclusively.

Re:Linux and RdRand (1)

1s44c (552956) | about 10 months ago | (#44792177)

So Linus either doesn't know how critical good quality random data is to encryption or was deliberately weakening encryption in the kernel.

Linus rarely seems like he doesn't know what he is talking about.

You can't trust any mainstream Linux distro (1)

1s44c (552956) | about 10 months ago | (#44791949)

It's sad but you can't trust any mainstream Linux distro created by a US company, and you likely can't trust any created in other countries either. I'm not saying that as a pro-windows troll because you can trust MS's efforts even less.

I believe you can trust OpenBSD totally but it lacks many of the features and much of the convenience of the main Linux distros. It is rock solid and utterly secure though, and the man pages are actually better than any Linux distro I've ever seen.

The possibly bigger problem is that no matter what OS you use you can't trust SSL's broken certificate system either because the public certificate authorities are corruptible. And before someone says create your own CA, sure, for internal sites, but you can't do that for someone else's website.

Re:You can't trust any mainstream Linux distro (4, Interesting)

Noryungi (70322) | about 10 months ago | (#44792117)

I believe you can trust OpenBSD totally but it lacks many of the features and much of the convenience of the main Linux distros. It is rock solid and utterly secure though, and the man pages are actually better than any Linux distro I've ever seen.

Three points:

1) See the above discussion: you cannot trust anything that you did not create and compile yourself. With a compiler you wrote yourself. On a machine you created yourself from the ground up, that is not connected to any network in any way. OpenBSD does not make any difference if your compiler or toolchain is compromised.

2) Speaking of which, I cannot but note that OpenBSD had a little kerfuffle a while back, about a backdoot planted by the FBI in the OS? (Source 1 [schneier.com] ) (Source 2 [cryptome.org] ). I am willing to bet that (a) it's perfectly possible (though not likely), (b) if it was done, it was not by the FBI and (c) that the dev @openbsd.org are, right now, taking another long and hard look at the incriminated code.

3) Finally OpenBSD lacking features and convenience? Care to support that statement? I have a couple of computers running OpenBSD here, and they are just as nice - or even nicer - to use than any Linux. Besides, you don't choose OpenBSD for convenience - you use it for its security. Period.

The possibly bigger problem is that no matter what OS you use you can't trust SSL's broken certificate system either because the public certificate authorities are corruptible. And before someone says create your own CA, sure, for internal sites, but you can't do that for someone else's website.

This goes way beyond a simple question of OpenSSL certificates - think OpenSSH and VPN security being compromised, and you will have a small idea of the sh*tstorm brewing right now.

Re:You can't trust any mainstream Linux distro (1)

1s44c (552956) | about 10 months ago | (#44792239)

Point 1 - You're right of course. OpenBSD uses gcc too and it's unknown how much we can trust CPUs made by AMD or Intel.
Point 2 - Yep, saw that. I got the impression that backdoor may never have existed or if it did it was wiped out quickly. There isn't an easy way to prove it doesn't exist though.
Point 3 - There isn't anything like Ubuntu for OpenBSD, it doesn't 'just work' with modern hardware on things like laptops. OpenBSD is a very nice OS but it's not got cool Linux toys like LVM, ext4, systemd, easy errata updates, and so on. I love OpenBSD and run it on firewalls but it's not the same easy end user OS that Linux is. Conversely OpenBSD's pf beats Linux's iptables hands down so it's horses for courses.

Be afraid (1)

Anonymous Coward | about 10 months ago | (#44791969)

If the powers that be had their way, you would do nothing but lie in your bed with the sheets pulled up around your chin, your eyes darting left and right. Nice life you have there. It would be a shame if something... happened to it.

Meanwhile, if you care about keeping your data private, don't use encryption and think that you can just trust it all to keep it hidden. Your data might be safe, it might not. Be smarter. Learn from baseball players. They keep their signals safe, and they don't even need a computer to do it.

Subversion possible but unlikely and temporary (4, Insightful)

Todd Knarr (15451) | about 10 months ago | (#44791999)

It's possible the NSA did something bad to the code, but it's not likely and it won't last.

For the "not likely" part, code accepted into Linux projects tends to be reviewed. The NSA can't be too obvious about any backdoors or holes they try to put in, or at least one of the reviewers is going to go "Hey, WTF is this? That's not right. Fix it.". and the change will be rejected. That's even more true with the kernel itself where changes go through multiple levels of review before being accepted and the people doing the reviewing pretty much know their stuff. My bet would be that the only thing that might get through would be subtle and exotic modifications to the crypto algorithms themselves to render them less secure than they ought to be.

And that brings us to the "not going to last" part. Now that the NSA's trickery is known, the crypto experts are going to be looking at crypto implementations. And all the source code for Linux projects is right there to look at. If a weakness were introduced, it's going to be visible to the experts and it'll get fixed.

That leaves only the standard external points of attack: the NSA getting CAs to issue it valid certificates with false subjects so they can impersonate sites and servers, encryption standards that permit "null" (no encryption) as a valid encryption option allowing the NSA to tweak servers to disable encryption entirely, that sort of thing. There's no technical solution to those, but they're easier to monitor for.

Pointless Worrying (3, Insightful)

Luthair (847766) | about 10 months ago | (#44792007)

The NSA doesn't really need to have backdoors written into the systems, they have a lot of exploits in their bag of tricks that they've bought or found. Unfortunately the NSA only needs to find one exploit, but truly secure systems we need to find and fix them all :/

Re:Pointless Worrying (2)

chr1st1anSoldier (2598085) | about 10 months ago | (#44792119)

Also, all they need is to route traffic over their hardware. Sure, you can use SSL and TLS, but I recall an article about the NSA have a majority of the keys from Certificate Authorities. Even if they do not have the correct CA key, I am sure they have a farm of computers ready to brute force crack the key and get the information they need. And no, this doesn't give access to data stored locally on your hard drive, but if you ever upload that data anywhere it can be captured.

Absolute Anonymity (0)

Anonymous Coward | about 10 months ago | (#44792039)

Absolute Anonymity is a weapon of mass destruction and will never be allowed by any government.

Why no mention of SELinux? (0)

Anonymous Coward | about 10 months ago | (#44792079)

Seems like the obvious choice if you want to be more secure, because it's NSA appr.... wait a second.

Until proven otherwise... (0)

Anonymous Coward | about 10 months ago | (#44792089)

... we can only assume that the Linux Kernel is compromised and has government (and not necessarily the US government) backdoors in it.

nothing's safe, but there are obvious things to do (5, Interesting)

hedrick (701605) | about 10 months ago | (#44792093)

No, but there's no reason to think that Linux is worse than anything else, and it's probably easier to fix.

If I were Linus I'd be putting together a small team of people who have been with Linux for years to begin assessing things. From Gilmour's posting it seems clear that IPsec and VPN functionality will need major change. Other things to audit include crypto libraries, both in Linux and the browsers, and the random number generators.

But certainly some examination of SELinux and other portions are also needed.

I don't see how anyone can answer the original question without doing some serious assessment. However I'm a bit skpetical whether this problem can actually be fixed at all. We don't know what things have been subverted, and what level of access the NSA and their equivalents in other countries have had to be code and algorithm design. They probably have access to more resources than the Linux community does.

what we will learn next (1)

Max_W (812974) | about 10 months ago | (#44792107)

I would not be surprised if visionaries and leaders of the computer industry, including FOSS, turn out to be generals, admirals and colonels. And that the leading technological companies are just the departments of the single organization.

Government? What About Other Bad Guys? (5, Insightful)

rueger (210566) | about 10 months ago | (#44792153)

We are being told - and some of us suspected as much for a very long time - that the NSA &Co track everything we do, and have the ability de-encrypt much of what we think is secure; whether through brute force, exploits, backdoors, or corporate collusion.

Surely we should also assume that there are other criminal and/or hacker groups with the resources or skills to gain similar access? Another case of "once they know it can be done, you can't turn back."

I honestly believe that we're finally at the point where the reasonable assumption is that nothing is secure, and that you should act accordingly.

Yes (0)

Anonymous Coward | about 10 months ago | (#44792163)

I can guarantee that unless you're very much out of the ordinary -- meaning that you're either guilty of million $1,000,000+ fraud (although even that is probably well on the low side), or else a member of organizations directly associated with extremist views -- then the NSA do not give the slightest fuck about you and it's an enormous arrogance to think they do. Honestly, just think about it: there are hundreds of millions of Americans, none of whom the NSA have a legal basis for probing so they have to be at the very least slightly circumspect, and a good few billion people outside America who they have every right to snoop on. You'd have to fucking go some for them to give a fuck about you and my guess is that, honestly, they really, really, really couldn't give a toss if you lived or died. That means they're not going to break into your personal desktop (honestly, why the hell *would* they?) and your imprint on their databases will go back to your internet metadata, none of which you've ever been assured by anyone at all is in the slightest bit private.

Everything I've said also goes for GCHQ who are my local legally-dubious group of spooks, and for anything I might ever say about them or, indeed, anything. They really don't care about me.

Can you sleep soundly? (5, Insightful)

cold fjord (826450) | about 10 months ago | (#44792181)

I think that depends on what keeps you up at night.

In one of the earlier stories today there was a post making all sorts of claims about compromised software, bad actors, and pointing to this paper: A Cryptographic Evaluation of IPsec [schneier.com] . I wonder if anyone bothered to read it?

IPsec was a great disappointment to us. Given the quality of the people that worked on it and the time that was spent on it, we expected a much better result. We are not alone in this opinion; from various discussions with the people involved, we learned that virtually nobody is satised with the process or the result. The development of IPsec seems to have been burdened by the committee process that it was forced to use, and it shows in the results. Even with all the serious critisisms that we have on IPsec, it is probably the best IP security protocol available at the moment. We have looked at other, functionally similar, protocols in the past (including PPTP [SM98, SM99]) in much the same manner as we have looked at IPsec. None of these protocols come anywhere near their target, but the others manage to miss the mark by a wider margin than IPsec.

I even saw calls for the equivalent of mole hunts in the opens source software world. What could possibly go wrong?

Criminals, vandals, and spies have been targeting computers for a very long time. Various types of security problems have been known for 40 years or more, yet they either persist or are reimplemented in interesting new ways with new systems. People make a lot of mistakes in writing software, and managing their systems and sites, and yet the internet overall works reasonably well. Of course it still has boatloads of problems, including both security and privacy issues.

Frankly I think you have much more to worry about from unpatched buggy software, poor configuration, unmonitored logs, lack of firewalls, crackers or vandals, and the usual problems sites have than from a US national intelligence agency. That is assuming you and 10 of your closes friends from Afghanistan aren't planning to plant bombs in shopping malls, or try to steal the blueprints for the new antitank missiles. Something to keep in mind is that their resources are limited, and they have more important things to do unless you make yourself important for them to look at. If you make yourself important for them to look, a "secure" computer won't stop them. You should probably worry more about ordinary criminal hackers, vandals, and automated probe / hack attacks.

Recall the NSA/FBI OpenBSD story? (2, Interesting)

Anonymous Coward | about 10 months ago | (#44792193)

Hmmm - all of a sudden this looks interesting again:

http://news.cnet.com/8301-31921_3-20025767-281.html [cnet.com]

Best Strategy - No encryption (1)

kawabago (551139) | about 10 months ago | (#44792211)

No one will bother with unencrypted text as it will be assumed to have nothing interesting. If a computer scanning your text and forgetting it bothers you, hide the real text inside other boring text. Obscurity by tedium.

"pretty safe?" (4, Insightful)

bill_mcgonigle (4333) | about 10 months ago | (#44792217)

Yes, it's "pretty safe". It's not absolutely safe or guaranteed to be safe. But if your other alternative is a hidden-source OS, especially one in US jurisdiction, then OSS is "pretty safe."

No, you are no safe (0)

drolli (522659) | about 10 months ago | (#44792227)

Even if you did use ssl to get an proper kernel image (which i doubt), ssl relies on companies issuing certificates, which have been issuing bad certificates for much less important entities than the NSA.

So, not you can not rely on that. Anybody could have given you any binary

Hardware cocnerns (1)

Anonymous Coward | about 10 months ago | (#44792339)

Don't forget that there are concerns in the hardware as well. Are you using NVIDIA chip, a ATI chip, a non-atheros wireless chip, a non-HP printer (and even if it is HP in many cases), etc. All these things have non-free code where shit hides. Then there is also Chrome, Adobe Flash, Adobe Reader, Skype, and a number of other non-free components in most distributions. You have to be really careful. You may want to check out Trisquel. It's based on Ubuntu, compiled from scratch, patched for free software reasons, and some privacy related ones.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...