Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Exploiting Wildcards On Linux/Unix

Soulskill posted about a month ago | from the teaching-a-new-dog-old-tricks dept.

Security 215

An anonymous reader writes: DefenseCode researcher Leon Juranic found security issues related to using wildcards in Unix commands. The topic has been talked about in the past on the Full Disclosure mailing list, where some people saw this more as a feature than as a bug. There are clearly a number of potential security issues surrounding this, so Mr. Juranic provided five actual exploitation examples that stress the risks accompanying the practice of using the * wildcard with Linux/Unix commands. The issue can be manifested by using specific options in chown, tar, rsync etc. By using specially crafted filenames, an attacker can inject arbitrary arguments to shell commands run by other users — root as well.

cancel ×

215 comments

Question... -- ? (5, Informative)

beh (4759) | about a month ago | (#47332419)

Who does NOT use -- in their scripts, if they're safety conscious?

        rm -i -- *

Problem solved?

Normal programs should stop processing options after a (standalone) "--" and take everything following it as regular parameters. getopt and similar libraries handle this automatically.

I really wouldn't class the "use of wildcards" as a security risk - the security risk is the developer that doesn't know what he's doing.
Would command line handling be a security risk, if someone would add a --superuser-rm option to his code and execute "rm -rf /" as root immediately afterwards?

what about? (1)

Joe_Dragon (2206452) | about a month ago | (#47332461)

drop tables *

Re:what about? (2)

Penguinisto (415985) | about a month ago | (#47333381)

# rm -rf *.*

(I actually saw a Windows-centric guy do that once as root while he was learning Linux. The look of horror on his face as the entire box began to delete itself was hilarious...)

Re:what about? (1)

Penguinisto (415985) | about a month ago | (#47333505)

Oh, thought of another one, just to mess with other admins:

# chattr +i /*.*

Question... -- ? (4, Informative)

Marneus68 (2329096) | about a month ago | (#47332489)

After years of using command line programs daily I never heard of -- before today. It was never brought up in school, nor did I see any specific thread / blog post on the subject. So to answer your question, I don't. I've never heard about that before. Where did you learn about that ?

Re:Question... -- ? (2)

drinkypoo (153816) | about a month ago | (#47332523)

Where did you learn about that ?

RTFM[anpages.] It's literally in the system documentation. Granted, not all commands have such an option. Knowing which do is your responsibility. Arguably, all commands should have such an option.

Re:Question... -- ? (1)

Marneus68 (2329096) | about a month ago | (#47332603)

I just check and indeed. Every application made with getopt implements this mechanism. It's good to know.

Re:Question... -- ? (5, Interesting)

locofungus (179280) | about a month ago | (#47332621)

Back in the (iirc) bsd 4.2 days, su was a suid shell script - at least on the machines I was using at the time.

Setup a symlink to su called -i

$ -i
# rm -- -i
#

There was a security bug handling suid shell scripts where the user was changed and then the #! interpreter was run, i.e. /bin/sh -i

and you got an interactive root shell :-)

Was very informative when the 'script kiddies' (although I don't recall that term existing in those days) had symlinks called -i in their home directory that they didn't know how to delete ;-)

Re:Question... -- ? (1, Informative)

PIBM (588930) | about a month ago | (#47332635)

Are you running commands, with root on stuff you don't know where it comes from ?

If you absolutely have to go run the query in the folder into which someone has upload/file creation, then at least use ./* if the tool doesn't support --.

This is not an issue if you work recursively on the directory holding whatever they want, which should cover most of the situations. A bad admin can run queries that are as dangerous or worst pretty fast!

Always be cautious when running someone else shell script, that's even more dangerous..

Re:Question... -- ? (1)

Penguinisto (415985) | about a month ago | (#47333479)

Always be cautious when running someone else shell script, that's even more dangerous..

If you aren't capable of auditing an untrusted-sourced script before you run/use it, then don't run it or use it.

Seriously.

I grab (cut+paste) script bits from online when I'm lazy, but I always take the time to audit the chunk of text, and insure that it doesn't do anything dumb before I use/incorporate it. Doing this gives me two benefits: First, I can double-plus insure that it doesn't do anything I don't want it to, and second, I learn a bit about the person who wrote it (and in some cases, I discover a trick or bit of info that I didn't know before.)

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47332645)

You've never had a stupid program crash and create a file named "--" or something similar in its working directory? Now try to remove the file without knowing about the "--" command line options.

Re: Question... -- ? (2, Informative)

Anonymous Coward | about a month ago | (#47332683)

rm ./--

Re: Question... -- ? (1)

mooingyak (720677) | about a month ago | (#47332977)

The painfully obvious solution I never thought of....

Re:Question... -- ? (4, Informative)

hawguy (1600213) | about a month ago | (#47332705)

You've never had a stupid program crash and create a file named "--" or something similar in its working directory? Now try to remove the file without knowing about the "--" command line options.

rm ./--

Re:Question... -- ? (1)

TheCarp (96830) | about a month ago | (#47332827)

ROTFL I never even thought of passing it as a path name!
Its amazing how the solution that is right there can be so hard to see. Like many, I found out about -- years ago after swearing at my terminal for a while before resorting to reading the rm man page.

In fact, if your script generates path names instead of changing working directory, then it solves the entire
original issue as well.

I might start using ./ a lot more now.

Re:Question... -- ? (5, Funny)

Anonymous Coward | about a month ago | (#47333081)

I might start using ./ a lot more now.

So, you learned about ./ on /.?

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47333123)

/. too

Re:Question... -- ? (4, Funny)

TangoMargarine (1617195) | about a month ago | (#47333357)

after swearing at my terminal for a while before resorting to reading the rm man page.

I find that half the time the swearing comes after trying to read the man page. Then it's time to fire up the old Google...

Re:Question... -- ? (0)

Chris Mattern (191822) | about a month ago | (#47333561)

I might start using ./ a lot more now.

I suppose you could say you learned about ./ on /.

Re:Question... -- ? (1)

Anonymous Coward | about a month ago | (#47332703)

You went to school to learn bash? Is this community college? Doing.it.wrong.

Re:Question... -- ? (2)

TheCarp (96830) | about a month ago | (#47332731)

About as I would expect. A fellow admin and I were recently talking about the disown command and how after more than a decade on the job we are still finding out about commands that have existed since we were kids running around on the playground.

Most admins find out about -- after they run into a situation where they accidentally created a file with a name like "-f" go ahead, try and delete a file named "-f" any other way.

It works in many unix commands, actually "--" is a very common "end of options" signal. Really, any command that doesn't have a good syntax reason to not support it, really probably should. Many of the old ones do.

Most people I know who have learned shell learned it on the job in one way or another. There are often a lot of gaps. I did too, it took me a long time to get in the habbit of proper quoting, escaping etc, and this is definitely an easy one to miss.

Re:Question... -- ? (1)

Bazman (4849) | about a month ago | (#47333113)

Any other way? How about this way:

rm ./-f

Re:Question... -- ? (3, Informative)

fnj (64210) | about a month ago | (#47333191)

Most admins find out about -- after they run into a situation where they accidentally created a file with a name like "-f" go ahead, try and delete a file named "-f" any other way.

rm ./-f
Is the most dead-simple way of doing it and is portable to non-gnu-based systems, although even BSD has the double-dash option nowadays.

And there is always the old standby of insulating the operation from bash command line expansion:
perl -e 'unlink "-f";'

You could also, within a minute or so, write and compile a literally two-line C program to remove it. I don't understand the mystery.
#include <unistd.h>
int main() { unlink("-f"); }

Re: Question... -- ? (0)

Anonymous Coward | about a month ago | (#47333433)

Double dash is specified by POSIX and it sure as heck didn't originate with GNU. I never knew it was absent from early BSD, but i'm pretty sure it was in some later AT&T versions.

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47332759)

I learned about it when I configured samba to share to windows clients, the issue was in that the customer had some files starting with dashes, and thus an older Red Hat distro couldn't handle the prefix depending on the command used unless I had -- added before the final directory parameters

Re:Question... -- ? (1)

Anonymous Coward | about a month ago | (#47332857)

You are the kind/level of admin who would never be granted access to critical systems. You don't learn Unix in school. You learn it by reading. And '--' is right there in sh(1). Of all the things you should read in Unix, that's the most important one, and you never read it. Neither did your 'teacher'. That's bad.

Re:Question... -- ? (1)

TangoMargarine (1617195) | about a month ago | (#47333395)

Can someone explain to me why all these program manpage references have e.g. "(1)" after them? Is it referring to a specific prototype with X number of arguments to invoke it or something? There aren't multiple programs in the PATH with the same name in a Linux install AFAIK.

Re:Question... -- ? (1)

mooingyak (720677) | about a month ago | (#47332967)

Can't speak for the OP, but I once accidentally created a file name '-r'. Trying to remove it eventually led me to discover '--', but I don't expect most people to know about it.

Re:Question... -- ? (2)

beh (4759) | about a month ago | (#47333133)

Sorry, if that appears harsh - but sometimes it pays to read manuals and try and understand what you're doing and how the stuff works.

I don't exactly remember when I learnt it first - but I DID already know when I also got told about it during my CS BSc degree course (probably 1st or 2nd year - which would place it about 1998-2000).

If you need to code stuff "securely", you need to understand how stuff works -- I don't think of myself as a particularly apt security coder or hacker - I mainly specialise on internal systems integration, not so much web or other front-end stuff, so I have the luxury that I already know the data is "sane", before it gets to me - and I "only" need to figure out how to transform it and where to send it on to.

Here are a few pointers, where you might read about it:

http://pubs.opengroup.org/onli... [opengroup.org]
"Guideline 10:
        The first -- argument that is not an option-argument should be accepted as a delimiter indicating the end of options. Any following arguments should be treated as operands, even if they begin with the '-' character."

Even wikipedia mentions it - even though not strictly a "developer" resource:

http://en.wikipedia.org/wiki/C... [wikipedia.org]

"In Unix-like systems, the ASCII hyphen-minus is commonly used to specify options. The character is usually followed by one or more letters. Two hyphen-minus characters ( -- ) often indicate that the remaining arguments should not be treated as options, which is useful for example if a file name itself begins with a hyphen, or if further arguments are meant for an inner command. Double hyphen-minuses are also sometimes used to prefix "long options" where more descriptive option names are used. This is a common feature of GNU software. The getopt function and program, and the getopts command are usually used for parsing command-line options."

If that's too far to go - try "man getopt" on your linux machine:

  "
            The parameters getopt is called with can be divided into two parts:
              options which modify the way getopt will parse (options and
              -o|--options optstring in the SYNOPSIS), and the parameters which are
              to be parsed (parameters in the SYNOPSIS). The second part will start
              at the first non-option parameter that is not an option argument, or
              after the first occurrence of `--'. If no `-o' or `--options' option
              is found in the first part, the first parameter of the second part is
              used as the short options string.
"

man rm - and even rm --help on linux show it:
"
              To remove a file whose name starts with a '-', for example '-foo', use
              one of these commands:

                            rm -- -foo
" ...though without explaining the "--" in general...

man chown doesn't mention it, but refers to the full documentation in texinfo and how to access it - that one says under "Common options"

"
    `--'
          Delimit the option list. Later arguments, if any, are treated as
          operands even if they begin with `-'. For example, `sort -- -r'
          reads from the file named `-r'.
"

The information is there - and in _lots_ of places - but it DOES require to occasionally read man pages or general intros, rather than using trial and error and just bodging around until something seems to work.

But, yes, it's a lot of material, and not everyone has the time to read everything -- for me this is also why I mostly rely on others to figure out system security issues... The problem to me seems more that a lot of "learn this in 5 mins" type tutorials don't include it purely for lack of time, and many just use those and still put the results up on the web somewhere.

 

Re:Question... -- ? (3, Interesting)

Anonymous Coward | about a month ago | (#47333321)

That is B.S.

If someone reads that, they do not think security. They think it is an escape to deal with files that start with - and that is where they file it in their head. You also have to understand about '*' and think about how the two would work together.

This is exactly why computer code is insecure.

Question... -- ? (1)

agent59517795 (1423109) | about a month ago | (#47333179)

After years of using command line programs daily I never heard of -- before today. It was never brought up in school, nor did I see any specific thread / blog post on the subject. So to answer your question, I don't. I've never heard about that before. Where did you learn about that ?

RFM

Re:Question... -- ? (1)

Ocrad (773915) | about a month ago | (#47333225)

Where did you learn about that?

In the manual of any argument [nongnu.org] parser [gnu.org] .

Re:Question... -- ? (2, Insightful)

Anonymous Coward | about a month ago | (#47332493)

So why would the expected method not be the default? This is exactly how security problems are born.

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47332923)

If the default behavior of a command was --, then you couldn't pass argument flags.

Re:Question... -- ? (0, Offtopic)

Anonymous Coward | about a month ago | (#47332617)

Why are you using wildcards in a script processing publicly-accessible directories in the first place?

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47332709)

I never do what you suggest.
I have never written a script that needed "rm -i" for one thing.
But even ignoring that, I never remove "*" in a script. If I have a bunch of temporary files, I will create a temporary directory (TMP="/dev/shm/tmp_delete_me.$$.${RANDOM}") and put them in there. I then "rm -rf $TMP" (the temporary directory).
I have needed to use wildcards with some files before, but I usually do have more than just a "*" in there. For example, "rm -f *.stub" is more common.

Re:Question... -- ? (2)

jones_supa (887896) | about a month ago | (#47333157)

Use "find" to delete the files. This way you avoid all the wildcard bombs. Look in /etc/init/mounted-tmp.conf in Debian/Ubuntu for an example:

# Remove all old files, then all empty directories
find . -depth -xdev $TEXPR $EXCEPT ! -type d -delete
find . -depth -xdev $DEXPR $EXCEPT -type d -empty -delete

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47332747)

Also, normal programs should stop processing options after encountering the first non-option argument. GNU tools often seem to stray from this convention for whatever reason, probably in order to stay compatible with some flavors of UNIX.

Re:Question... -- ? (1)

Gunstick (312804) | about a month ago | (#47333033)

this is the main problem cause: getopt messes it up, or something else in GNU

the oldschool unix systems do not behave that way. First the options, then the files.

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47333377)

If the first file name starts with a dash, how do you know it's a file name?

Re:Question... -- ? (1)

Anonymous Coward | about a month ago | (#47332811)

This is not an exploit, it's basic stuff any first year sysadmin should know. You know... quoting, ./, --, -i, -inum, IFS, etc. The author's stupid Linux 'ls' doesn't even order lexicographically correct. And dashes at the end of filename arguments are not interpreted as dashopts, that's not even posix.
If you want to learn unix and blog your newfound knowledge, great... just don't try to call it an exploit or secadv, you'll just make yourself look stupid.

Re:Question... -- ? (1)

Gunstick (312804) | about a month ago | (#47333049)

ordered ls:
LANG=C ls

Re:Question... -- ? (1)

PacoSuarez (530275) | about a month ago | (#47332961)

While that is indeed the solution, it is also true that it is too easy to forget. Perhaps one could modify all commands to require the use of the "--" separator, or to warn if it's not present, at least if some environment variable is set. That could be very helpful for people trying to write more secure code.

Re:Question... -- ? (2, Insightful)

Anonymous Coward | about a month ago | (#47332965)

"the security risk is the developer that doesn't know what he's doing."

Not the hacker who does know what he is doing.

Re:Question... -- ? (2)

godrik (1287354) | about a month ago | (#47333005)

Nop, you can not just use --. because many commands do not understand --

Here is an article by dwheeler (a frequent slashdotter; often cited for his technique countering the trusting trust problem) about filenames.
http://www.dwheeler.com/essays... [dwheeler.com]

I believe he is mostly right. We should move to file systems that do not allow "stupid" names and be done with it.

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47333039)

This doesn't solve the problem completely, but handles most cases of abuse. Remember, in Unix you can have a pipe in a filename. So the wildcard could activate a pipe to a script that ignores your initial input and executes nasty things.

Re:Question... -- ? (1)

Gunstick (312804) | about a month ago | (#47333131)

no it does not
The shell is not doing the -rf
But it does the pipes, the rm does not do the pipes.
So a pipe charater will not open a pipe
And I also tested it, to be sure what I say.

Re:Question... -- ? (0)

Anonymous Coward | about a month ago | (#47333185)

Anything can be put in a context such that it's not a security risk. For example, Heartbleed isn't a security risk unless you expose it to untrustworthy people.

But that's just it -- developers aren't trustworthy in this scenario. I cut my teeth on Unix decades ago and I didn't know of the "--" feature. And even then, not everything uses getopt. If I can't figure out how to use a feature securely, directly or through software others have created, then for me it's insecure. And it sounds like for most people who use wildcards it's insecure, too.

Lets quote FD while we're at it (5, Informative)

Anonymous Coward | about a month ago | (#47332439)

posting the answer to this useless story that was posted to FD

Date: Thu, 26 Jun 2014 12:55:42 -0700
From: Michal Zalewski

> We wanted to inform all major *nix distributions via our responsible
> disclosure policy about this problem before posting it

I'm not sure how to put it mildly, but I think you might have been
scooped on this some 1-2 decades ago...

Off the top of my head, there's a rant about this behavior in "The
Unix-Haters Handbook", and there are several highly detailed articles
by David Wheeler published over the years (e.g.,
http://www.dwheeler.com/essays/filenames-in-shell.html).

Yup, it's a counterintuitive behavior that leads to security problems.
The odds of changing the semantics at this point are very slim. Other
operating systems have their own idiosyncrasies in this area - for
example, Windows it not a lot better with parameter splitting and
special filenames. /mz

Re:Lets quote FD while we're at it (2, Interesting)

gweihir (88907) | about a month ago | (#47332657)

It may be counter-intuitive for people that have very little experience with a UNIX commandline. All others did run in the issue at some time that they could create, but not easily delete a filename "-v" or the like. But people with very little UNIX commandline experience have zero business writing security critical software that uses the commandline tools!

This is a complete non-issue. Incompetent people will usually screw security up and this is just one of the countless ways to do it.

Re:Lets quote FD while we're at it (0)

Anonymous Coward | about a month ago | (#47333045)

But people with very little UNIX commandline experience have zero business writing security critical software that uses the commandline tools!

The problem is that bugs with quoting are non-obvious and turn things that should definitively be trivial and secure into potential security problems. And it's not just scripts, it's possibly everyday use of the shell that may get subverted.

Just consider all the Cross Site Scripting flaws, they are just another flavor of the same problem.

Re:Lets quote FD while we're at it (1)

Gunstick (312804) | about a month ago | (#47333067)

I just want to state the UNIX does not behave like that, it's GNU who does.

If only this was a Microsoft issue. (0, Flamebait)

jellomizer (103300) | about a month ago | (#47332451)

If it were a Microsoft Issue, this would be so a bug and not a feature.

Linux/Unix are an old design of an OS. There are some designs in its main way of doing things that do not work in today's much more secure environment.

Things have been upgraded Telnet replaced with SSH, hacks on FTP to make it more secure. But the underpinning is still there. Back in the day where computers needed to do things.

Re:If only this was a Microsoft issue. (2, Interesting)

Anonymous Coward | about a month ago | (#47332597)

There is one great evil that Unix let into its filesystems long ago, one which Apple (which loves generate or perpetuate evil) put into its filesystem and that later Microsoft allowed because it was expedient to align with earlier Apple practice: spaces in file names. If we forbade spaces as well as control characters, things would be much better.

Re:If only this was a Microsoft issue. (1)

jones_supa (887896) | about a month ago | (#47332701)

I still hate the trickery I have always to put into my scripts just to deal with spaces in filenames.

find /my/files -print0 | xargs -0 do_some_stuff

Re:If only this was a Microsoft issue. (1)

itzly (3699663) | about a month ago | (#47332713)

With zsh you can type: do_some_stuff /my/files/**/*

Re:If only this was a Microsoft issue. (1)

psmears (629712) | about a month ago | (#47333371)

With zsh you can type: do_some_stuff /my/files/**/*

... provided that the number of files fits into the command line argument space (a common reason for using find/xargs rather than, say, wildcards/backticks, aside from the security issues).

Re:If only this was a Microsoft issue. (0)

Anonymous Coward | about a month ago | (#47332863)

Why are spaces bad? All it needs is a escape sequence.

Re:If only this was a Microsoft issue. (1)

gweihir (88907) | about a month ago | (#47332605)

That is complete BS. Preventing users from doing things they legitimately want to do is not a valid approach to securing untrusted interfaces. The valid valid way is to sanitize the untrusted input before using it and only a complete moron will pass a wildcard from an untrusted source, unless it cannot do any harm where it is going.

Re:If only this was a Microsoft issue. (1)

beelsebob (529313) | about a month ago | (#47332717)

I'm pretty sure you'll have a hard time trying to find a user who legitimately wants to pass arguments to command line tools by naming a bunch of files according to those arguments ;)

Re:If only this was a Microsoft issue. (0)

Anonymous Coward | about a month ago | (#47333051)

Wildcard expansion happens in the shell, not in the program.
The program doesn't know if a particular argument came from a wildcard expansion or from somewhere else.

Re:If only this was a Microsoft issue. (0)

Anonymous Coward | about a month ago | (#47332979)

You don't seem to understand shell variable expansion very well. The wildcard is not passed to the invoked application.

The underlying applications are not broken as they have no concept of the user shell. They take a list of arguments and process those arguments.

Expansion of a wild card occurs within the shell and the shell has no pre-wired knowledge of how an application should work. ie, it does not know that rm cannot determine the dash prefixed argument is really a file name. It simply passes on the arguments to an application. It does not matter that strange and powerful shell magic was used to create those arguments.... ie *

Re:If only this was a Microsoft issue. (1)

gfxguy (98788) | about a month ago | (#47332787)

Maybe, but I don't think so. First of all, for this "exploit" to have been around so many years, it's interesting how I've never heard of it actually being used to hack or vandalize a system... second, if someone is already able to write arbitrary filenames, they're already into your system; if it's a normal user, you'd be able to track down who it is... it just seems like a really "weak" exploit, if I'd call it an exploit at all. IOW, IMO, nothing to see here.

Re:If only this was a Microsoft issue. (1)

TangoMargarine (1617195) | about a month ago | (#47333469)

Linux/Unix are an old design of an OS.

Old does not always equal worse. You ever hear the saying, "Those who do not know UNIX are doomed to reimplement it, poorly?" Similar to how rewriting code from scratch is very rarely the correct approach because you'll make a lot of the same mistakes over again.

nothing really new (1)

Anonymous Coward | about a month ago | (#47332497)

While the paper is an interesting writeup, there's nothing really new in there. A colleague used to exploit the same issue "for good," by touching a file named "-i" in directories he deemed important. Obviously, one could undo that by touching a file named "--". For users, I'd usually recommend always using ./* instead of just *, as well as directory/. instead of just directory

User supplied input problem (0)

Anonymous Coward | about a month ago | (#47332543)

The root issue boils down to user supplied input not being checked.

PowerShell (1)

jones_supa (887896) | about a month ago | (#47332577)

Is the wildcard expanded by the shell in PowerShell?

Re:PowerShell (1)

The MAZZTer (911996) | about a month ago | (#47332649)

I assume from this article that Linux replaces * with filenames before the command sees it. AFAIK DOS/Windows the wildcard is handled by each specific command. dir * displays the same listing as just plain dir does, while passing dir a bunch of directory names will display the contents of those directories (like ls does... I guess that explains that behavior! It always confused me). PowerShell, at least as far as Get-ChildItem, seems to work the same way as dir (except it does not take multiple directory names in parameters).

Re: PowerShell (0)

Anonymous Coward | about a month ago | (#47332767)

What if there's an actual file named *?

Re: PowerShell (1)

by (1706743) (1706744) | about a month ago | (#47332927)

> echo you need to use an escape sequence > \*
> cat \*
you need to use an escape sequence
> rm -i \*
rm: remove regular file ‘*’? y
> echo $SHELL
/usr/bin/zsh
>

Re: PowerShell (1)

ray-auch (454705) | about a month ago | (#47333015)

* and ? are illegal characters in windows filenames, which prevents this. As is /, which is used to indicate parameters in windows command prompt (dos style), which effectively means that the style of attack in TFA doesn't work. Except maybe for unix (GNU, cygwin etc.) apps on windows which use "--" to indicate command option , and "--" is allowed in windows filenames, thus porting this Unix bug/hole/feature to Windows.

And of course Windows has other idiosyncrasies. Nothing is perfect.

Re:PowerShell (1)

Dishevel (1105119) | about a month ago | (#47333273)

What I like about PowerShell is that it runs on Windows. Windows really sucks at command line.

Re:PowerShell (2)

jones_supa (887896) | about a month ago | (#47333343)

Well, yeah. The object-oriented approach is pretty clever for example. Do not have to sweat over spaces in file names breaking your scripts and things like that.

Incompetent people will always mess things up... (2, Interesting)

gweihir (88907) | about a month ago | (#47332581)

Really, this is well-known, non-surprising and will not happen to anybody with a security mind-set. Of course it will happen in practice. But there are quite a few other variants of code injection (which this basically is) that the same people will get wrong. Complete input sanitisation is mandatory if you need security. I mean, even very early Perl-based CGI mechanisms added taint-checking specifically for things like this. If people cannot be bothered to find out how to pass parameters from an untrusted source securely, then they cannot be bothered to write secure software.

The fix is not to change the commands. The fix is to exchange people that mess things this elementary up against people that actually understand security. Sorry, you cannot have developers that are cheap and competent at the same time and even less so when security is important.

Re:Incompetent people will always mess things up.. (0)

Anonymous Coward | about a month ago | (#47332773)

Typical Linux user reaction: "you're holding it wrong!"

At least Microsoft takes these things seriously.

Re:Incompetent people will always mess things up.. (0)

Anonymous Coward | about a month ago | (#47333439)

At least Microsoft takes these things seriously.

Was this supposed to be a joke, FUD, or BS ?

Re: Incompetent people will always mess things up. (3, Insightful)

Anonymous Coward | about a month ago | (#47332893)

Wake up. Not everyone is a developer. Not everyone has even 2 minutes of unix philosophy.

My Users are scientists, and they get to trash their home space here. These types of issues are most likely to happen when they are writing a script and it makes files for what should have been options.

My job isn't to teach them unix, it's to keep them happy and productive. They make mistakes, I clean them up and help them through the frustration of things going wrong.

He could have researched a bit harder. (5, Interesting)

quietwalker (969769) | about a month ago | (#47332613)

I remember reading about this in the 1991 release of "Practical Internet and Unix Security," from O'Reilly back in 1991. I'm pretty sure they even gave examples. They also laid out a number of suggestions to mitigate risk, including not specifying the current path, ".", in the root user's path so they must explicitly type the location of an executable script, and so on.

They also pointed out that some well-behaved shells eliminate certain ease-of-use-but-exploitable features when it detects that a privileged user is running it, and even on systems where that's not the standard, the default .bashrc or equivalent files often set up aliases for common commands that disable features like wildcard matching, or color codes (which could be used if you're very tricky, to match a filename color to the background color of the screen, among other things), the path restriction listed above, and many many others.

It's really hard to secure shell accounts on systems, no matter how you try. Is this article just proof that the current generation of unix admins is rediscovering this? Should I be shaking my fist and telling the kids to get off my lawn? This was old news 2 over decades ago.

Re:He could have researched a bit harder. (1)

FudRucker (866063) | about a month ago | (#47333165)

To turn off wildcard expansion permanently: Utilize the ânoglobeâ(TM) option. To set the ânoglobeâ(TM) option, please execute the below command at the BASH shell: bash-prompt#>set -o noglobe; More often, the requirement is to turn off Path name expansion. This is especially useful if a wildcard is part of an argument to a program. http://ksearch.wordpress.com/2... [wordpress.com] Use set â"f in such cases. Execute âset â"fâ(TM) as below bash-prompt#>set â"f; To reset the wildcard expansion property of the BASH Shell, execute âset +fâ(TM).

Stupid headline, stupid article, stupid ... (0)

Anonymous Coward | about a month ago | (#47332735)

The article reveals a deep lack of thought. It might just as well have been headlined, "exploiting command lines". I mean, for goodness sake, this is a deliberate and oft-needed, oft-used feature. If you can't handle it then stop using computers because you lack the necessary skills.

-- david newall

Sanitize crazyness (1)

watermark (913726) | about a month ago | (#47332795)

I understand why this works and I understand the need to sanitize user input, but this is dumb. Even if there are workarounds. It's obvious what the intent of "tar cf archive.tar *" is suppose to be, it shouldn't be treating file names as additional arguments. Anyone actively using this "feature" for anything legitimate is dumb too.

This seems very similar to the whole "we need some other language than C" argument. Sure, you *can* make secure code with zero overflow vulnerabilities, but damn near all software has them. You can only blame the user/coder for so long for doing something "wrong", but when 90%+ people are doing it "wrong" then you probably need to change how the thing works.

Re:Sanitize crazyness (2)

itzly (3699663) | about a month ago | (#47332851)

It's obvious what the intent of "tar cf archive.tar *" is suppose to be, it shouldn't be treating file names as additional arguments

The problem is that the * expansion is done by the shell, and the shell doesn't know the difference between file names and arguments.

Re:Sanitize crazyness (1)

Gunstick (312804) | about a month ago | (#47333097)

no the problem is with gnu tar...

it sees cf file file file --whatever
and it usese --whatever as option

on unix (not linux) it also sees cf file file --whatever
and tries to put the file "--whatever" into the tar archive.

linux (gnu) broke stuff which worked for ages in unix world

Re:Sanitize crazyness (1)

jones_supa (887896) | about a month ago | (#47333297)

Yes, but what if the "--whatever" happens to be the first file name in the list?

Re:Sanitize crazyness (1)

Anonymous Coward | about a month ago | (#47333053)

It's obvious what the intent of "tar cf archive.tar *"

Obvious to who? You? Bash, which turned that into tar cf archive.tar file1 file2 file3? Tar, which never saw the *?

User data to control commands (4, Insightful)

jones_supa (887896) | about a month ago | (#47332829)

Systems where user data can accidentally get mixed in control commands are dangerous. In addition to this shell trick, another example would be HTML, where you have to be careful to not let raw HTML data through your guestbook messages so that visitors can't inject HTML into the messages.

With competent and careful system administrators you can avoid problems, but it's still kind of a fragile design in my opinion.

I lost file ownership! What happened? (0)

Anonymous Coward | about a month ago | (#47332835)

WILDCARD Bitches! Yeeeeeeehaw!!

.

Wildcards are used on other OSes, as well. (1)

Rambo Tribble (1273454) | about a month ago | (#47332925)

This would seem a problem with universal implications, and one that largely depends on local access by the malefactor.

in root? Am I missing something? (4, Interesting)

gb7djk (857694) | about a month ago | (#47332943)

Er.. most of the exploits are only possible if one is root and/or the directory is writable for some other user (e.g. leon in this case).

Since one is root, one can do anything anyway so why bother with all this misdirection? If someone leaves world writable directories lying around (especially without the sticky bit set), then they deserve everything they get. Or is this some kind of "trap the (completely) unwary sysadmin" wake up call? If I see some strange named file (especially if I know I didn't put it there) I would investigate very, very carefully what is going on. I can't be alone in this - surely?

Re:in root? Am I missing something? (1)

phantomfive (622387) | about a month ago | (#47332989)

In addition, they're only possible to use as a privilege escalation exploit, not to gain entrance into the system in the first place. So this is mainly only useable on multi-user systems, of which there aren't very many anymore.

Re:in root? Am I missing something? (2)

jones_supa (887896) | about a month ago | (#47333103)

No, you don't need root access. Let's say that you are in a group called "students", which has R/W permission for /work/students and all its subdirectories. You are in directory /work/students, and you want to remove all the files from that directory. Now some wiseass has created a file called "-rf" and you unknowingly end up destroying all the subdirectories too. This happens because the shell expanded the asterisk, instead of the "rm" program. The "rm" program happily interprets the "-rf" as an argument, even though it was originally a file name.

This just in! (1)

jlv (5619) | about a month ago | (#47332991)

Unpacking 'shar' archives via 'sh' considered dangerous.

linux problem NOT unix problem! (3, Interesting)

Gunstick (312804) | about a month ago | (#47332993)

This is because the linux commands do not respect what the manual says:
man rm...

rm [OPTION]... FILE...

but in realitiy it's rather:

rm [OTION|FILE]...

whereas on other unix systems it works as expected, first the options, then the arguments
HP-UX
rm *
rm: DIR1 directory

Solaris
rm *
rm: DIR1 directory

So screw the GNU tools, they mess things up for the "old unix sysadmins"

Here is a nice linux/unix trap:
x=a
y="rm z"
f=$x $y

So you expect f to contain: a rm z
not really...
z: No such file or directory
so the rm actually was executed

a=$x is an environment variable attribution, so $y becomes an executed command...
And that one works on any unix/linux
Recently patched in chkrootkit (CVE-2014-0476)

Re:linux problem NOT unix problem! (0)

Anonymous Coward | about a month ago | (#47333247)

Typos in the bug description. Ironic

Re:linux problem NOT unix problem! (0)

Anonymous Coward | about a month ago | (#47333305)

> So you expect f to contain: a rm z

Wait, what? No. I wouldn't expect that at all. It's behaving correctly. If I wanted to concatenate like you're expecting, I'd probably
f=$(echo "$x $y")

Re:linux problem NOT unix problem! (0)

Anonymous Coward | about a month ago | (#47333575)

So screw the GNU tools, they mess things up for the "old unix sysadmins"

Screw bell labs people who released UNIX non-free [youtube.com] .

Computers were conceived to execute user commands (2)

INT_QRK (1043164) | about a month ago | (#47333253)

...so wouldn't it be more accurate to to say that computers, like bull-dozers, can be dangerous in the hands of malicious, ill-informed, inattentive, or incompetent users? If you know of any of these archetypes, try to make them smarter, but don't allow them root privileges to anything taller than an ankle-high weed. Give them some locked-down version of Windows, without admin privileges, lots of monitoring tools and features. Consider helmets, knee-pads and child safety locks.

What's this... the 90's? (0)

Anonymous Coward | about a month ago | (#47333445)

Who's the writer? Some "security" guy that recently got a shell somewhere?

Any Unix admin professional is already familiar with this.
It's a pain? Not for me. Practicality hugely overcomes danger IMHO, but it's impossible to change right now.

Is a basic feature of how unix shells work, and there's no way to change it. Is like complaining about the rain.
Maybe someone invents something new and makes it popular.... Good luck with that.

Definition of idiot (1)

n0ano (148272) | about a month ago | (#47333447)

Let me check my dictionary for the defintion of idiot:

1. n: A user, especially super user, who uses * as an agument without first checking to see what * expands into.
2. n: A user who leaves his directories world writeable so others can put random garbage in them.

The one line summary for this story is bad things happen to people who use a command without knowing what the command does.

root (0)

Anonymous Coward | about a month ago | (#47333539)

First they came for our Shell.
Then they came for our Emacs.
Now they have come for our *.
No, I shall stand! They will not take our noble asterisk.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...