Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Linux

Reminiscing Old School Linux 539

t14m4t writes "While the Linux experience has improved dramatically over the years (remember the days of Kernel version 2.0? or even 1.2?), Tech Republic revisits some of the more-fondly-remembered artifacts of the Linux of years past. From the article: 'Of all the admin tools I have used on Linux, the one I thought was the best of the best was linuxconf. From this single interface, you could administer everything — and I mean EVERYTHING — on your Linux box. From the kernel on up, you could take care of anything you needed. With the dumbing down of the Linux operating system (which was actually a necessity for average user acceptance), tools like this have disappeared. It’s too bad. An admin tool like this was ideal for serious administrators and users.'"
This discussion has been archived. No new comments can be posted.

Reminiscing Old School Linux

Comments Filter:
  • by intellitech ( 1912116 ) * on Wednesday March 02, 2011 @08:52PM (#35364092)

    When anyone thought of the operating system, they thought of Linus.

    As a casual linux user, I believe it to still be the case, regardless of what your fluff might say.

  • the best of the best was linuxconf

    What's that? Something to replace vi with a GUI?

    (15 years experience as an admin, never came across it)

    • Re:what? linuxconf? (Score:5, Informative)

      by rwade ( 131726 ) on Wednesday March 02, 2011 @09:03PM (#35364186)

      Linux conf: http://tinyurl.com/4jfae7f [tinyurl.com]

      From wikipedia:

      Linuxconf is a configurator for the Linux operating system. It features different user interfaces: a text interface, a web interface and a GTK interface. Currently, most Linux distributions consider it deprecated compared to other tools such as Webmin, the system-config-* tools on Red Hat Enterprise Linux/Fedora, drakconf on Mandriva, YaST on openSUSE and so on. Linuxconf was deprecated from Red Hat Linux in version 7.1 in April 2001.

    • Yeah, that was weird. In some bullet point the author is missing the times when Linux was "hard" to install and in another he is missing tools like linuxconf. No UNIX admin needs configuration tools to do his/her job. All you need is vi.
  • ...dialing into the University of Helsinki BBS line to download early Linux disk images. Horrendous international calling fees.

  • I must agree - that was a fantastic tool. I remember being upset when it disappeared (Red Hat 8 dropped it I believe?).

    • Re:Linuxconf (Score:5, Insightful)

      by nhaines ( 622289 ) <nhaines@@@ubuntu...com> on Wednesday March 02, 2011 @09:04PM (#35364194) Homepage

      I never had the pleasure of using it. However, making things easier in Linux isn't "dumbing down" the operating system. It's simply making things more accessible. Done properly, the fancy GUI stuff just snaps together with the existing CLI and config file stuff and then you get to choose the most appropriate way to manage and configure your system. That's a win for absolutely everyone.

      And that's what will keep Linux competitive--the ability to meet novice computer users alongside having the power and the efficiency for die-hard CLI lovers.

      • by armanox ( 826486 )

        I agree with that. There is a difference between making easier and dumbing down. I'm all for making things easier, but, I can't stand when something is dumbed down.

      • Re:Linuxconf (Score:4, Informative)

        by burne ( 686114 ) on Wednesday March 02, 2011 @09:09PM (#35364272)

        And that's what will keep Linux competitive--the ability to meet novice computer users alongside having the power and the efficiency for die-hard CLI lovers.

        Don't worry. linuxconf is every bit as capable as vi of emacs with regards to fucking up your fresh linux install. It's the user, not the interface, who makes the mistakes.

      • Re:Linuxconf (Score:4, Insightful)

        by bhcompy ( 1877290 ) on Wednesday March 02, 2011 @09:16PM (#35364350)
        Depends on what you consider easier. IIS 6 to IIS 7 decentralized an assload of config from a handful of possible locations to dozens of different applets, advanced settings sidebars, etc, and most of it isn't the most descriptive as to what settings may be contained within. Not using Linuxconf, I can't say for sure if what replaced it was better, but changing something for the sake of changing it isn't always good.
    • I don't consider myself an old-school Linux user, but maybe I'm just in denial, because I remember Linuxconf fondly. I came to Linux a few years before Why-Too-Kay with a little experience as a Unix user but mostly as a WinDOS tech, and Linuxconf was an invaluable set of training wheels as I learned to set up Apache and Sendmail and BIND on the first *n*x system where I had root. I mostly use vi these days (and Webmin for daemons that I'm not as familiar with), but if it weren't for Linuxconf to get me ro
    • The true old-school admin tool was vi (or ed even). Linuxconf was already "dumbed down" in many veteran's eyes.

  • Old School (Score:5, Informative)

    by clang_jangle ( 975789 ) on Wednesday March 02, 2011 @09:00PM (#35364166) Journal
    On behalf of the many gentoo, arch, and slackware users, I'd like to point out that "old school Linux" is alive and well and more capable than ever, thanks.
  • "Dumbing Up" (Score:5, Insightful)

    by KingSkippus ( 799657 ) on Wednesday March 02, 2011 @09:01PM (#35364168) Homepage Journal

    I absolutely abhor the phrase "dumbing down" when used in this context.

    Linux used to be something used by a tiny minority of people who were primarily interested in hard-core computer science testing and research. It was their playground in which they could work their art. By making it more user-friendly, it has gotten it into the hands of people who are brilliant in other ways so that they can work their art. Are you a graphics guru? A UI wiz? A scripting genius? A music prodigy? A 3D design master? A business star? A poet laureate? If so, then Linux is now for you, too!

    It hasn't been "dumbing down" anything. If anything, it has been dumbing up--more and more people using it in smarter and smarter ways.

    And the beauty of the situation? If you're a hard-core computer scientist wanting to do testing and research with new stuff, it's still there for you, too [kernel.org].

    • Re:"Dumbing Up" (Score:4, Insightful)

      by Baseclass ( 785652 ) on Wednesday March 02, 2011 @09:09PM (#35364266)
      poppycock! I don't need your fancy schmancy graphical user interfaces and widgets.
      Back in my day we did everything via CLI and we liked it..
      Lynx is still the best browser out there.
      Now get off my lawn!
      • by 19061969 ( 939279 ) on Wednesday March 02, 2011 @11:17PM (#35365316)

        Lynx? You pussy. I use wget and parse the HTML with my *eyes* cos I'm so hard and cool and geeky, so git awf my lawn you nancy-boy and prance around with your new iPad somewhere else.

        Eeeh, nobbut like when I were young though, I used tut paper cards and tape... Eee, it were grand!

    • by r6144 ( 544027 ) <r6k&sohu,com> on Wednesday March 02, 2011 @09:36PM (#35364492) Homepage Journal

      Command-line tools usually have very well-documented configuration files, and even when they break, debugging is relatively easy.

      Now we often have configuration files (e.g. font configuration and internal stuff used by many GUI applications) spread over many poorly-documented locations. If the GUI is not enough or is buggy, which is often the case, it is quite hard to diagnose the issue even for an experienced user like me.

      After all, it usually takes much more work to design and program an acceptable GUI than a CLI with similar usability, at least for frequently-used software and users who can either type fast or do simple scripting. Developer time is scarce, so GUI tools are bound to lag behind in features, stability, usability, etc., and the world is complicated enough that a lot of effort is still needed to make things work at all.

    • Re: (Score:2, Interesting)

      I have 10+ years of experience as a Unix sysadmin, and that article was a serious WTF.

      1 - Linuxconf is nothing but old school. I am old school, and I rarely leave my emacs session. Linuxconf was a dumbed down, braindead tool, one of many. Certainly not old school.
      2 - Computing is always a challenge, if he has lost that, it's because he stopped looking for new challenges, or maybe all he wanted was a working printer. In any case, I find more challenges now when I have to use one of the automatic-for-the-peop

    • Re:"Dumbing Up" (Score:5, Insightful)

      by Blakey Rat ( 99501 ) on Wednesday March 02, 2011 @10:11PM (#35364764)

      My personal philosophy is that the instant you hear the term "dumbing down" you can ignore the speaker. They don't have any valid points to make.

      "Dumbing down" is just saying, "I don't like this, but I haven't bothered to spend any time figuring out why." With a side-order of "oh and I'm smarter than all of you."

    • by IICV ( 652597 )

      That's the main thing people don't seem to understand about computer science as it is practiced in the real world.

      We're not about making awesome computer stuff for the goal of making awesome computer stuff. We're, essentially, about making the most fantastically flexible and capable tools human kind has ever invented.

      But what good is a tool that requires six fingers to use?

      Personally, I think that in the future "computer science" won't really be a separate field of endeavor - like walking or throwing a ball

  • I don't miss the "challenge" one bit. If you're up for a challenge there are plenty of barebones and expert-friendly distros out there to cut your teeth over. However, things have progressed enough that if you're not prepared to use up what little free time you have tinkering around with shit to get it to work, we now have a lot more friendly options for people who want to actually USE their computers to do something useful.

    • I gotta agree -- Linux can be as simple or flexible as you want it to be. It's just a matter of your choice of distribution. This guy's post seems to be more a lament of how simple is life used to be. As in, he used to have time to screw around with linux all the time -- now he has to spend his time actually producing, rather than having an excuse to tinker...

      • And that is the true reason behind the story.

        Every thing is always better way back when. When women stayed in the kitchens, your slaves did your gardening, and you had hours and hours of free time to do nothing but smoke weed, drink, and talk about the good old days.

        • Hey, believe me: nostalgia is my friend. I recently bought a new computer and dual-boot windows and FreeBSD on it. Frankly, I have no reason to have FreeBSD. I'm not a developer or system administrator and I find web browsing in the Unix environment to be a pain in the neck -- flash crashes the browser, etc.

          The only reason that I ever installed Linux in the first place was because I had a computer without a license and could not afford to buy Windows 95. If that computer had a working OS installed, I never

  • by thomasdz ( 178114 ) on Wednesday March 02, 2011 @09:04PM (#35364196)

    I downloaded the boot & root FLOPPIES and that's how I got online with Linux back in 1992

    http://groups.google.com/group/comp.os.linux/browse_thread/thread/3e0f1f1f1e33e1fe/a4f297acaa54597e?hl=en&q=dzubin+linux#a4f297acaa54597e [google.com]

  • V0.12 was the first version of Linux that I had played with... a full installation with all kinds of stuff fit on something like 6 3.5" floppy disks.
  • by bored ( 40072 ) on Wednesday March 02, 2011 @09:09PM (#35364282)

    Linuxconf was cool, but it had some major holes. I'm here to tell you that Yast by the nature of having far more modules, is a _MUCH_ better solution.

  • The tree of life
  • linuxconf broke more than it fixed. I had only tried it a handful of times at the urgings of other enthusiasts. I hated having to undue all of the errors it would make on my machine. The idea was great but I still think its just not possible to make a one-size be-all-to-beat-all admin tool for every distro without messing something up somewhere.

    I see selecting a linux distro to be kind of like getting married. Sure there are plenty of general rules of thumb that can help you achieve a successful marriage. B

  • by digitalsushi ( 137809 ) <slashdot@digitalsushi.com> on Wednesday March 02, 2011 @09:17PM (#35364362) Journal

    I'm probably the person on slashdot who has used linux the longest... yes, redhat goes all the way back to 5.2. I remember learning about NAT when splitting the ethernet with a Y jack didnt get me two internets (i expected a little fade, was all.) Radioshack didnt sell Ethernet signal boosters at the time.

    I always get a little upset when someone tells me they are "an expert" at linux, and then tell me they use an old distro full of security holes. A modern ubuntu is going to have way better security because it's new. Further, older linux kernels actually cause damage to the internet with trace levels of malignant packets, from protocols days gone by. http 1.0 is a common example of this, consider the fleets of cloud servers running web 2.0 that have to strain with a hefty http 1.0 connection from a netscrape 4.0 web browser on linux 5.0.

    I am glad that threads like this raise awareness ... I just hope that some people reading this post realize that, even though they have been a linux user for 25, 30 years, that maybe just maybe they missed a few boats on the way. Most experts are not even running 2.6 kernels yet, which support IPv6 router advertisements. These RAs, as they are called, will configure the new Internet rapidly and I pray linux experts are not left in the dust when they dont get their autoconf info.

    • Re: (Score:2, Funny)

      by Anonymous Coward

      [notsureifserious.jpg]

    • The people who have used Linux the longest on slashdot would surely remember a time before Red Hat, let alone a late version like 5.2.

  • In the 1.2 kernel days, there was a really irritating bug that took forever to get fixed. The problem was that it would often not let you shut the system down or reboot until you deleted a file: /etc/shutdownpid

    Very strange, but knowing that little factoid certainly impressed some people who actually knew a lot more about Linux than I did. :)

    • by Rufty ( 37223 )
      Used to freak out people when I did # cat vmlinuz > /dev/fd0 to make a boot disk.
  • Seriously?

    So what does that make me? I switched to FreeBSD from Linux before linuxconf even existed.

    I'm pretty sure they guy writing it has no clue what 'old school' Linux actually was, he just seems to want something obscure and hard to use. Sounds more like a recently added fanboy than a long term user.

    Do you remember Linux BEFORE X worked on it, let alone anything like GTK/KDE was a glimmer in someones eye.

    • You're the guy that says that Rock and Roll stopped in the 70s. Give it a rest. Linux in 1997 is old-school, believe it or not. If you really think that Linux -- or FreeBSD -- for that matter is the same as it was in 1997, you're not paying attention.

    • Hard to imagine that with such a high UID.

    • Well not quite. My first experience with Linux was with Slackware 3.0 in 1996. That install program was horrid, one screw-up and you had to restart from the beginning because there was no 'back' button!

      Then there was getting X11 running. configxf86 was a scary program to run with lots of scary warnings about frying monitors if you selected the wrong modelines setting. GTK/KDE? ha... try fvwm. I remember Walnut Creek's ads showing impressive X screen shots, never could reproduce them. Kernel config was a
    • by Burdell ( 228580 )

      Yeah, I remember hearing about linuxconf, looking at it, and running away screaming. What a pile of crap (the source of mangled sendmail configs that got Linux a bad name in any sendmail newsgroup or mailing list).

      My first Linux boot was when H.J. Lu managed to fit a kernel and the root filesystem on a single 1.44 MB floppy for the first time. I remember TAMU, installing SLS (yeah, not so "soft" of a landing!), and then when Slackware was just a rip-off of SLS.

  • by DragonHawk ( 21256 ) on Wednesday March 02, 2011 @09:26PM (#35364428) Homepage Journal

    I miss not having 42 daemons running in the background to do stuff that could simply be a library or utility loaded/run when needed.

    I miss having the init system being a robust, straight-forward process of calling shell scripts in sequence.

    I miss only needing to reboot for kernel updates [launchpad.net].

    I miss having one sound subsystem that never worked, rather than countless sound daemons which never work.

    I miss having my immediately-after-logon process list fit in a single 80x25 terminal window.

    I miss not having everything complain that DBUS isn't running.

    I miss the Unix philosophy [wikipedia.org].

    It seems like Linux is just as good as MS Windows these days. Too bad. I liked it when Linux was an improvement over MS Windows.

    • by pz ( 113803 ) on Wednesday March 02, 2011 @10:40PM (#35364986) Journal

      I miss when a 266 MHz CPU and 64 MB of RAM was enough to do serious work under Linux.

      I miss RHL 6.2. That was as stable and clean an OS as I've used.

    • Aye. Lucky for us there are still free (even according to RMS' redefinition of the term) operating systems which continue to stick to the Unix philosophy. Even if not BSD, you can always go Gentoo and compile your system without all the dbus gconfd gstreamer esd pulseaudio crap. Emerge OSSv4 and you can even have sane and robust sound support.

      Don't worry; the dream isn't dead yet! It just may cost a couple hours of compiling.
    • by formfeed ( 703859 ) on Thursday March 03, 2011 @12:08AM (#35365622)

      I miss having a system, that had a decent documentation.
      I miss the time when important parts all had man pages.
      I miss being able to work my way through a script and not have everything hidden somewhere.

      And I miss being able to talk to people about a linux problem and getting a decent answer.
      In the good old days I could have a problem and someone would point in a direction, so I could find the answer and learn something in the process.

      Now? you either get the old-school answer, which breaks the fancy stuff, because for example you shouldn't meddle with the permissions, fstab, links and mount points, but do some udev stuff...
      Or, you get the "click-here-and-reboot", "just-upgrade", or "have--you-tried-reinstalling" kind of experts.

      On top of it, documentation is just missing, gvfs writes files I can't read. Data is hidden in some formats only the application designer knows. And I can't modify any of it, because more and more you don't get the answer but a why-would-you-wanna-do-that or that's-against-the-design answer.

    • by wvmarle ( 1070040 ) on Thursday March 03, 2011 @02:38AM (#35366322)

      It seems like Linux is just as good as MS Windows these days. Too bad. I liked it when Linux was an improvement over MS Windows.

      Really? Windows improved that much? Maybe I should give it a try. Do you still have to type "win" at the prompt after booting up?

    • by ToasterMonkey ( 467067 ) on Thursday March 03, 2011 @04:00AM (#35366536) Homepage

      I miss not having 42 daemons running in the background to do stuff that could simply be a library or utility loaded/run when needed.

      Daemons provide shared services with privilege separation, you know, that old school unix thing. /points at sendmail

      I miss having the init system being a robust, straight-forward process of calling shell scripts in sequence.

      Robust? With uncoupled running & enabled states? In sequence in the good old tradition of single core, single disk unix servers? No response to hung or dead services?

      I miss only needing to reboot for kernel updates.

      This is a flat out lie. The reason every other OS makes or tells you to reboot for changes to system code to take effect is because neither they nor Linux have any mechanism in place to guarantee all loaded/running code stays consistent with the replaced code on disk.
      When the heck did you think a libc patch fully goes into effect? What does "lsof | grep 'path inode'" say on these "no reboot" systems? What do you do with a whole data center that just got openssl patched? Hope for the best?
      What do you do, reload every single process on your system to feel better about not rebooting? Although it's theoretically possible - on a small scale, how is that in any way ideal in terms of uptime, stability, security, etc, compared to rebooting?
      It's an absolute shame that this myth is allowed to perpetuate.. You yourself mentioned using shared libraries (to access common data I presume, else it would not be a daemon replacement), running processes could wind up with different versions of that library for an indeterminate amount of time. This leaves leaves the shared data in a really FUN state!

      I miss having one sound subsystem that never worked, rather than countless sound daemons which never work.

      I don't miss screwing around with sound on Linux _at_ _all_

      I miss having my immediately-after-logon process list fit in a single 80x25 terminal window.

      Whats in your way?

      I miss not having everything complain that DBUS isn't running.

      If your init system didn't suck it would be [re]running.

      I miss the Unix philosophy.

      Which one? How to build a good OS 40 years ago?

      It seems like Linux is just as good as MS Windows these days. Too bad. I liked it when Linux was an improvement over MS Windows.

      Linux really has catching up to do, still, and always. It's pretty obvious to one that isn't completely oblivious to the last fifteen years of OS evolution outside Linux. It has reached the "good enough, cheap, unix-like server OS" goalpost and stood still for lack of leadership or vision.

    • Your complaints are almost entirely directed at the desktop environments. While it's unfortunate that they've gotten so massively bloated (even while doing next to nothing more than they did when they were tiny and dead simple) you certainly aren't forced to use them on linux or elsewhere. Xfce4 has gained a surprisingly large following for simply being about as simple as KDE/GNOME 1.x. I'm a blackbox devotee myself (openbox v2 actually), but either way, you can spare yourself the complication and waste

  • Meh (Score:3, Insightful)

    by smpierce ( 568838 ) <s_m_pierce@ y a hoo.com> on Wednesday March 02, 2011 @09:53PM (#35364630)
    I for one, don't really miss the 'good old days' of downloading 28 or so floppies of SLS over a 14k modem, only to find that disk 7 has a error when you're attempting an install. Or working days on writing and tweaking an xconfig file. I admit, the excitement of running this 'cool new OS' is gone, but it is infinitely more usable so now I can actually get my work done.
    • by Burdell ( 228580 )

      Hey, calculating X modelines by hand was fun! I managed to get an EGA/VGA monitor that was rated for a max of 800x560 to run 1024x768 within the frequency specs. It was a horribly low interlaced frame rate (couldn't have any light in the room or it gave you an instant headache) and the bandwidth was too low (so all the pixels were fuzzy), but it worked!

  • In the days before broadband and cheap CD-R drives Linux updates used to come from Infomagic on CDs.

    I would eagerly await when the local computer store would get this quarterly update of "shovel-ware" CDs, and hidden in it would be a gem, the six-CDs-in-a-box of Linux, and maybe a Slackware distro too.

    I was sort of like having a geek Christmas every season, heading home and reading the package list, trying everything out, seeing what new drivers were now in the Kernel so I could get a better VGA card. And t

  • by cos(0) ( 455098 ) <pmw+slashdot@qnan.org> on Wednesday March 02, 2011 @10:01PM (#35364688) Homepage

    I have to chuckle at this:

    I know this is counterintuitive, but there are days I really miss the challenge (and the ensuing celebration) of old-school Linux. Back in the day, getting Linux installed gave many users reason to shout their own variation of “Hoorah” to the clouds.

    The challenge is there, if you venture out of in-kernel drivers and supported install scenarios. Yesterday I spent three hours trying to get Linux set up on my new HP Pavilion dm1z -- and I consider myself a competent Linux user.

    It took me a little while to set up LVM with the root filesystem managed by LVM. Documentation for configuring GRUB for LVM isn't great, and in some places on the web is outright wrong. Fine, got that. Next, the wireless card is unsupported. To get it to work, you must get the driver from the manufacturer (who fortunately advertises Linux support), then apply patches to it from other sources to get the driver to compile with my kernel version. None of this is documented in one place -- different forums have various snippets that inch me forward. Believe me, I shouted "Hoorah" once I finally spilled enough sweat to get it to work. (After I got this to work, I wrote my own step-by-step instructions to save others the pain.)

    Once I got past the wireless issues, I started X and determined that the Synaptics touchpad is misconfigured -- the hardware is touch-sensitive on the physical buttons, so pressing a touchpad button also moves the mouse. The issue appears to be fixed [launchpad.net], but it hasn't made it into the version of xf86-input-synaptics that Gentoo has. I had to clone the git repo of that driver, build it myself, and manually set up the rule that masks that area of the touchpad. And even now, it still doesn't work correctly. Now I don't move the mouse when I click, but I also cannot click and drag -- once I click, the cursor is fixed. Now this Linux user is stuck between a rock and a hard place.

    Challenges still abound, even on the most modern Linux kernel and distributions... just dare to venture out of the entrenched and supported hardware.

  • by Dishwasha ( 125561 ) on Wednesday March 02, 2011 @10:08PM (#35364742)

    I miss the dudes who hadn't even hit 20 yet but had receding hairlines, used to say a bunch of stuff that you could quite tell was either crazy or genius, that wanted to convince you that this crazy Linux thing was awesome because it was in color (i.e. as opposed to black-and-white DOS or the VAX/VMS we dialed in to), and that thought it was totally reasonable to trust something that was *gasp* free (there's no way that something that was free was reliable enough to bother with).

  • Old School Linux (Score:5, Interesting)

    by atomic-penguin ( 100835 ) <wolfe21@@@marshall...edu> on Wednesday March 02, 2011 @10:40PM (#35364988) Homepage Journal

    Friends that are newcomers to Linux, complain to me all the time about their wireless cards not working, right out of the box. Then I share my first experiences with Linux to put things into perspective.

    A friend had bought a copy of Slackware 3.4 [utah.edu] from Walnut Creek CDROM (cdrom.com). We also had to buy a box of 100 floppy disks from the local office-supply Big Box store. You see, there wasn't a lot of manufacturers with BIOS support for booting CDROM disks. In those days you couldn't just hop onto an OEM's website and download the latest BIOS flash image direct from the manufacturer, to get support for CDROM booting.

    Even if you could have downloaded BIOS images from the manufacturer, I don't recall any OS installers to bootstrap directly from CDROM, that was still a fairly new idea at the time. Both Windows 95, and Linux distribution installers had to have a floppy bootstrap first, then load an ATAPI driver to read the rest of the installation files from CD.

    In those days, if you hadn't bought the CD from Walnut Creek you had to stay up late, downloading floppy images and checksumming the downloaded images on your 14.4 modem. Even if you had bought the CD, you would have to take the time to image that big box of floppy disks. Then you would have to check the disks for consistency (so you wouldn't get interrupted by a bad floppy half-way through the install). So we would trudge on through the night, making floppy sets. The floppy sets break down like this:

    • A set (base) - 9 floppies
    • AP set (applications) - 6 floppies
    • D set (development/compilers) - 13 floppies
    • E set (emacs) - 8 floppies
    • F set (FAQs/documentation) - 3 floppies
    • K set (Kernel source) - 6 floppies
    • N set (networking support/applications) - 6 floppies
    • T set (TeX formatting) - 9 floppies
    • Tcl set (Tcl/Tk) - 2 floppies
    • X set (X Windows base) - 26 floppies
    • XAP set (X Windows Applications) - 5 floppies
    • XD set (X Windows Development headers/libraries) - 3 floppies
    • XV set (X View) - 3 floppies
    • Y set (BSD Games) - 2 floppies

    So a full install would require you to image 99 floppy disks, not even counting boot and root install disks. So to get a Linux system capable of compiling the Kernel source, and networking with other machines, that would take at least 45 floppy disks individually imaged.

    If you want a GUI and some windowed applications, that would be 37 additional floppies. That is 82 floppy disks in all. The first time I installed Linux, I didn't know what to do with it. It was comparable to DOS, or even the OS on my old Commodore. It was just a basic shell, blinking cursor, and the DOS commands I knew, besides "DIR" did not work. It was a proud moment to get the damned thing, installed and booted up. Even if you didn't know what the hell to do with it, once you got to that point.

    A year, or two, later at University I could network install RedHat from a local NFS mirror in less than a few hours. Modern day, you can do a full network install in a few minutes. DVD images can be downloaded through bittorrent in less than an hour, and installed. You can even install Linux from a bootable USB flash drive that fits in your pocket.

    Most everything works out of the box, from desktop to enterprise-grade server hardware. Most of the wireless cards will work, with a little bit of tweaking and hunting down external firmware. Those new to Linux may not realize, or may simply forget, how far the technology has come in just a few years. Anyone that complains about how "hard" it is to install and use Linux, should try installing from floppy sets to get a little perspective.

    • by Tim C ( 15259 )

      My first Linux install was also Slackware from floppies. It sure does make today's installations seem a whole lot easier. The number of failed installs I had because one of the floppies was borked...

      Thing is though, "not working" is still "not working", whether you spend hours installing from floppies or half an hour installing from a DVD. Just because it is quicker and easier to get a system that mostly works apart from that one essential thing doesn't make it any better for the person staring at a PC they

  • by Nimey ( 114278 ) on Wednesday March 02, 2011 @11:11PM (#35365274) Homepage Journal

    Speaker Doom was a distribution of Doom for Linux, with Linux that'd been equipped with a PC Speaker driver, so you could get Sound Blaster-like sound effects without an actual sound card.

    Basically you got it as a zipfile and extracted it (UMSDOS filesystem), then ran a batch file to boot Linux from DOS, and /then/ Doom would launch. I think this came with v2.0.32 of the kernel. Don't know which distro it was derived from; Slackware maybe.

    This would have been 1998, and it's still available:

    http://www.doomworld.com/idgames/index.php?id=9704 [doomworld.com]

  • by Greyfox ( 87712 ) on Thursday March 03, 2011 @12:39AM (#35365810) Homepage Journal
    Downloading the 20-some-odd installation floppies in Slakware... I think that was 2. And forgetting to run FTP in binary mode for the first 2, so having to redownload them again the next day. And then running ircii+epic in text mode on a 386 SX/16 that didn't have enough oomph to manage X11. Or running around South Florida looking for terminating resistors for two 10baseT Ethernet adapters so I could set up a little 2 machine network in my house...
  • by MaerD ( 954222 ) on Thursday March 03, 2011 @01:15AM (#35365952)
    Linuxconf was such crap. It *WAS* dumbing down Linux, making it far to easy for "admins" (and I do use the term loosely) to configure a system. In general it led to insecure systems, systems that were just plain badly configured and barely worked, or file corruption (especially if you ever hand edited a file after linuxconf was done and went in to edit some other thing with linuxconf).
    Having been on the tech support end of people using linuxconf, I can't believe anyone would remember it fondly. I can see wanting a simple interface that can configure *everything*, but I don't prefer admins that have some clue what the options do.
  • by Anne Honime ( 828246 ) on Thursday March 03, 2011 @07:57AM (#35367172)

    Linuxconf is what introduced me to the 'immutable' attribute of ext2fs. After being bitten a couple of time with a reset of my soundcard parameters (specific ones at that, it was an IBM laptop with strange all-in-one video+sound chip), I sought a solution, and I finally chattr'ed the config file to +i. End of the problem.

    So in a way, I'm grateful to linuxconf for enticing me into learning more deeper and arcane knowledge of linux. But that's about all I found it useful for.

  • by g253 ( 855070 ) on Thursday March 03, 2011 @10:41AM (#35368492)
    When I started using linux, maybe around 8 years ago, you still had to manually mount / unmount volumes, which never bothered me although I understand it had to be changed for Joe User. This is back when KDE was the Kool Desktop Environment, at the time I used WMaker and absolutely loved it. In WMaker, there was a small widget to mount / unmount volumes, and it always worked flawlessly - one click to mount, one to unmount, never any trouble. But then as time went by auto-mount appeared, I started to use KDE / Gnome, and for years it was horrible. CDs would fail to be mounted, the CLI command wouldn't work, or unmounting would fail and the CD couldn't be ejected - it was terrible. So personally that was the tool I really missed.
    (I'm assuming automount works fine nowadays, I haven't used linux in years)

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...