Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AMD Graphics Open Source Linux

AMD Releases Open-Source Radeon HD 7000 Driver 84

An anonymous reader writes "AMD has publicly released the open-source code to the Radeon HD 7000 series 'Southern Islands' graphics cards for Linux users. This allows users of AMD's latest-generation of Radeon graphics cards to use the open-source Linux driver rather than Catalyst, plus there's also early support for AMD's next-generation Fusion APUs."
This discussion has been archived. No new comments can be posted.

AMD Releases Open-Source Radeon HD 7000 Driver

Comments Filter:
  • Llano: 3.3? (Score:4, Interesting)

    by whoever57 ( 658626 ) on Wednesday March 21, 2012 @01:20AM (#39423631) Journal
    I would be much more interested to know if Llano is fully supported in 3.3 kernel. With 3.2, if KMS is enabled, the screen blanks as soon as the radeon module is loaded (even before X starts).
  • by aztektum ( 170569 ) on Wednesday March 21, 2012 @01:24AM (#39423649)

    AMD could help itself a great deal by focusing on open-source support. Intel does a pretty damn good job supporting open-source with drivers, but they lack top-end graphics hardware. nVidia provides a solid binary, but their *NIX support lags behind Windows.

    If AMD becomes the number #1 graphics hardware on Linux, it could help even out their hot/cold CPU offerings.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      nVidia's linux support has been solid for, like, a decade. Their blob works so damn well, and has worked well for so damn long, that even if the Magical Code Fairy came and blessed AMD/ATI with perfect *NIX modules, I would be hard-pressed to give up on nVidia. I'm all fidgety, waiting for the nearly-released 600-series cards, and you can totally bet those cards will have first-rate blob support, for us early adopters of the linux persuasion.
      Buy a relatively new ATI card, and really, you just hope to Christ

      • I wouldn't choose them for a Windows machine anyway because they make shitty drivers. You're doing yourself a favor...trust me. I've had a 4870x2 for a year or two now and I've never had such hell with drivers. Just imagine your video driver crashing on YOUTUBE. Or simply refusing to play flash videos properly when hardware acceleration is enabled. Bask in the glory of losing all your catalyst settings in every single upgrade or even just rebooting your computer sometimes.
        • I've been using a 5670 for over 3 years and never had a problem with their drivers in windows as I prefer sticking with the stable version of catalyst. Hell if I could, I'd install the bare driver but that's not the case anylonger and it's catalyst that has given ATI a bad name, not the driver itself. I have never had a driver failure since I installed the card under Vista. Yes Vista. When it was released the driver was stable and never crashed. It wasn't the fastest but then, I'm not a gamer.

          For all the Nv

          • It has become increasingly apparently that the 4xxx line was ATi's red headed step child. Everyone with cards AFTER that line seems okay but everyone with a card in that line has the exact same issues I do. It has however been bad enough that I don't think I'm ever going to buy an ATi card again. I'm not even kidding. You have no idea. I guess we'll see but after all the bullshit I've had to deal with, regardless of it being the driver or Catalyst, I am VERY leery about buying another ATi card.
      • by tyrione ( 134248 )
        I'll give it up tomorrow and when the replacement to Bulldozer arrives I am giving up Nvidia. AMD's OpenCL support is far richer and my applications are being designed to leverage OpenCL 1.1/2.0 for Engineering and Solid Modeling.
    • by Anonymous Coward

      nVidia provides a ,proprietary blob, only for x86.

      FTFY.

  • ...methinks that the next logical step for them would be to have an FOSS CPU architecture - their AM64 (sans all of Intel's 32-bit x86 IP), and put it out there under an OSS license of their choice for any design house/fab to use it, and just live w/ licensing fees. That way, they don't have to worry about operational details, which they suck @ anyway, and can focus solely on CPU design.
    • And this will do what?

    • If they give away their CPU architecture for free under an open license, how do they charge royalties? Who is going to make an AM64 CPU with no x86 support?
      • I wasn't suggesting that they give it for $0 - I was suggesting that they make the architecture open source, say w/ a license that allows anybody to study the code, but charges a certain amount if someone wants to use for commercial purposes i.e. to fab a CPU. In other words, they could charge $x to a licensee for signing up, and $y for every chip sold. The licensee then just has to do incremental work on the design in terms of aligning it w/ whichever fabs they're working w/, which won't involve AMD dire

        • I wasn't suggesting that they give it for $0 - I was suggesting that they make the architecture open source, say w/ a license that allows anybody to study the code, but charges a certain amount if someone wants to use for commercial purposes i.e. to fab a CPU.

          Also known as *not* making it open source... Unless you're talking about living off the patent licenses.

          • That's precisely what I'm talking about - living of patent licenses. Also, open source doesn't mean that people are free to do what they like - like I mentioned, if they want to use it for commercial purposes, they can put normal terms & conditions involved in selling. It would be something like QPL, except that since the average person doesn't just go to a fab and give them the models, the paid aspect of the commercial usage would be more acceptable than it was w/ Qt/KDE.
            • by slew ( 2918 )

              Uhm, Arm doesnt' "open-source" their architecture and it's pretty successful. The company that used to be called Sun did a community license similar to what your are suggesting with their Sparc core. Which do you think was more sucessful?

              I always find it surprising how the opensource vultures/jackals come out to attack the weak and wounded with their "suggestions" on how to run thier business into the ground. It's as if they actually want the weak companies to die so they can feed off of the remains. Op

          • No, that would be "not making it free." Open source does not have to be free, unless you want to charge to view the source instead of just to use it.

    • I think it would make more sense for them to license ARM and make a competitor for Tegra. I'd certainly be interested in a netbook/pad based around that technology.
    • I had a quick google of AMD64 and it is just a bunch of extensions to x86. Its pretty much just more registers, everything extended from 32 to 64 bits and a new addressing mode. Mostly the same instructions. Good luck building an AMD64 CPU without an x86 license from Intel. Even if you could, good luck selling a CPU that can't run any software currently available
  • I look forward to hearing from actual users how well these drivers work.

  • by Anonymous Coward

    I was one of the first people to give AMD credit for making the decision to release specifications when ATI was bought. How naive I was. We shouldn't be supporting AMD until they come clean and release sufficient specifications for use on free operating systems. Intel remains to be the only company with graphics chipsets that are well supported on GNU/Linux and free operating systems. The ATI 3d acceleration is still dependent on non-free software. Only the 2d works on free systems. Right now they just rele

    • by dmitrygr ( 736758 ) <dmitrygr@gmail.com> on Wednesday March 21, 2012 @02:08AM (#39423821) Homepage
      Don't get too excited. Some intel chips use PowerVR, which has no OSS driver (from intel as well as from anyone else) see here [imgtec.com]
      • Yes, like all the new Atom cpu's
      • Some intel chips use PowerVR, which has no OSS driver

        And no windows driver either. Obviously, that's hyperbole. The chip does have Windows drivers and Linux drivers. The Windows ones are beyond terrible and the Linux ones were even worse.

        This may have changed when since I last looked, but I'll bet the intel partners were furious for being given a dud with such awful drivers.

        • by tlhIngan ( 30335 )

          Some intel chips use PowerVR, which has no OSS driver

          And no windows driver either. Obviously, that's hyperbole. The chip does have Windows drivers and Linux drivers. The Windows ones are beyond terrible and the Linux ones were even worse.

          This may have changed when since I last looked, but I'll bet the intel partners were furious for being given a dud with such awful drivers.

          PowerVR doesn't have any open-source drivers - the only ones you get are binary blobs.

          Of course, awful drivers is interesting, consider

          • Of course, awful drivers is interesting, considering an awful lot of smartphones are running the Linux kernel, and an awful lot of them have PowerVR chips powering them.

            Sure. I was referring to the Intel GMA500, which I had the misfortune of dealing with. It was in a super-ruggedized toughbook. We had both Windows and Linux versions. The graphics drivers were not good on either, with Linux being much worse.

    • I'd switch to AMD permanently and buy a new AMD video card tomorrow if I was sure they're serious. I want decent 3D acceleration in the open source drivers for Linux. Neither Nvidia nor ATI ever delivered on this. The proprietary Catalyst driver is something like 5x the speed of the open source driver. Nvidia is even worse. That's totally unacceptable. Some years ago, ATI announced they were opening up, and I got ready to dump Nvidia. And then... it didn't happen.

      Intel? What a joke! Their video p

      • by Kjella ( 173770 ) on Wednesday March 21, 2012 @04:01AM (#39424279) Homepage

        ATI announced they were opening up, and I got ready to dump Nvidia. And then... it didn't happen.

        Actually that's what did happen, they said they'd open up and for the most part they have - the instruction set for "decent 3D acceleration" is out there. A decent CPU analogy is that they promised x86_64 specs, you expected GCC. It doesn't magically make a team that's 2-3% the size of the proprietary team magically able to be 50 times as efficient, worse yet the hardware radically changes from generation to generation like now from VLIW to GCN which is basically to start over. And it continues to expand with geometry shaders, tesselation, new display standards, new chips etc. so it's a rapidly moving target.

        For example, Mesa just got OpenGL 3.0 support last month, the standard was released back in 2008. That's not just lack of a driver, there's not even an implementation to accelerate. Of course you could say that AMD should release their proprietary driver/OpenGL implementation which would be nice indeed but isn't practical on so many levels and certainly not something they promised. Your post is essentially why nVidia doesn't want to get involved with OSS, it's "Whaaaaaaa give us specs, we'll write the code" "Okay here's specs" "Whaaaaaaaa performance sucks, write the code too".

        • Of course you could say that AMD should release their proprietary driver/OpenGL implementation which would be nice indeed but isn't practical on so many levels and certainly not something they promised.

          That's actually what they need to do if they want to stay relevant, because their competition has done it.

          And by competition, I don't mean nVidia. I mean the truly monstrously huge behemoth, whose claws and fangs are already dripping with so much AMD blood, yet who is dismissed every time graphics hardware c

          • Low-end integrated "crap" is what 95% of the world needs...That's even where AMD's own Llano is!...

            Many people do agree with you which is why people are asking if the Llano support within the open source drivers is working yet. (Anecdotal...) I have a moderate 46xx series ATI card in my linux box which I can use happily with the open drivers. Performance isn't top notch but I have never needed 100% of its graphics capabilities. In my Llano laptop it is still all Windows 7. In the future I would be happy with another AMD APU system once the open driver support is better. The performance available with Lla

          • by Kjella ( 173770 )

            That's actually what they need to do if they want to stay relevant, because their competition has done it. (...) You know who I mean.

            More like the other way around, today AMD and Intel both have OpenGL 3.0 support through mesa while AMD has OpenGL 4.2 through catalyst/fglrx. If AMD went through the trouble of opening their implementation then Intel would essentially get a free pass to that, not to mention an invaluable lesson in shader optimization tricks and that'd benefit nVidia too. Even if it were possible it'd not be in AMD's best interest.

            • by epyT-R ( 613989 )

              I'm not sure about that.. the architectures are so different that any optimizations made to the shader compiler would be useless to other designs.. hell, even different generations of chips require different optimizations.

      • by tyrione ( 134248 )
        I could care less if they have a non-binary blob or not. I care about full OpenCL 1.x-2.x/OpenGL 4.x stack support for Linux. Taint the kernel all their want. I want solid drivers that just work.
    • by Daniel Phillips ( 238627 ) on Wednesday March 21, 2012 @03:09AM (#39424075)

      The ATI 3d acceleration is still dependent on non-free software. Only the 2d works on free systems.

      Complete nonsense. I am doing OpenGL development at this very moment using the fully open Radeon driver. Your post has too many inaccuracies to address. If it were possible to retract it, you probably should.

      • by paskie ( 539112 )

        How can I use acceleration with the Radeon driver and without the appropriate firmware binary blob? I think the GP is taking issue with that.

        • by MrHanky ( 141717 )

          While "true", the realistic alternative is the firmware blob residing in ROM on the graphics card.

          • Exactly. I don't have the source code for the processor microcode either. I can live with it, provided the API exposed to the driver is sufficiently complete.

  • consumers everywhere rejoice!
  • What the hell does "DCE6 display watermark support" mean?
    I googled for it and didn't find anything useful.
    It sounds ominously like cinavia [wikipedia.org] for video.

    • What the hell does "DCE6 display watermark support" mean?
      I googled for it and didn't find anything useful.

      Whoa, this one really flies below the Google radar. DCE part of recent Radeon architecture [botchco.com], a programmable display controller that produces low level digitial signals to drive a wide variety of display types. As for "watermark", I did not turn up much on it beyond the patch [spinics.net]. You tell me and we'll both know.

      • by Anonymous Coward

        Watermark refers to empty/fill thresholds in the FIFOs between video memory and the display.

        • Watermark refers to empty/fill thresholds in the FIFOs between video memory and the display.

          Hah, so it's made-in-linux terminology abuse. Somebody got confused between the correct "highwater mark" and the incorrect "high watermark".

    • by Kjella ( 173770 )

      Well my understanding of the patch is sketchy, but DCE6 is just the display controller chip and watermark in this context seem to be nothing but frame begin/end indicators or timings depending on number of pixels to draw, latency, display clock etc. so you'll get a picture on screen and the display buffer is updated at the right time. It certainly has nothing to do with watermarks in the cinavia sense.

  • Previous series? (Score:2, Interesting)

    by Anonymous Coward

    I am still waiting for a working 5700 series driver.

    The closed driver contains lots of bugs and is unstable, the open driver lacks features and has bad fan control. In short, one pile of failure.

  • by elwin_windleaf ( 643442 ) on Wednesday March 21, 2012 @09:30AM (#39426393) Homepage

    I found myself in the market for a graphics card recently, and after the research and hassle of figuring out what has been released as open source, I decided to delay the decision by sticking with an older NVIDIA card I had kicking around.

    Now that I know this series of AMD cards is supported with open source drivers, I'm much more comfortable running it in my Linux desktop than my old NVIDIA card, which requires their proprietary drivers.

  • I built a desktop system for myself last June with an AMD 'APU' in it. At the time people were talking August for ATI's open source reveal, so I put my old nVidia card in it. It's still there, obviously.

    Assuming these parts went to fab before I could buy them, this puts ATI's lead time on open source drivers for new chips at about a year. That's probably 1/3 the useful life of the part. Hopefully for the last 2/3rds I'll be able to take advantage of that power savings.

    Serious question: how do they te

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...