Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Virtualization Linux Build Hardware

Experimental Virtual Graphics Port Support For Linux 74

With his first accepted submission, billakay writes "A recently open-sourced experimental Linux infrastructure created by Bell Labs researchers allows 3D rendering to be performed on a GPU and displayed on other devices, including DisplayLink dongles. The system accomplishes this by essentially creating 'Virtual CRTCs', or virtual display output controllers, and allowing arbitrary devices to appear as extra ports on a graphics card." The code and instructions are at GitHub. This may also be the beginning of good news for people with MUX-less dual-GPU laptops that are currently unsupported.
This discussion has been archived. No new comments can be posted.

Experimental Virtual Graphics Port Support For Linux

Comments Filter:
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Tuesday November 08, 2011 @01:34AM (#37982292)
    Comment removed based on user account deletion
    • Re: (Score:1, Informative)

      by Anonymous Coward

      Let's hope it doesn't "work" like PulseAudio.

      • Comment removed based on user account deletion
        • by Jonner ( 189691 )

          It doesn't sound to me like this is that much like either PulseAudio or Jack. Those are both sound systems based on userspace daemons focused on flexible sound mixing, while this virtual graphics system is within the Linux kernel and seems to be focused on simply moving pixels from one hardware device to some other device.

          Wayland [freedesktop.org] is more like PulseAudio or Jack for graphics. Its proponents think it has advantages over the much thicker, more complex daemon we've used for decades called the X11 server.

      • Re: (Score:3, Funny)

        no,no "work" is not the word you are looking for when describing pulse audio.

        I had a nightmare last night that PA was keeping ALSA captive demanding the
        release of 1000000 CPU cycles the system was keeping for thread scheduling.
        In the end we used an SCSI driver to nuke the damn thing to /dev/null using
        an NPTL.
        Unfortunately when we stormed the desolated daemon we found out the cruel
        things it had been doing to ALSA all along, leaving it a mutated and deformed
        carcass. /dev/rand spoke a few words about it's for

        • by jonwil ( 467024 )

          If you want to see the worst abuse of PulseAudio, check out the Nokia N900 linux phone. Its using a combination of PulseAudio and a few other projects along with a bunch of closed-source blobs to do lots of the audio stuff in the phone. (the closed source blobs exist to keep certain proprietary algorithms for things like speaker protection and other things that are needed in a cellphone (exactly what is unknown since Nokia hasn't documented them)

      • by Jonner ( 189691 )

        PulseAudio works great for me and makes my life a lot easier, so it would be fine if it did.

      • Let's hope it doesn't "work" like PulseAudio.

        Fortunately, it seems not. These people actually seem to know what the fuck they're talking about.

    • Comment removed (Score:4, Informative)

      by account_deleted ( 4530225 ) on Tuesday November 08, 2011 @02:08AM (#37982432)
      Comment removed based on user account deletion
      • On the other hand, if you have spare CPU cycles, you could take that output video and compress it to MP4, which VNC doesn't yet support. Still, far less efficient than sending the 3D commands over the wire for the device on the other end to render.

      • by rdnetto ( 955205 )

        However, it would let turn a laptop or tablet into a 2nd monitor, which could be rather useful at times if you don't normally have a dual screen setup.

    • X11 has done network-transparent video since forever. Screens that don't exist have been around a long time too (Xvnc).

      The part where this is better than existing solutions is you get a hardware-accelerated framebuffer without having to attach it to a physical monitor. Thus, you could get a hardware-accelerated Xvnc, or create a virtual second head and network-attach it to a second computer. You might even do that over VNC, so it's not really an alternative to VNC... it's a new capability.

    • by Jonner ( 189691 )

      Does anyone know if this would this provide a performance boost over something like VNC for similar things? Or how about the possibility to pass rendered output as a fake video capture card input to a virtual machine? I think I get what this does, but I'm kind of wondering how exactly it's better than current solutions to these problems.

      An obvious way to use this would be to target some kind of virtual frame buffer in regular RAM that VNC or other remote protocol could take advantage of. Currently, you have to point VNC to a real frame buffer that is displayed on a GPU's output to take advantage of the acceleration. However, if you switched the virtual frame buffer the GPU renders to, you could have acceleration for an arbitrary number of them as long as applications don't need to use acceleration features all the time.

  • I get the sending info to multiple places. Are they talking about sending different streams to these monitors/what-have-you? Otherwise it just sounds like tossing in a splitter in the video signal.

    Yes, I did read TFA, and I guess I'm missing something.
    Help please?

    • by AHuxley ( 892839 ) on Tuesday November 08, 2011 @01:47AM (#37982350) Journal
      From the read me at https://github.com/ihadzic/vcrtcm-doc/blob/master/HOWTO.txt [github.com] :
      "In a nutshell, a GPU driver can create (almost) arbitrary number of virtual CRTCs and register them with the Direct Rendering Manager (DRM) module. These virtual CRTCs can then be attached to devices (real hardware or software modules emulating devices) that are external to the GPU. These external devices become display units for the frame buffer associated with the attached virtual CRTC. It is also possible to attach external devices to real (physical) CRTC and allow the pixels to be displayed on both the video connector of the GPU and the external device."
      • by isama ( 1537121 )
        So if you were to tie this to xinerama and vnc you whould have something like dmx? Altough I never got dmx to work, maybe because I'm a little lazy...
        • I've used DMX with Chromium to give 3D accelerated X over 28 monitors on 7 machines. Works, but the performance can be terrible if you don't have the interconnect to deal with what you're rendering. With gigabit basic X applications could cope, but firefox with google maps would take seconds per redraw. Depending on the 3D app you /can/ get decent performance though.

      • These external devices become display units for the frame buffer

        Looking at the HDMI specs for guidance, a high-res frame buffer might run 10Gbps. That's still considered a hard amount of data to push around inside a PC, right?

  • by Adriax ( 746043 ) on Tuesday November 08, 2011 @01:50AM (#37982362)

    Wonder if this could be used to create a GPU accelerated sound system?
    Take the scene modeling, texture objects based on their acoustic properties, create light sources for every sound source, and output the scene to a sound device that translates the visual frame into a soundscape for output.

    Or am I just not up to date with audio acceleration technologies (since I've never upgraded beyond a cheap headset).

    • Aureal3D (Score:4, Informative)

      by Chirs ( 87576 ) on Tuesday November 08, 2011 @02:18AM (#37982474)

      That's basically what the old Aureal technology did a decade ago--took the 3D scene data and passed it to the audio card for processing. It was awesome--Half-Life with four speakers was eerily realistic.

      • by Anonymous Coward

        the a3d did great positional audio with only 2 speakers. like that demo that had bees flying all around you.

        • God I remember that demo! It was actually a little spooky... they'd fly behind you and the hairs on the back of your neck would stand up because your brain was telling you there was a huge bee back there.

          That Came on my brand new Compaq which had Windows 98, an AMD K6-3D at about 200mhz, 32MB of RAM, and a 4GB HDD.

          And now I feel old...

        • I think the bee demo was from Sensaura. I worked up there for a few happy years until Creative ermmm... 'nuff said.

          Maybe both companies had a bee demo...

        • Nothing special, just an implementation of HRTF [wikimedia.org].

    • Well, mainstream soundcards have been good enough for realistic sound since the 90s, so it isn't really a problem that needs to be offloaded to anything else. It's a lot easier to fake realistic audio in realtime than realistic graphics. Half-Life with an EAX setup sounded amazing, but it wasn't exactly photorealistic.

    • by Jonner ( 189691 )

      Though your idea is interesting, I doubt it could benefit from the virtual graphics approach described in TFA. This is about rendering pixels to arbitrary outputs, while it sounds like you're talking about much higher level manipulation. There are already capable, programmable DSPs for advanced audio processing such as the EMU10k1 series from Creative and I expect you could use OpenCL or something like it to do sound processing on a GPU if desired.

  • Stick a webcam in front of the screen, compress/pipe webcam output to the remote client. Voila, instant 3D remote display!

    • Re:Pff, nothing new (Score:5, Interesting)

      by adolf ( 21054 ) <flodadolf@gmail.com> on Tuesday November 08, 2011 @03:10AM (#37982646) Journal

      Indeed -- not new, at all.

      Similar tricks were used a dozen or so years ago by Mesa 3D to get standalone 3dfx Voodoo cards to output accelerated OpenGL in a window on the X desktop. The 3D stuff rendered on a dedicated 3D card, and its output framebuffer was eventually displayed by a second, 2D-oriented card that actually had the monitor connected.

      • Perhaps doing it in a generic hardware agnostic way is new?

        • by adolf ( 21054 )

          Perhaps, depending on how hardware-agnostic the APIs in question were/are.

          Then again VirtualGL [wikipedia.org] has been around for a bit, too, which brings network transparency thrown into the mix. I don't know how much more hardware-agnostic such a thing could be...

        • by Jonner ( 189691 )

          Yes, I think that's exactly why this is interesting. Increasingly, PCs have multiple video outputs of various types as well as multiple GPUs. If you can decouple the GPU used to render something from the output used to display it without a huge performance hit, that opens up all kinds of possibilities.

      • by zefrer ( 729860 )

        You neglect to mention that said standalone 3D cards were physically connected to the 2D card via a pass-through cable which was what sent the video signal from one card to the other, allowing it to appear on your monitor.

        This is a software solution of the same effect that will work on any card, even remote cards on different machines. Hardly the same thing.

        • by adolf ( 21054 )

          You neglect to remember that the Voodoo 1 and 2 were only capable of full-screen output using that passthrough cable, and had no conventional 2D processing capabilities of their own. The pass-through cable was essentially just a component of an automatic A/B switch: You could either visualize the output of one card, or of the other, but never both at the same time. (At least not by those means.)

          To render 3D stuff on a Voodoo 1/2 and have it displayed inside of a window instead of full-screen required[1]

  • Reminds me of plan9, beautiful design and concept.
    • Reminds me of plan9, beautiful design and concept.

      I agree about it being a beautiful design and concept. Why send expensive aggressive robots to dominate a new species you find on a new planet, when you can just raise their dead and control the masses with slow moving zombies?

      I sure hope I'm not misunderstanding your reference.

  • In the Kernel please (Score:5, Informative)

    by sgt scrub ( 869860 ) <saintium@NOSpAM.yahoo.com> on Tuesday November 08, 2011 @09:25AM (#37984046)

    David Airlie's HotPlug video work is really cool. I'm not surprised something bigger is coming out of it. What I really like are Elija's thoughts on putting it in the kernel so support is for more than X. Below is from the DRI-Dev thread. http://lists.freedesktop.org/archives/dri-devel/2011-November/015985.html [freedesktop.org]

    On Thu, 3 Nov 2011, David Airlie wrote:

    >
    > Well the current plan I had for this was to do it in userspace, I don't think the kernel
    > has any business doing it and I think for the simple USB case its fine but will fallover
    > when you get to the non-trivial cases where some sort of acceleration is required to move
    > pixels around. But in saying that its good you've done what something, and I'll try and spend
    > some time reviewing it.
    >

    The reason I opted for doing this in kernel is that I wanted to confine
    all the changes to a relatively small set of modules. At first this was a
    pragmatic approach, because I live out of the mainstream development tree
    and I didn't want to turn my life into an ethernal
    merging/conflict-resolution activity.

    However, a more fundamental reason for it is that I didn't want to be tied
    to X. I deal with some userland applications (that unfortunately I can't
    provide much detail of .... yet) that live directly on the top of libdrm.

    So I set myself a goal of "full application transparency". Whatever is
    thrown at me, I wanted to be able to handle without having to touch any
    piece of application or library that the application relies on.

    I think I have achieved this goal and really everything I tried just
    worked out of the box (with an exception of two bug fixes to ATI DDX
    and Xorg, that are bugs with or without my work).

    -- Ilija

  • by fnj ( 64210 )

    WTF. Cathode ray tube controller? What an antiquated concept.

  • If this ever makes it out of the lab, the MPAA's gonna be on this like a ton of bricks.

Get hold of portable property. -- Charles Dickens, "Great Expectations"

Working...