Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Pipes In GUI's 22

Caine asks: "While having some dead time at work, I spent some time thinking about how to write a good GUI. I started thinking about if you could implement pipes in a GUI. My thought was something like this: You could connect two programs and/or widgets by a GUI pipe. This could be visualized if the user wanted to, as a thin, pulsating red line or something. For example, you could pipe your browsers' status/error window to a log colorer. You could also pipe other things than text, such as pictures; if you're watching a streaming movie you could pipe it into the graphic programs' filters for example. Basically they would do the same as a normal "|" but with all kinds of data and in a GUI. Has this ever been done or written about before? Any implementations?" Interesting idea, but before this would be possible, we'd have to introduce the concept of discrete input-channels and output-channels into our GUIs, and that's not as easy as it sounds.
This discussion has been archived. No new comments can be posted.

Pipes in GUI's

Comments Filter:
  • by Anonymous Coward
    I think this is a great idea. When saving to a file dialog, instead of a file, there would be a button to choose a pipe. Other programs (ie a graphical "tar" app) could open one of these the same way.

    What we need is a collection of applications that can do something useful with this concept - sort of a seed to give others something to work with.

    Of course, you CAN create a named pipe in the file system and just output to it, then have another program read from it, but this only works if applications do pipe-friendly IO, ie not mmap or what not.

  • A common/filetype description protocol, and then unix/tcp sockets, and simply open two connections as a pipe, shouldn't be as hard? sockets are about some of the simplest stuff to create pipes with
  • by stubob ( 204064 ) on Thursday August 17, 2000 @09:35AM (#848599) Homepage
    I'm not sure if this is what you mean, but the Java event model seems to allow this. For example, if you type in one text window, you can send the KeyPressedEvents to another window and handle them differently there. I've never messed around with anything more complicated than forwarding MouseEvents (moving the mouse in one window recenters a map in a different window.) Also, with the JVM design, different JFrames can communicate with each other simply through object references.

    I guess the same idea could be done with X by registering for and dispatching callbacks to other panels in the same way.

    -----
  • by bmetzler ( 12546 ) <bmetzlerNO@SPAMlive.com> on Thursday August 17, 2000 @11:07AM (#848600) Homepage Journal
    Aren't things like corba and bonobo methods for communicating with applications? Either I don't understand Bonobo, or I don't understand the question, or I've had too much Mountain Dew.
  • by Anonymous Coward
    How about taking any visible text or picture from one GUI application and being able to put it into a "clipboard." Then, you can "paste" that text or picture into the next GUI application and it will just work. The type of object will somehow be stored with the object itself in this clipboard. Next, we could create a script in the GUI application, call it a "macro". This macro could manipulate the text or picture however you like. Oh, wait, this has already been done by many others.
  • by hodeleri ( 89647 )
    XML would be a good way to get things like this done. If you could find a common namespace applications would easily be able to converse with each other to use only what they want.

    --
    Eric is chisled like a Greek Godess
  • from http://www.cs.cmu.edu/~rcm/papers/usenix00/usenix0 0.html

    "Attempts to introduce Unix shell features like pipelining into graphical user interfaces [3,6,7,8,15,16] have been unsuccessful, largely because they were not integrated well with existing applications, required extra work
    from application developers to expose hooks and programming interfaces, or were too hard to use."

    And the refs:

    [3]K. Borg. ``IShell: A Visual UNIX Shell.'' Proc. Conference on Human Factors in Computing Systems (CHI '90), 1990, pp 201-207.

    [6]P. E. Haeberli. ``ConMan: A Visual Programming Language for Interactive Graphics.'' Proc. ACM SIGGRAPH 98, 1988, pp 103-111.

    [7]T. R. Henry and S. E. Hudson. ``Squish: A Graphical Shell for Unix.'' Graphics Interface, 1988, pp 43-49.

    [8]B. Jovanovic and J. D. Foley. ``A Simple Graphics Interface to UNIX.'' Technical Report GWU-IIST-86-23, George Washington University Institute for Information Science and Technology, 1986.

    [15]F. Modugno and B. A. Myers. ``Typed Output and Programming in the Interface.'' Carnegie Mellon University School of Computer Science Technical Report, no. CMU-CS-93-134. March 1993.

    [16]F. Modugno and B. A. Myers. ``Pursuit: Visual Programming in a Visual Domain.'' Carnegie Mellon University School of Computer Science Technical Report, no. CMU-CS-94-109. January 1994.

  • by GuardianLion ( 161686 ) on Thursday August 17, 2000 @11:59AM (#848604)
    There is a programming environment that works much like that, called Labview. They call it a "dataflow" language, and the design is all graphical, building components and connecting them together with "wires". Piping things off to other processes (within the environment, but it was multi-tasking) was really easy. It'd be hell to write a browser in, since it is built for real-time test and measurement, but I got it to do some fairly useful stuff, including some disk-based communication to get around whiny IS types.

    It was actually a lot of fun to program in.
  • by hatless ( 8275 ) on Thursday August 17, 2000 @07:08PM (#848605)
    Plenty of GUI applications have been piping between each other for years now. Apart from some scientific and audio/video processing apps, the UI representation is usually different, but the purpose and function are the same. When you embed an updateable spreadsheet in a word processing document, that's piping. When you drag a document icon to a prionter icon, you're piping to the application that renders the document for printing, which in turn pipes to the print spooler. Ditto when you drag and drop a compressed file's icon onto an icon representing a decompression program.

    Implementing a pipe metaphor into an entire UI framework isn't a bad idea. It is one that will likely have a small, techy audience. Using a nonverbal GUI to indicate what is being piped and to where, you run into a lot of ambiguity. When you connect one widget or window to another, what is it you're piping?
    • the content of a widget, the underlying value passed by the widget, the functioning widget itself, or the bitmap/vector graphic contents of the widget?
    • at this moment in time, or updated dynamically?
    • Plain text? HTML? Richtext?
    At a certain point in the stagnant WIMP interface, getting at these distinctions involves wizards and property sheets, and it ends up looking and feeling like an IDE.

    Again, truly ubiquitous embedding and linking at the desktop environment level would be useful for programmers and for some power-user applications. But I wouldn't expect it to take the world by storm in the two-dimensional, mouse-driven windowing environments we use today, outside the self-contained applications (like audio processors) that it's used in already.

    Apply the concept to VR interfaces and tactile interfaces, and maybe it'll be something fluidly usable for everyday use.
  • A few years ago when I took an 'Introduction to Operating Systems' class as a freshman, my classmates and I were given a quick overview of the UNIX shell (using the MKS tools in DOS, ick!) Most of my fellow students hated it; I think they found it too arcane and complex.

    I had a background in computer programming, though, and appreciated the richness of the enviroment right away. UNIX is an operating system by programmers, for programmers: its user interface is a programming language!

    So you can follow me when I say I think this idea of using pipes in GUIs is just a special case of a bigger question -- how would a visual, GUI based programming language work?

    I have often visualized a programming language(?) for GUIs where one could drag & drop elements around to create windows, dialogs, etc. I've never taken the much further than wondering how the actions resulting from activating those elements in the final product would be represented in such a language, i.e., how do you tell the computer what those buttons do without resorting to textual code?

    In a way, I guess this is a catch-22 situation, like compiling a compiler -- at some point, somebody has to write the machine code.

  • There's Piper [bioinformatics.org], which is exactly what you are talking about... I think it assumes that programs always have stdin and stdout, and you simply draw a data flow diagram showing the connections between them. And then you can save a configuration for later use.

    The older thing like this which I know of, is Khoros [khoral.com], a library of image processing utilities, which you could wire together with Cantata [byu.edu]. However cantata allows each module to have multiple inputs and outputs (for example, there might be a module which takes two images and blends them together). Seems like a really good idea to me, and I think the GUI could be reused for other tasks besides image processing, because cantata is only a shell for starting up multiple processes and connecting their inputs and outputs together. Khoral Research tries to make money off this product, but it looks like you can still download some stuff from ftp://ftp.khoral.com/pub/khoros/. I was running it on my Linux box in 1996 or so, and prior to that, had used it at ASU on a Sun.

  • BTW, LabVIEW 6.0 was just released this week, and it has been available for Linux since 5.0. See http://www.ni.com/linux [ni.com].

    I program in LabVIEW almost exclusively for my job, though I still prefer text-based languages.
  • Yes, XML should obviously be used. About the common namespace:

    One could make 2 apps, one to send its stdin to a unix domain socket, and another to read from a unix domain socket and "print" it to stdout.

    Window 1:

    myapp | in2sock /tmp/mysock

    Window 2:

    sock2out /tmp/mysock | otherapp

    And there you are, provided that you implement a sane stream handling in your applications (example: send 7 bytes to describe the message length, then send the message of that length, encoded in XML).

    One "plus" in this when compared to traditional pipes is that you can confuse the end of the pipe by bombarding it from multiple apps.

    Ahh, the sockets are beautifull.

  • by dutky ( 20510 ) on Thursday August 17, 2000 @02:05PM (#848610) Homepage Journal

    There is a tool for MacOS called FilterTop [topsoft.org] which allows you to construct pipelines from special filter programs. You can then drag-and-drop files into the resulting filters (called 'droplets' in MacOS) and have the pipeline executed on the contents of the file, spitting the results out into another file afterwards.

    I think that there was something else for the Mac, released back around 1995, that did something similar by arranging icons on the desktop, but I can't find any reference to it through Google.

  • What about drag and drop?

    I don't think this needs a new kind of user interaction, just give the source application some draggable icon. If the user draggs this icon to the destination app, a connection between the two is build up, and they start exchanging data.

    In the classical drag and drop case, the data exchanged is just a pice of static data, but what prevents you to create a permanent connection and dynamically transmit data through it?

    One GUI addition that may be needed is some kind of disconnect button.
  • Where I see this usefull is where I have seen it implimented before. I have used a java gui tool kit that allowed me to create a data pipe from one node to the next. This made the programming for the app simpler. I see a similar idea for a gui.

    For the developer:

    What each window needs is some default output connectors and input connectors with defined data types. A great example of this is how the nodes in Maya work. The whole system is built of nodes with inputs and outputs. You can link an image file to a light and that light prjects the image. You link and expression that returns an image series, frame by frame, and now you have a slide/movie projector. The examples for the usefullness in development is endless.

    For the user:
    I can see where people, geeks and non-geeks could love something like this. If I want the output of notepad, perhaps some html I am working on, to be dynamicly displayed in netscape, I should be able to. OR... I could be working on a web page, have the text editor link to the html, the web development software link to some java script, and have GIMP link to the images. Any change in any of the tools, either cached or flushed, would update the look of the page, on the fly.

    Why is this good?

    It gets us back to the tool kit model, something that UNIX was based on. Build one tool, build it right. When somebody needs that tool functionality, pipe it. Do this with several tools and pipes and you can do anything. I see where too often the developer tries to throw too much functionality into the system (such as xemacs) and bloats it, where individual tools would work wonders.

    Also, if the pipes were non-blocking, each tool could live in its own process space. This way if netscape stopped responding, you would not lose the work.

    I see the benifits. I also see oponents pointing to similar ideas saying that this is similar enough to get it done. One such example would be the ability to embed something in a word doc. You can then double click on this and through the magic of windows be able to edit this data in the native app. This is close. This is more of a cached pipe approach. My problem with this is that the viewer is driving the changes (word) where in my earlier example, netscape was the results of the changes.
    I see the order difference to be important because I don't want the viewing app to control how the content is generated, I just want it to display it.

    I don't know... maybe it is too late for me to think straight... but I see this as posibly being the next good direction for a gui to take. What an idea!

  • I hope he's not talking about the same type of pipe that they do on smokedot [smokedot.org]. (AHHH they are down! ahh!!)
  • Something I once saw might be interesting. It was a few years before anyone thought about HTML and the basic idea was to create a "well defined" data stream from arbitrary data (something XML will do, for example). Then you could use various filters which work only on the data in the stream which they know and pass the rest on unmodified.

    Then the next step is to identify the available data streams. You should probably allow to insert these filters in different places, for example in the data stream between keyboard/mouse and a GUI element, between GUI elements (for example, display additional data when some valid data has been entered in an element; think of the URL entry fields in web browsers that show the know matches).

    Another stream will be the various data streams the application generates and reads (ie. the config files or everything the user enters in, for example, a word processor).

    And lastly, you will probably want to define additional connection points for these pipes so you can remotely control the application (for example, you could create a new mega application by connecting several existing apps just like you would do in a shell).

    Some things to look at: RMI and CORBA for remote connectivity, XML for the data exchange format, ARexx and Tcl for how to open an application for remote control on a higher level than RMI/CORBA.
  • In Plan 9 and Inferno all objects communicate through network transparent text or binary channels. The Plan 9 main user interface framework ACME is built on such set of files. The concept of "plumbing" allows run time combining of different application into systems with channels.

    See acme command [bell-labs.com] acme "methods" [bell-labs.com] plumb command [bell-labs.com] plumb "methods" [bell-labs.com]

  • That's nice.

    Now for the question that is being asked -- how do you do this in a GUI? We're not talking about the backend stuff, which everyone knows about. We're talking about what do you do with your mouse and icons (or whatever) to graphically create a pipe between two applications. How do you manipulate applications -- in a generic fashion -- to set up a pipe-like construct?
  • Okay, great. You use XML.

    Now how do you actually get two GUI programs to talk to one another using this XML-based pipe? That's the question. It's not about the backend, it's about how you use a GUI to get two GUI programs to pipe to one another. What do you manipulate? How do you visually show the user that the programs are connected? Etc.
  • You know, that's such an intuitively obvious answer that I feel like slapping myself in the forehead for not having come up with it myself. (smack) Ah, that's better.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...