Pipes In GUI's 22
Caine asks: "While having some dead time at work, I spent some time thinking about how to write a good GUI. I started thinking about if you could implement pipes in a GUI. My thought was something like this: You could connect two programs and/or widgets by a GUI pipe. This could be visualized if the user wanted to, as a thin, pulsating red line or something. For example, you could pipe your browsers' status/error window to a log colorer. You could also pipe other things than text, such as pictures; if you're watching a streaming movie you could pipe it into the graphic programs' filters for example. Basically they would do the same as a normal "|" but with all kinds of data and in a GUI. Has this ever been done or written about before? Any implementations?" Interesting idea, but before this would be possible, we'd have to introduce the concept of discrete input-channels and output-channels into our GUIs, and that's not as easy as it sounds.
Good idea (Score:1)
What we need is a collection of applications that can do something useful with this concept - sort of a seed to give others something to work with.
Of course, you CAN create a named pipe in the file system and just output to it, then have another program read from it, but this only works if applications do pipe-friendly IO, ie not mmap or what not.
I would say.. (Score:2)
java? (Score:3)
I guess the same idea could be done with X by registering for and dispatching callbacks to other panels in the same way.
-----
Corba?? bonobo?? (Score:3)
I got it! (Score:1)
XML (Score:2)
--
Eric is chisled like a Greek Godess
Info and some pointers (Score:1)
"Attempts to introduce Unix shell features like pipelining into graphical user interfaces [3,6,7,8,15,16] have been unsuccessful, largely because they were not integrated well with existing applications, required extra work
from application developers to expose hooks and programming interfaces, or were too hard to use."
And the refs:
[3]K. Borg. ``IShell: A Visual UNIX Shell.'' Proc. Conference on Human Factors in Computing Systems (CHI '90), 1990, pp 201-207.
[6]P. E. Haeberli. ``ConMan: A Visual Programming Language for Interactive Graphics.'' Proc. ACM SIGGRAPH 98, 1988, pp 103-111.
[7]T. R. Henry and S. E. Hudson. ``Squish: A Graphical Shell for Unix.'' Graphics Interface, 1988, pp 43-49.
[8]B. Jovanovic and J. D. Foley. ``A Simple Graphics Interface to UNIX.'' Technical Report GWU-IIST-86-23, George Washington University Institute for Information Science and Technology, 1986.
[15]F. Modugno and B. A. Myers. ``Typed Output and Programming in the Interface.'' Carnegie Mellon University School of Computer Science Technical Report, no. CMU-CS-93-134. March 1993.
[16]F. Modugno and B. A. Myers. ``Pursuit: Visual Programming in a Visual Domain.'' Carnegie Mellon University School of Computer Science Technical Report, no. CMU-CS-94-109. January 1994.
Not exactly what you are looking for, but.. (Score:3)
It was actually a lot of fun to program in.
Um, Bonobo, KDE's DCOP, OLE, any drag-n-drop (Score:3)
Implementing a pipe metaphor into an entire UI framework isn't a bad idea. It is one that will likely have a small, techy audience. Using a nonverbal GUI to indicate what is being piped and to where, you run into a lot of ambiguity. When you connect one widget or window to another, what is it you're piping?
Again, truly ubiquitous embedding and linking at the desktop environment level would be useful for programmers and for some power-user applications. But I wouldn't expect it to take the world by storm in the two-dimensional, mouse-driven windowing environments we use today, outside the self-contained applications (like audio processors) that it's used in already.
Apply the concept to VR interfaces and tactile interfaces, and maybe it'll be something fluidly usable for everyday use.
GUIs and Programming (Score:1)
A few years ago when I took an 'Introduction to Operating Systems' class as a freshman, my classmates and I were given a quick overview of the UNIX shell (using the MKS tools in DOS, ick!) Most of my fellow students hated it; I think they found it too arcane and complex.
I had a background in computer programming, though, and appreciated the richness of the enviroment right away. UNIX is an operating system by programmers, for programmers: its user interface is a programming language!
So you can follow me when I say I think this idea of using pipes in GUIs is just a special case of a bigger question -- how would a visual, GUI based programming language work?
I have often visualized a programming language(?) for GUIs where one could drag & drop elements around to create windows, dialogs, etc. I've never taken the much further than wondering how the actions resulting from activating those elements in the final product would be represented in such a language, i.e., how do you tell the computer what those buttons do without resorting to textual code?
In a way, I guess this is a catch-22 situation, like compiling a compiler -- at some point, somebody has to write the machine code.
I know of at least two X11 implementations... (Score:1)
The older thing like this which I know of, is Khoros [khoral.com], a library of image processing utilities, which you could wire together with Cantata [byu.edu]. However cantata allows each module to have multiple inputs and outputs (for example, there might be a module which takes two images and blends them together). Seems like a really good idea to me, and I think the GUI could be reused for other tasks besides image processing, because cantata is only a shell for starting up multiple processes and connecting their inputs and outputs together. Khoral Research tries to make money off this product, but it looks like you can still download some stuff from ftp://ftp.khoral.com/pub/khoros/. I was running it on my Linux box in 1996 or so, and prior to that, had used it at ASU on a Sun.
Re:Not exactly what you are looking for, but.. (Score:3)
I program in LabVIEW almost exclusively for my job, though I still prefer text-based languages.
Re:XML (Score:1)
One could make 2 apps, one to send its stdin to a unix domain socket, and another to read from a unix domain socket and "print" it to stdout.
Window 1:
myapp | in2sock /tmp/mysock
Window 2:
sock2out /tmp/mysock | otherapp
And there you are, provided that you implement a sane stream handling in your applications (example: send 7 bytes to describe the message length, then send the message of that length, encoded in XML).
One "plus" in this when compared to traditional pipes is that you can confuse the end of the pipe by bombarding it from multiple apps.
Ahh, the sockets are beautifull.
how about this... (Score:3)
There is a tool for MacOS called FilterTop [topsoft.org] which allows you to construct pipelines from special filter programs. You can then drag-and-drop files into the resulting filters (called 'droplets' in MacOS) and have the pipeline executed on the contents of the file, spitting the results out into another file afterwards.
I think that there was something else for the Mac, released back around 1995, that did something similar by arranging icons on the desktop, but I can't find any reference to it through Google.
Re:I would say.. (Score:1)
I don't think this needs a new kind of user interaction, just give the source application some draggable icon. If the user draggs this icon to the destination app, a connection between the two is build up, and they start exchanging data.
In the classical drag and drop case, the data exchanged is just a pice of static data, but what prevents you to create a permanent connection and dynamically transmit data through it?
One GUI addition that may be needed is some kind of disconnect button.
Not drag and drop, dynamic content/static connect (Score:2)
For the developer:
What each window needs is some default output connectors and input connectors with defined data types. A great example of this is how the nodes in Maya work. The whole system is built of nodes with inputs and outputs. You can link an image file to a light and that light prjects the image. You link and expression that returns an image series, frame by frame, and now you have a slide/movie projector. The examples for the usefullness in development is endless.
For the user:
I can see where people, geeks and non-geeks could love something like this. If I want the output of notepad, perhaps some html I am working on, to be dynamicly displayed in netscape, I should be able to. OR... I could be working on a web page, have the text editor link to the html, the web development software link to some java script, and have GIMP link to the images. Any change in any of the tools, either cached or flushed, would update the look of the page, on the fly.
Why is this good?
It gets us back to the tool kit model, something that UNIX was based on. Build one tool, build it right. When somebody needs that tool functionality, pipe it. Do this with several tools and pipes and you can do anything. I see where too often the developer tries to throw too much functionality into the system (such as xemacs) and bloats it, where individual tools would work wonders.
Also, if the pipes were non-blocking, each tool could live in its own process space. This way if netscape stopped responding, you would not lose the work.
I see the benifits. I also see oponents pointing to similar ideas saying that this is similar enough to get it done. One such example would be the ability to embed something in a word doc. You can then double click on this and through the magic of windows be able to edit this data in the native app. This is close. This is more of a cached pipe approach. My problem with this is that the viewer is driving the changes (word) where in my earlier example, netscape was the results of the changes.
I see the order difference to be important because I don't want the viewing app to control how the content is generated, I just want it to display it.
I don't know... maybe it is too late for me to think straight... but I see this as posibly being the next good direction for a gui to take. What an idea!
Crossdot post? (Score:1)
XML-Like data filtering (Score:1)
Then the next step is to identify the available data streams. You should probably allow to insert these filters in different places, for example in the data stream between keyboard/mouse and a GUI element, between GUI elements (for example, display additional data when some valid data has been entered in an element; think of the URL entry fields in web browsers that show the know matches).
Another stream will be the various data streams the application generates and reads (ie. the config files or everything the user enters in, for example, a word processor).
And lastly, you will probably want to define additional connection points for these pipes so you can remotely control the application (for example, you could create a new mega application by connecting several existing apps just like you would do in a shell).
Some things to look at: RMI and CORBA for remote connectivity, XML for the data exchange format, ARexx and Tcl for how to open an application for remote control on a higher level than RMI/CORBA.
Plan 9 / Inferno does it... (Score:1)
In Plan 9 and Inferno all objects communicate through network transparent text or binary channels. The Plan 9 main user interface framework ACME is built on such set of files. The concept of "plumbing" allows run time combining of different application into systems with channels.
See acme command [bell-labs.com] acme "methods" [bell-labs.com] plumb command [bell-labs.com] plumb "methods" [bell-labs.com]
Re:I would say.. (Score:2)
Now for the question that is being asked -- how do you do this in a GUI? We're not talking about the backend stuff, which everyone knows about. We're talking about what do you do with your mouse and icons (or whatever) to graphically create a pipe between two applications. How do you manipulate applications -- in a generic fashion -- to set up a pipe-like construct?
Re:XML (Score:2)
Now how do you actually get two GUI programs to talk to one another using this XML-based pipe? That's the question. It's not about the backend, it's about how you use a GUI to get two GUI programs to pipe to one another. What do you manipulate? How do you visually show the user that the programs are connected? Etc.
Re:I would say.. (Score:1)