Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

The State of Remote Desktops? 484

frenchgates writes "It became clear to me (when my main machine had to be sent away for repairs for a week) that it's high time to finally divorce myself from any particular computer by using data and software accessible from any internet connected computer as much as possible. I'm talking Visual IDEs, productivity apps, powerful, easy to use email client, etc, all presented to me consistently from computer to computer on my remote virtual desktop. Is anyone seriously trying this? What are the best practices and best applications? What are the biggest shortcomings? What if I limit my demand to "accessible from any internet connected Windows machine with Java installed?" Are there good web sites devoted to this noble goal?"
This discussion has been archived. No new comments can be posted.

The State of Remote Desktops?

Comments Filter:
  • Re: (Score:2, Informative)

    Comment removed based on user account deletion
    • Re:VNC (Score:5, Informative)

      by damien_kane ( 519267 ) on Friday March 22, 2002 @04:39PM (#3209447)
      I have used VNC before, and not only does it support acceptable refresh rates over a broadband connection, but it also had built in support for connectiong over a java client (if enabled) through its own server.
      Because of this you can access it anywhere that you can open a browser.
      I highly recommend it.

      Remote Administrator ($hareware I believe) is also quite good.
      I used it for a project when I was in school... My friend and I set up a VPN between two networks and a roaming host (my laptop on a dialup connection).
      To display most of our data, as we required three internet connections (two networks + roaming host), we left our main setups at our houses and connected to them over Remote Administrator.
      It worked well and we received 98% on our presentation.
    • Re:VNC (Score:4, Informative)

      by edgarde ( 22267 ) <slashdot@surlygeek.com> on Friday March 22, 2002 @04:44PM (#3209487) Homepage Journal
      Citrix works nice if you got the $$. I've seen it in use but have never set it set up or administrated.

      VNC works great with Windows & Linux clients, and Linux servers (Windows servers are limited to a single desktop at this time I believe). You need to install a VNC client, but I consider it the best alternative.

      Another product called Tridia VNC [tridiavnc.com] (here's a review [unixreview.com] from UnixReview.com [unixreview.com] ) works in any browser supporting Java 2. I find it inadequate for most users because the screen refreshes are poor, but I use it for my stuff and I'm good whereever I go.
      • by nowt ( 230214 )
        Citrix is really for using a single, powerful server and serving up the equivalent to many vnc sessions simultaneously. One sad drawback (apart for all the $$$) is sucky printer support.

        • Amen on the printer support. Especially in the XP version. That doesn't work anywhere near like they advertise. There are workarounds though.

          But I'll have to disagree with you on the single server part. I have three of them here that are load-balanced into one farm. It actually works well that way. As long as I don't get a user doing something to hog processor time.

          You are correct though when you say you need powerful servers. Some java sessions inside IE will take up 30M of memory. If I get 10 people doing that on a single box it won't take long for the box to go into vapor lock.
    • I wouldn't suggest VNC -- at least for Windows. It is just _too_slow_! I mean, it's perfectly fine for doing odd jobs. But you'd never want to use it as your main user interface!
    • Re:VNC (Score:2, Informative)

      If you're using a Mac, skip VNC. In my experience, the Mac server is very finicky and crashes reliably whenever anything graphic-intensive is invoked.

    • Re:VNC (problem) (Score:3, Insightful)

      Problem with VNC on Windows - Client works fine, server works fine for remote administration but works better on *NIX for true remote desktop work. One can actually have multiple concurrent sessions on VNC for *NIX under X. One can not do that in Windows.

      Besides, VNC doesn't include encryption. You can tunnel it through a VPN or SSH or IPSEC etc, but that's it.

      Don't get me wrong - I LOVE VNC - I use it at EVERY client site as a remote administration and troubleshooting tool on Windows. I've sat on the mailing list in the past. Quentin Stafford-Fraiser, Wez & co at AT&T labs and Cambridge U. do an OUTSTANDING job - but there are limitations (in MS Windows, mind you - not VNC) that make it not so great for Windows remote desktop applications. Built in encryption would be nice too.

    • Re:VNC (Score:2, Informative)

      by mossmann ( 25539 )
      Tarantella does indeed kick ass, and not just for big companies. The 10 user Starter for Linux license is pretty affordable. A single Tarantella server can provide remote access from a Java client in a web browser to graphical or character applications running on Unix, Windows, AS/400, mainframes, and more.
  • Windows Terminal Server does this. There are companies that will host Terminal Services from their site and you can access them from anywhere.
  • by ABadDog ( 28370 )
    All you need is VNC [att.com].
  • Windows PC? (Score:2, Interesting)

    by Amarok.Org ( 514102 )
    Not bashing Microsoft, but...

    Why limit yourself to a Windows desktop? You specifically mention Java, isn't the point of Java to be platform independent?

    Shouldn't the goal be accessability from any type of machine?

    • by jgerman ( 106518 ) on Friday March 22, 2002 @04:43PM (#3209485)
      Technically java only runs on one platform. The JVM. ;)
      • No. Java IS the platform. Programs written in Java run on the Java environment, but if Java ran on itself, it'd be like that old saw about the Earth being a plate on the back of a giant turtle: "turtles all the way down". :)
        • No java the language runs in the java environment, otherwise known as the JVM. In the same way that you said programs written in Java run on the Java Environment. It's all semantics, the point is the java language is supposed (not always the case though) to be platform independent, but only because it will run on any environment with a JVM ported to it.
    • Didn't you read the post. He had to SEND HIS MACHINE AWAY to get fixed. Sounds like a dyed in the wool Windows user to me, hehe <grin>. ;o
  • Depends on the goal (Score:3, Informative)

    by Lxy ( 80823 ) on Friday March 22, 2002 @04:34PM (#3209400) Journal
    Are you talking about thin client or remote desktop access? VNC does pretty much what you're looking to do. One computer, one set of data, accesible from anywhere depedning on how you set it up. I believe there's also a jav version that you could easily run off an Apache server.

    Otherwise check out www.ltsp.org for terminal services/thin client options.
    • VNC does pretty much what you're looking to do.

      I use PCAnywhere a lot for managing machines over a VPN, and I would kill myself if I had to use a system like this to do what this person is talking about. Unless you're running the PCAnywhere (or VNC) server on a really fast machine, and your network connection is at least 10base-T, you're going to hate working through this because of the lag-time in the UI.
  • http://www.xfree86.org [xfree86.org]. Free, open source, cross-platform. Everybody wins.
  • Not so fast (Score:5, Insightful)

    by bribecka ( 176328 ) on Friday March 22, 2002 @04:35PM (#3209404) Homepage
    it's high time to finally divorce myself from any particular computer by using data and software accessible from any internet connected computer as much as possible.

    The problem is, even if you're doing everything remotely, you're pretty much stuck using one computer as a central repository for everything--programs and data. Unless you are planning on keeping sensitive data all over the place, it all has to physically reside somewhere.

    And if you do replicate everything, what about keeping consistency?? This problem you have will always be around. Okay, so you use Hotmail as your email client so you can access it from everywhere...what about a Hotmail outage, or MS goes out of business? :)
    • Re:Not so fast (Score:2, Insightful)

      by kaisyain ( 15013 )
      you're pretty much stuck using one computer as a central repository for everything--programs and data

      No, you're stuck using something that looks to the end user like one computer. It is pretty easy to have satisfactory amounts of redudancy with clustering, SANs, etc.

      And if you do replicate everything, what about keeping consistency??

      Amazingly enough people have been researching this for years and that's why we have things like RAID and Amoeba and AFS that solve all the complaints you've brought up.

      It's not that there aren't solutions out there, it's that the solutions aren't viable in many settings either because of bandwidth constraints or because people like having a computer they can call their own.
    • Yo have a replication process that handles the replication. I.E. you click backup. and a program puts your data where ever you want it. The only realistic thing you can do at this point is keep your data on a server thats not likely to go down. My compnay had 5 9 reliability on there servers, so we could do this, and host the data there.
    • We actually have a pretty nice remote PC setup here at where I work.

      The way it is setup is that the company standarized ALL of its PCs and laptops. So first off, everyone has the exact same "workstation", same processor, same RAM, hardware, etc. Then we have the exact same basic software installed on every single machine (MS Windows, Office, Anti-virus, etc.) Therefore, when we download or connect to our workstation accounts, we don't have to download the application software as well. Everyone's personal account files (work related files, email, etc.) then reside not on the PC in their office, but on a remote drive than can be accessed anywhere in the company's network - worldwide. The way its set up now is that I can leave my Michigan office, fly down to Texas, go to a company PC there, log in with my user ID and password, and bring up all of my email and working documents and work there. When I'm done, I save everything, shut it down, and then I can access it again at another PC somewhere else. We have people here to access their files from around the world, as they travel to all of the company's locations across the world.

      Unfortunately, I can't comment on the server software we use that allows all of this, as I'm merely a user of the sytem, rather than an administrator. However, in the time I've been using it, it works VERY well, preventing downtime if your PC in your office goes down.

      Now the system does have its problems. If the server which handles your personal account is down, you can't access any of your files or email. If a network connection is down or slow, same problem. I usually make backups of very important files on my office PC so that if the network crashes, I can still work.

      Now that I've described what we have, it makes me think that what I've described is more of a LAN or company-wide network accessable system, rather than internet based, accessable by any dial-up or other connection. However, it does work, allowing all of us here to use just about any computer anywhere on the company network.
    • Re:Not so fast (Score:4, Interesting)

      by brer_rabbit ( 195413 ) on Friday March 22, 2002 @05:37PM (#3209892) Journal
      The problem is, even if you're doing everything remotely, you're pretty much stuck using one computer as a central repository for everything--programs and data.

      That isn't painful with unison [upenn.edu]. I use this to sync my laptop and desktop. Unlike rsync, unison can propogate changes in *both* directions. This allows me to keep my home directory consistent. And for the paranoid, it can even be used over ssh.

    • Central Data (Score:5, Informative)

      by fm6 ( 162816 ) on Friday March 22, 2002 @06:25PM (#3210148) Homepage Journal
      The problem is, even if you're doing everything remotely, you're pretty much stuck using one computer as a central repository for everything--programs and data.

      I used to work at Sun, and that's precisely the approach they use for the corporate WAN. It's partly about being able to access your data from anywhere, but it's mainly about the difficult of backing up data that isn't on servers. (Though that always struck me as kind of strange, since Sun sells backup applications that catch workstation data.) Such a setup has obvious advantages, but there were glitches:

      • The only way to enforce this policy is to be very, very sticky about who gets a superuser password for their own workstation. I guess that's fine for admin people, but it can be pretty painful if you're a technical type and need to do some tweaking.
      • Any network or server issues, and everybody's in trouble. One amusing day, network traffic slowed to a crawl. Now, the standard text editor at Sun is a kitchen-sink implementation of XEMACS -- run entirely off the network. (Guess keeping it up to date was a priority!) Except for those who had their superuser passwords and had taken the time to do a local XEMACS, everybody found their editor stopping for about five minutes every time they did something that loaded a module. One guy who was on deadline had me come to his office and edit his files in Vi, according to changes he dictated to me.
      • The whole setup requires a fairly complex NIS-driven automounter setup. The basic setup was quite sound, but basically broke if your automounter demon crashed -- and mine seemed to at least once a week. Worked out in the end: IS got tired of my service calls and let me have my superuser password!
      • If all your apps are on a server, you have to live with the configuration decisions of whoever maintains the server. Sometimes not the right ones...
      • We were always running short of disk space. Never mind that terrabytes of workstation disk space were going unused...
  • These people are (Score:5, Interesting)

    by Rupert ( 28001 ) on Friday March 22, 2002 @04:35PM (#3209405) Homepage Journal
    [att.com]
    http://www.uk.research.att.com/spirit/
  • Ugggh! .NET (Score:2, Interesting)

    by Eric Damron ( 553630 )
    What you describe is Microsoft's .NET and Passport strategy. I'm not trying to get modded down as Flamebait but to tell you the truth, the biggest disadvantage IMHO is that it would be controlled by a company that I don't trust. Specifically Microsoft.

    I like the concept of being able to access MY desktop from anywhere but it opens up a few security concerns. Security doesn't seem to be Microsoft's strong suit.
  • by Neil Watson ( 60859 ) on Friday March 22, 2002 @04:35PM (#3209412) Homepage
    That is the whole theory behind X.

    Getting people to close their slack jaws and look away from Microsoft is another story.
    • by electroniceric ( 468976 ) on Friday March 22, 2002 @06:36PM (#3210207)
      I'm trying to restrain a rant, but this kind of pooh-pooh-ing is exactly why Linux continues to look like everyone's kid brother.

      I am a student, with the opportunity of working at home. At school - fairly good T1. At home 256K DSL. Which means as connectivity goes, I'm actually quite well off. I run Mandrake and Windows on both ends. In this setup (note again that it is an above-average one), I can tell you that using X (over SSH with compression enabled), Matlab (java app) runs juuuuust barely fast enough to be usable. Any KDE/GNOME apps - forgetaboutit. I used VNC for about 20 minutes before getting tired of waiting for the pointer to catch up to my mouse. My then-roommate, who works for Microsoft, could easily use PPTP to connect to his TermServ machine over the same connection. Not at all sluggish. In fact, he could even do it over dialup (then it was sluggish).

      X windows does what it was designed to do - let you redirect displays over the local network, but it's not a long-distance remote access answer.

      If we Linuxites want remote connectivity for desktop apps, we'll need to figure out how to make higher-level RPC calls. Being a KDE user, I'd love to see this built into QT or KDE.

      That's the desktop part. Now the data storage part:
      In our glorious remote computing future, your data is stored in the "network cloud". Microsoft will implement this by selling Cloud Server 1.0, which only works if you have Microsoft Synchronization Server running on Whistlerhorn XPDQ.

      But rather than trying to do things exactly the MS does, we can do them the Linux way: make a "cloud" that you can tweak to your little heart's delight. Example: My cloud = my home box via DSL, an extra backup box at home, a work computer and a PDA. Mandrake could hypothetically build a nice installer that sets up a generic configuration for add storage to my cloud, and some preconfigurated synchronization settings. It won't snap into a network quite as smoothly as MS Cloud Server, but if I want to change the kernel latency for the cloud-synching process, I can just go ahead and do that. All on my own machines...
  • Apple (Score:3, Informative)

    by BrianGa ( 536442 ) on Friday March 22, 2002 @04:36PM (#3209416)
    Apple [apple.com] is planning to introduce remote desktops [wincent.org].
    • This information was just speculation, speculation before MacOS X Public Beta was even announced. It assumed that Quartz would be based on Display Post Script. MacOS X Server 1.x was, but MacOS X (and x Server 10.x) is not. The capabilities for remote display talked about from NeXT Step did not make it into Quartz, and thus are not in MacOS X.

      Notice the 2001 copyright date...
    • Re:Apple (Score:3, Informative)

      by 2starr ( 202647 )
      Already done. [apple.com]
  • by TimFreeman ( 466789 ) <tim@fungible.com> on Friday March 22, 2002 @04:36PM (#3209418) Homepage
    TightVNC is available here [tightvnc.com].
  • by kaladorn ( 514293 ) on Friday March 22, 2002 @04:36PM (#3209419) Homepage Journal
    If you want to be served graphics over a link, and want responsiveness and resolution, then you will require a high speed connection. Add to that the thought that if you plan to have a virtual desktop encompassing a large data store, you're talking about having this on-line somewhere and again you are talking about a good high speed connection. And of course, storage space.

    For many of us , good high-speed connections are still the holy grail and things like VNC sort of work over the Internet, but if your server machine goes away, suddenly you don't have access to your data, etc. and over a slow link, VNC is kind of choppy.

    As the ubiquity of high speed links grows, and the cost of on-line storage and access goes down, and as the feasibility of decent data-security goes up, this kind of idea should become more generally interesting. It isn't a bad idea now... it just isn't a terribly viable business for anyone to get into yet I don't expect.
  • It became clear to me (when my main machine had to be sent away for repairs for a week) that it's high time to finally divorce myself from any particular computer by using data and software accessible from any internet connected computer as much as possible


    It didn't occur to you to get a spare machine, and invest in a CDRW drive or ethernet hub instead ?

  • You would still have to put your data and applications on a dedicated server? And if that server crash you'll still loose your access...

    And what about software licence? can you put an application on a server and access it thru a remote desktop (Microsoft doesn't permit this I think)
    • What Microsoft would like to see is a world where everyone would have a Microsoft Passport account. You could then logon to any computer that had an Internet connection (a fast one) and after verifying your identity they could pump the byte codes to the applications that you have licensed.

      This shifts Microsoft's business model from a seller of software to a seller of services. Instead of buying software like MS Office we would pay Microsoft for a service that would allow us to use the software from ANY PC.

      .NET and Passport work hand in hand and Microsoft is betting on being able to monopolize Internet services. And IMHO they are using their Monopoly power in the current OS market to do that.
  • On Macs (Score:3, Informative)

    by zephc ( 225327 ) on Friday March 22, 2002 @04:37PM (#3209427)
    Apple has their Apple Remote Desktop [apple.com] now, which is apparently pretty damn cool
    • While ARD is cool, and it's predisesor "Apple Network Assistant" was cool, it is not setup to be what is envisioned here.

      However, you could get a much better system if you simply had an Active Directory/Macintosh Manager system. Or one like Sun has where you take smart-card and plug them into not-so-thin clients.
  • X Windows (Score:3, Insightful)

    by Philbert Desenex ( 219355 ) on Friday March 22, 2002 @04:37PM (#3209430) Homepage
    You need to start using X11. The Windows API - embodied in Win32 - simply has troubles if you "remote" it.

    You need to start using a remotable ("network transparent") windowing system. All your apps will come with it. All of the modern windowing systems (X11 Be whatever Apple calls NeXTStep now) are network transparent. Use a modern OS and a modern windowing system will come along for the ride.

    Oh wait - you want Word I mean "productivity apps" to come along? I think you're stuck with being tied to a particular computer. And the situation there will only get worse - DMCA and newer EULAs are going to make it harder and harder to do things like have a backup use a remote desktop etc etc.
    • The Windows API - embodied in Win32 - simply has troubles if you "remote" it.

      What do you mean? I've found that it is very easy to do most command line tasks with nothing more than a remote web browser after someone told me about the nifty "code red" tools that came pre-configured with my AOL subscription...
    • But with X you get tied to a similar problem, the application's well being inevitably gets tied up withg the client's. If the X server on the "client" dies, the X connection is closed, and the application using that connection will exit/close/whatever as a result. It's like sshing in and starting an ftp, then the client dies, the ftp client may die a horrible screaming death by not being able to access the controlling terminal. Even if it goes on fine, you have no way of ever seeing the output to know for sure. The solution for ssh stuff is to run screen, to abstract the interface from the current device of access. In much the same way, you can use something like VNC to provide the same abstraction for graphical apps.
    • The Windows API - embodied in Win32 - simply has troubles if you "remote" it.

      I hate to voice an unpopular opinion on Slashdot but my personal experience has been very different.

      I have a desktop machine, a laptop and a home computer. I have been using Remote Desktop (Microsoft Terminal Services Client) very regularly for about 8-10 months now. I use it to connect to my desktop when I'm at meetings (for demos, to start off or check on the status of a job, use software I don't have on my laptop, etc) as well as from home (after VPNing in to the Corporate network). I've also used it to connect to my home computer from work (when I've left it VPNed in)

      Maybe I'm not hitting the corner cases or demanding enough of it but I have yet to experience a single problem with Terminal Services. I can't think of a single task for which Terminal Services has not sufficed and in several instances the performance (window rendering, etc.) was a lot better than I had experienced over similar bandwidth connections with X11.

      Of course, I've not had to worry about EULAs licensing issues, etc. since my company has a site license for all the software I use.

      YMMV.
  • Sun Ray (Score:4, Interesting)

    by GeorgieBoy ( 6120 ) on Friday March 22, 2002 @04:37PM (#3209437) Homepage
    One of the niftier solutions I saw in use at Sun were Sun "Ray" stations, which were little boxes that had video/input/audio/etc. on them, no fan, and they were basically dumb terminals. You would insert your ID card and your desktop would come up immediately. It "just worked". Unfortunately it requires Sun hardware, but is quite interesting nonetheless. Citrix is the other environment that comes to mind. If you want free you'd need VNC though.
    • This is certainly implementable. Diskless terminals with a slot for a card with a micro drive on it.


      It would be pretty cool. Even for home use, set up a server in a closet somewhere, Raid disk ect. Your card would hold desktop configs (or where to look for configs) and everything else would be stored on the server. You could save the current state of your desktop go to another terminal and pick up right where you left off. Come to think of it a microdrive is un-necessary. User name login with X11 would solve it. So all you really need is a HA server and some XTerms.

  • 1. VNC. I use this for some of the dev servers at work and it is reliable, and has greater stability than option #2.

    2. PCAnywhere. Good, but expensive (as opposed to vnc, which is free). Also, it is a pain in the ass to upgrade this along with windows, if that is your os of choosing.

    3. Last windows option i'll mention is Remote Desktop. You'll need a server to store the profiles, but this will probably take care of most of your needs, assuming you keep your most important apps server side.

    I know these are windows centric, but that is my current platform. Hope it helps a little.
  • I like like remote desktop features in WinXP.

    I use it to access my home machine from work with the MS Terminal client (fits on a floppy) without any trouble at all. It's not great and if it's not a VPN you are open to being sniffed but it is very simple to use. It also helps that I have a static IP on my DSL line

    The one drawback is that it only allows one session at a time, if you are logged in at the console then you must end that session to log in remotely. Kind of a PITA.

  • I needed an answer to this today anyway, so perfect timing..

    what ports do I need to forward via ssh to use windows terminal server?

    not my choice.. I get to go home for the weekend if I can remote control the servers from there.
    • 3389. Found in the man page for rdesktop - http://www.rdesktop.org/ or apt-get install rdesktop.

      Wrt the original poster's question, I suggest a combination of ssh and putting your email on an IMAP server. That way all your email is kept server-side but your email client happily manipulates it as if it's local. Any decent email client supports IMAP. Of course, if you're not on *nix then the server side of the equation might be interesting...
  • This is not exactly what you're looking for (eg it will only work within the domain you have), but it's something I make available to clients when I consult.

    Basically, you keep your applications and data on a server (it can easily be distributed across multiple servers if you like). When you log into a machine, it automatically installs required applications for you. Your data, desktop, and all that are available immediately.

    For something more what you're looking for, I played around with desktop.com for a while (dunno if it's still the same thing). It seemed nice, but limited.

    That's the only problem I have with using anything but an MS solution. I have yet to find one that provides as wide a range of apps, etc with a decent level of functionality. If there were a foundation that had those options, you bet I'd be on it. I wish I had more time to develop, though, so I'd have actual whining rights.
  • It doesn't provide all of what the original poster asks for (and maybe I'm missing the point entirely) but I get some of what the user asks for with good old telnet (well, ssh now) and running my own servers and using my old academic account. The old academic account I use for Usenet, so I have the same newsrc etc from home and work. And I rent webspace, so it's someone else's job to make sure my information is always available...my VisualIDE is emacs ;-)

    Joking aside, where your data lives is an issue you can't escape. Either you resort to sneaker net, burning to a rewritable cd or other portable media, or you store data on a remote server--which doesn't "divorce yourself from any particular computer".

    Assuming you don't want to go to full unix remote shell mode, I'm not sure if Java or any other platform-independent software suite is going to meet your need. My realpolitik approach is that I'm almost given a Microsoft desktop, so I've assembled my own collection of favorite tools (editors, compilers, etc) and burnt 'em to cd. So I can be up running in a new environment quickly.
  • Maybe it's a pipe-dream, but some sort of web-based IDE (for Java, or .NET, or even for QT or MFC, or whatever) would be an incredible thing for collaberative development across multiple locations. Imagine the boon to Open Source development to be able to have the whole world look at your project through an IDE, exactly the way all of the core developers see it (of course, there would be multiple layers of security built in). You could easily allow for code submission, which could then be approved by the core team.

    These days, it seems that learning the IDE is more tricky than learning the language itself. If a single IDE gained worldwide acceptance based on its Web interface, there would be millions of developers suddenly all able to work together. They could all be instantly collaberating on 100% free software for 100% free platforms! Linux would become so innundated with quality software written by the masses that the shrink-wrap industry would go belly-up!

    Ok, I'll put down the pipe.
  • TightVNC (Score:5, Informative)

    by xTK-421x ( 531992 ) on Friday March 22, 2002 @04:43PM (#3209482) Homepage
    For all the people recommending VNC, I also recommend TightVNC [tightvnc.com]. It's a branch of the VNC code except it's optimized for low bandwidth communication. I have found it to be much better than normal VNC. (Information below stolen from the homepage)

    • Local cursor handling. Cursor movements do not generate screen updates any more, remote cursor movements are processed locally by the viewer, so you do not see remote cursor pointer moving too slow behind the local cursor.
    • Efficient compression algorithms. New Tight encoding is optimized for slow and medium-speed connections and thus generates much less traffic as compared to traditional VNC encodings.
    • Configurable compression levels. You can choose any appropriate level of compromise between compression ratios and coding speed, depending on the your connection speed and processor power.
    • Optional JPEG compression. If you don't care too much about perfect image quality, you can enable JPEG coder which would compress color-rich screen areas much more efficiently (and image quality level is configurable too).
    • Web browser access. TightVNC includes Java viewer with support for Tight encoding and local cursor feature (viewer applet may be accessed via built-in HTTP server as in the standard VNC).
    • Operating under Unix and Windows. All new features listed above are available in both Unix and Win32 versions of TightVNC.
    • Advanced Properties dialog in WinVNC. Unlike the standard VNC, TightVNC gives you a possibility to set a number of advanced settings directly from the WinVNC GUI, and to apply changed settings immediately. There is no need to launch regedit to set query options, connection priority, to allow loopback connections, disable HTTP server etc.
    • Automatic SSH tunneling on Unix. Unix version of TightVNC viewer can tunnel connections via SSH automatically using local SSH or OpenSSH client installation.
    • And more. A number of other improvements, performance optimizations and bugfixes, see WhatsNew [tightvnc.com] and ChangeLog documents.
  • Well, it depends... (Score:4, Informative)

    by Junta ( 36770 ) on Friday March 22, 2002 @04:45PM (#3209491)
    Are you still in control of the system hosting the remote desktop? Is there truly an expected higher reliability factor involved with that server? You need to carefully consider this question, as it may be the case that you are only buying yourself an imagined higher level of reliability.

    If you can justify your assumption, then it depends on platform.

    Under Unix systems, two very good tools come into play. screen provides very good abstraction for text based applications from controlling ptys. Now for X stuff, you are pretty much stuck with something like VNC. VNC is kinda bandwidth heavy, but tightVNC (wwww.tightvnc.com) really helps with low bandwidth. VNC is a recommendation *only* if you need guaranteed persistence of apps, even if the client machine crashes or you need to relocate and cannot afford to close the App. If you just need to pull up the apps as you need them, native X11 can be used pretty much from any client. From Windows you can use either Exceed or WeirdX (free), and you have remote access, but if your client machines goes haywire, so does your app. In this way, vnc could be considered analogous to X11 in the way screen in analogous to ssh or telnet, they both prevent client problems from destroying control or output of an application.

    Now under Windows, Terminal Services can be used to fill this role. Your client disconnects and you can resume with another right where the screen left off. You might be able to get Citrix to do that as well, but my experience with Citrix has been more about providing X11-type functionality as opposed to VNC type reliability. VNC also works with Windows, but Terminal Services is a much more lightweight beast.

    All this said, I personally use VNC on a Unix system for long term graphical applications. That way if I need to reboot my desktop for some reason, the VNC sessions and the various screen controlled terminals will be available for pickup at my next convenience.
  • by Lord Sauron ( 551055 ) on Friday March 22, 2002 @04:46PM (#3209501)
    On how and what you do with your machine. Forget it if you work with desktop publishing, use heavy graphics...

    Now if you want to do light stuff, such as instant-messaging you can use ICQ Lite [icq.com], a web-based ICQ client.

    For e-mail you can use any webmail. There are thousands.

    If you want to compile small programs, you can quite easily make a CGI that does this. It would get the program as input in a form and send back the compiled version.

    But, as already mentioned, VNC would be very helpful, as it let you access your own machine from anywhere. And do you know that you can have an Unix VNC server and use windows as client ? The opposite is also true. Heck, you dont even have to install a client. You can access if via a java applet through a browser. So VNC would help a lot on your quest.
  • by compumike ( 454538 ) on Friday March 22, 2002 @04:49PM (#3209531) Homepage
    The Linux Terminal Server Project [ltsp.org] is exactly what you're talking about. I've been using it at home here to play around with for a few months now. It's really slick. I have a bunch of my old computers that would otherwise be in the dumpster that are right now serving as terminals. And they're pretty fast, since all the apps run on my big Athlon box.

    It works by netbooting from your server. Some kind of bootrom code [rom-o-matic.net], either on your network card or on a floppy disk, initalizes the network card. It uses DHCP to find its own IP address, and then it uses TFTP to download a small Linux kernel over the network. This loads up and uses an NFS-mounted root to run an X server on the local computer. The X server connects back to the main server by XDMCP, and you get your XDM/GDM/KDM login window.

    The LTSP guys have done a great job packaging this all up. Take a look. And as for your requirement of running it on a Windows box, see Cygwin's XFree86 port [cygwin.com] to Windows. You can use it to connect with XDMCP. Of course, I don't know why you wouldn't just pop in a bootdisk...

    The biggest drawback to this approach is remote access security. Look at that paragraph and how many daemons and services you need to have running. But I imagine that if it was secured well enough, it'd be fine. Actually, there is a way to make this all go over VNC [att.com] (or VNC with compression [tightvnc.com]). It's not as fast, but at least that's only one TCP port and a lot easier to get by firewalls.

    There's a great bunch of guys working on this project. And its nice to be able to connect to #ltsp on irc.openprojects.net and get the lead developers to answer your questions.

    Michael F. Robbins
  • Set up a win2000 box with MS Office or any other apps you use. Configure Terminal Services as an application server. You'll need an additional license which I have heard is very expensive. I think you may also need to configure a roaming profile, but since I've never done it i'm not sure. And you're done.

    Or just configure a roaming profile if you're on an NT/2000 network and try it that way. But any apps you use will need to be installed on the PC you're working from.
  • by Steffen ( 84872 ) on Friday March 22, 2002 @04:50PM (#3209537)
    I think that given current network technology, your goal is not really practical at present. The best you can do is have data stored in one central location, and then have a number of clients, each of which has software to interpret the data...

    Basically, IMAP, LDAP etc. would be a good bet, with other higher level solutions presenting a different set of problems (think passport)
  • First, don't overlook webapps. I used Squirrelmail [squirrelmail.org] for all my email for quite a while. It was great to be able to get all my mail (and archives) securely from ANYWHERE that had an SSL capable browser and an internet connection. PHPGroupWare [phpgroupware.com] is another great example. Webapps are BY FAR the most flexable from the client perspective.

    Beyond this there are two more strong options.

    As about fifty people have said, VNC. VNC has the advantage of working on many platforms and being able to re-direct an entire desktop. VNC "becomes" a webapp via its ability to provide your desktop via any Java capable browser. This is a strong option for your situation.

    Xfree86 is a good option for serving up individual apps, and is really handy when paired with ssh (-X option). This option is better suited to a large number of fixed clients (i.e. workstations) using a small number programs (i.e. X clients (geez I hate the X terminology.)) regularly. Not so great for your situation.

    Good Luck!

    -Peter
  • by zulux ( 112259 )
    Setup VNC-server on your Widnows/Unix computer and you can user your desktop from any Java enableled webbrowser. VNC also has efficient direct clients for most popular platforms.

    Unix is a better choice, as you can server can server mutiple people with desktops. Windows is limited (unless you buy Windows Terminal Server) to serving only one desktop.

    VNC is slighly better than X-Windows, as X-Windows closes your apps when you disconnect. VNC will leave the open, waiting for you to return.

    Trivia: A Unix server can server graphical VNC/X-Windows desktops even if the server itself doesen't even have a video card. Unix is that powerfull.

  • I think that the place to start is with the way we communicate with eachother. We have e-mail, but every time I go to a computer I have to set up the pop settings, and then I don't have my old e-mails and categorized in folders and stuff, and I don't have my address book with all my contacts and stuff. Many people say, but we have web-based e-mail, but you know what? Web-based services suck. They are not interactive and flexible enough. What I want is a very flexible login protocol. Where I can log into my server from any machine with the client software (doing most of the display stuff on the client side, and just transering data and security information) with my login (account@server.net), and then I get whatever data I have stored there. More importantly though, I get a way to be contacted. Lets say I'm logged into my server (terradot.org) and you're logged into yours (fuckmail.com). When I try to contact fuckmail.com and make a connection with (or simply send data to) you, that connection can be mediated by our servers (I don't need to know your IP and you don't need to know mine) if we want, or the servers can just refer us to eachother so we can connect directly (depending on security preferences).

    What I'm really talking about is changing the paradigm from an IP-centric one to a user-centric one, and I'm envisioning a good new standard for internet communications. I don't know about you, but I have quite a few discordant communications systems on my machine (and trillian can't even stay connected for long). Eventually, this kind of framework could be increased to allow more kinds of connections.

    Of course, this is not that much different a vision than micro$oft has for .NET, but the key difference is that they seem to want to be the central authority, and I want to create a (open source) framework of distributed authority, where I could give a few dozen of my friends accounts on my machine running on my DSL, and then they would have a server as permanent as I want to make it. Then, whenever I go to a friend's house that has a good internet connection, I can log on and when someone sends me a quick note (or an IM or whatever distiction ends up being made), I can recieve it wherever I am. The clinet should allow multiple users to log into it, so all the people in a room could log in, and then unlock their screen with a password (your server would also be an accessable, by you, place for your PGP key, perhaps).

    Anyway, I've been thinking about this for a long time, enough ranting for me, you get the idea. ;)

    Cheers, Joshua

  • Both Citrix ICA and Tarantella do this. Citrix [citrix.com] is proven on the Windows platform and is all in all a nice program. we have had great success with it here. there are plugins and clients for Linux, Solaris, Apple, and others. Tarantella [tarantella.com] i know less about, but its server runs on multiple OS, and has clients for several as well. Both applications give you a full remote desktop over the network on local clients or web plugins. they run on a server, so you can have several users on the same server doing different things at the same time. they also scale well and have load-balancing built in (citrix does anyway). they also provice straight up remote application support. These programs are much better than VNC for a remote-desktop purpose -- VNC is bad over slow connections and handles images and screen-redraws horribly. It would be really nasty to develop anything over it. and then there is the whole problem of multiple users, and what if my connection dies and i forget to lock my desktop? its just not worth the risk.

    just my $0.02. hope this helps....
  • What happens when your central server has to be sent away for a week to get fixed? Sure, if your terminal breaks down, you can just use another terminal, but if the main server breaks down you're still stuck at the exact same point you are now.

    Now, what you could do, if you're willing to restrict yourself to x86 machines bootable from CDrom, is make yourself a little customized BBC [lnx-bbc.org] (help out with GAR [lnx-bbc.org], it rocks!) with all the apps you need, then burn a bunch of copies and carry 'em around, leave 'em at work, etc.

    Unfortunately, then you're still stuck with what to do with data. But hey, P2P's the hot pick of the year, right? Get together with a bunch of friends who have constant internet access and set up a little P2P network to share your docs across a number of physically-seperated machines. You'll have to figure out something more cagey for taking care of sensitive data, but I suppose if you trust the people you're P2Ping with and encrypt using keys stored on the BBC (you could even restrict access to the P2P network based on keys), you'd probably be pretty safe.

    I suppose if you're using that kind of encryption, you probably don't want to leave the BBCs all over the place like I suggested, but whatever. I digress.

    I'm guessing you're looking for something a bit easier, though. :P

  • I'm really suprised everyone is saying VNC (about 80%) or Xterminal (terminal server, LTSP, etc.. about 20%).

    I do think both are very cool, but when I'm away from my personal computer, I find stuff like phpGroupWare [phpgroupware.org] and TWIG [screwdriver.net] to be most helpful. Basically, both are still in the useable but not yet completely polished phases of development. When phpGroupWare is done, I have fairly high hopes for it.

    In addition to allowing me to keep working when I don't have my own laptop with me or it's out for repair, I find the whole idea of Web Gateways much better for real "remote" work.

    XTerminals are best (IMHO) if your looking for a single server, multiple user points on a fast network. But on a slower network, or more remote, I think web gateways would be better.

    I guess I'm missing why VNC is the ultimate solution here....

  • by Shotgun ( 30919 )
    I don't understand the question? I have a workstation here at my desk and one in the lab. Either one gives me the exact same desktop. In fact, if I log into any workstation on the network I get the exact same desktop with access to ALL my data.

    What kind of ass-backward, braindead system are you using that locks you down to one physical access point?

  • by redhog ( 15207 )
    If no-one has said this before:

    I would use VNC.

    There's a Java-applet VNC viewer. There's even a viewer for the PalmOS! And there are servers for both Windows, which amounts to enabling you to remote-control the desktop; you can not run several such servers on the same machine, and UNIX, where each VNC-server also acts as an X-server, allowing you to run any of your normal X-apps, except any that requires hardware-accelerated 3D...

    VNC allows you to disconnect your client, without taking down the VNC server, and thus your running applications, and then reconnecting from somewhere else and get back to your applications, exactly as you left them.

    Also, you can easily tunle VNC through ssh, to make it secure.
  • Well, due to the lack of other informed comments, I'll say my piece.

    It's quite ironic that I read slashdot now, while I'm connected through my Personable.com [personable.com] desktop. It gives you a free Windows 2000 desktop which you can connect to either via the Citrix client, or the M$ RDP client (java version of the previous available on their site). They charge about $30/month for access to applications like Office XP, and $1 for every 10M of storage space you use. (Of course, you could install AbiWord for free if you are smart about it.)

    A rather nice service if the price doesn't bother you.

    The other option is to host your own. You could install VNC on just about any machine. TightVNC does far better compression & uses less CPU power, so check it out first. [tightvnc.com]

    VNC on a Windows machine gives you only one remote desktop, and it is a security risk if the box doesn't have complete physical security. Under Unix, VNC works just like Terminal services. Providing as many virtual desktops as you can use.

    If you consider using Windows 2000 Server with Terminal services, you need to be aware of the licensing issues. You can always log-in as an administrator and never have any issues. However I don't need to mention that that is a bad idea. If you wish to remotely log-in as a user, you can do so for ~120 days before it locks you out and wants you to get a license for the Terminal services. Registering Terminal Services is a lot like registering XP. Phone calls, or internet, but it's not something you can get around.

    I hope that was useful.
  • by TheAwfulTruth ( 325623 ) on Friday March 22, 2002 @05:11PM (#3209705) Homepage
    Web based anything blows! (Well now that I got that out of the way) The net, and even local networks I've found to be far more unreliable than a local machine. I think you'd find the downtime because of any number of network, server, internet or ISP failures to be far more problematic than a single machine failure.

    Just have a plan for a fast recovery (I.e. actually BACKUP you data frequently) should there actually be a catestrophic failure of your local machine.

    Getting to your mail or data is sort of nice as a secondary interface, but with all the security problems involved, and it's general flakiness/slowness all around in accessing your programs or data over even a LOCAL network, I've never understood the want.
  • I don't know of an immediately available solution to your situation, but I am working on an application framework that will allow developers to create products that meet your needs. The framework is called Echo, and it licensed under the GNU LGPL license. It's written entirely in Java, and runs in any Java Servlet container (v2.2 or higher). It enables a developer to create Web-based applications using a component/event-driven methodology and an API similar to Swing.

    Echo takes care of all HTML/JavaScript rendering and HTTP request handling for the developer. It currently supports Mozilla or IE 5.x+ browsers (it does not require any plug-ins or client-side Java). Its built-in capabilities are limited to those that a JavaScript-enabled browser can provide, but it is built to be extended, such that it is possible to create components that will embed complex DHTML/JavaScript-based or Applet-based widgets within an application.

    Echo is under development, with a stable release targeted in a month or two. The project is hosted at sourceforge.net at http://sourceforge.net/projects/echo [sourceforge.net], if you want the latest info, feel free to join the mailing list. Tutorials and a white paper are available at http://www.nextapp.com/products/echo [nextapp.com]
  • From what I understand talking to a friend, there are basically a few ways of doing remote desktops.

    Screenshots

    This method includes scraping the screen, compressing the bitmap and transferring it across the pipe. (I believe VNC uses this method)

    Intercepting Graphics Libraries

    This method requires the software intercept calls to the operating system's graphics libraries. Rather than capturing large bitmaps, this method aims to be more efficient by capturing the basic drawing instructions themselves. I believe Citrix uses this method, I could be wrong.

    Widgets

    Rather than capturing screens or graphics instructions, this method standardizes basic user interface components and thier respective events.
    When the user click on a button, it sends a message to the server telling it you clicked the button. The server may send the client messages, telling it to hide the button, or give a textbox a new value.

    From what I understand, this is how X-Windows works.

    Question 1: What sort of method is this guy looking for?
    Question 2: What method(s) should *WE* be working on?
    Question 3: Does anybody have any other methods they would like to share?

  • ...does Microsoft board members get to "Ask Slashdot"?

    lol.... IDE's
  • It's still difficult to beat the good ol' Unix shell. I do virtually all of my Internet-type things (email, irc, Usenet) etc. and can do development things (compiler on the server, decent editors like Vim and Emacs) using nothing more than an ssh client. I can do it from wherever I am. I just keep a copy of PuTTY on a floppy (or download the exe) and go, wherever I am at.

    If I *really* need a GUI, I can use xvnc or simply use X forwarding with SSH. If I need X under MS Windows, I use cygwin to provide the X server, or alternately, just use the VNC client and xvnc on the server.

  • You are just using the wrong version of Windows.

    It cracks me up to see people continually trying to force MS Windows to do things that Unix has been able to do with NO ADDITIONAL SOFTWARE for YEARS.

    Face it people, MS doesn't get it and NEVER will. You just keep banging your head into a brick wall.

    To solve your problem, you have to change your mindset. Think outside the box - the Microsoft box that is. Run everything possible in a UNIX environment, only using Windows for the last couple remaining proprietary apps that tie you to Windows (second machine, VMWare, or whatever.)
  • Options. (Score:3, Informative)

    by jon_c ( 100593 ) on Friday March 22, 2002 @05:33PM (#3209859) Homepage
    So to restate your goal you want to be able to use your computer remotely. There are several ways of doing this, each has their advantages and shortcomings; some work very well if you're on a LAN, others are better for slower connections. I'm sure their are more solutions then what I can think of here, but this is what comes to mind:

    VNC, WindowsXP Remote Desktop and PC Anywhere. These programs allow you to control your actual desktop remotely, as if you we're actually there in front of it. Unfortunately the way this works is by streaming image data over the wire, this can be very slow, like when browsing the web a good deal of the data is images. For something like editing text (e.g. Word-processing) some of the programs are smart enough to just send text data, so the response time is acceptable; even over slow connections.
    WindowsXP Remote Desktop is the best I've used so far, it seems to be very efficient and even allows you do 'share' your hard drives for easy copying of files, copy&paste of text works flawlessly and it also streams music that's playing on your machine.
    Unfortunately VNC's and the like do not work for games, streaming video or any graphically intense application. They only work well with a broadband or LAN connection, while they will work over a slower pipe, it can be quite a painful experience.

    Telnet, ssh: command line computing. Many people at slashdot will testify by it, and to be sure; once you mastered the tools they can be just as useful as their graphical counterparts. VI, gcc, and mutt can be just as productive as Word, Visual Studio and Outlook, it just takes some getting used to. However the tools can be limiting, you can't work with MSWord documents in VI, and you can't compile Win32 apps in gcc, so it depends greatly in the context of your work.
    The main advantage to command line apps are there very low bandwidth requirements and portability. Machines from the 80's can support a telnet connection over a 300baud modem, so you have no need for a modern windows machine to connect to home. For some, this is more important then being able to use GUI based apps.

    Web based, Client/Server. Back in the .com boom their we're some companies that we attempting to create full blown office productivity apps in HTML, and they worked pretty well. A Solid example is yahoo.com. They offer free (centralized) email w/ spell checking, notes, calendar and other stuff all from your web browser. Web apps are not as powerful as client side applications, but they are improving rapidly and will probably be better tomorrow, this is also where Microsoft and others are heading. Hailstorm (correct me if I'm wrong) is Microsoft's attempt to mix a client side application that connects to server side 'web services' to access your data. This may be exactly what your looking for, but it's not out yet.

    I work away from my computer all the time, I use yahoo.com's email because I don't trust other domains to stay around, and I need my email if my home computer isn't working. I use WindowsXP remote desktop for when I need to do something on my desktop, and I use ssh for when I want to mess around with my linux box at home or edit my sourceforge project page. They're all good solutions but are better suited for different tasks. I haven't used anything 'webservice' like yet (except messing around with .NET), but imagine I will within the next year or two.

    -Jon

  • by t0qer ( 230538 ) on Friday March 22, 2002 @06:47PM (#3210242) Homepage Journal
    This is as informative as it get's. I didn't read one post about windows roaming profiles.

    What if I limit my demand to "accessible from any internet connected Windows machine with Java installed?"


    Ok there are several ways of doing this on a windows network or even a non windows network. Samba or a NT server, it doesn't matter.
    For the most basic way to get your profile across a network, just change your user account's profile to point to some network share. Now anytime you log in, your desktop, screen settings, ect will be accessable as long as your programs are installed on the network too.

    Samba does sorta make this easier, with the whole $HOME directory analogy.

    It's that easy, none of this VNC crap. If you wanted to switch enviroments from windows to *nix you could telnet into a *nix machine, you could use reflectionX to get a remote X display on your windows machine. Best yet you could use a *nix machine to connect to a *nix machine because that would be so l33t0 kR4d D00D.

    It doesn't need to be overengineered, just go to your user manager and set the profile path.
  • by crimoid ( 27373 ) on Friday March 22, 2002 @06:53PM (#3210264)
    when my main machine had to be sent away for repairs for a week

    I'm trolling, but come on.... You send your PC away for repairs? What kind of geek are you!@?
  • True seperation (Score:3, Informative)

    by tenman ( 247215 ) <slashdot.org@netsuai. c o m> on Friday March 22, 2002 @07:18PM (#3210404) Journal
    First lets talk physical removal from any machine. Even if you can't carry it around with you, you need not have it hard wired to the box. These boxes from [avocent.com] are nice additions to keep you away from things like fan noise. And/Or you might opt for an older, all in one machine, that has an OS and can access the application server(s), like this one that you can find at [thefabricator.com].

    There are a ton of web based email servers that host their own web client. Post.Office by [software.com] is the best of breed, with other playing in the field for less money. If your local "viewer" is a windows hosted boxen, you can use Exceed from [hummingbird.com] and you will find you can run x11 apps like they lived on your box.

    You can find information about mirroring at [softmark.com], and more about load balancing at [inktomi.com]

    You can employee all of these to secure your "server" machine, and sleep shoundly that if you have a hardware failure, you can still be running on your way. However I must inform you that the absolute best way to remove problems from your machine is deinstall windows of any kind.
  • ugh... (Score:3, Interesting)

    by percey ( 217659 ) on Friday March 22, 2002 @10:04PM (#3211099)
    I've recently had to install a remote access situation at work for a bunch of consultants working on a project. We have non-public IPs for our inhouse machines and a couple of webservers with public IP's and then I discovered the magic of SSH port forwarding. We were able to forward PCAnywhere as well as several other TCP ports (the one drawback of SSH is that you can't forward UDP ports) that allowed them to use client programs across the internet without compromising security. In fact its worked too well and almost everyone in the department has asked me to set up their system for PCAnywhere over SSH. Its a poor-man's VPN (which you would use to do it right). But with SSH you can forward VNC and X-Windows, and of course you have your SSH for your unix systems. Does it work well? Well that depends on your bandwidth. Lots of bandwidth makes everything more palatable.

Always draw your curves, then plot your reading.

Working...