Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
X GUI

Any rxvt-Sized Unicode-Aware Terminal Emulators? 48

Viqsi writes "Just on a lark, I started a short while back to try to convert my environment totally to UTF-8. One of the big hangups that I've run into so far, however, is my X terminal emulator. I've been very happy with rxvt (I tend to have several $XTERMs open at once, so Low Memory is Good!), but it doesn't seem to support anything Unicode. A bit of searching has turned up nothing that isn't as big as or larger than xterm itself. So, the question -- are there any low-memory terminal emulators that support UTF-8, or any other Unicode encoding? (tabbed-window style terminals Don't Count, and that goes double for Konsole!)"
This discussion has been archived. No new comments can be posted.

Any rxvt-Sized Unicode-Aware Terminal Emulators?

Comments Filter:
  • by Aknaton ( 528294 )
    is XTerm so large? I've been hearing about this for a while, as it is usally cited as a reason for using rxvt.

    I also have a question regarding Unix and Unicode.

    Although GNU/Linux and BSD systems can support Unicode libraries for the apps that need it, the OSes themselves use only ascii (from what I understand). Has there been, or ever will be, a form of Un*x that natively supports Unicode in all things? Or would doing such a thing create too many problems?
    • by RobKow ( 1787 )
      xterm is big because of the Tektronix graphical terminal emulation.
      • Re:Why (Score:2, Funny)

        by Anonymous Coward
        I thought it was so big because it has to contain the proper responses to anything that I type in it. For example "ls". How does it know?
    • Has there been, or ever will be, a form of Un*x that natively supports Unicode in all things? Or would doing such a thing create too many problems?
      I don't know how difficult it'd be, but what's the motivation? End users, and even application developers, may need to interact using Han or Cyrillic or Hangul characters. But it's hard to see why a kernel hacker needs more than 2^7 characters.
      • Re:Why yourself (Score:3, Insightful)

        by keesh ( 202812 )
        But it's hard to see why a kernel hacker needs more than 2^7 characters.


        Or more than 640KBytes of RAM. Or more than a 33MHz processor. For that matter, why do they need both uppercase and lowercase? Why do they want monitors? What's wrong with a VT100?

        The days of "sorry, no accents or unAmerican characters" are over. Unicode support at every level would be a big help for non-US-English development.
        • You know, there are applications that run perfectly fine in a small memory space, on a slow processor, and don't require fancy GUIs. I still enjoy playing old text-mode and low-rez computer games, designed in the days of 16K memory spaces and 1 Mhz processors. (I play them on emulators, not on the hardware they were originally designed for, but that's a matter of convenience and desk space.) High-powered systems are fun, and sometimes even useful -- but not every app needs them.

          The Linux kernel is not a very big entity -- which is a prime reason for its success. I find it hard to see what use it can make of extended character sets. And even if adding such a feature to the kernel had some benefit, there's a cost in terms of size. speed, and risk of bugs and security holes. You need to weigh the benefits against the risks, not just assume every bit of software in the world has to support UTF-32768. And the plain fact is, there doesn't seem to be any benefit at all.

          Perhaps I'm wrong. But to prove me wrong, you're going to have to suggest some real-world examples or scenarios that contradict me. Reciting cliches about vaguely relevent history says nothing.

          Note that I'm not attacking the general concept of Internationalized software. In point of fact, I spend a lot of time documenting the International features of my own company's products. All serious development tools support Internationalization. But they support it from the run-time-library level on up, where 99.99% of all development occurs.

          • I don't understand.

            If you are a kernel hacker (I'm not....), and you want to use chinese characters as variable names, why should you not be able to do that? Is that what you meant? Just which parts of a computer should only ever be referred to using glyphs from the Roman empire, and why?

            I can't think of any reason for any part of a computer system being ascii only, except the reason:

            "It's quite hard to change, and there's not much demand". Which is fine, but given that most people in the world can't communicate using ASCII, it's surely only a temporary excuse....
            • Well, for starters, I doubt Linus would ever accept a patch with chinese (or any other non-english language) variable names. To be able to hack Linux you need to know at least enough English to understand what the existing code does. So, using English for your own code shouldn't be that big a problem.

              i18n is great, utf-8 is great, but using non-english languages when hacking free software is not so great -- at least if you want to share your code.

              At one point, the macro language used in MS Word (WordBasic or whatever) was translated into Swedish in the Swedish version. Everything was translated, including IF-THEN-ELSE statements etc. It was completely useless.

              I'm not sure what gcc would say if you started using non-ascii utf-8 in your variable names though...

            • If you are a kernel hacker (I'm not....), and you want to use chinese characters as variable names, why should you not be able to do that?

              Because the common language of kernel hackers is English.

            • Ooh, how politically correct of you. God forbid anyone should be able to read linux source, much less its maintainers. It should be a tower of babel with patches in thai, chinese, and arabic.
            • Actually, the Romans didn't invent most of our glyphs. The Romans didn't have J, U, W, Y or Z, and only used K when writing Greek words. And they didn't have lower-case letters or punctuation. All invented later.

              But I digress. You seem to be assuming that "Unicode support" is this magic thing you can add to any program and suddenly support every Unicode character. In fact, no program directly supports the complete Unicode character set -- Unicode was never meant to be used that way. Instead you support an "encoding" that gives you access to a manageable subset of Unicode.

              The most widely-used encoding is UTF-16, which supports a 2^16-character subset of Unicode. Most major programming languages support both ASCII and UTF-16. Some (notably Java and Visual Basic) support UTF-16 instead of ASCII. Unfortunately, the documentation for these languages usually refers to their their "wide" characters as "unicode" characters, as if Unicode were just a 16-bit "universal" character set. In fact, there are important Unicode-supported characters that are not in UTF-16.

              There's also UTF-8, a variable-width encoding that's backward-compatible with ASCII. I believe GCC already supports UTF-8, and could probably be made to support UTF-16, as most C compilers aready do. And since GCC is written in GCC, you could probably allow kernel programmers to use these extended character sets in their source code. But it's a tricky thing to do [cam.ac.uk], and it's difficult to see the benefit.

              I hate the term "politically correct", but maybe it applies here. You seem to feel that any software that isn't character-set-agnostic is unfair to non-Western users. Putting such an assumption ahead of issues of reliability and security is a very poor kind of prioritizing.

    • Comment removed based on user account deletion
      • Plan 9 is not UN*X but Mac OS X is and uses UTF-8 and UTF-16 for almost everything, and has a lot of fonts to support the different languages (Japanese, Chinese [Both traditional and Simplified], Korean, and other languages).

        Also Don't use any of the M$ products for the Mac because they do not support Unicode at all.

        Use OmniWeb for a Web Browser and Just use TextEditor for an word editor.

        Also Terminal uses UTF8 for the default encoding, you can change this if you want.
        • Actually, Mac OSX is not UNIX(tm), but it is as
          much Unix as OpenBSD, Linux, Windows NT/2000/XP are.

          Well, I wish OpenBSD hat native UTF-8 - currently it
          is totally locale unaware and just makes 8bit==latin-1
          assumptions.

          This does not mean I want NLS or I18N: localized
          error messages, locale in general, LANG= and LC_*=
          do suck a lot.
          • Windows NT/2000/XP is no where near UN*X as much as Linux and *BSD are. Mainly because of the idea of a kernel and what goes into it.
            The Windows server in Windows NT/2K/XP is part of the kernel while under UN*X, it is an userland program. And also Mac OS X, *BSD are based loosely around the UNIX source. Also Linux is inspired by MINIX which is inspired by UNIX.
            • No sense arguing with this type of person- unless you're on this list [opengroup.org] you're not "Unix." Silly trademark issue. Apple hasn't paid the assload of money to get certified, nor as OpenBSD. So they're not "Unix." I say, who cares if it's called a car or an auto, still does the same damned thing.
            • "The Windows server in Windows NT/2K/XP is part of the kernel"

              If I remember correctly the "Windows" part of "Windows NT" is simply a "personality" which is hosted under the NT kernel. In the beginning Windows NT wasn't even going to be called "Windows" - it was just "NT". Only part-way through the development of the GUI-agnostic NT kernel, was the Windows personality added on (there was also a short lived OS/2 personality IIRC). So you're statement isn't so accurate. Now perhaps it actually lives in kernel space, or over time has integrated more or less with the kernel proper, but making this claim seems to be a standard "criticism" of NT. For all intents and purposes, operating systems should come with GUIs. I'd be glad if Linux (which is now being pitched as a desktop OS) actually came with real GUI support in the kernel. Or at least shift the video drivers from XFree into the kernel.
              • Or at least shift the video drivers from XFree into the kernel.

                You might want to look at the kernel framebuffer support. That's exactly what it does. My [somewhat uninformed] understanding of why there are still video drivers in both places is that:
                1. The Linux kernel framebuffer support is not yet all that mature.
                2. XFree86 is designed to be used with more systems than just Linux, and as such, cannot rely on the existence of an underlying framebuffer API.
            • NT isn't a Unix because it lacks a variety of very unix-y things. Stuff like BSD sockets, 'sh', an X implimentation, unix-like filesystem, unix-like permissions, and the visible filesystem layout is a nightmare (not that I'm a huge fan of unix-y layouts, but they're a darn sight better than Microsoft's layouts). It lacks a lot of unix-y libc stuff in a familier format (fork() comes immiediatly to mind, as I recently used it in some code I sent somebody running Win2k and had them come back to me saying "wtf do I do now?" and I wound up sending them cygwin to install rather than learning any of the win32 api; luckily this wasn't performance critical and I wasn't being paid for it).

              The fact that the kernel seems to be a bizzaire mess is not, in my mind, what prevents NT from being a "Unix". Sit me down infront of a system where everything is in the kernel and runs as one big 'file' and I'll call it a Unix if it does what a Unix is supposed to do and does it well. I'll say the programmers need to have their heads extracted by a qualified proctologist and subsequently beat with a sendmail manual, but I'll call it a Unix.
          • Mac OS X is BSD. BSD was UNIX long before X/open owned the trademark. Were their any justice in the world BSD would be grandfathered in as UNIX. After Ford purchased Mazda my Miata didn't become a Ford. I suppose Apple is big enough of a company to care about trademarks so they could conceivably spend the money to make Darwin UNIX 95 standard compliant and certify it. Then they could legally call it UNIX instead of *NIX or UNIX-like.
    • by dvdeug ( 5033 )
      the OSes themselves use only ascii (from what I understand).

      The OS's support arbitrary strings of 8-bit characters, which means they support UTF-8. There is no point in a modern Unix kernel where you would want to use UTF-8, and it won't let you, short of arbitrary hardware or standards limitations (weird foreign filesystems and what not.)
    • by Tet ( 2721 )
      is XTerm so large? I've been hearing about this for a while, as it is usally cited as a reason for using rxvt.

      Yes, it's really large. It uses 2.3MB of RAM on my machine, of which, 1.8 or so is shared with other processes. The sad fact is, though, that so is rxvt these days (and indeed, all current terminal emulators). xterm includes a tektronics emulator, amongst other things, which 99% of users will never need (in fact, I'm the only person I know that has ever had a genuine need for it). As a reaction to xterm's size, one of my lecturers at University wrote xvt, a minimal terminal emulator, without the bloat. Over time, that evolved into rxvt, which is growing more and more features, and is no longer as small as it once was. It's still smaller than xterm, but that's not hard. For comparison, opening up a new term uses up the following amount of RAM per new window:

      • 496: xterm
      • 296: rxvt
      • 856: gnome-terminal
      • 140: gnome-terminal --use-factory
      • 1068: konsole
      • 160: konsole (first new tab)
      • 36: konsole (subsequent new tabs)

      It should be noted that both konsole and gnome-terminal have massive startup costs, with konsole starting up 4 kdeinit processes, and gnome-terminal starting up gnome-pty-helper.

  • by therealmoose ( 558253 ) on Thursday August 15, 2002 @08:46PM (#4080095)
    http://freshmeat.net/browse/158/?topic_id=158 [freshmeat.net]

    Ask slashdot is becoming increasing ridiculous, with the answer to almost every question found at either google or within OSDN. I don't mean to flame the editors, but it would be good if they would be a little more selective WRT ask slashdot.

  • by sig ( 9968 ) on Thursday August 15, 2002 @09:31PM (#4080265) Homepage
    You can use gnome-terminal with the --use-factory option. It makes one process for all your terms, so if you have a lot of windows open, it doesn't use that much memory.
  • by zeda ( 415 ) on Thursday August 15, 2002 @11:52PM (#4080771)
    Memory is cheap, why worry.

    Using screen also helps.

  • by Anonymous Coward
    I know you said Konsole is out, and I can understand why. It does eat up quite a bit of memory initially. And yea you dont like it being tabbed, but it just takes a little getting use to. Every new "tab" in Konsole eats up less memory than Bash does. Give it a shot for 1 week.

    The shortcut for switching between console is is LShift+RArrow for going to the next console and LShift+LArrow for going to the previous console.

    You can name your consoles which is VEY handy. I frequently have 5-6 consoles open and name every one of them. It really helps. A little extra benifit of Konsole is that you can have customized profiles. One for Bash, one for ZSH, one for Midnight Commander, one for ...

    I didn't like it a lot at first but I grew to love it. Now I cant stand terminal programs that aren't like it.

    Mikey likes it.
  • I wasn't wanting to point out *why* my criterion are the way they are, but here's a quick summary:

    1) I *like* having $XTERMs scattered everywhere. I use my $XTERMs as a Be-Anywhere-Doing-Anything type tool; ergo, concentrating them in one space would be counterproductive - hence, no tabbing. I'm odd that way.

    2) Memory isn't cheap on a laptop. I hate desktop systems. I'm odd that way.

    3) Excepting Opera (and it'll go the instant the Moz folks get their heads out of their asses about tabbing through links vs. tabbing through form elements), I want nothing at all to do with Qt. I just don't like it.ÂI more or less tolerate Opera only because I used it back in my Windows days and so am used to it already. I'm odd that way.

    4) I tend to leave a lot of terminal windows open to preserve state that may or may not be important. I'm ... well, you can prob'ly grok the pattern by now.

    The suggestion for gnome-terminal --use-factory would probably be best (is Way cool; I never knew about that before), except that gnome-terminal's i18n is broken at the moment. :)

    Such are my motivations for asking. Now You Know. :)
    • The OS manages memory better than you, so let the OS do its job.

      Post a 'top' of your system and then everyone here can all get together and critique your memory usage. I'm sure there are other things you can stop using to save an extra K or two, so maybe you can splurge on an extra xterm.

      Don't expect other people to care all that much about your particular set of tradeoffs. If you care that much, take the xterm and cut out all the Tektronix graphics and other stuff yourself.

      If you are using xterms on multiple desktops, look into just using one and making it sticky.

      As far as saving state, nothing beats screen. It saves state even if X dies.

      You should get over your irrational problems with Qt also.

      You do realize that some of the old folks had to run good old xterms back in dinosaur times, on machines with half as much RAM as your laptop.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...