Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Innovation on the Edge? 229

MCassatt asks: "It's a truism in many fields that breakthroughs come from the edge: the scandalous Impressionists become pretty pictures for posters and umbrellas; the world of science fiction becomes the world of science. The wonderful, the fantastic, and the mad of today are tomorrow's mainstream. Are there examples of this in computer science? Not extreme programming, but extreme programs?"
This discussion has been archived. No new comments can be posted.

Innovation on the Edge?

Comments Filter:
  • Extreme programs (Score:5, Insightful)

    by SeanTobin ( 138474 ) * <byrdhuntr AT hotmail DOT com> on Saturday April 26, 2003 @02:32PM (#5815534)
    Gnutella [gnutella.com]
    Bit Torrent [bitconjurer.org]
    Freenet [sourceforge.net]
    Reiserfs [reiserfs.org]
    Linux Kernel [kernel.org]
    Open SSH [openssh.org]
    Encrypted Filesystems [sourceforge.net]
    GnuPG [gnupg.org]

    At least in my opinion p2p and crypto are the edges in coding right now. Both can be hugely successful if you succeed in writing them properly. They can also be a huge failure if done improperly. Personally, I'm amazed that there aren't more p2p worms/remote exploits out there. Every now and then there are a few breaks in crypto from a weird angle, but in general they have been very successful as well.
  • by Anonymous Coward on Saturday April 26, 2003 @02:36PM (#5815547)
    To this day this prolific timewaster has left a very impressive swath of damage to production. If that isn't extreme, then I don't know what is.
  • Google Labs (Score:5, Interesting)

    by Anonymous Coward on Saturday April 26, 2003 @02:36PM (#5815551)
    Lots of fun little things to place with at Google Labs [google.com].
  • DMS (Score:5, Interesting)

    by supabeast! ( 84658 ) on Saturday April 26, 2003 @02:38PM (#5815561)
    http://www.lmcdms.com/

    DMS is the US Government's international secure email implementation. At a glance it looks like a bunch of crappy obsolete code and operating systems trying to do email, but when you stop and think about what is DMS is doing, it is pretty damned impressive.
  • by sanpitch ( 9206 ) on Saturday April 26, 2003 @02:42PM (#5815585)
    Much of theoretical computer science is all about some crazy professor looking at a problem that he thinks is cool, without worrying about its utility. Then in a few years, somebody finds a practical application.

    • I can think of only a few: System/R becoming relational databases; the language work that became LALR parsers; some of the graph-theoretic work that became the Internet.

      In most cases it was a huge leap between research and the implementation that made it actually useful. Innovations in the course of a "simple matter of programming" are more often intuitive understanding that only later is justified by research, sometimes prior but unknown research.

      Meantime, 99.9% of what comes out of theoretical labs is c
    • Computer scientists don't live in a vacuum. Most of the computer science researchers I know have particular real-world problems that they are trying to solve. Sometimes they also have intent of directly commercializing the stuff, sometimes they leave that to others .. but in almost every case I know of, they are aware of the problems they are trying to solve and are very aware of at least some of the potential real-world applications.

    • I am confused. You seem to be talking about physicists and mathematicians, not computer science professors. What area of computer science has actually been pushed forward by scientists recently? Perhaps you can name some examples, because I am at a loss. While they had a monopoly on computer systems, they were able to have a monopoly on innovation, but now that just isn't true.

      All the innovation I see come from people who actually write code instead of talking about it. Many of these people have no formal
  • Virii (Score:5, Insightful)

    by kinnell ( 607819 ) on Saturday April 26, 2003 @02:46PM (#5815605)
    Some guy thinks one day, "Life is just the replication of information. Computers can do that". We all love to hate them, but you could argue that conceptually, computer virii are as "alive" as organic virii. If that isn't an etreme idea, what is?
  • Palladium (Score:3, Insightful)

    by DuSTman31 ( 578936 ) on Saturday April 26, 2003 @02:51PM (#5815633)

    The question of what is on "the edge" can be answered by how much controversy the thing recieves - Something accepted by all will be mainstream, "the edge" denotes a radical departure and whenever there's a radical departure there's going to be quite a few people complaining about it.

    It would seem to me that this whole palladium situation is the most controversial software project in a while, so it could probably be termed "on the edge", too.

  • by 2057 ( 600541 ) on Saturday April 26, 2003 @02:51PM (#5815641) Homepage Journal
    the thing is if a program is really innovative and radical you really cannot tell. example an example of new radical science in work would be an "ionic engine" it took princeples found from earlier sciences and applied them. we don't have this in CS. Nobody comes up with a theory about how "the computer space" works, and then tries to prove it, because everything is pretty much well documented and everything is understood because we created, so you really can't have really extreme programs. unless that is if someone uses a function really weird and gets it to something else, and i really don't see alot of that.
    • Computer Space (Score:5, Informative)

      by Vagary ( 21383 ) <jawarren@gmail.cAUDENom minus poet> on Saturday April 26, 2003 @03:03PM (#5815714) Journal
      "Computer space" is the object of study of computability theory. Turing Machines, Post Machines, the \lambda-Calculus, the Language of WHILE-programs, function (morphism) composition, etc. These are all theories about the nature of computer space. Since the Church-Turing thesis and complexity theory pretty much cover the fundamental physics of the space, instead we worry about different ways to visualise and apply the space. It's much closer to engineering than physics is style, but you must admit that there's some similarity.
    • ...it took princeples found from earlier sciences and applied them. we don't have this in CS.

      Yes you do! Examples?

      1) Pointless to mention that a lot of, if not all CS is based on mathematics/logic/physics...

      2) OOP: the concept of classes and specialization is a standard in semantic analysis at least going back to Aristotle. The hierarchy of classes was developed by the semioticians of the middle ages (or Bertrand Russell in the beginning of the 20th century...). Gottlob Frege (modern logic and class the

    • Nobody comes up with a theory about how "the computer space" works, and then tries to prove it, because everything is pretty much well documented and everything is understood because we created...

      Huh? There are thousands of researchers working dilligently every day to figure out how "computer space" works. Computer science is a very young field, and at present, we are unable to answer even the most basic of questions concerning the nature of computation and its relation to time, space, information, rand

  • Voice recognition (Score:2, Insightful)

    by Timesprout ( 579035 )
    Should revolutionise computer usage when it gets more reliable in a few years. IBM [ibm.com] have been at it for a while now.
    • haha, unless you have an impairment, voice recognition won't be much of an impact.

      It would be more efficiant and natural to have a system that detects a specic movement/action and turns that into an event.
      example: instead of saying lights on, you would just make a flicking motion in the air, and the light come on.

      Studies are starting to show the people find it uncomfortable and un natural to speak into the air.

      Of course giving a computer cammands via voice in an office enviromant would just drive everyon
  • by iamatlas ( 597477 ) on Saturday April 26, 2003 @02:58PM (#5815683) Homepage
    "...the scandalous Impressionists become pretty pictures for posters..."

    I dunno... impressionist programing? It would only look like code from far away.

    Besides, Microsoft already makes programs that look useful from far away but crappy close up.
  • by bj8rn ( 583532 ) on Saturday April 26, 2003 @03:01PM (#5815699)
    The edge is where the known meets the unknown. That's where all innovation comes from - you find out or do something new, something that has never been done before. What new can you find in a territory already explored? Only a place that hasn't been explored yet (or some interesting bugs/plants/animals).
  • There are a bunch of different levels that this goes on - First there are new types of programs (p2p and crypto), then there are new languages (Java, etc.) and then there are programming approaches (OOP, etc.) and finally there are organizational approaches (OSS, distributed teams, etc.)
  • There is a possibility of people saying "In theory, our computers could do this..." But as soon as something makes it as far as actually being implemented, it's no longer fantasy but already in the realm of science. This is why there's very little "fantasy" in the computing world.
  • by seizer ( 16950 ) on Saturday April 26, 2003 @03:05PM (#5815722) Homepage
    I'd like to put forward this Turing machine [server.org.uk], implemented using the rules of Conway's game of Life. It astounded me when I first saw it, and it astounds me still. Have a look at some of the components using the provided applet. If you've ever played with Life, you'll know how hard it is to create anything non-random at all.

    Sweetcode [sweetcode.org] often has interesting pieces of programming too.

  • I run windows 2000 without a firewall! Can't get more dangerous than that!

    (I'm joking. I have a firewall and I am actually getting portscanned right now - wheeee)
  • by wildsurf ( 535389 ) on Saturday April 26, 2003 @03:09PM (#5815741) Homepage
    Having been involved in the development of Kai's Power Tools, I'd have to say that Kai's user-interface designs had a strong influence on what's out there today.

    Our philosophy while writing those programs was based on the observation that existing UI paradigms were created for processors hundreds of times slower than current machines; why not leverage that power to create interfaces beyond the standard buttons, menus, and 16x16 pixelated cursors?

    Say what you will, the OSX Dock (for example) is indisputably Kai-like. I think that's a good thing.
  • Clustering software (Score:4, Interesting)

    by mz001b ( 122709 ) on Saturday April 26, 2003 @03:11PM (#5815751)
    Software that enables one to turn a bunch of ordinary off-the-shelf computers into a distributed cluster to run message passing programs on were pretty radial at the time, but now it seems everybody does it. I run my codes just as often of Linux clusters as on big IBM SP/3 machines, and for a lot of tasks, the Linux clusters cannot be beat.
  • by mark-t ( 151149 ) <markt.nerdflat@com> on Saturday April 26, 2003 @03:14PM (#5815762) Journal
    Extreme programs, I think, are programs of such overwhelming significance, that they create an indelible impression upon the way that people use computers afterwards.

    My votes would be for the following

    1. Unix
    2. Visicalc
    3. TeX
    4. GCC
    • Well, wouldn't the graphical user interface count as an indelible impression on computer use? Even XWindows changed a lot of UNIX use.
    • WTF did GCC change? Umm, ok, so it's a free compiler and some code isn't compatible, but it adds _nothing_ technically and is in many ways an inferior compiler. Whatever significance GCC has, has nothing to do with the code and everything to do with the license it was filed under. If you're going to take the non-technical argument, then I can sort of see where you're coming from, but at the same time you must have tunnel vision, because you're missing the much more significant and popular applications on

    • GCC?!?!?!? You are aware that it wasn't the first C compiler right? That at it's introduction, others were clearly superior? That, while some might argue it, it's a pretty tough sell that it has ever been the best at what it does. The advantage is the license. The program is a reimplementation of something that already existed.
    • I'd also add Emacs to the list - all said and done, it *is* a neat piece of work.
  • Xtreme Programz! (Score:5, Interesting)

    by Captain Beefheart ( 628365 ) on Saturday April 26, 2003 @03:14PM (#5815765)
    I think running the Microsoft Word paperclip applet should be considered an extreme sport, at least. I think, on a serious note, that more and more people are going to start using apps that allow them to view data constructs in visual terms, like the network map thingamajig I saw for instant messaging the other day. It allows you to see circles, cliques, newbies, etc., and how they're distributed through the IM world. New ways of looking at data for those visual types.
  • by hoegg ( 132716 ) * <ryan.hoegg@NoSPaM.gmail.com> on Saturday April 26, 2003 @03:23PM (#5815811) Journal

    Check out the Avalon [apache.org] project. If is a framework encompassing the ideas of Component Oriented Programming [apache.org] and Separation of Concerns [apache.org].

    Also, read about Aspect oriented Programming [aosd.net], which "modularize[s] crosscutting aspects of a system" by allowing a programmer to specify "aspects" of a class or component such as logging, security, remotability, and more.

  • by rusty0101 ( 565565 ) on Saturday April 26, 2003 @03:24PM (#5815816) Homepage Journal
    Spreadsheets. I am not aware of any other application that can be said to have had as much of an impact on computers. The spreadsheet on an Apple II was what brought personal computers into buisness, and was what gave users the power to do their own research and experimentation.

    Once Personal computers came out, and Lotus came up with 1-2-3, the economics of volume production became powerful enough that costs dropped to the point that personal computers became useable for other activities (word processing was already being done on mini and main frames, so it doesn't count, databases have been on mainframes for a very long time, etc.)

    Eventually costs got to the point where users could afford a computer simply to play games on. Of course then Games got to the point where a good gaming machine costs more than an excelent business grade PC.

    -Rusty
    • replying to my own post, how droll...

      Some people may argue that PDA's are a significant inovation. All a PDA is, and really ever will be, is a portable computer that is capable of presenting a Personal Information Manager of some sort. PIMs have been around since at least the early 80s.

      Variations on the PIM concept existed as notes a programmer would put in his own source code to do certain things at certain times, as well as calander and address book applications on various OS's including Unix from the e
    • Hmm. I would only add that having run Spreadsheet courses at the time, in between PC support / programming / etc, the reaction of people was "I want one of these (PC) cos these spreadsheet things are so useful", but what they really meant was "I want one of these PC things because they are so much fun, and spreadsheets gives me an excuse to put to my wife." Rationalisation, that fueled an industry ... but hey, I was guilty of it myself.

  • I'd say (Score:3, Insightful)

    by Madcapjack ( 635982 ) on Saturday April 26, 2003 @03:46PM (#5815947)
    I'd say that the cutting edge is the stuff being done in A-life and evolutionary algorithms, and probably neural nets.
  • The analogy here is somewhat false. Science fiction is proto-unresearched science, but for Computer Science, that isn't extreme programming, it's just science fiction again.

    The WWW and Net were information systems I lusted over in David Brin's Earth, and 10 years later I'm living in the middle of it. I'm not saying the Earth is going to wake up and talk to us, but the did anticipate the Net well. Now if only we could implement his idea that you have to subscribe to a news feed to get the vote...

    Similar
  • The biggest, world-transforming programming breakthru will be when we start to teach computers to behave like people-- which will be a major step towards understanding who we are.

    "The Sims" is a very crude glimpse of this, but there's almost no 'psychology' in The Sims, and the sorts of science you learn in psychology class are almost entirely useless for this purpose.

    So one extreme 'fringe' involves wrestling with the literary side of behavior, trying to analyse and classify the real behaviors people d

  • I like to lump things that computers should do into two categories - communication/entertainment and mindless tasks. My crystal ball says there's still a lot to be explored in both of these areas.

    As far as mindless tasks go, there are plenty of applications for computing all around us: automating our work, controlling our living environments, checking what's in the fridge. A lot of this will be networked microchip stuff that will tie into a central computer somewhere to visualize them. A lot of it will be
  • by joelparker ( 586428 ) <joel@school.net> on Saturday April 26, 2003 @04:52PM (#5816252) Homepage
    The edge IMHO is *not* crypto, P2P, or Linux;
    These are known by mainstream techies today.

    Think instead of what these techies do *not* know.
    Remember when you first saw email or a web browser?

    These apps changed *so* much in our world.
    Think in that arena.. what could change so much?

    Cheers, Joel

  • Extreme programs? (Score:5, Insightful)

    by StevenMaurer ( 115071 ) on Saturday April 26, 2003 @04:58PM (#5816275) Homepage
    They are so common in this industry, we have our own special name for them: Killer Apps.

    Let's review, shall we?

    VisiCalc ...and its successors spawned a trillion dollar industry, made Steve Jobs a billionaire, and almost singlehandedly eliminated the profession of "bookkeeper".

    WordPerfect ...ditto for the profession of personal secretary. Only executives use them now.

    Mosaic ...let's see. Trillion dollar industry, hundreds of business models, hundreds of thousands of businesses, millions of lives and careers changed... seems pretty extreme to me.

    I could go on, but you get the idea...

    • WordStar was the big killer app when word processing first became a big deal. WordStar killed themselves through some stupid decisions. (They admited their interfaced sucked, and built a better one, but didn't provide a good migration path. Since everyone then had to migrate they looked around and decided wordPerfect was better)

      Mind you there were other word processors at the time. I doupt wordStar was first.

    • They are so common in this industry, we have our own special name for them: Killer Apps.

      Let's review, shall we?

      • VisiCalc
      • WordPerfect
      • Mosaic

      I think this list is interesting from two directions. First, because of the way that I would classify the apps from the users' perspective: smart paper, smart paper, and smart notebook/magazine. What other broad classes of killer app are there? Games and several other things go in the category of "smart TV". Besides paper and TV, what other "smart thing" do

  • They're Everywhere (Score:5, Insightful)

    by Bob9113 ( 14996 ) on Saturday April 26, 2003 @04:59PM (#5816277) Homepage
    Recent History:

    How about an operating system written as a substitute for massive commercial systems, written initially by one guy, then by a bunch of people collaborating, without direct compensation, via email? (Linux)

    How about a system to allow anyone with a computer and a pipe to publish structured hypertext and images for all the world to see? (Mosaic)

    How about a system for independent individuals to type to each other in real time? (talk, IM)

    How about a system for people without a static IP to share files? (P2P)

    How about a system for people to contribute spare CPU cycles to a collective social work? (Distributed.net, SETI@Home, Folding@Home)

    The Future:

    What's on the edge now that will be huge tomorrow? If I knew that I'd be in angel capital. (speaking of equity, how about online stock trading systems?)

    What's on the edge and either hasn't found a niche or isn't sufficiently advanced yet (and may never be)? 3DUIs, Freenet, Complex Adaptive Systems, Face Recognition; and those are less than a cube in the iceberg.
    • (I'm gonna get flamed for this...)

      How about a family of operating systems that has managed to capture over 90% of the small computer market?
      • (I'm gonna get flamed for this...)

        How about a family of operating systems that has managed to capture over 90% of the small computer market?


        I actually agree with this 100%. Microsoft got there because they were one of the revolutionaries in the 1980s, when feathered hair and skinny leather piano ties were the rage. But what have they done for me lately?
    • I have been playing Neverwinter Nights for the last week. And even in Linux I still try to move the cursor to the window edge to rotate my view to see any other docs or interesting events. It is a pretty natural interface. It would be interesting to see a similar window manager that allowed a 360 panorama with stairs going down or up .. and locked doors heh heh ... hmmm yeah ... really warming to the idea. No don't include the quake like one , I don't want to have to go into combat with files in order to de

  • Dance Dance Revolution is a wildly popular video-game, even with demographics that normally wouldn't be caught dead in an arcade (ie: girls.)

    One of the keys to its success is user-prompting... an interface where visual and audio cues (the arrows and the music) indicate what the user should do next to achieve a goal without rote memorization or exhaustive trial and error. This is something that will revolutionize UI design once it is better understood... a graphics program walking you through the steps need
  • by winter@ES ( 17304 ) on Saturday April 26, 2003 @05:43PM (#5816400)
    I think the recent stuff going on in AI research is pretty neat.

    I'm talking about stuff like Genetic Programming [amazon.com] and evolutionary algorithms.

    For instance, check out these guys [amazon.com] who built an evolving neural network that plays checkers. Neither of the coders even really knew how to play checkers when they first started breeding this thing. The latest evolved version became a world-class top ranked player on the Zone [zone.com]. They never "taught" it a single thing about checkers, it learned everything it knows on its own - playing in a winner-survives-to-breed soup of competing neural networks.

    In my opinion this tech has incredible potential for the future of software, as processing power outstrips our ability to program to it. Genetic programs, evolutionary algorithms, and evolved neural networks are able to search an almost unlimited problem space (using the most efficient hill-climbing algorithm ever devised... thanks mother nature!)

    paulb
  • How about Emacs? (Score:2, Interesting)

    by GnuVince ( 623231 )
    Emacs is an editor that has been around for 20+ years, it is so extensible that you can use it as your debugger, you can use it to compile stuff, you can modify EVERY behaviour of it. You can also add lots of stuff like a doctor, a tetris game, an interface to gnuchess, etc. Emacs is also extremely stable, safe (no buffer overflows or stuff like that). Even if I don't use Emacs (I prefer Vim), I think it's one of the most extreme programs ever designed.
    • I've always loved the quote "emacs is a great operating system if only it had a decent editor"

      Code editors in Unix are WAY behind what you can get on a windows box. I loved Coderite in its day, now the visual development environments are much better at what emacs does, without the overhead of having to learn a new operating environment.

      Watching karma go down by the minute

      • Code editors in Unix are WAY behind what you can get on a windows box. I loved Coderite in its day, now the visual development environments are much better at what emacs does, without the overhead of having to learn a new operating environment.

        Thus speaks someone who can't use emacs. Not surprising - emacs is opaque and hard to learn. But once you have learned it you'll never go back.

        • You said it exactly. Why should I have to learn my editor ???

          The Windows editors do not take a learning curve, they just open documents, and let me type away, giving context sensitive help, coding tips, variable expansion, Debug, edit, continue support, and much much more.

          I don't want to spend time learning an editor (yes, that is what I am doing now unfortunately) to be productive, I want my editor to make me productive

  • My nominees (Score:3, Insightful)

    by Aidtopia ( 667351 ) on Saturday April 26, 2003 @05:56PM (#5816451) Homepage Journal
    • Genetic Algorithms - write programs/solve problems by modeling a natural process than many people deny the existence of!
    • TeX & Metafont - revolutionized the quality of print, one of the first, major free open pieces of code, virtually unbreakable
    • Oberon - proving fully functional software need not be bloated
  • by chriss ( 26574 ) <chriss@memomo.net> on Saturday April 26, 2003 @06:02PM (#5816473) Homepage

    Lets talk about extreme applications, those which changed our views and methods to act. I do not believe that any application has ever been created without some examples been existent before, but there is often one specific version that got used widely and opened the eyes of a lot of people.

    Spreadsheets: Visicalc was not the first, but the first on personal computers. These tools allow you to play with a number of different scenarios in a way you could never handle without them and therefore give a chance to see into the future.

    1st person shooters: Doom (and Wolfenstein and hundreds of followers) realized least some of the promises of virtual reality. A artificial world, created in real time, in a way that was realistic without to much burden on your own fantasy, dense and moody enough to really immerse yourself into that world. A copy of our real world as an interface to a computer, more coming.

    Communication (Email/News/Chat): The video text system Minitel pushed by France Telecom during the 80s and early 90s by giving away the (primitive) terminals for free. This is most likely the first electronic mass medium that existed with up to 35 million users, more than 50% of the whole population of France. Was used massively for mail and chat (and porn), but also included a micro payment system and was a huge ecommerce success more than a decade before the web became popular. Communication is the killer app of all killer apps.

    ebay: ebay is its own category (and, of course, it's an application), everything else is a copy. First worldwide successful C2C business, could not exist without the web, but has proved that the low cost of a medium can generate markets where there was no margin before. Removed the costs for advertising, customer service, handling etc. by reducing its own function to a mere communication enabler.

    Search engines: Google comes in mind, but Google is just a very clever version of Altavista, I do not remember who started it. Whenever you search in a text with your preferred text processor, you're using its search engine to run a full text search, so it's not really new. But applied to an enormous body of data (unsorted, in contrast to classical databases) gave us a kind of 'instant knowledge' unthinkable before. I own dozens of dictionaries and never leave without my Encyclopaedia Britannica (on my iBook), but nothing can compete with billions of pages of unstructured information at my fingertip.

    web browsers: Mosaic was for many people the first look into the computer interface of the near future. A system, easy to use from a consumer and producer perspective, at low cost, to enable exchange and access anything that can be squeezed into HTML and some pictures.

    bioinformatics tools: BLAST (Basic Local Alignment Search Tool), a dedicated database for storing, comparing, finding and annotating sequences of DNA etc., to be run at home (if you want to) or in your lab or easily accessible on the web. Enabled researcher worldwide to get immediate access to the most current findings, therefore increasing the speed in which the humane genome could be decoded (and stealing Celeras show). This kind of technology will speed up our acquisition of knowledge in many ways.

    When you look at this list, there are some common themes:

    • eases the access or handling of data
    • works on low tech machines
    • enforces communication
    These will be found in a lot of 'extreme applications', be it p2p, encryption, proteomics or whatever.

    Chriss

  • by sasami ( 158671 ) on Saturday April 26, 2003 @06:16PM (#5816509)
    Funny that no one has mentioned any of the embedded systems that have had a broad, tangible impact on everyday life...

    - DVD video, Dolby Digital audio
    - Fly-by-wire aviation
    - CAT, PET, MRI
    - Automobile controllers
    - Routers and switches, to say nothing of ESS and its descendants
    - Toys
    - Credit card readers, ATMs

    Plus a dozen others on the tip of my tongue, and those are just the ones I'm aware of. Anyone care to post something about power grids and other infrastructure? How about applications in manufacturing, business, medicine, art, military, construction?

    More generally, the well-known When Things Start to Think [amazon.com] generally illustrates the kind of dramatic effects that can occur when you add just a bit of intelligence into a mundane object (or process).

    --
    Dum de dum.
  • by NewToNix ( 668737 ) on Saturday April 26, 2003 @09:07PM (#5816982) Journal
    The single most extream program ever written is the plain old "copy" program.

    When ever you use the "copy" program you are accomplishing the oldest and dearest dream mankind has ever had - you are both having your cake and eating it too.

    The ability to infinitely replicate something, each copy being absolutely identical to the first, but also infinitely distributable to however many desire it, is earth shaking.

    This is the major thing human kind must learn to deal with into the future. More then any other single event or "discovery" the lowly copy program (and it's brother "paste") will have greater effect on the way we view our world then any other thing.

  • by Yoda2 ( 522522 ) on Saturday April 26, 2003 @10:06PM (#5817135)
    I've been developing software to help computers associate language with perception. Here's a recent workshop paper [cornell.edu] if you're interested. More info on my site (see sig).
  • by jms ( 11418 ) on Saturday April 26, 2003 @11:45PM (#5817415)
    I would count the early arcade games, and the Apple II games.

    These machines and programs jammed an enormous amount of programming functionality into incredibly tight spaces. Many of the old arcade programs ran on 4K, 8K, or 16K 8 bit computers, and ran on machines with clock speeds of under 1 MHz, and effective instruction rates of mere hundreds of thousands per second. Even a fully loaded Apple II gave you under 32K of actual program space to work with, once you subtracted the low RAM, the hires graphics areas, and the BASIC ROM space, and people did a whole lot with that 32K.

    The last two games I've purchased (Simcity 4 and C&C Generals) require minimums of 500 MHz and 800 MHz processors respectively and 128M of RAM. Of course, they do a lot more, but they are certainly not 500, or 800, or 8000 times as entertaining as the Cocktail Space Invaders machine that graces my hall entryway and is such a hit when we throw parties.

    Early arcade games were heroic, wildly successful efforts. Truly examples of extreme programming.

  • by dj_virto ( 625292 ) on Saturday April 26, 2003 @11:48PM (#5817427)
    Without a doubt, video motion detection is going to be huge. Programs like Homewatcher, GOTCHA, and many others (I'm too lazy to set up links) can sense motion very accurately, take timestamped images, upload them to a webserver, send them via faz and email, call your phone, run external programs, etc, etc. If you live in a dangerous neighbourhood like me(and if economic downturns persist, perhaps you soon will) they are hugely useful. Couples with cheap cameras and cheap low power hard drives, systems like this could make crime very dangerous for the potential thief if they were extremely widespread.


  • The only radicaly different, as in 'different axioms' different, approaches to C.S. I've come accross are quantum computing and molecular (as with using DNA to solve NP-hard problems) computing:

    Both, if ever implemented, will not only increase our capabilities, but will also change (actually have somewhat changed) our conception of what calculation actually is.

    In fact, quantum computing has evolved from trying to apply different laws and axioms (quantum laws) to computer science.

    Now that is what I call
  • We will see computers becoming a cultural technique and books become rare. That is only a decade away at most, imo. Display technology needs a little more improvement an mass market capability, we need a little more tweaking in the 'human interface' dept. and shurely some optimization in power consumtpion.
    Then books are going to start to go away. Degrading 500 years of means of information storage from pole position to #2 or 3 within a few years is quite a breakthrough if you ask me.
    Gathering information an
  • I do research in this area and I can't begin to tell you how many things can go wrong. The paint on the walls, the kind of lighting you use (quick mental experiment: halogen light cycles brightness at ~60Hz, video streams come in at say, ~15Hz. Think you can get them to synchronize? Good luck.), shadows, reflections, etc. etc. etc. are all a huge problem so large that I still have a job. Yes, there are cheap (

    • Grrr... I keep on forgetting that = < in HTML. So, as I was saying, cameras are cheap ($100) and you can get a decent camera for not too much; however, except for the incredibly expensive IR and UV cameras (which cost tens of thousands of dollars), these cameras all have the same Achilles Heal: light. When the lights go off (and I imagine you turn them out when you sleep), noise increases exponentially and you end up with images that are essentially useless. Vision systems can also be fooled; if you mov

On the eighth day, God created FORTRAN.

Working...