Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Businesses Software Technology Hardware

Ask Slashdot: Were Developments In Technology More Exciting 30 Years Ago? 231

dryriver writes: We live in a time where mainstream media, websites, blogs, social media accounts, your barely computer literate next door neighbor and so forth frequently rave about the "innovation" that is happening everywhere. But as someone who experienced developments in technology back in the 1980s and 1990s, in computing in particular, I cannot shake the feeling that, somehow, the "deep nerds" who were innovating back then did it better and with more heartfelt passion than I can feel today. Of course, tech from 30 years ago seems a bit primitive compared to today -- computer gear is faster and sleeker nowadays. But it seems that the core techniques and core concepts used in much of what is called "innovation" today were invented for the first time one-after-the-other back then, and going back as far as the 1950s maybe. I get the impression that much of what makes billions in profits today and wows everyone is mere improvements on what was actually invented and trail blazed for the first time, 2, 3, 4, 5 or more decades ago. Is there much genuine "inventing" and "innovating" going on today, or are tech companies essentially repackaging the R&D and knowhow that was brought into the world decades ago by long-forgotten deep nerds into sleeker, sexier 21st century tech gadgets? Is Alexa, Siri, the Xbox, Oculus Rift or iPhone truly what could be considered "amazing technology," or should we have bigger and badder tech and innovation in the year 2018?
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Were Developments In Technology More Exciting 30 Years Ago?

Comments Filter:
  • SpaceX (Score:2, Informative)

    by Anonymous Coward

    Falcon 9 landings are pretty awesome!

  • SGML was a big innovation; combining it with Gopher/FTP to make the web was good stuff too. Ever since then we have focused on new ways to sell distractions to the bloated consumers. The market is about to correct our over-estimation of what that is worth, but in the meantime, I got into tech to change the world, not connect refrigerators to Twitter.

    • And then CSS and JavaScript screwed that pooch.

      • Uh, CSS or something like it was intended to exist from the very beginning of the web. But one might argue that the web in any former or current incarnation was a bad implementation of a great first idea.
  • by crgrace ( 220738 ) on Wednesday March 21, 2018 @07:32PM (#56301189)

    This question reminds me a lot of people who say "Music was so much better in the 1990s" or "Comic books are garbage now but they are so innovative in the 70s". Basically these people were more passionate about their hobbies (music, comics, computers, or whatever) when they were young than they are today. Therefore, anything going on "back in the day" was - almost by definition - so much more amazing than the pedestrian stuff we have today.

    I would say the idea that there were more exciting developments 30 years ago is ludicrous. In the last few years we have virtually the whole of human knowledge at our fingertips, we've had a huge resurgence of neural nets, we have rockets that can land themselves (!), actually useful brain-machine interface (for example deep-brain stimulation for epilepsy), self-driving cars, actually cool VR, electronic communications becoming ubiquitous, cheap single board computers that even a child can use (e.g. Raspberry-Pi), electric vehicles becoming mainstream, a technology for currency that is actually threatening to upset the applecart, and on and on and on.

    I was a teenager in the late 80s and early 90s and was deeply passionate about technology. I was excited about the Amiga, Unix, and C++. Those days have NOTHING on today.

    • by Junta ( 36770 ) on Wednesday March 21, 2018 @07:47PM (#56301339)

      People do tend to have greater reverence for things during their formative years. However I will say that easily technology has obviously progressed, but in terms of creative endeavors, there's a lot of room for different dominant expressions. For example, if you were a fan of the adventure game genre, then you would really like the 90s. Similarly for space flight sims, it faded out. If you wanted an over the top action shooter, for a while there games started taking realism too seriously. Same for music, there's some good music from before I was born, and I would say some of the worst music was when I was a teenager, and music that dominated later was pretty good.

      The other thing is who dominates the information outlet. Up until the mid 90s, the business-for-business sake folks didn't really sink their teeth into the industry, and it was dominated by people who were in it because they wanted it. Nowadays there are a lot of people in it to get money drowning out the continued substantive advancements.

      • by tlhIngan ( 30335 )

        It's human nature to be nostalgic. And "everything" was better 30 years ago if you ask people on any topic, from TV, to news, to elections and politicians to every other topic under the sun.

        And yes, some things were better back then, but it's really survivor bias. We remember the good stuff, and ignored the crap. We ignored all the crap (music/tv/movies/news/politicians/toys/tech/cars/games/etc) and remembered the few things that were notable. This goes for products too - everyone says things were built bet

        • It's human nature to be nostalgic. And "everything" was better 30 years ago if you ask people on any topic, from TV, to news, to elections and politicians to every other topic under the sun.

          I don't think that's the issue with technology. Everything sucked 30 years ago, but a year later it sucked a lot less. I'm typing this on a computer that's 4 years old and a new one is only marginally better. 20 years ago, a 4-year-old computer was practically an antique. In my lifetime, the home computer, mobile phone, Internet, and smartphone have all become ubiquitous, most from being niche products, smartphones from not existing at all.

          Each one of these had some transformative effect on society.

    • But now tech discussions are usually about censorship and politics instead of tech. Just look at the massive bans today from YouTube and Reddit. The past few days before that it was about Facebook working with Russians to influence politics.

    • by Javaman59 ( 524434 ) on Wednesday March 21, 2018 @08:04PM (#56301421)

      This question reminds me a lot of people who say "Music was so much better in the 1990s" or "Comic books are garbage now but they are so innovative in the 70s". Basically these people were more passionate about their hobbies (music, comics, computers, or whatever) when they were young than they are today. Therefore, anything going on "back in the day" was - almost by definition - so much more amazing than the pedestrian stuff we have today.

      I would say the idea that there were more exciting developments 30 years ago is ludicrous. In the last few years we have virtually the whole of human knowledge at our fingertips, we've had a huge resurgence of neural nets, we have rockets that can land themselves (!), actually useful brain-machine interface (for example deep-brain stimulation for epilepsy), self-driving cars, actually cool VR, electronic communications becoming ubiquitous, cheap single board computers that even a child can use (e.g. Raspberry-Pi), electric vehicles becoming mainstream, a technology for currency that is actually threatening to upset the applecart, and on and on and on.

      I was a teenager in the late 80s and early 90s and was deeply passionate about technology. I was excited about the Amiga, Unix, and C++. Those days have NOTHING on today.

      I'm about 20 years older than that. Still, when I got into computing as a professional in my 20s I was excited by those things (Amiga, Unix, C++) and also some more academic things - AI and functional programming. I was also excited by "Client/Server" and networks. Now, all of the things which I was excited by as cutting edge have been through two transitions: one, to commercial acceptance and required knowledge for programmers; and, then two, ubiquity and invisibility. Meanwhile, some very smart people and aggressive startups have put all of these in the hands of everyone from teenagers to grandmas. Back in the 80's we may have dreamed of everyone having a computer and being connected, but we did not envisage how it would be. We probably thought of some giant international connection of PCs with people chatting through text consoles. We did not envisage the www, with all the world's news and knowledge being crowd sourced, we didn't envisage facebook/instragram/twitter with ordinary people compulsively getting their latest info from each other, and we didn't envisage smart phones. WRT smartphones, that was Steve Jobs, and Steve Jobs only. Microsoft and Blackberry had 10+ years leap on them, but never understood the possibilities, nor did anyone else.

      One common trend I've seen with programmers is that we dismiss the latest developments. In my time, we've dismissed the GUI ("I get more done throught the command line"). Then we dismissed the GUI with colors. Then we dismissed the web (yes - I heard that!). Then we dismissed smart phones. Then we dismissed facebook. etc. If it were up to us, we'd still be using mainframes with text consoles. Which was the state of technology when I arrived in 1980. Undoubtedly, that would have been rejected by the luddites from 30 years before then.

      Good question! My answer is, as someone who was already a professional programmer 30 years ago, an emphatic NO!

      (I still love the Amiga, though!)

      • by doom ( 14564 )

        BLOCKQUOTE> One common trend I've seen with programmers is that we dismiss the latest developments. In my time, we've dismissed the GUI ("I get more done throught the command line"). Then we dismissed the GUI with colors. Then we dismissed the web (yes - I heard that!). Then we dismissed smart phones. Then we dismissed facebook. etc.

        And we were right every time.

        • And we were right every time.

          :)

          Sent from the text-mode browser on your terminal attached to the PDP "Mini Computer" at work?

        • by doom ( 14564 )
          But if only we had a gui editor at slashdot to keep me from mangling html tags.
    • You just described products, not technology. Oddly the top grossing movies are based on 70s comic books. Your "neural nets" were first described in the 50s. Self landing rockets was done in the 60s. Electric cars from the 80s (1880s). You just think things are new because they are wrapped in shiny wrappers for your consumption.
      • Neural nets were envisioned in the Sixties, but not implemented. Self landing rockets never emerged from the cover of Analog until Elon made it so. The old electric cars couldn't get far on the lead-acid batteries of the day.

        Today's technology rocks because we Boomers are no longer holding it back.

        • by sconeu ( 64226 )

          Self landing rockets never emerged from the cover of Analog until Elon made it so.

          Depends on the definition... Surveyor says hello.

    • In some ways what you're saying is true, but I'm not old enough to have grown up listening to music from the 60's as it was fresh and new, but I do honestly think that it's better as a whole than music from any other decade since. I think there are certain time periods when you can somewhat objectively claim as being better at something than others without being blinded by nostalgia.

      For example I think the Sci-Fi novels when Asimov, Heinlein, et al. were in their prime are among some of the best and ther
    • by antdude ( 79039 )

      Ditto. Same here as an old fart. I noticed the current young generations are enjoying the current and upcoming stuff.

    • I don't generally side with the "music was better back in the day" folks either -- but it sure hasn't evolved much in the last 40 years, compared to the 1930's to 1970's.

    • by rtb61 ( 674572 )

      Tech was new though, sure tech is better today but then the tech was new, nothing like it what so ever, that made it interesting. The thing I missed the most, is what would it have been like growing up with developing tech, rather than already being a young adult already in the work force. Then again many today will miss out on killing things, hunting and fishing, in real life, preparing, cooking and eating what you killed, this versus the virtual version of it. The early day of shareware were also interest

    • I was a teenager in the late 80s and early 90s and was deeply passionate about technology. I was excited about the Amiga, Unix, and C++. Those days have NOTHING on today.

      Depends on your criteria, I guess.

      Yeah, a modern PC can do unfathomably more than my Vic-20 did.

      Does that make the typical modern PC user more excited and "techie" than the typical Vic-20 user was? Um, no.

  • Old (Score:5, Insightful)

    by RazorSharp ( 1418697 ) on Wednesday March 21, 2018 @07:33PM (#56301191)

    You're just old. It's common for things to feel fresh and exciting when you're young and then you feel cynical and apathetic when you're old. Young nerds are always excited about the new stuff. Old nerds tend to shrug off the new stuff because they were there to see what preceded it. I mean, you can feel like a trail blazer because of the computer work you did in the 80s, but that's no different than how my dad boasted about being a trailblazer for the computer work he did in the 70s. You can keep going back until you get to the nerd that invented the abacus.

    • I think there's more to it than that - not only were you more excited in your youth, but you had far less personal perspective. The progress of your youth was no less incremental, but you hadn't already spent decades watching the precursors.

      There's also a legitimate external component though, if you're discussing computer technology specifically - 80s and 90s were sort of the golden age of computing: impressive computing power had just becoming accessible to the public, and its performance was accelerating

      • The progress of your youth was no less incremental

        Video games were invented in my youth and I think that was pretty revolutionary. A lot of new computer inventions had to be made like cLUT, sprites and hardware windows. I was never interested much in the games themselves, but rather the technologies that made them possible.

        • When was your youth? A quick google suggests the first "video" (interactive CRT display) game was Space War in 1962, running on a PDP-1 the size of a large car. Computer games went back even further - "Bertie the Brain" played tic-tac-toe in 1950. When pong came out in 1972 it was already an extremely crude "retro" game, it's claim to fame was that you could play it on the tiny little Atari entertainment system, a computer affordable by middle class households.

          Meanwhile sprites, hardware windows, etc, wer

          • Sorry, Pong was made by Atari, but for a dedicated console, the cartridge-based entertainment system wouldn't come out until 1977.

    • To go from CGA to EGA was an huge leap, in a world where monitors were not very many and most were monochrome. Compare to your phone going from small bezel to no bezel/curved screen or something.

      More examples -- from Wolfenstein to Doom, from home computer to a PC (or even Amiga), from no modem to modem and the world of BBSes...

      Human senses -- that includes the mental faculty -- are logarithmic in nature. From not eating any sweets to eating dark chocolate is a much bigger leap than to go from dark chocolat

      • Going from dark chocolate to sweet chocolate is a downgrade.

        • I just realize I meant to say milk chocolate but said sweet chocolate! Yes it's often so sweet it's disgusting. But it is a jump sweetness wise.

      • by Kjella ( 173770 )

        To go from CGA to EGA was an huge leap, in a world where monitors were not very many and most were monochrome. Compare to your phone going from small bezel to no bezel/curved screen or something. (...) Human senses -- that includes the mental faculty -- are logarithmic in nature.

        But the result of extrapolating that way is that the most exciting time in human history was the caveman who discovered fire and it's been downhill ever since. That's clearly not true, we've had Dark Ages and Industrial Revolutions where human society has been stagnant or regressing and other times were it has grown by leaps and bounds. And it's hard to compare because every leap is fundamentally different, like say before and after electricity. The world before and after computers.

        I think the mass spread o

  • Ah the good o'days (Score:5, Insightful)

    by bfmorgan ( 839462 ) on Wednesday March 21, 2018 @07:35PM (#56301215)
    To answer the question, yes, the passion was different then and not just in the developers, but with the users/clients. The users/clients were excited about the new techologies promises and look to the developers to lead them to the promised land. The developers were there because they wanted to be, not just because of the promises of riches or there high school counselor said CS would be a good job. Many of the people I worked with in the 80s and 90s were self taught even before getting into college. The passion was organic and the exicitment of the new computer paradym feed that passion.
  • by llamalad ( 12917 ) on Wednesday March 21, 2018 @07:36PM (#56301233)

    We reached peak smartphone with the iphone 5. Past that, it's more crap we don't need (eye candy, tendrils of the surveillance state, ever more pixels).

    Alexa devices and Homepod are just a commercialized version of what geeks were doing 15 years ago, minus privacy and autonomy and self-sufficiency.

    The interesting stuff, imho, is happening outside of obvious IT stuff and more where it intersects with other niches. Electric cars, sure. But electric bikes, too. Drones. Blockchain.

    If phones were about serving their owners or make the world better they would use their location-awareness to mute their ringers in offices and movie theaters and waiting rooms and turn off their creepy "Hey siri" crap in bedrooms. There'd be undefeatable-via-software LEDs to indicate when cameras were being used, we'd have exact control over what apps got what data and to whom they could send it. And they'd have user-replaceable batteries.

  • by Junta ( 36770 ) on Wednesday March 21, 2018 @07:37PM (#56301251)

    The same sort of stuff is happening, but business has a better handle on marketing, press releases and so on.

    So there's a lot of do-nothing 'advances' that clog the tech media drowning out true innovation. It creates a lot of cynicism to see so many hollow articles impersonating innovation, but it's just promoters dominating what you see better than they ever have before.

  • by mikael ( 484 ) on Wednesday March 21, 2018 @07:43PM (#56301305)

    Desktop PC's moved from CGA (4-color palettes) to EGA (16 colors), VGA (256 x 320x200 colors) to SVGA/SXVGA 16-bit and then 32-bit color. CPU performance was doubling in performance. Every Intel/AMD chip had some super optimization that Byte magazine would document every few months. There were all sorts of different accelerator boards based on i860's, TMS340x0's, transputers. Audio boards just came out. Adlib Soundblaster etc... I still remember the first 256-color demo I saw for VGA, the first four 24-bit color images I saw from the Hercules graphics boards.

    Today, we've got desktop PC's with dual socket motherboards, quad SLI with hundreds of streaming processors, CPU's with 16+ cores, gaming wall projectors, VR, real-time computer vision with OpenCV (that's what the i860's and TMS340x0's were used for back then), tablets and smartphones with more powerful GPU's than SGI workstations from the 1990's.

    • Certainly with a lot of things. CPU processing. I remember saying that I don't upgrade my CPU until the new one is at least 4 times as fast as the last one, which typically was about 3 years. Now it seems I'll have to wait about 20 years or more at the rate we are going.

      Software. We went from small 30-40k programs to multi-gigabyte programs. We had the beginning of AIs, machine learning. We went from text-based systems (or line/screen based) to GUIs. The mouse, menus, buttons, dropdowns, windows, scr

      • So now that I said what really hasn't changed.. What has?

        Consumer SSDs.
        Phones.
        Social Media (-ish. We had the beginnings of social media, but it definitely wasn't as popular, and grandma wasn't on it).
        Online stores (Amazon, best buy, walmart, pizza, JC Penny, etc).
        Everyone is on the World Wide Web today, not just techies.
        Email has essentially replaced mail (USPS for you millennials) for a large number of things. Mail today is mostly garbage.
        Death of the TV (Unless you can hook it up to a game system or str

  • by thinkwaitfast ( 4150389 ) on Wednesday March 21, 2018 @07:44PM (#56301319)
    I remember convincing my parents to drive me downtown and drop me off at the civic center where a computer convention was being held. I saw for the first time, a color picture of a city street layout that was being panned down the screen. There were like 32 colors on the screen at one time!

    Up until this point, I had to suffer with b&w on my friend's TRS-80. I guess that was 36 years ago, still things were revolutionary up until about 2000. Everything since has been copies and copies of copies. No TV -> 40x40 black and white @ 16fps is a much bigger jump than 300x500->8k retina resolution in 3D at 120Hz.

    I guess to answer the question more correctly, 30 years ago being 1988, the high tech nerdy stuff was OS/2, windows 2 and MINIX, the roton [wikipedia.org], and the DC-x [youtube.com]. Now we have spaceX doing this, but the first time is always the best. GPS is about the last truely innovative technology that I can recall

  • But that's mainly because I was 30 years younger back then and everything was pretty new and exciting whereas I'm now middle-aged and jaded by the Microsofts, Googles, Apples, and Amazons of the world.

    A lot of things happened during the PC revolution that were revolutionary, particularly from the point of view of users and businesses. Spreadsheets, for example, had a huge impact on business (for better or worse). I argue that smartphones are as as revolutionary today.

    In some ways the pace of innovation has

    • Things that were exciting 30 years ago:

      VESA Local Bus
      Transputers
      Lotus Notes

      Yeah, those all turned out to be amazing, didn't they?
      • by caseih ( 160668 )

        Most of those things were only 20 years ago, not 30. Plus you have a different definition of "exciting." How about:
        - i386 architecture - true multitasking, virtual memory machines for the masses. Amazingly long-lived architecture.
        - Lotus 1-2-3, Microsoft Excel - if you don't classify those as amazing and revolutionary, then I don't know what is. These products changed the computer landscape
        - AutoCad - yes I know there were cad programs in the 70s, but in the 80s they actually became usable
        - practical laser

  • It would be more accurate to say that we're seeing a lot of technologies we've dreamed about for decades finally maturing. There's been voice recognition software since the 80s, but now we have true accurate and for the most part free, voice recognition. We have actual self-driving cars on the road today, granted in a very small form, but it's coming very quickly. We have actual gene-splicing happening on actually humans. We have actual cloning. Space travel is rapidly becoming available to citizens. It's a

    • coming very quickly

      Have been hearing this since an auto in DARPA's 2005 Grand Challenge. Back in the 80's and 90's, no one talked about computer maturing quickly, but their capabilities doubled nearly every month on some front. I haven't seen a self driving car since the ones I worked on in college. I'm not even sure what doubling the capabilities of a self driving car would even mean...though people back then were working on systems that could surpass a human driver for things like backing up three tractor trailers (I think

  • by 110010001000 ( 697113 ) on Wednesday March 21, 2018 @08:01PM (#56301405) Homepage Journal
    Published in the Washington Post in 1988:

    Using Internet and overlapping networks, thousands of men and women in 17 countries swap recipes and woodworking tips, debate politics, religion and antique cars, form friendships and even fall in love. But the networks that link tens of thousands of computers 24 hours a day also allowed the computer virus to spread much more rapidly, and with far greater potential for damage, than any previous electronic invader. That frightens many network visionaries, who dream of a "worldnet" with ever more extensive connections and ever fewer barriers to the exchange of knowledge. "The Internet is a community far more than a network of computers and cables," Stoll said. "When your neighbors become paranoid of one another, they no longer cooperate, they no longer share things with each other. It takes only a very, very few vandals to ... destroy the trust that glues our community together."

    Good thing THAT never happened!

  • When I was a kid in the 80's, sci-fi shows were all about flying cars, fast planes, spaceships etc. Sure, sometimes the moon exploded due to nukes and started traveling through the galaxy, but still there was a moon base on it.
    In almost all aspects, we are not there. In some areas we have even regressed - we no longer have supersonic commercial planes, our manned spaceflight is also more limited.
    Science fiction was too optimistic about all technology... except PHONES. They did not even come close to imagini

    • Joymakers (Score:4, Interesting)

      by fyngyrz ( 762201 ) on Thursday March 22, 2018 @12:01AM (#56303059) Homepage Journal

      They did not even come close to imagining our phones!

      Yeah, they did. Read "The Age of the Pussyfoot" by Frederik Pohl, first published (serial form) in 1966 and then re-released in book form in 1969. Not only did he come very close to imagining what we have now, he imagined things we don't yet have, but probably will eventually.

      There's a lot of great SF out there; I'd be really careful about presuming someone didn't think of our current tech in one form or another, cosmetic differences aside.

      • by Ecuador ( 740021 )

        Eh, how are (largish) devices imagined for 500 years in the future, which tap into a central mainframe-type unit for any processing and have no display, an indication that our current phones are not ahead of what sci-fi predicted for personal communicators?
        Sure, if you could find a single instance of an all-display pocket supercomputer, preferably not imagined for hundreds of years in the future, it would be interesting, and it still would not go against the point of my post: If you read/watch sci-fi (writt

  • by 89cents ( 589228 )
    Being a gamer and growing up with a Commodore 128 and moving to PC, there are a few things that really wowed me:

    1. Amiga graphics

    2. Macintosh resolution (although just black and white)

    3. Doom shareware - and creating a multiplayer LAN

    4. Star Control 2 music (mod files)

    5. GLQuake with a 3dfx Voodoo card

    6. Maybe SGI computers. Maybe ATI and nVidia graphic demos. I can't say anything else has really made my jaw dropped. I loved changing technology and couldn't wait to see what came next, but these day e

    • IANAG, but was always very into computer graphics and pretty much agree with you on everything. I did work on some graphics stuff with NASA while in college, but the most we had was maybe 5-6 years ahead of what was commercially available like 24 bit color... but after 24 bit color, there's not much more you can do. I'm sure 48 bit color looks better (even SGI had 56 bit color, most of that was for processing), but for 99%, 24bits is sufficient for 99% of all applications.
  • Is Alexa, Siri, the Xbox, Oculus Rift or iPhone truly what could be considered "amazing technology," or should we have bigger and badder tech and innovation in the year 2018?

    None of that shit is innovative. The word that fits is flashy

    Innovation is: Reusable rockets, manned space stations, and unmanned war-fighting machines, it's full take surveillance, and social networking. It's genetic therapy, and integrated cybernetic prosthetics. It's 3D printed houses, and drinking water from thin air. It's self driving cars being prevalent enough to kill people, and a rich guy shooting one into space because he can. It's high profile hacks, and the buying and selling of personal inform

  • by AlanObject ( 3603453 ) on Wednesday March 21, 2018 @08:37PM (#56301667)

    The movie Hidden Figures had many excellent and authentic moments but one of them was when one of the women got unintended access to "The IBM" (as they called it) and picked up a book on FORTRAN and taught herself how to program. With access and basic grasp of logic she built herself a career.

    That's kind of how it happened for me and my peers, although we didn't have to steal our books from the whites-only section of the library. There were already good courses and professors in Computer Science at the university but their real value was mostly in giving the nascent programmer access to equipment on which to learn.

    In that day you could easily have a thousand people using one computer such as a CDC 6600. A hundred people for a PDP-11 in timesharing was not uncommon. Today I have easily more than 100 Intel cores for my personal use and access to many more. My Macbook Pro alone has eight cores and more storage, compute, network power than an $20M supercomputer complex had back in 1970.

    So for me access was the key. Not everyone could get access to computing capability that could do anything meaningful. Back then programmers looked more like a mysterious priesthood what with their exclusive access to special locked rooms and intimidating looking equipment and the ability to command "thinking machines". Being a member of the club I suppose had an attraction.

    All the same I think I am having more fun today than I did then. There are so many more interesting things to play with.

  • by hey! ( 33014 ) on Wednesday March 21, 2018 @08:38PM (#56301685) Homepage Journal

    In some ways it was more exciting back then, in other ways it's more exciting now.

    In terms of what you can actually do, there's no comparison: today is much better. And a lot more is known about how to do things like testing and integrating large systems. But you don't so much stand on the shoulders of giants today as you do on great masses of talented but basically ordinary people. Back in the day if you didn't like the way a library worked you made your own routines. Today the volume of source required to produce the kind of applications we use today is so large you pretty much have to resign yourself to working around the mistakes of others.

  • by b0s0z0ku ( 752509 ) on Wednesday March 21, 2018 @08:39PM (#56301689)

    I.T. seems to be going in the direction of a better e-leash. How to better track your location via your phone, car, etc, how to track your habits to better throw ads in your face, how to sell things that used to be a one-time fee to you as a service paid monthly, how to keep connected to your boss 24/7, how to track citizens' travels as a government. Pardon my cynicism.

    On the other hand, recent developments in biomedical science, electric cars, renewable energy and private space travel have been amazingly cool. So it really depends on the actual technology.

    • THIS

      The reason you don't feel 'wowed' is the current developments in the next fields of the revolution (much like computers back then), haven't really hit the main stream yet.

      As someone in the field, I'd argue that advent of CRISPR-CAS9 style gene editing, implementation of Darpa's NESD project to directly interface with the brain, and the incredible success of Gingo Bioworks at designing new life forms to solve world material shortages, makes that old tech you reminisce of seem boring.

  • by Subm ( 79417 ) on Wednesday March 21, 2018 @08:53PM (#56301811)

    The survivors of the 30-year test of time probably look better than today's who haven't survived.

    Almost like there is a bias in the survivorship. Someone should come up with a name for this bias for noticing survivors.

  • by AbRASiON ( 589899 ) * on Wednesday March 21, 2018 @08:54PM (#56301825) Journal

    There was hope, well for me at least.

    Future was ever possible, Star Trek, Star Wars, amazing technological leaps, nothing bad could happen. I had no idea of the world we live in, economically, ecologically, socially. This place, I have no faith anymore, I have none. I do not in any way suspect we'll "make it". Not as we are, not as we've been. The inertia of global warming, the attitudes of the common man. The decisions of government.

    The only thing which could excite me right now would probably be a a rogue planet on a collision course here, aka Melancholia.

    Slightly less sombre, I said to someone just this past weekend, the only thing which could give me faith for our future, would be an Arrival (movie) situation, we might stand a chance then. If someone were to come, peacefully and offer us advanced, seriously, incredibly advanced tech. Maybe we'd figure shit out.

    Unlikely though, extremely.

    • by mikael ( 484 )

      We were in that situation back in the 1980's - "The Coming of the Chip" or "When the Chips are down". Back then the UK knew that a digital revolution was coming ... the microprocessor ... the paperless office ... the digital workplace ... all sorts of names ... home computing ... IT skills ...

      The first wave was when the newpaper printing presses were replaced with digital laser printers and workstations. Whole departments disappeared overnight. Email, spreadsheets and word processors changed the roles of se

  • The key technological developments of 30 years ago were things you could buy and hold in your hand or put in your house/car/whatever.

    The key technological developments of today are not "things"; at best they are "things as a service". The Echo is essentially a powered microphone with a wifi connection, which could have been done a decade ago. Alexa, however, is not a thing you can hold or kick or even own.

  • Back in the day, there were lots of new applications and interfaces that tried to do things in new ways. Some worked, some didn't.

    However, we seem to have gotten stuck in one of the neural network sub-optimal potholes. Email apps today are basically identical to Eudora. Calendars still suck. Even tools like slack are just warmed-over IRC.

    Just look at the UI for Kai's Power tools. Whoa!

    While there may be an optimal UI for various use cases, there's no particular reason that the Eudora UI should be the one th

  • As someone with a 60 year view on things I can say excitement about technology wanes and waxes with time, but currently we are in a waxing phase. Voice activated AI is starting to permeate our lives, self driving cars will be arriving soon, space has become exiting again with SpaceX.

    Here are the main upticks in general interest in technology as I see them, starting before I was born.

    1920s Travel by Car
    1930s Electricity delivered to the home (wide adoption).
    1940s True air travel and widespread radio use.
    1950s Atomic Age, Automation, TV
    1960s Space Age, Mainframe Computers
    1970s First Lull, we did get VCR's and time shifting
    1980s Video Games, Personal Computer (OK both arrived in late 70’s, but this is wide adoption)
    Early 1990s Another Lull, Personal Computing great for business – home use doesn’t live up to promise, Video Games cool a bit.
    Late 1990s, Cell Phones, and the Internet (again wide adoption)
    Early 2000s, Another Lull, though the Internet was continuing to pick up steam, computers are truly useful at home now.
    Late 2000s, HDTV wide adoption (finally TV is improving at a rapid pace)
    Early 2010s, Smart Phones wide adoption
    Late 2010s, Voice recognition AI, AI in general is rapidly improving and being used widely in business, cusp of self-driving cars, Virtual Reality is now out of the lab, SpaceX has given us back the space-age.

    Compared to previous decades I think this one is only getting hotter and hotter – Not quite yet at 1960’s Space Age general interest in technology, but a close second (and the decade isn't over yet).

    • by mikael ( 484 )

      Early 1990's - that was the hardware accelerated SVGA graphics modes, plus audio multimedia like CD drives, MIDI soundcards, multimedia instructions SSE2,

      Early 2000's - the first programmable GPU's, audio has been sorted, superscalar CPU's (500MHz)

  • The internal combustion engine is merely a steam engine with the heat produced gas expansion happening in the cylinder instead of an external boiler.
    A jet engine is simply a turbocharged engine with the piston and valves removed from the internal combustion engine.

    Looking back at these innovations, there isn't much difference between a steam engine from 1700 and a jet engine from the mid 1900's

  • I think the huge leap forward was the 50's, with everybody finally getting vacuum cleaners, washing machines, refrigerators and penicillin. Those feel like era-defining upgrades. The way people lived their lives 30 years later was not fundamentally different. The music got better, but the tech behind way people lived, worked and entertained themselves did not. Moore's law and the internet finally changed everyone's lives, and that too feels pretty era-defining.
  • by az-saguaro ( 1231754 ) on Thursday March 22, 2018 @12:12AM (#56303101)

    When I read the article, the first thing I thought of was a singular event that defined for me the transition between hands-on and brains-in enthusiasm for new technology versus passive disinterest or boredom with new technology. Byte magazine. For those who never had the opportunity to read it, it was hard core nerd stuff, printed from 1975 to 1998. In those burgeoning early days of IC's and PC's, it was a great way to learn about digital and computer technologies. It was not a technical journal. It was a general interest magazine, but Byte stories got in-depth on processor architectures, fab methods, system building, application programming, peripheral interfacing, and so on. It was a great way to stay informed about what was then a genuinely innovative and exhilarating set of new technologies. Remember, this was the era of Space Invaders, Pac Man, Tron, Tandy, Commodore, Lotus, VisiCalc, and early PC-Mac. Then, in the early 1990's, the internet started to gain traction. General public enthusiasm bloomed with the dot-com era of the later decade, but in the early decade, the days of Netscape and Mosaic, Byte saw the future and decided to shift the magazine's focus. It went all in on the internet, changing name to byte.com.

    I remember reading the first new edition, where the publisher explained the shift in focus. However, instead of adhering to their admired focus on technology reporting, they reported on where the internet could take you. Imagine a world class automobile magazine that was exalted for its in-depth articles on engine design, torque and hp, engine machining, carburetor specs and tuning, tire manufacture, highway engineering, and traffic control systems. Then suddenly in the 1960's, suburban shopping malls start popping up, so Auto Magazine then switches its entire format to describing what you can buy in the stores at the mall, which of course the car will take you to. It reports solely on where and how to go to the mall to buy shoes and clothes for a Sunday jaunt, or tires or a battery at Sears Auto, or fuses at Radio Shack. You could even buy other stuff at the Mall. Wow! That is what Byte became, a guide to online places and experiences. Bye bye Byte.

    It is easy to get beguiled by something solely because it is new. Back then, the internet was new, and it was exciting. But the underlying technology per se was perhaps too arcane or unseen for most people to care. The applied internet was what caught attention, those things that ordinary people could do with it, but not the physical infrastructure underneath. Even for the hobbyist or hacker, you couldn't just tap into an internet trunk on your own, so the technology itself became less tangible.

    It depends on how you define technology. Wireless is a great new technology that has radically altered how we do things. But, "wireless" is just radio, telephone, and pc all comingled, and each of those are old technologies. Are iterative improvements or logical machine hookups the same as fundamental new technologies? It does seem that a lot of the new technologies of the past 30 years are iterative extensions or market driven mashups of prior genuinely novel advancements.

  • I was there 30 years ago and IT just keeps getting more and more interesting. It was cool geek fun back then however now is fucking awesome and it seems to get better and better. Only problem is that it has attracted some real dicks to get involved, they need to go.

    My 2c

  • I think in the 90's and definitely the time before that, there were physical/hardware limitations to a lot of ideas, where a lot of the time, people knew the value of this stuff, its just that it had never been done before, and sufficiently powerful hardware didn't necessarily exist either. In other words, there were lots of situations where tech could be implemented to improve processes, and just make it easier to do stuff.

    Where we are at now, hardware has definitely stagnated. The last 3-5 years, I'd say

    • by swb ( 14022 )

      Tyler Cowen, an economist, wrote an interesting paper about how we've hit the great stagnation. All the low-hanging fruit of economic innovation has been realized, at least in the US. Women participate in the workforce, we've mostly maximized the productivity enhancements of technology (assembly lines, basic robotics, office automation), we've tapped into (and in some cases, nearly tapped out) cheap and simple energy sources, and so on.

      It's not to say there aren't marginal improvements, but they come with

  • Back then, each development brought something new, something you couldn't do before. Nowadays? Not so much: a decade ago my computer could render beautiful worlds in real time, or stream full-screen HD video from the internet, and today it still does that. Yes, the shadows are slightly softer and the hair is a bit more realistic, but those are refinements, not entirely new capabilities.

    Thirty years ago, having sampled sound was magic. The ability to manipulate objects the size of the screen was magic. Havin

  • Technological advances 30 years ago were for the benefit of their users. Technological advances today are for the benefit of their maker.

    I prefer to be the user to being the product. That's why they were more exciting and actually something to look forward to. The question back then was "how do I get it?" The question today is "How do I turn it off?"

  • When I switched from a green text-only monitor to a Color Graphics Adapter, I thought I was living in the future.

    Also, when I bought my first Levis Jeans on Compuserve in 1989 I thought, this is the way to shop.

    The first moving graphics were awesome as well.

    Then in 1991 we started to buy all our books via Mosaic (on books.com if memory serves) we were already a bit blasé about it.

  • Because I didn't have 30 years of learning how much bullshit is spouted by people with hidden money in the game.

    This closely related to why politicians keep wanting to reduce the voting age.

  • There are too many fads and companies aping each other these days. Someone puts a notch in the screen, now all do it. Someone invents "material design" defying 50 years of usability research and it becomes the latest and greatest. It is a sad time in technology when eliminating features is celebrated like a huge accomplishment.
  • Developments have accelerated so quickly over the last 30 years, we may have simply worn ourselves out when it comes to being impressed by it all, or just so used to it that it doesn't register as being impressive.

    In other words, we're burned out. You can only have your mind blown for so long.

  • I bought an Apple ][+ with my parents in 1980. It came with a neat reference manual that had a printout of the ROM's assembler code and a schematic. You could tinker with component-level stuff then, I soldered in a potentiometer to the 555 timer that controlled repeat-key speed (for variable speed repeat, hey I was only 15)

    Modern computers don't allow for such work, it's all board swaps and e-waste disposal. I doubt many current geeks have ever used a soldering iron.

    There are still Fun Things to Tinker Wi
  • The Space Shuttle was new, back then, and it was totally the cool technology. Way more cool than VR or 3D modeling
  • 30 years ago, RISC CPUs were just coming to the market.

    SPARC, ARM, MIPS and PA-RISC were all being released in the mid-late 80's, and it wouldn't be long until they were joined by the likes of Alpha.

    If you could afford them or had access to them, these RISC based machines were a step change in computing power compared to contemporary CISC CPUs of the time, which would have been 286 or 386 (if you were lucky) or 68K. They destroyed even the mini-computers of the time, such as VAXen. It must have been an exci

  • It's like REM said... Standing on the shoulders of giants leaves me cold.

    We're all standing on giants who were standing on giants who were standing on giants, and so on back to the stone age.

    The question is pretty pointless. They may as well have just asked: "How much older do you feel today than you felt 30 years ago?"

As Will Rogers would have said, "There is no such things as a free variable."

Working...