



Ask Slashdot: Were Developments In Technology More Exciting 30 Years Ago? 231
dryriver writes: We live in a time where mainstream media, websites, blogs, social media accounts, your barely computer literate next door neighbor and so forth frequently rave about the "innovation" that is happening everywhere. But as someone who experienced developments in technology back in the 1980s and 1990s, in computing in particular, I cannot shake the feeling that, somehow, the "deep nerds" who were innovating back then did it better and with more heartfelt passion than I can feel today. Of course, tech from 30 years ago seems a bit primitive compared to today -- computer gear is faster and sleeker nowadays. But it seems that the core techniques and core concepts used in much of what is called "innovation" today were invented for the first time one-after-the-other back then, and going back as far as the 1950s maybe. I get the impression that much of what makes billions in profits today and wows everyone is mere improvements on what was actually invented and trail blazed for the first time, 2, 3, 4, 5 or more decades ago. Is there much genuine "inventing" and "innovating" going on today, or are tech companies essentially repackaging the R&D and knowhow that was brought into the world decades ago by long-forgotten deep nerds into sleeker, sexier 21st century tech gadgets? Is Alexa, Siri, the Xbox, Oculus Rift or iPhone truly what could be considered "amazing technology," or should we have bigger and badder tech and innovation in the year 2018?
SpaceX (Score:2, Informative)
Falcon 9 landings are pretty awesome!
Re: (Score:3)
We're spinning plates (Score:2)
SGML was a big innovation; combining it with Gopher/FTP to make the web was good stuff too. Ever since then we have focused on new ways to sell distractions to the bloated consumers. The market is about to correct our over-estimation of what that is worth, but in the meantime, I got into tech to change the world, not connect refrigerators to Twitter.
Re: (Score:2)
And then CSS and JavaScript screwed that pooch.
Re: (Score:2)
Re: (Score:3)
It was pretty easy [floodgap.com], especially considering that many of us had access through a 2400 bps modem (we called them 2400 baud, but I think they were 600 baud technically). Most of what you needed to find over Gopher was text files, which made things like troff/nroff very handy for formatting text files very nicely on terminals.
Is the new technology more powerful and flexible? Without a doubt. On the other hand I didn't need an ad blocker back in those days. And people weren't remote exploiting my out-of-date gop
Depends on how old you are (Score:5, Insightful)
This question reminds me a lot of people who say "Music was so much better in the 1990s" or "Comic books are garbage now but they are so innovative in the 70s". Basically these people were more passionate about their hobbies (music, comics, computers, or whatever) when they were young than they are today. Therefore, anything going on "back in the day" was - almost by definition - so much more amazing than the pedestrian stuff we have today.
I would say the idea that there were more exciting developments 30 years ago is ludicrous. In the last few years we have virtually the whole of human knowledge at our fingertips, we've had a huge resurgence of neural nets, we have rockets that can land themselves (!), actually useful brain-machine interface (for example deep-brain stimulation for epilepsy), self-driving cars, actually cool VR, electronic communications becoming ubiquitous, cheap single board computers that even a child can use (e.g. Raspberry-Pi), electric vehicles becoming mainstream, a technology for currency that is actually threatening to upset the applecart, and on and on and on.
I was a teenager in the late 80s and early 90s and was deeply passionate about technology. I was excited about the Amiga, Unix, and C++. Those days have NOTHING on today.
Re:Depends on how old you are (Score:5, Insightful)
People do tend to have greater reverence for things during their formative years. However I will say that easily technology has obviously progressed, but in terms of creative endeavors, there's a lot of room for different dominant expressions. For example, if you were a fan of the adventure game genre, then you would really like the 90s. Similarly for space flight sims, it faded out. If you wanted an over the top action shooter, for a while there games started taking realism too seriously. Same for music, there's some good music from before I was born, and I would say some of the worst music was when I was a teenager, and music that dominated later was pretty good.
The other thing is who dominates the information outlet. Up until the mid 90s, the business-for-business sake folks didn't really sink their teeth into the industry, and it was dominated by people who were in it because they wanted it. Nowadays there are a lot of people in it to get money drowning out the continued substantive advancements.
Re: (Score:2)
It's human nature to be nostalgic. And "everything" was better 30 years ago if you ask people on any topic, from TV, to news, to elections and politicians to every other topic under the sun.
And yes, some things were better back then, but it's really survivor bias. We remember the good stuff, and ignored the crap. We ignored all the crap (music/tv/movies/news/politicians/toys/tech/cars/games/etc) and remembered the few things that were notable. This goes for products too - everyone says things were built bet
Re: (Score:2)
It's human nature to be nostalgic. And "everything" was better 30 years ago if you ask people on any topic, from TV, to news, to elections and politicians to every other topic under the sun.
I don't think that's the issue with technology. Everything sucked 30 years ago, but a year later it sucked a lot less. I'm typing this on a computer that's 4 years old and a new one is only marginally better. 20 years ago, a 4-year-old computer was practically an antique. In my lifetime, the home computer, mobile phone, Internet, and smartphone have all become ubiquitous, most from being niche products, smartphones from not existing at all.
Each one of these had some transformative effect on society.
Re: (Score:2)
But now tech discussions are usually about censorship and politics instead of tech. Just look at the massive bans today from YouTube and Reddit. The past few days before that it was about Facebook working with Russians to influence politics.
Re:Depends on how old you are (Score:4, Interesting)
This question reminds me a lot of people who say "Music was so much better in the 1990s" or "Comic books are garbage now but they are so innovative in the 70s". Basically these people were more passionate about their hobbies (music, comics, computers, or whatever) when they were young than they are today. Therefore, anything going on "back in the day" was - almost by definition - so much more amazing than the pedestrian stuff we have today.
I would say the idea that there were more exciting developments 30 years ago is ludicrous. In the last few years we have virtually the whole of human knowledge at our fingertips, we've had a huge resurgence of neural nets, we have rockets that can land themselves (!), actually useful brain-machine interface (for example deep-brain stimulation for epilepsy), self-driving cars, actually cool VR, electronic communications becoming ubiquitous, cheap single board computers that even a child can use (e.g. Raspberry-Pi), electric vehicles becoming mainstream, a technology for currency that is actually threatening to upset the applecart, and on and on and on.
I was a teenager in the late 80s and early 90s and was deeply passionate about technology. I was excited about the Amiga, Unix, and C++. Those days have NOTHING on today.
I'm about 20 years older than that. Still, when I got into computing as a professional in my 20s I was excited by those things (Amiga, Unix, C++) and also some more academic things - AI and functional programming. I was also excited by "Client/Server" and networks. Now, all of the things which I was excited by as cutting edge have been through two transitions: one, to commercial acceptance and required knowledge for programmers; and, then two, ubiquity and invisibility. Meanwhile, some very smart people and aggressive startups have put all of these in the hands of everyone from teenagers to grandmas. Back in the 80's we may have dreamed of everyone having a computer and being connected, but we did not envisage how it would be. We probably thought of some giant international connection of PCs with people chatting through text consoles. We did not envisage the www, with all the world's news and knowledge being crowd sourced, we didn't envisage facebook/instragram/twitter with ordinary people compulsively getting their latest info from each other, and we didn't envisage smart phones. WRT smartphones, that was Steve Jobs, and Steve Jobs only. Microsoft and Blackberry had 10+ years leap on them, but never understood the possibilities, nor did anyone else.
One common trend I've seen with programmers is that we dismiss the latest developments. In my time, we've dismissed the GUI ("I get more done throught the command line"). Then we dismissed the GUI with colors. Then we dismissed the web (yes - I heard that!). Then we dismissed smart phones. Then we dismissed facebook. etc. If it were up to us, we'd still be using mainframes with text consoles. Which was the state of technology when I arrived in 1980. Undoubtedly, that would have been rejected by the luddites from 30 years before then.
Good question! My answer is, as someone who was already a professional programmer 30 years ago, an emphatic NO!
(I still love the Amiga, though!)
Re: (Score:2)
BLOCKQUOTE> One common trend I've seen with programmers is that we dismiss the latest developments. In my time, we've dismissed the GUI ("I get more done throught the command line"). Then we dismissed the GUI with colors. Then we dismissed the web (yes - I heard that!). Then we dismissed smart phones. Then we dismissed facebook. etc.
And we were right every time.
Re: (Score:2)
And we were right every time.
:)
Sent from the text-mode browser on your terminal attached to the PDP "Mini Computer" at work?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Neural nets were envisioned in the Sixties, but not implemented. Self landing rockets never emerged from the cover of Analog until Elon made it so. The old electric cars couldn't get far on the lead-acid batteries of the day.
Today's technology rocks because we Boomers are no longer holding it back.
Re: (Score:2)
Self landing rockets never emerged from the cover of Analog until Elon made it so.
Depends on the definition... Surveyor says hello.
Re: (Score:2)
Delivered twice as week and hardly ever any listeria in your unfrigerated and unpasteurized milk.
Re: (Score:2)
For example I think the Sci-Fi novels when Asimov, Heinlein, et al. were in their prime are among some of the best and ther
Re: (Score:2)
Ditto. Same here as an old fart. I noticed the current young generations are enjoying the current and upcoming stuff.
Re: (Score:2)
I don't generally side with the "music was better back in the day" folks either -- but it sure hasn't evolved much in the last 40 years, compared to the 1930's to 1970's.
Re: (Score:2)
Tech was new though, sure tech is better today but then the tech was new, nothing like it what so ever, that made it interesting. The thing I missed the most, is what would it have been like growing up with developing tech, rather than already being a young adult already in the work force. Then again many today will miss out on killing things, hunting and fishing, in real life, preparing, cooking and eating what you killed, this versus the virtual version of it. The early day of shareware were also interest
Re: (Score:3)
I was a teenager in the late 80s and early 90s and was deeply passionate about technology. I was excited about the Amiga, Unix, and C++. Those days have NOTHING on today.
Depends on your criteria, I guess.
Yeah, a modern PC can do unfathomably more than my Vic-20 did.
Does that make the typical modern PC user more excited and "techie" than the typical Vic-20 user was? Um, no.
Re: (Score:2)
The Navlab self-driving van drove cross country in 1995. It required a lot of human intervention, but it existed, so yes, that was done.
If a vehicle requires a lot of human intervention, it is not 'self-driving.' Unless you are one of those people who enjoy word games and count a central heating thermostat as Artificial Intelligence.
Old (Score:5, Insightful)
You're just old. It's common for things to feel fresh and exciting when you're young and then you feel cynical and apathetic when you're old. Young nerds are always excited about the new stuff. Old nerds tend to shrug off the new stuff because they were there to see what preceded it. I mean, you can feel like a trail blazer because of the computer work you did in the 80s, but that's no different than how my dad boasted about being a trailblazer for the computer work he did in the 70s. You can keep going back until you get to the nerd that invented the abacus.
Old eyes seeing young technology. (Score:2)
I think there's more to it than that - not only were you more excited in your youth, but you had far less personal perspective. The progress of your youth was no less incremental, but you hadn't already spent decades watching the precursors.
There's also a legitimate external component though, if you're discussing computer technology specifically - 80s and 90s were sort of the golden age of computing: impressive computing power had just becoming accessible to the public, and its performance was accelerating
Re: (Score:2)
The progress of your youth was no less incremental
Video games were invented in my youth and I think that was pretty revolutionary. A lot of new computer inventions had to be made like cLUT, sprites and hardware windows. I was never interested much in the games themselves, but rather the technologies that made them possible.
Re: (Score:2)
When was your youth? A quick google suggests the first "video" (interactive CRT display) game was Space War in 1962, running on a PDP-1 the size of a large car. Computer games went back even further - "Bertie the Brain" played tic-tac-toe in 1950. When pong came out in 1972 it was already an extremely crude "retro" game, it's claim to fame was that you could play it on the tiny little Atari entertainment system, a computer affordable by middle class households.
Meanwhile sprites, hardware windows, etc, wer
Re: (Score:2)
Sorry, Pong was made by Atari, but for a dedicated console, the cartridge-based entertainment system wouldn't come out until 1977.
No, the deltas were much bigger back then (Score:3)
To go from CGA to EGA was an huge leap, in a world where monitors were not very many and most were monochrome. Compare to your phone going from small bezel to no bezel/curved screen or something.
More examples -- from Wolfenstein to Doom, from home computer to a PC (or even Amiga), from no modem to modem and the world of BBSes...
Human senses -- that includes the mental faculty -- are logarithmic in nature. From not eating any sweets to eating dark chocolate is a much bigger leap than to go from dark chocolat
Re: (Score:2)
Going from dark chocolate to sweet chocolate is a downgrade.
Re: (Score:2)
I just realize I meant to say milk chocolate but said sweet chocolate! Yes it's often so sweet it's disgusting. But it is a jump sweetness wise.
Re: (Score:2)
To go from CGA to EGA was an huge leap, in a world where monitors were not very many and most were monochrome. Compare to your phone going from small bezel to no bezel/curved screen or something. (...) Human senses -- that includes the mental faculty -- are logarithmic in nature.
But the result of extrapolating that way is that the most exciting time in human history was the caveman who discovered fire and it's been downhill ever since. That's clearly not true, we've had Dark Ages and Industrial Revolutions where human society has been stagnant or regressing and other times were it has grown by leaps and bounds. And it's hard to compare because every leap is fundamentally different, like say before and after electricity. The world before and after computers.
I think the mass spread o
Re: (Score:2)
Re: (Score:2)
most amazing things I have been seeing
Anything you can share?
Ah the good o'days (Score:5, Insightful)
Peaked with the current level of technology (Score:5, Insightful)
We reached peak smartphone with the iphone 5. Past that, it's more crap we don't need (eye candy, tendrils of the surveillance state, ever more pixels).
Alexa devices and Homepod are just a commercialized version of what geeks were doing 15 years ago, minus privacy and autonomy and self-sufficiency.
The interesting stuff, imho, is happening outside of obvious IT stuff and more where it intersects with other niches. Electric cars, sure. But electric bikes, too. Drones. Blockchain.
If phones were about serving their owners or make the world better they would use their location-awareness to mute their ringers in offices and movie theaters and waiting rooms and turn off their creepy "Hey siri" crap in bedrooms. There'd be undefeatable-via-software LEDs to indicate when cameras were being used, we'd have exact control over what apps got what data and to whom they could send it. And they'd have user-replaceable batteries.
More marketing (Score:3)
The same sort of stuff is happening, but business has a better handle on marketing, press releases and so on.
So there's a lot of do-nothing 'advances' that clog the tech media drowning out true innovation. It creates a lot of cynicism to see so many hollow articles impersonating innovation, but it's just promoters dominating what you see better than they ever have before.
Certainly with PC graphics (Score:3)
Desktop PC's moved from CGA (4-color palettes) to EGA (16 colors), VGA (256 x 320x200 colors) to SVGA/SXVGA 16-bit and then 32-bit color. CPU performance was doubling in performance. Every Intel/AMD chip had some super optimization that Byte magazine would document every few months. There were all sorts of different accelerator boards based on i860's, TMS340x0's, transputers. Audio boards just came out. Adlib Soundblaster etc... I still remember the first 256-color demo I saw for VGA, the first four 24-bit color images I saw from the Hercules graphics boards.
Today, we've got desktop PC's with dual socket motherboards, quad SLI with hundreds of streaming processors, CPU's with 16+ cores, gaming wall projectors, VR, real-time computer vision with OpenCV (that's what the i860's and TMS340x0's were used for back then), tablets and smartphones with more powerful GPU's than SGI workstations from the 1990's.
Re: (Score:3)
Certainly with a lot of things. CPU processing. I remember saying that I don't upgrade my CPU until the new one is at least 4 times as fast as the last one, which typically was about 3 years. Now it seems I'll have to wait about 20 years or more at the rate we are going.
Software. We went from small 30-40k programs to multi-gigabyte programs. We had the beginning of AIs, machine learning. We went from text-based systems (or line/screen based) to GUIs. The mouse, menus, buttons, dropdowns, windows, scr
Re: (Score:2)
So now that I said what really hasn't changed.. What has?
Consumer SSDs.
Phones.
Social Media (-ish. We had the beginnings of social media, but it definitely wasn't as popular, and grandma wasn't on it).
Online stores (Amazon, best buy, walmart, pizza, JC Penny, etc).
Everyone is on the World Wide Web today, not just techies.
Email has essentially replaced mail (USPS for you millennials) for a large number of things. Mail today is mostly garbage.
Death of the TV (Unless you can hook it up to a game system or str
Re: (Score:2)
sent people to the Moon using equipment
That's because going to the moon, or driving or anything else physical means solving hundreds of physics problems a hundred times a second. Just because your computers are 1000x faster does not mean that there are 1000x more physics problems you need to solve. Yeah, you can always solve more for safety, or redundancy or a few other reasons, but only up to a point. The software for a rocket landing on the moon today would not be doing fundamentally anything different than Apollo other than more self checks a
Yes, a thousand times more (Score:4, Interesting)
Up until this point, I had to suffer with b&w on my friend's TRS-80. I guess that was 36 years ago, still things were revolutionary up until about 2000. Everything since has been copies and copies of copies. No TV -> 40x40 black and white @ 16fps is a much bigger jump than 300x500->8k retina resolution in 3D at 120Hz.
I guess to answer the question more correctly, 30 years ago being 1988, the high tech nerdy stuff was OS/2, windows 2 and MINIX, the roton [wikipedia.org], and the DC-x [youtube.com]. Now we have spaceX doing this, but the first time is always the best. GPS is about the last truely innovative technology that I can recall
In a word, yes! (Score:2)
But that's mainly because I was 30 years younger back then and everything was pretty new and exciting whereas I'm now middle-aged and jaded by the Microsofts, Googles, Apples, and Amazons of the world.
A lot of things happened during the PC revolution that were revolutionary, particularly from the point of view of users and businesses. Spreadsheets, for example, had a huge impact on business (for better or worse). I argue that smartphones are as as revolutionary today.
In some ways the pace of innovation has
Re: (Score:2)
VESA Local Bus
Transputers
Lotus Notes
Yeah, those all turned out to be amazing, didn't they?
Re: (Score:2)
Most of those things were only 20 years ago, not 30. Plus you have a different definition of "exciting." How about:
- i386 architecture - true multitasking, virtual memory machines for the masses. Amazingly long-lived architecture.
- Lotus 1-2-3, Microsoft Excel - if you don't classify those as amazing and revolutionary, then I don't know what is. These products changed the computer landscape
- AutoCad - yes I know there were cad programs in the 70s, but in the 80s they actually became usable
- practical laser
Pretty great time... (Score:2)
It would be more accurate to say that we're seeing a lot of technologies we've dreamed about for decades finally maturing. There's been voice recognition software since the 80s, but now we have true accurate and for the most part free, voice recognition. We have actual self-driving cars on the road today, granted in a very small form, but it's coming very quickly. We have actual gene-splicing happening on actually humans. We have actual cloning. Space travel is rapidly becoming available to citizens. It's a
Re: (Score:2)
coming very quickly
Have been hearing this since an auto in DARPA's 2005 Grand Challenge. Back in the 80's and 90's, no one talked about computer maturing quickly, but their capabilities doubled nearly every month on some front. I haven't seen a self driving car since the ones I worked on in college. I'm not even sure what doubling the capabilities of a self driving car would even mean...though people back then were working on systems that could surpass a human driver for things like backing up three tractor trailers (I think
Article from 30 years ago (Score:5, Interesting)
Using Internet and overlapping networks, thousands of men and women in 17 countries swap recipes and woodworking tips, debate politics, religion and antique cars, form friendships and even fall in love. But the networks that link tens of thousands of computers 24 hours a day also allowed the computer virus to spread much more rapidly, and with far greater potential for damage, than any previous electronic invader. That frightens many network visionaries, who dream of a "worldnet" with ever more extensive connections and ever fewer barriers to the exchange of knowledge. "The Internet is a community far more than a network of computers and cables," Stoll said. "When your neighbors become paranoid of one another, they no longer cooperate, they no longer share things with each other. It takes only a very, very few vandals to ... destroy the trust that glues our community together."
Good thing THAT never happened!
We are too concentrated on phones (Score:2)
When I was a kid in the 80's, sci-fi shows were all about flying cars, fast planes, spaceships etc. Sure, sometimes the moon exploded due to nukes and started traveling through the galaxy, but still there was a moon base on it.
In almost all aspects, we are not there. In some areas we have even regressed - we no longer have supersonic commercial planes, our manned spaceflight is also more limited.
Science fiction was too optimistic about all technology... except PHONES. They did not even come close to imagini
Joymakers (Score:4, Interesting)
Yeah, they did. Read "The Age of the Pussyfoot" by Frederik Pohl, first published (serial form) in 1966 and then re-released in book form in 1969. Not only did he come very close to imagining what we have now, he imagined things we don't yet have, but probably will eventually.
There's a lot of great SF out there; I'd be really careful about presuming someone didn't think of our current tech in one form or another, cosmetic differences aside.
Re: (Score:2)
Eh, how are (largish) devices imagined for 500 years in the future, which tap into a central mainframe-type unit for any processing and have no display, an indication that our current phones are not ahead of what sci-fi predicted for personal communicators?
Sure, if you could find a single instance of an all-display pocket supercomputer, preferably not imagined for hundreds of years in the future, it would be interesting, and it still would not go against the point of my post: If you read/watch sci-fi (writt
Yes (Score:2)
1. Amiga graphics
2. Macintosh resolution (although just black and white)
3. Doom shareware - and creating a multiplayer LAN
4. Star Control 2 music (mod files)
5. GLQuake with a 3dfx Voodoo card
6. Maybe SGI computers. Maybe ATI and nVidia graphic demos. I can't say anything else has really made my jaw dropped. I loved changing technology and couldn't wait to see what came next, but these day e
Re: (Score:2)
You got it all wrong- (Score:2)
Is Alexa, Siri, the Xbox, Oculus Rift or iPhone truly what could be considered "amazing technology," or should we have bigger and badder tech and innovation in the year 2018?
None of that shit is innovative. The word that fits is flashy
Innovation is: Reusable rockets, manned space stations, and unmanned war-fighting machines, it's full take surveillance, and social networking. It's genetic therapy, and integrated cybernetic prosthetics. It's 3D printed houses, and drinking water from thin air. It's self driving cars being prevalent enough to kill people, and a rich guy shooting one into space because he can. It's high profile hacks, and the buying and selling of personal inform
More exciting? Maybe. Definitely DIfferent (Score:3)
The movie Hidden Figures had many excellent and authentic moments but one of them was when one of the women got unintended access to "The IBM" (as they called it) and picked up a book on FORTRAN and taught herself how to program. With access and basic grasp of logic she built herself a career.
That's kind of how it happened for me and my peers, although we didn't have to steal our books from the whites-only section of the library. There were already good courses and professors in Computer Science at the university but their real value was mostly in giving the nascent programmer access to equipment on which to learn.
In that day you could easily have a thousand people using one computer such as a CDC 6600. A hundred people for a PDP-11 in timesharing was not uncommon. Today I have easily more than 100 Intel cores for my personal use and access to many more. My Macbook Pro alone has eight cores and more storage, compute, network power than an $20M supercomputer complex had back in 1970.
So for me access was the key. Not everyone could get access to computing capability that could do anything meaningful. Back then programmers looked more like a mysterious priesthood what with their exclusive access to special locked rooms and intimidating looking equipment and the ability to command "thinking machines". Being a member of the club I suppose had an attraction.
All the same I think I am having more fun today than I did then. There are so many more interesting things to play with.
Short answer: yes and no. (Score:5, Insightful)
In some ways it was more exciting back then, in other ways it's more exciting now.
In terms of what you can actually do, there's no comparison: today is much better. And a lot more is known about how to do things like testing and integrating large systems. But you don't so much stand on the shoulders of giants today as you do on great masses of talented but basically ordinary people. Back in the day if you didn't like the way a library worked you made your own routines. Today the volume of source required to produce the kind of applications we use today is so large you pretty much have to resign yourself to working around the mistakes of others.
Depends which technology... (Score:3)
I.T. seems to be going in the direction of a better e-leash. How to better track your location via your phone, car, etc, how to track your habits to better throw ads in your face, how to sell things that used to be a one-time fee to you as a service paid monthly, how to keep connected to your boss 24/7, how to track citizens' travels as a government. Pardon my cynicism.
On the other hand, recent developments in biomedical science, electric cars, renewable energy and private space travel have been amazingly cool. So it really depends on the actual technology.
Re: (Score:2)
The reason you don't feel 'wowed' is the current developments in the next fields of the revolution (much like computers back then), haven't really hit the main stream yet.
As someone in the field, I'd argue that advent of CRISPR-CAS9 style gene editing, implementation of Darpa's NESD project to directly interface with the brain, and the incredible success of Gingo Bioworks at designing new life forms to solve world material shortages, makes that old tech you reminisce of seem boring.
What survived the test of time (Score:3)
The survivors of the 30-year test of time probably look better than today's who haven't survived.
Almost like there is a bias in the survivorship. Someone should come up with a name for this bias for noticing survivors.
One big difference back then. (Score:4, Insightful)
There was hope, well for me at least.
Future was ever possible, Star Trek, Star Wars, amazing technological leaps, nothing bad could happen. I had no idea of the world we live in, economically, ecologically, socially. This place, I have no faith anymore, I have none. I do not in any way suspect we'll "make it". Not as we are, not as we've been. The inertia of global warming, the attitudes of the common man. The decisions of government.
The only thing which could excite me right now would probably be a a rogue planet on a collision course here, aka Melancholia.
Slightly less sombre, I said to someone just this past weekend, the only thing which could give me faith for our future, would be an Arrival (movie) situation, we might stand a chance then. If someone were to come, peacefully and offer us advanced, seriously, incredibly advanced tech. Maybe we'd figure shit out.
Unlikely though, extremely.
Re: (Score:2)
We were in that situation back in the 1980's - "The Coming of the Chip" or "When the Chips are down". Back then the UK knew that a digital revolution was coming ... the microprocessor ... the paperless office ... the digital workplace ... all sorts of names ... home computing ... IT skills ...
The first wave was when the newpaper printing presses were replaced with digital laser printers and workstations. Whole departments disappeared overnight. Email, spreadsheets and word processors changed the roles of se
Yes and no (Score:2)
The key technological developments of 30 years ago were things you could buy and hold in your hand or put in your house/car/whatever.
The key technological developments of today are not "things"; at best they are "things as a service". The Echo is essentially a powered microphone with a wifi connection, which could have been done a decade ago. Alexa, however, is not a thing you can hold or kick or even own.
The rate of improvement has slowed (Score:2)
Back in the day, there were lots of new applications and interfaces that tried to do things in new ways. Some worked, some didn't.
However, we seem to have gotten stuck in one of the neural network sub-optimal potholes. Email apps today are basically identical to Eudora. Calendars still suck. Even tools like slack are just warmed-over IRC.
Just look at the UI for Kai's Power tools. Whoa!
While there may be an optimal UI for various use cases, there's no particular reason that the Eudora UI should be the one th
Hottest its been since the 60's (Score:3)
As someone with a 60 year view on things I can say excitement about technology wanes and waxes with time, but currently we are in a waxing phase. Voice activated AI is starting to permeate our lives, self driving cars will be arriving soon, space has become exiting again with SpaceX.
Here are the main upticks in general interest in technology as I see them, starting before I was born.
1920s Travel by Car
1930s Electricity delivered to the home (wide adoption).
1940s True air travel and widespread radio use.
1950s Atomic Age, Automation, TV
1960s Space Age, Mainframe Computers
1970s First Lull, we did get VCR's and time shifting
1980s Video Games, Personal Computer (OK both arrived in late 70’s, but this is wide adoption)
Early 1990s Another Lull, Personal Computing great for business – home use doesn’t live up to promise, Video Games cool a bit.
Late 1990s, Cell Phones, and the Internet (again wide adoption)
Early 2000s, Another Lull, though the Internet was continuing to pick up steam, computers are truly useful at home now.
Late 2000s, HDTV wide adoption (finally TV is improving at a rapid pace)
Early 2010s, Smart Phones wide adoption
Late 2010s, Voice recognition AI, AI in general is rapidly improving and being used widely in business, cusp of self-driving cars, Virtual Reality is now out of the lab, SpaceX has given us back the space-age.
Compared to previous decades I think this one is only getting hotter and hotter – Not quite yet at 1960’s Space Age general interest in technology, but a close second (and the decade isn't over yet).
Re: (Score:2)
Early 1990's - that was the hardware accelerated SVGA graphics modes, plus audio multimedia like CD drives, MIDI soundcards, multimedia instructions SSE2,
Early 2000's - the first programmable GPU's, audio has been sorted, superscalar CPU's (500MHz)
Hindsight makes everything look easy (Score:2)
The internal combustion engine is merely a steam engine with the heat produced gas expansion happening in the cylinder instead of an external boiler.
A jet engine is simply a turbocharged engine with the piston and valves removed from the internal combustion engine.
Looking back at these innovations, there isn't much difference between a steam engine from 1700 and a jet engine from the mid 1900's
50's was more like the 80's than the 80's vs today (Score:2)
When technology Byte's . . . (Score:5, Interesting)
When I read the article, the first thing I thought of was a singular event that defined for me the transition between hands-on and brains-in enthusiasm for new technology versus passive disinterest or boredom with new technology. Byte magazine. For those who never had the opportunity to read it, it was hard core nerd stuff, printed from 1975 to 1998. In those burgeoning early days of IC's and PC's, it was a great way to learn about digital and computer technologies. It was not a technical journal. It was a general interest magazine, but Byte stories got in-depth on processor architectures, fab methods, system building, application programming, peripheral interfacing, and so on. It was a great way to stay informed about what was then a genuinely innovative and exhilarating set of new technologies. Remember, this was the era of Space Invaders, Pac Man, Tron, Tandy, Commodore, Lotus, VisiCalc, and early PC-Mac. Then, in the early 1990's, the internet started to gain traction. General public enthusiasm bloomed with the dot-com era of the later decade, but in the early decade, the days of Netscape and Mosaic, Byte saw the future and decided to shift the magazine's focus. It went all in on the internet, changing name to byte.com.
I remember reading the first new edition, where the publisher explained the shift in focus. However, instead of adhering to their admired focus on technology reporting, they reported on where the internet could take you. Imagine a world class automobile magazine that was exalted for its in-depth articles on engine design, torque and hp, engine machining, carburetor specs and tuning, tire manufacture, highway engineering, and traffic control systems. Then suddenly in the 1960's, suburban shopping malls start popping up, so Auto Magazine then switches its entire format to describing what you can buy in the stores at the mall, which of course the car will take you to. It reports solely on where and how to go to the mall to buy shoes and clothes for a Sunday jaunt, or tires or a battery at Sears Auto, or fuses at Radio Shack. You could even buy other stuff at the Mall. Wow! That is what Byte became, a guide to online places and experiences. Bye bye Byte.
It is easy to get beguiled by something solely because it is new. Back then, the internet was new, and it was exciting. But the underlying technology per se was perhaps too arcane or unseen for most people to care. The applied internet was what caught attention, those things that ordinary people could do with it, but not the physical infrastructure underneath. Even for the hobbyist or hacker, you couldn't just tap into an internet trunk on your own, so the technology itself became less tangible.
It depends on how you define technology. Wireless is a great new technology that has radically altered how we do things. But, "wireless" is just radio, telephone, and pc all comingled, and each of those are old technologies. Are iterative improvements or logical machine hookups the same as fundamental new technologies? It does seem that a lot of the new technologies of the past 30 years are iterative extensions or market driven mashups of prior genuinely novel advancements.
No way (Score:2)
I was there 30 years ago and IT just keeps getting more and more interesting. It was cool geek fun back then however now is fucking awesome and it seems to get better and better. Only problem is that it has attracted some real dicks to get involved, they need to go.
My 2c
This time it's different! (Score:2)
I think in the 90's and definitely the time before that, there were physical/hardware limitations to a lot of ideas, where a lot of the time, people knew the value of this stuff, its just that it had never been done before, and sufficiently powerful hardware didn't necessarily exist either. In other words, there were lots of situations where tech could be implemented to improve processes, and just make it easier to do stuff.
Where we are at now, hardware has definitely stagnated. The last 3-5 years, I'd say
Re: (Score:3)
Tyler Cowen, an economist, wrote an interesting paper about how we've hit the great stagnation. All the low-hanging fruit of economic innovation has been realized, at least in the US. Women participate in the workforce, we've mostly maximized the productivity enhancements of technology (assembly lines, basic robotics, office automation), we've tapped into (and in some cases, nearly tapped out) cheap and simple energy sources, and so on.
It's not to say there aren't marginal improvements, but they come with
Yes. Yes, it was (Score:2)
Back then, each development brought something new, something you couldn't do before. Nowadays? Not so much: a decade ago my computer could render beautiful worlds in real time, or stream full-screen HD video from the internet, and today it still does that. Yes, the shadows are slightly softer and the hair is a bit more realistic, but those are refinements, not entirely new capabilities.
Thirty years ago, having sampled sound was magic. The ability to manipulate objects the size of the screen was magic. Havin
No, just more useful (Score:2)
Technological advances 30 years ago were for the benefit of their users. Technological advances today are for the benefit of their maker.
I prefer to be the user to being the product. That's why they were more exciting and actually something to look forward to. The question back then was "how do I get it?" The question today is "How do I turn it off?"
Sure (Score:2)
When I switched from a green text-only monitor to a Color Graphics Adapter, I thought I was living in the future.
Also, when I bought my first Levis Jeans on Compuserve in 1989 I thought, this is the way to shop.
The first moving graphics were awesome as well.
Then in 1991 we started to buy all our books via Mosaic (on books.com if memory serves) we were already a bit blasé about it.
Everything was more exciting 30 years ago (Score:2)
Because I didn't have 30 years of learning how much bullshit is spouted by people with hidden money in the game.
This closely related to why politicians keep wanting to reduce the voting age.
Yes, less fads (Score:2)
Fatigue, accustomization (Score:2)
In other words, we're burned out. You can only have your mind blown for so long.
Yes, but there are still fun areas: Arduino, etc. (Score:2)
Modern computers don't allow for such work, it's all board swaps and e-waste disposal. I doubt many current geeks have ever used a soldering iron.
There are still Fun Things to Tinker Wi
The Space Shuttle. (Score:2)
Rise of RISC (Score:2)
30 years ago, RISC CPUs were just coming to the market.
SPARC, ARM, MIPS and PA-RISC were all being released in the mid-late 80's, and it wouldn't be long until they were joined by the likes of Alpha.
If you could afford them or had access to them, these RISC based machines were a step change in computing power compared to contemporary CISC CPUs of the time, which would have been 286 or 386 (if you were lucky) or 68K. They destroyed even the mini-computers of the time, such as VAXen. It must have been an exci
It's like REM said... (Score:2)
It's like REM said... Standing on the shoulders of giants leaves me cold.
We're all standing on giants who were standing on giants who were standing on giants, and so on back to the stone age.
The question is pretty pointless. They may as well have just asked: "How much older do you feel today than you felt 30 years ago?"
Re:Yes (Score:5, Insightful)
Things definitely seemed to move faster. I was watching a video about the 8 bit ZX Spectrum today. It ended production in 1992, and by 1995 we had the Playstation. In comparison my current computers are mostly over 5 years old and the latest models are not really noticeably better for most tasks.
Re: (Score:3)
Business interests have taken over. The next major development is underway now with qbits. The big problem is the storage cartels. They're conspiring to keep manufacturing limited, HD capacities relatively low, & prices inflated.
Re:Yes (Score:4, Informative)
The Spectrum was launched in 1982.
Re: (Score:2)
Sure, but you could buy it new in 1992. The Amiga 1200 was brand new. The 16 bit console era was in full swing.
Re: (Score:3)
My point was, you can't point to Spectrums in 1992 as evidence of rapid progress. The Spectrums on sale in 1992 were the last gasp of a dying company selling 10 year-old hardware. There's 13 years between the introduction of Spectrum and Playstation.
Re: (Score:3)
My point was that from the ordinary user's point of view, the kid who got a computer for their birthday or the adult looking to buy a machine for games or applications would have been considering machines like the Spectrum, Commodore 64 (discontinued 1994), Amiga, Archimedes and low end DOS/Win 3.11 PCs.
Computers back then had a much longer shelf life. The speed of developments seemed more rapid to people.
Having said that, the Spectrum was 1982, and in 1985 we had the Amiga, and later machines like the X680
Re: (Score:2)
The exciting time for a technology is when it transitions from being a novelty that demonstrates some potential to being good enough for widespread use. This transition took quite a long time for home computers - from about the mid '80s to the late '90s. It took a similar amount of time for smartphones, but there was a much sharper inflection point around the time of the original iPhone when large displays became cheap and there was a big jump in usability. Few people used the earlier smartphones and the
Re: (Score:2)
In comparison my current computers are mostly over 5 years old and the latest models are not really noticeably better for most tasks.
The problem is you define better purely by speed of your tasks. That ignores a lot of the industry focus over the years. You're not hugely faster at playing games, but then you ignore that you can play them now on battery power for a few hours at a time. You ignore the focus on intelligence and personal assistance, really I greatly prefer the modern world of accurate mapping and traffic information than slightly better gaming graphics (something which has also advanced even in the past 5 years *If you have
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Airplanes are actually a good example. Most of what we currently know as commercial airplanes were pretty much solidified by the late 1960's. Most changes since then feel very much like incremental optimizations.
Re: (Score:2)
Re: (Score:3)
If you told me 30 years ago that state of the art VR in 2018 is cardboarding your phone to your face I would have said "Fuck you."
Re: (Score:2)
Re: (Score:2)
But even without the internet, the computer has had a fairly profound impact in terms of disruption. I graduated a year before www hit, and had been using the internet for nearly a decade before that, but mostly for sharing data and designs. But was doing significant work on stand alone computers (or more accurately mainframes) that would not have been possible before without the computer.
Re: (Score:2)
A Raspberry Pi is a personal computer.
It's just not an IBM compatible PC.
Re: (Score:2)
I built my own 8-bit system with a Z-80A. Lots of us old codgers out there that programmed with paper tape, cloads, and binary front panel loads.
There are products, sure.
We went from a pair of wires for internal networks to coax, to twisted pairs, to fiber, to wireless, with branches for telco-supplied interconnect and carrier supplied interconnect. Carriers used CSU/DSUs along with telcos until networking protocols evolved, then fiber became practical. Some carriers still use DSL because they're cheap.
We w