Ask Slashdot: What Would Computing Look Like Today If the Amiga Had Survived? 221
dryriver writes: The Amiga was a remarkable machine at the time it was released -- 1985. It had a multitasking capable GUI-driven OS and a mouse. It had a number of cleverly designed custom chips that gave the Amiga amazing graphics and sound capabilities far beyond the typical IBM/DOS PCs of its time. The Amiga was the multimedia beast of its time -- you could create animated and still 2D or 3D graphics on it, compose sophisticated electronic music, develop 2D or 3D 16-Bit games, edit and process digital video (using Video Toaster), and of course, play some amazing games. And after the Amiga -- as well as the Atari ST, Archimedes and so on -- died, everybody pretty much had to migrate to either the PC or Mac platforms. If Commodore and the Amiga had survived and thrived, there might have been four major desktop platforms in use today: Windows, OSX, AmigaOS and Linux. And who knows what the custom chips (ASICs? FPGAs?) of an Amiga in 2019 might have been capable of -- Amiga could possibly have been the platform that makes nearly life-like games and VR/AR a reality, and given Nvidia and AMD's GPUs a run for their money.
What do you think the computing landscape in 2019 would have looked like if the Amiga and AmigaOS as a platform had survived? Would Macs be as popular with digital content creators as they are today? Would AAA games target Windows 7/8/10 by default or tilt more towards the Amiga? Could there have been an Amiga hardware-based game console? Might AmigaOS and Linux have had a symbiotic existence of sorts, with AmigOS co-existing with Linux on many enthusiast's Amigas, or even becoming compatible with each other over time?
What do you think the computing landscape in 2019 would have looked like if the Amiga and AmigaOS as a platform had survived? Would Macs be as popular with digital content creators as they are today? Would AAA games target Windows 7/8/10 by default or tilt more towards the Amiga? Could there have been an Amiga hardware-based game console? Might AmigaOS and Linux have had a symbiotic existence of sorts, with AmigOS co-existing with Linux on many enthusiast's Amigas, or even becoming compatible with each other over time?
Favorite Alternative History Site (Score:4, Informative)
This [stackexchange.com]is probably my favorite site for hypothetical futures.
What if Dinosaurs never went extinct? (Score:3)
Hardware designed specifically for software that was designed specifically for said hardware seems to be where it's at.
So much emphasis on compatibility in the dominant markets is quelling innovation.
You can't just build the best possible machine. You have to build a machine that is also compatible with : never ending laundry list of protocols, standards, API's , hardware, form factors etc etc.
It is both unfortunate and apparently necessary.
amiga needed pci when pc's & bit later macs go (Score:2)
amiga needed pci when pc's & bit later macs got it
Re: (Score:3)
Amiga got PCI but real Amiga development was gone by then. The technology had been purchased by a third party and they basically just coasted the remaining value out.
Re: (Score:2)
The original IBM PC was a lot worse for that. At the time DOS didn't even have drivers, so you had to make your hardware register level compatible with IBM's. That rather limited innovation to say the least.
The Amiga could have been the dominant platform. It was expandable, there are APIs for hardware abstraction even in fairly early versions of the OS.
Re: (Score:3)
BIOS is your driver, very much the old CP/M-80 way of dealing with things. Anything not defined by BIOS has no abstraction, later option ROMs were possible on cards which is how we got SCSI and IDE to boot.
But writing a custom BIOS was too much of a pain for a little clone maker shop when IBM gives the sources away for free. On top of that the use of off-the-shelf components makes cloning an IBM PC much easier than doing something new. Cloning a C64, Amiga, Atari ST, or Macintosh was harder because of the c
Re: (Score:2)
It is both unfortunate and apparently necessary.
Interoperability is one of the greatest success stories of the PC. Even in the old days pre-USB, pre-PCI, pre-plug and play the PC itself was painful enough to make work, let alone any talk of a special purpose device.
They have their place, but it's not in general purpose computing.
Re: (Score:2)
Are you old enough to understand the pain of device configuration that was pre-USB? Assigning IRQs by hand and such?
And how having one plug was great? Until we had a bunch... and then, by sheer fucking accident, politicians did something intelligent and mandated a single plug standard again.
And then we did this all over again.
Standards allow the rest of us who just want to Get Shit Done, to Get Shit Done, instead of, "oooh shiny!".
Hence the part where i included :
"It is both unfortunate and apparently necessary."
If i want to build a machine that will be marketable, the standards are necessary. If I really wanted to build ONE machine that was REALLY good at something, I'd have to ignore them.
Re: (Score:2)
If I really wanted to build ONE machine that was REALLY good at something, I'd have to ignore them.
That's why CRAY was so weird.
Re: (Score:3)
Politicians did what? Aside from dangerous stuff (mains power) the law generally doesn't regulate what sort of plugs you have to use for anything -- individual products do need to be FCC approved, etc., but there's no law mandating that your computer has to have USB or Thunderbolt or whatever. Manufacturers can make whatever they want -- it just made sense to standardize on something that all devices could use, and USB won out over Firewire (and continues to win out over Thunderbolt, hence Thunderbolt throw
Re: (Score:2)
Plug and Pray wasn't good enough for you huh?
Re: What if Dinosaurs never went extinct? (Score:3)
The problem wasn't having to set IRQs... the problem was that there were only 16 (until Intel expanded them to 24 sometime after it mattered), and 95% of card only allowed you to choose between 2 or 4 out of 2/9, 3, 4, 5, or 7.
If all cards had been 16-bit & we'd had 32 or 64 IRQs (selectable via 5 or 6 jumpers to set an arbitrary binary value) to choose from, IRQs would have been little more than a bookkeeping nuisance.
USB has its own pain points, especially for embedded development. At least with RS-23
IRQ numbers (Score:2)
IRQ numbers aren't supposed to be separate channels one per piece of hardware.
IRQ numbers are supposed to be priorities, i.e.: which should the CPU serve first when multiple come at the same time.
It's perfectly possible to share IRQs.
Multiple drivers register the same IRQ, each IRQ handler, when called, checks if its piece of hardware is requesting attention (e.g.: is Sound Blaster signaling that its buffer is empty ?) serve it if necessary, and chain to the next handler in the chain (IRQ7 wasn't triggered
Re: (Score:2)
And the Amiga predates USB by many years, and never had to mess around with manual assigning of IRQs etc.
the same as today (Score:3)
Computers have become so immensely and ridiculously complex that there's no more common ground to machines from 30 years ago. You can call your desktop a "PC" but it has as little to do with a 25 MHz 386 as it does with the Amiga.
Supposing that Commodore hadn't been run by stupendous idiots, the natural evolution of computing hardware would have happened much the same. Carmack would have maybe made his games for the theoretical much better Amiga Commodore should have made and the market would then have started making more and more hardware for the new games. Either Commodore would have had no choice but to evolve along or it would have had to open up the Amiga architecture and allowed clones.
The shape of present computers is pretty much inevitable.
Re: (Score:2)
I'd go the other way and say everything is almost exactly the same, except that most apps are now built using a large number of bloated libraries. If you're writing an operating system, everything is about the same. Everything from 386 to now has been incremental changes, there have been no hard breaks in compatibility. Old software still works, and the new software would work fine wherever it would fit.
The Amiga didn't have special hardware. They just had a different package of price/performance tradeoffs
Re:the same as today (Score:5, Insightful)
I think saying that it "didn't have special hardware" is a bit of a disservice to the original design team. They made some pretty amazing, highly-optimized designs that did more with less silicon. Yes it did involve tradeoffs, but there was also some serious artistry involved in the circuit design.
Many/most computers of the day used off-the-shelf parts cobbled together in a way that let them achieve their end goal via brute force. The Atari 2600/Atari 400/800 team that wound up defecting to create the Amiga brought a lot of their special talents with them to create something special, just like they did in the 1970's at Atari (seriously, the Atari 8-bit machines were 3+ years ahead of the curve in capability on release).
Which is funny as the Atari ST was very much a straightforward brute-force machine, nothing like its predecessors. Unfortunately the off-the-shelf hardware it used was mostly shared with the Amiga as well, so people wrote simple software for the ST and just ported it straight to the Amiga.
Of course later on as the microcomputer market exploded, economies of scale kicked in, and manufacturing brute force hardware got cheaper than funding the creation of bespoke designs -- until we started hitting walls in physics in the past 10 years or so and now silicon is having to get creative again, albeit only through large players that can pay for large production runs. Modern Intel/AMD/Nvidia CPUs/GPUs *are* Amiga-like works of art today, they're just hidden behind generic ABIs/APIs.
Re: (Score:2)
Re: (Score:3)
They did have different hardware - at the time the whole co-processor was new. You'd be surprised at how many things you think at normal, started with the Amiga.
There's a documentary called "From Bedrooms to Billions" [imdb.com] that describes the whole Amiga development - from its hippy beginnings to its corporate death. Its not the best documentary ever, but the subject matter should be really interesting to you.
but for time - I think Apple would have stayed bankrupt and dsisappeared, and Microsoft would have used A
Re: (Score:2)
Computers have become so immensely and ridiculously complex that there's no more common ground to machines from 30 years ago. You can call your desktop a "PC" but it has as little to do with a 25 MHz 386 as it does with the Amiga..
True, but not really. My pc is still mostly hardware compatible with that 386x25mhz.
Re: (Score:2)
True, but not really. My pc is still mostly hardware compatible with that 386x25mhz.
And yet you cannot directly address any of the hardware as you did back when your 386 was state of the art. You are purposefully abstracted from it, and a large part of the complexity of modern systems is the legacy compatibility.
As for hardware compatibility, I think you may have missed type. There's nothing hardware compatible with the 386. Your I/O cards do not share any means of communicating with your 386 as PCIe is fundamentally different from PCI, which itself was quite different from the ISA expansi
There was an Amiga based game console... (Score:2, Informative)
...the CD32 [wikipedia.org].
It sold well, for a while, but failed to save Commodore.
Re:There was an Amiga based game console... (Score:4, Interesting)
The CD32 sold for about 9 months and due to a trademark dispute they couldn't sell it in the United States.
Most games were ported from the A1200 (disk-based) but with a CD soundtrack or maybe some FMV, but the software was essentially the same.
So the CD32 never really had enough time to become its own system, and Commodore was already dead in the water at the launch time of the CD32 so publishers largely stayed away. The A1200 was really the last Amiga with its own software.
People into Amiga should really check out YouTube as there are a lot of interesting channels there like Kim Justice [youtube.com] with her history of the platform [youtube.com]...and Retro Recipes [youtube.com].
Guru meditation jokes (Score:2, Redundant)
instead of blue-screen-of-death jokes
not much different probably (Score:3)
Re: (Score:2)
Oh,
I have MachTen, too.
Was thinking about it a few days ago, but could not remember its name. I guess it is in the big ox in the cellar.
Well Linux 68k should run on Amigas, not sure thought. Oh, it dos: http://www.linux-m68k.org/ [linux-m68k.org]
Many of my friends had Amigas or Ataris, I had a mac and an Archimedes.
A computer I like to get my hands on is a Sinclair 68k, I forgot the name, perhaps a spectre?
Or the TRS-80 Color Computer (Score:4, Insightful)
It's M6809 Processors were more powerful such that you didn't need as powerful graphics chips.
Regardless of which system it is and even though computers are technically thousands of times faster today, the 1980's computers were tremendously faster (in effect) and easier and funner to work with.
Over all these years, my sense was that while hardware improved, software just kept moving backwards. For example, a modern Node.js + React application is hugely more complicated and time consuming to both make and maintain than COBOL. And frankly, the newer tools and languages really don't bring that much more to the table... Graphics are better... but that's really just praise for the hardware. Doing the same things in software as we used to do are much harder today than the used to be.... be it graphics or data processing.
Re: (Score:2)
One of the best CoCo hardware hackers runs:
https://thezippsterzone.com/ [thezippsterzone.com]
Only correct answer: It Depends. (Score:2)
What else happens differently is there are two high-endish, tightly controlled, platforms instead of one?
Let's say Commodore survives the 90s, but the MS OSes (DOS/Windows/etc.) become dominant as they did in the real world. Perhaps two of these platforms can't survive the Win95 onslaught, so macOS is dead, and Commodore slots right into the Apple Macintosh line. But this is boring so this scenario will be ignored for the remainder of this post.
If say both survive Windows is dominant, but less dominant. Tha
Re: (Score:3)
AmigaOS in 1985 was from a kernel perspective similar in feature set to Windows 95 -- albeit missing memory protection (something that was supposed to be included but Commodore nixed it because they didn't want to pay for the MMU). Obviously Win95 from a user experience included many more external features (TCP/IP etc.) than 1985's AmigaOS included. Still, AmigaOS was really a full-fledged preemptive multitasking kernel, much closer to UNIX than Windows 3.1 or MacOS 9. In 1991 a POSIX shared library was re
People forget (Score:3)
My opinion. I could be wrong, but then so could you.
Re: (Score:2)
...and the reason PCs got cheap is IBM created a device built from off the shelves parts that was easy to copy and impossible to protect the IP of the design. In one sense a massive IBM screwup, in another an early, accidental example of the power of open source.
Also the start of decades of pain for the poor sods trying to use the not quite compatible mess that created!
Re: (Score:2)
It makes you wonder what the personal computing landscape would have looked like had IBM built a machine that couldn't be copied.
Do you end up with the various other "big" computer companies also releasing a personal computer? Circa '83/'84, I remember DEC had the Rainbow 100 and HP had something, too. Are they all linked by a CP/M and Intel CPUs, or does Zilog get the Z8000 off the ground and gain traction? What about the 68000?
Does Apple's release of the Macintosh in a fragmented, CP/M world make it mo
Re: (Score:2)
Strange that we still have hundreds of different car marks/brands/types.
You seem to mix up a few things about competition.
Most "old" systems went bankrupt, for what ever reason. Competition it was not.
The prices for all those systems were very similar to each other. Problem was compatibility ... and then applications and games got written for the platform most common. And: to chose a platform, Atari, Amiga, Archimedes, Sinclair: you needed to be a geek. No one really understood the implications of shifting
Seriously... (Score:3, Interesting)
The Amiga died because of Commodore. There was no path for survival under Commodore, full stop. What could have saved the Amiga? IMO only of course...
1. Clones. 1987 and there were PC 386 clones everywhere. The Amiga had the A500 and A2000. Just about a fair fight, especially on price. But there wasn't a business out there (except specialty niches like video) that would use an Amiga as a general purpose machine without access to commodity hardware from "names". Commodore was a games machine, not a business computer, because of this factor and the lack of software. This was true since 1982 and the release of the IBM PC.
2. Software. Businesses didn't care much about color graphics (Until Windows 3.1 really) but they cared a lot about Lotus 1.2.3 and Dbase 2. They cared about DOS too - mostly in the ability to train their users and exchange floppies. And if the other department was using DOS, you had to too. IT departments liked standardized hardware and software, and the Amiga was too different. Commodore made some half-assed attempts with bridgeboards and there was cross-dos, but none of that was going to make an entry into the commodity PC market.
3. Games. Amiga was a games machine in the eyes of companies. So why didn't Commodore push that aspect harder, and go all-in there? Well, first of all it was too expensive as a dedicated games machine (The A500 eventually came close enough) - and second, you forgot about the big video game crash of 1982? Most companies hadn't forgotten, and there was high reluctance to create a dedicated game machine.
4. Commodore. Commodore had no competence in marketing the Amiga or positioning it. The C64 practically sold itself, but the Amiga was another situation entirely. Yes, AmigaDOS and Intuition were very sophisticated, but that didn't really sell machines. Yup, the graphics were amazing and you had animation, but whizz-bang didn't sell machines into the business market where the real money was. The "Home Computer" era had passed, and people wanted machines like they had at work so they could use the same software and transfer data. The Amiga didn't have that. I come back to this again as I think it's the number one factor - software and data file compatibility.
the old Prevue Guide has amiga. Weather star not (Score:2)
the old Prevue Guide has amiga. The Weather star was not.
Weather star 4000 was cool.
Maybe the Weather star XL would of been an Amiga based system and not SGI O2 .
Irving Gould and Medi Ali (Score:3)
Depends on how.. (Score:2)
If it was just commodore pulling a huge monopoly of hell, it would be quite terrible with specialized overpriced computers everywhere.
But if it was the base for the clone machines, like the PC was on this timeline, things probably would be better as Amiga was a better base.
Also Motorola would be freaking HUGE.
Maybe we wouldn't be stuck with x86-64 (Score:4, Insightful)
I wasn't under the impression that the Amiga ever really died - there have been die-hards that have been touting the Amiga since 1985 to today and keeping old hardware running as well as bringing up new hardware to try and get people to recognize what they saw as the brilliance of the Amiga. Before you say that I don't have any experience with an Amiga I did buy one around '88 and it never dazzled me in terms of being a workhorse, day in day out computer; it was more like a series of demos put together in a box. Games were pretty good, but that was it - so for the Amiga to have flourished, it needed to get basic productivity apps on it as well as some good software development tools (the SDK was crap at the time and really turned me off). Along with that, it needed to be better built than a Commodore 64 - I had buttons coming off and the fit of the case was pretty warped. Compare the lightly built Amiga to heft of the IBM PS/2s, Apple Macs, Compaqs and other systems of the time, you'd have a hard time convincing corporations to invest in it for their workers.
In terms of OSes, IBM was well on its way to defining OS/2, Microsoft was looking seriously at multi-tasking versions of Windows and Apple was moving ahead with the MacOS. So, I don't think OS capabilities would be that different from today if Amiga had become the standard.
As for graphics & music capabilities, that has always been something people have wanted to do on PCs and there have been a number of companies that have provided the required hardware and software for the machines we have today where we can create, distribute and enjoy multi-media content and I don't think that the Amiga would have really driven it any further than where we are today.
If Commodore could have pulled together a real product out of the Amiga and figured out how to market it successfully as something that could do everything and not just be a geez-whiz demo machine, then I would think the biggest difference you would see is in processors - the 68k probably would have gotten a better foothold against it's contemporary, the 80386, which might mean we would have more efficient processors today. That's the only difference that I can see.
I had a dream... (Score:2)
I think it was about two years ago. I woke up in a dream into another dream, and in that dream I owned an Amiga 8000, which looked a lot like the Amiga 1000 or the Amiga 2000 in the way that the LCD replacing the then-CRT stood upon the cabinet.
Back in 1987 I got an Amiga 500 with Kickstarter 1.2, and I didn't replace it with a x86 PC until around 1995. In the dream, the Amiga was THE go-to gaming system. The IBM compatibles had been focusing on the business segment, and didn't add stuff like VESA graphics
Re:I had a dream... (Score:5, Informative)
Commodore was *NEVER* run by anyone who understood computers or the computer market. Jack Tramiel founded the company, and he understood the business equipment market very well, but that went from typewriters to calculators -- where the function remained the same, only the technology behind it improved. A fancy modern calculator did the same things as an older one, it was just smaller, used less power, and had a better display. It was a business about incremental improvements in quality, not complete paradigm shifts.
Irving Gould knew nothing about THAT even, he just knew how to move money around in ways that made him money. After he kicked Tramiel out he tended to hire people who also did that -- eventually settling on Medhi Ali who was very much in his mold. In the end they were VERY successful at managing Commodore in a way that made them a ton of money -- they were by far the highest paid executives in the entire industry. Mission Accomplished.
Succeeding as the company that would revolutionize computing was never the goal, and the wins they did have were accidental byproducts. Tramiel wanted to make the same thing over and over with incremental improvements, while making them cheaper so the masses could afford them -- because that was what worked through most of history. Tramiel was more of a Henry Ford than a Bill Gates -- the problem was a Bill Gates was needed because computing is like going from horses to model Ts to flying cars to spaceships over the span of 20 years. He just hired some very, VERY good engineers that managed to innovate against their instructions.
Gould and Ali wanted to line their pockets -- and in the end they made more money for themselves than most CEOs at IBM ever did.
Amiga was a little too early (Score:2)
The revolution in hardware processing didn't happen until a decade later in the 1990s, when DSPs (digital signal processors) dropped enough in price that it became cost-effective to use them in cheap consumer devices - MP3 players, HDTVs, digital cell
We're too far into the future (Score:2)
To preface, I owned an Amiga 1000 back in the late 80s, and was a major Amiga zealot and advocate. Having said that, we are now too far in the future since the days of the Amiga. Yes, the Amiga was amazingly far ahead of the competitors back in the day, and it was nearly a decade later that Windows could finally produce a mouse pointer with motion nearly as fluid and smooth of that of the Amiga. But the fact of the matter is that Windows / IBM-PC hardware did eventually catch up to the Amiga, and of cour
Migrated to what, again? (Score:4, Interesting)
I migrated to Linux from the Amiga precisely because of the spectacle of Commodore's management - Irving "The Ghoul" Gould and Medhi Ali - running an incredible platform in the ground, despite having engineers champing at the bit to greatly expand on its success. It's like they were paid to wreck it or something.
I mean, Dave Haynie's prototypes would have had levels of hardware and system bandwidth outside of the CPU that wouldn't be matched anywhere else in the personal computing market for some time. The Amiga did not fade for lack of excited users or brilliant engineers; it faded precisely because CEOs more capable of asshattery than any normal human could make up decided those users and engineers were a problem.
Linux may not have had the multimedia glitz, but it had real multitasking, and the open source license were assurance that the users and engineers would be in charge. I never wanted to invest my computing time in a let-down like Commodore again. Even so, I would have loved to have seen open source be the scene of so much creativity.
What could one do to bring Amiga-like computing back? Standardize on a well-designed open GPU and use it to its utmost? Accelerate more code on FPGAs? Migrate from 680x0 to LLVM bytecode and multiprocess on whichever processor is available, like TaOS? See, the Amiga had extremely capable multimedia acceleration, yet its CPU was mostly an orchestra conductor of many other processors, and that would have been amazing to see taken further. And it would have been amazing to hold the trend in favor of lean, quality code that wouldn't gobble megabytes or unheard-of gigabytes of RAM.
No self-respecting Amiga programmer would have ever written a beast like Slack to do IRC's job. I'll just leave it at that.
the impossible dream (Score:2)
As an Apple/Mac user, my interest was in creative possibilities. The Amiga was exciting! Any computer could do database, text, spreadsheet stuff; the cornerstone of dull business interests. Only the Amiga could do the sounds and graphics that pointed to the future.
I'd be a dedicated Amiga user if it had progressed as it seemed it would.
Meanwhile my friends were making money with dBase. Capitalizing on the most user unfriendly OS and database system on the market because only hackers could make it work for s
Video cards and to an less part sound on the mac (Score:2)
Video cards and to an less part sound cards on the mac took over.
The Amiga had video cards that did more then the custom chips over time.
The customs chips idea would been killed the fast moving 90's as well OS in rom that needed an rom / eprom change to update would needed to be replaced with flash / be the the g4 mac with smaller bios flash rom with the full OS parts of rom being loaded from disk.
amiga = custom chips with little flexibility (Score:3)
The amiga did a lot with relatively few transistors. But, it was all hacks that would not have scaled very well with more transistors. Recall that the amiga couldn't run doom worth hoot, because the CPU wasn't that powerful. But it did great mario 3 clones, etc. By the time of the 386/486 the amiga was firmly behind the curve. It may have been the first affordable multimedia machine but to expand the architecture in a serious way they would have had to more or less start from scratch. So if the amiga was still around, I doubt anything would be different at all - it would just be another nameplate on top of the x86 (ahem, amd64) hardware we all know and love/hate/does it matter any more??
Primary source:
The Future Was Here: The Commodore Amiga
(a really fantastic book, BTW.
Re: (Score:3)
The Amiga wasn't 'hacks'. The essential hardware design was purpose built for gaming and later acquired to form the basis of the Amiga. It would have 'scaled' just fine in the sense of any other VLSI design, just as the two never released successors to AGA. It's better to conceptualise the hardware design in terms of a videogaming console than a computer. The Doom era was a brief period where CPUs became powerful, video framebuffers fairly rapid and the transition from scrolling/sprite-based setups that fe
Re: (Score:2)
The amiga custom chips were very good at doing hardware accelerated 2d scrolling, which is what most games used at the time...
Higher end amigas could actually run doom just fine once it was open sourced and ported, but most users - especially gamers, had the lower end models. The custom chips didn't help to accelerate games like doom, so it was using just the brute strength of the cpu.
Many 2d games of the day ran just as well on a 7mhz amiga as they did on a 486, because the chipset was doing a lot of the w
While we are dreaming... (Score:2)
While we are dreaming, let's suppose that Microsoft when bust in 1995 so that Linux could rule the world much sooner, including the desktop. We'd be almost a decade further along now.
If AmigaOS had survived.... (Score:3)
- Cross-application scripting support would have been pervasive. This is something I still miss: the ability to make multiple applications work together was phenomenally useful.
- We would have had assigns - the ability to shorten any path to any single word of your choice. How I miss this...
- Creative applications (for making music, drawing, rendering, movie editing, etc.) would have been everywhere, and available for all levels of skill (instead of only super-low-end and super-high-end).
- Shareware would have been alive and kicking.
- We would have screens, a vastly superior way to organize your work. Workspaces are kind of the same, but without the same level of useability.
- We would have had a shell where you can type things like "list all files since yesterday", or "copy from ftp:aminet/tools/readme.txt [aminet] to speak:"
Just my two cents...
Re: (Score:2)
The arexx port was indeed useful, although windows has com and unix systems are designed to chain commandline tools together etc, and many applications support python scripting now.
Both unix and windows have an equivalent of assigns - windows lets you assign arbitrary drive letters, unix has symlinks.
Creative applications are everywhere, there are plenty of open source alternatives to the expensive ones.
Shareware has pretty much lost ground to open source, software today is too big for a single author to de
It had already won in the '80's. (Score:2)
It came out in 1981, when I went to study electricity and electronics. This was still vocational/technical/high school, I don't know the equivalent in English, but it was a STEM direction.
I got my final bachelor degree in electronics in 1989, and in that year one of my electronics teacher already declared the IBM PC (XT/AT) the winner, because it had the best hardware extensibility of all, which meant it could be used across all possible domains.
Re: (Score:2)
I am clearly talking about the IBM PC of course.
When I went to work one year later (after my military service), that was also the point that prices started to drop real fast for PCs.
My first job was in a small company that also sold Mac computers, but the only people we sold that to were people in the real graphical industry. All the others bought PCs.
We'd probably not be where we are today (Score:5, Informative)
Take off that rosy tinted glasses and remember what the Amiga ACTUALLY was like. It's like the East German Trabant. Back at its inception, it was revolutionary. But it didn't evolve. It stayed where it was. The Amiga 500 was the mainstay model for pretty much all its lifetime. Yes, Amigas came in 2000 and 4000 eventually, but few people had them and fewer companies developed for them. Because they were incompatible with the A500 for the most part, which led to people staying with the A500 and its pretty large program library, which led to developers staying with it and the vicious cycle continued.
What fell the Amiga was, oddly enough, what also made it revolutionary in the first place: Its custom chipset, and the fact that you could depend on a set standard of hardware as a developer, much like you could with consoles. Chips that you could (ab)use as a developer like none before (or probably since), which led to a lot of, let's say, creative programming practices that simply didn't allow easy porting off those chips to newer sets that didn't have the quirks, along with lazy programming practices that depend on a very specific processor clock and controller behaviour to actually work. Many programs refused to run because programmers tweaked their code to depend on a specific number of clock cycles when they should actually depend on a specific amount of time. On a clock that you know the frequency of, that's the same. Not so if the CPU changes and hence the clock speed.
The Amiga was a dead end, not because it was a badly designed machine. Quite the opposite. Its custom chipset was years ahead of its time. But it was set in stone, impossible to move on, which eventually meant that the rest of the computer world caught up with it and walked by.
I know what I would be doing (Score:2)
Still playing Marble Madness...
Predicting the past (Score:2)
Another article about stuff that doesn't matter.
Write a SF novel about it and don't bother us.
No, not really... (Score:3)
1: Commodore didn't just fail in making any quantum leaps forward in the Amiga line, it even actively stopped itself from doing so. The original creators of the architecture and the 1000 stuck around for some time, creating the first quantum leap internally called "Ranger". However Commodore really didn't want to spend any money tooling up for new models so they quietly pushed all of the original creators out of the company and cancelled the machine. When they finally realized that they really did need a true quantum leap ahead and not extremely iterative improvements the original creators were long gone and they instead had people like the guy who ran the IBM PC Jr. in charge of engineering, which obviously caused the AAA chipset to be a dud. Hence they had no choice but to push out the iterative improvement that was the AGA chipset and start over with the "Hombre" architecture, which didn't even get far before they went belly up.
2: The power of the Amiga was always in having it's own custom built chips made using Commodore's internal foundry, i.e loads of vertical integration. Not only did this allow for some pretty novel solutions, it was also pretty cost-effective. However with generics being able to pool their resources the economies of scale removed Commodore's cost effectiveness benefit and allowed considerably higher design budgets for the parts they used. Even if Commodore was the biggest computer company out there, they still wouldn't have as many machines to spread the development costs of each of their custom chips as third party vendors selling to PC clone makers.
3: Commodore themselves were ran pretty damn incompetently for most of the time after Leonard Tramiel left the company. They did eventually get Thomas Rattigan to take over the helm and get the Amiga organization into good shape, but he was fired pretty much as soon as the company returned to profitability and when he sued for being unjustly fired he actually won. Thus it's clear that the company really didn't have good management in place so any real leaps forward in the Amiga line wouldn't have been because of management, but in spite of management.
Very little would have changed (Score:2)
The difference is that consumers were able to buy Amigas.
The reason why Be is dead is because of poor management.
The reason why SGI and NeXT are dead is because NeXT was bought by Apple (and relabeled OS X with lots of enhancement) and SGI is dead because we were able to commoditiz
I was at Commodore its last 5 years of existence. (Score:3)
Upper management killed this and made up come out with something reduced; the AA chipset that you are aware of with the Amiga 600 and the like.
Who knows where we'd be today? Of course, Nvidia's awesome RTX 2070 / 2080 is nothing to be sneezed at (and I have the 2070). I think Commodore could be where Nvidia is now, and maybe a bit sooner, had it played its cards right.
Technology was never a problem for us. Marketing, on the other hand, was. Real marketing was simply nonexistent for Commodore, which is the largest cause of its downfall.
In other words... (Score:2)
startup (Score:2)
Microware OS-9 (Score:2)
Another Hypothetical Question (Score:2)
That's a big IF (Score:2)
Re: (Score:2)
True, true Everyone knows the Commodore 64 is where it's at!
Re: (Score:2)
+1, Concur. C64 forever!
Re: (Score:2)
Re: (Score:2)
Excuses, excuses.
Bottom line? It's OUT! Over. Done. It lost!!
Stop making excuses and just get over it (:
I don't care about the IBM Cell today. (Score:4, Insightful)
Re: (Score:2)
x64 dominates because of efficiency of scale, and software compatibility. If the 68000/PowerPC line had "won", the world wouldn't look that much different unless you're one of the small sliver of the population that needs to know what's under the hood in order to use it fully.
the ppc g5 was too hot / power drain for laptops (Score:2)
the ppc g5 was too hot / to much of an power drain for laptops.
The Higher end G5's needed liquid cooling and lots of fan.
https://everymac.com/systems/a... [everymac.com]
Re: (Score:3)
The Pentium 4 was a space heater too, and didn't sink the x86 line. In this alternate history, inferior products would still slip through, how could they not? That's part of being "not substantially different". The details of implementation would all change, but overall capabilities would be pretty much the same, and we'd be sitting here asking "what if the x86 line had won?"
Apple's ARM variant came from Power (Score:3)
They did this by purchasing a (DEC) design team that had recently implemented a compelling Power implementation.
https://iphone.appleinsider.co... [appleinsider.com]
The lead designers of P.A. Semi had previously implemented StrongARM while at DEC.
Re: (Score:2)
""what if the x86 line had won?""
There is a difference. The two lines weren't equal. x86 chips were drastically inferior. People would ask but it would be along the lines of "how bad would it have been if...."
Re: (Score:2)
And maybe Apple has no reason to go to Intel chips. Maybe Commodore eats at least part of Apple's lunch, and they don't make it through the 1990s. Maybe a lot of things -- but none of them would seem weird to anyone living only that timeline.
Re: (Score:3)
It's not that PPC was "better", it's that Motorola and Apple had agreed to drop the 68K in favor of that variant of IBM's POWER architecture. The 68060 was decent, but it was running out of steam without a major re-design (Coldfire being a bit too radical), so Motorola scrapped both the 68000 and 88000 in favor of PPC. Then a few years later, Motorola only wanted to make low-power embedded chips, and IBM only wanted to make high-power server chips. So laptops were under-powered and had slow front-side bus,
Re: (Score:2)
The '060 was terrible. It dropped instructions, and ran far too hot. Even the '040 FPU dropped a lot of '881/'882 instructions, requiring software emulation. If Moto had put as much effort into it as Intel did with the Pentium Pro, maybe they could've made a next-gen 68k worth using, but they didn't. ColdFire/DragonBall/CPU32 all went in a different direction, removing even more addressing modes, going for the low-power SoC market (and eventually falling in the face of ARM).
Re: (Score:2)
Re: I don't care about the IBM Cell today. (Score:3)
->As for Cell, in the end it turned out to be little more than a meme,
nah. Sony killed cell development because too many people were buying ps3s (which they sold at a loss) to build HPC platforms.
Re: (Score:2)
Exception handling on 68K was a bit of a mess compared to PPC. The CISC on 68K was quite a bit more complicated to decode than x86, even though the 68K was much more assembler coder friendly.
Once you have a working kernel and some compilers these differences probably don't matter to most developers and users. Then things like price, performance, and backward compatibility tend to win out. Intel made fast cheap chips that could run old and new business software.
Re: (Score:3)
NVIDIA has a very strong showing in the most recent TOP500 list [top500.org]. But look at number 3 - Sunway MPP architecture, which is conceptually similar to Cell but using MIPS as the base architecture rather than PowerPC.
Re: I don't care about the IBM Cell today. (Score:3)
Programming for the PS3 was a nightmare. I know people that made good careers in the games industry purely as contractors to finish and optimize PS3 games. But everyone used the Xbox as their reference point because it was easy to get running. We could theoretically run circles around the Xbox with a good PS3 build, but why bother? The extra assets it took would never be paid back in sales. If you were doing multi-platform development, you weren't going to put in the extra time given how much work it was.
By
Re: (Score:2)
Yes and no. The Power architecture and cell were all IBM with Apple being the biggest driver. That is a tiny marketshare, tiny interest, and tiny amount of R&D relative to what has been dumped into improvement of x86/intel.
Re: (Score:2)
Nice in what way? How do think instruction set has any impact on your computing experience today?
It wasn't as though Intel had an insurmountable technical superiority over the 68k. The difference was that Intel continued to improve while Moto gave up.
Re: I don't care about the Amiga today (Score:2)
In a parallel universe, AMD secured a license to manufacture 680x0-compatible CPUs & rolled out its wildly-popular 68k80 in 1999, which blurred the distinction between CISC & RISC the same way the K5 did... treating the CPU's CISC machine language as the virtualized public face of its RISC microcode core.
Once we ran into the ~3Ghz brick wall, the ability to sidestep clock speeds by packing more actual complex work into fewer clock cycles became a way to get more performance without the pain of paral
amd and intel both had chips dulling in the late 8 (Score:2)
amd and intel both had chips dulling in the late 80's - 90's
A second chance (Score:2)
What would have been nice if the very clean Motorola MC68k architecture had survived and become dominant instead of the hacked together zombie that Intel had
Having done a lot of 68k assembly and just a little x86, I totally agree...
But at least a second chance is upon us, as more and more computing shifts to ARM which seems to have a cleaner design (I admit I've not actually done any assembly with it yet).
Re: (Score:2)
Intel always suffered backward compatibility issues which they brought upon themselves. M68k was a from scratch 32 bit processor design, while the 80386 was a 32 bit descendant of the 8 bit 8008 design. 68k was a cleaner design if you were programming bare metal in assembly, but it lacked the MMU of the 386, so from the perspective of designing an OS with process separation, there were advantages to the Intel architecture beyond backward compatibility. At the time, only OS/2 took advantage of that, and mo
Re:A second chance (Score:5, Insightful)
Intel always suffered backward compatibility issues which they brought upon themselves.
On the contrary. They won the CPU race exactly because of their insistence on backward compatibility, plus the brilliance of their design that allowed such compatibility in the first place.
Re: (Score:3)
The 68000 series had MMUs available, third party ones in the case of the 68000 itself and an official motorola one for the 68020. From the 68030 onwards, the MMU was typically built in.
The 68020 actually came out before the 80386, but the mmu was an optional external component because most software of the day didn't use it. It was more prevalent on the various m68k machines intended to run unix such as older sun workstations. Lower end machines such as the amiga, mac and atari st didn't require an mmu.
Re: A second chance (Score:2)
ARM assembly sucks for programmers, especially if you were spoiled by the 680x0's convenience. It actually lacks direct single-opcode equivalents for instructions as fundamental as 'JSR' and 'RTS' -- you have to manually save the return address yourself. Sure, you could macro it... but the whole POINT of the 680x0 was to make its assembly language convenient & nice for HUMANS to use. Compared to m68k assembly language, ARM feels downright barbaric.
Re: (Score:2)
I actually liked both and can not really follow your rant.
Both have orthogonal register and instructions design.
What do you want more?
I considered ARM3 quite elegant ... never programmed for a modern ARM though in assembly.
Re: (Score:3)
I wonder what would have happened if Motorola had cared enough when IBM was making the original PC. The story as I was able to piece it together through the years and legends was that when making the PC, IBM looked into both Intel and Motorola for the CPU. From Motorola they wanted the 68008 (part of their cost objective being an 8-bit bus), but Motorola at the time was obsessed with selling the 68000 for high-priced workstations and wouldn't commit to IBM's deadline. IBM went with the 8088, even though Mot
Re:I don't care about the Amiga today (Score:5, Informative)
IBM went with the 8088 over the 68k for two major reasons:
1) Chip Availability.
While Motorola's official intro date for the 68000 was September 1979, the first allocated production didn't ship until February 1980, and mass deliveries didn't take place until November 1980.
IBM, on the other hand, was designing the PC in early 1980 for a manufacturing start in April 1981. IBM utterly refused to make its PC dependent on a chip that at the time of design wasn't even in real mass production; indeed, IBM corporate standards (which didn't fully apply to Project Chess) established that IBM wouldn't use a chip that had actually been shipping in quantity for a year for its design, to avoid early-production bugs.
And Motorola didn't ship the 68008 until 1982, which certainly wasn't in time to manufacture computers in April 1981. Maybe they could have if they had an IBM contract to push them, but they certainly couldn't make guarantees of that in early 1980, when they were not making 68000s in quantity. IBM would have been making a serious gamble if they used the straight 68000, much less the 86008.
2) Software Availability.
All the major personal computer software in 1980 was written in 8080 assembly code for CP/M. Porting it to M68k would be a real pain, particularly as CP/M 68K was nowhere near ready. Porting it to CP/M-86 or QDOS on 8086 was easy; Intel had an assembly translator for 8080-to-8086, and Seattle Computer Products (writers of QDOS/PC-DOS/MS-DOS) had a Z80-to-8086 assembly translator. Examples of then-major applications that were at least partly machine-translated from 8080 assembly code for the IBM PC include Microsoft BASIC, VisiCalc, dBase II, and WordStar; because IBM used the 8088, all could be and were available on the day the IBM PC was released.
If IBM had used the M68k, there might not even have been an operating system ready by October 1981, much less a full set of essential applications. As it was, IBM had the software to be a credible alternative to the then standard S-100 Z80 CP/M business microcomputer on day one, even if the translated software didn't run any better on the IBM PC than it did on a Z80.
Re: (Score:2)
Re: (Score:2)
It really did happen though, we *LITERALLY* had Amiga-grade Virtual Reality in the 1990s. Those old Virtuality (c) headsets from back in the Lawnmower Man days ran on Amiga 3000s, so you could play Dactyl Nightmare at 15fps and throw up. Unfortunately they cost as much as a new car, so mostly they were only in high-end arcades.
Re:The PC/Mac Duopoly won (Score:4, Interesting)
Amiga users never paid for software - they pirated everything, so it wasn't worth developing software for Amiga. Home PC users pirated everything, but businesses paid. For some reason, Mac users tended to pay for more software than Amiga users. But really, the Amiga painted itself into a corner with all its special-purpose chips. PCs and Macs with linear framebuffers and lots of memory bandwidth overtook what the custom chips could do, and any major change to the Amiga architecture would break compatibility with everything.
Re: (Score:2)
Re: (Score:2)
We already have ARM Cortex with about 100 cores.