Hardware designed specifically for software that was designed specifically for said hardware seems to be where it's at.
So much emphasis on compatibility in the dominant markets is quelling innovation.
You can't just build the best possible machine. You have to build a machine that is also compatible with : never ending laundry list of protocols, standards, API's , hardware, form factors etc etc.
Are you old enough to understand the pain of device configuration that was pre-USB? Assigning IRQs by hand and such?
And how having one plug was great? Until we had a bunch... and then, by sheer fucking accident, politicians did something intelligent and mandated a single plug standard again.
And then we did this all over again.
Standards allow the rest of us who just want to Get Shit Done, to Get Shit Done, instead of, "oooh shiny!".
Are you old enough to understand the pain of device configuration that was pre-USB? Assigning IRQs by hand and such?
And how having one plug was great? Until we had a bunch... and then, by sheer fucking accident, politicians did something intelligent and mandated a single plug standard again.
And then we did this all over again.
Standards allow the rest of us who just want to Get Shit Done, to Get Shit Done, instead of, "oooh shiny!".
Hence the part where i included:
"It is both unfortunate and apparently necessary."
If i want to build a machine that will be marketable, the standards are necessary. If I really wanted to build ONE machine that was REALLY good at something, I'd have to ignore them.
Politicians did what? Aside from dangerous stuff (mains power) the law generally doesn't regulate what sort of plugs you have to use for anything -- individual products do need to be FCC approved, etc., but there's no law mandating that your computer has to have USB or Thunderbolt or whatever. Manufacturers can make whatever they want -- it just made sense to standardize on something that all devices could use, and USB won out over Firewire (and continues to win out over Thunderbolt, hence Thunderbolt throw
The problem wasn't having to set IRQs... the problem was that there were only 16 (until Intel expanded them to 24 sometime after it mattered), and 95% of card only allowed you to choose between 2 or 4 out of 2/9, 3, 4, 5, or 7.
If all cards had been 16-bit & we'd had 32 or 64 IRQs (selectable via 5 or 6 jumpers to set an arbitrary binary value) to choose from, IRQs would have been little more than a bookkeeping nuisance.
USB has its own pain points, especially for embedded development. At least with RS-23
IRQ numbers aren't supposed to be separate channels one per piece of hardware.
IRQ numbers are supposed to be priorities, i.e.: which should the CPU serve first when multiple come at the same time.
It's perfectly possible to share IRQs. Multiple drivers register the same IRQ, each IRQ handler, when called, checks if its piece of hardware is requesting attention (e.g.: is Sound Blaster signaling that its buffer is empty ?) serve it if necessary, and chain to the next handler in the chain (IRQ7 wasn't triggered
Maybe... but from what I recall, under DOS & Windows 95, 99.7% of drivers *were* crappy & required dedicated IRQs.
The dawn of PCI was incredibly painful, too... mostly because the number of BIOS options that nobody understood the implications of exploded... level/edge? INTA/B/C/D? Plug & Pray, or attempt to semi-blindly guess the right settings absent any real documentation by card vendors? I remember one where I was given two PnP configuration choices: "Dos/Win95" or "OS/2". I was trying to ins
At least with RS-232(-ish) serial, once you got the baudrate, parity, and stop bits right, it was pretty bulletproof.
It still is bulletproof!
We have ancient industrial equipment configured using a serial port and 16-bit (Windows 3.1 era) software. It will work natively on 32 bit versions of Windows. It will work with USB-serial adapters as long as they are assigned COM1-COM4. They will work on a 64 bit machine by using a 32-bit VM, and passing the serial port through, which works better than trying to pass-through USB devices, let alone a PCI or ISA card.
Even better are things that can be configured over serial with any t
Amiga got PCI but real Amiga development was gone by then. The technology had been purchased by a third party and they basically just coasted the remaining value out.
The original IBM PC was a lot worse for that. At the time DOS didn't even have drivers, so you had to make your hardware register level compatible with IBM's. That rather limited innovation to say the least.
The Amiga could have been the dominant platform. It was expandable, there are APIs for hardware abstraction even in fairly early versions of the OS.
BIOS is your driver, very much the old CP/M-80 way of dealing with things. Anything not defined by BIOS has no abstraction, later option ROMs were possible on cards which is how we got SCSI and IDE to boot.
But writing a custom BIOS was too much of a pain for a little clone maker shop when IBM gives the sources away for free. On top of that the use of off-the-shelf components makes cloning an IBM PC much easier than doing something new. Cloning a C64, Amiga, Atari ST, or Macintosh was harder because of the c
Apple nearly failed. They were in deep trouble in the 1990s. They struggle when Jobs left and rose when he returned. Commodore never recoveree when Jack Tramiel left.
Interestingly the modern Mac is commodity PC hardware with tweaks.
The original IBM PC was a lot worse for that. At the time DOS didn't even have drivers, so you had to make your hardware register level compatible with IBM's. That rather limited innovation to say the least.
No you didn't. You just had to be BIOS compatible. There were plenty of PCs that were MS-DOS compatible that were not fully IBM PC compatible. They could run MS-DOS programs just fine, apps like Lotus 1-2-3 and WordPerfect.
They were not IBM compatible, which means applications which decided to not follow
Interoperability is one of the greatest success stories of the PC. Even in the old days pre-USB, pre-PCI, pre-plug and play the PC itself was painful enough to make work, let alone any talk of a special purpose device.
They have their place, but it's not in general purpose computing.
I just asked myself... what would John DeLorean do?
-- Raoul Duke
What if Dinosaurs never went extinct? (Score:3)
Hardware designed specifically for software that was designed specifically for said hardware seems to be where it's at.
So much emphasis on compatibility in the dominant markets is quelling innovation.
You can't just build the best possible machine. You have to build a machine that is also compatible with : never ending laundry list of protocols, standards, API's , hardware, form factors etc etc.
It is both unfortunate and apparently necessary.
Re: (Score:1)
Are you old enough to understand the pain of device configuration that was pre-USB? Assigning IRQs by hand and such?
And how having one plug was great? Until we had a bunch... and then, by sheer fucking accident, politicians did something intelligent and mandated a single plug standard again.
And then we did this all over again.
Standards allow the rest of us who just want to Get Shit Done, to Get Shit Done, instead of, "oooh shiny!".
Re: (Score:2)
Are you old enough to understand the pain of device configuration that was pre-USB? Assigning IRQs by hand and such?
And how having one plug was great? Until we had a bunch... and then, by sheer fucking accident, politicians did something intelligent and mandated a single plug standard again.
And then we did this all over again.
Standards allow the rest of us who just want to Get Shit Done, to Get Shit Done, instead of, "oooh shiny!".
Hence the part where i included :
"It is both unfortunate and apparently necessary."
If i want to build a machine that will be marketable, the standards are necessary. If I really wanted to build ONE machine that was REALLY good at something, I'd have to ignore them.
Re: (Score:2)
If I really wanted to build ONE machine that was REALLY good at something, I'd have to ignore them.
That's why CRAY was so weird.
Re: (Score:3)
Politicians did what? Aside from dangerous stuff (mains power) the law generally doesn't regulate what sort of plugs you have to use for anything -- individual products do need to be FCC approved, etc., but there's no law mandating that your computer has to have USB or Thunderbolt or whatever. Manufacturers can make whatever they want -- it just made sense to standardize on something that all devices could use, and USB won out over Firewire (and continues to win out over Thunderbolt, hence Thunderbolt throw
Re: (Score:1)
https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
Plug and Pray wasn't good enough for you huh?
Re: What if Dinosaurs never went extinct? (Score:3)
The problem wasn't having to set IRQs... the problem was that there were only 16 (until Intel expanded them to 24 sometime after it mattered), and 95% of card only allowed you to choose between 2 or 4 out of 2/9, 3, 4, 5, or 7.
If all cards had been 16-bit & we'd had 32 or 64 IRQs (selectable via 5 or 6 jumpers to set an arbitrary binary value) to choose from, IRQs would have been little more than a bookkeeping nuisance.
USB has its own pain points, especially for embedded development. At least with RS-23
IRQ numbers (Score:2)
IRQ numbers aren't supposed to be separate channels one per piece of hardware.
IRQ numbers are supposed to be priorities, i.e.: which should the CPU serve first when multiple come at the same time.
It's perfectly possible to share IRQs.
Multiple drivers register the same IRQ, each IRQ handler, when called, checks if its piece of hardware is requesting attention (e.g.: is Sound Blaster signaling that its buffer is empty ?) serve it if necessary, and chain to the next handler in the chain (IRQ7 wasn't triggered
Re: IRQ numbers (Score:2)
Maybe... but from what I recall, under DOS & Windows 95, 99.7% of drivers *were* crappy & required dedicated IRQs.
The dawn of PCI was incredibly painful, too... mostly because the number of BIOS options that nobody understood the implications of exploded... level/edge? INTA/B/C/D? Plug & Pray, or attempt to semi-blindly guess the right settings absent any real documentation by card vendors? I remember one where I was given two PnP configuration choices: "Dos/Win95" or "OS/2". I was trying to ins
Re: (Score:2)
At least with RS-232(-ish) serial, once you got the baudrate, parity, and stop bits right, it was pretty bulletproof.
It still is bulletproof!
We have ancient industrial equipment configured using a serial port and 16-bit (Windows 3.1 era) software. It will work natively on 32 bit versions of Windows. It will work with USB-serial adapters as long as they are assigned COM1-COM4. They will work on a 64 bit machine by using a 32-bit VM, and passing the serial port through, which works better than trying to pass-through USB devices, let alone a PCI or ISA card.
Even better are things that can be configured over serial with any t
Re: (Score:2)
And the Amiga predates USB by many years, and never had to mess around with manual assigning of IRQs etc.
amiga needed pci when pc's & bit later macs go (Score:2)
amiga needed pci when pc's & bit later macs got it
Re: (Score:3)
Amiga got PCI but real Amiga development was gone by then. The technology had been purchased by a third party and they basically just coasted the remaining value out.
Re: (Score:2)
The original IBM PC was a lot worse for that. At the time DOS didn't even have drivers, so you had to make your hardware register level compatible with IBM's. That rather limited innovation to say the least.
The Amiga could have been the dominant platform. It was expandable, there are APIs for hardware abstraction even in fairly early versions of the OS.
Re: (Score:3)
BIOS is your driver, very much the old CP/M-80 way of dealing with things. Anything not defined by BIOS has no abstraction, later option ROMs were possible on cards which is how we got SCSI and IDE to boot.
But writing a custom BIOS was too much of a pain for a little clone maker shop when IBM gives the sources away for free. On top of that the use of off-the-shelf components makes cloning an IBM PC much easier than doing something new. Cloning a C64, Amiga, Atari ST, or Macintosh was harder because of the c
Re: (Score:2)
That is what is sad today. Businesses have mostly figured out how to avoid races to the bottom which are the only way consumers win.
Re: (Score:2)
That's worked pretty damned well for Apple.
Re: (Score:2)
Apple nearly failed. They were in deep trouble in the 1990s. They struggle when Jobs left and rose when he returned. Commodore never recoveree when Jack Tramiel left.
Interestingly the modern Mac is commodity PC hardware with tweaks.
Re: (Score:2)
No you didn't. You just had to be BIOS compatible. There were plenty of PCs that were MS-DOS compatible that were not fully IBM PC compatible. They could run MS-DOS programs just fine, apps like Lotus 1-2-3 and WordPerfect.
They were not IBM compatible, which means applications which decided to not follow
Re: (Score:2)
It is both unfortunate and apparently necessary.
Interoperability is one of the greatest success stories of the PC. Even in the old days pre-USB, pre-PCI, pre-plug and play the PC itself was painful enough to make work, let alone any talk of a special purpose device.
They have their place, but it's not in general purpose computing.