Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Unless I misunderstand you, I thought the exact opposite was true: The separation of source from binaries is what made UNIX a success. When C was designed, it allowed one to take the source from one machine to another, disregarding the underlying machine architecture.


What made UNIX a success was AT&T being prevented to charge money for it, thus making available for free versus the alternatives.


I assume (presume perhaps) they're a windows user and the -dev mention is more for the case where debug symbols and packages are separated from non debug binaries.

I only know that windows does something different more akin to debug symbols in each binary by default with chicanery to not have it impact things in the general case. But I'm not a windows programmer this is about as much as I know about windows in that regard.


Never been a Windows user or programmer, but like you I've heard they do quite a lot of clever stuff under the hood. Probably a lot of VMS heritage, another system I've never had a chance to use but heard good things about. But every version I've tried since Windows 3.1 has had an abysmal user experience, so I've never once considered switching.

And that leaves no current viable alternatives to the Unix-like systems. Sad, really.


It didn't make Unix a success, it made Unix a punch-card-and-tape batch processing simulator in the same way the Mac was a paper pushing simulator.

What made Unix a success is a great mystery to me. Presumably it was the ability to run a time-sharing system on low-cost hardware, but then people started taking the tradeoffs it had to make as gospel, which is why we have this ridiculous situation of still wrangling processes on computers with 64 bit bloody address spaces.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: