View Single Post
Old 09-14-2014, 01:50 PM   #15
Katsunami
Grand Sorcerer
Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.Katsunami ought to be getting tired of karma fortunes by now.
 
Katsunami's Avatar
 
Posts: 6,111
Karma: 34000001
Join Date: Mar 2008
Device: KPW1, KA1
Quote:
Originally Posted by kovidgoyal View Post
Linux definitely has more problems than any other OS, when you consider the ratio of problems to number of users. Which is ironic, considering that I develop calibre on linux.

The problems with linux, from the perspective of someone writing software to run on it are:

1) Insane fragmentation. Once you go beyond the kernel (which is a very well managed project, where Linus tries to never break userspace compatibility) and a few system libraries, the rest of the "system" stack in linux is an utter disaster. No one cares about compatibility, technologies/APIs for integrating with the system keep on changing every few years and every individual linux distro feels the need to do even the most trivial things differently.

2) The crazy software distribution model. Having a centrally managed repository for all software made sense twenty years ago when there was relatively little software, with a much smaller dependency graph. Trying to centralize all software distribution, in a manner such that a few overburdened volunteers have to ensure that *all* software is compatible with *all* other software in a distribution is simply not scaleable. And then this insanity is repeated a dozen times over in a dozen different distros. As it is now, I have no idea how a non-technical person is ever going to maintain a stable and fully functional linux system. The correct approach to software distribution is to distribute the "base" system, which should remain relatively stable and backwards compatible, centrally. End user application distribution should be done by the developers of the applications. However, considering that linux distros cannot even get their heads out of their asses long enough to agree on a common packaging format for binary distribution, I doubt this will ever happen.

Dont get me wrong, I love linux as a *technical* end user. But as a stable platform for which to create useful software for non-technical people, it is the absolute pits.
OMG...

This is exactly what I've been saying to everyone for the last (almost) 15 years. EXACTLY. I am now developing a distributed software project that fully runs on Linux, and the only distribution that somewhat allows me to do this is Debian Stable, because it doesn't change every... like... three days. I installed it (on multiple CPU architectures), compiled and installed the drivers that were not in the kernel, and now I only do "apt-get update && apt-get upgrade". Choosing almost any other distro, except maybe something such as CentOS, would be completely unworkable, because of your number 2 point.

That is my very biggest pet peeve. What IDIOT has devised that it would be a good idea to make one huge repository in which ALL software, including the operating system, is linked to one another? That defeats modularity.

Now it can happen that, if I update Application A, it needs an updated Library X. But, Application B would break, if that happens, so it has to be updated also, which I might not want. Maybe I want to update the operating system, but then install my old applications; or the other way around, installing newer applications on an older operating system. This is not possible: with Linux, you're all-in. Update/upgrade everything, or nothing.

The solution is to look at how FreeBSD does it. They release a base system, which is the operating system itself. They still stick everything into repositories however. Now, enter PC-BSD, based on FreeBSD: the ultimate open source Unix, IMHO.

- Take the FreeBSD base system.
- Take X.org.
- Pick one desktop, in this case KDE, if I remember correctly.
- Merge that into an operating system: the ONLY PC-BSD operating system.
- Now, compile applications in such a way that they are complete, and put everything into a PBI (PC-BSD Installer) file. This is the equivalent of Calibre's setup file on Windows.
- Now, while installing, the application installs EVERYTHING it needs into its own folder, including libraries, it includes everything it requires to run, such as Python, Perl, whatever... and the settings are put into /home/user, just like Calibre does on Windows.
- This takes the cake: the application is linked through App Cafe, which can then be used to update individual or all applications. This would be like every application on Windows linking through Windows Update, which would then acquire the latest application version (for the selected application) from the developer's website and install it. Sadly, Windows Update doesn't do this, apart from Microsoft software. In the Linux world, people balk at the fact that several versions of one library might be installed. Who cares? Hard drives are 2 to 6 terabytes in size. Even 256GB SSD's are becoming cheap; the 128GB ones are almost free in the Netherlands. DDR3 memory basically costs nothing (€60 or so will get you 8GB: I remember paying €200 for 128MB.) This is not the 80's or 90's anymore. Disk space usage and memory usage are almost irrelevant now.

I also agree about point nr. 1, the fragmentation. Why would we want 10 different desktops, 19 toolkits, 17 window managers, and three display servers (Xorg, Wayland, Mir), and then split them into 687 distro's that all need to be maintained? That's madness.

To be honest, I love the concept of open source software, especially for home computing, where I use a lot of (small) applications that would cost a fortune to buy. I love Unix, because of the power of the command line. The creators of Unix and C (Ken Thompson, Dennis Ritchie and Brian Kernighan) could be called personal heroes.

If you'd take a look at my computer, there's one thing you'd see:

A Unix/Linux system, running on Windows.

What? Yes. Exactly that.

I install Windows, configure it, and on top of that, I install Cygwin to acquire a Unix command line and Bash scripting. (I know Powershell can do it since 2006, but having used Unix Services for Windows in the 90's as my primary command line, I'm much more used to Unix commands than Powershell.) All of my applications, except a certain few, are the same ones that would also run on Linux. Foobar2K, Chessbase Fritz11 (running the open source Stockfish engine), CDBurnerXP and Digital Editions 2.01 are the only exceptions.

The only difference in daily use between Linux and my Windows installation, is its way of configuration (control panel / registry vs. /etc), and the fact that I can update/upgrade my operating system and eah piece of software as separate entities.

For that distributed system though, I actually need Linux, because I need something that runs on several versions of ARM CPU's and x86 CPU's, and I need it to be customizable (inserting scripts into the boot process, putting custom splash screens into it, and not cost €120 for each computer in the system). Debian Stable serves its purpose, but if drivers had been available for the necessary hardware, I would have chosen FreeBSD.

PS: Maybe this should be split off into a different topic

Last edited by Katsunami; 09-14-2014 at 01:57 PM.
Katsunami is offline   Reply With Quote