iFags keep rationalizing every moves/decisions by Apple never ceased to amuse me, and so does Linuxfags who in a similar way keep rationalizing Linux OS design as always superior in every aspect to other competing OS-es. Don’t get me wrong, I’m a Linux user too, and not just a mere user, it is my main OS but it does not stop me to criticize it. The reason to this is because I want to see improvement on the OS. I now my criticism alone won’t improve it directly but with me make it available to other people, the more people share my ideals, the more likely it will reach the developers. Then it’s up to them on how to react, either to arrogantly shoot down every criticism they received or take it into consideration.
It’s safe to say that open criticism is like an indirect petition to the targeted personnel/party. It doesn’t matter whether it’s pointless or not because every criticism you give, as long as it’s from your heart will worth it. And because of the same reason too I don’t hesitate to criticize anime and/or other form of art. Just so you know, for me to criticize things doesn’t mean I have to be a master in that faculty/field. It doesn’t mean I have to be a better one in order to criticize a crapwork. Excellence in a given field is not required to have a critical insight into that field. After all crap does exist and any attempt to deny that is useless. That being said, any negative respond to this post would be a purely ad hominem argument.
And yeah don’t try to tell me that I’m stubborn or arrogant just because I refuse to accept certain things. The fact is you saying me as stubborn or arrogant is made based on your own definition only. In other words you see me as stubborn or arrogant is because I refuse to do or see things the way you do. Let me tell you, just because I don’t follow your way doesn’t mean I’m stubborn or arrogant. Besides for you to see me like that is like you want to force me to follow your way or you want to impose control on me. Oh, you asked my why there are only negativity here? Let me tell you. That’s the sole purpose why this blog is here. Of course I do have my own all-praising, all-complimenting, all-buttkissing blog but I don’t need to tell you everything here.
(98/99 people who think I’m arrogant are arrogant people themselves.)
My first experience with Linux was 10 years ago. Although I started my computing experience earlier than that (I attended computer classes since 1995) but I remain a n00b because the beige boxes scares me. I only started falling in love with computers and IT stuff when my father purchased our first family PC (powered by Pentium III) in year 2000. By having our own PC I can freely tinker around it without having to worry too much. Although the PC was pre-installed with Windows 98 SE, an article in local PC magazine drove my curiosity to try Linux and ended up installing RedHat in a dual-boot environment. I admit that I fell in love in Linux but I love Windows more because of it’s ease-of-use thus I set Windows as the default OS. Actually nobody in my family knows there’s Linux in the PC because it is only bootable via a boot diskette. Despite using Windows most of the time, I keep using Linux occasionally out of curiosity and started mastering it unknowingly in the process.
A couple of years since my introduction to Linux, I learned about the existence of special kind of software that would allow me to use Linux without having to set my beige box to dual-boot system. The software is known as virtual machine. I installed Connectix Virtual PC and began experimenting with various Linux flavors, often more than 2 at one time. Sure, having such load in a Pentium III box with maximum RAM of only 512MB is a pain but for a geek it was a pain worth bearing. However it still does not enough to make me a Linux convert because I still think that Linux was still immature for a beginner’s use. It was during the same time I introduced Linux to my family and nobody accepted it. Yes, the heavy reliance on CLI freaked my family members and unlike Windows which they can fix themselves, they’d left dumbfounded should they face problems in Linux. And I was more convinced that Linux is still not good as a beginner’s OS. Well, perhaps I’ve used the wrong distro but how should I know if the one I’m comfortable with may be too scary for others?
Fast forward a few years and I’ve almost gave up being a Linux evangelist to my family. Yeah, I know it wouldn’t succeed because I still not using Linux as my main OS up to that time. It’s not that I don’t want to but virtualization software consumed too much of my limited system resources, although my PC was among the most powerful of that time. Even if I set my PC into a dual-boot machine and dedicate all system resources to whatever OS I booted, it wouldn’t help either because I’m not happy with the hassle of having to reboot the machine just to switch the other OS. Then I think why not the computer developers simplify it? My computing knowledge were pretty much limited on that time. All that I could think for improvement is either to make the virtualization less resource hungry or something that I described as “hardware-level virtualization”. The former might be impossible because no matter how small footprint the virtualization software has, the overall system resources is still shared among the host SO and the guest OS. For the latter, I thought it was ridiculous until I read an article in another local PC magazine about the so-called “hardware-assisted virtualization” in 2005, around the same time of the emergence of multi-core consumer CPU.
From what I understood about hardware-assisted virtualization, it’s similar to my vision of hardware-level virtualization, where system resources are partitioned at hardware level instead of in software level as in the traditional software virtualization. In the article both AMD‘s “Pacifica” and Intel‘s “Vanderpool” were mentioned well. I thought the technology I’ve been waiting for has arrived but I was wrong. It’s all liars. The article mentioned about having a machine where we can boot into both system at one time without the need to install the virtualization software and we can switch between the OSes in real-time without having to reboot the system (let’s call it “double-boot” instead of dual-boot) or reloading the same OS without restarting it. Sounds nice because should the current working environment crashed, the loaded copy of the OS would take over and this could be done without the user noticing it. However I still haven’t seen my dream of “double-boot” system become true despite the technology is already available. The technology becomes useful only if the virtualization software is installed, which means it still need the host-guest relationship between the OSes, of which I think kills the purpose of having the hardware-level virtualization. I am highly disappointed. However there was one time in local PC expo where I saw an Apple representative demonstrated switching between Mac OS X and Windows XP in real-time using certain key combination. I asked him whether there are any virtualization software installed or not and he answered me the Mac only use Bootcamp. I’m not sure though whether it’s true or there were just some tricks because I never really have a chance to use Windows on Macs but whatever system it is, I only want to see the “double-boot” system become true.
How often I should have major upgrades for my OS? Even if there are free annual major upgrades for my system, I won’t be happy either if it involves huge downloads every time that happens. Yes, I’m referring to a certain specific free Linuxdistro where the company that maintains it promised the user that the major OS upgrades will be available every 6 months or so, more or less. Luckily transferring user preferences and settings in Linux is not as tricky as in Windows because no system registry is involved, which means things won’t be too disastrous even if the upgrades didn’t run smoothly. The worst you’d do is most probably to fresh install the system, recreate your user account and manually patching/transferring all your previous settings to your current installation. But then again it doesn’t change the fact that I hate mandatory huge downloads like system upgrades as I mentioned above. If that’s the case (inevitable huge downloads) I rather getting major upgrades only after every 2 or 3 years than doing it twice or even once every year. I know major upgrades always associated with major security concerns but for something that “as big as an OS”, I think having to do it each year is too much to the point of annoying. Yeah, I might be able to skip it but chances are newer apps might only be optimized for the upgraded system only, rendering most users would feel like being ‘forced’ to do the upgrades. Besides I think annual major upgrades for free OS doesn’t really makes sense since there’s nothing to sell in the first place. At least in Windows world, although only having major upgrades once every 2 years or so, it does makes more sense as those upgrades are something they’re selling, something they (Microsoft) are making money from.
Linux should learn more from Windows too; ie. to use proper, corresponding icon for its executable files. Executable files are programs that comes with applications, together with their own set of icons. The problem is the icons are only available for applications menu and desktops but why not the executable files as well? I remember having a hard time browsing the usr/bin folder searching for the executable file of Transmission (the bittorent client) by looking at file names instead of trying to notice the familiar Transmission icon. Yeah I found it but I can do it much faster in Windows thanks to the usage of corresponding icon for Windows applications. Besides in Windows applications are usually sorted out properly in their own program directories with proper application naming (Program Files/App name) instead of dumping all cryptically named executable files in only one place. Besides why the heck the system still use almost-cryptic acronymic naming of directories like ‘usr’ instead of ‘user’ or ‘bin’ instead of ‘binary’ or ‘lib’ instead of ‘library’? (And strangely enough they can use four-letter words for some other directories like ‘home’ and ‘root’?). And what the heck the ‘sda’ is actually? (yeah, I know what ‘sda’ is but that one is just me exaggerating to show you how cryptic directory naming in Linux can be). Linux is already decades in development yet the developers still refuse to abandon UNIX system convention? C’mon, it’s just a matter of a few additional letters so it shouldn’t hurt for that much of extra work. And isn’t modern Linux is supposed to be a hybrid and pragmatic standalone system instead of being heavily and strictly modeled after UNIX? It’s been decades since the first version of Linux being coded so today it should be made less like UNIX now while still remaining POSIX-compliant. Heck, even Mac OS X and Windows NT* are completely POSIX-compliant but managed to exist as a completely different system. In other words, modern Linux should be partially modeled after other systems as well if it wants to ‘seize’ the OS market domination from Windows.
*WinNT may or may not fully POSIX-compliant but I don’t fuss over the details. Nobody really care whether I’m correct or not because there’ll always be people disagreeing with me whether I’m right or wrong.
People always told me that as Linux users, we should take advantage of CLI’s power and should not fear of its user-unfriendliness. Well, I could say similar thing to them too, like “you should take advantage of GUI’s user-friendliness and should not fear of its limitations”. Fair enough.
I got my first Ubuntu live CD around 5 years ago which was distributed free in a local PC expo. When I tried to run it on my laptop, I waited for 15 minutes and nothing happen so I was badly disappointed. By that time I’m already have 5 years of Linux experience so I don’t mind at all installing Ubuntu but having to run it as a live CD first before I can install it is too much of a hassle. Why didn’t they make the CD starts with 2 options?; ie. to run as live CD or to install? If not I might have started using it since that time until today.
Why is it so hard for Linux developers to make GUI available to all Linux tools and functions? Just replace the damn typing space (Terminal) with windows and also replace the damn textual commands with clickable buttons with all configurable parameters visible and available to the users. Is that too much of a favor to ask for?