Category Archives: computer

When Windows feels better than Linux

In an attempt of Linux trying to be better than windows, there are apparently a few significant features being left out that ended up making it feels inferior to the latter. They maybe not that important but it really is annoying to know that they’re missing, considering that those features help in expediting many personal computing operation.

The first thing I noticed missing in Linux is the “click-twice-to-rename”. In windows you don’t have to right-click and select rename in order to rename a file/folder. Just by clicking the icon twice will activate the renaming process, thus saving great deal of time.

Another feature that I noticed is missing is the automatic focus to the specific file when we opened a file browser such as nautilus from within an application. For example it is just natural for one to right-click on an icon of a downloaded file from within Firefox’s download list and select “open in file manager”, hoping the focus will be for the said file, but no, it is not. It is quite time-consuming to look for that one file from the swarm of them in the file manager window. Wouldn’t it be more convenient if it works just like in windows, right?

Actually there are still more I’d like to point out too but I’ll save them for future posts for now. If you also think there are also other things Linux could learn, please let me know by leaving your comments here. You may also write your own article about it and linking it back to this post as a mean for sharing.


Pumping words/text to my blog from my Androphone

Image representing HTC as depicted in CrunchBase
Image via CrunchBase

As mentioned sometime ago, this is my new blog, a successor to my old blog at thank goodness that ‘almost’ all entries from my old blog has been succesfully imported to this new place. Actually it has been my dream to have my own dotcom domain, and now that I finally have it, I’m gonna blog with more motivation than I used to be.

And you know what? This post is composed on my new Androphone, and it’s also has been my dream to have an Androphone. LG Optimus One may not sound much compared to a certain popular brand like HTC but at least it fulfilled some of my dreams.

I’m making my own slate/tablet (I’m not joking)

Image representing iPad as depicted in CrunchBase
Image via CrunchBase

I’m tired of people talking about tablet/slate is the future of personal computing. I mean I’ve built my own PCs all this time so why don’t I start making my own slate if that is really the future?

Many people think iPad is the best tablet so far. I beg to differ. It’s probably the worst since it has no USB port, it does not use desktop OS, and it has no memory card slot. So you’re asking why does using a full-fledged desktop OS matters to me? Read on.

Before I go further, I’d like to tell you how important to have at least one USB port on a slate. Imagine this; we should already know by now that iPad in 2 versions; the 3G and non-3G. People with money would normally go for the best their money could buy. For poorly paid public servant like me, it’s just normal to opt for the cheaper one, which is in this case the non-3G one. Practically it’s cheaper to buy the non-3G and then plug my existing USB 3G modem (or even 3G-enabled phone) and I’m good to go. Besides, even if the slate does not have a memory card reader, I can always plug a USB card reader too, should there’s a need to do so. Not to mention the freedom to directly connect my other devices such as phones, digicam, mp3 player, etc. without the need of a PC as a mediator. It would also allow me to expand the built-in memory by plugging in USB flash drives. See how cool it is to have a USB port? The presence of USB ports on some Windows slates makes them the winner over iPad.

Talking about connectivity, the non-3G iPad would only benefit home users, provided that the house is equipped with wi-fi. Travelers and road warriors would be happy with the 3G version, but that’s it, they’re restricted to 3G only. Since many new ISPs have started rolling out WiMax services, even the 3G version of iPad is out of luck, as there is no way to plug in USB WiMax modem to it. That’s why I’d like to stress out that having USB port is as important as having the core of the tablet itself. Not only it allows the slate to have extra features, it may also doubles the existing features.

Now, into the OS as I promised earlier. I don’t mind if people prefer to put/use a specially developed mobile OS for tablets. However I can’t accept it if people think tablets must use mobile OS instead of desktop OS, citing issues like battery life, “overkill”, etc. I’m not really sure about the “overkill” part as I never think desktop OS is too much for a tablet, because even netbooks can accept them. However when it comes to battery life, actually it’s the goal of all desktop OS to become most power efficient and it doesn’t have to be for mobile OS only. Besides the battery life is usually determined by the CPU. That means even if the OS is power efficient, if the CPU is power-hungry then even a tablet with power-optimized mobile OS would have it’s battery juice drained in no time. For an analogy, imagine comparing a 3GHz Intel Pentium-powered PC with a 3GHz Intel Core-powered PC. Install both PCs with Windows 7 Ultimate and I’m pretty sure the latter would be the winner for consuming much lower power despite using the same OS.

For me, no matter how people want to push the usage of a mobile OS in a tablet, a tablet must be powerful enough to support at least Windows 7 in it, should there are people who want to use it in a tablet. People should not think that “a slate is a smartphone+ and it should have similar battery life to a smartphone”. It is a ridiculous idea to believe that way. Sure it’s welcomed to see a tablet that can run as long a smartphone does but it’s still pretty much unrealistic and just a wishful thinking. A tablet is good enough if it could run continuously (with 3G/wi-fi on) for 6-8 hours with single full charge.

Actually what you do with a tablet is all that matters, not the OS inside it. If tablet is really the future of personal computing then doesn’t that means it would need the OS of personal computing know to everybody (Windows/Linux/BSD/OSX/etc.)? Really most people have fallen victim to the illusion made by Apple that now they believe the reason iPad is popular is because of its OS. They’re totally wrong. iPad was popular because it was from the “new” Apple (Apple under Steve Jobs management). Just a matter of fact, everything that comes from Apple since the return of Steve Jobs would sell, no matter how feature-poor they are.

I believe people would still buy the iPad even if it was loaded with OSX instead of iOS. Likewise, I don’t think Fujitsu’s tablets would become as popular as the iPad too, even if they had some mobile OS pre-installed. After all, reputation helps a lot here. Many people don’t know the truth behind the OS selection for iPad. The initial plan was to use OSX but they had a hard time to re-scale OSX’s kernel for lower-powered mobile use. Therefore they went to recreating/redesigning the OS again, and that explains why the earlier versions of iOS were all “unfinished products”, rolled out prematurely to meet its users with no copy-paste, no multitasking, etc.

Talking about reputation, since I’m anime fan too, I’d like to touch a bit about anime as well. It was reputation and hype that helped popularized Panty & Stocking anime a while back. People have known GAINAX for a long time, and people have known them for their good works. However I bet P&S would be as popular as we know it today if the exact thing was made by some American cartoon studio or even by some Asian animation studio outside Japan. Instead it might have ended up being known as yet another cartoon with Powerpuff Girls style animation, and wouldn’t make it to either Cartoon Network or Nickelodeon but would be aired in the midnight adult slot in some US TV channels due to its dirty jokes. In other words, people who have fallen for P&S is just like those people who have fallen victim of the illusion created by Apple as I explained in earlier paragraph.

The biggest lie the IT world have told me (resurfaced)


Windows XP Virtual Machine on a Mac
Image by scottpowerz via Flickr


My first experience with Linux was 10 years ago. Although I started my computing experience earlier than that (I attended computer classes since 1995) but I remain a n00b because the beige boxes scares me. I only started falling in love with computers and IT stuff when my father purchased our first family PC (powered by Pentium III) in year 2000. By having our own PC I can freely tinker around it without having to worry too much. Although the PC was pre-installed with Windows 98 SE, an article in local PC magazine drove my curiosity to try Linux and ended up installing RedHat in a dual-boot environment. I admit that I fell in love in Linux but I love Windows more because of it’s ease-of-use thus I set Windows as the default OS. Actually nobody in my family knows there’s Linux in the PC because it is only bootable via a boot diskette. Despite using Windows most of the time, I keep using Linux occasionally out of curiosity and started mastering it unknowingly in the process.

A couple of years since my introduction to Linux, I learned about the existence of special kind of software that would allow me to use Linux without having to set my beige box to dual-boot system. The software is known as virtual machine. I installed Connectix Virtual PC and began experimenting with various Linux flavors, often more than 2 at one time. Sure, having such load in a Pentium III box with maximum RAM of only 512MB is a pain but for a geek it was a pain worth bearing. However it still does not enough to make me a Linux convert because I still think that Linux was still immature for a beginner’s use. It was during the same time I introduced Linux to my family and nobody accepted it. Yes, the heavy reliance on CLI freaked my family members and unlike Windows which they can fix themselves, they’d left dumbfounded should they face problems in Linux. And I was more convinced that Linux is still not good as a beginner’s OS. Well, perhaps I’ve used the wrong distro but how should I know if the one I’m comfortable with may be too scary for others?

Fast forward a few years and I’ve almost gave up being a Linux evangelist to my family. Yeah, I know it wouldn’t succeed because I still not using Linux as my main OS up to that time. It’s not that I don’t want to but virtualization software consumed too much of my limited system resources, although my PC was among the most powerful of that time. Even if I set my PC into a dual-boot machine and dedicate all system resources to whatever OS I booted, it wouldn’t help either because I’m not happy with the hassle of having to reboot the machine just to switch the other OS. Then I think why not the computer developers simplify it? My computing knowledge were pretty much limited on that time. All that I could think for improvement is either to make the virtualization less resource hungry or something that I described as “hardware-level virtualization”. The former might be impossible because no matter how small footprint the virtualization software has, the overall system resources is still shared among the host SO and the guest OS. For the latter, I thought it was ridiculous until I read an article in another local PC magazine about the so-called “hardware-assisted virtualization” in 2005, around the same time of the emergence of multi-core consumer CPU.

From what I understood about hardware-assisted virtualization, it’s similar to my vision of hardware-level virtualization, where system resources are partitioned at hardware level instead of in software level as in the traditional software virtualization. In the article both AMD‘s “Pacifica” and Intel‘s “Vanderpool” were mentioned well. I thought the technology I’ve been waiting for has arrived but I was wrong. It’s all liars. The article mentioned about having a machine where we can boot into both system at one time without the need to install the virtualization software and we can switch between the OSes in real-time without having to reboot the system (let’s call it “double-boot” instead of dual-boot) or reloading the same OS without restarting it. Sounds nice because should the current working environment crashed, the loaded copy of the OS would take over and this could be done without the user noticing it. However I still haven’t seen my dream of “double-boot” system become true despite the technology is already available. The technology becomes useful only if the virtualization software is installed, which means it still need the host-guest relationship between the OSes, of which I think kills the purpose of having the hardware-level virtualization. I am highly disappointed. However there was one time in local PC expo where I saw an Apple representative demonstrated switching between Mac OS X and Windows XP in real-time using certain key combination. I asked him whether there are any virtualization software installed or not and he answered me the Mac only use Bootcamp. I’m not sure though whether it’s true or there were just some tricks because I never really have a chance to use Windows on Macs but whatever system it is, I only want to see the “double-boot” system become true.

SPAM comments are funneh (Part II)

no spam!
Image via Wikipedia

I received average 1 spam comments for every article I posted here. Although Akismet spam catcher did a good job of capturing them all, it is still annoying to have to manually delete them. I know it is designed to work like that to reduce false positives but I still wish for to have a setting to auto-delete those spam comments because I don’t really mind even if they are legitimate comments thought to be a spam by the Akismet system. For me in false positive cases like this I’d not blame the system but rather the person who send comments that have spam characteristics. Actually having your ‘legitimate’ comments treated as spam is not a big deal. If you see the comments you posted before never appear, just post them again but with some difference. You know, when your comments got caught that way usually there are only 3 reasons behind it; the website owner are not professional enough and took offense of your comments and deleted it, or your comments are just deserved to be deleted because you said nothing but trolling, or it’s caught by the anti-spam system (either because it is an intended spam or you don’t know what you were doing is spamming). I believe I’ve seen this auto-delete spam feature in WordPress before (in the form of Akismet settings) but I can’t find the setting in should really consider to make the option available to users if they can’t implement what I’m going to suggest below.

Now I think should take their spam prevention system to one step further. The existing system only catch spam only after they arrived at the comment inbox. I think the spam prevention should be done even before comments could be sent. Some websites use CAPTCHA/reCAPTCHA or similar system for this, which require the commenter to prove him/herself is a human first. So I think if Akismet could scan comments once they arrived at comment inbox, why can’t it try scanning the comments even before it was posted? Alright, I understand this might be hard to implement regardless what system is being used (CAPTCHA or Akismet). For example, allows comment to be posted via email which the system have no control at all. But still it’s better to have Akismet work before any spam is sent to prevent more spam rather than just working passively like it is right now, or at least eliminate them even before they could reach the comment inbox.

I got my first Ubuntu live CD around 5 y…

I got my first Ubuntu live CD around 5 years ago which was distributed free in a local PC expo. When I tried to run it on my laptop, I waited for 15 minutes and nothing happen so I was badly disappointed. By that time I’m already have 5 years of Linux experience so I don’t mind at all installing Ubuntu but having to run it as a live CD first before I can install it is too much of a hassle. Why didn’t they make the CD starts with 2 options?; ie. to run as live CD or to install? If not I might have started using it since that time until today.

Gigapixel display, anyone?

I created this work based on :Image:Progressiv...
Image via Wikipedia

In the past there used to be the vision of 3M computing (megabyte memory, megapixel display, and megahertz processing power). We’ve achieved that long time ago but now it’s quite impractical to have 3G* computing (>_<) (to have gigabyte memory, gigapixel display and gigahertz processing power, all in tandem). While gigabyte memory and gigahertz CPU clockspeed is already around for several years, gigapixel imaging is still in its infancy. We just got our hand on it very recently and it’s still impossible to implement it for daily/home/personal computing, yet.

So what about this “gigapixel display” I’m talking about? Let’s take the HD resolution that everybody is gaga over as a comparison. The Full-HD resolution (1080p) has around 2 megapixels (1920 x 1080 = 2073600) while most hi-end consumer DSLR cameras right now might have around 10 megapixels. That may not sound much but for most people even being slightly beyond-HD resolution standards are already overkill, so it’s quite unimaginable to have gigapixel monitor sitting in front of us.

Well, even if gigapixel display going to emerge anyway, how big it would be? Lets do some simple calculation here. Assuming that the imaginary gigapixel monitor has the 4:3 aspect ratio, to have a 1.2 gigapixels resolution is equivalent to 40000 x 30000 pixels! And if we’re to measure it in inches, let’s say we adopted 100 dpi here (normal monitor is somewhere between 96 dpi to 120 dpi), then the monitor will be 300” vertically and 400” horizontally, or 500” diagonally! (25-feet vertically, which is almost equivalent to a 3-storey building! That’s bigger than most silverscreen in cinemas!). Even the largest LCD/plasma TV right now only measured around 100” diagonally, and even with that ‘small’ size it’s  already hard to fit on any walls in most houses in the world! Unless we could fit 1000 pixels in an inch, only then we could make it much smaller, say 50” diagonally, which would be around the same size as most hi-end consumer LCD/plasma TV right now. However 1000 dpi is quite unviable with our current technologies. Anyways, seeing the rate of development in imaging technologies, I think it’s no surprise to see the world’s first working gigapixel display in ten years time, or even less.

*not to be confused with 3G networking

ERROR: The last ‘M’ in 3M computing is supposed to be million instructions per second (MIPS) instead of megahertz processing power. Sorry for the mistake.

p/s: I’d love to watch my favorite anime series on gigapixel display someday (>_<)