Linux Migration Helps Maximize IT Investments

Oracle’s acquisition of Sun Microsystems at the beginning of 2010 shocked the IT world. Sun’s products are the backbones of many corporate IT operations, and the industry has had to make a choice. Do they stick with Sun’s uncertain future, or start migrating from Sun servers to a technology with a more stable outlook?

Oracle’s Takeover of Sun Microsystems

When Sun Microsystems was bought by Oracle, many IT managers started to worry. Any time a company changes hands there is concern about the future of the product lines. However in this case there was an additional problem. Although Sun is well known for many software solutions, it is also a provider of products such as the heavily-used Solaris server operating system. Oracle, on the other hand, has no experience with developing or supporting this kind of technology.

What would happen to Sun’s server products? Would Oracle abandon them, suddenly leaving some of the world’s largest companies running their networks on obsolete technology? Or would Oracle continue to offer Sun’s servers, but offer escalating maintenance fees for substandard support?

Oracle has remained silent in the months since the acquisition. Many IT managers have stopped waiting on Oracle and have turned to other technologies. Solaris to Linux migration rates are soaring. Others wait, reluctant to abandon reliable technology, but is that better than migrating from Sun servers?

The Problems With Staying With Solaris

Managers have fears about migrating from Sun servers, such as downtime, data loss, and lowered production as employees get used to the new system; they are biding their time until Oracle gives some clear indication of their future direction. Unfortunately, the decision to do nothing may come back to haunt them later.

What if Oracle announces it plans to discontinue support for Solaris? Suddenly Solaris to Linux migration has to be done in a hurry, and migration is never something that should be done without careful planning. However it’s either that or maintain a network on unsupported technology – that is a disaster in the making.

Or perhaps Solaris will still be supported, but future development will be halted. As other technologies advance, Sun technologies fall further and further behind, and so will businesses which use it. Again, migrating from Sun servers would become an inevitable result, so why not start the process now rather than waiting until the decision is forced?

Benefits Of Solaris to Linux Migration

Most IT departments which are migrating from Sun servers appear to have opted for Linux over other server technologies available. The Solaris to Linux migration path is so well-traveled that it holds fewer surprises than other technology changes.

The advantage of migrating from Sun servers now rather than later is that the full migration process is a long one. When done properly, there are few crises or service interruptions during the transition and the entire operation is less disruptive to the company’s operation. However it takes time, with months of planning leading up to a transition which might be done in a few hours. This is followed by months of post-migration monitoring to be sure the new system is meeting the organization’s needs.

The future stability of Linux allows IT departments more flexibility in their software decisions. Current application purchases can be made with confidence, not worrying about operating system changes making software obsolete.

Don’t Take The Trip Alone

Although, as mentioned, the Solaris to Linux migration path is well traveled, so is the path up Mount Everest. And yet a mountaineer would be foolish not to take a local guide on his climb.

Migrating from Sun servers can be done safely and with a minimum of interference with normal company operations, but only if all the problems that might arise have already been handled. Rather than taking on the migration process themselves, IT departments have found it far more cost effective to hire experienced consultants who have overseen the migration process successfully with many clients. By doing so, businesses can take advantage of long-term benefits such as reduced costs, more control over the IT environment as well as an ease in manageability.

Contracting with a third-party migration consultant may seem like a more expensive option but in reality it is far cheaper than the alternative. Self-directed Solaris to Linux migration can lead to disaster which leaves the servers down for hours or days. How much is that going to cost the organization? Data corruption, confusion over new processes or responsibilities, and unforeseen compatibility problems are just a few of the expenses that can be avoided by partnering with a knowledgeable migration expert.

Linux Vs Windows

There is some very high competition in the operating system industry these days. As people become more tuned in to technology, they tend to gravitate towards one of three choices. These three, Windows, OS-X, and Linux, are always in stiff competition with each other to pull more users to their user-base. Incorporate the vast amount of server technologies out there, and virtualization, and there are a ton of options available for operating systems, all having their unique characteristics. Most hardcore users are dedicated to one platform, but you find a lot of true professionals starting to diversify themselves by learning two or more of the available platforms. For the focus of this article, I am going to disregard OS-X, however, for completeness, I am going to discuss it’s structure briefly.

OS-X runs atop of Darwin, which is a sort of hybrid of BSD, the Mach-3 micro-kernel, and a device driver API, or programming interface, called I/O Kit. Being that OS-X runs on BSD, it’s basically a UNIX system, different from Linux, because linux is open-source, where UNIX is closed-source. One of the great advantages of this type of architecture is the ability to use the Mach-O binary format, which not only allows you to utilize symmetric multi-processing, but also allows a single executable file, even the kernel, to include support for tons of CPU architectures. With some ingenuity, almost any piece of software that can be made to work under Linux can be made work under OS-X. Apple does manufacture a very nice machine. I find that most people that are dedicated to Mac’s are graphic designers or musicians. The reasoning being that Apples’ system is really tailored around these types of apps. From hardware to software, everything is tweaked for optimal performance.

Windows, ah windows. Windows was introduced in 1985, as a second-rate add-on for DOS. It was kind of Microsoft’s answer to the Mac. While the initial releases of windows were pretty basic, for the time, they were a decent alternative to running DOS as your sole operating system. Windows was, initially, riddled with problems. Tedious driver installation, bogus hardware recognition, and a flawed system BIOS structure were plaguing the operating system, restricting users from the option of using it as their sole base of operation. There was also a steep competition factor at this point in the technology’s revolution, as companies were beginning to emerge as leaders in the industry. Personally, operating under DOS, while tedious, gave you more control over what was happening, while Windows, required you to relinquish some configuration options in exchange for ease of use. In it’s infancy, you did have the ability to control environment variables and several other settings via DOS before launching windows. With the advent of Windows 95, all that would change. This was Microsoft’s first attempt at a fully windowed OS.

While you still have access to a DOS prompt, through command.com, you are stripped of the pure DOS interface, as several files are loaded at boot by default in order to make the OS function properly. Beta builds of Windows 95 started hitting the net in mass in early 95. It took me three years, until the late beta stages of Windows 98, before I tested the OS as a dedicated one. Everything was cumbersome, but the desire for a new platform overwhelmed me and I had to give it a try. It resulted in the re-installation of DOS, with windows running on top of it, but it was interesting. Microsoft has stepped up it’s game throughout the years, and they now have one of the largest user-bases in the world. Windows has been re-hashed and re-released several times since then, making some notable transitions. From Windows 98 to Windows XP there was a massive shift, the incorporation of plug-and-play technology, which added BIOS calls for identifying different hardware, to the addition of a new user environment. XP was a pretty significant step for Microsoft, bringing them closer to their ‘vision’ of where they thought computing was going. Their thoughts were right-on, and they cornered the operating system market at release. With the release of Windows Vista, and its predecessor, Windows 7, they have further demonstrated their ability to ease computer use for the average person.

Linux, is emerging as one of the best options available for low-cost high-end operations. Linux is the result of collaboration of millions of computer users from across the world pooling their efforts for the progression of the greater good. Linux is open-source, which means that all of the operating systems components are released under the GPL. The code can be custom-tailored and replicated in whole or part, allowing people to release their customizations and modifications. The speeds that Linux produces, even on mediocre machines, can be almost astounding when compared to Windows operating on the same machine. While Linux is definitely for advanced users at the moment, in the next year it will be ready to go for home users.

There are already several versions of Ubuntu available for the brave ones. Linux offers everything you need in an operating system, it just all hasn’t been made to work properly yet. If you are an advanced user, perhaps Linux is a viable option for you. If you have several years of computing experience, or have developed a knack for technology, then Linux can provide a ton of interesting options to you, and can far outperform Windows. The ability to customize the kernel to your liking, total control of your user interface, and total environmental monitoring are just some of the advantages of utilizing a Linux based OS. The Ubuntu ‘flavor’ of Linux is emerging as a leader in Linux for the home user. Ubuntu offers ease of installation, doesn’t require you to compile your own kernel, and has an ever-growing library of supported hardware. There are also several devices popping up around the market that are hard-coded with the Linux OS.

Generally, you can find the most user-configurable environment within Linux. This configuration can, however, be hard to achieve. I have spent countless hours honing my Linux install to perfection. Any time that you incorporate a new software package or kernel update, you may face difficulty’s. Getting graphics drivers working can be tedious, especially for high-end configurations. However, the work is worth it, if you’ve got the skills, to see linux working on a multi-desktop display. Another not so well known piece of software is Compiz Fusion. Compiz Fusion is Windows in it’s greatest form. Your desktop is transformed into 4 desktops, all having independent window layouts, with the ability to shift between the desktops and work in individual workspaces at random.

While Compiz operates on the plugin level, remember that it certainly can, and will cause adverse effects to your system. Be careful with Compiz settings, but if your system totally freezes, just hold the power button for 10 seconds, it will turn off, wait for 30 more, then turn it back on, boot as normal, Compiz should go with the settings that were stored before, since you didn’t apply and only tested the settings. The cube setting is one of the most stunning graphical layouts for a desktop that I have ever seen, especially when spread across three monitors. The cube setting in Compiz turns the four graphical terminals provided to you by Compiz into a three dimensional cube, that you can rotate with your mouse and zoom in and out of, it also has another plug-in that makes your windows pop off of the desktop in a three-dimensional casing. Conky is a very interesting desktop ‘widget’ feature that allows you almost endless options as far as customization.

Windows VS Linux Web Hosting

Choosing a web host is, no doubt, the first step in designing an effective website for your company and selecting the right hosting platform is one of the most confusing decisions, especially for those who are totally new to hosting world. Windows and Linux are two different types of operating systems used by most of the hosting companies. Each technology has its own pros and cons and an in-depth analysis is required when you have to choose one from amongst the two.

Many users already using a Windows or Linux operating system on their personal computers automatically assume that if they are using a Windows system, then they need the same operating system to host their website. In reality, the system one uses on personal computer may have nothing to do with the system that one’s web host runs. One needs to decide on the basis of what one’s website needs. Rather than comparing which technology is better than the other, this article focuses on the difference between the two and how they affect one’s hosting.

Windows Web Hosting:

Developed by Microsoft, the Windows web hosting platform is the most widely used hosting platform, used by more than 80% of the users. Its major advantages are its compatibility and user friendly technology. The Windows Web Hosting platform also offers some unique features that no other operating systems offer. For instance, a Windows programmer can easily incorporate COM objects into web applications. Also, using Windows, MSSQL servers are easier to manipulate. However, for all its advantages, unlike Linux, Windows comes with an exorbitant licensing fee. The additional licensing fees are justified, though, if a website relies on Windows specific technologies like MSSQL or ASP.NET.

Technologies like MSSQL or ASP.NET can be compatible with a Linux Server, however, as these technologies are meant to be run on a Windows based server, it is always preferable to go for a windows server when running systems developed by Microsoft.

Linux Web Hosting:

Evolved from Unix, Linux is another popular operating system on the web that is acknowledged for its reliability and cost effectiveness. The low price is the biggest advantage offered by Linux and this is the reason that most bloggers use this platform to set up their personal blogs. An easy to use control panel is another plus as Linux uses a cPanel or WHM which are equally popular among users for their easy functioning. On the other hand, Windows uses Plesk, which is comparatively less user friendly, especially for all the non-techies out there. Also, Linux operating system is very secure and less prone to cyber attacks; NetFronts continuously makes alterations and updates to ensure that this platform is secure for its users.

Linux hosting is best for those who use PHP, Perl, MySQL, WordPress or any other open source software. Though all these can work on Windows based servers as well, but perform better on the much more stable Linux platform.

Conclusion:

In terms of performance both Windows and Linux are reliable. One’s choice between the two would only depend on one’s requirement for specific features that are unique to either one of the two platforms. Careful assessment of requirements would be a definitive determining factor in choosing between Windows or Linux based platforms.

Linux Boot Process

As many people have often wondered, how does the Linux/Unix operating system boot? How could I put in a process to run during boot so that I do not have to start it every time manually? To do this one has to understand the basic environment procedure that is currently in place to add anything to it.

“Linux for Future Technology”

On a typical Linux system, I prefer Debian, the bootloader points to the kernel which in turn points to a file under /etc known as inittab. The interaction between the bootloader and the kernel is complex enough for another article.

  1. The /etc/inittab is a file that contains a set of instructions to call a specific directory under the /etc directory. It is a Shell script that looks something like this:# /etc/inittab: init(8) configuration.
    # $Id: inittab,v 1.91 2002/01/25 13:35:21 miquels Exp $

    # The default runlevel.
    id:2:initdefault:

    # Boot-time system configuration/initialization script.
    # This is run first except when booting in emergency (-b) mode.
    si::sysinit:/etc/init.d/rcS

  2. The system then calls the /etc/init.d/rcS shell script. This script then executes each shell script found in /etc/rcS.d/ directory. When an application is added to the system it will place its startup script in either the /etc/rcS.d/ or the /etc/rc2.d/ directory. This set of scripts are run initially and are system specific and should never be tampered with. After this set of scripts spawns off each process the /etc/init.d/rcS script then calls all files under /etc/rc2.d/ directory. The /etc/rc2.d/ directory is usually application software and not software required specifically for the system. Each file in these directories has a S(for start) and a number. For example: S43portmap. The number does not mean it will be the 43rd process started, but instead is used to stagger which process is started. Thus, if you wanted something to start before or after S43portmap you would set the prefix to be S40 for before the S43portmap and S50 for after the S43portmap.
  3. If you wanted to add your own application to be run you should always add it into the /etc/rc2.d/ directory. The /etc/rc2.d/ is the default runlevel used on most Linux systems.
  4. If you find yourself in the /etc/rc2.d/ directory you may notice that each file is actually symlinked back to the /etc/init.d/ directory. You are in fact correct. Its actually pretty neat if you think about it, as you could just create a symlink with a different number to start any of these scripts at a different time. This also gives the flexibility to the user to easily modify the boot process for his/her system.