Wednesday, July 28, 2010

Fragmentation and the Virtual Revolution

It hasn’t been all that long since virtual servers came on the scene, allowing entire machines to operate as software applications. This technology meant that companies hanging onto only partially used hardware servers—and sweating over the expensive space and energy they were taking up—could consolidate them as virtual servers and fully utilize machines. Virtualization has certainly meant a revolution for data centers.

But as one might expect, the revolution has not stopped with servers. Now companies are looking at how they might virtualize the scores of desktops scattered throughout an enterprise, again reducing hardware and energy consumption and simplifying management as well. The PC monitors remain with users, but become very similar to terminals connected to virtual PCs that could be located anywhere, including a company’s main data center.

It might seem that since a machine is virtual, it would not suffer from a traditional issue such as fragmentation. After all, if a machine is existing as purely data in memory, how can file fragmentation be a problem?

The answer is that the data being utilized by a virtual machine is still being saved on a hard drive. A single drive or set of drives is supporting a number of virtual machines—called “guest systems”—and data from all of those machines is saved on the drive or set of drives on what is called the “host system.” File fragmentation, which drastically slows down performance on any drive, has an even worse effect in virtual server environments. Note that fragmentation will occur no matter what is being “virtualized”—servers, PCs or even in the future, networks.

Since a virtual machine has its own I/O request which is relayed to the host system, multiple I/O requests are occurring for each file request—minimally, one request for the guest system, then another for the host system. When files are split into hundreds or thousands of fragments (not at all uncommon) it means the generation of multiple I/O requests for each fragment of every file. This action is multiplied by the number of virtual machines resident on any host server, and doing the math it can be easily seen that the result is seriously degraded performance.

Defrag is obviously crucial for virtual machines—but it must be the right defrag technology. A fully automatic defrag solution means that files stored at the hardware layer are consistently and automatically defragmented, and fragmentation is never an issue at all. Only idle resources are utilized to defragment, which means that users never experience a negative performance impact, and scheduling is never required. Virtual machine performance and reliability are constantly maximized.

Virtualization means the sky is the limit for enterprises today. Don’t let fragmentation keep you tied to the ground.

Tuesday, July 20, 2010

Yes, It’s Free—but Will It Do the Job?

If you ask anyone who writes advertising copy for a living, you’ll find that the word most responded to in promotional literature of any kind is the word “free.” Not surprising—who doesn’t want something useful without paying anything? The problem is that the old adage, “If it sounds too good to be true, it probably is,” usually applies.

One version of this is the “free trial.” Some trialware is full-featured and only has a time limit. In other cases, however, it is feature-hobbled and when you really need it to do the job, it won’t.

Another is simply “free software” which has been common throughout the history of the web. When you look at the functionality, however, you may find many things missing. An example would be word processing. A free word processor might be Notepad, found on all PCs. It’s fine if you’re simply writing text and don’t care about formatting, fonts or symbols, let alone spelling and grammar checking, or the ability to embed graphics and photos. For that functionality you would need to do what many do: pay for and turn to Microsoft Word.

A free virus checker would be another example. If it’s not trialware (which is probably going to have limited functionality anyway), then it most likely is not updated with all recent virus signatures and it may or may not protect your computer. For that, you’d need a professional robust application along with a subscription that always keeps it up to date and keeps your computer safe from malware attack.

The bottom line: in responding to any offer of something free, check the functionality. In most cases, you’ll find that it won’t do the job you need it to do.

In terms of functionality, a great case in point would be a free defragmenter. In looking over the features, you would most likely find that it either must be run manually or—at best—it needs to be scheduled. Most sites cannot afford to take a system down for maintenance to defragment its hard drives, as most must remain up and running pretty near constantly. The end result is that the enterprise’s computers will continue to suffer the performance-crippling effects of fragmentation, simply because defrag can be run so seldom.

The second problem would be the utility’s ability to actually defragment your drives. Does it have the needed technology to truly do the job? Many do not, especially with today’s much larger drives, enormous file sizes and sheer number of files. A defrag utility not up for the task will simply grind endlessly and never fully defragment. Once again, a company is saddled with fragmentation’s effects—despite the apparency of a “solution.”

The lesson to be learned is that “free” does not always mean “effective.” In fact, far from it. Before you decide a free utility is best, look closely at the features.

Wednesday, July 14, 2010

Maximum Optimization of Virtual Environments

Virtualization has revolutionized data centers in many ways. It has made it possible to create servers within minutes for specific functions—even by users. It has brought us the ability to substantially conserve on hardware resources by running multiple servers within the same hardware platform. It has made it possible to greatly reduce the physical footprints of server farms.

Interestingly, however, there is a “missing link” in the optimization of virtual resources that, if not addressed, can drastically affect virtual performance.

First, there is the issue of file fragmentation. All hard drives suffer from this malady, but virtual systems actually suffer twice as much; there is fragmentation at both the host and guest levels. Additionally, by consolidating 4 - 5 servers into one, a single storage device is forced to work overtime due to the 4 – 5 times increase in I/O traffic. The result is heavy processing bottlenecks.

Second, multiple virtual machines are sharing mutual system resources—an activity that can become a drain on performance.

Third, when virtual hard disks are set to dynamically grow, they do not then shrink when users or applications remove data. This is a costly waste of space that could otherwise be allocated to other virtual systems.

For the first problem—fragmentation—there are, of course, defragmenters that can be implemented. The advanced technology of virtualization, however, requires a more advanced solution. Technology will soon be available that actually prevents a majority fragmentation both at the guest and host levels, as fragmentation occurs in both places. This makes fragmentation a thing of the past for virtual systems, allowing their innate performance potential to be realized.

The solution to the second problem—competition for multiple resources—would be the synchronization of the complex and ongoing activity between host and multiple guest operating systems. In addition to solving file fragmentation, this would mean additional performance optimization for the entire virtual platform.

The third problem, the “bloating” of virtual hard drives, can be solved with tools that allow system personnel to monitor wasted resources and compact virtual disks when required. This facility would allow IT personnel to efficiently allocate virtual storage resources.

These problems put together equal an overall issue in virtual machine optimization. Mutually solved, maximum performance and reliability for high-traffic virtual environments can be fully exploited. When implementing virtualization—as most enterprises are today—it is wise to take these issues, and their solutions, fully into account.