When a company goes to the considerable time and effort to purchase and implement a server, it is always for an important purpose. The server will be utilized for hosting the company’s web site or handling online sales transactions, or internally will be used as a database, CRM, file or email server. In any case, whatever the use, that server will be expected to live up to its name and deliver top-notch service.
Wherever within a corporation that server is put to work, if it slows down in its delivery of data it is going to impact the company’s bottom line. If a sales prospect visits the web site and is browsing for a new purchase, and pages take too long to load, there’s a good chance the prospect is going to leave. The same effect is created if a sales representative, on the phone with a prospect, is having to wait for product information from the database. Internally, if accounts receivable is having to wait too long for invoices to be generated, billings will be late and so will income. Even if email is slow, vital orders or data could be late reaching a recipient resulting in mistakes or lack of coordination between departments.
Many enterprises today are attaching disks with capacities of 1 terabyte or larger to servers in order to increase capacity while at the same time lessening the data center footprint. Such a move also simplifies the storage model with shorter routes to more data. But in the case of terabyte drives, one other factor must be taken into account that might be assumed to be a foregone conclusion: file fragmentation.
Traditional defragmenters in use at enterprises now adopting terabyte drives were only designed to handle a certain range of storage capacity. At 50 GB, they run just fine. At 100 GB, they begin to strain. At 500 GB, the runtimes are overly-long, but they still might get the job done. But getting up into 1 and then 2 TB range, these “one size fits all” defragmenters cannot cope and will just run on endlessly, never actually defragmenting the drive.
Fortunately some fragmentation solution developers had seen this coming, and have now released solutions containing special technology for large disks. These “engines” are designed to handle multi-terabyte capacities and can make it possible to fully defragment such drives in a matter of hours. Once defragmented they are kept that way, as a majority of fragmentation is prevented on-the-fly with no impact on users and no required scheduling.
In the case of enterprise servers, speed of service is the key. Ensure that with any server you install, the fragmentation solution selected will stand up to the job and help guarantee that speed.
No comments:
Post a Comment