[NTLUG:Discuss] Is there a disk defragmenter for Linux
terry
linux at cowtown.net
Sun Dec 22 11:49:51 CST 2002
The short answer is yes.
Steve Baker wrote:
> Rev. wRy wrote:
>
>>> No - only crappy OS's need to have their disks defragged.
>>
>
>> Can you expand on this for a moment? It's a critical difference
>> between M$ and Linux, and I've yet to stumble across anything other
>> than one line answers
>> to the question that say any more than what you've written. Yes,
>> I've looked
>> at the LDP howto on filesystems. I've googled (although perhaps on
>> the wrong
>> subjects). I've taken various classes on *nix, but I've yet to see this
>> explained.
>
I've researched this issue myself and have found that although it's
probably not a very popular or active project but a disk defragmenter is
available for Linux. (You'll find it on fresheat.net if your
interested.) I think however, that it's a trivial pursuit on either
platform, but much more trivial on a Linux file system because of the
fact that it's just a more organized and efficient system and structure,
especially when you have multiple partitions. (It's probably one good
reason to have separate partitions or drives for / and /boot and /var
and /usr and /home etc.)
Under normal circumstances, fragmentation just doesn't occur, at least
not so much that it matters. Only if you regularly delete large blocks
of data will it become an issue at all, and even then, with the modern
hardware we have now, the seek times are so fast that I doubt that you'd
notice any performance loss even if you did have a considerable amount
of fragmentation. And the term "large blocks of data" is a relative
term. Consider the sizes of hard drives now days, some of us have 1 or 2
gig hard drives, others of us have 6 or 10 gig hard drives and still
others of us now have 80 or 100 gig drives. So how much fragmentation
would make a difference on a 100 gig hard drive? How much fragmemtation
would make a difference on a 10 gig drive?
>
> Well, I confess that I don't understand the technical details - but as
> an observed
> fact, when you run fsck and it reports your fragmentation, it's never
> very far
> off 90% perfect.
>
>> So how exactly does *nix write data to a hard drive that eliminates
>> the need for a defrag? And why is there fsck if there is no need for
>> a defrag?
>
It writes data concurrently or consecutively.
I think that if fragmentation were enough of an issue, the various
Linux distributors would include a defrag utility in the default
install, but it's not, so they don't.
>
> fsck doesn't have anything to do with fragmentation - it checks the
> file system
> for consistancy - for errors - and tries to repair them.
>
> In theory, there should be no filesystem errors - but if your machine
> crashes
> or you have hardware problems - then occasionally, something gets
> screwed up and
> on your next reboot, fsck repairs it.
>
> Nothing to do with fragmentation.
>
>> Obviously Linux requires a different way of thinking than does M$,
>> but I don't see how saying "Crap OS'es need a defrag, Linux doesn't"
>> explains what's going on under the
>> hood, and often times with Linux, knowing what goes on under the hood
>> is half the battle
>> won.
>
>
> Well, yes - but it's hard to be an expert about everything. Some
> things outside
> your own field, you just have to take on trust...or if it really bugs
> you - become
> an expert. After all, you can always read the source code for the
> file system and
> see what clever thing it does.
>
> Personally, I'm happy to take it on trust that fragmentation isn't an
> issue.
>
> Practical experience says it's not. Many Linux systems run for a year
> or more
> without being rebooted - much less defragged - and nobody notices any
> drop in
> performance in the way you do on a horribly fragmented Windoze system.
>
> As you use Linux longer, you see more and more places where your past
> experience
> with Microsoft's poor design and implementation have led you to
> believe that
> certain horrible parts of using a computer are somehow inevitable.
> Liberation
> from those problems sometimes seems too good to be true.
>
> These are considered 'good practice' for desktop systems running Windoze:
>
> * Defragging.
> * Rebooting once a day to flush out memory that Windoze doesn't free
> up.
> * Reinstalling the operating system periodically to "clean things up".
> * Rebooting after a major program crashes just in case it corrupted
> something.
> * Running regular virus scans.
> * Not opening attachments on email.
>
> ...none of them are needed under Linux - but that's not because there is
> special magic inside Linux - you don't have to do those things under BSD
> or IRIX or Solaris or HPUX either. They are all caused by inept
> design and implementation from our buddies at M$.
>
> Joe Public doesn't realise that computers can be any other way - so
> these things are never identified as "unnatural". People think that
> defragging is "a good thing" - when in fact they are just exercising
> a really lame workaround for something I'd consider to be a bug.
>
> So, welcome to the brave new world!
> ---------------------------- Steve Baker -------------------------
> HomeEmail: <sjbaker1 at airmail.net> WorkEmail: <sjbaker at link.com>
> HomePage : http://web2.airmail.net/sjbaker1
> Projects : http://plib.sf.net http://tuxaqfh.sf.net
> http://tuxkart.sf.net http://prettypoly.sf.net
>
>
> _______________________________________________
> https://ntlug.org/mailman/listinfo/discuss
>
> .
>
More information about the Discuss
mailing list