The context of the question is a Windows computer (so the filesystem in question is NTFS) that is filling up with data that can probably be removed. But I don't know whether it's worth the time to weed through it, or whether we should just defrag and move on.
Basically, does the 'fullness' of the filesystem cause a decrease in performance, or is it just fragmentation that slows things down? And if it does, does it make a meaningful difference?
Many things can impact a server's file-serving performance. Fullness of the file-system is but one of many things that can contribute.
Some of these interrelate and many times it'll be multiple issues driving a performance problem. In general, NTFS filesystem fragmentation does have an impact. The impact is worst when doing large sequential reads from such a file-system, such as happens during a backup. The impact to general file-serving performance is not as significant for typical office-server loads since those are largely random I/O anyway; and in some cases you can even see some performance improvements with a fragmented system over a fully defragged one.
For a file-server storing a lot of AutoCAD files, NTFS fragmentation will be perceptible to the end users. That user-generated I/O pattern is significantly sequential, and is therefore vulnerable to degradation by fragmentation. How much it'll be really impacted is dependent upon how much RAM the server has for caching, and how fast the underlaying storage is regarding random I/O patterns. It could very well be that the underlaying storage is fast enough that end-users won't notice a volume with 60% fragmentation. Or it could cause I/O saturation with only 15% frag.
For a file-server storing a lot of plain old office files, NTFS fragmentation will not be as perceptible to end users. That user I/O pattern is significantly random as it is, and is minimally impacted by fragmentation. Where the problems will emerge is in the backup process, as the time to backup each GB will increase as fragmentation increases.
Which brings me to my final point. The one I/O operation that is most affected by fragmentation is sequential I/O. Most servers undergo large scale sequential I/O patterns as part of the backup process. If you're having trouble fitting your backup into your backup window, defragging can help make things go faster. Your underlaying storage systems will determine how much of an impact fragmentation can have, and your fragmentation numbers will determine how much of an impact it actually has. Know your storage.
Fragmentation would lead to some slowness. Overall it probably won't be anything your user notices unless they're doing a lot of video work or working with huge files.
Actually, I think it would slow down if there's a ton of seek operations, thousands of tiny files that are hit a lot.
In most cases with decent memory and a routine of only a few files in use, the OS will cache things in memory and you won't notice too much difference. Only benchmarks will tell.
In the end...this is another "it depends" questions. Depends on large files vs. small, usage patterns on the computer, and just how fragmented fragmented is and how perceptive your users are to a few seconds difference in performance.
Won't hurt anything if you run MyDefrag. Freeware; it also tries to "optimize" some of the layout of files to areas of the disk where access will be a bit faster.
Defrag and move on. It isn't worth the time to save a few dozen GB. But to answer your question, the only thing that a new disk has is all the files at the start so seek times are less. But once it has been used, files can be anywhere, so defrag will help.
TL;DR: Not till you get more than 75% full.
For most intents and purposes, filling up a drive has no performance implications until you get over 75% full. This can be off a bit depending on usage, but for a typical workstation load this is true.
Fragmentation is minimized when all files have space to be placed. The only types of files that get fragmented on a largely empty NTFS partition are Logfiles and directory metadata, because they are constantly expanding. If you frequently search through logs or have a large throughput of created and deleted files, regular defragmentation may be beneficial even when the drive is less full.
If you're under 80% usage or so, don't worry, just defrag.
When it starts to get close to 100%, any filesystem will start to slow down.
if you are using Windows 2008, then you can use Deduplication facility that can free up some unnecessary files that file up your hard disk