Filesystem Paths: How Long is Too Long?

hmmm. let’s make the maximum number of characters 256.

I read all your comments and they were fantastic.

Now, if microsoft would stop acting like some top-down behemoth and write software we can use.

I actually do work for Microsoft and the primary reason for the limit is indeed compatibility. There are a lot of old programs that have MAX_PATH (260) as limit. As soon as the Explorer shell “fixed” the limitation, then all of those program will buffer overflow when the long path is passed in.

NTFS can support 32K paths. Lower level Win32 APIs can support 32K paths. Explorer and the old cmd shell choose not to for backward compatibility reasons. (and maybe others I am not aware of). I have no idea why .Net framework imposes the same restriction… that seems a little wierd…

package removedirs;


public class RemoveDirs {

File rootDir = null;
public RemoveDirs(File rootDir){
    this.rootDir = rootDir;

public void walk(File dir, int depth){
    System.out.println("Depth:" + depth);
    if (dir.isFile()){
    } else if (dir.isDirectory()){
        File files[] = dir.listFiles();
        if (files.length == 0){
            System.out.println("del: " + dir.getAbsolutePath());
            if (depth > 1){
                if (dir.getParentFile().listFiles().length == 0){
                    walk(dir.getParentFile(), depth -1);
        for (File file: files){
            walk(file, depth+1);

public void walk(){
    walk(rootDir, 0);

public static void main(String[] args) {
    RemoveDirs rd = new RemoveDirs(new File("c:/cygwin/work/core"));


// Run as admin (vista), make sure your absolutly sure its the correct directory before running. Java 1.4 or higher

I disagree. The maximum path length in windows is a serious - and seriously annoying - issue with all kinds of nasty side effects.

Maximum path lengths turn the file system into a leaky abstraction: when you move a directory into another directory, it is possible to sidestep the path-length limitation, and create files that can’t be read by any program, whether administrator-privileged or not, without modifying the directory structure.

The maximum path length isn’t such an issue when design a hierarchy up-front, but it is when you combine existing hierarchies, something you’ll inevitably do.

I wrote a slightly longer argument hereof on my website.

To the Mac/nix trolls:

Your OS doesn’t have these “unbelievable” limitations because they don’t even attempt to maintain any sort of backward compatibility. Try running a System 6/7 app on Tiger. Or a RedHat 6 binary on Fedora. Good luck. In fact, forget about backward compatibility, there’s no attention paid to compatibility of any kind. Even a minor update can totally break an app or library.

But I forgot, we’ll just hand users the source code and get them to compile it, and that will eliminate all those nasty compatibility issues, right? Because every user knows how to compile source code and loves hunting around for the 38 dependencies and 9 kernel patches that they need. And every user knows how to fix a makefile that was never fully tested. And last but not least, every program worth having OBVIOUSLY comes as Open Source, except for MS Office and Adobe Photoshop and uh, pretty much every other popular app. And the drivers for most new hardware and gizmos.

Don’t kid yourselves, kids - for every “feature” of Linux (which I used daily for 3 years) I prefer over Windows, I can count 10 more that drive me nuts. Mac fanboys may have it a little easier, but that’s only due to the hard work of 3rd-party software vendors who, coincidentally, don’t share your philosophy and are only supporting you to make a few extra bucks.

I don’t see what’s so outrageous about having long file names. If the user wants to name files something meaningful to themselves who’s to say they shouldn’t use their computer that way? For that matter, for small enough notes, who’s to say the user can’t just put ALL the data in the file name? I can hear the shreiks now, but short of outdated technical reasons is there any reason to prohibit users from doing this? Filename (and location) is just one of myriad metadata about a blob of data.

I tend to agree that hierarchical filesystems are going to start disappearing because organizing everything in a single axis of taxonomy is inefficient (to humans) (same goes for symlinks, which are possibly worse), but while they are still around brain-damage restrictions from the 1980s shouldn’t be restricting how users use their system.

In PowerShell:
dir -path “c:” -recurse | where { $_.fullname.length -gt 150 } | sort-descending fullname.length | select fullname.length, fullname -first 1

Not sure if that where clause helps performance; it might not be necessary.

Ah, I see KL provided the PowerShell answer a while ago. I like mine better, though - the where clause seems to make it run more than twice as fast on my box (160GB, dual 2.8, 3GB ram). My query took 4 minutes 47 seconds, KL’s is still running at 13 minutes.

I figured a baseline of 150 characters made sense since Jeff found that 152 was the longest path on a fresh install.

Here it is in a more terse format:
ls -r -fo C:|where {$.fullname.length150}|sort {$.fullname.length}|select -l 1|select fullname|fl


Yes I got it, from Jeff Atwood’s program.

I just hit this problem organising some MP3s. My hierarchy is only a few levels deep but one or two albums had some very long names and track names. One track in particular, took its path over the limit after the album was moved, and that left me with a file that I could not even rename or move from Windows Explorer (Explorer just refused to do anything with the file).

The workaround was to rename the lower-level folders so they weren’t so long, and that is how it should have been from the start. However, the fact remains that I managed to blow the 260 character limit just by moving a few folders around, and the results were pretty unpredictable. For example, I couldn’t find the total running time for the album, and MS Synctoy simply skipped half the files it was meant to be backing up after hitting this file (that is how I managed to find the culprit).

– Jason

I think the critical point is that there is no actual limit to the paths that can be created, just a limit in some Win32 APIs. Those APIs have to stay the same for backwards-compatibility purposes. OK. So old programs won’t be able to read long files.

So the logical thing to do is create a new interface without the restriction that new code can use. There is a way to handle paths with lengths up to the filesystem’s limit, but it doesn’t support things like relative paths. Microsoft doesn’t promote its use and doesn’t in fact use it in its own software.

This problem could go away eventually if simply new and useful API functions without the path limit were created, used throughout Microsoft’s software, and promoted for use in all new software. If it had happened around the year 2000 most people would be blissfully unaware today (honestly, WinNT4 would have been a good time to do this, but it’s never too late). If it happens tomorrow maybe we’ll put this to rest by 2020. But it won’t happen tomorrow. When I turn 40 I’ll still be dealing with the path limit and I won’t even have a flying car. Some future.

All Problems have ONE solution and that is:

Path too long.

Error cannot delete file: cannot read from source file or disk

Cannot delete file: Access is denied

There has been a sharing violation.

Cannot delete file or folder The file name you specified
is not valid or too long. Specify a different file name.

The source or destination file may be in use.

The file is in use by another program or user.

Error Deleting File or Folder

Make sure the disk is not full or write-protected and that the file is not currently in use.

Path too deep.

I had the similiar problem and finally I found solution:

I had the similiar problem and finally I found solution:

I had the similiar problem and finally I found solution:

I had the similiar problem and finally I found solution:

Silverlight definitely needs long file paths:

C:\Documents and Settings\reallylongerusername\Local Settings\Application Data\Microsoft\Silverlight\is\qzm2cwh3.bk0\xvzol3cm.540\1\s\wbt3ggyazzgf42xkvmzkumxhseqylr2brzmpatqodms5hg2z1jaaahaa\f\888b19e95-cf6b-4b0c-9588-1a8ea8ce8ca4-c4b19e95-cf6b-4b0c-9588-1a8ea8ce8ca4.db

You’re left with around 50 characters once that completely OTT IsolatedStorage path is created.

without drifting through the massive amount of boring trolling by users of various operating systems …

The biggest problem, is that a not inconsiderable amount of those 256 characters are ‘taken’ from you by default paths, for instance with Visual Studio “C:\Users\\Documents\Visual Studio 2010\Projects” and THEN if you have a sensible naming structure you use something akin to the project namespace so “”, which is then used twice, once for ‘solution’ level folder and once for each project under it (whatever namespace they use).

Only THEN do you actually get to the content folders of your project.

To get anywhere near useful you have to ignore all the windows ‘friendliness’ anyway and create yourself short paths from drive root like ‘C:\bill\projects’


~256 characters might be okay for most people, but anyone working with NodeJS and/or NPM (in Windows) is in for some hell regarding the node_modules directory. Once you have a project inside 8 directories and run npm install you wish you would be on a Linux or Mac. Workarounds exist, but having to create a build process for a build process to prevent things from breaking is certainly something we shouldn’t need to do.

1 Like

From what I’m seeing, there seems to be a slow but certain push by Microsoft to address this problem in their core libraries:


1 Like