Flag - or Solve - a Windows Tree Problem
Posted: Sat Oct 04, 2014 2:05 am
Good Morning!
Thanks for the software! I was about to start something similar for myself, when I stumbled upon your program. Well worth every cent!
I'd like to propose an enhancement I believe would be very beneficial to all us happy clients! Of course, I may live in such a black hole that I'm the only one interested in this idea. But I'm open to feedback from you and your community. (It never hurts to get cross checks on reality!)
SUGGESTION: UVK ver. 6.7.0.0 under XP Pro for following;
I ran into a bit of a snag a couple of weeks ago and thought I'd suggest the following enhancement. <Sorry this isn't a bit more timely -some details are going to be a bit foggy....>
THE SYMPTOM(S):
About a terabyte of data/programs over multiple hard drives of various sizes - the discovered problem actually had mutilple instances, but I'll just focus on the 1 example which allowed me to ID the issue.
Working on a XP Pro system with a clean report from virus scan. I had a variety of permission and access errors that I had to address. I used the "unlock all files" feature. Needed to do a reboot, so I did: still had locked files. Same with resetting all permissions to defaults. I decided to delete files in a temp dir in order to reduce processing times. Some files needed to be deleted on re-boot. Seemed strange for a temp dir that had scanned clean. But, ever vigilant, I checked running processes, etc and couldn't find anything that should effect the directory. Oh well, who knows what secrets lurk in the hearts of computers? Re-booted. Took a look. No files deleted. Attempted this several times. Same results every time. Repeatedly re-did unlock & permissions prior to new attempts to del tree. Same results. Checked Win XP points: Yes, the list of files to delete were in the proper place(s). Reboot. No files deleted. Furthermore, they were no longer listed as pending deletion (anywhere). Commands always seemed to work. No error messages. Event log(s) did not report anything. No matter how many times I tried, nothing was ever accomplished.
THE PROBLEM & SOLUTION:
The temp directory was actually labeled as a temp backup location of files. Newest of which was several years old. There was a second copy in an other location - presumably the original. Other files were in the temp backup as well and also had originals in other locations. All files were related to the same subject and all were consolidated into a more cohesive structure in the backup dir. (Presumed it was an organizational thing prior to moving the backup elsewhere.) I noted the entire tree was compressed, but this turned out to be a red herring. The real problem turned out to be that when the files were moved into the "new" tree structure, max path name limits were exceeded! << I later confirmed that the various "sub-trees" were just copied from their original locations into an already opened branch in the temp backup dir.>> I was forced down to the bottom of the tree in order to delete "segments" with short enough path lengths - one at a time using del .\*.* /f /s. Then I'd move a ways up the tree and try again. If it worked, fine. If not, move down the tree a ways. Repeat until entire dir finally deleted.
SUGGESTION:
It's been a long time since I've done any coding (before C+). But I still remember that not everything is as easy as you'd think! IF it isn't too much trouble, maybe the delete - and other file/dir commands - could do a count on path length. If max length met or exceeded, at least throw up an error message on the display. That alone would be very helpful. Although I've seen this sequence happen a few times before, there had been enough other problems with this system that I just figured I'd missed something. The suggested message would have saved me a fair bit of time. I considered a stand alone tool to do the check and report Pass/Fail. But that would only get used when I already suspected the problem. Not as functional as ID'ing the problem for me.
IF it isn't too much trouble, it would be even nicer if the command didn't just throw a message. Instead, when the limit is reached (or reasonably close) the command would work it's way down the tree keeping pointers to segments of the tree within max length bounds. The command would work its way down the tree, keeping pointers to segments of paths shorter than max, all the way to the bottom of the tree. Then it would perform its function, starting at the bottom of the tree (highest point within limits) and work its way up - performing it's action on each segment before it moved up to the next block and repeated the process incrementally till the "top" was reached. At completion a message about exceeding max path length (with total path length found?) would not only enhance realization of your tool's quality, but could serve as a flag about potentially similar problems elsewhere.
I know it's some bloat to code. But size cost vs functionality seems worth it to me (for all that's worth!). I kinda envision a small routine called prior to the command to check for path length size. If max not met/exceeded, perform as usual. If path too long, routine would then call a recursive sub-routine to pass back (an array) of pointers into the tree that do not exceed the limits. The routine would then call the command (recursively?), passing segments of the tree to process the array from the bottom up. Takes incrementally longer, but not as long as manually traveling to the bottom of the tree and manually working our way back up.
Please feel free to contact me if I can provide more info or whatever. About mid-week I'll be out of touch for about 7 - 10 days, so please accept apology for being potentially non-communicative.
I'd like to once more say thanks for a nice quality tool. I'm looking forward to continuing to become more proficient with it. (It takes a while longer when you don't get to use it every day.)
Thanks for the software! I was about to start something similar for myself, when I stumbled upon your program. Well worth every cent!
I'd like to propose an enhancement I believe would be very beneficial to all us happy clients! Of course, I may live in such a black hole that I'm the only one interested in this idea. But I'm open to feedback from you and your community. (It never hurts to get cross checks on reality!)
SUGGESTION: UVK ver. 6.7.0.0 under XP Pro for following;
I ran into a bit of a snag a couple of weeks ago and thought I'd suggest the following enhancement. <Sorry this isn't a bit more timely -some details are going to be a bit foggy....>
THE SYMPTOM(S):
About a terabyte of data/programs over multiple hard drives of various sizes - the discovered problem actually had mutilple instances, but I'll just focus on the 1 example which allowed me to ID the issue.
Working on a XP Pro system with a clean report from virus scan. I had a variety of permission and access errors that I had to address. I used the "unlock all files" feature. Needed to do a reboot, so I did: still had locked files. Same with resetting all permissions to defaults. I decided to delete files in a temp dir in order to reduce processing times. Some files needed to be deleted on re-boot. Seemed strange for a temp dir that had scanned clean. But, ever vigilant, I checked running processes, etc and couldn't find anything that should effect the directory. Oh well, who knows what secrets lurk in the hearts of computers? Re-booted. Took a look. No files deleted. Attempted this several times. Same results every time. Repeatedly re-did unlock & permissions prior to new attempts to del tree. Same results. Checked Win XP points: Yes, the list of files to delete were in the proper place(s). Reboot. No files deleted. Furthermore, they were no longer listed as pending deletion (anywhere). Commands always seemed to work. No error messages. Event log(s) did not report anything. No matter how many times I tried, nothing was ever accomplished.
THE PROBLEM & SOLUTION:
The temp directory was actually labeled as a temp backup location of files. Newest of which was several years old. There was a second copy in an other location - presumably the original. Other files were in the temp backup as well and also had originals in other locations. All files were related to the same subject and all were consolidated into a more cohesive structure in the backup dir. (Presumed it was an organizational thing prior to moving the backup elsewhere.) I noted the entire tree was compressed, but this turned out to be a red herring. The real problem turned out to be that when the files were moved into the "new" tree structure, max path name limits were exceeded! << I later confirmed that the various "sub-trees" were just copied from their original locations into an already opened branch in the temp backup dir.>> I was forced down to the bottom of the tree in order to delete "segments" with short enough path lengths - one at a time using del .\*.* /f /s. Then I'd move a ways up the tree and try again. If it worked, fine. If not, move down the tree a ways. Repeat until entire dir finally deleted.
SUGGESTION:
It's been a long time since I've done any coding (before C+). But I still remember that not everything is as easy as you'd think! IF it isn't too much trouble, maybe the delete - and other file/dir commands - could do a count on path length. If max length met or exceeded, at least throw up an error message on the display. That alone would be very helpful. Although I've seen this sequence happen a few times before, there had been enough other problems with this system that I just figured I'd missed something. The suggested message would have saved me a fair bit of time. I considered a stand alone tool to do the check and report Pass/Fail. But that would only get used when I already suspected the problem. Not as functional as ID'ing the problem for me.
IF it isn't too much trouble, it would be even nicer if the command didn't just throw a message. Instead, when the limit is reached (or reasonably close) the command would work it's way down the tree keeping pointers to segments of the tree within max length bounds. The command would work its way down the tree, keeping pointers to segments of paths shorter than max, all the way to the bottom of the tree. Then it would perform its function, starting at the bottom of the tree (highest point within limits) and work its way up - performing it's action on each segment before it moved up to the next block and repeated the process incrementally till the "top" was reached. At completion a message about exceeding max path length (with total path length found?) would not only enhance realization of your tool's quality, but could serve as a flag about potentially similar problems elsewhere.
I know it's some bloat to code. But size cost vs functionality seems worth it to me (for all that's worth!). I kinda envision a small routine called prior to the command to check for path length size. If max not met/exceeded, perform as usual. If path too long, routine would then call a recursive sub-routine to pass back (an array) of pointers into the tree that do not exceed the limits. The routine would then call the command (recursively?), passing segments of the tree to process the array from the bottom up. Takes incrementally longer, but not as long as manually traveling to the bottom of the tree and manually working our way back up.
Please feel free to contact me if I can provide more info or whatever. About mid-week I'll be out of touch for about 7 - 10 days, so please accept apology for being potentially non-communicative.
I'd like to once more say thanks for a nice quality tool. I'm looking forward to continuing to become more proficient with it. (It takes a while longer when you don't get to use it every day.)