Set Working Folder -> "being used by another process

If you are having a problem using Vault, post a message here.

Moderator: SourceGear

mlippert
Posts: 252
Joined: Wed Oct 06, 2004 10:49 am
Location: Cambridge, MA

Post by mlippert » Mon Oct 02, 2006 10:13 am

Beth,
I just don't understand why you think unbinding the solutions from Visual Studio needs to be done if Visual Studio isn't running? For 1 thing that requires that those solutions already be on the disk somewhere so that Visual Studio can load (and change) them, and this problem occurs when they don't exist on the local system.

As for the other, I understand how to turn on the client logging, I guess I was really asking about add in the other options for more detail. What other options?

Since you haven't specified which classes to log I assume you want
enableLogging value set to "true" and
classesToLog value set to "all"

Mike

KentPitman
Posts: 10
Joined: Mon Aug 28, 2006 11:24 am
Location: Massachusetts

Post by KentPitman » Mon Oct 02, 2006 10:26 am

Beth wrote:Details on client-side logging can be found here: http://support.sourcegear.com/viewtopic.php?p=5375

Oh, and the unbinding from VS would only be happening on the one client machine, not with anyone else. VS and Vault can sometimes be looking in different places for their information, which is why I suggested getting it out of the picture for that person.
Getting Visual Studio out of the picture is not feasible for me in the near future. We're developing product on tight time schedules and I just can't justify the time. I've already spent way more time debugging this issue than I can justify. At some point, I'll have to declare failure on debugging this endlessly and just cut my losses, ask my MIS department to wipe my disk and reinstall things as if I were a new machine, and laborioiusly reconstruct my entire set of environment options, other installed programs, etc. However, if it gets that far, it will represent a total failure of Vault as a reliable entity and should not be regarded as a "handy way of debugging this problem". It's ok to ask if this is a point of flexibility on my end, but the answer is no, and I hope that's an acceptable answer on your end and that you don't just stop working on the problem because I'm not able to be so flexible as to simply uninstall Studio.

I'll see what I can do in the way of debugging the issue of the subst device, however I find it a bit concerning that I haven't heard someone say "we've heavily tested this on shares or mapped drives" or "we know this never works on shares or mapped drives, and have documented this fact" but rather that the response has been to suggest I take that out of the picture, as if it were a rogue programming element. I emphasize that a mapping table that informed Vault of what the impossible-to-otherwise-determine relationships are (like that M: maps to D:\Trunk on machine x or that share \\Foo\ maps to C:\blah\Foo\ on machine y) would get around the "is this a file someone else is using or have I used the same file myself under two different (and opaque) names?" problem.

Furthermore, no one would be happier than me if there were a way to say to Vault "when Studio tries to contact you, deny access". I intend to do all my checkouts via the Vault GUI or the Windows Command Prompt or Emacs invoking shell commands. I don't like it when Studio does stuff to Vault because I never know what it does. If Vault can tell the difference between those various cases and can deny Studio while allowing the other styles, that'd be great. I think I can't de-integrate Vault from Studio, because I think Studio will interpret that as meaning go back to "Use SourceSafe" ... which has its own set of problems. And every time I try to set my project to not use the source control stuff, it wants to put that info back into the source controlled files, which is also a problem because it either tries to override the project defaults and risks them being checked back in, or else it risks the checked-out version (which has others using Studio by default in my group) will override my settings. But if it thought it was to use Vault and that Vault was always just mysteriously unavailable to Studio only due to some setting I could set, that would actually, strangely enough, solve most of the problem that led to this because I normally don't interactively ever mention M: to Vault. It's only Studio that does that because I compile out of M:. At Vault, I have tried to always just use the underlying directories so that the right branch will check out to the right place. In studio, I use M: so that I can switch among which of those consistent branches I compile.

Beth
Posts: 8550
Joined: Wed Jun 21, 2006 8:24 pm
Location: SourceGear
Contact:

Post by Beth » Mon Oct 02, 2006 11:29 am

The virus scan is a good idea, when the new user was created and the old profile mapped over, MIS also changed the virus software that we run from Norton to McAfee. They might have it set so that it is scanning those files.
I hadn't heard back on the virus scanning part or else I missed it. Could you try opening CacheMember_WorkingFolderAssignments in a binary editor and try saving it? That might be one way of seeing if another process has it locked. Another option would be to download Filemon and see if that can track down what is opening that file. Filemon is found at http://www.sysinternals.com/Utilities/Filemon.html.

KentPitman
Posts: 10
Joined: Mon Aug 28, 2006 11:24 am
Location: Massachusetts

Post by KentPitman » Mon Oct 02, 2006 11:47 am

Beth wrote:
The virus scan is a good idea, when the new user was created and the old profile mapped over, MIS also changed the virus software that we run from Norton to McAfee. They might have it set so that it is scanning those files.
I hadn't heard back on the virus scanning part or else I missed it. Could you try opening CacheMember_WorkingFolderAssignments in a binary editor and try saving it? That might be one way of seeing if another process has it locked. Another option would be to download Filemon and see if that can track down what is opening that file. Filemon is found at http://www.sysinternals.com/Utilities/Filemon.html.
We'll look into this, but a lot of these suggestions you're offering seem to actually believe the error message.

I think the error message is lying. I think it's getting an error that it is attributing to an open file but that is something else entirely.

Here's are some examples of the kind of thing I think is actually happening. I have no data to support it other than 30 years of experience programming and seeing bugs of this kind. I'm not alleging these problems in particular are happening. What I'm saying is that the text of the error message may be the thing making it hard to debug.

Scenario A (opaque filename confusion)

Vault made some notation about a file A under the name D:\Trunk\foo and later found that D:\Trunk\foo was not what it expected even though it had no record of having changed it. However, in the interim, Vault had modified M:\foo thinking it was a different file. Vault assumes another person is in play because it doesn't realize its own actions are perturbing it because it has no internal model of the idea that two different filenames could manipulate the same file.

Scenario B (human name confusion)

I made some modification to Vault's persistent data under name Mathsoft\kpitman and then when Mathsoft became acquired by PTC, and I became PTCNET\kpitman, I continued to use the vault login name kpitman but somehow vault has noticed that some file that was changed eralier by Mathsoft\kpitman is now being changed by PTCNET\kpitman and it doesn't realize those are the same people.

Scenario C (stale data)

Vault Server has some persistent information that is on the server and that we are re-requesting each time this happens, and we should be checking server data transfers, not log files and not local profile info, so we're looking in the wrong place.

- - - - - - - - - -

I don't think studio is implicated because it doesn't even have to be running when this happens. I normally don't run with Studio, but I get this error all the time.

I don't think virus protection is implicatec because I don't know why virus protection would single out this one file and no others. It's definitely not because of data in the file since Vault re-creates the file even if we delete it.

I think it's possible that the M: device is implicated, but I don't see why removing it is the solution. Vault does not know about M: unless Visual Studio has told it about M:. The reason I'm running this command is exactly and only to eradicate the erroneous uses of it within Vault that Studio might have given it.

I don't think another program is using the file because I keep restarting my machine and running these scenarios with nothing else running. I could get filemon to check, but it seems silly as a first line of attack since no other programs and the file doesn't even exist until Vault creates it.

KentPitman
Posts: 10
Joined: Mon Aug 28, 2006 11:24 am
Location: Massachusetts

Post by KentPitman » Mon Oct 02, 2006 1:14 pm

mlippert wrote:The virus scan is a good idea, when the new user was created and the old profile mapped over, MIS also changed the virus software that we run from Norton to McAfee. They might have it set so that it is scanning those files.
Beth wrote:I hadn't heard back on the virus scanning part or else I missed it. Could you try opening CacheMember_WorkingFolderAssignments in a binary editor and try saving it? That might be one way of seeing if another process has it locked. Another option would be to download Filemon and see if that can track down what is opening that file. Filemon is found at http://www.sysinternals.com/Utilities/Filemon.html.
KentPitman wrote:We'll look into this, but a lot of these suggestions you're offering seem to actually believe the error message.

I think the error message is lying. I think it's getting an error that it is attributing to an open file but that is something else entirely. ..
Well, notwithstanding my worry to the contrary, I disabled Macafee's on-access virus scanning since it was simple to try, and indeed this seems to allow the write to complete. So I guess I'm convinced this was/is the problem. (And, fortunately, it exonerates the M: issue, at least for now.)

I was reluctant to just run without Macafee able to do that virus scanning in general, so in the On-Access options, I've selected All Processes -> Detection -> Exclusions... and made an entry for C:\Documents and Settings\kpitman.PTCNET\Local Settings\Application Data\SourceGear\Vault_1\Client\ and its subfolders to say only to exempt them from scan on write. And then I re-enabled Macafee's on-access scanning. With this exemption in place, it seems to still work to set the problemsome default directory.

It does leave me with one lingering concern: You don't have any files in that hiearchy that you execute, right? It's just non-executable binary data? It would be nice if there were an extension saying "this file contains binary data that is expressly not to be executed" and where Windows would inquire if you tried to rename such a file to something else without a virus scan, to avoid trojan horses. But other than wondering whether my creating that little hierarchy makes a safe haven for viruses to live and breed, I guess I'm all set.

I assume you guys will talk to Macafee about whether I should need to do this, since it doesn't seem like I should.

jclausius
Posts: 3702
Joined: Tue Dec 16, 2003 1:17 pm
Location: SourceGear
Contact:

Post by jclausius » Mon Oct 02, 2006 2:19 pm

In order to prevent multiple Vault clients from overwriting each other's data, the Vault client will exclusively open its cache files.

What we've seen in the past is Virus Scanners will monitor the system for file changes, and once it sees a file has been written to, it opens the file for a Virus Scan. When the Vault client comes back to open the same file, the Virus scanner still has the file open causing the Vault client to fail trying to open the file.

The fix is to disable any Virus tools from checking the Client Side Cache Files.
Jeff Clausius
SourceGear

mlippert
Posts: 252
Joined: Wed Oct 06, 2004 10:49 am
Location: Cambridge, MA

Post by mlippert » Mon Oct 02, 2006 2:28 pm

Thanks for your help figuring this out Beth, and Jeff, thanks for the explanation as to how the Virus scanner is getting in the way.

Mike

KentPitman
Posts: 10
Joined: Mon Aug 28, 2006 11:24 am
Location: Massachusetts

Post by KentPitman » Tue Oct 03, 2006 10:24 am

mlippert wrote:Thanks for your help figuring this out Beth, and Jeff...
Yes, thanks.

And sorry for sounding frustrated. But as you can imagine, this has been festering for a while and was driving me nuts.

I appreciate your working through it.

Post Reply