Recommendations for optimal Vault performance

A collection of information about Vault, including solutions to common problems.

Moderator: SourceGear

Post Reply
lbauer
Posts: 9736
Joined: Tue Dec 16, 2003 1:25 pm
Location: SourceGear

Recommendations for optimal Vault performance

Post by lbauer » Tue Jul 05, 2005 1:04 pm

Recommendations for Optimal Vault performance

last updated 5/30/06

Many variables can affect Vault performance, including system hardware and software resources, network configuration, IIS configuration, SSL usage, bandwidth of your network connection, the state of the SQL database, as well as your Vault configuration.

Here’s a checklist of things to consider when optimizing Vault performance:


1. Hardware – the Vault/ SQL Server machine:

The minimum system requirements for Vault are found at:

http://www.sourcegear.com/vault/documen ... ysreq.html


Beyond that, it depends on your own preferences for hardware, OS, etc, and how much money you want to invest in the system. Other factors that affect hardware requirements are how many files users will be checking out, what types of operations they'll be doing, and the size and complexity of your repositories.

More suggestions here:
viewtopic.php?p=38379

For best performance, we recommend that the server machine(s) running Vault and its SQL Server installation be dedicated only for Vault. Using the Vault server machine as file server, mail server, PDC, etc. may take system resources away from Vault.


Tips:

-- If possible, install Vault and MS SQL Server on different machines. Not only is this configuration more secure, but it provides more processing power to each component of the Vault service.

-- Fast disk I/O drives will improve performance. Of course, more memory, SSDs, dual processors and RAID are desirable (that's where the money part comes in).

-- You must have adequate RAM to run SQL Server. When the server computer runs out of physical memory, SQL Server performance degrades dramatically.

-- Allow enough disk space for SQL Server. To determine an adequate amount of disk space, allow for double or triple the size of the tree you want to store in Vault and monitor the database size every few weeks.

(The size of the Vault database can vary greatly depending on the number and amount of changes in your source code, and the number of binary vs. text files. Vault stores files in a compressed format, and stores the changes between each version of the file (also compressed), so text files tend to require less storage space than binary files. By default Vault stores a full copy of each file every 50 versions for performance reasons.)

-- Your system should have the flexibility to upgrade (add RAM, for instance.) Your database will grow over time as you add projects, branch, and accumulate history. Additionally, as you add new users to Vault, you will want a machine that can handle the increased load.

2. MS SQL Server 2000/MS SQL Server 2005/MSDE 2000/MS SQL Server Express 2005

How big can a Vault database be? Vault uses SQL Server for its storage, so the size of your database is limited by SQL Server limits. Of course you need adequate hardware as your database grows.

MSDE 2000 or MS SQL Server Express may be options for smaller teams.

NOTE: There are limitations with MSDE 2000 and MSDE SQL Server Express:

MSDE 2000 has a 2GB total DB size limitation. MSDE also has code in it to intentionally slow down when there are 5 concurrent users.

SQL Server Express supports 1 CPU only (installs and runs on multiprocessor machines, but only a single CPU is used at any time), 1 GB RAM (the engine will not utilize more than 1 GB of memory) and a 4 GB database size (there are no limits to the number of databases that can be attached to the server). Unlike MSDE, SQL Server Express does not have a workload governor.


Tips:
-- Anti-virus software can affect SQL Server performance:

http://support.microsoft.com/?kbid=309422

-- Regular database backup and maintenance is important. See this KB article for details:

viewtopic.php?t=2924

-- File fragmentation can greatly affect SQL Server performance. Database maintenance is more effective when the drives are not fragmented.

To defrag SQL Server machine hard drive:

1) Back up the Vault database to another machine first.
2) Stop the SQL Server and the ASP.NET service so the files aren’t locked.
3) Run defrag on the drive.
4) Start the SQL Server and ASP.NET service.

-- Verify that your Vault databases are configured with "Auto Close" set to FALSE. By default, the Vault installers will set Auto Close to FALSE, which leads to greater performance. This setting is in SQL Server Management Studio. Right-click on the sgvault database and select Properties->Options->Automatic. This setting should also be applied to your other Vault databases: sgmaster, sgnotify, sgvaultindex and sgdragnet (Vaultpro).

-- A setting on your SQL server regarding the database statistics can affect performance. This is especially true during branching. Check out this additional KB article on what to set: viewtopic.php?f=13&t=22358.

-- Check the settings for other SQL features such as mirroring and clustering. Misconfiguration of those type of features can affect performance. For example, branching may be slow if you have a synchronous mirror rather than an asynchronous mirror.

3. Vault configuration and usage

What performance should you expect from Vault?

Vault sends only deltas for file changes rather than entire files, resulting in less data being transmitted for changed files.

On a LAN, Vault typically takes < 1 second for a checkout operation. A commit of one file usually takes from 1 - 4 seconds depending on server load.

For remote users, this will vary depending on their available bandwidth and network latency.

Tips:

-- Users should only check out what they intend to modify. Extremely large checkout lists can affect performance, as the Vault server creates a checkout list before each operation. We've had reports of users on a system checking out 1500 - 2000 items per user. In a 20 - 30 user environment, this places a large stress checkout lists maintained in the client when working with 60,000+ total checkouts.

-- Do you have flexibility in how to structure your tree? Vault works faster when you have fewer nodes (folders) with more files, rather than many nodes with fewer files in each node.

-- If you don’t need to share files in across projects, put discrete projects in their own repositories. This helps keep the checkout list smaller. Two or three smaller repositories usually provide better performance than one large repository.

viewtopic.php?p=5183

-- Folder security can affect performance, as the Vault Server has to check access rights when building the tree. If you don’t need security on certain repositories, disable it.

--Vault.config settings. Vault performance can be tweaked with certain changes to Vault.config.

<DBBufferSizeKB>256</DBBufferSizeKB>
<TreeManagerSize>-1/TreeManagerSize>
<TreeDeltaCompresionThreshold>1000</TreeDeltaCompresionThreshold>

-- DBBufferSizeKB -

This sets the size of memory buffer used when retrieving files into / out of the database. Optimal size is about 256-384 KB. However, if you deal with many of binary changes, consider 1024+ KB

--Tree Manager Size

TreeManagerSize set to -1 tells the Vault service to calculate how many trees to hold based on available memory. Sometimes the calculation results in too much memory useage. The TreeManager keeps these trees in cache so that it can easily compare the current tree to one of the cached trees, if a client connects that is out of date. If a client connects and the last tree that they downloaded is no longer in the TreeManager cache, a rather expensive database call must be made to get that client up to date with the latest tree.

Setting TreeManagerSize to a number (lets say 100) tells Vault to throw away any trees that are 100 versions out of date. If you can safely say that most of the people who connect will not be more than 100 versions out of date, then you will have no problems. If your tree goes through less than 100 versions per day, then 100 is a good value for TreeManagerSize .

In summary, set TreeManagerSize to more than the typical number of tree versions you see in a day, but less than 545. Start at 222 and see if that gives acceptable performance and memory consumption.

More discussion here:
viewtopic.php?t=363

--Tree Delta compression:

Sometimes performance is affected by insufficient server memory during certain operations, like when a large delta must be compressed by the server.

For information about adjusting Tree Delta compression, see this KB article:
viewtopic.php?t=1450

Note, any change to vault.config will require you to reload the settings within vault (use iisreset.exe ).


4. Network Considerations

-- Chunked Encoding may improve performance.

The Vault Client has the option to use chunked encoding to reduce memory usage during file upload. By default this is off, as some proxies and other network devices do not support chunked encoding. You can enable chunked encoding in the Vault GUI Client under Tools->Options-> Network settings.

-- SSL may decrease performance.

Using SSL can slow connectivity to Vault.

-- Not using Reverse DNS Look-up may improve performance.

The Vault Admin Tool has this option in the Server Options. This look-up could be slowed by either the DNS server or a router or firewall blocking or slowing that traffic. Users have reported an improvement in speed with this option unchecked.

--Not using IPv6 may improve performance.

Even though Vault doesn't directly touch the network (it goes through IIS and .NET), it has been reported that some network configurations or equipment are not yet ready for IPv6. Disabling IPv6 on the server has shown performance improvements for some users.

5. IIS Configuration

--Session timeouts (versions 3.0.4 and earlier)

If Vault server is on IIS 6 (Windows 2003 server) and operations are frequently timing out, you may need to adjust the Application pools recycling settings in IIS. These settings can cause Vault to restart after certain periods of time or memory allocations. For details see:

viewtopic.php?t=1014

Each time the ASP.NET process is recycled, the Vault server side tree cache must be rebuilt, which is an expensive operation and can slow performance.

Notes: Because it is computationally expensive to build a repository tree from the database, the Vault Server will keep a cache of the most recently accessed trees. These trees are then used as server side objects to improve performance - avoiding a hit on the database. (For example, at SourceGear, on our 550 MHz PIII based test server, it will take about 15 seconds to build a tree within the server. There are about 4000 folders and 34000 files in this one repository.) However, if the machine is being rebooted, or ASP.NET is being recycled, the benefits of the cache are negated.

Versions later than 3.0.4: In later versions, the installer turns off auto-recycling in Windows 2003 for the VaultService, so there's less possibility for constant recycling.

-- Problems uploading large files (versions 3.0.4 and earlier)

Vault settings:
First check the IIS File Upload Limit in the Vault Admin Tool -> Server Options. The default is 102400. If this is smaller than the file you’re trying to upload, increase this value.

IIS 6 settings:
If the Vault Server is installed on a Windows 2003 Server with IIS 6, check the Connection Timeout under the website Properties->Connections.

The default for this setting is 120 seconds, which is too low when committing large files.

Versions later than 3.0.4: The installer sets this timeout to up to 1800 so this is no longer a concern.

(Also see Chunked Encoding info below, for client-side configuration to improve uploads).

6. Sometimes Vault is Just Slow

Certain operations do take time in Vault:

-- VSS Import

The VSS import not only brings your source files over to the Vault database, but it meticulously recreates history, labels, shares, etc. We’d rather have the import be accurate than fast.

While an import is running, you should not have any regular clients connected to the Vault server. Their presence will slow down the import.

viewtopic.php?t=7

-- Obliterate

Obliterate is slow. We working to improve this.

viewtopic.php?t=2967

Note: We don't recommend using Obliterate if you think you might want to export all or part of your repository someday. The Folder Export\Import tool does not work properly if portions of history have been deleted from the repository.


7. Speeding up Client-side operations

There are a few settings on the client side that can help you work faster:

-- Disable the splash screen
If you’re logging in and out often, this can save you a few seconds:
viewtopic.php?t=12

-- Cloak folders you don’t need.
The Cloak command allows the client to "ignore" a particular folder during recursive operations. For example, imagine a cloaked folder at $/a/b/. If the user executes a recursive "get latest" operation on $/a/, the info in $/a/b/ and any of its sub folders will be ignored. Cloak is a personal feature. When you cloak a folder, you hide it from yourself, but others can still see it.

-- Chunked Encoding may improve performance.
The Vault Client has the option to use chunked encoding to reduce memory usage during file upload. You can enable chunked encoding in the Vault GUI Client under Tools->Options-> Network settings. NOTE: By default, this is off, as some proxies and other network devices do not support chunked encoding.

-- Unsetting "Request Database Delta on Repository Cache Miss" improves performance.
The setting "Request Database Delta on Repository Cache Miss" is found in the Vault GUI client under Tools->Options-> Network settings. Uncheck the box next to that setting. By default, this is turned on.

viewtopic.php?t=78

-- If the Vault client is slow to load the tree, there are changes you can make mentioned in this KB article: viewtopic.php?f=13&t=22633

TROUBLESHOOTING:

If you think Vault's performance is less than it should be, don’t suffer in silence, contact us.

For faster resolution, try these troubleshooting steps first, and let us know the results:

Run SQL Server maintenance
Check the database disk for fragmentation
Turn off Folder security – does performance improve?
See how many files are checked out1
Try chunked/unchunked encoding
Check the log file for any errors
Check SQL Server log file


1How to tell how many files you have checked out:

You can run a SQL query to see how many files are checked out.

In SQL Query Analyzer, first run this command to determine your repository ID number:
select repid,name from sgvault.dbo.tblrepositories

Then, to determine how many checked out files are in a particular repository, run:
select count(*)from sgvault.dbo.ufngetlockedfiles (1) (or whatever the actual repository ID number is)

You can also get a list of checked out files with this statement:
select*from sgvault.dbo.ufngetlockedfiles
Linda Bauer
SourceGear
Technical Support Manager

Post Reply