cancel
Showing results for 
Search instead for 
Did you mean: 
Security Alert Log4j Security Vulnerability. Click here to know more.
cancel
Showing results for 
Search instead for 
Did you mean: 

Vaulting Questions(PDMLink 8.0 M040)

borourke
5-Regular Member

Vaulting Questions(PDMLink 8.0 M040)

We are setting up vaulting for the first time, and were wondering if the kind folks in WindchillLand had some dos and don'ts about vaulting. Some questions we have is it possible to have new data move to the vault upon creation, or do you have to set up the vaulting jobs to run periodically? And do most folks have separate vaults/folders for each product\library, or do you tend to lump them all together? Any and all help is most appreciated!
11 REPLIES 11

Bill,

Files are put directly into a cache vault upon creation. They are then
moved permanently to the vault defined in the vaulting rules for that
particular object. They are never stored in the database if you are using
external storage.

It's usually recommended to put all files in a single vault using the
wt.fv.forceContentToVault option in wt.properties. A single vault can have
multiple folders so you don't stress the OS by putting too many files in a
single OS folder. If you are using Windows, set disable8dot3 to true (1)
using the fsutil command (fsutil behavior set disable8dot3 1) for better
performance. If you're using Windchill 9.0, you can use Root Folders which
will make simplify maintenance of the folders by automatically creating new
folders as needed (used in conjunction with a wt.properties threshold value
- check /Windchill/properties.html">http://<yourwindchillserver>/Windchill/properties.html for details).

Best Regards,
Bill Palm
Manager - RAPiDS Support Center
ProductSpace Solutions, Inc.


On Tue, Jan 6, 2009 at 4:24 PM, bill o'rourke <->wrote:

> We are setting up vaulting for the first time, and were wondering if the
> kind folks in WindchillLand had some dos and don'ts about vaulting. Some
> questions we have is it possible to have new data move to the vault upon
> creation, or do you have to set up the vaulting jobs to run periodically?
> And do most folks have separate vaults/folders for each product\library, or
> do you tend to lump them all together? Any and all help is most appreciated!
>
>
ToddVotapka
4-Participant
(To:borourke)

Bill,

In addition, I found it very useful to use the FvLoader utility. The loader can be used to create and configure external vaults and vaulting rules. See WCSysAdminGuide Chap 5. With the utility you can load all of your vaulting rules from a file. We created separate folders for each object type we were vaulting.

Todd Votapka

BAE Systems IT


In Reply to Bill Palm:

Bill,

Files are put directly into a cache vault upon creation. They are then
moved permanently to the vault defined in the vaulting rules for that
particular object. They are never stored in the database if you are using
external storage.

It's usually recommended to put all files in a single vault using the
wt.fv.forceContentToVault option in wt.properties. A single vault can have
multiple folders so you don't stress the OS by putting too many files in a
single OS folder. If you are using Windows, set disable8dot3 to true (1)
using the fsutil command (fsutil behavior set disable8dot3 1) for better
performance. If you're using Windchill 9.0, you can use Root Folders which
will make simplify maintenance of the folders by automatically creating new
folders as needed (used in conjunction with a wt.properties threshold value
- check for details).

Best Regards,
Bill Palm
Manager - RAPiDS Support Center
ProductSpace Solutions, Inc.


On Tue, Jan 6, 2009 at 4:24 PM, bill o'rourke <->wrote:

> We are setting up vaulting for the first time, and were wondering if the
> kind folks in WindchillLand had some dos and don'ts about vaulting. Some
> questions we have is it possible to have new data move to the vault upon
> creation, or do you have to set up the vaulting jobs to run periodically?
> And do most folks have separate vaults/folders for each product\library, or
> do you tend to lump them all together? Any and all help is most appreciated!
>
>

We have a simple but not the simplest configuration. 3 vaults
(defaultcachevault, main and librarydata). All our design data goes in
the main vault. When we added another site, we create a main vault and
library vault for them. If you file count grows large, you can create
additional mount folders for each vault. You would mark the current
folder as read-only and create a new folder. All new data would flow
into the new folder. You do have to run periodic revaulting jobs,
replication jobs (if applicable) and remove unreferenced files jobs.
When an item is deleted, it does not disappear from the vaults, it is
just unreferenced.



If your systems every spaz out and it cannot write to the vaults, it
will failover to writing to BLOBS in the database. Nice that you do not
lose data but you are never prompted that there was a problem. You
should checked that the vaults and folders are still valid. If not,
revalidate them and run the revaulting jobs. Data will flow out of BLOBS
to the correct vault.



We did have success with our test server in that we were able to mount
both systems to the same vault locations. Since test was a clone, they
should be referencing the same files. We made the mounts to the
production vaults read only so test could not write to it. On test, we
made the production folders read only as well and create a new folder in
a place that test could write to. That way, any new files could be
added to the test server. The test server did not care if there were
newer files in the production vaults it was referencing since it would
see them as unreferenced files.


borourke
5-Regular Member
(To:borourke)



Just a followup - I noticed that our defaultcache folder has some 44G of data while our Oracle database contains only 10G of data. Do I need to do a revaulting, after setting up my rules, before I run the job to delete unreferenced objects?
borourke
5-Regular Member
(To:borourke)

Another followup question about vaulting. If we decide to set our vaults similar to Antonio(default, main, etc.), is there a way to write some catch all rules to send the data to these different valuts, or do we have to set up a vaulting rul for each context?

Send lawyers, guns, and money...the sh!t has hit the fan!

Each context. PITA
mlockwood
20-Turquoise
(To:borourke)

Not sure what PTC currently recommends, but we found it not necessary to use any vaulting rules. Simply set ForceContentToVault and set up to have a new folder every so many files.
sdrzewiczewski
5-Regular Member
(To:borourke)

PTC recommends using 1 vault and the ForceContentToVault property. It is
supposed to be better for performance that it doesn't have to look at the
vaulting rules when storing a file.



If you do want to use multiple vaults and use vaulting rules to control
which vault a specific context is stored in, it's pretty easy to bulkload
the vaulting rules..



Steve D.



From: Lockwood,Mike,IRVINE,R&D [
borourke
5-Regular Member
(To:borourke)

Another PITY question about vaulting. We have set up our new folders on a recently added F drive against the defaultcachevault. When I went to update the fvforcetovault property to 'true', I noticed the this propert was already set to true. So I checked under the ptc\vaults directory and found we have 47,000 files at 49G in the defaultcachevalut folder. So I ran the revaulting, pulled some 3,000 files(4G) I am assuming from my Oracle blobs into my first folder on the new F drive. Is ther a way to move the 47000 files from my ptc\valut directory over to my new folders on the F drive, and when can I run the Remove Unreference Objects command? Thanks!

Send lawyers, guns, and money...the sh!t has hit the fan!


I just would like to encourage everyone to use a single folder as vault (at least on windows) --- I don't think it's worth the headache to sperate the objects into different folders.

we ha a vault of 2.000.000 Files and 600 GB in one Folder without any problem.

NTFS can handle 4,294,967,295 Files and 256terabytes of data (it doesn't depend on the folders but is the value per volume)

http://technet.microsoft.com/en-us/library/cc781134.aspx



The only thing you have to do is:

If you use large numbers of files in an NTFS folder (300,000 or more), disable short-file name generation for better performance, and especially if the first six characters of the long file names are similar.

Herbert,

Windows may be able to handle large numbers of files in single folders
but most backup software
can't. Nor will sync software like Robocopy or Xcopy. Set up to auto
fill to 60,000 files works great for us and
our back-up routine as well. Good luck when you try to rehost.

Kory Krofft
Applications Engineer
Trimble Navigation Ltd.
5475 Kellenburger Rd.
Dayton, Oh 45424
Ph. 937.245.5356
kory_krofft@trimble.com
Announcements