Community Tip - New to the community? Learn how to post a question and get help from PTC and industry experts! X
I need to clone our production system to a development environment which is located in a different location as the production system. Any suggestions on reducing the size of the export so that it less hassle to move from one location to the other? In the past we have exported the entire database which is quite large and we put the export on a disk and shipped it to the other location.
Hi Sandra,
I don't know of a way to get a smaller extract. But I can give you a few tricks I developed when I sent my Windchill DB to PTC for their upgrade team to develop their upgrade tools.
All this is based on UNIX though... So if you have a Windows based infrastructure, you can get these tasks done by installing Cygwin on your PC and running the UNIX commands there.
First off, I did the Oracle expdp to generate a dump (DMP) file out of Oracle. 2 years back, that was about 150GB in size. Now it's 220GB! Time flies.
Then, to send the DMP to PTC, I went through the following steps :
1) Compress the file. Note that I use level 1 compression here for speed since the size benefit of going to compression level 9 is not worth the extra time it takes to compress more.
date ; gzip -1 -c EXPORT.DMP > EXPORT.gz ; date
2) Split the file into smaller parts for easier file transfers :
split -b 1500m EXPORT.gz EXPORT.gz_
3) ftp the files to remote site
4) On the receiving side, reconstruct the DMP from the split files.
cat EXPORT.gz_* > EXPORT.gz
gunzip -c EXPORT.gz
So although you'll still be sending all the data over the wire, managing a bunch of smaller files will be much more reliable than trying to send one huge file.
HTH
JL
BTW... If you're on Windows, you can probably work with the ZIP software there as most ZIP programs now have splitting capabilities.
How big is the DB dump?
I believe this is mentioned in the rehost guide, but the "datapump" method of export (expdp command instead of exp command) should provide a faster way to do this. Keep in mind that if you export with expdp, you will have to import with the impdp command as well. One or both of these commands should have an argument in the command "compress=y" as part of it.
Is the development environment's hard drive accessible from the production environment through the network, and if so is it a reliable/stable connection? If so, you can export the DB directly to that location on the development server, and once done it will be already there waiting for the import task.
Another thing to note: as long as your DB and LDAP dumps are taken at the same time, it doesn't really matter how long it takes to transfer and rehost it to the development system, as development and production systems are rarely close to each other in timestamp as far as the data is concerned.
One last item: are you using file vaulting in your system? If not, content gets stored in your DB as BLOBs (Binary Large Objects) and would be a major component of the overall DB size. Configuring and revaulting will take that information out of the DB and store it to a hard drive of your choosing per your vault setup.
If you have any other questions or would like additional assistance, feel free to contact me at robert.sindelar@eccellent.com or go to my company's website - www.eccellent.com - for more information.
Thanks guys! I'll talk more with our DBA on our suggestions. We are using file vaults, but have not dumped all the data out of BLOBs to file vaults.
Sandra