Community Tip - New to the community? Learn how to post a question and get help from PTC and industry experts! X
Hi,
does anyone know what the most efficient way would be to migrate data from old version of ThingWorx to the latest release.
We are using PostgreSQL as persistence provider. This runs in cluster and HA mode.
DB is a different machine than Tomcat with ThingWorx.
We tried to do In-Place upgrade a had some problems with it. Suggestion was to do a migration:
New DB, installation scripts etc.
export extensions and entities from old platform
We have tried data export via Composer (Export to ThingWorx Storage) but that failed + we dont have lot of space on our application server.
I would like to use pgAdmin or psql to copy content from ValueStream, Stream and Datatable tables (perhaps wiki and blog as well).
Can anyone suggest some concrete SQLs to do that?
Thanks a lot.
Tomas
Hello Thomas,
Just as an alternative idea that might work if you don't get any better answer -- you can try mounting remote filesystem inside your exports directory (using NFS for example). This way you'll get your exported data uploaded "automatically".
/ Constantine
Hi,
what do you mean uploaded "automatically"?
I still need to upload data from Storage using Composer to move data to database, need I not?
And our PROD environment fails to export data under normal operation, even when I do one stream at a time.
Can be overcome if we cut out all incoming traffic (edge things) and stop all Timers and Schedulers (this is something that we cannot do, it takes too long). We can work with 1 - 2 hours window.
My thinking was along the following sequence:
WARNING: I seriously don't recommend doing this directly in PROD, at least before trying it on a similarly sized test environment at least a couple of times (what we call a "dry run").
On a side note, there are some base rules for doing major upgrades right:
Violating any of those rules increases the risk of failure. But since you are already in production, then most of those items should be familiar to you (e.g. I guess you already have a preprod environment), and this list is actually not as scary as it may look like.
Hope it helps.
/ Constantine
That is very extensive and elaborate answer :)
And yes we had all of that.
Except for point 5.
Our tests were not that extensive, since we cannot properly simulate all incoming data (KepServerEX does not support sending the same data to several ThingWorx instances).
And we have found no major problems.
Moreover we have done In-Place upgrade so no need for data transfer.
I talked to our customer, network disk will not be possible :/
I found this for DB data transfer that might work:
pg_dump -U <Username> -h <host> -a -t <TableToCopy> <SourceDatabase> | psql -h <host> -p <portNumber> -U <Username> -W <TargetDatabase>
I would need to verify if someone has done it and what the results were.