cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - If community subscription notifications are filling up your inbox you can set up a daily digest and get all your notifications in a single email. X

Migration ValueStream with Persistence Provider InfluxDB

dbologna
11-Garnet

Migration ValueStream with Persistence Provider InfluxDB

Hi All,

I must to migrate IoT Application

FROM Thingworx 8.4 with Persistent Provider PostgreSQL and Additional Persistent Provider InfluxDB (DB InfluxDB 1.7) TO Thingworx 9.2 with Persistent Provider PostgreSQL and Additional Persistent Provider InfluxDB (DB InfluxDB 1.8.4). In the source system (Thingworx 8.4 and Persistence Provide InfluxDB , DB Version 1.7) data are stored in a very big ValueStream.

Using standard migration process (export/import from Thingworx) takes a long time because I must split data using DataTimeStart and DataTimeEnd , in order to avoid problems on source system.

 

Would it be possible to migrate data from DB InfluxDB Version 1.7 to DB InfluxDB version 1.8.4 and somehow recreating the ValueStream on Thingworx 9.2 that points to the data on the InfluxDB 1.8.4 db?

Any suggestion is appreciated

BR

Dimitri

 

1 ACCEPTED SOLUTION

Accepted Solutions

Hi @dbologna ,

It is possible like any ThingWorx service - even if they are not exposed as Resource/Things services, like in the case of exporting data.

Open the browser Debug Console, switch to the network tab, and see what request is being issued when you try to export a Stream for example.

Then in your service you can use the ContentLoader snippets to issue a call to that endpoint similar to the one below.

var ExportURL = me.serverURL + "/DataExporter/Streams/" + MyStreamName + "?Accept=" + encodeURIComponent("application/octet-stream") + "&endDate=" + encodeURIComponent(endDate) + "&path=" + encodeURIComponent("/") + "&repositoryName=" + encodeURIComponent(fileRepository) + "&startDate=" + encodeURIComponent(startDate) + "&universal=";

View solution in original post

5 REPLIES 5
PaiChung
22-Sapphire I
(To:dbologna)

I recommend directly contacting customer support for this.

I recommend the same. The way I did this in the past via the supported way was to create a script in which I looped through the existing data, exporting in batches, then importing in batches, from what I remember. Also note that ThingWorx 9.3 is here, and if you're thinking to migrate mashups from that solution, there's a built-in style migration utility since this version which makes it far easier to migrate the mashups to the newer widgets while keeping their styles.

Hi ValdimirRosu, many thanks for you answer.
I have a question, what mean "exporting in batches, then importing in batches" ?

Is it possible to use export/import service outside the composer ? How ?
BR
Dimitri

Hi @dbologna ,

It is possible like any ThingWorx service - even if they are not exposed as Resource/Things services, like in the case of exporting data.

Open the browser Debug Console, switch to the network tab, and see what request is being issued when you try to export a Stream for example.

Then in your service you can use the ContentLoader snippets to issue a call to that endpoint similar to the one below.

var ExportURL = me.serverURL + "/DataExporter/Streams/" + MyStreamName + "?Accept=" + encodeURIComponent("application/octet-stream") + "&endDate=" + encodeURIComponent(endDate) + "&path=" + encodeURIComponent("/") + "&repositoryName=" + encodeURIComponent(fileRepository) + "&startDate=" + encodeURIComponent(startDate) + "&universal=";

I just remembered something else. ThingWorx 8.4.x does not have Influx timeouts configurable, and by default, the library has 10 seconds timeouts for connection, read (and write if I remember). This means that if you're requesting data from Influx, and Influx spends more than 10 seconds reading internally, you'll get a timeout error in ThingWorx, the connection will close and you won't be able to get the data.

To solve this you must supply adequate start & end parameters to this DataExporter service - test the good interval by trial. I'm saying this because if you reduce the interval, then probably Influx will spend less than 10 seconds to retrieve the data and send it back to you.

 

I hope this makes sense.

Top Tags