cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - If community subscription notifications are filling up your inbox you can set up a daily digest and get all your notifications in a single email. X

Postgres to InfluxDB Migration

pshashipreetham
17-Peridot

Postgres to InfluxDB Migration

There is a situation where DataTables Data(Which is in Postgres) needed to be Migrated to the InfluxDB, such that going forward InfluxDb Streams will be used in DataTables. So, is there any best Practices by PTC provided ?

@VladimirN @AnnaAn @paic @slangley @CarlesColl @c_lowy 

Thanks,
Shashi. 

Shashi Preetham
1 ACCEPTED SOLUTION

Accepted Solutions

So,

There is no particular way @slangley  mentioned. We need to write a service that adds each row in DataTable to the Stream(Created using Influx Persistence).

Example Code:

 

for (i = 0; i < FilteredAlarmsBasedOnTank.rows.length; i++) 
{
	var row = FilteredAlarmsBasedOnTank.rows[i];
	var data = {
		Alarm: row.Alarm,
		ThingName: row.ThingName,
		Facility: Thing.Facility
	};
	values.AddRow(data);
	result.AddRow(data);
	Things["TankAlarm"].AddStreamEntry({
		timestamp: row.Date,
		source: Thing.name,
		sourceType: "Thing",
		values: values
	});
	values.RemoveAllRows();
}

 


Thanks,

Shashi Preetham

View solution in original post

12 REPLIES 12

Hi @pshashipreetham.

 

Have you reviewed the information available in the ThingWorx Help Center?

 

Regards.

 

--Sharon

 

 

Hi @slangley ,

Gone through all the Docs present in the Thingworx help center, found nothing !

Thanks,
Shashi

Shashi Preetham
slangley
23-Emerald II
(To:slangley)

Hi @pshashipreetham.

 

Can you confirm your intent?  Data tables cannot be migrated to Influx.  However, streams and value streams can be migrated to Influx.  Here's an article to assist with that process.

 

Regards.

 

--Sharon

 

Hi @slangley ,

Present Postgres DB Value stream has around 100 Million of Transaction data, as of now shifting from Postgres to Influx to store the data of the Things(Properties inside the Things), Therefore previous data need to be migrated from the Postgres to the Influx, such that no data get's missed.

Looking for an Best Migration Method..

Thanks,
Shashi.

Shashi Preetham
slangley
23-Emerald II
(To:slangley)

Hi @pshashipreetham.

 

Did you test the method in the article previously provided?  Are you running into an issue?

 

Regards.

 

--Sharon

 

Hi @slangley ,

getting time out error, when tried the Article method, if Exported at a time (May be the Postgres DB Configuration is less)

Thanks
Shashi.

Shashi Preetham

Hi @psp316r.

 

Have you tried breaking it into smaller chunks?  For example, you can break it by start and end date.

 

Regards.

 

--Sharon

Hi @slangley ,

Of course, smaller chunks works, but there is data around 75Gb, how many chunks can be made, and the downtime for production will be more, which is not an advised way ..

Thanks,
Shashi.

Shashi Preetham

Hi @psp316r

 

Unfortunately, we don't have another option to recommend here.  You'll have to test to determine the optimum file size.  

 

You'll need to set up a test environment with a copy of the prod db for testing to determine the time required and then schedule the appropriate downtime for Prod.  However, the downtime will essentially only be needed for the export.  Once you have it configured to use influxdb, you can turn ThingWorx back on while you're doing the imports.  Keep in mind, though, that reporting, etc. may be impacted until the imports are complete.

 

Regards.

 

--Sharon

slangley
23-Emerald II
(To:slangley)

Hi @psp316r.

 

Do you have further information to provide on this post?

 

Regards.

 

--Sharon

So,

There is no particular way @slangley  mentioned. We need to write a service that adds each row in DataTable to the Stream(Created using Influx Persistence).

Example Code:

 

for (i = 0; i < FilteredAlarmsBasedOnTank.rows.length; i++) 
{
	var row = FilteredAlarmsBasedOnTank.rows[i];
	var data = {
		Alarm: row.Alarm,
		ThingName: row.ThingName,
		Facility: Thing.Facility
	};
	values.AddRow(data);
	result.AddRow(data);
	Things["TankAlarm"].AddStreamEntry({
		timestamp: row.Date,
		source: Thing.name,
		sourceType: "Thing",
		values: values
	});
	values.RemoveAllRows();
}

 


Thanks,

Shashi Preetham

hi, please can you explain this line: 

Facility: Thing.Facility

Second question: why do you add to the stream and then remove all rows from values infotable? 

values.RemoveAllRows();

 

Top Tags