Community Tip - Need to share some code when posting a question or reply? Make sure to use the "Insert code sample" menu option. Learn more! X
There is a situation where DataTables Data(Which is in Postgres) needed to be Migrated to the InfluxDB, such that going forward InfluxDb Streams will be used in DataTables. So, is there any best Practices by PTC provided ?
@VladimirN @AnnaAn @paic @slangley @CarlesColl @c_lowy
Thanks,
Shashi.
Solved! Go to Solution.
So,
There is no particular way @slangley mentioned. We need to write a service that adds each row in DataTable to the Stream(Created using Influx Persistence).
Example Code:
for (i = 0; i < FilteredAlarmsBasedOnTank.rows.length; i++)
{
var row = FilteredAlarmsBasedOnTank.rows[i];
var data = {
Alarm: row.Alarm,
ThingName: row.ThingName,
Facility: Thing.Facility
};
values.AddRow(data);
result.AddRow(data);
Things["TankAlarm"].AddStreamEntry({
timestamp: row.Date,
source: Thing.name,
sourceType: "Thing",
values: values
});
values.RemoveAllRows();
}
Thanks,
Hi @pshashipreetham.
Have you reviewed the information available in the ThingWorx Help Center?
Regards.
--Sharon
Hi @slangley ,
Gone through all the Docs present in the Thingworx help center, found nothing !
Thanks,
Shashi
Hi @pshashipreetham.
Can you confirm your intent? Data tables cannot be migrated to Influx. However, streams and value streams can be migrated to Influx. Here's an article to assist with that process.
Regards.
--Sharon
Hi @slangley ,
Present Postgres DB Value stream has around 100 Million of Transaction data, as of now shifting from Postgres to Influx to store the data of the Things(Properties inside the Things), Therefore previous data need to be migrated from the Postgres to the Influx, such that no data get's missed.
Looking for an Best Migration Method..
Thanks,
Shashi.
Hi @pshashipreetham.
Did you test the method in the article previously provided? Are you running into an issue?
Regards.
--Sharon
Hi @slangley ,
getting time out error, when tried the Article method, if Exported at a time (May be the Postgres DB Configuration is less)
Thanks
Shashi.
Hi @psp316r.
Have you tried breaking it into smaller chunks? For example, you can break it by start and end date.
Regards.
--Sharon
Hi @slangley ,
Of course, smaller chunks works, but there is data around 75Gb, how many chunks can be made, and the downtime for production will be more, which is not an advised way ..
Thanks,
Shashi.
Hi @psp316r
Unfortunately, we don't have another option to recommend here. You'll have to test to determine the optimum file size.
You'll need to set up a test environment with a copy of the prod db for testing to determine the time required and then schedule the appropriate downtime for Prod. However, the downtime will essentially only be needed for the export. Once you have it configured to use influxdb, you can turn ThingWorx back on while you're doing the imports. Keep in mind, though, that reporting, etc. may be impacted until the imports are complete.
Regards.
--Sharon
Hi @psp316r.
Do you have further information to provide on this post?
Regards.
--Sharon
So,
There is no particular way @slangley mentioned. We need to write a service that adds each row in DataTable to the Stream(Created using Influx Persistence).
Example Code:
for (i = 0; i < FilteredAlarmsBasedOnTank.rows.length; i++)
{
var row = FilteredAlarmsBasedOnTank.rows[i];
var data = {
Alarm: row.Alarm,
ThingName: row.ThingName,
Facility: Thing.Facility
};
values.AddRow(data);
result.AddRow(data);
Things["TankAlarm"].AddStreamEntry({
timestamp: row.Date,
source: Thing.name,
sourceType: "Thing",
values: values
});
values.RemoveAllRows();
}
Thanks,
hi, please can you explain this line:
Facility: Thing.Facility
Second question: why do you add to the stream and then remove all rows from values infotable?
values.RemoveAllRows();