Community Tip - You can subscribe to a forum, label or individual post and receive email notifications when someone posts a new topic or reply. Learn more! X
Hi,
We have large amount of data stored in Thingworx Value Stream, and we show it in a Grid with binding QueryPropertyHistory, but problem is that after several searching the Value Stream data in Mashup, Thingworx processing time will rapidly slow down. And we also got this error report in Tomcat log:
catalina-xx-xx.log:
java.lang.IllegalStateException: Illegal access: this web application instance has been stopped already. Could not load [org.neo4j.kernel.InternalAbstractGraphDatabase]. The following stack trace is thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access.
at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForResourceLoading(WebappClassLoaderBase.java:1354)
at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForClassLoading(WebappClassLoaderBase.java:1340)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1205)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1166)
at ch.qos.logback.classic.spi.PackagingDataCalculator.loadClass(PackagingDataCalculator.java:207)
at ch.qos.logback.classic.spi.PackagingDataCalculator.bestEffortLoadClass(PackagingDataCalculator.java:232)
at ch.qos.logback.classic.spi.PackagingDataCalculator.computeBySTEP(PackagingDataCalculator.java:138)
at ch.qos.logback.classic.spi.PackagingDataCalculator.populateUncommonFrames(PackagingDataCalculator.java:113)
at ch.qos.logback.classic.spi.PackagingDataCalculator.populateFrames(PackagingDataCalculator.java:105)
at ch.qos.logback.classic.spi.PackagingDataCalculator.calculate(PackagingDataCalculator.java:57)
at ch.qos.logback.classic.spi.ThrowableProxy.calculatePackagingData(ThrowableProxy.java:147)
at ch.qos.logback.classic.spi.LoggingEvent.<init>(LoggingEvent.java:124)
at ch.qos.logback.classic.Logger.buildLoggingEventAndAppend(Logger.java:440)
at ch.qos.logback.classic.Logger.filterAndLog_0_Or3Plus(Logger.java:396)
at ch.qos.logback.classic.Logger.error(Logger.java:559)
at com.thingworx.logging.LogUtilities.logExceptionDetails(LogUtilities.java:146)
at com.thingworx.logging.LogUtilities.logExceptionDetails(LogUtilities.java:99)
at com.thingworx.system.subsystems.streamprocessing.queuing.StreamEntryProcessor$StreamQueueChecker.run(StreamEntryProcessor.java:230)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.runAndReset(Unknown Source)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(Unknown Source)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
tomcat8-stdout-xx-xx.log:
Found in 'org.owasp.esapi.resources' directory: E:\ThingworxStorage\esapi\validation.properties
Loaded 'validation.properties' properties file
Unable to load value stream entry: javax.transaction.SystemException: Kernel has encountered some problem, please perform neccesary action (tx recovery/restart)
Unable to load value stream entry: javax.transaction.SystemException: Kernel has encountered some problem, please perform neccesary action (tx recovery/restart)
Unable to load value stream entry: javax.transaction.SystemException: Kernel has encountered some problem, please perform neccesary action (tx recovery/restart)
Unable to load value stream entry: javax.transaction.SystemException: Kernel has encountered some problem, please perform neccesary action (tx recovery/restart)
java.lang.RuntimeException: javax.transaction.SystemException: Kernel has encountered some problem, please perform neccesary action (tx recovery/restart)
at org.neo4j.kernel.impl.transaction.TxManager.getTransactionState(TxManager.java:1005)
at org.neo4j.kernel.impl.persistence.PersistenceManager.getResource(PersistenceManager.java:245)
at org.neo4j.kernel.impl.persistence.PersistenceManager.currentKernelTransactionForReading(PersistenceManager.java:235)
at org.neo4j.kernel.impl.core.ThreadToStatementContextBridge.instance(ThreadToStatementContextBridge.java:55)
at org.neo4j.kernel.impl.core.NodeProxy.getProperty(NodeProxy.java:327)
at com.thingworx.persistence.neo4j.factories.data.NeoValueStreamEntryDataProvider.getValueStreamEntry(NeoValueStreamEntryDataProvider.java:116)
at com.thingworx.persistence.neo4j.factories.data.NeoValueStreamEntryDataProvider.queryEntries(NeoValueStreamEntryDataProvider.java:393)
at com.thingworx.persistence.common.ValueStreamEngine.queryEntries(ValueStreamEngine.java:135)
at com.thingworx.valuestreams.ValueStreamThing.queryStreamEntries(ValueStreamThing.java:154)
at com.thingworx.things.Thing.QueryNamedPropertyHistory(Thing.java:4941)
at sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at com.thingworx.common.processors.ReflectionProcessor.processService(ReflectionProcessor.java:118)
at com.thingworx.handlers.ReflectionServiceHandler.processService(ReflectionServiceHandler.java:28)
at com.thingworx.things.Thing.processServiceRequestDirect(Thing.java:4471)
at com.thingworx.things.Thing.processServiceRequest(Thing.java:4394)
at com.thingworx.dsl.engine.adapters.VirtualFunction.call(VirtualFunction.java:130)
at org.mozilla.javascript.optimizer.OptRuntime.call1(OptRuntime.java:32)
at org.mozilla.javascript.gen.queryFirstRowLastRowAvailabilityGrid_239._c_script_0(queryFirstRowLastRowAvailabilityGrid:68)
at org.mozilla.javascript.gen.queryFirstRowLastRowAvailabilityGrid_239.call(queryFirstRowLastRowAvailabilityGrid)
at org.mozilla.javascript.ContextFactory.doTopCall(ContextFactory.java:394)
at org.mozilla.javascript.ScriptRuntime.doTopCall(ScriptRuntime.java:3090)
at org.mozilla.javascript.gen.queryFirstRowLastRowAvailabilityGrid_239.call(queryFirstRowLastRowAvailabilityGrid)
at org.mozilla.javascript.gen.queryFirstRowLastRowAvailabilityGrid_239.exec(queryFirstRowLastRowAvailabilityGrid)
at com.thingworx.dsl.engine.DSLProcessor.executeService(DSLProcessor.java:154)
at com.thingworx.dsl.DSLServiceHandler.processService(DSLServiceHandler.java:38)
at com.thingworx.things.Thing.processServiceRequestDirect(Thing.java:4471)
at com.thingworx.things.Thing.processAPIServiceRequest(Thing.java:4406)
at com.thingworx.webservices.BaseService.handleInvoke(BaseService.java:2459)
at com.thingworx.webservices.BaseService.service(BaseService.java:307)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.thingworx.security.contenttype.ContentTypeFilter.doFilter(ContentTypeFilter.java:75)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.thingworx.security.filter.ValidationFilter.doFilter(ValidationFilter.java:20)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at com.thingworx.security.authentication.AuthenticationFilter.propagateRequest(AuthenticationFilter.java:335)
at com.thingworx.security.authentication.AuthenticationFilter.doFilter(AuthenticationFilter.java:145)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
I also hope to know should we use PostgreSql verison of thingworx in this case(store large history data)?
Solved! Go to Solution.
Dear Nash,
Thanks for sharing the code you used during large or frequently logged history data in Value Stream and displayed in Chart.
I read the code and it did somehow resolve the case where customer wants to display their time-series data in Chart in runtime and give customer a better experience. I will keep it for future reference and I believe our customer will face the similar issue. Thanks again for sharing.
I would also want to share with you the use case for this customer for whom I posted this topic.
1. Customer logged about 86 properties in one Value Stream, and at that time they had logged about 12500 records, so the data amount is reletive big enough.
2. Customer wants to display the data in a Grid(Not Chart in this case). They don't want to limit the startDate and endDate, and they only search from mashup with a logged property called "lot", and it will return all the records that meet the value of "lot" searched.
3. Also customer want to display in Grid all the record matching the search no matter if the logged value changed or not. In Customer's case, the "lot" property is not logged with "Value Change", but with "Always", so even if old value is 4, and new push value is also 4, it will be logged twice. This is based on some specific requirement for customer side.
In this case, it will become very slow if we search all data in Value Stream and display it in Grid in runtime. As we know the when we use QueryPropertyHistory it will search with the first level search condition like startDate, endDate, OldestFirst, and if we dont define the time period, it will fetch all in memory, and then it will do second level search according to our query condition. So even if the final result is not so big, still it will fetch all the data which will purchase a lot of time. Besides, it will return different result for the same property in QueryStringPropertyHistory and QueryPropertyHistory, as QueryPropertyHistory will record all the logged property changes and there may be some duplicated records for the property which didnt have change during that time, but QueryStringPropertyHistory only record the property's real logged value and no redundancy. So in this case, as a workaround, customer drafted a service instead of using QueryPropertyHistory directly(Quite similar to your case here).
1. Call QueryStringPropertyHistory to get the startDate and endDate
2. With the startDate and endDate got from step 1, we config the params and call QueryPropertyHistory
Below is the code we used as your reference:
var queryByLot = {
"filters": {
"fieldName": "value",
"type": "EQ",
"value": lot
}
};
var params = {
oldestFirst: undefined /* BOOLEAN */,
maxItems: undefined /* NUMBER */,
propertyName: "lot" /* STRING */,
endDate: undefined /* DATETIME */,
query: queryByLot /* QUERY */,
startDate: undefined /* DATETIME */
};
var startDate;
var endDate;
// result: INFOTABLE dataShape: "StringValueStream"
var myInfoTable = me.QueryStringPropertyHistory(params);
var rowCount = myInfoTable.getRowCount();
logger.warn(myInfoTable.rows[0].timestamp);
logger.warn(myInfoTable.rows[rowCount-1].timestamp);
startDate = myInfoTable.rows[rowCount-1].timestamp;
endDate = myInfoTable.rows[0].timestamp;
var params = {
oldestFirst: undefined /* BOOLEAN */,
maxItems: undefined /* NUMBER */,
endDate: endDate /* DATETIME */,
query: { "filters": { "fieldName": "lot", "type": "EQ", "value": lot} } /* QUERY */,
startDate: startDate /* DATETIME */
};
// result: INFOTABLE dataShape: "undefined"
var result = me.QueryPropertyHistory(params);
Thanks,
Br,
Anna
You probably should submit this at support.ptc.com
You will need to define what 'Large Data' is also what Hardware and OS you are running on and what your JVM settings are.
Hi Anna,
could you resolve the problem ? Does it happen when the Grid widget has to show too much data, or it concerns already "QueryPropertyHistory" ?
Thank you in advance for your answer.
QueryPropertyHistory Service with Row Control/Limiting
Dear Anna
Please take a look at this thread, it may be useful in resolving your 'Large Data' issue.
Dear Nash,
Thanks for sharing the code you used during large or frequently logged history data in Value Stream and displayed in Chart.
I read the code and it did somehow resolve the case where customer wants to display their time-series data in Chart in runtime and give customer a better experience. I will keep it for future reference and I believe our customer will face the similar issue. Thanks again for sharing.
I would also want to share with you the use case for this customer for whom I posted this topic.
1. Customer logged about 86 properties in one Value Stream, and at that time they had logged about 12500 records, so the data amount is reletive big enough.
2. Customer wants to display the data in a Grid(Not Chart in this case). They don't want to limit the startDate and endDate, and they only search from mashup with a logged property called "lot", and it will return all the records that meet the value of "lot" searched.
3. Also customer want to display in Grid all the record matching the search no matter if the logged value changed or not. In Customer's case, the "lot" property is not logged with "Value Change", but with "Always", so even if old value is 4, and new push value is also 4, it will be logged twice. This is based on some specific requirement for customer side.
In this case, it will become very slow if we search all data in Value Stream and display it in Grid in runtime. As we know the when we use QueryPropertyHistory it will search with the first level search condition like startDate, endDate, OldestFirst, and if we dont define the time period, it will fetch all in memory, and then it will do second level search according to our query condition. So even if the final result is not so big, still it will fetch all the data which will purchase a lot of time. Besides, it will return different result for the same property in QueryStringPropertyHistory and QueryPropertyHistory, as QueryPropertyHistory will record all the logged property changes and there may be some duplicated records for the property which didnt have change during that time, but QueryStringPropertyHistory only record the property's real logged value and no redundancy. So in this case, as a workaround, customer drafted a service instead of using QueryPropertyHistory directly(Quite similar to your case here).
1. Call QueryStringPropertyHistory to get the startDate and endDate
2. With the startDate and endDate got from step 1, we config the params and call QueryPropertyHistory
Below is the code we used as your reference:
var queryByLot = {
"filters": {
"fieldName": "value",
"type": "EQ",
"value": lot
}
};
var params = {
oldestFirst: undefined /* BOOLEAN */,
maxItems: undefined /* NUMBER */,
propertyName: "lot" /* STRING */,
endDate: undefined /* DATETIME */,
query: queryByLot /* QUERY */,
startDate: undefined /* DATETIME */
};
var startDate;
var endDate;
// result: INFOTABLE dataShape: "StringValueStream"
var myInfoTable = me.QueryStringPropertyHistory(params);
var rowCount = myInfoTable.getRowCount();
logger.warn(myInfoTable.rows[0].timestamp);
logger.warn(myInfoTable.rows[rowCount-1].timestamp);
startDate = myInfoTable.rows[rowCount-1].timestamp;
endDate = myInfoTable.rows[0].timestamp;
var params = {
oldestFirst: undefined /* BOOLEAN */,
maxItems: undefined /* NUMBER */,
endDate: endDate /* DATETIME */,
query: { "filters": { "fieldName": "lot", "type": "EQ", "value": lot} } /* QUERY */,
startDate: startDate /* DATETIME */
};
// result: INFOTABLE dataShape: "undefined"
var result = me.QueryPropertyHistory(params);
Thanks,
Br,
Anna
Hi Anna.
You can set MaxItem is big number ex: 99999999 to show all data