cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - New to the community? Learn how to post a question and get help from PTC and industry experts! X

KepServerEx on a Windows Cluster

CM_9924263
1-Newbie

KepServerEx on a Windows Cluster

In my current scenario I have 2 OPC servers (KepServerEx 6.9) collecting data through the Modbus driver and making this data available in OPC UA and DA.
Generally, client applications do not have redundancy tools and only my OPC01 server provides data for this type of application.
Contrary to this, Osisoft PI already has a tool in its implementation that allows switching between data sources (OPC01 or OPC02).
Through the video https://www.youtube.com/watch?v=vaoKidtZ82w I realized that it would be possible to create a Windows cluster to solve this problem and create a redundancy between servers.
However, I did not find documentation/guide on the Kepware website that directs me to this type of implementation.
Is there any? Could you refer me?

PS: I'm familiar with the "Redundancy Manager" solution, but I'd rather point to a cluster than purchase additional licenses for each of the client applications.

MAC Solutions : UK Technical Resellers and experts for Kepware products since 2001. Visit us on www.mac-solutions.net For more info please contact us on 01527 529 774 or email sales@mac-solutions.co.uk This webinar covers : - KEPServerEX resources and Windows OS versions - VMWare compatibility - ...
1 REPLY 1

@CM_9924263

 

The failover would be handled in the Windows Admin Center. There are no changes that need to be made in the KEPServerEX applications to run in a clustered environment so there is no additional documentation at this time.  The biggest hurdle is licensing, since each computer in the cluster will require to have its own license.
 

Here is a link to the Kepware knowledge base article that covers licensing in clustered environment:

https://www.ptc.com/en/support/article/CS286615

 

Thanks,

*Chris
 

Announcements


Top Tags