7-Bedrock
June 29, 2024
Question
Activation function in PMML does not match selection during training
- June 29, 2024
- 1 reply
- 1053 views
When checking the exported PMML file of a model trained with just 1 neural net learner, configured to use ReLU, I'm noticing that all activation functions (apart from the output) are declared as logistic.
How can this be? Maybe I'm interpreting the PMML file wrong, but I've checked the whole file and there's no mention of ReLU. I've done a lot tests and I'm not getting that parameter to change in the PMML file.

