When using leave one out cross validation in neural network, do I have to fix the epoch number for each training model?
The test results of these models are averaged to show performance. So can I choose best result for each model (the epoch number will be different) or do I have to fixed epoch number for each model (the epoch numbers will be same but the results are not all best)?
I've never seen any constraint on the number of epochs, but I think it doesn't make much sense… You can try setting a desired training error e and a maximum number of epochs n. If n is reached before e is achieved, stop the training and start testing, and the lowest error will be reported as the training error of the model. You train with 9 folders, and test with the other one.
I think you might be confused about training and test error. Training error is achieved iteratively, the parameters are changing; testing error is achieved with fixed parameters, there is no epoch number for testing error.
Nevertheless, always make it clear how you're doing the training/testing!