Steve Ward’s tips on preventing over-optimization

Any time you use optimization, you can indeed over-optimize. That means it works great on the period you optimized, but falls apart later. The ways to prevent that include:

a. Don’t use so many of what statisticians call “free variables”. That means if you have lots of parameters and lots of rules, there is more chance that whatever the history is, it can be overfit. That is why you see overfitting worse with full optimization. I rarely use it myself; I prefer to use rules I think make sense and then just optimize parameters.
b. If you don’t optimize over very much historical data, overfitting is again a real possibility. Make sure you have plenty of data to optimize on. This is a balancing act with daily data, because you also don’t want to go so far back in time that the markets were acting very differently than today!
c. Consider stopping optimization before it gets too far, or limit the amount of time optimization runs (you may need to turn on some options to see where to do this.)
d. Consider limiting the range of the variables being optimized. Our defaults are not always the best, and your knowledge of trading should be used the set them. (The range of variables can be edited by clicking on the plus sign (+) just to the left of the rule in the Trading Strategy or to the left of the input in a neural net when optimization is turned on.)
e. Consider forcing variables in the long rules to optimize the same as the same variables in the short rules. Forcing symmetry like this is usually more appealing to traders anyway, and although it causes results to be less spectacular, the results may be less overfit. (You will use the plus sign to do this too.)
f. Consider optimizing over all chart pages. This usually results in poorer individual results, but they will probably not be over-optimized. Make sure your variables are well normalized – don’t use raw prices or raw moving averages. Use percent changes and percent spreads, etc.
g. Last but not least, use the Paper trading feature in rel 5.0, which was designed to prevent over-optimization.

Neural nets can overfit even faster than rules because they have more free variables in them. To prevent neural net overfitting:

a. Keep the number of inputs low – I prefer no more than 5
b. Make sure training sets are large – small sets promote overfitting. Again you have the balancing act with daily data, which is why I prefer intraday data, even if I’m not “daytrading”. Also try to get an equal amount of “bull” as “bear” in your training and optimization period, so the net sees both types of moves.
c. Consider stopping optimization before it gets too far, or limit the amount of time optimization runs (you may need to turn on some options to see where to do this.)
d. Consider limiting the range of the variables being optimized. Our defaults are not always the best, and your knowledge of trading should be used the set them. (The + again.)
e. Don’t use too many hidden neurons, which increases free variables.
f. Use the Paper trading feature in rel 5.0, which was designed to prevent over-optimization.
g. Consider optimizing over all chart pages. This usually results in poorer individual results, but they will probably not be over-optimized. Make sure your variables are well normalized – don’t use raw prices or raw moving averages. Use percent changes and percent spreads, etc.

In addition to the ideas above, my personal beliefs are that “out-of-sample periods” (the periods after optimization, i.e. the paper trading periods and evaluation periods) should not be very long. It is unreasonable to expect any model to work well very long without reoptimizing because the market changes and adapts. I rarely use more than 3 to 6 months (if using daily data, much less if using intraday bars).

It is also a personal belief that stops should not be optimized. I let the optimizer build good models that do not rely on stops as a crutch. Then I apply the stops myself when I trade as insurance – I know my pain limit and I don’t want the optimizer to change that.

Was this article helpful?

Related Articles