WHAT MORE IS NECESSARY FOR A GOOD NEURAL NET DESIGN?
2 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
x = input;
t = target;
1. [I N ] = size(x)
[O N ] = size(t)
2. The default(AUTOMATIC & RANDOM) design division
procedure is
a. Train (70%)
b. Validate (15%): occurs during training and stops
training if validation error increases for
6(default) continuous epochs.
c. Test (15%)
3. The default normalization (0 mean/unit variance) is sufficient.
4. a. The (SUFFICIENT) default configuration only contains
ONE hidden layer with H = 10 nodes and (I+1)*H+(H+1)*O
randomly initialized weights.
b. If H is sufficiently large, ANY "reasonable"
input/output transformation can be approximated.
c. However, if H is too large, the phenomenon of
overfitting occurs and special steps have to be taken.
d. My approach is to find the minimum value of H that
yields an acceptable result
e. Typically my training goal is
mse(target-output) <= 0.01*var(target,1)
NOTE: var(target,1) is the error for the naïve guess
output = mean(target)
5. Weights are automatically initiallized randomly. So,
typically, all you have to do is
a. Start using the MATLAB sample code with the default
H = 10
b. Use a do loop to design a number of nets (e.g., 10 or
more) to find the best result from the random initial
weights
c. Search for the smallest value of H that will yield a
satisfactory solution.
6. Any questions?
0 Commenti
Risposte (0)
Vedere anche
Categorie
Scopri di più su Sequence and Numeric Feature Data Workflows in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!