TrainingUpdateSchedule

TrainingUpdateSchedule

is an option for NetTrain that specifies which arrays of the network can be updated at each step of the optimization process.

Details

  • With the default value of TrainingUpdateScheduleAutomatic, all arrays are updated at any step of the optimization process.
  • TrainingUpdateSchedule{group1,group2,,groupn} specifies which arrays will be updated for each respective optimization step, this schedule being repeated until the end of training.
  • In TrainingUpdateSchedule{group1,group2,,groupn}, each of the groupi can be of the following forms:
  • "layer"all arrays of a named layer or subnetwork
    nall arrays of the nth layer
    m;;nall arrays of layers m through n
    {layer,"array"}a particular array in a layer or subnetwork
    {part1,part2,}
  • arrays of a nested layer or subnetwork
  • spec1|spec2|any of the specified arrays
    _all arrays in the network
    specsrepeat the same specification for s consecutive steps
  • The hierarchical specification {part1,part2,} used by TrainingUpdateSchedule to refer to a subpart of a net is equivalent to that used by NetExtract and NetReplacePart.
  • Specifications of a subnet (e.g. a nested NetChain or NetGraph) apply to all layers and arrays within that subnet.
  • Any group of parameters not specified in TrainingUpdateSchedule is held constant during training.

Examples

open allclose all

Basic Examples  (1)

Train a NetGANOperator by alternating updates of the discriminator and updates of the generator:

Scope  (2)

Create and initialize a net with three layers:

Train this net by updating alternately the first and third layers, and collect the net arrays after each optimization iteration on a batch:

Show the evolution of the first array value through iterations and check that the arrays are alternately updated:

Create and initialize a NetGraph with named layers:

Train the net by updating a subpart of the NetGraph 10 times more than another:

Check how the arrays are updated:

Train the same NetGraph by updating arrays separately, one by one:

Possible Issues  (1)

When a shared array occurs at several places in the network, only a unique training update schedule will be applied to all the occurrences of the shared array.

Create a network with shared arrays:

Train with a TrainingUpdateSchedule that specifies that only the first layer should be updated and collects the value of the weights of the third layer after each update:

The shared weights are updated at each epoch:

The results are the same as training without any update schedule:

Wolfram Research (2020), TrainingUpdateSchedule, Wolfram Language function, https://reference.wolfram.com/language/ref/TrainingUpdateSchedule.html.

Text

Wolfram Research (2020), TrainingUpdateSchedule, Wolfram Language function, https://reference.wolfram.com/language/ref/TrainingUpdateSchedule.html.

CMS

Wolfram Language. 2020. "TrainingUpdateSchedule." Wolfram Language & System Documentation Center. Wolfram Research. https://reference.wolfram.com/language/ref/TrainingUpdateSchedule.html.

APA

Wolfram Language. (2020). TrainingUpdateSchedule. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/TrainingUpdateSchedule.html

BibTeX

@misc{reference.wolfram_2024_trainingupdateschedule, author="Wolfram Research", title="{TrainingUpdateSchedule}", year="2020", howpublished="\url{https://reference.wolfram.com/language/ref/TrainingUpdateSchedule.html}", note=[Accessed: 21-November-2024 ]}

BibLaTeX

@online{reference.wolfram_2024_trainingupdateschedule, organization={Wolfram Research}, title={TrainingUpdateSchedule}, year={2020}, url={https://reference.wolfram.com/language/ref/TrainingUpdateSchedule.html}, note=[Accessed: 21-November-2024 ]}