- Setting BatchSizen specifies that n examples should be processed together.
- The default setting of BatchSize->Automatic specifies that the BatchSize should be chosen based on factors such as the available GPU or system memory, etc.
- BatchSize can be specified when evaluating a net by writing net[input,BatchSize->n]. This can be important when GPU computation is also specified via TargetDevice->"GPU", as memory is typically more limited in this case.
- For nets that contain dynamic dimensions (usually specified as "Varying"), the BatchSize is usually automatically chosen to be 16.
- The BatchSize used when training can be obtained from a NetTrainResultsObject via the "BatchSize" property.
Examplesopen allclose all
Basic Examples (1)
Define a single-layer neural network and train this network with a BatchSize of 300:
Properties & Relations (1)
Introduced in 2016
(11.0)| Updated in 2018