BatchSize

BatchSize

is an option for NetTrain and related functions that specifies the size of a batch of examples to process together.

Details

  • Setting BatchSizen specifies that n examples should be processed together.
  • The default setting of BatchSize->Automatic specifies that the BatchSize should be chosen based on factors such as the available GPU or system memory, etc.
  • BatchSize can be specified when evaluating a net by writing net[input,BatchSize->n]. This can be important when GPU computation is also specified via TargetDevice->"GPU", as memory is typically more limited in this case.
  • For nets that contain dynamic dimensions (usually specified as "Varying"), the BatchSize is usually automatically chosen to be 16.
  • The BatchSize used when training can be obtained from a NetTrainResultsObject via the "BatchSize" property.

Examples

open allclose all

Basic Examples  (1)

Define a single-layer neural network and train this network with a BatchSize of 300:

In[1]:=
Click for copyable input
Out[1]=

Predict the value of a new input:

In[2]:=
Click for copyable input
Out[2]=

Properties & Relations  (1)

See Also

NetTrain  TargetDevice  MaxTrainingRounds  TimeGoal  NetTrainResultsObject

Tutorials

Introduced in 2016
(11.0)
| Updated in 2018
(11.3)