Skip to Content
DEEPCRAFT™ Studio 5.11 has arrived. Read more →

Model Training

In this step, we will learn how to generate, train and evaluate the machine learning model. After collecting data and designing the preprocessor, the next step is to train the model. Training requires generating a model and defining the layers. You can create multiple model architectures and compare results to identify the best model. Use the Model Wizard to select the model family, classifier, model size, and learning rate; the wizard then generates candidate models based on these settings and accelerates development by focusing on the key design choices. The generated models vary in layer composition and configuration, which can produce different results. For advanced optimization, refine the Auto ML wizard output by editing layers individually.

Generating models for Button Detection and Gesture Detection

  1. Open the respective project file and navigate to the Training tab.

  2. Click Generate Model List button to open the Model wizard. The Auto-ML tab appears with the pre-configured parameters. Set the following parameters to generate different set of models.

    ParametersOptions    
    Hardware TypeGeneric
    Model FamilyConv1D
    Model FlavorSmallkern
    ClassifierGlobalAveragePool
    Model SizeMedium
    OptimizationBalanced
    DownscaleOFF
    PoolingON
    Learn RateMid
    RegularizationLow
    Append ModelsUncheck
  3. Click the Training tab to set the following parameters.

    ParametersOptions    
    Epochs Type100
    Batch Size32
    Loss FunctionCategorical Crossentropy
    Split Count16
    Patience20
  4. Click Ok. The new set of models appear in the list.

Generating model for Slider Position

  1. Open the project file and navigate to the Training tab.

  2. Click Generate Model List button to open the Model wizard. The Auto-ML tab appears with the pre-configured parameters. Set the following parameters to generate different set of models.

    ParametersOptions    
    Hardware TypeGeneric
    Model FamilyConv1D
    Model FlavorSmallkern
    ClassifierGlobalAveragePool
    Model SizeMedium
    OptimizationBalanced
    DownscaleOFF
    PoolingON
    Learn RateMid
    RegularizationLow
    Append ModelsUncheck
  3. Click the Training tab to set the following parameters.

    ParametersOptions    
    Epochs Type100
    Batch Size32
    Loss FunctionMean Squared Error
    Split Count16
    Patience20
  4. Click Ok. The new set of models appear in the list.

Model Training

In order to start the model training, click the Start New Training Job… button which will send the training job to the Imagimob cloud and begin the model training. This will prompt you to open the model training, from where you can view the progress of the model training. Once the models are trained, they can be downloaded.

Refer to Starting Model Training and Tracking the training job to see the detailed instructions on how to start and track the training job respectively.

After the model is trained in Imagimob Cloud, the detailed performance data of each the model is available. You can download the model recommended by Studio. Refer to Download the model files for detailed steps.

Model Evaluation

You can evaluate the models by comparing the predictions from the original model with the predictions from the new model. Studio allows a much more detailed view of the performance of a model by merging the model predictions as tracks into the sessions containing the original data.

For classification projects: Touch button detection and Gesture detection, refer to import predictions into the project to see the detailed instructions on how to import the predictions.

You can use the advanced features, such as confusion matrix and Window Visualization and Grad-CAM to evaluate the performance of the models. Refer to Model evaluation using Confusion Matrix and Model explainability using Window Visualization and Grad-CAM to know more.

For regression project: Slider position, refer to Evaluating regression model for detailed steps.

Last updated on