Model Compare

Model Compare is a feature, which is used to compare the performance of Classification and Regression algorithms. You can select the performance metrics on which the comparison is to be done. Based on the performance metrics results, Rubiscape recommends the best suited model.

With this feature, a Regression algorithm can be compared with another Regression algorithm and a Classification algorithm can be compared with another Classification algorithm. Please note, you can not compare a Regression algorithm with another Classification algorithm.

This feature is located under rubiML in the task pane. Model Compare is also called Evaluation in the product.

Working with Model Compare

To start working with Model Compare, follow the steps given below.

  1. Go to the Home page and create a new workbook or open an existing workbook.
  2. Drag and drop the required dataset on the workbook canvas.
    Figure: Selecting Dataset
  3. In the Properties pane, select the Data Fields from the dataset.
    Figure: Selecting Properties of the Dataset
  4. Select the classification or regression algorithm that you want to compare and connect it to the dateset node.
  5. Select the algorithm node and select its respective Properties displayed in the Properties pane.
    Here, we have selected the Dependent and Independent variables, and Advanced Properties of NLP Neural Network algorithm.
    Figure: Selecting Properties of the Model
  6. Click the algorithm and click the vertical ellipsis (), and then click Run.
  7. Repeat steps 4 to 6 for all the algorithms that you want to compare.

    (info) Note:

    Make sure you select either the Classification models or Regression models to compare. You cannot compare a Classification model to another Regression model.


    Figure: Running the Models
  8. Drag and Drop Model Compare on the workbook.
    Figure: Selecting Model Compare
  9. Connect the selected algorithms to Model Compare.
    Figure: Configuring Model Compare
  10. Select the Comparison Metrices for Model Compare.
    Figure: Selecting the Properties of Model Compare
    (info)

    Note:

    For Regression models, the Comparison Metrices are –

    • RMSE
    • Adjusted R Square
    • R square

    For Classification models, the Comparison Metrices are –

    • Accuracy
    • Specificity
    • Sensitivity
    • F-Score
    • AUC

    If cross validation is added as a predecessor to any of the algorithms, the comparison metrices are Mean accuracy and Standard Deviation accuracy.

  11. Select the Model Compare node, click the vertical ellipsis () corresponding to it, and then click Run.
    Figure: Running the Model Compare
    The node execution starts and after completion, a confirmation is displayed.
  12. After the Model Compare node execution is complete, Select the node, click the ellipses (), and then Click Explore.
    Figure: Exploring Model Compare
    The result page is displayed.
    The result page displays the metrices of both the models sorted on the Performance Metrices selected in step 10.
    Figure: Model Compare Result Page
    Scroll down the result page to view the recommended model based on Comparison Metrices.
    Figure: Result Page – Recommended Model
    As seen in the above figure, in this example, the recommended model is MLP Neural Network.
Table of Contents