Visualization
Last updated
Was this helpful?
Last updated
Was this helpful?
In this Visualization part, we can evaluate all models generated from Build and Evluate section.
Start by Select from avaliable models
, a list of all generated models will show, choose one and go to Choose from the following metrics to evaluate your model. For Classification : auc, confusion_matrix, threshold, and precision_recall. For Regression : residuals, error, cooks, and learning , Click Plot
, the output is a graph of the evaluation metric.
All ouput images will be saved under file name evaluation_metric.png or .html (example : Residuals.png)
Select the appropriate model from
:
Model trained with custom number of clusters
Model trained using Elbow method
Model trained and tuned with data containing labled target
Choose from the following metrics to visually evaluate your model
, For Clustering: distribution, cluster, tsne, silhouette, and distance.
when plot = distribution, you need to Select feature to be evaluated
, When plot type is cluster or tsne feature column is used as a hoverover tooltip and/or label when show data label
is set to True. When the plot type is cluster or tsne and feature is None, first column of the dataset is used. Show data label
is the Name of the column to be used as data labels. Ignored when plot is not cluster or tsne.
Select the appropriate model
from :
Model trained with custom number of clusters.
Model trained and tuned with data containing labled target
Choose from the following metrics to evaluate your model
: tsne or umap
Select feature to be evaluated
: Feature to be used as a hoverover tooltip and/or label when the Show data label
is set to True. When feature is None, first column of the dataset is used.
Show data label
is the name of column to be used as data labels.Click Plot
, the output is a graph of the evaluation metric.
All ouput images will be saved under file name evaluation_metric.png or .html (example : silhouette.png or UMAP Dimensionality.html)