Mian
  • Mian Overview
  • Tutorial
  • Getting Started
  • Projects
  • Tool Parameters and Filters
  • Boxplots
  • Barplot (Composition)
  • Donut (Composition)
  • Scatterplot (Correlation)
  • Heatmap (Correlation)
  • Heatmap (Composition)
  • Correlation Network
  • Rarefaction Curves
  • Taxonomic Tree View
  • Table
  • Alpha Diversity
  • Beta Diversity
  • NMDS
  • PCoA
  • Boruta (Feature Selection)
  • Elastic Net Classification
  • Elastic Net Regression
  • Fisher Exact Test
  • Differential Selection
  • Correlations Selection
  • Linear Regressor
  • Linear Classifier
  • Random Forest Classifier
  • Deep Learning
Powered by GitBook
On this page
  • Used For
  • Feature Selection Parameters
  • Interactive Elements
  • Additional Features

Was this helpful?

Elastic Net Classification

This tool fits a linear classifier with elastic-net regularization. This tool will then subsequently select the highest weight OTUs or taxonomic groups using recursive feature elimination.

PreviousBoruta (Feature Selection)NextElastic Net Regression

Last updated 4 years ago

Was this helpful?

Used For

  • Selecting OTUs or taxonomic groups ("OTU signature") that differentiate between two or more sample groups according to a generalized linear model with Elastic Net regularization. Note that the OTUs are normalized prior to training the model to retrieve comparable coefficients.

Feature Selection Parameters

Taxonomic Level

The taxonomic level to aggregate the OTUs at. The OTUs will be grouped together (by summing the OTU values) at the selected taxonomic level before the analysis is applied.

Categorical Variable

Create comparative sample groups based on categorical variables uploaded in the metadata file.

Optionally create a categorical variable from a quantitative variable by using the Quantile Range feature on the Projects home page.

Number of Features to Keep

Specify how many features to display in the output. Leave blank to not do any filtering.

Loss Function

The type of loss to use when training the model.

Fix Training Set Between Changes

Indicate whether the training set should remain the same every time a parameter is changed and the model is retrained.

Training Proportion

Define the proportion of the data that should be randomly picked to form a training dataset.

You can set this value to be 1.0 if you don't plan on evaluating with a test dataset.

L1 Regularization Ratio

0.5 is recommended.

Max Iterations

The maximum number of passes through the data during training.

Note that the training may stop early if the model detects that the training error is no longer going down after five consecutive epochs

Interactive Elements

  • Link back to boxplots

Additional Features

  • Save Snapshot: Save the results to the experiment notebook

  • Download: Downloads the results as a CSV file

  • Share: Creates a shareable link that allows you to share the results with others

L1 () regularization helps encourage sparsity within the selected features, which means that fewer features will be used to predict the experimental variable.

LASSO