Installation | FLAML (original) (raw)

Python

FLAML requires Python version >= 3.7. It can be installed from pip:

or conda:

conda install flaml -c conda-forge

Optional Dependencies

Autogen

pip install "flaml[autogen]"

Task-oriented AutoML

pip install "flaml[automl]"
pip install "flaml[openai]"
pip install "flaml[catboost]"
pip install "flaml[forecast]"

Notebook

To run the notebook examples, install flaml with the [notebook] option:

pip install "flaml[notebook]"

Distributed tuning

Spark support is added in v1.1.0

pip install "flaml[spark]>=1.1.0"

For cloud platforms such as Azure Synapse, Spark clusters are provided. But you may also need to install Spark manually when setting up your own environment. For latest Ubuntu system, you can install Spark 3.3.0 standalone version with below script. For more details of installing Spark, please refer to Spark Doc.

sudo apt-get update && sudo apt-get install -y --allow-downgrades --allow-change-held-packages --no-install-recommends \
    ca-certificates-java ca-certificates openjdk-17-jdk-headless \
    && sudo apt-get clean && sudo rm -rf /var/lib/apt/lists/*
wget --progress=dot:giga "https://www.apache.org/dyn/closer.lua/spark/spark-3.3.0/spark-3.3.0-bin-hadoop2.tgz?action=download" \
    -O - | tar -xzC /tmp; archive=$(basename "spark-3.3.0/spark-3.3.0-bin-hadoop2.tgz") \
    bash -c "sudo mv -v /tmp/\${archive/%.tgz/} /spark"
export SPARK_HOME=/spark
export PYTHONPATH=/spark/python/lib/py4j-0.10.9.5-src.zip:/spark/python
export PATH=$PATH:$SPARK_HOME/bin
pip install "flaml[blendsearch]"

To install flaml in Azure Synapse and similar cloud platform

pip install flaml[synapse]

Test and Benchmark

pip install flaml[benchmark]

.NET

FLAML has a .NET implementation in ML.NET, an open-source, cross-platform machine learning framework for .NET.

You can use FLAML in .NET in the following ways:

Low-code

Code-first

To get started with the ML.NET API and AutoML, see the csharp-notebooks.