Quickstart Guide

This page will help you get started quickly with Instill Core, whether you wish to deploy it on your local machine, or are a Managed Instill Core user.

#🔮 Instill Core

Instill Core is a full-stack AI platform that allows you to create versatile data pipelines with Pipeline, orchestrate unstructured data via Artifact, and leverage Model to serve AI models on your local machine, remote instances or via our Managed Instill Core service.

#☁️ Managed Instill Core

If you are using Instill Core as a managed service, please click the button below and proceed through the login instructions.

INFO

To use Instill Core as a managed service, you will need to have been personally onboarded by our team. If you are interested in using Instill Core as a managed service, please sign up here.

#🐳 Self-Hosted Instill Core

The following instructions will guide you through the setup process using Docker Compose.

#Prerequisites

Before getting started, please ensure you meet the following prerequisites:

  • macOS or Linux - Instill Core is compatible with macOS and Linux.

  • Windows - Instill Core can be set up on Windows via Windows Subsystem for Linux (WSL2). To learn more about setting up Instill Core on Windows, please refer to our Docker Compose guide.

  • Docker and Docker Compose - Instill Core uses Docker Compose to manage services locally. See the official installation instructions and review the Docker Resource Requirements for optimal setup.

#Launch

To fire up Instill Core on your local machine or remote instance, simply run the following commands:


git clone -b v0.50.4-beta https://github.com/instill-ai/instill-core.git && cd instill-core
make all

Once all services are up and running, the Console UI will be available at http://localhost:3000.

INFO

If you change to a different version of Instill Core, you will need to re-build the docker images by running make build-latest instead of make all.

#Shutdown

To tear down and clean up all Instill Core resources, run:


make down

Please refer to our Deployment Guide for further information on how to deploy Instill Core using Docker Compose, Kubernetes using Helm or via our Instill CLI.

#Next Steps

Now that you're set up with Instill Core, you're ready to dive deeper into the platform's capabilities.

#Explore Our Key Services

The three core services that underpin our full-stack AI solution are:

  1. Artifact - Easily parse and process your files and documents into a RAG-ready Catalog
  2. Pipeline - Create versatile AI-powered pipelines that seamlessly integrate with your data and applications
  3. Model - Serve, orchestrate and monitor AI models

#See Our Examples

Explore, test, modify and draw inspiration from the diverse range of AI products you can build with our services on our Examples page. This includes:

  • Pipelines that are API-ready for external integrations
  • Servable models that are ready to be deployed on Model
  • Tutorials that give you step-by-step guidance on how to build your own AI applications
  • Instill AI Cookbooks that demonstrate how to solve real-world problems with our Python SDK

#Read Our Blog

Stay up-to-date with our latest product updates, AI insights, and tutorials by visiting our Blog.

#Support

Please see our Support page for more information on how to get help.