-
Notifications
You must be signed in to change notification settings - Fork 2.3k
2. Quickstart
Before referring to this document to install Coze Studio, ensure that your software and hardware environment meet the following requirements:
Requirements | Description |
---|---|
CPU | 2 Core |
RAM | 4 GiB |
Docker | Pre-install Docker, Docker Compose, and start the Docker service. For detailed operations, please refer to the Docker documentation: * macOS: It is recommended to install using Docker Desktop, refer to the Docker Desktop For Mac installation guide. * Linux: Refer to the Docker installation guide and the Docker Compose installation guide. * Windows: It is recommended to install using Docker Desktop, refer to the Docker Desktop For Windows installation guide. |
Execute the following command in the local project to retrieve the latest version of the Coze Studio source code.
# Clone the code
git clone https://github.com/coze-dev/coze-studio.git
Coze Studio is an AI app development platform based on LLM. Before deploying and starting the open-source version of Coze Studio for the first time, you need to configure the model service in the Coze Studio project. Otherwise, you won't be able to properly select a model when creating an agent or workflow. This documentation uses the Volcano Ark model as an example to demonstrate how to configure the model service for Coze Studio. If you plan to use OpenAI or other online model services, you should refer to the model configuration documentation to correctly fill out the configuration files.
-
Copy the template files of the doubao-seed-1.6 model from the template directory and paste them into the configuration file directory.
cd coze-studio # Copy the model configuration template cp backend/conf/model/template/model_template_ark_doubao-seed-1.6.yaml backend/conf/model/ark_doubao-seed-1.6.yaml
-
Modify the template file in the configuration file directory.
- Enter the directory
backend/conf/model
. Open the fileark_doubao-seed-1.6.yaml
. - Set the fields
id
,meta.conn_config.api_key
, andmeta.conn_config.model
, and save the file.- id: The model ID in Coze Studio, defined independently by the developer, must be a non-zero integer and globally unique. Please do not modify the model ID after the model goes online.
- meta.conn_config.api_key: The API Key for the online model service, in this example it is the API Key for Volcengine Ark. Refer to obtaining Volcengine Ark API Key for the retrieval method.
- meta.conn_config.model: The model ID for the online model service. In this example, it is the Endpoint ID for the Volcengine Ark doubao-seed-1.6 model access point. Refer to obtaining Endpoint ID for the retrieval method.
- Enter the directory
The first deployment and startup of Coze Studio requires retrieving the image and building the local image, which may take a while. Please be patient. During the deployment process, you will see the following log information. If you see the prompt "Container coze-server Started," it means that the Coze Studio service has successfully started.
# Start the service
cd docker
cp .env.example .env
docker compose --profile "*" up -d
After the service starts, it is normal for the coze-elasticsearch-setup , coze-minio-setup , coze-mysql-setup-init-sql , and coze-mysql-setup-schema containers to be in an exited state (exit 0).
After starting the service, you can open Coze Studio by accessing http://localhost:8888/ through your browser. Here, 8888 is the backend listening port.
At this point, you have successfully deployed Coze Studio. You can register an account and explore the various features and services of Coze Studio as guided on the page.
After successfully deploying Coze Studio, if you need to use functions such as plugins and knowledge bases, you also need to:
- Configure Plugins: Some official plugins require authentication through the keys of third-party services, such as Feishu Cloud Document series components. If the key is not configured, the plugin will be displayed as "Unauthorized".
-
Configure Basic Components: The core components are as follows
- Knowledge Base: To use the knowledge base function, you must configure the Embedding Component; for image knowledge bases, you also need to set up the OCR Component to recognize text in images.
- Image Upload: When you need to use the multimodal input of the large model, the upload component needs to be configured with a public network domain name or IP address. Otherwise, in the debug console and when conversing with the model, the model cannot read the uploaded images.
- Configure Models: Add model services as needed so that your agents, workflows or applications can use more models.