# ModelScope

## What is ModelScope?

> ModelScope is a new-generation open-source model-as-a-service (MaaS) sharing platform, dedicated to providing general AI developers with**flexible, easy-to-use, low-cost**one-stop model service solutions, making model application easier!
>
> Through **API-Inference service capabilities**, the platform standardizes open-source models into callable API interfaces, enabling developers to lightly and quickly integrate model capabilities into various AI applications, supporting innovative scenarios such as tool invocation and prototype development.

### Core advantages

* ✅ **Free quota**: provides daily **2,000 free API calls**([Billing rules](##计费与额度规则))
* ✅ **Rich model library**: covers 1,000+ open-source models in NLP, CV, speech, multimodal, and more
* ✅ **Ready to use**: no deployment required, quickly call via RESTful API

***

## Cherry Studio integration process

### Step 1: Get a ModelScope API token

1. **Log in to the platform**
   * Access [ModelScope official website](https://modelscope.cn) → click the top right corner**Log in** → choose an authentication method ![登录界面](/files/4198aee182fcf6d59893f3a0181472daecb559ec)
2. **Create an access token**

   * Go to [**Account settings → Access tokens**](https://modelscope.cn/my/myaccesstoken)

   * Click **`Create new token`** → fill in a description → **copy the generated token**(*See the page example in the figure below*) ![新建令牌示例](/files/5e70b0b80e52449a58cc494fa17f144e0910f510)

   > 🔑 **Important note**: token leakage will affect account security!

### Step 2: Configure Cherry Studio

* Open **Cherry Studio** → **Settings → Model Services → ModelScope**
* In `API Key` paste the copied token into the field ![配置界面](/files/06612431c538ff08621c251c4f43fbcfab579378)
* Click **`Save`** Authorization complete

### Step 3: Call the model API

1. **Find models that support API**

   * Access [ModelScope model library](https://modelscope.cn/models)

   * Filter conditions:**check `API-Inference`**(or recognize the `API` icon on the model card) ![API 模型筛选](/files/e3f0f60ca0bb82cf370f384ac2b173424d91c5e8)

   > The scope of models covered by API-Inference is mainly determined based on the level of attention a model receives in the ModelScope community (with reference to data such as likes and downloads). Therefore, as more capable and more popular next-generation open-source models are released, the list of supported models will continue to be updated.
2. **Get the model ID**
   * Go to the target model details page → copy **Model ID**(format like `damo/nlp_structbert_sentiment-classification_chinese-base`) ![复制 Model ID](/files/e0e1797e3e975ed5f89704c0264e00065455d5ce)
3. **Enter it in Cherry Studio**
   * On the model service configuration page, enter the ID in the `Model ID` field → choose the task type → complete the configuration ![填入模型ID](/files/e5c41261f2c003a6ecb18ee7f88974fb6f6c203d)

***

## Billing and quota rules

### Important note

* 🎫 **Free quota**: each user **2,000 API calls per day**(\*subject to the latest rules on the official website)
* 🔁 **Quota reset**: automatically resets every day at UTC+8 00:00,**does not support accumulation across days or upgrades**
* 💡 **Over-quota handling**:
  * After reaching the daily limit, the API will return `a 429 error`
  * Solution: switch to a backup account / use another platform / optimize call frequency

### Check remaining quota

* Log in to ModelScope → click the top right **`username`** → **`API usage`** ![额度查看位置](/files/b04f597538f539cb3754b9d4ee83c038fb107f74)

> ⚠️ Note: the inference API-Inference has a free daily quota of 2,000 calls. For more calls, consider using cloud services such as Alibaba Cloud Bailian.


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.cherry-ai.com/docs/en-us/pre-basic/providers/modelscope.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
