You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+19-15Lines changed: 19 additions & 15 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@
4
4
5
5
#### QAI AppBuilder
6
6
Quick AI Application Builder(this repository) is also referred to as *QAI AppBuilder* in the source and documentation. QAI AppBuilder is extension for Qualcomm® AI Runtime SDK. We need some libraries in Qualcomm® AI Runtime SDK for using QAI AppBuilder. <br>
7
-
QAI AppBuilder is designed for developer to using Qualcomm® AI Runtime SDK to execute model on Windows on Snapdragon(WoS) and Linux platforms easily. We encapsulated Qualcomm® AI Runtime SDK APIs to several simple APIs for loading the models to CPU or HTP and executing inference.
7
+
QAI AppBuilder is designed for developer to using Qualcomm® AI Runtime SDK to execute model on Windows on Snapdragon(WoS) and Linux platforms easily. We encapsulated Qualcomm® AI Runtime SDK APIs to several simple APIs for loading the models to CPU and HTP and executing inference.
8
8
9
9
#### Qualcomm® AI Runtime SDK
10
10
@@ -27,37 +27,41 @@ Developers can use QAI AppBuilder in both C++ and Python projects <br>
27
27
• Faster for testing models. <br>
28
28
• Plenty of sample code. <br>
29
29
30
-
Using the Python extensions with ARM64 Python will make it easier for developers to build GUI app for Windows on Snapdragon(WoS) platforms. Python 3.12.6 ARM64 version has support for following modules: PyQt6, OpenCV, Numpy, PyTorch*, Torchvision*, ONNX*, ONNX Runtime*. Developers can design apps that benefit from rich Python ecosystem. <br>
30
+
** Support ARM64 Windows, Linux and Ubuntu (e.g.: X Elite Windows, QCS8550 Linux and QCM6490 Ubuntu)*
31
31
32
-
**PyTorch, Torchvision, ONNX, ONNX Runtime: need to compile from source code.* <br>
33
-
**Also support using x64 Python to run QNN mode on WoS HTP, with this, we can install all the Python extension directly (Refer to the samples code here for detail: https://github.com/quic/ai-engine-direct-helper/tree/main/samples/python))* <br>
34
-
**Support ARM64 Windows, Linux and Ubuntu (e.g.: X Elite Windows, QCS8550 Linux and QCM6490 Ubuntu)*
32
+
## Environment Setup
33
+
Refere to [python.md](docs/python.md) on how to setup Python environment for using QAI AppBuilder on Windows on Snapdragon (WoS) platforms.
34
+
35
+
## Samples
36
+
We have several [samples](samples/) which can be run directly:<br>
37
+
1.[Sample code](samples/python/README.md): Guide to run several [AI-Hub](https://aihub.qualcomm.com/compute/models) models throug sample code.
38
+
2. OpenAI Compatibility API Service(LLM Service):<br>
39
+
2.1 [Python based service](samples/genie/python/README.md): Guide to run OpenAI compatibility API services developed with python.<br>
40
+
2.2 [C++ based service](samples/genie/c++/README.md): Guide to run OpenAI compatibility API services developed with C++.<br>
41
+
3.[WebUI samples](samples/webui/README.md): Guide to run several WebUI based AI applications.
35
42
36
43
## Components
37
44
There're two ways to use QAI AppBuilder:
38
45
### 1. Using the QAI AppBuilder C++ libraries to develop C++ based AI application.
39
46
Download prebuild binary package *QAI_AppBuilder-win_arm64-{Qualcomm® AI Runtime SDK version}-Release.zip* to get these files: https://github.com/quic/ai-engine-direct-helper/releases
40
47
41
-
**libappbuilder.dll {libappbuilder.lib, LibAppBuilder.hpp}** –– C++ projects can use this lib to run models in HTP.
42
-
**QAIAppSvc.exe** –– Due to HTP limitations, we can only load models smaller than 4GB in one process. This app is used to help us load the models in new processes(Multiple processes can be created) and inference to avoid HTP restrictions. [*Depress: the above limitation has been fixed.*]
43
-
44
48
### 2. Using the QAI AppBuilder Python binding extension to develop Python based AI application.
45
-
Download Python extension *qai_appbuilder-{version}-cp312-cp312-win_arm64.whl* and install it with the command below:
49
+
Download Python extension *qai_appbuilder-{version}-cp312-cp312-win_amd64.whl* and install it with the command below:
This guide helps developers setup Python environment for using QAI AppBuilder on Windows on Snapdragon (WoS) platforms.
5
+
6
+
## Setting Up QAI AppBuilder Python Environment:
7
+
8
+
### Step 1: Install Dependencies
9
+
Download and install [git](https://github.com/dennisameling/git/releases/download/v2.47.0.windows.2/Git-2.47.0.2-arm64.exe) and [x64 Python 3.12.8](https://www.python.org/ftp/python/3.12.8/python-3.12.8-amd64.exe)
10
+
11
+
*Make sure to check 'Add python.exe to PATH' while install Python*
Using the Python extensions with ARM64 Python will make it easier for developers to build GUI app for Windows on Snapdragon(WoS) platforms. Python 3.12.6 ARM64 version has support for following modules: PyQt6, OpenCV, Numpy, PyTorch*, Torchvision*, ONNX*, ONNX Runtime*. Developers can design apps that benefit from rich Python ecosystem. <br>
2
+
3
+
**PyTorch, Torchvision, ONNX, ONNX Runtime: need to compile from source code.* <br>
Copy file name to clipboardExpand all lines: samples/genie/python/README.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,9 +4,9 @@
4
4
This sample helps developers use QAI AppBuilder + Python to build Genie based Open AI compatibility API service on Windows on Snapdragon (WoS) platform.
5
5
6
6
## Setting Up Environment For Service:
7
-
### Step 1: Install basic dependencies
8
-
Refer to following link to setup basic dependencies: <br>
| Phi 3.5 mini * |[model files](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/phi_3_5_mini_instruct/v1/snapdragon_x_elite/models.zip)<br>[tokenizer.json](https://huggingface.co/microsoft/Phi-3.5-mini-instruct/resolve/main/tokenizer.json?download=true)|
49
49
50
50
*. For Phi-3.5-Mini-Instruct model, to see appropriate spaces in the output, remove lines 193-196 (Strip rule) in the tokenizer.json file.<br>
51
-
**. Refer to here to [setup Stable Diffusion v2.1 models](../../python/README.md) before run 'GenieAPIService.py'.
51
+
**. Refer to [setup Stable Diffusion v2.1 models](../../python/README.md) before run 'GenieAPIService.py' (Our Python version 'GenieAPIService.py' support generating image, it depends on Stable Diffusion v2.1 sample code.)
This guide helps developers use QAI AppBuilder with the QNN SDK to execute models on Windows on Snapdragon (WoS) platforms.
4
+
This guide helps developers setup Python environment for using QAI AppBuilder to run sample code on Windows on Snapdragon (WoS) platforms.
5
5
6
-
## Setting Up QAI AppBuilder Python Environment:
6
+
## Setting Up QAI AppBuilder Python Environment
7
7
8
8
### Step 1: Install Dependencies
9
-
Download and install [git](https://github.com/dennisameling/git/releases/download/v2.47.0.windows.2/Git-2.47.0.2-arm64.exe) and [x64 Python 3.12.8](https://www.python.org/ftp/python/3.12.8/python-3.12.8-amd64.exe)
9
+
Refer to [python.md](../../docs/python.md) on how to setup x64 version Python environment.
10
10
11
-
*Make sure to check 'Add python.exe to PATH' while install Python*
Before running Stable Diffusion python script, please download Stable Diffusion models from following AI-Hub website and save them to path: 'samples\python\stable_diffusion_v1_5\models' & 'samples\python\stable_diffusion_v2_1\models' manually.<br>
19
+
For other models, the sample python script will download them automatically.
python <Python script for running model> <Parameter of Python script>
36
37
```
37
38
Where `<Python script for running model>` is the Python script you want to run. For example, if you want to run `stable_diffusion_v2_1`, you can run below command:
38
39
```
40
+
cd ai-engine-direct-helper\samples
39
41
python python\stable_diffusion_v2_1\stable_diffusion_v2_1.py --prompt "spectacular view of northern lights from Alaska"
*. Before running Stable Diffusion app, please download Stable Diffusion models from following AI-Hub website and save them to path: samples\python\stable_diffusion_v1_5\models & samples\python\stable_diffusion_v2_1\models.<br>
59
-
60
-
There're 3 models for each Stable Diffusion need to be downloaded: TextEncoderQuantizable, UnetQuantizable, VaeDecoderQuantizable <br>
0 commit comments