You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
`react-native-transformers` is a React Native library for running Large Language Models (LLMs) from Hugging Face on your mobile applications locally. It supports both iOS and Android platforms, allowing you to leverage advanced AI models directly on your device without requiring an internet connection.
7
+
**Run Hugging Face transformer models directly on your React Native and Expo applications with on-device inference. No cloud service required!**
8
8
9
-
## Features
9
+
## Overview
10
10
11
-
- On-device transformer model support for both text generation and text embedding
12
-
- Local inference without internet connectivity
13
-
- Compatible with iOS and Android platforms
14
-
- Simple API for model loading and inference
15
-
- Support for Hugging Face models in ONNX format
16
-
- Built on top of ONNX Runtime for efficient model execution
17
-
- TypeScript support with full type definitions
11
+
`react-native-transformers` empowers your mobile applications with AI capabilities by running transformer models directly on the device. This means your app can generate text, answer questions, and process language without sending data to external servers - enhancing privacy, reducing latency, and enabling offline functionality.
12
+
13
+
Built on top of ONNX Runtime, this library provides a streamlined API for integrating state-of-the-art language models into your React Native and Expo applications with minimal configuration.
14
+
15
+
## Key Features
16
+
17
+
-**On-device inference**: Run AI models locally without requiring an internet connection
18
+
-**Privacy-focused**: Keep user data on the device without sending it to external servers
19
+
-**Optimized performance**: Leverages ONNX Runtime for efficient model execution on mobile CPUs
20
+
-**Simple API**: Easy-to-use interface for model loading and inference
21
+
-**Expo compatibility**: Works seamlessly with both Expo managed and bare workflows
18
22
19
23
## Installation
20
24
21
-
To use `react-native-transformers`, you need to install `onnxruntime-react-native` as a peer dependency. Follow the steps below:
25
+
### 1. Install peer dependencies
22
26
23
-
### 1. Install the peer dependency:
27
+
```sh
28
+
npm install onnxruntime-react-native
29
+
```
24
30
25
-
```sh
26
-
npm install onnxruntime-react-native
27
-
```
31
+
### 2. Install react-native-transformers
28
32
29
-
### 2. Install `react-native-transformers`:
33
+
```sh
34
+
# React-Native
35
+
npm install react-native-transformers
30
36
31
-
```sh
32
-
npm install react-native-transformers
33
-
```
37
+
# Expo
38
+
npx expo install react-native-transformers
39
+
```
34
40
35
-
### 3. Configure React-Native or Expo
41
+
### 3. Platform Configuration
36
42
37
43
<details>
38
-
<summary>React Native CLI</summary>
44
+
<summary><b>React Native CLI</b></summary>
39
45
40
-
-Link the `onnxruntime-react-native` library:
46
+
Link the `onnxruntime-react-native` library:
41
47
42
-
```sh
43
-
npx react-native link onnxruntime-react-native
44
-
```
48
+
```sh
49
+
npx react-native link onnxruntime-react-native
50
+
```
45
51
</details>
46
52
47
53
<details>
48
-
<summary>Expo</summary>
54
+
<summary><b>Expo</b></summary>
49
55
50
-
- Install the Expo plugin configuration in`app.json` or `app.config.js`:
56
+
Add the Expo plugin configuration in `app.json` or `app.config.js`:
51
57
52
-
```json
53
-
{
54
-
"expo": {
55
-
"plugins": [
56
-
"onnxruntime-react-native"
57
-
],
58
-
}
59
-
}
60
-
```
58
+
```json
59
+
{
60
+
"expo": {
61
+
"plugins": [
62
+
"onnxruntime-react-native"
63
+
]
64
+
}
65
+
}
66
+
```
61
67
</details>
62
68
63
69
### 4. Babel Configuration
64
70
65
-
You need to add the `babel-plugin-transform-import-meta` plugin to your Babel configuration (e.g., `.babelrc` or `babel.config.js`):
66
-
67
-
```json
68
-
{
69
-
"plugins": ["babel-plugin-transform-import-meta"]
70
-
}
71
-
```
71
+
Add the `babel-plugin-transform-import-meta` plugin to your Babel configuration:
`react-native-transformers` works with ONNX-formatted models from Hugging Face. Here are some recommended models based on size and performance:
192
+
193
+
| Model | Type | Size | Description |
194
+
|-------|------|------|-------------|
195
+
| [Felladrin/onnx-Llama-160M-Chat-v1](https://huggingface.co/Felladrin/onnx-Llama-160M-Chat-v1) | Text Generation | ~300MB | Small Llama model (160M parameters) |
196
+
| [microsoft/Phi-3-mini-4k-instruct-onnx-web](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-onnx-web) | Text Generation | ~1.5GB | Microsoft's Phi-3-mini model |
0 commit comments