Powered by RWKV + Flutter + C++ High-Performance Inference Engine
基于 RWKV 架构与 Flutter + C++ 高性能推理引擎的融合
Private, Offline, Everywhere. We aim to bring the power of Large Language Models (LLMs) to consumer-grade devices—from mobile phones to laptops—without relying on the cloud. By leveraging the linear complexity of the RWKV architecture, we make AI accessible even on low-compute devices.
隐私、离线、无处不在。 我们的目标是将大语言模型(LLM)的能力带入普通用户的手机和笔记本电脑,无需依赖昂贵的云端算力。借助 RWKV 架构的线性复杂度优势,我们让 AI 在低算力设备上也能流畅运行。
Our ecosystem consists of three key layers, ensuring high performance across platforms: 我们的生态系统由三个核心层组成,确保了跨平台的高性能表现:
| Repository (仓库) | Role (角色) | Tech Stack (技术栈) |
|---|---|---|
| 📱 RWKV_APP | [Frontend] The UI/UX layer. Manages model weights, chat interface, and platform-specific logic. [前端交互] 负责 UI 渲染、权重管理及平台特定代码。 |
|
| 🔗 rwkv_mobile_flutter | [Bridge] The communication bridge. Uses Dart FFI to connect the Flutter UI with the C++ backend. [通信桥梁] 通过 Dart FFI 连接 Flutter 前端与 C++ 后端,实现非阻塞通讯。 |
|
| ⚙️ rwkv-mobile | [Engine] The high-performance inference runtime. Supports CPU, GPU (Vulkan/Metal), and NPU. [推理引擎] 核心 C++ 运行时。支持 WebGPU, NPU, MLX 等多种硬件加速。 |
- Offline Inference (离线推理): Download once, run forever. No internet required.
- Multi-Modal (多模态): Chat, ASR (Speech-to-Text), TTS (Voice Cloning), and Vision (OCR/Description).
- Hardware Acceleration (硬件加速): Optimized for Qualcomm NPU, Apple MLX, and standard GPUs.
- Cross-Platform (全平台): Android, iOS, Windows, macOS, Linux.
Experience the latest RWKV-7 models on your device today. 立即在您的设备上体验最新的 RWKV-7 模型。
| Platform | Link |
|---|---|
| Android | Google Play • Pgyer (Beta) |
| iOS | App Store • TestFlight |
| Desktop | Join our QQ Group for Windows/macOS builds |
Special thanks to our core contributors: Molly Sophia (Core Engine), Ce Wang (App Arch), dengzi, chenqi, and the community.
| Platform | Link |
|---|---|
| Discord | Join Server |
| QQ Group (Tech) | 325154699 |
| QQ Group (Beta) | 332381861 |