π©
On-device demos for MiniCPM-V multimodal LLMs running locally via llama.cpp. Supports MiniCPM-V 2.6, 4.0, and 4.6 models across iOS, Android, and HarmonyOS NEXT platforms. Built for developers who want to deploy vision-language models directly on mobile devices without cloud dependencies.