Skip to main content


Does anyone have a recommendation for #LlamaCPP alternative to run recent vision language models on Apple Silicon? Llama.cpp doesn't support any of the recent #VLM such as Qwen2-VL, Phi-3.5-vision, Idefics3, InternVL2, Yi-VL, Chameleon, CogVLM2, GLM-4v, etc.
Minicpm-v 2.6 is the only recent model that was added. Maybe time to move on. :( #LLM #multimodal #AppleSilicon #MacOS #ML #AI
in reply to Chi Kim

@ZBennoui I am reading that CoreML allows us porting models to run on Apple Silicon. I yet to find the time to go through the videos though to better understand how it’s done. Any idea? Here is a starter article: apple.github.io/coremltools/do…
This entry was edited (2 months ago)
in reply to victor tsaran

@vick21 @ZBennoui unfortunately the coreml converter was pretty limited when I tried last time. I doubt it can catch up with all the new architectures especially In this fast pace space.
in reply to victor tsaran

@vick21 @ZBennoui Having said that, someone told me on Reddit MLX-VL supports some newer VL models on Apple silicon like idefics2, deepseek-vl, Phi-3.5-vision. I haven't tried yet.