Raspberry Pi has launched the AI HAT+ 2, bringing generative AI capabilities to the Pi 5 via a Hailo-10H accelerator that delivers 40 TOPS of INT4 performance. The interesting thing with this updated model though is the 8GB of dedicated on-board RAM, which allows you to run large language models and vision-language models locally without eating into your Pi 5’s system memory. I imagine that in itself is likely a big chunk of the price tag, though perhaps these were sourced early enough!

For those keeping track, this is the third iteration of AI HATs from Raspberry Pi. The original AI Kit came with the Hailo-8L (13 TOPS), then we got the AI HAT+ in both 13 TOPS and 26 TOPS variants, and now this. The computer vision performance on the AI HAT+ 2 is comparable to the 26 TOPS variant thanks to that on-board RAM, though the focus here is clearly on running generative AI models.
What Can It Run?
At launch, you’ll be able to install models like DeepSeek-R1-Distill (1.5B parameters), Llama3.2 (1B), and a few Qwen variants (1.5B). These are small compared to cloud-based LLMs like ChatGPT which run anywhere from 500 billion to 2 trillion parameters, but that’s the tradeoff for running everything locally. Raspberry Pi’s demo videos show the Qwen2 model handling basic questions, Qwen2.5-Coder tackling coding tasks, and even a VLM describing camera stream scenes.
The smaller parameter counts mean these models aren’t trying to match the knowledge of larger cloud models. Instead, they’re designed to work within constrained datasets, which you can customize through LoRA-based fine-tuning using Hailo’s Dataflow Compiler.

Pricing and Availability
The AI HAT+ 2 is available now at $130 (plus taxes/shipping depending on your reseller, for example it’s £124.80 at The Pi Hut). It connects via the Pi 5’s PCIe interface and integrates with the camera software stack, so if you’re already using rpicam-apps, things should work without additional configuration.
The board ships with a 16mm stacking header, spacers, screws, and a heatsink (which you’ll want to install if you’re running intensive workloads to avoid thermal throttling).
Will I be testing one? Probably not. I’ve said it before with the previous AI HATs and I’ll say it again.. I don’t have much personal interest in this side of things, so I’ll leave the tinkering to those who actually have use cases for it. If you do grab one though, I’d be curious to hear what you end up using it for. I’ll stick to my Claude subscription. For now.
You can find your local reseller on the Raspberry Pi AI HAT+ 2 product page, and Hailo’s GitHub repo has examples and demos for getting started with vision and generative AI applications.