ncnn

A high-performance neural network inference framework

ncnn is a high-performance neural network inference computing framework optimized for mobile platforms. ncnn is deeply considerate about deployment and uses on mobile phones from the beginning of design. ncnn does not have third party dependencies. It is cross-platform, and runs faster than all known open source frameworks on mobile phone cpu. Developers can easily deploy deep learning algorithm models to the mobile platform by using efficient ncnn implementation, create intelligent APPs, and bring the artificial intelligence to your fingertips. ncnn is currently being used in many Tencent applications, such as QQ, Qzone, WeChat, Pitu and so on.

Není dostupný žádný oficiální balíček pro openSUSE Leap 16.0

Distribuce

openSUSE Tumbleweed

X11:Deepin Experimentální
20250916
science Experimentální
20260113

openSUSE Slowroll

openSUSE Leap 16.0

science Experimentální
20250916

openSUSE Leap 15.6

science Experimentální
20250503

openSUSE Factory RISCV

SLFO 1.2

openSUSE Backports for SLE 15 SP7

science Experimentální
20250916