ncnn

A high-performance neural network inference framework

ncnn is a high-performance neural network inference computing framework optimized for mobile platforms. ncnn is deeply considerate about deployment and uses on mobile phones from the beginning of design. ncnn does not have third party dependencies. It is cross-platform, and runs faster than all known open source frameworks on mobile phone cpu. Developers can easily deploy deep learning algorithm models to the mobile platform by using efficient ncnn implementation, create intelligent APPs, and bring the artificial intelligence to your fingertips. ncnn is currently being used in many Tencent applications, such as QQ, Qzone, WeChat, Pitu and so on.

Для openSUSE Leap 16.0 нет официального пакета

Дистрибутивы

openSUSE Tumbleweed

X11:Deepin Экспериментальный
20250916
science Экспериментальный
20260113

openSUSE Slowroll

openSUSE Leap 16.0

science Экспериментальный
20250916

openSUSE Leap 15.6

science Экспериментальный
20250503

openSUSE Factory RISCV

SLFO 1.2

science Экспериментальный
20250916

openSUSE Backports for SLE 15 SP7

science Экспериментальный
20250916