Self-hosted projects using CUDA
6 self hosted projects using this technologyAI
6 services found

Open WebUI
Extensible, offline-capable web interface for LLM interactions
Feature-rich, self-hosted AI interface that integrates Ollama and OpenAI-compatible APIs, offers RAG, vector DB support, image tools, RBAC and observability.

LocalAI
OpenAI-compatible local AI inference server and API
Run LLMs, image, and audio models locally with an OpenAI-compatible API, optional GPU acceleration, and a built-in web UI for managing and testing models.

Jina
Cloud-native Python framework for serving multimodal AI services
Open-source Python framework to build, scale, and deploy multimodal AI services and pipelines with gRPC/HTTP/WebSocket support and Kubernetes/Docker integration.
Willow
Open-source, privacy-focused voice assistant platform
Self-hosted voice assistant platform for ESP32 devices with on-device wake-word and command recognition, Home Assistant integration, and an optional inference server for...


Viseron
Local-only NVR with AI computer vision for IP cameras
Self-hosted NVR and computer vision platform for RTSP/IP cameras with local object detection, motion detection, and face recognition.

Scriberr
Offline AI audio and video transcription with transcript chat
Scriberr is a self-hosted, privacy-focused AI transcription app for audio and video, with speaker diarization, word-level timestamps, summaries, and transcript chat.