LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
-
Updated
Dec 18, 2024 - Rust
LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
Instant, controllable, local pre-trained AI models in Rust
TensorZero creates a feedback loop for optimizing LLM applications — turning production data into smarter, faster, and cheaper models.
Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
LLama.cpp rust bindings
Production-Ready LLM Agent SDK for Every Developer
LLaMa 7b with CUDA acceleration implemented in rust. Minimal GPU memory needed!
A collection of serverless apps that show how Fermyon's Serverless AI (currently in private beta) works. Reference: https://developer.fermyon.com/spin/serverless-ai-tutorial
DuckDuckGo AI to OpenAI API
Generic persona (Ollama discord bot) but in Rust.
Rust bindings for bitnet.cpp based on llama-cpp-4
Add a description, image, and links to the llama topic page so that developers can more easily learn about it.
To associate your repository with the llama topic, visit your repo's landing page and select "manage topics."