This repository contains a curated list of 120+ LLM libraries category wise.
🚀 LLM Interview Questions and Answers Book
Crack modern LLM and Generative AI interviews with this comprehensive, interview-focused guide designed specifically for ML Engineers, AI Engineers, Data Scientists and Software Engineers.
This book features 100+ carefully curated LLM interview questions, each paired with clear answers and in-depth explanations so you truly understand the concepts interviewers care about. Get the book here.
Use the Coupon Code: LLMQA25 for an exclusive discount (50%) on the book. (Available only for a short period of time).
Data Prep Kit accelerates unstructured data preparation for LLM app developers. Developers can use Data Prep Kit to cleanse, transform, and enrich use case-specific unstructured data to pre-train LLMs, fine-tune LLMs, instruct-tune LLMs, or build RAG applications.
Framework for serving and evaluating LLM routers - save LLM costs without compromising quality. Drop-in replacement for OpenAI’s client to route simpler queries to cheaper models.
A Library for Creating Semantic Cache for LLM Queries. Slash Your LLM API Costs by 10x 💰, Boost Speed by 100x. Fully integrated with LangChain and LlamaIndex.
fastRAG is a research framework for efficient and optimized retrieval-augmented generative pipelines, incorporating state-of-the-art LLMs and Information Retrieval.
A web scraping Python library that uses LLM and direct graph logic to create scraping pipelines for websites and local documents (XML, HTML, JSON, Markdown, etc.).
ChatArena is a library that provides multi-agent language game environments and facilitates research about autonomous LLM agents and their social interactions.
Open source LLM-Observability Platform for Developers. One-line integration for monitoring, metrics, evals, agent tracing, prompt management, playground, etc.
Python library for working with structured outputs from large language models (LLMs). Built on top of Pydantic, it provides a simple, transparent, and user-friendly API.
A blazing fast inference solution for text embeddings models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5.
A modular and extensible Python framework, designed to aid in the creation of high-quality, unbiased datasets to build robust models for MGT-related tasks such as detection, attribution, and boundary detection.
This package integrates Large Language Models (LLMs) into spaCy, featuring a modular system for fast prototyping and prompting, and turning unstructured responses into robust outputs for various NLP tasks.
👨🏻💻 LLM Engineer Toolkit
This repository contains a curated list of 120+ LLM libraries category wise.
🚀 LLM Interview Questions and Answers Book
Crack modern LLM and Generative AI interviews with this comprehensive, interview-focused guide designed specifically for ML Engineers, AI Engineers, Data Scientists and Software Engineers.
This book features 100+ carefully curated LLM interview questions, each paired with clear answers and in-depth explanations so you truly understand the concepts interviewers care about. Get the book here.
Use the Coupon Code: LLMQA25 for an exclusive discount (50%) on the book. (Available only for a short period of time).
Related Repositories
Stay Updated with Generative AI, LLMs, Agents and RAG.
Join 🚀 AIxFunda free newsletter to get latest updates and interesting tutorials related to Generative AI, LLMs, Agents and RAG.
Quick links
LLM Training and Fine-Tuning
LLM Application Development
Frameworks
Data Preparation
Multi API Access
Routers
Memory
Interface
Low Code
Cache
LLM RAG
LLM Inference
LLM Serving
LLM Data Extraction
LLM Data Generation
LLM Agents
LLM Evaluation
LLM Monitoring
LLM Prompts
LLM Structured Outputs
LLM Safety and Security
LLM Embedding Models
Others
⭐️ Star History
Please consider giving a star, if you find this repository useful.