GitStar

vLLMOrganization

vLLM

@vllm-project • Open source projects from vllm-project. Use this route to separate flagship concentration from portfolio breadth before you treat a publisher as broadly strong.

Portfolio concentration

53%

Top three share

Shows whether the organization is driven by one breakout repo or several visible projects.

Breadth

30 repos

Visible snapshot

24 repositories updated in the last 90 days.

Leading language

Python

Portfolio mix

Python (17), Unknown (5), Go (2)

Average size

830

Stars per repository

Useful for distinguishing one flagship-heavy publisher from a repeatable portfolio.

Updated: 2026-04-28(1d ago)GitHub API fallback30 repositories

Portfolio Shape

53%

of the visible star count comes from this organization's top three repositories.

Average Repository Size

830

stars per repository in this same snapshot.

Current Mix

Python

is the most common language here, with 24 repositories updated in the last 90 days.

Why this rank

This organization stands out because its public portfolio is relatively balanced across 30 repositories.

Balanced portfolio across 30 reposTop 3 share 53%

Organization pages work best when you separate portfolio breadth from flagship concentration. In vLLM's case, the visible top three repositories account for about 53% of total stars in this snapshot, which helps explain whether the organization is known for one breakout project or for a broader repeatable portfolio.

The dominant language mix here is Python (17), Unknown (5), Go (2). That makes this page useful not just for popularity checks, but also for seeing what technical shape an organization's public ecosystem actually has.

Source: GitHub API fallback. This is the same cache-first snapshot used by the organization ranking list, so the summary view and the detail view should stay aligned.

Top Repositories

#RepositoryLanguage⭐ Stars
1vllm-project/aibrix

Cost-efficient and pluggable Infrastructure components for GenAI inference

Go4.8K
2vllm-project/vllm-omni

A framework for efficient model inference with omni-modality models

Python4.5K
3vllm-project/semantic-router

System Level Intelligent Router for Mixture-of-Models at Cloud, Data Center and Edge

Go3.9K
4vllm-project/llm-compressor

Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM

Python3.2K
5vllm-project/production-stack

vLLM’s reference system for K8S-native cluster-wide deployment with community-driven performance optimization

Python2.3K
6vllm-project/vllm-ascend

Community maintained hardware plugin for vLLM on Ascend

Python2K
7vllm-project/guidellm

Evaluate and Enhance Your LLM Deployments for Real-World Inference Needs

Python1.1K
8vllm-project/vllm-metal

Community maintained hardware plugin for vLLM on Apple Silicon

Python1K
9vllm-project/recipes

Common recipes to run vLLM

JavaScript762
10vllm-project/speculators

A unified library for building, evaluating, and storing speculative decoding algorithms for LLM inference in vLLM

Python381
11vllm-project/tpu-inference

TPU inference for vLLM, with unified JAX and PyTorch support.

Python306
12vllm-project/router

A high-performance and light-weight router for vLLM large scale deployment

Rust210
13vllm-project/vllm-skills

Agent skills for vLLM

Shell67
14vllm-project/vllm-daily

vLLM Daily Summarization of Merged PRs

50
15vllm-project/vllm-openvinoPython48
16vllm-project/vllm-xpu-kernels

The vLLM XPU kernels for Intel GPU

C++38
17vllm-project/vllm-gaudi

Community maintained hardware plugin for vLLM on Intel Gaudi

Python38
18vllm-project/vllm-neuron

Community maintained hardware plugin for vLLM on AWS Neuron

Python29
19vllm-project/agentic-api

Stateful API logic for agentic applications using vLLM

Python24
20vllm-project/vllm-nccl

Manages vllm-nccl dependency

Python18
21vllm-project/dllm-plugin

vLLM plugin for block-based diffusion language model (dLLM) support

Python13
22vllm-project/vLLM-in-PyTorch-Conference-202512
23vllm-project/FlashMLAC++12
24vllm-project/bart-plugin

vLLM Model plugin for the encoder-decoder BART model

Python11
25vllm-project/media-kit

vLLM Logo Assets

8
26vllm-project/perf-dashboard

Performance dashboard for vLLM

Python1
27vllm-project/rfcs1
28vllm-project/vllm-dashboardTypeScript0
29vllm-project/perf-evalPython0
30vllm-project/DeepGEMM

DeepGEMM: clean and efficient FP8 GEMM kernels with fine-grained scaling

0

Next step after the organization read

Open a flagship repository, compare a couple of portfolio leaders, or return to the organization map when you want a broader concentration read.

Learn and methodology

Keep trust-building context reachable, but behind the first data read instead of ahead of it.

How to Read This Snapshot

Total stars are useful as a discovery signal, but they do not tell you whether a team maintains every repository equally. Pair this page with release cadence, maintainer activity, and the flagship concentration shown above before making adoption decisions.

For broader background on GitStar's ranking logic and editorial guidance, see Methodology & Editorial Standards.