Skip to content

Conversation

@angelayi
Copy link
Contributor

@angelayi angelayi commented Jan 28, 2026

This PR adds a dedicated vllm-compile tlparse viewer which looks like (internal example):
image

When we encouter vllm compilation artifacts (prefixed with vllm_), we use the vllm parsers located under src/vllm. We will also choose to display a different index.html.

In vllm, we will have multiple layers of compilation. First based on dynamic shapes (shape ranges, and then specialized sizes), and then if using piecewise cudagraphs then it will compile subgraphs of the toplevel model. So we want to have a hierarchical view like

Inductor Compilation                                         
  ├── compile range [1, 16384]                                         
  │   ├── submod_0                                                                   
  │   │   └── inductor artifacts ...                     
  │   └── submod_2                                                              
  │       └── inductor artifacts ...                     
  └── size 8                                                   
      ├── submod_0                                             
      │   └── inductor artifacts ...                     
      └── submod_2  

To do this, we create a VllmState to pass around which tracks the artifacts as we parse through. In the vllm logs added in vllm-project/vllm#33213, vllm will emit a vllm_piecewise_compile_start log at the start of each subgraph compilation, containing the submod name and compile range/size. As we parse subsequent inductor artifacts (post_grad_graph, output_code, etc.), we attach them to the current subgraph in VllmState. When we encounter the next vllm_piecewise_compile_start, we know the previous subgraph is complete and start collecting artifacts for the new one.

@angelayi angelayi requested review from yushangdi and zou3519 January 28, 2026 06:30
@meta-cla meta-cla bot added the cla signed label Jan 28, 2026
@angelayi angelayi force-pushed the angelayi/tlparse_vllm branch from 8f44fdd to 0018307 Compare January 28, 2026 08:45
src/lib.rs Outdated
let filename_str = filename.to_string_lossy().to_string();

// Track artifact for vLLM summary
let artifact_name = filename
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe I'm missing something. why this code will only run vllm artifacts? do we need to check vllm file prefix and only add vllm artifacts to vllm_state?

Copy link
Contributor Author

@angelayi angelayi Jan 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated! I removed most of that logic since it was unnecessary, and moved the rest into the vllm_state.add_artifact function

@angelayi angelayi force-pushed the angelayi/tlparse_vllm branch from 0018307 to 2ae9ed8 Compare January 29, 2026 08:01

// If vLLM artifacts are present, use vLLM summary as index.html and save
// traditional tlparse index as tlparse_index.html for reference
if vllm_state.has_artifacts() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since this override the normal html, can we add a more detailed description of when would vllm_state.has_artifacts() = True? e.g. it's true when there exist at least a file that starts with vllm

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated the comment!

@angelayi angelayi force-pushed the angelayi/tlparse_vllm branch 2 times, most recently from e540338 to 5e8cc04 Compare January 29, 2026 22:14
@angelayi angelayi force-pushed the angelayi/tlparse_vllm branch from 5e8cc04 to a64185d Compare January 29, 2026 23:34
@angelayi angelayi merged commit b99d509 into main Jan 30, 2026
15 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants