Skip to content

test: Add Torch AOTI Tests#8771

Open
whoisj wants to merge 7 commits into
mainfrom
jwyman/pt2-tests
Open

test: Add Torch AOTI Tests#8771
whoisj wants to merge 7 commits into
mainfrom
jwyman/pt2-tests

Conversation

@whoisj
Copy link
Copy Markdown
Contributor

@whoisj whoisj commented May 8, 2026

This change:

  • Creates a new L0_torch_aoti test suit.
  • Adds complex Torch AOTI model generation to qa/common/gen_qa_models.py.
  • Cleans up existion AOTI model generation in qa/common/gen_qa_models.py.
  • Enabled torchvision AOTI model generation in qa/common/gen_qa_model_repository.

This change:
- Creates a new L0_torch_aoti test suit.
- Adds complex Torch AOTI model generation to qa/common/gen_qa_models.py.
- Cleans up existion AOTI model generation in qa/common/gen_qa_models.py.
- Enabled torchvision AOTI model generation in qa/common/gen_qa_model_repository.
@whoisj whoisj requested review from mudit-eng, pskiran1 and yinggeh May 8, 2026 16:20
Comment thread qa/L0_torch_aoti/torch_aoti_infer_test.py Fixed
whoisj and others added 2 commits May 8, 2026 12:32
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
This change:
- Creates a new L0_torch_aoti test suit.
- Adds complex Torch AOTI model generation to qa/common/gen_qa_models.py.
- Cleans up existion AOTI model generation in qa/common/gen_qa_models.py.
- Enabled torchvision AOTI model generation in qa/common/gen_qa_model_repository.
Comment thread qa/L0_torch_aoti/.gitignore Outdated
@whoisj whoisj requested a review from yinggeh May 8, 2026 21:21
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds a new QA L0 suite to exercise PyTorch Torch AOTI models (including complex/nested I/O) and enables generation of the required QA model repository artifacts (including TorchVision AOTI ResNet50).

Changes:

  • Introduces qa/L0_torch_aoti test suite (shell harness + HTTP client unittest coverage for complex/simple/torchvision AOTI models).
  • Extends qa/common/gen_qa_models.py to generate complex Torch AOTI models/configs and updates Torch AOTI/TorchVision AOTI packaging/config generation.
  • Enables TorchVision AOTI model generation in qa/common/gen_qa_model_repository.

Reviewed changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 8 comments.

File Description
qa/L0_torch_aoti/torch_aoti_infer_test.py Adds HTTP-based unittest coverage for complex/named/index Torch AOTI models, simple dtype variants, and a TorchVision AOTI model.
qa/L0_torch_aoti/test.sh Adds the L0 harness that stages models, starts Triton, runs the Python tests, and cleans up.
qa/common/gen_qa_models.py Adds complex Torch AOTI generation/configs; refactors paths; updates TorchVision AOTI generation.
qa/common/gen_qa_model_repository Enables TorchVision AOTI QA model repository generation.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread qa/L0_torch_aoti/test.sh Outdated
Comment thread qa/L0_torch_aoti/test.sh
Comment thread qa/common/gen_qa_models.py Outdated
Comment thread qa/L0_torch_aoti/torch_aoti_infer_test.py
Comment thread qa/L0_torch_aoti/torch_aoti_infer_test.py
Comment thread qa/L0_torch_aoti/torch_aoti_infer_test.py Outdated
Comment thread qa/L0_torch_aoti/torch_aoti_infer_test.py Outdated
Comment thread qa/L0_torch_aoti/torch_aoti_infer_test.py Outdated
Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
@whoisj whoisj force-pushed the jwyman/pt2-tests branch from 0948d27 to 48a3375 Compare May 12, 2026 22:06
Comment thread qa/L0_torch_aoti/torch_aoti_infer_test.py Fixed
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Comment thread qa/common/gen_qa_models.py Outdated
@@ -1298,69 +1299,44 @@ def generate_sample_inputs(
input_shape = [abs(ips) for ips in input_shape]

if input_dtype == np.int8:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not simply create a numpy to torch dtype map?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what do you mean? This function generates sample inputs, not translate types. Sorry, just confused.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Something like

np_to_torch_dtype_dict = {
    np.int8: torch.int8,
    ...
}

...

if input_dtype in np_to_torch_dtype_dict:
    input0 = torch.zeros(input_shape, dtype=np_to_torch_dtype_dict[input_dtype], device=device)
    input1 = torch.zeros(input_shape, dtype=np_to_torch_dtype_dict[input_dtype], device=device)
else:
    print(
        f"{_color_yellow}warning: dtype {input_dtype} is unsupported; falling back to torch.int32{_color_reset}"
    )
    input0 = torch.zeros(input_shape, dtype=torch.int32, device=device)
    input1 = torch.zeros(input_shape, dtype=torch.int32, device=device)

if input_dtype == np.int8:
input0 = torch.randint(-128, 127, input_shape, dtype=torch.int8, device=device)
input1 = torch.randint(-128, 127, input_shape, dtype=torch.int8, device=device)
input0 = torch.zeros(input_shape, dtype=torch.int8, device=device)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the reasoning behind this change? How do you tell backend is working correctly for a zero-value output tensor?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

these are just sample values used by PyTorch to compile the model. These aren't used for inference.

Comment thread qa/common/gen_qa_models.py
print(f"Created {label_path}")


def create_torch_aoti_complex_modelconfig(
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def create_torch_aoti_complex_modelconfig(
def create_torch_aoti_complex_model_config(

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all other similar functions are named modelconfig not model_config, I'm just following the existing pattern.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see. "Model config" should really be two words.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agreed. I can change them all, what do you think?

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For now I am fine with only changing this new method.

Comment thread qa/L0_torch_aoti/test.sh
@whoisj whoisj requested a review from yinggeh May 14, 2026 17:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

4 participants