Skip to content

Conversation

@metascroy
Copy link
Contributor

Here we do experiments to see why iOS26 produces bad numerics (see #15833).

Export with:

 python export.py -n model2.pte -p $HOME/models/llama1b/params.json -c $HOME/models/llama1b/llama1b.pth --seq_length 16 --max_seq_length 512 --dtype fp16 --coreml-quantize custom &> log.txt

Run with:

python run.py -m model2.pte -t $HOME/models/llama1b/tokenizer.model --prompt "Once upon a time,"

What we've found:

  • Quantizing embeddings seems to lead to bad results on iOS26
  • Using CoreMLRMSNorm instead of RMSNorm produces better numerics in FP16 on iOS26
  • Quantizing q, k, v, o, and lm_head on iOS26 seems OK
  • Quantizing w1, w1, or w3 on iOS26 leads to bad numerics. It seems to be related to the 8096 matrix size. If we use in/out feature splitting on those linear layers, we can recover good numerics.

Even after all of the above changes, on my mac CoreML is scheduling this model to run on CPU. Have yet to try on iPhone.

@metascroy metascroy requested a review from cccclai as a code owner November 21, 2025 02:34
@pytorch-bot
Copy link

pytorch-bot bot commented Nov 21, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/15936

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure, 2 Unrelated Failures

As of commit 4266924 with merge base 131d1f4 (image):

NEW FAILURE - The following job has failed:

FLAKY - The following job failed but was likely due to flakiness present on trunk:

BROKEN TRUNK - The following job failed but was present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 21, 2025
@github-actions
Copy link

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants