Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[release/2.5] Enable bf16 with fp32 weights for MIOpen batchnorm #1672

Draft
wants to merge 2 commits into
base: release/2.5
Choose a base branch
from

Commits on Nov 14, 2024

  1. Enable bfloat16 for batchnorm

    Do not go down MIOpen path for bf16_input and bf16_weight
    
    Allow bfloat16 input with fp32 weight
    
    Weight should be fp32 for bfloat16 in backward
    jithunnair-amd committed Nov 14, 2024
    Configuration menu
    Copy the full SHA
    98a7d29 View commit details
    Browse the repository at this point in the history
  2. Add PYTORCH_MIOPEN_EXTRA_LOGGING logging to Normalization.cpp

    Add PYTORCH_MIOPEN_EXTRA_LOGGING anchor
    jithunnair-amd committed Nov 14, 2024
    Configuration menu
    Copy the full SHA
    7a7d89f View commit details
    Browse the repository at this point in the history