Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Need for type casting? #284

Closed
wacky6 opened this issue Aug 26, 2022 · 3 comments
Closed

Need for type casting? #284

wacky6 opened this issue Aug 26, 2022 · 3 comments
Labels

Comments

@wacky6
Copy link

wacky6 commented Aug 26, 2022

This came up during review for softmax implementation https://chromium-review.googlesource.com/c/chromium/src/+/3856752/comments/267c926c_b14af87f

Softmax will likely only support float point types: #283

Currently, the spec provides no way to convert between data types, this might make it hard to use softmax with certain computation graphs.

This could impact models doing integer computations (e.g. image recognition models that takes in RGB24 image represented by uint8), and returns softmax-ed predictions (e.g. one-hot encoding).

For example, the following graph:

   Input (uint8)
-> matmul (uint32)
-> softmax

Perhaps WebNN needs a type-casting function?

In ML frameworks:

@fdwr
Copy link
Collaborator

fdwr commented Sep 29, 2023

Definitely a gap, and included in v2 ops for transformers: #375 (comment)

partial interface MLGraphBuilder {
  MLOperand cast(MLOperand input, MLOperandDataType operandDataType);
}

@inexorabletash
Copy link
Member

Now that cast() landed (PR #478) can this be closed?

@inexorabletash
Copy link
Member

Closing this as complete.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants