Operator support for F.one_hot
#29
Annotations
2 errors and 1 warning
auto-cc
Not Found
{
name: 'HttpError',
id: '8411245054',
status: 404,
response: {
url: 'https://api.github.com/repos/Lightning-AI/lightning-thunder/issues/1464',
status: 404,
headers: {
'access-control-allow-origin': '*',
'access-control-expose-headers': 'ETag, Link, Location, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Used, X-RateLimit-Resource, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval, X-GitHub-Media-Type, X-GitHub-SSO, X-GitHub-Request-Id, Deprecation, Sunset',
connection: 'close',
'content-encoding': 'gzip',
'content-security-policy': "default-src 'none'",
'content-type': 'application/json; charset=utf-8',
date: 'Sun, 24 Mar 2024 18:18:00 GMT',
'referrer-policy': 'origin-when-cross-origin, strict-origin-when-cross-origin',
server: 'GitHub.com',
'strict-transport-security': 'max-age=31536000; includeSubdomains; preload',
'transfer-encoding': 'chunked',
vary: 'Accept-Encoding, Accept, X-Requested-With',
'x-accepted-github-permissions': 'issues=read',
'x-content-type-options': 'nosniff',
'x-frame-options': 'deny',
'x-github-api-version-selected': '2022-11-28',
'x-github-media-type': 'github.v3; format=json',
'x-github-request-id': 'FFC0:214B9:742F80:C6EEE0:66006E58',
'x-ratelimit-limit': '15000',
'x-ratelimit-remaining': '14996',
'x-ratelimit-reset': '1711307880',
'x-ratelimit-resource': 'core',
'x-ratelimit-used': '4',
'x-xss-protection': '0'
},
data: {
message: 'Not Found',
documentation_url: 'https://docs.github.com/rest/issues/issues#get-an-issue'
}
},
request: {
method: 'GET',
url: 'https://api.github.com/repos/Lightning-AI/lightning-thunder/issues/1464',
headers: {
accept: 'application/vnd.github.v3+json',
'user-agent': 'probot/12.2.5 octokit-core.js/3.6.0 Node.js/16.20.2 (linux; x64)',
authorization: 'token [REDACTED]'
},
request: {}
},
event: {
id: '8411245054',
name: 'issues',
payload: {
action: 'labeled',
issue: {
active_lock_reason: null,
assignee: null,
assignees: [],
author_association: 'NONE',
body: '## 🐛 Bug\r\n' +
'\r\n' +
'`thunder` fails When attempting to compile a graph containing `torch.nn.functional.one_hot` within the forward pass.\r\n' +
'The error message indicates that the input to the method must be a `Tensor`, but a `TensorProxy` is received instead.\r\n' +
'\r\n' +
'### To Reproduce\r\n' +
'\r\n' +
'Steps to reproduce the behavior:\r\n' +
'\r\n' +
'- Define a PyTorch model class with a forward pass involving `F.one_hot` to convert the input tensor to a one-hot encoded representation.\r\n' +
'- Create an instance of the model and evaluate it on a random input tensor.\r\n' +
'- Compile the model using `thunder.jit`.\r\n' +
'- Call the compiled model with the same input tensor.\r\n' +
'\r\n' +
'#### Example\r\n' +
'\r\n' +
'```python\r\n' +
'import thunder\r\n' +
'\r\n' +
'\r\n' +
'class MLP(nn.Module):\r\n' +
' def __init__(self, hidden_size=1024):\r\n' +
' super(MLP, self).__init__()\r\n' +
' self.hidden = nn.Linear(6 * 256, hidden_size, bias=False)\r\n' +
' self.head = nn.Linear(hidden_size, 32000, bias=False)\r\n' +
'\r\n' +
' def forward(self, inputs):\r\n' +
' x = F.one_hot(inputs, 6).reshape(-1, 6 * 256).float()\r\n' +
' x = self.hidden(x)\r\n' +
' logits = self.head(x)\r\n' +
' return logits\r\n' +
'\r\n' +
'\r\n' +
'x = torch.randint(0, 6, (1, 256))\r\n' +
'\r\n' +
'model = MLP(1024).eval()\r\n' +
'print(model(x))\r\n' +
|
auto-cc
HttpError: Not Found
at /home/runner/work/_actions/Lightning-AI/probot/v5/node_modules/@octokit/core/node_modules/@octokit/request/dist-node/index.js:86:21
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async Job.doExecute (/home/runner/work/_actions/Lightning-AI/probot/v5/node_modules/bottleneck/light.js:405:18)
{
name: 'AggregateError',
event: {
id: '8411245054',
name: 'issues',
payload: {
action: 'labeled',
issue: {
active_lock_reason: null,
assignee: null,
assignees: [],
author_association: 'NONE',
body: '## 🐛 Bug\r\n' +
'\r\n' +
'`thunder` fails When attempting to compile a graph containing `torch.nn.functional.one_hot` within the forward pass.\r\n' +
'The error message indicates that the input to the method must be a `Tensor`, but a `TensorProxy` is received instead.\r\n' +
'\r\n' +
'### To Reproduce\r\n' +
'\r\n' +
'Steps to reproduce the behavior:\r\n' +
'\r\n' +
'- Define a PyTorch model class with a forward pass involving `F.one_hot` to convert the input tensor to a one-hot encoded representation.\r\n' +
'- Create an instance of the model and evaluate it on a random input tensor.\r\n' +
'- Compile the model using `thunder.jit`.\r\n' +
'- Call the compiled model with the same input tensor.\r\n' +
'\r\n' +
'#### Example\r\n' +
'\r\n' +
'```python\r\n' +
'import thunder\r\n' +
'\r\n' +
'\r\n' +
'class MLP(nn.Module):\r\n' +
' def __init__(self, hidden_size=1024):\r\n' +
' super(MLP, self).__init__()\r\n' +
' self.hidden = nn.Linear(6 * 256, hidden_size, bias=False)\r\n' +
' self.head = nn.Linear(hidden_size, 32000, bias=False)\r\n' +
'\r\n' +
' def forward(self, inputs):\r\n' +
' x = F.one_hot(inputs, 6).reshape(-1, 6 * 256).float()\r\n' +
' x = self.hidden(x)\r\n' +
' logits = self.head(x)\r\n' +
' return logits\r\n' +
'\r\n' +
'\r\n' +
'x = torch.randint(0, 6, (1, 256))\r\n' +
'\r\n' +
'model = MLP(1024).eval()\r\n' +
'print(model(x))\r\n' +
'\r\n' +
'model = thunder.jit(model)\r\n' +
'print(model(x))\r\n' +
'```\r\n' +
'\r\n' +
'<details>\r\n' +
'<summary>Output</summary>\r\n' +
'\r\n' +
'```\r\n' +
'tensor([[-0.1134, -0.0827, -0.0205, ..., 0.0757, 0.0066, 0.0974]],\r\n' +
' grad_fn=<MmBackward0>)\r\n' +
'---------------------------------------------------------------------------\r\n' +
'TypeError Traceback (most recent call last)\r\n' +
'[<ipython-input-6-6425e5faad6e>](https://localhost:8080/#) in <cell line: 23>()\r\n' +
' 21 \r\n' +
' 22 model = thunder.jit(model)\r\n' +
'---> 23 print(model(x))\r\n' +
'\r\n' +
'16 frames\r\n' +
'[/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py](https://localhost:8080/#) in _wrapped_call_impl(self, *args, **kwargs)\r\n' +
' 1509 # type ignore was added because at this point one knows that\r\n' +
' 1510 # torch.jit._trace._trace_module_map is not Optional and has type Dict[Any, Any]\r\n' +
'-> 1511 name = torch.jit._trace._trace_module_map[self] if self in torch.jit._trace._trace_module_map else None # type: ignore[index, operator] # noqa: B950\r\n' +
' 1512 if name:\r\n' +
' 1513 tracing_state.push_scope(name)\r\n' +
'\r\n' +
'[/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py](https://localhost:8080/#) in _call_impl(self, *args
|
auto-cc
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: Lightning-AI/probot@v5. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
|