Skip to content

Conversation

@pereanub
Copy link
Contributor

@pereanub pereanub commented Dec 4, 2025

Details:

  • Tensor helper to get the data offset from ROI tensors

Tickets:

Signed-off-by: Bogdan Pereanu <bogdan.pereanu@intel.com>
Signed-off-by: Bogdan Pereanu <bogdan.pereanu@intel.com>
@pereanub pereanub requested review from a team as code owners December 4, 2025 12:17
@github-actions github-actions bot added category: inference OpenVINO Runtime library - Inference category: Core OpenVINO Core (aka ngraph) labels Dec 4, 2025
@pereanub pereanub changed the title Get tensor data offset Get ROI tensor data offset Dec 4, 2025
@mlukasze mlukasze requested a review from praasz December 4, 2025 12:29
return ov::SoPtr<ov::ITensor>(tensor_impl, so);
}

size_t get_tensor_data_offset(const ov::SoPtr<ov::ITensor>& tensor) {
Copy link
Contributor

@olpipi olpipi Dec 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd suggest to take ov::Tensor as a parameter instead of ov::SoPtr<ov::ITensor>. And to move calling of ov::get_tensor_impl(tensor) into get_tensor_data_offset function

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need it in the plugin as ov::ITensor. That means I need 2 extra calls to create an ov::Tensor from ov::ITensor and to get ov::ITensor from ov::Tensor again. To be honest, it doesn't make sense to me.

Signed-off-by: Bogdan Pereanu <bogdan.pereanu@intel.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

category: Core OpenVINO Core (aka ngraph) category: inference OpenVINO Runtime library - Inference

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants