Skip to content

Conversation

@lvhg
Copy link

@lvhg lvhg commented Jan 2, 2026

@lvhg lvhg requested review from acmorrow, bhaney and lia-viam January 12, 2026 19:57
Copy link

@lia-viam lia-viam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

various minor cleanup comments

build.zip Outdated

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this may have been committed by accident

CMakeLists.txt Outdated
PRIVATE Threads::Threads
PRIVATE viam-cpp-sdk::viamsdk
PRIVATE tensorflow::tensorflowlite
add_executable(main

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would make sense for the executable target to be called tflite_cpu_module or something like that, and the add_library target could just be called tflite_cpu (no _library) because the generated library file will have lib prefixed to it anyway (libtflite_cpu)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

whichever you decide, however, you will probably need to commit changes to meta.json I think

// struct state_ final : public tflite::ErrorReporter;

// A visitor that can populate a TFLiteTensor given a MLModelService::tensor_view.
class write_to_tflite_tensor_visitor_ : public boost::static_visitor<TfLiteStatus>

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this class should just appear in the header as a forward declaration class write_to_tflite_tensor_visitor_; and then have the full definition in the cpp file, similar to how we did with struct state_

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd go so far as to suggest that this could likely move to the unnamed namespace in the .cpp file.

#include <tensorflow/lite/interpreter_builder.h>
#include <tensorflow/lite/kernels/register.h>

#include <viam/sdk/common/instance.hpp>

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe the following includes can/should be moved to main.cpp:

  • instance.hpp
  • registry.hpp
  • server.hpp

#include <stdexcept>

#include <tensorflow/lite/c/c_api.h>
#include <tensorflow/lite/interpreter_builder.h>

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the second two tensorflow includes can be moved to tflite_cpu.cpp I believe

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suspect all three can actually, because I think the only functions that reference the TF stuff in this header can actually be moved to free functions in the unnamed namespace in the .cpp file.

Copy link
Collaborator

@acmorrow acmorrow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a good start.

I think the first and most important change is to get a .clang-format file in place and feed this back into the formatter. There is a lot of noise from formatting changes, and they are almost all formatting changes that we don't want. So it both makes it hard to review and isn't the end result we want. I'd prioritize that change.

I also think that several things that are currently member functions or class-scoped static functions / types can actually be pulled entirely out of the header / class and instead declared as free functions / types in the unnamed namespace in the .cpp file.

That in turn would allow removing the tflite headers from our header, which would mean the tflite-ness is fully encapsulated in the implementation: a nice outcome.

src/main.cpp Outdated
Comment on lines 10 to 13
namespace
{

int serve(const std::string &socket_path)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is a little weird. There is a bin/run-clang-format.sh script but no .clang-format file. I think what happened is that when this file was moved from its original home in the C++ SDK, the file got copied but the formatting rules didn't.

In any event, this definitely needs clang-format applied before merging.

src/main.cpp Outdated
@@ -0,0 +1,74 @@
#include <shared_mutex>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is shared_mutex used in this fiel?

Comment on lines 39 to 40
namespace mlmodel_tflite
{
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not convinced this needs to be in a namespace at all. But, if you wanted to put it in one, I think I would put it in

namespace viam {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was the one who suggested putting it in some namespace but I think you're right that since it's an application package, no namespace or namespace viam is fine

#include <stdexcept>

#include <tensorflow/lite/c/c_api.h>
#include <tensorflow/lite/interpreter_builder.h>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suspect all three can actually, because I think the only functions that reference the TF stuff in this header can actually be moved to free functions in the unnamed namespace in the .cpp file.

Comment on lines 13 to 21
#include <viam/sdk/common/proto_value.hpp>
#include <viam/sdk/components/component.hpp>
#include <viam/sdk/config/resource.hpp>
#include <viam/sdk/module/service.hpp>
#include <viam/sdk/registry/registry.hpp>
#include <viam/sdk/resource/reconfigurable.hpp>
#include <viam/sdk/resource/stoppable.hpp>
#include <viam/sdk/rpc/server.hpp>
#include <viam/sdk/services/mlmodel.hpp>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I expect this list can also be reduced. I'd start with just including mlmodel.hpp, stoppable.hpp, and reconfigurable.hpp, and only adding back what you really need beyond that.

// struct state_ final : public tflite::ErrorReporter;

// A visitor that can populate a TFLiteTensor given a MLModelService::tensor_view.
class write_to_tflite_tensor_visitor_ : public boost::static_visitor<TfLiteStatus>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd go so far as to suggest that this could likely move to the unnamed namespace in the .cpp file.

Comment on lines 74 to 77
// All of the meaningful internal state of the service is held in
// a separate state object to help ensure clean replacement of our
// internals during reconfiguration.
// struct state_ final : public tflite::ErrorReporter;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Delete

Comment on lines 93 to 98
MLModelService::tensor_views make_tensor_view_(const MLModelService::tensor_info &info,
const TfLiteTensor *const tflite_tensor);

template <typename T>
MLModelService::tensor_views make_tensor_view_t_(const MLModelService::tensor_info &info,
const TfLiteTensor *const tflite_tensor);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think there is a decent possibility that these functions could be defined in an unnamed namespace in the .cpp file.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit confused on how that would work since they're functions of MLModelServiceTFLite which is in a named namespace (mlmodel_tflite)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit confused on how that would work since they're functions of MLModelServiceTFLite which is in a named namespace (mlmodel_tflite)

Because I don't think the actually need to be members of class MLModelServiceTFLite. I don't think they actually access any member state.


~MLModelServiceTFLite() final {
MLModelServiceTFLite::~MLModelServiceTFLite()
{
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The formatting changes are making it hard to see what really changed. Can we try to get a .clang-format file into this repo (probably just copy the one from the C++ SDK) and reformat?

@acmorrow
Copy link
Collaborator

@lvhg - I think it would be helpful if you could share what sort of testing is envisioned. Splitting this up into a header and source file is a necessary condition to being able to have tests that share the code, but it may not be sufficient, depending on what sort of tests you are trying to port over. Once this review lands, what you will be able to do is to instantiate a full tflite_cpu MLModelService instance, but is that really the intended unit of test? You would be more or less limited to making calls to create an instance with a configuration, invoke its metadata and infer endpoints, etc. Maybe that is enough? Maybe it isn't?

@lvhg
Copy link
Author

lvhg commented Jan 13, 2026

@lvhg - I think it would be helpful if you could share what sort of testing is envisioned. Splitting this up into a header and source file is a necessary condition to being able to have tests that share the code, but it may not be sufficient, depending on what sort of tests you are trying to port over. Once this review lands, what you will be able to do is to instantiate a full tflite_cpu MLModelService instance, but is that really the intended unit of test? You would be more or less limited to making calls to create an instance with a configuration, invoke its metadata and infer endpoints, etc. Maybe that is enough? Maybe it isn't?

@acmorrow https://github.com/viam-modules/mlmodel-tflite/blob/0.0.5/tflitecpu/tflite_cpu_test.go This is the Go test file that we are planning to eventually convert into C++.

Copy link
Collaborator

@acmorrow acmorrow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is looking good. There are a decent number of comments here in this round, but they are all pretty minor and amount to:

  • Naming nits, questions, and suggestions
  • Requests to reduce scope/visibility or move things into the .cpp file and out of the class declaration in the header.

.gitignore Outdated
build-conan/
build/

.DS_Store
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

.DS_Store probably belongs in your own ~/.gitignore, not the per project one.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure what you mean by this, this is the only gitignore in the repository

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can have a personal gitignore file in your home directory.

CMakeLists.txt Outdated
src/main.cpp
)

target_link_libraries(tflite_cpu_module PRIVATE tflite_cpu)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you should do this the other way: leave the name of the executable unchanged, and find a different name for the libary. Otherwise, you need to change meta.json. Maybe call the library tflite_cpu_service.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's update this per my suggestion, since Lia seems to agree.

CMakeLists.txt Outdated
PRIVATE tensorflow::tensorflowlite
PUBLIC Threads::Threads
PUBLIC viam-cpp-sdk::viamsdk
PUBLIC tensorflow::tensorflowlite
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suspect that tensorflow::tensorflowlite is not actually a PUBLIC dependency of the library (or, at least, it won't be after you push the functions I called out down into the .cpp file). In that case, it can and should be PRIVATE. I agree that the viamsdk is PUBLIC.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried changing it to private, but it didn't work since some parameters referenced in hpp have TF types

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which, not-really-coincidentally, are the functions I'm encouraging you to get ouf of the header and into the .cpp file.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I expect this change will work now.

src/main.cpp Outdated
} // namespace

int main(int argc, char* argv[]) {
const std::string usage = "usage: mlmodelservice_tflite /path/to/unix/socket";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this correct? I don't think mlmodelservice_tflite is the name of the binary anymore. In fact, the best way to get the name of the binary (or, at least, the name by which the binary was invoked) is out of argv[0], generally.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ping

// automatically without needing to wait for anything more to
// drain.
}
// namespace vsdk = ::viam::sdk;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove if not needed

tensor_data, tensor_size_t, std::move(shape));
}

} // namespace mlmodel_tflite
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

File should end with a newline, please configure your editor to include it

.gitignore Outdated
build-conan/
build/

.DS_Store
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

File should end with a newline, please configure your editor to automatically add it.

Copy link
Collaborator

@acmorrow acmorrow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only a few little things left now, this looks to me like it can be ready with one more round of review.


// Converts from tflites type enumeration into the model service
// type enumeration or throws if there is no such conversion.
static MLModelService::tensor_info::data_types service_data_type_from_tflite_data_type_(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • You don't need the static keyword here: that made sense when it was in a class, but you don't need it inside the unnamed namespace (which implies static).
  • There should not be a trailing _ on the name of this function because it is no longer a private class member.

.gitignore Outdated
build-conan/
build/

.DS_Store
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, I think .DS_Store belongs in your personal gitignore, not the project one.

CMakeLists.txt Outdated
PRIVATE tensorflow::tensorflowlite
PUBLIC Threads::Threads
PUBLIC viam-cpp-sdk::viamsdk
PUBLIC tensorflow::tensorflowlite
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I expect this change will work now.

CMakeLists.txt Outdated
src/main.cpp
)

target_link_libraries(tflite_cpu_module PRIVATE tflite_cpu)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's update this per my suggestion, since Lia seems to agree.

src/main.cpp Outdated
} // namespace

int main(int argc, char* argv[]) {
const std::string usage = "usage: mlmodelservice_tflite /path/to/unix/socket";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ping

Copy link
Collaborator

@acmorrow acmorrow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I took a thorough pass through, and while I did find a handful of things, I really believe these are the final issues before this can merge, and none of them will take more than a few minutes to change.

src/main.cpp Outdated
#include <viam/sdk/common/instance.hpp>
#include <viam/sdk/module/service.hpp>
#include <viam/sdk/registry/registry.hpp>
// TODO import tflite_cpu library
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove this commented line before merging, but leave a blank line between the C++ SDK includes and the tflite_cpu.hpp one, to group them.

namespace mlmodel_tflite {

namespace {
using namespace viam::sdk;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should remove using namespace viam::sdk now that you have namespace vsdk = ::viam::sdk so you know you got everything.

PRIVATE tensorflow::tensorflowlite
)

install(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. What happened to the install target? Was this intentional? If so, why?

src/main.cpp Outdated

int main(int argc, char* argv[]) {
const std::string usage =
std::string("usage: ") + std::string(argv[0]) + std::string(" /path/to/unix/socket\n");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

std::string has an operator+ overload that accepts const char*, so I don't think you need to wrap in std::string, and the type is definitely string so you don't need to name it on the LHS:

const auto usage =
        std::string("usage: ") + argv[0] + " /path/to/unix/socket";

You also don't need the terminating \n here if you have one on line 60, or vice versa. Probably better to retain the one on 60 and remove the one here.

}

// A visitor that can populate a TFLiteTensor given a MLModelService::tensor_view.
class write_to_tflite_tensor_visitor_ : public boost::static_visitor<TfLiteStatus> {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • This type is no longer private, so you can remove the trailing _.
  • This type also belongs in the unnamed namespace with tensor_views_from_tflite_tensor and tensor_views_from_tflite_tensor[,_t]

Comment on lines 265 to 266
const std::string* name_;
TfLiteTensor* tflite_tensor_;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These fields were originally private to write_to_tflite_tensor_visitor_ (note the trailing _ on the names), but they have since become public. I think they should remain private:.

Copy link
Collaborator

@acmorrow acmorrow left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, with one small cleanup regarding usage of the vsdk:: namespace alias, and two places trailing newlines should be added/restored.

But, you don't need to post another round of review. Just fix those up before merging.

class write_to_tflite_tensor_visitor_;
namespace {
namespace vsdk = ::viam::sdk;
using namespace vsdk;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should entirely remove the line using namespace vsdk and everything should keep working, because all C++ SDK usages in the .cpp file should now be vsdk:: qualified, as they originally were. If there are any missing, just add vsdk:: where needed.

}

} // namespace mlmodel_tflite
} // namespace viam
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please restore the trailing newlines and configure your editor to add them.

};

} // namespace mlmodel_tflite
} No newline at end of file
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Trailing newline here too

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants