-
Notifications
You must be signed in to change notification settings - Fork 5
Description
Thanks for your amazing work!
I have two categories of questions:
About Unity Rendering:
Would it be possible to open-source your Unity project?
For people like me, who have limited experience with Unity, this would be a huge help in understanding and reproducing your pipeline.
During the rendering pipeline, does it include lens and CMOS sensor simulation?
According to your paper, "We render human motion from a large-scale internal MoCap dataset using the exact camera placement and intrinsics of the Quest 2 headset in the Unity game engine", it seems that only camera placement and intrinsics are considered. Based on my limited understanding, the sim-to-real gap could be further reduced if lens and CMOS sensor characteristics were also taken into account.
About RGB to Monochrome
Is this process implemented using an OpenCV API like cv2.cvtColor(source_image, cv2.COLOR_BGR2GRAY), or do you use other methods?