Replies: 5 comments 1 reply
-
Hello there, thank you for opening an Issue ! 🙏🏻 The team was notified and they will get back to you asap. |
Beta Was this translation helpful? Give feedback.
-
Hi @zhmiao, but do you want to read multiple files at once or read the same file faster? |
Beta Was this translation helpful? Give feedback.
-
Hello @SkalskiP, thanks for responding. I am looking for faster single video processing, as currently it looks like a single for loop over all the frames. I wonder if there are any ways to process the video with parallelism. |
Beta Was this translation helpful? Give feedback.
-
Hi, @zhmiao 👋🏻! Let me convert this issue into a discussion and move it into the Q&A section. |
Beta Was this translation helpful? Give feedback.
-
@zhmiao, in theory you could try to do something like this: generators = [
sv.get_video_frames_generator(source_path='walking.mp4', end=100),
sv.get_video_frames_generator(source_path='walking.mp4', start=100, end=200),
sv.get_video_frames_generator(source_path='walking.mp4', start=200, end=300),
sv.get_video_frames_generator(source_path='walking.mp4', start=300)
]
import threading
def frame_reader(generator_id, frame_generator):
for frame_id, frame in enumerate(frame_generator):
print(f"Generator {generator_id}, Frame {frame_id}")
# Create and start threads
threads = []
for i, gen in enumerate(generators):
t = threading.Thread(target=frame_reader, args=(i, gen))
t.start()
threads.append(t)
# Wait for all threads to complete
for t in threads:
t.join() But I'm not even sure you will notice any speed improvements. |
Beta Was this translation helpful? Give feedback.
-
Search before asking
Question
Hello. Thank you for the great repo. I wonder if there is any way to make the process_video more optimized with parallelism using multiple workers or threading? Or are there any resources I can dig into for this purpose? Thank you very much!
Additional
No response
Beta Was this translation helpful? Give feedback.
All reactions