180DA-Lab1
Lab1
Sources:
hand-gesture-recognition-mediapipe-main: https://github.com/kinivi/hand-gesture-recognition-mediapipe
Color.py: https://answers.opencv.org/question/200861/drawing-a-rectangle-around-a-color-as-shown/
Detect.py: https://onecompiler.com/python/3xdvdjyjk
Dominant.py: https://code.likeagirl.io/finding-dominant-colour-on-an-image-b4e075f98097
Dominantrect.py: https://stackoverflow.com/questions/73808864/get-most-dominant-colors-from-video-opencv-python
Dominantvideo.py: https://stackoverflow.com/questions/73808864/get-most-dominant-colors-from-video-opencv-python
Handmotion.py: https://www.youtube.com/watch?v=3xfOa4yeOb0
Image.py: https://docs.opencv.org/4.x/d7/d4d/tutorial_py_thresholding.html
Mediapipe_test.py: https://google.github.io/mediapipe/solutions/hands
Test.py: From Lab1 PDF
Video.py: https://docs.opencv.org/3.0-beta/doc/py_tutorials/py_gui/py_video_display/py_video_display.html
Test.txt: From Lab1 PDF
Colorful.jpg and messi5.jpeg are two images from Google.
180DA-Lab1 is a file I created just to commit in this repo.