Videos contain plethora of contextual information. For example, in a movie there are fighting scenes, sentimental scenes, romantic scenes, and many others. In a cricket match, there are wickets, sixes, fours et cetera. With the advent of the data-driven age, amateurs, researchers, and organisations alike require some specific part of this contextual information for their needs; maybe for creating a highlights reel of a sports match or mining data from movies for their machine learning models. This makes parts of certain types of videos very useful. FabBits tries to automate finding them. Following are the things it will be able to detect -

Project repo - github.com/achie27/FabBits Blog posts - medium.com/@achie27 Samples - Drive folder


You need the following things to run FabBits-

  1. Python3
  2. OpenCV - Used for image and video processing
  3. Moviepy - Used for video editing and audio processing
  4. PyQt5 - Used to make the GUI
  5. Scipy - Used for audio processing
  6. Tesserocr - Used for, well, OCR
  7. Pillow - Used to preprocess images for OCR The python dependencies can be installed by running -
 pip3 install scipy
 pip3 install opencv-python 
 pip3 install moviepy
 pip3 install pyqt5
 pip3 install Pillow
 pip3 install tesserocr

or if you are the Anaconda kind -

 conda install -c conda-forge scipy
 conda install -c conda-forge opencv 
 conda install -c conda-forge moviepy 
 conda install -c anaconda pyqt
 conda install -c conda-forge pillow
 conda install -c simonflueckiger tesserocr


Run the main GUI by -

python3 main.py

To find your FabBit of choice -

You can also run the respective files of use-cases to get their FabBit, like - python3 goal_detector.py soccer_match.mp4


All the references can be found listed in the repository's readme.