Appendices

User Manual

  1. Upload video from home page of webapp or click the button to view the example analysis.
  2. Once the video has finished being analysed, you will be redirected to the analysis dashboard which has a URL unique to this video analysis.
  3. Here you can view the full annotated video.
    • You can also view the clipped shots, filtered by shot type using the dropdown button.
    • You can also download the JSON analysis data or copy the link to this video analysis to share by clicking on the respective buttons. Note that your analysis data is stored on the server for 24 hours and is publicly available via this link for this time.
    • For each clipped shot, you can view its video clip and metrics, as well as its 3D reconstruction by clicking on the respective button. You can also download the 3D animation in .BVH format to import into an application such as Blender, Unity, or Unreal.
  4. In the 3D reconstruction view, you can cycle between shots as well as control the playback speed of the shot animation by toggling the ‘animation’ menu.

Analysis Pipeline API Documentation

extract_joint_frames:

                
Extracts 3D coordinates of joints in world space, 
measured in metres with the origin at the center of the hips, for each frame 

    Parameters: 

        frames (list) -- list of NumPy image arrays representing the video frames 

    Returns: 

        joint_frames (dict) -- dictionary of 3D joint coordinate frames with joint names 
        as the keys 

        mp_landmarks (list) -- MediaPipe joint image landmarks 
                
            

classify_shot:

                
Classifies 3D pose animation into 'backhand', 'forehand', 'service', or 'smash' shot 

    Parameters: 

        joint_frames (dict) -- dictionary of 3D joint coordinate frames with joint 
        names as the keys 

    Returns: 

        classification (str), -- shot classification 

        confidence (float) -- confidence score 
                
            

analyse_shots:

                
Analyse all shots performed by 3D pose animation 

    Parameters: 

        joint_frames (dict) -- dictionary of 3D joint coordinate frames with joint 
        names as the keys 

    Returns: 

        shot_analysis (dict) -- dictionary of lists containing analytic entries for 
        each shot.  

                        Keys are analytic names: 'intervals', 'classifications', 
                        'joint_frames', 'speeds', 'hands' 
                
            

pose2bvh

                
Converts joint frames given in world coordinates to .bvh format 

    Parameters: 

        joint_frames (dict) -- dictionary of 3D joint coordinate frames with joint 
        names as the keys 

        fps (float) -- frames-per-second of the animation 

    Returns: 

        bvh (str) -- formatted .bvh text 
                
            

analyse_video:

                
Perform 3D shot analysis on video feed 

    Parameters: 

        video_path, -- path to video file to be analysed 

        out_dir -- path to directory to output analysis results in (.json file, 
        annotated video and clips, and .bvh files) 
                
            

Deployment Manual

Dependencies

  • NumPy
  • OpenCV
  • MediaPipe
  • TensorFlow
  • SciPy
  • Flask

Deployment

  1. Ensure the above dependencies are installed and that you are running Python 3.8.
  2. cd into 'src' and run 'python app.py' to start the Flask server.
  3. The webapp can then be accessed by navigating to ‘localhost:5000’.
  4. Analysis results are stored in the folder 'src/static/analysis_results'
  5. If you wish to use the analysis pipeline API, please refer to the provided documentation.


Data Protection

The only data we collect from users are their uploaded videos. We process uploaded videos solely to extract the shot analysis data. The annotated videos and other analysis data are then stored on the server for 24 hours before being deleted. During this time, these videos and analysis data are only accessed for the sake of being displayed or downloaded on the frontend. This data is publicly available during the 24 hours to anyone with the unique URL for this video analysis. By uploading a video, the user consents to the aforementioned usage and storage of their data.

Software Licence

Our project is licenced under the MIT licence.

MIT Licence
Copyright (c) 2022 Prithvi Kohli, Morgane Ohlig, and Jin Feng

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.


Development Blog

Link to blog

Monthly Videos


Credit for Bootstrap template

https://github.com/startbootstrap/startbootstrap-clean-blog