generated from rochacbruno/fastapi-project-template
-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
Right now, our algorithm takes some assumptions:
- When the start button is been pressed it starts to stream frames
- We take frame numbers of file and the webcam in consideration (window size of 5-6) and then compute the similarity.
- If there is a sudden lag or issue related to internet then the frame numbers become less proximal and our algorithm do not have a way in which it can resume matching from that frame onwards.
An example:
Suppose both file streaming and webcam stream starts with frame number of 0 and a single matching window can be like:
0 - 0
1 - 2 (may not be frame 1, frame 1 might got skipped)
2 - 3
4 - 4 (frame 3 might get skipped for some issues, like no mediapipe results found)
5 - 5
Now suppose if we have some issues from the webcam side, then the matching might get something like this
1000 - 850
1001 - 852
1002 - 853
1004 - 854
1005 - 855
And matching between those does not make any sense. But we expect the file streamer to go back to frames (starting from 840) and then start matching.
Some possible solutions:
- Instead of streaming of files store in a way such that it is in a form of a dictionary. Where our dict has a schema something like this
{
frame number : {
"keypoints": [
[
0.5179882645606995,
0.28025951981544495,
-0.05498553439974785
],
....
],
"angle": [
0.3522091266834007,
0.01872792213619685,
0.35149274749276493,
0.3600268078852646,
....
0.3197111974833256
]
}Such that we can just access the frame from file by just accessing the key something like this:
data_at_frame_i = json_object[framew_number_from_web] There might be some better ways to implement the same problem. Feel free to come with the solution.
Metadata
Metadata
Assignees
Labels
No labels