Skip to content

Adding "BEST: BERT Pre-training for Sign Language Recognition with Coupling Tokenization" #61

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

cleong110
Copy link
Contributor

@cleong110 cleong110 commented Jun 5, 2024

My notes on the process: cleong110#19

A few things:

  • No I don't know what BEST stands for, it's not called out explicitly in the paper but I believe it is BERT pre-training for Sign language recognition with coupling Tokenization
  • Might be good to double-check some of the stuff about coupling tokenization, I might not have understood it properly
  • extremely similar in a number of ways to Adding SignBERT+ #54, to the point where I could not resist referencing it, and I think we need to merge that one first. Similar approach, evaluated on the same datasets with similar results. Even the table layouts are similar, with both of them calling their method "Ours +R" in the "from video" results. Neither one offers any code.

TODO:

@cleong110
Copy link
Contributor Author

Merged latest, working on rewrites

@cleong110
Copy link
Contributor Author

All right did some rewrites. https://chatgpt.com/share/f42199e8-bd03-4ec3-9913-e5e832bb2885 had various suggestions, some of which I incorporated. It consistently struggles with citations that start with [dataset:, and will sometimes write out the rendered citation, and has a few other issues, but generally seems helpful

@AmitMY AmitMY merged commit a901d72 into sign-language-processing:master Jun 12, 2024
1 check failed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants