-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Grouping resources #2
Comments
Hey, @AmitMY . I think your work is very meaningful. If possible, you can briefly introduce the contribution process to me. Although I may not conduct subsequent research in the direction of sign language, this issue will always remain open. So that interested people can actively contribute to your work. |
Thanks! Mainly, to add text, it involves editing this large markdown file with all of the content https://github.com/sign-language-processing/sign-language-processing.github.io/blob/master/src/index.md?plain=1 You then make a pull request, and I review it. |
I've started on this myself, and am taking notes on the process at cleong110/sign-language-processing.github.io#2. Regarding the dataset JSON format, I have some notes on that noting what the main fields are, etc here, or one can just look at some of the existing ones in the datasets folder Amit linked above. It's not too difficult, just things like title, what the count of vocabulary is in the dataset, how many samples/videos there are, etc. Things that show up in the table of datasets basically. |
Hi!
A while ago I made https://github.com/sign-language-processing/sign-language-processing.github.io
It is clearly incomplete, and could benefit from the inclusion of many of the papers you list in the README.
Personally, I find it more valuable when it is written in text, and the papers are connected to each other (A et al did this thing with this method, B et al took it and improved it. C et al did something else entirely...)
If you feel like contributing in that repository, that would be very much appreciated.
The text was updated successfully, but these errors were encountered: