nickgillian / grt

gesture recognition toolkit
859 stars 285 forks source link

How to become a contributor? #91

Open jingyi2811 opened 7 years ago

jingyi2811 commented 7 years ago

Hi all,

Perhaps it is wrong to ask this question here. I would like to know how can I contribute to this group. I have been working on gesture for quite some time and I find this group does best suit me.

I work for almost 10 years as Java programmer. Even though I don't know anything about C++ but I believe I can contribute perhaps from writing documentation at first.

Kind advise.

Regards, Jimmy (Malaysia) 603-0122075069

leekingly commented 7 years ago

this is wonderful! hope you can make more contributions to grt

nickgillian commented 7 years ago

Hi Jimmy,

Any and all contributions are welcome! There are lots of ways you can contribute, for example:

Any help across any of these areas would be greatly appreciated!

You might have noticed that over the last few weeks/months I've been migrating all the examples, wiki and website across to github, so this should hopefully make contributions much easier, as now you can just make a pull request if you want to edit the wiki or have an idea that would improve the GRT website.

Thanks!

jingyi2811 commented 7 years ago

Hi nickgillian and leekingly,

Thanks for warm greeting.

I guess the first step I will do is to improve the documentation. Prior to that, I still need some time to study all sort of documents / codes.

I am happy to join this big family. I really serious in gesture / machine learning stuff.

Regards, Jimmy (This is how I looks like https://www.facebook.com/JimmyLee2811) :)

cyberluke commented 7 years ago

Hi Nick, I'm working on configurable 3D UI in Unity (Win, Mac, Lin, Android, iOS) with possibility to daisychain pipelines & load trained data. In future maybe also record your own. The main purpose is to make a Unity SDK or asset for our RealMagic NR (Natural Reality), which is a family of building blocks with motion sensors or EMG sensors. From one finger (ring) to whole body capture & record. We are currently in the state of making a lot of Unity samples (small games, simulations, 3D MIDI environment for DJs/producers/VJs). Regarding the hardware we are preparing preorders for next month, when we present this at Maker Faire Rome. This is just in case someone would be interested in 3d UI (Unity 3d) or just some open, distributed, wifi motion sensors with buttons & high precision. BTW: another advantage of 3d UI is support for Google Cardboard or another AR/VR solution.