ImageMonkey / imagemonkey-core

ImageMonkey is an attempt to create a free, public open source image dataset.
https://imagemonkey.io
47 stars 10 forks source link

Migrate away from travis-ci #286

Closed bbernhard closed 3 years ago

bbernhard commented 3 years ago

travis-ci seems to be no longer providing CI minutes for open source projects: https://news.ycombinator.com/item?id=25338983

As we are using travis-ci extensively, we should migrate away from travis-ci as fast as possible.

dobkeratops commented 3 years ago

Just curious if you intend to keep this running -

I think I remember you saying you didn’t have as much time to put into this so it was kind of on hold,

You’re still paying for an instance to run it I guess? .. if you don’t get many volunteers making submissions and annotating you may consider winding it down? Or do you intend to go back to it some day?

I just got hold of another PC and GPU (RTX 3080 ) giving me a bit more spare power to experimemt with training. I’ve still got the rtx 2070S aswell, and I still mostly use a 970 day to day (Prior to this my spare GPUs we’re running in much older machines).

If you do end up winding this down I’d probably want to grab a dump of the data already here first.

With the lockdowns I haven’t been anywhere to get any new photos and haven’t bothered scraping more. I note there’s still thousands of un-annotated examples (and I’d guess the majority of images are labelled but not yet annotated )

bbernhard commented 3 years ago

Hi,

Just curious if you intend to keep this running -

I think I remember you saying you didn’t have as much time to put into this so it was kind of on hold,

Yeah, at the moment I have quite a lot of paid projects going on, so I do not have that much time anymore that I can put into improving the platform. But I still try to reserve at least 5 hrs/week for improving ImageMonkey. As I do not have that much time, I mostly focus on small improvements, i.e:

So there's still a bit going on in the background :)

I've noticed that with my limited time big features are a bit cumbersome, as I often lose focus between all the context switching. So I have put those on hold for now.

If you do end up winding this down I’d probably want to grab a dump of the data already here first.

I've noticed that you are regularily contributing something, which is totally awesome. So, as long as somebody is using the platfom and I do have enough "play money" for projects that I love, I'll keep the service up and running. The only thing that I might wind down soon, is my GPU instance, which I rented for playing with neural nets. But I've mainly used that for experimenting, so it shouldn't affect the service.

If nobody is using it for months (and I still do not have more time to contribute), I'll maybe shut it down at some point. But if that happens, I'll definitely create a full database dump with some instructions in case someone wants to use the data. That reminds me, that I should probably create another snapshot for the Internet Archive :)

I just got hold of another PC and GPU (RTX 3080 ) giving me a bit more spare power to experimemt with training. I’ve still got the rtx 2070S aswell, and I still mostly use a 970 day to day (Prior to this my spare GPUs we’re running in much older machines).

Awesome! Out of curiosity: Are there any projects your are currently using the GPU's for? I always love to read about people's experiments with neural nets.

dobkeratops commented 3 years ago

Right I’ve tried to keep this idea of a community maintained dataset alive, it’s always there to kill a bit of time.

Currently the previous spare pc is unplugged (it’s pretty old.. 1st gen core i7) - I’ve bought this new one with the express intention of leaving it training . I figured a new PC was the only way to get an RTX3080 with the current shortages.

Prior to that I had tinkered a little with pytorch using CIFAR with various simple nets but didn’t really do anything serious (it was interesting seeing how to implement all the tricks like residuals and shortcuts , layer dropout etc in pytorch, it’s very flexible with auto differentiation)

In the long term my interests remain the same .. looking for ways to use NNs to assist 3D graphics . Just plain recognition could be used to enhance photogrammetry scans.

Hard to compete with the experiments done by the tech giants with their big computing resources, but I hope it will be possible to bolt something into a pretrained net (a lot of people just restrain them and I think one can just peel back a couple of layers and add branches ..)

bbernhard commented 3 years ago

Awesome to hear!

Hard to compete with the experiments done by the tech giants with their big computing resources, but I hope it will be possible to bolt something into a pretrained net (a lot of people just restrain them and I think one can just peel back a couple of layers and add branches ..)

Agreed! I think there's no chance to compete with the big ones in terms of size. I can't imagine the amount of data companies like google have access to in order to train their neural nets. I think we are currently in a state where neural nets are good enough for playing a bit with them (and sometimes even getting impressive results), but for the majority of tasks they are still a bit to unreliable (at least when run unsupervised). But I am really hoping that this changes a bit in the future, as the technology matures and people come up with different training algorithms. And I think at that point it becomes more and more important to have neural nets that are trained on open source data.

I am really looking forward to hear more from your NN experiments!

bbernhard commented 3 years ago

the CI is now working again :tada: