Open mdehollander opened 6 years ago
Unfortunately, we don't have a comprehensive documentation for deploying the OSF. We would like to but don't have the resources right now to support users' deployments.
That said, if your institute has requirements that data be stored in their region, your best bet is to use one of our hosted storage regions, or connect your Amazon S3 or ownCloud to your OSF projects.
I think that OSF itself on the github especially made for developing local environment. Fortunately, I could build on a single server with different domain name of each container service, and surveying to build kubernetes environment now. However, I still have some of difficulties, because our infrastructure environments are in a proxy. It obstructs often accessing among containers. In addition, Ember.js services such as Preprints, Registries and also My Quick Files are still not function on our environment. To fix this, a lot of tough survey myself both tech side and OSF specifications must need. In addition, enormous help of OSF experts might be needed. Through exerience of this, I would like to push some correction of code for building environment someday as a push requiests on github.
@BRosenblatt and @HiroyukiNaito, thanks for your answers.
I understand that you need people to maintain documentation. It would be very helpful of this would happen in the future. The OSF is a great open-source platform. It would be great if it can be deployed across the globe. Currently for that good documentation is missing.
I got the main interface running, but did not look into other services like preprints and registries. Too bad you haven got that up and running.
I am facing difficulties to configure the default OSFStorage. I managed to connect a owncloud/nextcloud addon and upload/download files. In the osf.io website the default storage provider is amazon I think. For local use I am not sure what values to change, I guess it is something in addons/osfstorage/settings/local.py
. For testing purposes I would like to store the files on the local filesystem, but also it would be good to know what config values to set to connect it to amazon.
Also I had issues with network connectivity. I set my own local domain name in /etc/hosts
and had to change the WATERBUTLER_URL=
in .docker-compose.env
to this domain in order to get the addons working.
If we get things working we can maybe give it the documentation a start with our experiences. That is my intention for opening this bug report.
I would also like to use OSF on premise. We are not allowed to share files with the public and not allowed to store files anywhere else than on our servers. If we, as a community, could develop a documentation on how to setup OSF on premise, that would be awesome.
Are there any shared ressources, yet?
I would just like to leave a +1 on this. Hosting OSF locally, and maybe even federating it with other installations would be a total game changer. Many institutions would like to manage their projects but are not willing or allowed to store anything (including metadata and project details for internal use) outside their premises.
Maybe other actors could pick up the role of contributing to the code with a nice CI (suited to customize the OSF installation) and federation with the ability to publish to a global network of OSF nodes.
The documentation would be the first step to make such development possible.
I have a half baked dream of spinning up software related to Dataverse (you can deposit data from OSF into Dataverse) using the Kubernetes config in https://github.com/IQSS/dataverse-kubernetes . I'm coming at the from the Dataverse developer perspective of wanting to ensure that integrations are tested regularly. Right now I think we rely on users to tell us if we broke something. π
I got this idea of spinning up related software in our ecosystem in Kubernetes from @craig-willis who created https://www.workbench.nationaldataservice.org which is described at http://www.nationaldataservice.org/platform/workbench.html and has "specs" various software (including Dataverse, CKAN, Globus, Jupyter, etc.) at https://github.com/nds-org/ndslabs-specs
I have no idea if anyone has any time to work on any of this though. It's just a thought.
I agree that docs are crucial. Maybe someone from the OSF community could apply for https://developers.google.com/season-of-docs/
Hi, @RightInTwo !!! π
I haven't tried it, but would using the osf helm-charts (https://github.com/CenterForOpenScience/helm-charts) make a custom installation easier?
I would also be interested in this feature. Let's at least collect our success stories of local installations somewhere.
Any news? My organisation would be interested in some pointers....
@CaptainSifff if your organization is interested in using a personal OSF instance, I suggest they look at the implementation for RDM-osf.io. That's essentially the only large group that has forked our project that is being actively used and is open source. If you'd like to use just a portion of our functionality I'd recommend integrating with our services as is, such as WaterButler and modular-file-renderer, or forking one of those services. I'd also strongly recommend using our new Oauth2 capabilities to utilize our file storage and REST API.
We are somewhat cagey about writing precise instructions for setting up osf.io because the data model and external integrations we are writing change frequently and we simply don't have the resources to walk people through the process individually, but if you are representing an institution and want more details please contact us at support@osf.io
and we and talk about what is best for your use case. Mention that you had a conversation with John Tordoff and name the organization you represent and will answer your questions to the best of our ability.
I think it would be very useful to have a document with the best practices of people setting up a local OSF instance at their home institute. In the issue list I noticed several people have worked on this in the past (https://github.com/CenterForOpenScience/osf.io/issues/6248, https://github.com/CenterForOpenScience/osf.io/issues/7219, https://github.com/CenterForOpenScience/osf.io/issues/6255, https://github.com/CenterForOpenScience/osf.io/issues/7347, https://github.com/CenterForOpenScience/osf.io/issues/7805, https://github.com/CenterForOpenScience/osf.io/issues/8493). The Docker Compose document is a good starting point for a local installation. However, my experience is that it does not cover specific application configurations, like when to create a
local.py
and what values to change.For me it is also not clear how to setup working with files on a local installation. I guess this goes via Waterbutler, but any input or best practices on this topic would be highly appreciated. Also which storage option is used as default (Amazon S3, or any other object storage system like Swift, Ceph or Minio)
@umardraz, @mfraezz, @yacchin1205, @sloria, @mattvw, @antonwats, @jpkeith4, @HiroyukiNaito, I would appreciate it if you can share you experiences with setting up a local OSF installation. Did you succeed to setup an instance? What was the most difficult part? Is it still in use?