netlify / ask-netlify

A place to submit questions for Netlify to answer in tutorials, podcasts and blog posts
https://ask.netlify.com
32 stars 8 forks source link

Best practices for deploying prebuilt content #60

Open gwk opened 5 years ago

gwk commented 5 years ago

I am currently using Netlify to deploy sites that I generate locally. I understand that the deploy system is designed with static site generators in mind, and that Netlify expects to run a build process on the server side. However, for small / single-developer projects, it sometimes makes more sense for me to run the build on my local machine and then just push the results to Netlify.

Currently what I am doing is using a separate repository for the site content. This is basically 1990s web development except using git instead of FTP. I have been tracking the deploy commits as a submodule of the source directory. I set it up this way so that I wouldn't bloat the source repository, but it is cumbersome and I would like to do away with the submodule setup.

For the moment, I can probably afford to check everything into the source repo. However over time images and large amounts of generated HTML are likely to cause git repo bloat. Do you have any suggestions?

One option would be to use git LFS. However I have no experience with it yet. I wonder if I could use Git LFS to track the entire build directory, not just certain file types. Are there pitfalls with using it that way?

Side question: if I push a refactoring commit, presumably Netlify still has to run the build process. Is there a way for the build system to signal "no change"?

For context, I am developing an experimental build system for deep data cleaning pipelines. The idea is that it can do end to end builds from original sources, through clean data intermediates, to a final website. Depending on the project, the build times can be very long, and so it is left up to the user to manage when they save intermediate products and rebuild. In other words, "just let Netlify build it" does not really make sense when the inputs are GB of raw data.

The project is here: https://github.com/gwk/muck/ The (old) project paper is here: https://www.cjr.org/tow_center_reports/muck-a-build-tool-for-data-journalists.php

The paper is now almost two years old, the documentation is out of date, and the tool is still undergoing major changes. I think Netlify could be a great match for this kind of work, but the deploy model does not quite make sense yet. In the future I could imagine adding some integration with Muck and Git LFS or even with the Netlify CLI if that made sense.

Thanks for reading! -George

j-f1 commented 5 years ago

Here’s the docs for manual deploys on Netlify: https://www.netlify.com/docs/cli/#manual-deploys. TL;DR:

$ # run these once:
$ netlify login
$ netlify --telemetry-disable # optional
$ netlify link --name your-site-name
$ # run this to deploy:
$ netlify deploy --dir=build-dir/ --prod
gwk commented 5 years ago

Jed, thank you very much. Since posting this I realized that the community forum is a better place to continue this conversation.