Closed michaell4438 closed 11 months ago
Relates to #592
The easiest way to host the files locally is with python -m http.server
but I think it would be better to find some other way to temporarily host the website online.
Well actually, I was looking at some of the files for myself and it looks like all links originate from whatever is set in the baseUrl
option in docusaurus.config.js
, which is just \
. There is also the url
option which is https://robotics.xbhs.net
. My suspicion is the actual files in the site do not care what our url is, only that files can be accessed via /<file-name>
, which if we are in the file://
context, the browser tries to access file:///<file-name>
.
Here's a screenshot from the browser console showing what I mean:
So we just need some way to convince the browser that the location of the site (say /home/michael/Downloads/HelpPage
) can be accessed by file:///
, so using theBook.pdf as an example, the real location is /home/michael/Downloads/HelpPage/theBook.pdf
, but we have to convince the browser that it is actually file:///theBook.pdf
.
Relative paths. Instead of /assets/whatever.js
, use assets/whatever.js
. Editing the paths manually works to make index.html
load the main runtime js, but I get webpack errors.
Well actually, I was looking at some of the files for myself and it looks like all links originate from whatever is set in the
baseUrl
option indocusaurus.config.js
, which is just\
. There is also theurl
option which ishttps://robotics.xbhs.net
. My suspicion is the actual files in the site do not care what our url is, only that files can be accessed via/<file-name>
, which if we are in thefile://
context, the browser tries to accessfile:///<file-name>
.Here's a screenshot from the browser console showing what I mean:
So we just need some way to convince the browser that the location of the site (say
/home/michael/Downloads/HelpPage
) can be accessed byfile:///
, so using theBook.pdf as an example, the real location is/home/michael/Downloads/HelpPage/theBook.pdf
, but we have to convince the browser that it is actuallyfile:///theBook.pdf
.
Because of this, we could do the same with a temporary host using something like workers.dev
Relative paths. Instead of
/assets/whatever.js
, useassets/whatever.js
.
The problem is there is no way to tell Docusaurus to use relative paths.
My thought was to use Cloudflare Pages for the test builds because it keeps old deployments. For example, I created this just now: Production test and Preview test. It keeps all of the old builds like Preview, Production, old production 2, and old production 1.
We could have an Actions script that deploys to something like xbhs-robotics-docs-dev.pages.dev
and an automatic comment that links to that deployment with the hash.
And it looks like we could use something like https://github.com/cloudflare/pages-action inside Actions to give us full control over the upload (instead of the default Pages-GitHub integration that deploys on every push). I can start working on this using my account and then switch it over to the XBHS account if we decide to actually use this.
If you want to work on this you are welcome to, but I have too much work on my hands right now.
I am already working on this. I'm currently refactoring the Pages build action to use multiple steps, which will be necessary to make sure this works. The main problem is that I don't have secrets access to the repo so I won't be able to test it out for real.
Strange bug: on https://robotics.xbhs.net/, clicking the buttons on the header go to a 404 page until refreshed.
It would probably be helpful is we could build the docs for branches other than master.
The problem is the docs expect us to be hosted on robotics.xbhs.net, or run the development server. The problem is if you ship just the site contents, it won't work because the page
index.html
expects to be located atrobots.xbs.net/index.html
but in reality it is located atfile://..../index.html
.And we can't host multiple instances of the same site on the same domain, because that doesn't work.