bouncepaw / mycorrhiza

๐Ÿ„๐Ÿ“‘ Filesystem and git-based wiki engine for the independent web written in Go and using Mycomarkup as its primary markup language.
https://mycorrhiza.wiki
GNU Affero General Public License v3.0
300 stars 26 forks source link

Replace default static files #225

Open juev opened 5 months ago

juev commented 5 months ago

Currently, a file system from two sources is used for static files:

  1. embedded files
  2. files in the static directory

according to the Open function used, the search is performed by sources in the specified order. And if the file was found, it just comes back.

As a result, if we have a file in the source of the embedded files, we cannot redefine it, that is, there is no way to simply replace it if necessary.

How about starting to sort through the sources starting from the directory? And in the event that there is no specified file in it, should I search in the embedded?

This will make it much easier to manage styles, scripts and the same files, robots.txt for example.

bouncepaw commented 5 months ago

For now you can redefine static files with a reverse proxy. Some people use nginx for that.

The question is, why would one want to replace these files? ๐Ÿคจ

juev commented 5 months ago

For example, completely redefine styles, or redefine robots.txt the file.

So that the styles are not formed as a large file, which then needs to be repeated to redefine all incoming elements.

Eragonfr commented 2 months ago

The default robots.txt disallow all user-agents but people probably wants their wiki to be available on search engines. And personally, don't want to run a server only to serve one file. So being able to replace it completely would be great.

For the styling it's easy to simply overwrite the defaults with the custom.css file. Set all the values that exists in the default style. But the result can be a bit bloated.

bouncepaw commented 2 months ago

For example, completely redefine styles,

You can do that, see file structure docs. static/default.css is the file you need to make.

For the styling it's easy to simply overwrite the defaults with the custom.css file.

See above.

The default robots.txt disallow all user-agents but people probably wants their wiki to be available on search engines.

This is not true, Mycorrhiza is seen in search engines. For example, in Duck Duck Go. In the default robots.txt, you can see that only the following subset is indexed:

Allow: /help/
Allow: /hypha/
Allow: /recent-changes
Allow: /list
# the rest is Disallow

I'm not a big fan of tens of spiders triggering Git for outdated historic content, so I limited the indexed scope to the useful stuff only.

Moreover, turns out, robots.txt can be redefined. I didn't remember that! I think I should make it more transparent and known.

As a result, if we have a file in the source of the embedded files, we cannot redefine it, that is, there is no way to simply replace it if necessary.

I don't think it's true. Can you please provide a reproduction instruction?

Eragonfr commented 2 months ago

This is not true, Mycorrhiza is seen in search engines. Sorry it was probably a problem with my browser cache.

robots.txt can be redefined. I didn't remember that! I think I should make it more transparent and known. I just tried it to remove ChatGPT from my website and I can confirm that it works. Maybe all that is missing is some docs.