Open shu8 opened 7 years ago
Unfortunately this is not implemented by Firefox. Please vote for this issue or leave a comment there: https://bugzilla.mozilla.org/show_bug.cgi?id=1266960
Thanks.
@derjanb thanks for the quick reply! that's really annoying, I've voted on the bug, but it seems like I'm the only one!? Have I voted on the right place? :/
@shu8 It's the only place I've found so far. You can also comment on this issue and explain why it is important for you.
On the other hand, opening a file or "blob" in an external editor is supported by Firefox, and that is the main reason I'd like to use @require file://
to begin with - as a workaround for editing in an external editor.
See e.g. It's All Text! that opens an external editor.
Perhaps in the meantime, the documentation could be updated, e.g. in Make script editable in an external editor referenced by the FAQ and on the Tampermonkey • Firefox page which in fact specifically mentions this as possible:
Tampermonkey's editor is nice, but you do have your own, that is faster, better, ... The solution is to enable the file access and then you can @require the local copy of your script.
Besides Firefox, it seems that Edge is also affected, but I didn't find any existing issue.
Does it already exist or am I wrong and it should work?
Update: With the release of Firefox 57 (Quantum) this week, only WebExtensions are supported from now on.
The WebExtensions API does not allow access to local files, and probably never will. I believe this issue can therefore be closed, as "use a local file" is no longer possible in any extension.
In FF57, e.g. It's All Text! also doesn't work any more.
Quoting https://stackoverflow.com/a/44516256
I finally found the way to do this using the Fetch requests and FileReader APIs.
We're using this method in Stylus (WebExtension) for live-reload of usercss styles in FF57.
@tophf Thanks for the tip. TM beta 4.5.4619 uses it. We'll see how long it is working.
@ all: Please note that you have to enable script file access at Tampermonkey's options page first.
How about having a small local file server and auto-fetching from it? The server would just need to have liberal CORS settings, and ideally be configured to only serve your tamper monkey scripts.
It really adds to the pain of hacking on userscripts if I have to manually copy them out of my text editor all the time. Using a text editor inside a browser is just not ok. I tried using the "edit-in-emacs" extension, but I could barely get it working even for github comments, never got it to work with tampermonkey's editor.
Perhaps the existing update script functionality could be used with a localhost url? I've put down a TODO note to investigate this possibility.
A local web server worked for me on the initial load, though the script (included via @require
) was cached and then didn't update when I made changes.
I'm assuming the caching is done by tampermonkey. Maybe a non-caching version of @require
might be possible?
FWIW, this is no longer necessary for me, because it turns out to be quite easy to convert a tampermonkey script into a browser extension. The browser extensions then support easy reload from local files. Here's the commit where I converted my project to a browser extension: https://github.com/mgsloan/todoist-shortcuts/commit/6875896ff7a802f287bc37284131eac8f4a5171d
@Hidden50 : About caching: I think that standard browser caching is taking place. If the webserver doesn't send cache headers, it does the standard thing. I can't find a reference to it at the moment, but I seem to believe that "the standard caching thing" is to cache things for some constant * age_of_resource
. (and my local webserver - twistd -n web -p 8000 --path .
- does send the Last-Modified
header from which age_of_resource
can be derived).
So yeah, it is annoying and not perfect. What I've done is touch
the files that I know will change, have a separate browser tab where that resource is open, clear the cache, then browser refresh. In the network tab of the debugger you can see whether the resource is cached or not. I don't think I've seen a situation where the networking tab disagreed with Firefox caching (I'm using Firefox). I'd be surprised if Chrome was any different.
Ideal would be to have a local webserver where you can control caching headers, so the browser doesn't cache at all, but I didn't bother because this worked well enough.
Requiring local files is still working fine at Firefox up to at least version 60.0. You just have to enable the corresponding config option. (This is needed to not allow scripts to access local files without the user's permission.)
Note: requiring a local file may delay script execution a little bit. If this causes problems you can require a file from a local webserver. For development you can set "Externals" -> "Update Interval" to "Always". (Please use this in development only!) In order to allow fast script injection externals are updated everytime after the script was executed.
Thanks. I completely missed the "update interval" setting. That solves the problem for me, I can work with this.
Local files still don't load in Firefox, even though I use the settings you describe (Firefox Quantum 60.0, Allow scripts to access local files: Externals (@require and @resource)
).
The same settings and @require
path work in chrome. I assume firefox denies access, though the way the message is phrased would also fit with a typo'd or otherwise invalid file path.
Indeed, it's not working at Windows 10 anymore, but still working at Ubuntu. 🙄
FYI: I've created another possible solution for script editing which is based on Tampermonkey's recently added WebDAV support. Tampermonkey/tamperdav
@derjanb is there any update on this issue? I use FF 62.0.3 and local file access is not possible...
There is no news on this. FF 62.0.3 still allows local file access at Ubuntu. Other OS needs to wait for Firefox bug 1266960.
Thanks for the Update. I'll watch this Issue for progress.
@derjanb
For development you can set "Externals" -> "Update Interval" to "Always". (Please use this in development only!) In order to allow fast script injection externals are updated everytime after the script was executed.
It's not convenient for some use cases. May I ask you about alternative solutions?
Or add some kind of remote control to TM, to allow some external (node.js?) script to send command when it should update.
@johnd0e Please take a look at TamperDAV. It allows a script to be synced from disk every time it is modified.
Well, my userscripts are built from some kind source templates. My current workflow is like this:
Is TamperDAV able to help me? (I've tried it, and it has it's job, but not for my workflow)
July 2020, still not implemented? Not working in latest Elementary OS
Okay...
I managed to configure my workflow on Firefox/Tampermonkey with Keyboard Maestro; an automation app of macOS. https://www.keyboardmaestro.com/main/
First, prepare a bookmarklet on Firefox for popupping an editor window with a dedicated title.
(function(){
const SUFFIX = ' [Local Script Updater]';
const WIDTH = 320, HEIGHT = 640;
const getTextarea = (d) => d.querySelector('[vis="true"] .CodeMirror-editor textarea');
const getSaveButton = (d) => [...d.querySelectorAll('[vis="true"] ul.editormenu td.shortcut')].find(td => td.textContent === 'Cmd-S');
const w = window.open(location.href, '_blank', `width=${WIDTH},height=${HEIGHT},resizable`);
setTimeout(() => {
const d = w.document, t = getTextarea(d), b = getSaveButton(d);
d.title += SUFFIX;
if(t && b){
t.focus();
t.addEventListener('change', e => b.click());
}
}, 5000);
})();
Second, prepare a macro on Keyboard Maestro for copying and pasting to the popped editor window. I bound [F5] key with the macro for Sublime Text.
How to use:
My solution has been using a local nginx server. All my tampermonkey scripts have only the header metadata, like:
// ==UserScript==
// @name myscriptname
// @namespace http://tampermonkey.net/
// @version 1.007
// @description try to take over the world!
// @author You
// @match https://www.example.com/*
// @require http://localhost/userscripts/myscriptname/main.js
// @grant none
// ==/UserScript==
All the meat and potatoes are in the main.js
script that gets @require
'd in. Then I have to manually bump the version every time I update the script to break the cache. And I think sometimes reload the "example.com" page in question twice. It's not ideal. I made a scaffolding tool that helps set it up and a deploy script to copy it from my dev folder into nginx, but the deploy script can't update the version above, so I have to bump it manually. Maybe there's a way to host the above metadata from my nginx instance as well. Then my deploy script could auto-increment the version and I could update everything in one go with my deploy script, I guess assuming the metadata caching wasn't a dealbreaker.
Anyway, there's probably a better way.
xdhmoore@ Nice approach! You should be able to disable caching in nginx. Alternatively, you could have the userscript generate a script tag with a random query parameter.
I don't think I've tried fiddling with nginx cache headers. I'll have to try that. I don't think the random query parameter would help because it would need to go in the // @require
statement of the metadata which is stored in Tampermonkey and AFAIK, not modifiable by script. Otherwise I would just bump the // @version
there.
Is it possible to @resource a non-javascript (.js) file from a file:// url?
I provide a idea, maybe someone need it. this way don't need change any TamperMonkey setting.
You can create a new TamperMonkey script like this
// ==UserScript==
// @name New Userscript
// @namespace http://tampermonkey.net/
// @version 0.1
// @description try to take over the world!
// @author You
// @match http://www.website.com/*
// @grant none
// ==/UserScript==
function addScript(url){
var script = document.createElement('script');
script.setAttribute('type','text/javascript');
script.setAttribute('src',url);
document.getElementsByTagName('head')[0].appendChild(script);
}
addScript('http://localhost:6868/index.js?t='+(new Date()).getTime())
Every refresh, the page will be load script from http://localhost:6868/index.js
Create a simple loacal http server with nodejs or nginx etc, catch the url, output your code
Here is js code to create a simple http server with nodejs:
// http.js
const http = require('http')
const readFileSync = require('fs')['readFileSync']
http.createServer((req, res) => {
if (req.url.startsWith('/index.js')) {
const content = readFileSync('./index.js', 'utf-8')
res.writeHead(200, {
'Content-Type': 'text/javascript; charset=utf-8'
})
res.end(content)
} else {
res.writeHead(404, {
'Content-Type': 'text/plain; charset=utf-8'
})
res.end('error access')
}
}).listen(6868)
Run command
node path/to/http.js
Create index.js
in the http.js
same directory
Now you can code index.js
with any editer, enjoy~
@sausager I was excited when I modified your example for my own use. However, I hit a problem today with a site that has a content security policy defined. While Tampermonkey I think has some plumbing to help with that for direct scripts, when I load my own script like this it gets blocked by the cross-origin CSP.
After 6 years in 2022... and still nothing! 👴
Unfortunately this is not implemented by Firefox. Please vote for this issue or leave a comment there: https://bugzilla.mozilla.org/show_bug.cgi?id=1266960
Thanks.
@NabiKAZ I assume you mean that it hasn't been fixed in Firefox yet. 😔
I wrote this code for my work:
// ==UserScript==
// @resource accounts_chrome file:///C:/Nabi/accounts.js
// @resource accounts_firefox http://127.0.0.1:12345/accounts.js?111
// @grant GM_getResourceText
// ==/UserScript==
//NOTE:
// Chrome > Extensions > TamperMonkey > Allow access to file URLs
// Firefox > php -S 0.0.0.0:12345
(function() {
'use strict';
var accounts = [];
var accounts_chrome = GM_getResourceText("accounts_chrome");
var accounts_firefox = GM_getResourceText("accounts_firefox");
if (accounts_chrome) {
accounts = JSON.parse(accounts_chrome);
} else if (accounts_firefox) {
accounts = JSON.parse(accounts_firefox);
}
alert(accounts);
})();
The content of accounts.js
is json.
The 111
end of accounts_firefox
path, should be change for prevent caching.
For Chrome
, Enable this: Extensions > TamperMonkey > Allow access to file URLs
For Firefox
, In the C:\Nabi\
path, run this: php -S 0.0.0.0:12345
@NabiKAZ there's another ticket for the caching issue: https://github.com/Tampermonkey/tampermonkey/issues/723
https://github.com/Tampermonkey/tampermonkey/issues/347#issuecomment-706614606
This method worked for my case.
#347 works very well, but on pages like instagram it gives "Content Security Policy" errors.
Loading failed for the <script> with source “http://localhost/userscripts/config.user.js?t=1660712589420”.
Content Security Policy: The page’s settings blocked the loading of a resource at http://localhost/userscripts/config.user.js?t=1660712589420 (“script-src”).[ line 14 > eval:19:45](https://www.instagram.com/%20line%2014%20%3E%20eval)
Content Security Policy: The page’s settings observed the loading of a resource at http://localhost/userscripts/config.user.js?t=1660712589420 (“script-src”). A CSP report is being sent.
I'm still experimenting with this new technique, and it seems that it's working (at least in my case). I make changes, refresh, and in firefox it gets changes on first reload
I created a node.js server, that runs with nodemon like this:
cd "GRobe Scripts/GRobe/GRobeNode"
fnm use v19.9.0
nodemon app.js
In this app.js i have contents like this:
const express = require('express')
const cors = require('cors');
const fs = require('fs');
const path = require('path');
var process = require('process');
process.chdir('/home/USERNAME/GRobe Scripts/');
const util = require('util');
const readFile = util.promisify(fs.readFile);
const app = express()
// CORS without limitations
app.use(cors());
// Make a endpoint for a script bundle, that sends combined script text
app.get('/GRobeGeneralBundle', async (req, res) => {
const fileUrls = [
'GRobe/Shared/smallFunctions.js',
'GRobe/Shared/mousetrap.js',
'GRobe/GRobeGeneral/GRobeGeneral.js',
];
combineStringSend(fileUrls, res)
});
async function combineFilesSequentially(fileUrls) {
let combinedScript = '';
for (const fileUrl of fileUrls) {
const fileData = await readFile(fileUrl, 'utf8');
combinedScript += fileData;
}
return combinedScript;
}
async function combineStringSend(fileUrls, res){
try {
const combinedScript = await combineFilesSequentially(fileUrls);
res.send(combinedScript);
} catch (error) {
console.error('Error combining CSS files:', error);
res.status(500).send('Internal Server Error');
}
}
app.listen(3000,()=>{
console.log('Sever is running on localhost:3000')
})
And in the tampermonkey script i have it like this:
// ==UserScript==
// @name GRobe™ General
// @namespace GRobe
// @version 0.3
// @description GRobe™
// @author You
// @match *://*/*
// @run-at document-start
// @grant GM_xmlhttpRequest
// ==/UserScript==
(()=>{const e="http://127.0.0.1:3000/GRobeGeneralBundle";const t=e=>{if(200===e.status&&e.responseText){eval(e.responseText)}else{console.error("Error loading Node.JS Script:",e.statusText)}};const n=()=>{GM_xmlhttpRequest({method:"GET",url:e,onload:t,onerror:t})};document.addEventListener("DOMContentLoaded",n)})();
The minified code is : Basically it makes request to the endpoint and gets the combined script text and evaluates it, after the page has loaded
(function () {
var nodeJSEndPoint = 'http://127.0.0.1:3000/GRobeGeneralBundle';
// Function to evaluate the fetched script
function evaluateScript(scriptContent) {
eval(scriptContent);
}
// Function to handle the GM_xmlhttpRequest response
function handleResponse(response) {
if (response.status === 200 && response.responseText) {
evaluateScript(response.responseText);
} else {
console.error('Error loading Node.JS Script:', response.statusText);
}
}
// Function to make the GM_xmlhttpRequest
function makeRequest() {
GM_xmlhttpRequest({
method: 'GET',
url: nodeJSEndPoint,
onload: handleResponse,
onerror: handleResponse
});
}
// Wait for the DOMContentLoaded event before making the request
document.addEventListener('DOMContentLoaded', makeRequest);
})();
You can now vote for https://bugzilla.mozilla.org/show_bug.cgi?id=1807608 as well. :-)
I can't manage to
@require
a local file in Firefox.For example, if I use this script:
with
test.js
being onlyalert('test');
, nothing happens.However, the exact same script works in Chrome.
Is there a setting I need to change somewhere to allow local files, like I did in Chrome?