Open TheBeachMaster opened 5 years ago
I have even switched to express
and still no change
import compression from 'compression';
import * as sapper from '@sapper/server';
const express = require('express');
const app = express();
app.use(express.static(__dirname + '/static'),
compression({ threshold: 0 }),
sapper.middleware());
const appPort = 3056;
app.listen(process.env.PORT || appPort, () => {
console.info('We are up!');
});
I've had this happen to me occasionally as well, generally if the app takes a long time to boot up. There's some timeout that stops waiting for the server to be ready, and displays this message. Increasing the timeout doesn't really seem like the proper solution.
I'm not sure how long we should wait for the server to be ready, but it does seem to make sense to start up the live reload server regardless. Or maybe we could just keep polling to see whether the server has started. There's definitely room for some improvement here.
Experiencing the same when using .env
(dotenv) to set PORT
env var to a different port number
If anyone has a workaround, it would be much appreciated. Hitting the same issue.
I ended up starting the server while seting the PORT
env var explicitly, as it seems server.js
doesn't load .env
:
$ PORT=3003 yarn run dev
Hope it helps!
@benoror, Thanks for the suggestion. Unfortunately, this did not work for me. :(
Question: why doesn't sapper just use something like get-port
to avoid collisions and ensure that it starts on an available port?
Edit: I've opened an issue for it.
I have a collegue with this issue, the app is available but livereload does not work. Also the build takes longer than on my computer. It might have to do with storing the project in a network drive.
Still investigating.
EDIT: the network drive was the issue, it seems that 5 seconds is not enough to write the app to a network drive.
I've just run into this too not sure why, internet isn't too bad 60mbps. Seems to happen on all sapper projects too
I run into this issue too
Any fix?
There seems to be an open PR #985
@Conduitry any chance this will be looked at, seems like tons of people are having this issue 🤔
I run into this bug 100% of the time I try to use vscode's devcontainer to run a debug session.
I am getting this very regularly. Has anyone got a workaround?
same as me, specially when sapper is trying many times to start the server upon any error. only workaround avoiding me to restart pc is to turn on and off the flight mode
I just got this when I imported rollup-plugin-node-globals
as a rollup plugin
Edit: Don't use rollup-plugin-node-globals
. Use https://www.npmjs.com/package/rollup-plugin-node-polyfills instead.
I have a use case to switch ports because I have a work app that's on 3000 but I'd like to sometimes work on my blog on my breaks. It's a PITA because even though I've set up Svelte/Sapper app on 4000 something is opening it on 3000 too still. I can literally kill the PID for 3000, fire up my 4000 configured sapper, and lsof -i :3000 and see it again:
The result of this is it's tedious to switch back to work project because I pretty much have to nuke the service worker even though I've stopped sapper server.
I was having this issue all of a sudden, and it appears it was because I accidentally enabled two network interfaces simultaneously (wifi on and ethernet plugged in at same time). Disabling one or the other solved my problem. Not sure if that's helpful or not, but thought I'd share.
I got this issue when I bumped rollup-plugin-svelte
to 6.1.1
to solve a separate issue described here https://github.com/Samuel-Martineau/generator-svelte/issues/7. I'm also unsure if this is helpful or not.
Having this issue trying to run Sapper on my Raspberry pi zero w. It's a bit weird, as I'm able to see the default output with curl localhost:3000, but my Ngrok tunnel is giving an NSURLErrorDomain -1017, ie. not receiving valid data.
Got this issue with express@4.16.1. I figured out that it was because the usage of app.listen() in app.js and "node /bin/www" in package.json at the same time, since the default /bin/www would create a httpserver on 3000 anyway, even if app.listen has already created one. Remove app.listen() from app.js and change "node /bin/www" to "PORT=${port} node /bin/www" would fix the issue. Hope it helps :p.
I run into this bug 100% of the time I try to use vscode's devcontainer to run a debug session.
I also started to suffer from this problem after I configured the devcontainer in vscode
I was experiencing the same problem here. In my case, in the beginning, I was running the server with WSL on Windows 11. I have also tried changing the port several times and nothing worked. After that, I tried to run it without WSL and finally got this alert: The alert clearly states that "Window Defender Firewall has blocked some features of this app". In this case, "app" is the "Node.js: Server-side JavaScript" with the specific version of the currently used node version (in my case, it's v14.16.1). So, I just click the "Allow access" button, and then it worked!
So, if you are using Windows, maybe you can try to allow the connections in Windows Firewall manually. Hope it helps!
This is sort of similar to this issue and this issue as well as this issue .
This is on a fresh setup as detailed here.
Environment
Node Version: v8.16.0
,OS: Arch Linux
,npm version : 6.9.0
.Setting up (server.js)
The default template comes with the following:
Then running
npm install
Running
export HOST=127.0.0.1 PORT=3000
npm run build
npm run dev
you get the following errorI went and checked if there was any process listening on port 3000
lsof -i :3000 | grep LISTEN
only to find that anode
process was ...as seen here when I opened the browser...
The problem
This results in live-reloading failing.
Note that in my case as opposed to this comment I did not have any other process on port 3000 other than the app.
Attempts at solving
I went ahead and set
PORT=3080
and rebuilt the app, only to get the same error message but now on port3080
... but the app is running on port 3080 on localhost with port 3000 not listening on any processes (as expected).