Closed archiewood closed 2 months ago
Is there a confirmation on this and/or some potential workaround (like: knowing what version to downgrade to?)
@csjh can you take a look at this one this week?
It sounds like the underlying issue is not the all-queries.json specifically, but rather the "too small to be Parquet" error
@archiewood I doubt it, I have this problem on pages that do not have the too small to be parquet error also. But the all queries is there. Root cause seems to be some misalignent with the hashes (data is there, just in other folders with other hash values).
@andrejohansson do you have or could you make a repro project that reproduces this error that you can share
We are struggling to reproduce this consistently internally!
It seems that the 404 all-queries errors come whenever there is a markdown page which have no queries (I guess it makes sense). So this yields a missing all-queries and the 404 error. But this does not seem to relate to the dropdowns not working, will continue my efforts in locating the issue.
Here is a sample that reproduces the behaviour (no update on filter change).
npm install
npm run sources
Start dev server
npm run dev
Build static page
npm run build
Start some web server hosting static pages, I am using https://github.com/cortesi/devd
cd build
devd .
Uncaught (in promise) Error: Invalid Input Error: File 'csv_data_ingestion_system_type_statistics.parquet' too small to be a Parquet file
csv
and the source name in connection.yaml is called csv_data
csv_data
did not helpYYYY-MM-DD
format)Trying to convert the csv data to a sqlite file also gave me some trouble
Sqlite Documentation is unclear/wrong
SQLite is a local file-based database. It should be stored in the root of your Evidence project.
This does not seem to be true. I must place the sqlite file under sources\<mydatasourcename>
and also add a connection.yaml
file similar to
name: agentdata
type: sqlite
options:
filename: agentdata.sqlite
This works to get the data, but generates a warning when I run npm run sources
as it tries to process all files in the source folder (including the sqlite file itself) and not only the .sql
files
In order to exclude any date issues, i created a duckdb datasource instead of csv of the same data.
Here is a variant using duckdb:
evidence-issue-1566-using-duckdb.zip
But the problem persists, there is no interactivity in when filtering using the dropdown after being built with npm run build
. Filtering only works using the dev server.
Sidenote: I am using windows 11
Thank you for the detailed repro
My folder name is called csv and the source name in connection.yaml is called csv_data. Do these need to be in sync?
No - your setup is fine
Thank you, I have opened a PR to fix this
Confirm bar chart does not update anymore
With your csv example, I am unable to replicate this issue on my local machine.
I have also deployed this repository on netlify, and the filter appears to work. Does this also work for you on windows?
Here is a screencast of my behaviour, interestingly enough I can seem to return back to the "all" filter.
https://github.com/evidence-dev/evidence/assets/2828428/64674699-eef7-40b0-b037-a60e21ac6485
On local i tried both the built in vite preview npm run preview
and http-server.
My best current guess on this is a platform specific issue to windows.
What is your current hosting setup where this is failing?
Like I wrote above, I just npm run build
to the build folder then start devd in it (something similar to http-server).
I noticed that when I use the filters, only very small datasets are returned (7 rows each). But when I use the % operator to show all, then I get 1k+ rows and then the charts and everything work.
Could there be some limit that is wrong and prevents really small datasets?
Uncaught (in promise) Error: Invalid Input Error: File 'duckdata_measurements.parquet' too small to be a Parquet file
Immutable 3
[VennDiagram.svelte_svelte_type_style_lang.96597554.js:4:56898](http://localhost:8000/_app/immutable/chunks/VennDiagram.svelte_svelte_type_style_lang.96597554.js)
Okay, I have now downloaded devd
and replicated the issue.
While I don't want to exclude the possibility of errors on our side - which we'll investigate, perhaps as a workaround using a post 1.0 webserver will resolve your issue?
npm run preview
(the vite preview) works well for me too.
I confirmed also that using caddy with the following command work, so Indeed it is some server configuration
caddy file-server --listen :8000
In our production docker image we are using nginx unit which also shows the same behaviour and this is a brand new webserver.
Do you have any idea what settings my be required or could affect this? I'm thinking something like mime types, or allowing/disallowing something.
The devd webserver responds with octet/stream (which seems right) but is not working
The caddy server responds with xml (which seems wrong) but is working
@ItsMeBrianD might have some better ideas here, as he was recently deploying evidence using nginx.
I believe there are certain file types that are not served correctly by nginx by default
Do you have any idea what settings my be required or could affect this?
I've not heard of nginx unit
before, but it looks very interesting. The setup I was running with was with a ootb nginx docker container. I did need to tweak the file permissions of the .parquet files (they were getting spit out 0700 owned by root)
this is the full configuration that I used:
server {
listen 80;
listen [::]:80;
server_name localhost;
gzip on;
gzip_types text/plain application/xml application/octet-stream application/json application/javascript;
#access_log /var/log/nginx/host.access.log main;
location / {
root /usr/share/nginx/html;
index index.html index.htm;
}
location ~* \.(parquet|arrow|js|json)$ {
root /usr/share/nginx/html;
add_header Cache-Control "private, max-age=3600";
}
#error_page 404 /404.html;
# redirect server error pages to the static page /50x.html
#
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
# proxy the PHP scripts to Apache listening on 127.0.0.1:80
#
#location ~ \.php$ {
# proxy_pass http://127.0.0.1;
#}
# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
#
#location ~ \.php$ {
# root html;
# fastcgi_pass 127.0.0.1:9000;
# fastcgi_index index.php;
# fastcgi_param SCRIPT_FILENAME /scripts$fastcgi_script_name;
# include fastcgi_params;
#}
# deny access to .htaccess files, if Apache's document root
# concurs with nginx's one
#
#location ~ /\.ht {
# deny all;
#}
}
Something that we encountered in another spot was this error appearing if duckdb wasm runs out of memory, is this happening on pages that have a lot of data (or after navigating around the site for a bit?)
Thank you both for your efforts.
No this sample is very small, 7 rows of data when filtered. 1k rows when unfiltered.
Switching webserver to caddy helped locally, I'll see if I can get either caddy into docker or use your sample nginx config @ItsMeBrianD
There is an official caddy docker image if that's helpful
I can confirm to you that the issue was resolved for us when switching to caddy as webserver instead of using nginx and nginx unit.
Here is our simple docker:
FROM caddy:2.7.6
# Copy web server config
# NOTE: Use a compatible web server configuration, see https://github.com/evidence-dev/evidence/issues/1566
COPY ./Caddyfile /etc/caddy/Caddyfile
# Copy app from build stage into default www folder
COPY --from=builder /app/build /var/www/html
EXPOSE 80
Excellent, closing for now
Appears to have been a duckdb
-based error, as upgrading to duckdb-wasm 1.28.0
fixes, so reopening pending that version bump.
See slack thread
Steps To Reproduce
npm run sources
npm run build
Environment
Expected Behavior
Actual Behaviour
most things looks and acts normal. But on one page we use dropdowns to filter data. That page stops working (either no change when changing filter or no data at all).
When running npm run dev, directly in same folder this works. I can use the dropdowns and the components referring the data is updated.