Open faassen opened 2 years ago
Some of the additional conversation we had on this:
I had implemented file uploading in the past, in JS / Express / React apps, and you don't really want it to go through your server -> that is a lot of traffic that you rather want to avoid, as it suffocates your server and reduces its capability to do other, more important stuff, like answer user HTTP requests. Instead, what you ideally want is to upload the file directly from client! So instead of sending it from client to server and then from server to file storage, you want to send it directly to file storage -> skip the server part. Server still plays its role, but that role becomes doing authentication / giving permissions to client to upload to file storage, instead of doing the upload itself.
I guess the first question is, where do you want to store your files? Ideally you would use some file storage, like S3, or a similar offering from other hosting provider. Then, you ideally want to find a nice React library that does most of this work for you, and also explains all the details around it. Such library will probably also advise you on how to set up your S3, and what you need to do on the server. And on the server, you will most likely need to implement a route or two that do some authentication with S3 and then provide those tokens to the client (React). GitHub Allow defining custom API routes (http) · Issue #268 · wasp-lang/wa... Right now dev can define operations (actions, queries) which are then consumed from client via RPC that works via http. However, they can't define custom http API roues at the moment! They ...
[14:22] As for file uploading libraries:
Going directly from the client to the storage provider can work for most cases, but not all. What if you are generating files on the server (reports, images etc.) and need to store them?
What I did to get it working for my use-case:
@aws-sdk/client-s3
library
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
const S3 = new S3Client({ region: "auto", endpoint: process.env.R2_BUCKET_URL, credentials: { accessKeyId: process.env.R2_ACCESS_TOKEN!, secretAccessKey: process.env.R2_SECRET_TOKEN!, }, });
const upload = await S3.send(
new PutObjectCommand({
Bucket: "
entjustdoit shared some great info on how he did it, on our Discord: https://discord.com/channels/686873244791210014/1062935979951923280/1088518391494619226 .
Here it goes:
Sure! First, go to S3 and setup a bucket and IAM user, then add these entries to your .env.server file.
AWS_S3_IAM_SECRET_KEY=
AWS_S3_FILES_BUCKET=
AWS_S3_REGION=
then add the aws sdk to your list of dependencies
dependencies: [
...
("aws-sdk", "^2.1294.0"),
...
]
In your front end, setup the functions for downloading and uploading files.
const handleUploadFile = async () => {
...
let data = await getUploadFileSignedURL(...);
// key is the identifier of the file in S3
const { uploadUrl, key } = data;
// upload the actual file using the signed URL, newFile here is the file selected in the form
await axios.put(uploadUrl, newFile);
// store the key as a field of a file entity for later retrieval
...
}
const handleDownloadFile = async (file) => {
...
let downloadUrl = await getDownloadFileSignedURL(...);
// ignore my ugly code below, its a workaround I had to do due to how I had setup my UI
var link = document.createElement("a");
link.download = file.filename;
link.href = downloadUrl;
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
...
}
I think this is a great candidate for potential Full Stack Modules.
User asking for this: https://discord.com/channels/686873244791210014/1276501590936911933/1276501590936911933
Wasp currently has no way to let users download dynamic binary assets nor upload them. This is a fairly common use case, so should be supported. Where these assets are stored is a related topic (database, s3, etc)