Closed kailinc closed 7 years ago
We found a helpful issue in the past that points us in the right direction.
Snippet from Issue. " Currently, the contentType I am passing to aws is octet-steam. Octet-stream seems to be the content type provided if the type of the file cannot be discerned by mime. This is as expected because I am defining content type with the obscured file name that has no extension. If I were to pass in the file's original name, such as myimage.png, mime would return the content type image/png and the link I get back would open instead of download."
Link to Issue: https://github.com/ga-wdi-boston/team-project/issues/359
We grabbed mimetype from the request and passed it into AwsUpload. Then in AwsUpload we used the mimetype in defining MimeType in AwsUpload. This returned a url that hosts the file.
Uploads Controller
const create = (req, res, next) => {
const upload = {
file: req.file.path,
name: req.body.file.name,
ext: req.file.originalname,
mimetype: req.file.mimetype
}
AWSUpload(upload, req)
.then(upload =>
res.status(201)
.json({
upload: upload.toJSON({ virtuals: true, user: req.user })
}))
.catch(next)
}
S3-Uploads Function
'use strict'
// uses dotenv to grab data from the env
require('dotenv').config()
// #mystery grabs mongoose wiring
require('app/middleware/mongoose')
// fs requires file reading module for node
const fs = require('fs')
// requires crypto to help generate random string for AWS naming
const crypto = require('crypto')
// requires path to help find the location of the file
const path = require('path')
// requires AWS module to interact with AWS
const AWS = require('aws-sdk')
// the way to use AWS model
const s3 = new AWS.S3()
// requring schema for uploading a file
const models = require('app/models')
const Upload = models.upload
// grabs the type of file
const mime = require('mime')
// generates random string of 16 characters for AWS name
const promisifyRandomBytes = () => {
return new Promise((resolve, reject) => {
crypto.randomBytes(16, (error, buf) => {
if (error) {
reject(error)
}
resolve(buf.toString('hex'))
})
})
}
// uploads to AWS
const s3Upload = (params, request) => {
return new Promise((resolve, reject) => {
s3.upload(params, (error, data) => {
if (error) {
reject(error)
}
resolve(data)
})
})
}
// AWS starts the whole process for uploading a file to AWS and mongoDB
const AWSUpload = (options, req) => {
// checks the type of file
const ext = path.extname(options.ext)
// grabs today's date for folder
**const mimeType = options.mimetype**
// checks the extension of the file
const today = new Date().toISOString().split('T')[0]
// turns image into large string
// option.file is the file
const stream = fs.createReadStream(options.file)
console.log('this is ext', ext)
console.log('this is mimetype', options.mimetype)
// params is the parameter for uploads to AWS
const params = {
ACL: 'public-read',
Body: stream,
Bucket: process.env['AWS_S3_BUCKET_NAME'],
**ContentType: mimeType**
}
return promisifyRandomBytes()
.then()
.then((randomString) => {
// sets name of the file in AWS
params.Key = `${today}/${randomString}${ext}`
return params
})
// uploads to AWS
.then(s3Upload)
// Stores in mongo
.then((s3Response) => {
const params = {
name: options.name,
url: s3Response.Location,
// creates _owner param which is set to the request's user id
_owner: req.user._id,
tag: req.body.file.tag,
description: req.body.file.description
}
return Upload.create(params)
})
}
module.exports = AWSUpload
Our Team is running into an issue with url of files being hosted on AWS. The url downloads the files, but I want the url to go to the website that hosts the files.