pionl / laravel-chunk-upload

The basic implementation for chunk upload with multiple providers support like jQuery-file-upload, pupload, DropZone and resumable.js
MIT License
617 stars 167 forks source link

Error with images over 2M #67

Closed kylemilloy closed 5 years ago

kylemilloy commented 5 years ago

I'm getting a really weird bug here using Dropzone and Laravel Chunk Upload (pionl/laravel-chunk-upload).

My ini settings are set to 128M for both post/max file size.

My upload controller looks like:

    /**
     * Handles the file upload.
     *
     * @param \Pion\Laravel\ChunkUpload\Receiver\FileReceiver $receiver
     *
     * @return \Illuminate\Http\Response
     */
    public function store(Request $request)
    {
        $receiver = new FileReceiver('file', $request, HandlerFactory::classFromRequest($request));

        if (false === $receiver->isUploaded()) {
            throw new UploadMissingFileException();
        }

        $save = $receiver->receive();

        if ($save->isFinished()) {
            return $this->saveFileToCloud($save->getFile());
        }

        $handler = $save->handler();

        return response()->json([
            'done' => $handler->getPercentageDone(),
            'status' => true,
        ]);
    }

    protected function saveFile(UploadedFile $file, string $folder = 'uploads')
    {
        $path = Storage::disk('public')->putFile($folder, $file);
        $url = Storage::disk('public')->url($path);
        $name = basename($path);

        return response()->json(compact('url', 'name'));
    }

    /**
     * Saves the file to S3 server.
     *
     * @param \Illuminate\Http\UploadedFile $file
     *
     * @return \Illuminate\Http\Response
     */
    protected function saveFileToCloud(UploadedFile $file, string $folder = 'uploads')
    {
        $path = Storage::disk('s3')->putFile($folder, $file);
        $url = Storage::disk('s3')->url($path);
        $name = basename($path);

        dump($url, $name);

        return response()->json(compact('url', 'name'));
    }

I'm also using dropzone on the front-end to handle the chunking.

I'm routing dropzone through a little bit of vue as such:

// when DZ inits
this.dropzone.on("success", this.onDropzoneSuccess.bind(this));

// in methods
onDropzoneSuccess(file, response) {
  console.log("success", file, response);
  this.$emit("success", response);
},

So here's the issue. On files smaller than 2M it uploads successfully, saves to S3 and the controller responds back with the file name and S3 bucket where it saves successfully. The console.log spits back something like:

{ url: "https://my-bucket.s3.my-region.amazonaws.com/uploads/my-image-name.jpeg", name: "my-image-name.jpeg" }

However when the image is larger than 2M it still uploads successfully, saves to S3 and the dump() command in my controller shows me the proper URL/Name...the problem is when onDropzoneSuccess is called it doesn't have any response from the server...just an empty string instead of the intended URL/name that I know it's properly getting. Dropzone's error callbacks don't get called and nothing gets logged to either httpd logs, php logs or laravel logs and I'm at a loss for where exactly this is falling apart.

kylemilloy commented 5 years ago

This looks to be a dropzone issue. The success callback is being made but with no response. The network tab shows the proper response coming back but dropzone is turning this into an empty string. Anyone else see this issue?

kylemilloy commented 5 years ago

^^ That in mind led me here:

https://gitlab.com/meno/dropzone/issues/51#note_47553173