laminas / laminas-diactoros

PSR HTTP Message implementations
https://docs.laminas.dev/laminas-diactoros/
BSD 3-Clause "New" or "Revised" License
484 stars 64 forks source link

Low performance of TextResponse and another related responses #58

Open codercms opened 3 years ago

codercms commented 3 years ago

Feature Request

Q A
New Feature no
RFC no
BC Break no

Summary

Currently TextResponse always allocating new buffer for string output - https://github.com/laminas/laminas-diactoros/blob/2.6.x/src/Response/TextResponse.php#L75 This approach working really slow, my suggestion is to implement pure RAM string stream. Here is what I mean - https://github.com/makise-co/framework/blob/master/src/Http/FakeStream.php (this approach is ~30% more performant)

lcobucci commented 3 years ago

@codercms are you always experiencing that low performance or it depends on the text length? Does anything change if /maxmemory:NN gets added to php://temp? The filesystem should only be used when the text is big (default limit is 2Mb).

Ocramius commented 3 years ago

my suggestion is to implement pure RAM string stream.

Overall agree, php://memory could suffice here, but we also are not sure if the provided string|StreamInterface will fit in memory, hence why a more conservative php://temp was used here.

I suggest sending a patch, and then we discuss the implications there: it is also acceptable to tell end-users that TextResponse is not intended for large payloads (which is in fact the original design anyway)

codercms commented 3 years ago

@lcobucci there is two cases:

  1. I'm experiencing low performance because TextResponse doing excess allocation of PHP native stream (allocation time + string copying time)
  2. With large text length performance difference is even more

Patched TextReponse:

$size = 10 * 1024 * 1024;
$body = new Stream("php://temp/maxmemory:{$size}", 'wb+');

It is still slower than approach without PHP native stream allocation. 320 RPS without native PHP stream 260 RPS with native PHP stream

codercms commented 3 years ago

@Ocramius I guess this patch will not affect end users, because php native stream is only allocated when string was passed to the constructor - this means that the memory for response is already allocated.

Ocramius commented 3 years ago

Looking back at this, I think that for text, we can use php://memory, and let consumers with large text/plain payloads implement their own streamed response perhaps?

Unsure if this should be considered a BC break.

Ocramius commented 3 years ago

Similar for html, which is the more common response object.

codercms commented 3 years ago

@Ocramius I think there is one more approach (in two different ways):

  1. Create a “fake” stream object that is just simulating stream behavior (under the hood its just a string wrapper which implements StreamInterface) - end user will have to create stream object
  2. Create a true plain string response which will create a “fake” stream object under the hood - no additional actions for end user
Ocramius commented 3 years ago

I didn't really understand the difference between those suggested approaches :thinking:

codercms commented 3 years ago

@Ocramius actually its only one approach which can be implemented in two manners

Ocramius commented 3 years ago

But basically, if I understand it correctly, avoiding usage of a stream at all, and having a string variable instead (since html/text responses are generally well within the memory limits)

codercms commented 3 years ago

@Ocramius yes it is. The performance problem occurs when the Response object allocates stream for string (I mean PHP stream), but actually there is no need to allocate memory one more time.