w3c / ServiceWorker

Service Workers
https://w3c.github.io/ServiceWorker/
Other
3.63k stars 315 forks source link

Replaying POST requests #693

Open karolklp opened 9 years ago

karolklp commented 9 years ago

Let's consider offline-first survey application.

When users performs some /GET requests SW will respond with cached content But when users performs /POST request situation get's complicated. I want to:

  1. Perform request by application
  2. Fetch the request by SW
  3. SW to check if it can send that request to the server
  4. If yes - respond with 200 to application
  5. If not - cache the request (store in IDB / cache) and retry it later (ex. using Background Sync)

How in your opinion replaying requests should be performed? I would like ServiceWorker to be a programmable network proxy, so I would like ServiceWorker to handle whole process - application should just send one request and don't care anymore. But sending a POST request is meant to change something on server - we have to make user (application) aware of the fact that it was failed but successfully cached.

Jeff's response to this issue is OK, but he also suggests that application should be responsible for replaying the requests - and I believe it'd be better to separate these layers (for Background Sync).

wanderview commented 9 years ago

cache the request (store in IDB / cache)

FWIW, the Cache API does not currently allow POST requests to be stored. You will get a TypeError.

jakearchibald commented 9 years ago

In browsers that support background sync, you probably want them to handle the POST for the first attempt. With your current steps, data will be lost if the user closes the tab between steps 1-4.

I think it should be:

  1. Page: User clicks "send"
  2. Page: "sending" UI (spinner or notification)
  3. Page: Save data to IDB "outbox"
  4. Page: Register for background sync
  5. SW: On sync, empty outbox, communicate success/failure to clients
  6. Page: Show success/failure UI (failure can reassure the user the send will be attempted later)

However, I agree that Request/Response should be able to go into IDB, as long as bodyUsed is false. Getting a request/response from IDB should always return an un-drained copy, as the cache API does.

wanderview commented 9 years ago

However, I agree that Request/Response should be able to go into IDB, as long as bodyUsed is false. Getting a request/response from IDB should always return an un-drained copy, as the cache API does.

I think we need fetch body stream fully spec'd and something like whatwg/streams#276 before that could be solved. If the streamed body could be transferred, then in theory a structure clone for Request/Response could be defined.

karolklp commented 9 years ago

@jakearchibald @wanderview thanks! I wasn't aware of bodyUsed, that makes sense :)

jakearchibald commented 8 years ago

Once streams are fully specced we can start to think about how request/response could do into IDB

rektide commented 7 years ago

Mozilla's Service Worker Cookbook example simply stores requests into localForage (which presumably picks it's IDB backend)- https://serviceworke.rs/request-deferrer_service-worker_doc.html

I'd be afraid of adding nasty complexity trying to wrangle POST into Caching.

rahulbm commented 6 years ago

I have came across below scenarios

  1. I want to cache the /post/ call similar way ServiceWorker did for /get/ call to work PWA offline but it is not supported right now. The exact issue is raised here
  1. I have the form and user is about to post but internet is unavailable, It should store it locally and sync when internet is available. The The exact issue is raised here

I am not sure why both issues are considered in same manner. If not please help me to find out alternative way for 2

jakearchibald commented 6 years ago

You should be able to store the body of the request into IDB or similar.

wanderview commented 6 years ago

If we wanted to provide a convenience for this use case maybe we could add a RequestQueue concept or something. Something like:

let c = await caches.open("foo");

// add something to a queue
await c.enqueueRequest(FetchEvent.req);

// pop the next item off the queue
let req = await c.dequeueRequest(url);

// drain the entire queue in one go
let reqList = await c.drainRequests(url);

This would avoid the matching problem. All the posts match the same URL/vary combination. We just don't overwrite. Instead we queue the requests and don't have a response associated at all.

loganpowell commented 5 years ago

POST would be handy for graphql users

jakearchibald commented 5 years ago

Pre TPAC thoughts:

simondrabble commented 3 years ago

Any progress on this? The value of service workers is limited if they are restricted to GETs. A modern web app is rarely simply used for retrieving information, so POST, PATCH, PUT, DELETE should be something such an app can handle even while offline.

jakearchibald commented 3 years ago

@simondrabble sorry for the delay in picking this up. Can you talk a little bit about how you'd use this feature, and why you can't store the blob in IDB along with the header metadata?

jakearchibald commented 3 years ago

Fwiw, you can do:

const method = request.method;
const headers = [...request.headers];
const body = await request.blob();
const idbData = { method, headers, body };

…and you can now store idbData in IDB, and reconstruct the request with:

const { method, headers, body } = idbData;
const request = new Request(body, { method, headers });

That should cover most cases.

lucas42 commented 3 years ago

@jakearchibald one problem with using IDB is that it's much more complicated than the simplicity of CacheStorage. Also, having to deconstruct request objects and reconstruct them later is very fiddly and error-prone.
(For example, someone could easily mix up the body and url when reconstructing the request object).

A simpler (yet very hacky) approach I found was to store all my POST requests in a separate cache storage instance and just change the method on the way in and out. So adding a POST request looks like:

    const cache = await caches.open(ACTION_CACHE);
    cache.put(new Request(request, {method: "GET"}), new Response());

and retrieving them all for processing later looks like:

    const cache = await caches.open(ACTION_CACHE);
    const actionQueue = await cache.keys();
    // cache.keys doesn't return requests in date order, so need to sort them
    actionQueue.sort((a,b) => {
        return  getRequestTime(a) - getRequestTime(b);
    });
    return actionQueue.map(request => new Request(request, {method: "POST"}));

I've not tested this extensively, but it works for my particular use case (my requests are all very simple and don't have custom bodies or headers). Hope someone finds that useful.

I know this is abusing CacheStorage for something it's not made for. But CacheStorage is the closest thing I've found to a persistent FIFO queue of native Request objects.

wanderview commented 3 years ago

If we wanted something more convenient than IDB I think a new, separate "request queue" API would be more appropriate than using cache_storage.

BTW, your snippet above will not preserve request bodies.

jakearchibald commented 3 years ago

BTW, your snippet above will not preserve request bodies.

I guess you could store that in the response body in the cache, but it all feels very hacky compared to using IDB.

@lucas42 Have you seen https://www.npmjs.com/package/idb for making IDB easier?

lucas42 commented 3 years ago

@lucas42 Have you seen https://www.npmjs.com/package/idb for making IDB easier?

I had not, thanks! Any library with Douglas Adams references in its examples is definitely worth investigating 😄 For now, my very hacky approach has covered my very niche usecase. (Which is only a personal project, so I'm taking pride in the level of hackiness). But if I come across a similar problem in future, I will try the IDB way.

simondrabble commented 3 years ago

@jakearchibald I ended up going a slightly different route and breaking apart the request to store in idb. Not super clean by my standards but it works.

How I'd use the feature is pretty much analogous to GET requests, but it's been a few months since I've worked on that section of code and I don't recall much of the specifics of how I was trying to use ServiceWorker.