A user has a blog with 100s of blog posts. She makes some small changes to a draft, or creates a new draft, or updates a published post. Then when she goes back to the post list, we upload the local changes and then download and save to db every single post again (in one gigantic request), because the post list's ETag will have changed, which is obviously horrendously inefficient. In one casual test, this scenario (one small change to a single post) on a blog with 100 fairly long posts uses 250-350 KB of data! Imagine what happens when there are 5x more posts, and the user makes lots of small edits in several different posts!
These should help:
[x] Temporary workaround: limit the number of posts downloaded (d96fd90)
[ ] Implement pagination in post list (#81)
[x] Use E-Tags for all API calls to reduce data usage (#104)
Consider a common scenario:
A user has a blog with 100s of blog posts. She makes some small changes to a draft, or creates a new draft, or updates a published post. Then when she goes back to the post list, we upload the local changes and then download and save to db every single post again (in one gigantic request), because the post list's ETag will have changed, which is obviously horrendously inefficient. In one casual test, this scenario (one small change to a single post) on a blog with 100 fairly long posts uses 250-350 KB of data! Imagine what happens when there are 5x more posts, and the user makes lots of small edits in several different posts!
These should help: