OlegWock / anori

Customizable new tab extension for Chrome, Firefox and Safari
https://anori.app
GNU Affero General Public License v3.0
323 stars 40 forks source link

Cloud sync of settings #47

Open karma1428 opened 1 year ago

OlegWock commented 1 year ago

Hello @karma1428. You can use export/import (in the settings modal) to make this process a bit easier.

Making full-fledged sync involves quite a lot of work and additional costs, so it's not in nearest plans

mileshc1 commented 1 year ago

I would love to see this cloud sync feature, as adding a to-do list on one pc and then remembering to export and import onto my laptop is not manageable. Would be awesome to have some sort of cloud sync.

kiznick commented 1 year ago

What if we give a list of free database services (some providers give a free tier and unlimited time. I think it is enough to save a to-do list.) and we add an option to link the database?

OlegWock commented 1 year ago

Yeah, that could be a possible solution. But this will be kind of ad-hoc solution, and I'm afraid might be quite confusing/complicated for non-tech users

I have a few features in mind that will benefit from having full-fledged back-end (like gmail or spotify integration). And thus I'm leaning into that side. But this is kinda depends on how successful my attempts at marketing will be, if extension stays at ~same audience as now I'll probably go with your proposed solution of using 3rd party database provider and asking user to register there. But if everything goes smooth, we'll go with companion back-end to provide sync and other features

OlegWock commented 1 year ago

I removed 'Pull requests are welcome' label as this feature is currently in a limbo and it will be more clear how to proceed after in a month or two

sansmoraxz commented 10 months ago

FYI storage.sync can help. Provided not storing a large payload.

OlegWock commented 10 months ago

I considered this, but it our payloads are larger that limits and we do update storage quite often

sansmoraxz commented 10 months ago

I wonder will it really grow that much?

I have 6 different pages. My total backup size of exported configs is hardly even 12 kb.

Granted I am just stating out and was just looking for a nightab alternative while I stumbled upon this project.

sansmoraxz commented 10 months ago

Maybe some button to sync manually.

Checking for the size of the compressed payload and generating error messages if exceeded can be a good balance imo.

OlegWock commented 10 months ago

That could work for some (lean) setups, but it won't work for other users. My setup, for example, is 162kb compressed. Even if we workaround limitations of chrome.sync (8 kb per item, 100kb max), it won't be easy to code and maintain. I'm on 'good solution or nothing' side in this debate.

We might not need to code all sync logic from scratch and incorporate projects like electric-sql, TinyBase or cr-sqlite, but even in this case we'll need some kind of central server with user management, which is big chunk of work and maintenance.

sansmoraxz commented 10 months ago

Hey @OlegWock I gave this some thought. Truth be told I am not really a fan of having my extensions configs synced to somewhere I can't directly control.

What do you think of using Github to store the configs? Namely, just sync the user-generated PAT tokens and repo URLs. And then directly update the files through Github's REST API.

OlegWock commented 10 months ago

@sansmoraxz If I would to implement sync, I'd prefer a solution which will work for majority of users (most of them aren't very tech-savy), so solution with github isn't ideal.

However, if you (or any other person) are willing to implement sync with GitHub (or any other service which can be controlled by user, like their dropbox or something) and submit PR, I'll merge it (likely hidden under 'Advanced settings' or something like that).

From my side I'll be happy to answer any questions about code and guide you if you decide to give it a try.

sansmoraxz commented 7 months ago

I tried working on this, but the workflow is too damn slow. Not sure if it's specific to extension development, or just the huge webpack config. Dev server doesn't even work.

sansmoraxz commented 7 months ago

Anyway for anyone trying to take a dig at this with octokit.

I made a small api server implementation.

For viewing contents of file:

import { Octokit } from "octokit";

export async function POST(request: Request) {
  // path is the expected properties in the request body
  const owner = process.env.GITHUB_OWNER || "octocat";
  const repo = process.env.GITHUB_REPO || "hello-world";

  console.log("request: ", request);
  const { path } = await request.json();

  const octokit = new Octokit({
    auth: process.env.GITHUB_TOKEN,
  });

  try {
    const x = await octokit.rest.repos.getContent({
      owner: owner,
      repo: repo,
      path: path,
      headers: {
        "X-GitHub-Api-Version": "2022-11-28",
      },
    });

    if (x.status !== 200 && x.status !== 201) {
      throw new Error("Failed to update file");
    }

    return new Response(
      JSON.stringify({
        status: 200,
        sha: x.data?.content?.sha,
      }),
      { status: 200 }
    );
  } catch (e) {
    console.error(e);
    return new Response(
      JSON.stringify({
        status: 500,
        sha: null,
      }),
      { status: 500 }
    );
  }
}

For updating the file you need to validate against the exisitng sha of the file. This should be available on the calling the read api. Sample here:

import { Octokit } from "octokit";

export async function POST(request: Request) {
  // data, sha, and path are the expected properties in the request body
  const owner = process.env.GITHUB_OWNER || "octocat";
  const repo = process.env.GITHUB_REPO || "hello-world";

  console.log("request: ", request);
  const { data, sha, path } = await request.json();

  const octokit = new Octokit({
    auth: process.env.GITHUB_TOKEN,
  });

  try {
    const x = await octokit.rest.repos.createOrUpdateFileContents({
      owner: owner,
      repo: repo,
      path: path,
      sha: sha,
      message: "update",
      committer: {
        name: "github-actions",
        email: "41898282+github-actions[bot]@users.noreply.github.com",
      },
      content: Buffer.from(data).toString("base64"),
      headers: {
        "X-GitHub-Api-Version": "2022-11-28",
      },
    });

    if (x.status !== 200 && x.status !== 201) {
      throw new Error("Failed to update file");
    }

    return new Response(
      JSON.stringify({
        status: 200,
        sha: x.data.content?.sha,
      }),
      { status: 200 }
    );
  } catch (e) {
    console.error(e);
    return new Response(
      JSON.stringify({
        status: 500,
        sha: null,
      }),
      { status: 500 }
    );
  }
}
OlegWock commented 6 months ago

Didn't noticed that this was closed. I'm going to repopen it and keep it as 'main' issue for all cloud sync requests, as this one has the most actually useful info. If you don't want to get notifications, please unsubscribe from this issue by clicking button to the right.