Compare commits

...

6 Commits

Author SHA1 Message Date
vorotamoroz
c3464a4e9c New feature:
- Bootup sequence prevention implemented.

Touched the docs up:
2021-12-28 11:30:19 +09:00
vorotamoroz
55545da45f Fixed:
- Fixed problems about saving or deleting files to the local database.
- Disable version up warning.
- Fixed error on folder renaming.
- Merge dialog is now shown one by one.
- Fixed icons of queued files.
- Handled sync issue of Folder to File
- Fixed the messages in the setting dialog.
- Fixed deadlock.
2021-12-24 17:05:57 +09:00
vorotamoroz
96165b4f9b Fixed and implemented:
- New configuration to solve synchronization failure on large vault.
- Preset misconfigurations
- Sometimes hanged on replication.
- Wrote documents
2021-12-23 13:22:46 +09:00
vorotamoroz
abe613539b Fixes:
- Allow "'" and ban "#" in filenames. #27
- Fixed misspelling (one of #28)
2021-12-22 19:38:00 +09:00
vorotamoroz
fc210de58b bumped 2021-12-16 19:08:15 +09:00
vorotamoroz
1b2f9dd171 Refactored and fixed:
- Refactored, linted, fixed potential problems, enabled 'use strict'

Fixed:
- Added "Enable plugin synchronization" option
(Plugins and settings had been run always)

Implemented:
- Sync preset implemented.
- "Check integrity on saving" implemented.
- "Sanity check" implemented
It's mainly for debugging.
2021-12-16 19:06:42 +09:00
23 changed files with 8893 additions and 4114 deletions

3
.eslintignore Normal file
View File

@@ -0,0 +1,3 @@
npm node_modules
build
.eslintrc.js.bak

19
.eslintrc Normal file
View File

@@ -0,0 +1,19 @@
{
"root": true,
"parser": "@typescript-eslint/parser",
"plugins": ["@typescript-eslint"],
"extends": ["eslint:recommended", "plugin:@typescript-eslint/eslint-recommended", "plugin:@typescript-eslint/recommended"],
"parserOptions": {
"sourceType": "module"
},
"rules": {
"no-unused-vars": "off",
"@typescript-eslint/no-unused-vars": ["error", { "args": "none" }],
"@typescript-eslint/ban-ts-comment": "off",
"no-prototype-builtins": "off",
"@typescript-eslint/no-empty-function": "off",
"require-await": "warn",
"no-async-promise-executor": "off",
"@typescript-eslint/no-explicit-any": "off"
}
}

View File

@@ -81,6 +81,7 @@ Synchronization status is shown in statusbar.
- When synchronized, files are compared by their modified times and overwritten by the newer ones once. Then plugin checks the conflicts and if a merge is needed, the dialog will open.
- Rarely, the file in the database would be broken. The plugin will not write storage when it looks broken, so some old files must be on your device. If you edit the file, it will be cured. But if the file does not exist on any device, can not rescue it. So you can delete these items from the setting dialog.
- If your database looks corrupted, try "Drop History". Usually, It is the easiest way.
- To stop the bootup sequence for fixing problems on databases, you can put `redflag.md` on top of your vault.
- Q: Database is growing, how can I shrink it up?
A: each of the docs is saved with their old 100 revisions to detect and resolve confliction. Picture yourself that one device has been off the line for a while, and joined again. The device has to check his note and remote saved note. If exists in revision histories of remote notes even though the device's note is a little different from the latest one, it could be merged safely. Even if that is not in revision histories, we only have to check differences after the revision that both devices commonly have. This is like The git's conflict resolving method. So, We have to make the database again like an enlarged git repo if you want to solve the root of the problem.
- And more technical Information are in the [Technical Information](docs/tech_info.md)

View File

@@ -83,6 +83,7 @@ Self-hosted LiveSync用にWebClipperも作りました。Chrome Web Storeから
- ファイルは同期された後、タイムスタンプを比較して新しければいったん新しい方で上書きされます。その後、衝突が発生したかによって、マージが行われます。
- まれにファイルが破損することがあります。破損したファイルに関してはディスクへの反映を試みないため、実際には使用しているデバイスには少し古いファイルが残っていることが多いです。そのファイルを再度更新してもらうと、データベースが更新されて問題なくなるケースがあります。ファイルがどの端末にも存在しない場合は、設定画面から、削除できます。
- データベースが変。そういうときは、いったんデータベースをDrop Historyのapply and sendで再初期化してみてください。だいたい直ります。
- データベースの復旧中に再起動した場合など、うまくローカルデータベースを修正できない際には、Vaultのトップに`redflag.md`というファイルを置いてください。起動時のシーケンスがスキップされます。
- データベースが大きくなってきてるんだけど、小さくできる→各ートは、それぞれの古い100リビジョンとともに保存されています。例えば、しばらくオフラインだったあるデバイスが、久しぶりに同期したと想定してみてください。そのとき、そのデバイスは最新とは少し異なるリビジョンを持ってるはずです。その場合でも、リモートのリビジョン履歴にリモートのものが存在した場合、安全にマージできます。もしリビジョン履歴に存在しなかった場合、確認しなければいけない差分も、対象を存在して持っている共通のリビジョン以降のみに絞れます。ちょうどGitのような方法で、衝突を解決している形になるのです。そのため、肥大化したリポジトリの解消と同様に、本質的にデータベースを小さくしたい場合は、データベースの作り直しが必要です。
- その他の技術的なお話は、[技術的な内容](docs/tech_info_ja.md)に書いてあります。

View File

@@ -23,7 +23,7 @@ The Database name to synchronize.
## Local Database Configurations
"Local Database" is created inside your obsidian.
### Batch database update (beta)
### Batch database update
Delay database update until raise replication, open another file, window visibility changed, or file events except for file modification.
This option can not be used with LiveSync at the same time.
@@ -78,6 +78,27 @@ When running these operations, every synchronization settings is disabled.
**And even your passphrase is wrong, It doesn't be checked before the plugin really decrypts. So If you set the wrong passphrase and run "Apply and Receive", you will get an amount of decryption error. But, this is the specification.**
### minimum chunk size and LongLine threshold
The configuration of chunk splitting.
Self-hosted LiveSync splits the note into chunks for efficient synchronization. This chunk should be longer than "Minimum chunk size".
Specifically, the length of the chunk is determined by the following orders.
1. Find the nearest newline character, and if it is farther than LongLineThreshold, this piece becomes an independent chunk.
2. If not, find nearest to these items.
1. Newline character
2. Empty line (Windows style)
3. Empty line (non-Windows style)
3. Compare the farther in these 3 positions and next "\[newline\]#" position, pick a shorter piece to as chunk.
This rule was made empirically from my dataset. If this rule acts as badly on your data. Please give me the information.
You can dump saved note structure to `Dump informations of this doc`. Replace every character to x except newline and "#" when sending information to me.
Default values are 20 letters and 250 letters.
## General Settings
### Do not show low-priority log
@@ -122,32 +143,58 @@ Self-hosted LiveSync will delete the folder when the folder becomes empty. If th
### Use newer file if conflicted (beta)
Always use the newer file to resolve and overwrite when conflict has occurred.
### minimum chunk size and LongLine threshold
The configuration of chunk splitting.
### Advanced settings
Self-hosted LiveSync using PouchDB and synchronizes with the remote by [this protocol](https://docs.couchdb.org/en/stable/replication/protocol.html).
So, it splits every entry into chunks to be acceptable by the database with limited payload size and document size.
Self-hosted LiveSync splits the note into chunks for efficient synchronization. This chunk should be longer than "Minimum chunk size".
However, it was not enough.
According to [2.4.2.5.2. Upload Batch of Changed Documents](https://docs.couchdb.org/en/stable/replication/protocol.html#upload-batch-of-changed-documents) in [Replicate Changes](https://docs.couchdb.org/en/stable/replication/protocol.html#replicate-changes), it might become a bigger request.
Specifically, the length of the chunk is determined by the following orders.
Unfortunately, there is no way to deal with this automatically by size for every request.
Therefore, I made it possible to configure this.
1. Find the nearest newline character, and if it is farther than LongLineThreshold, this piece becomes an independent chunk.
Note: If you set these values lower number, the number of requests will increase.
Therefore, if you are far from the server, the total throughput will be low, and the traffic will increase.
2. If not, find nearest to these items.
1. Newline character
2. Empty line (Windows style)
3. Empty line (non-Windows style)
3. Compare the farther in these 3 positions and next "\[newline\]#" position, pick a shorter piece to as chunk.
### Batch size
Number of change feed items to process at a time. Defaults to 250.
This rule was made empirically from my dataset. If this rule acts as badly on your data. Please give me the information.
You can dump saved note structure to `Dump informations of this doc`. Replace every character to x except newline and "#" when sending information to me.
Default values are 20 letters and 250 letters.
### Batch limit
Number of batches to process at a time. Defaults to 40. This along with batch size controls how many docs are kept in memory at a time.
## Miscellaneous
### Show status inside editor
Show information inside the editor pane.
It would be useful for mobile.
### Check integrity on saving
Check all chunks are correctly saved on saving.
### Presets
You can set synchronization method at once as these pattern:
- LiveSync
- LiveSync : enabled
- Batch database update : disabled
- Periodic Sync : disabled
- Sync on Save : disabled
- Sync on File Open : disabled
- Sync on Start : disabled
- Periodic w/ batch
- LiveSync : disabled
- Batch database update : enabled
- Periodic Sync : enabled
- Sync on Save : disabled
- Sync on File Open : enabled
- Sync on Start : enabled
- Disable all sync
- LiveSync : disabled
- Batch database update : disabled
- Periodic Sync : disabled
- Sync on Save : disabled
- Sync on File Open : disabled
- Sync on Start : disabled
## Hatch
From here, everything is under the hood. Please handle it with care.
@@ -165,8 +212,11 @@ The remote database indicates that has been unlocked Pattern 1.
When you mark all devices as resolved, you can unlock the database.
But, there's no problem even if you leave it as it is.
### Reread all files
Reread all files in the vault, and update them into the database if there's diff or could not read from the database.
### Verify and repair all files
read all files in the vault, and update them into the database if there's diff or could not read from the database.
### Sanity check
Make sure that all the files on the local database have all chunks.
### Drop history
Drop all histories on the local database and the remote database, and initialize When synchronization time has been prolonged to the new device or new vault, or database size became to be much larger. Try this.

View File

@@ -25,7 +25,7 @@ CouchDBのURIを入力します。Cloudantの場合は「External Endpoint(prefe
## Local Database Configurations
端末内に作成されるデータベースの設定です。
### Batch database update (beta)
### Batch database update
データベースの更新を以下の事象が発生するまで遅延させます。
- レプリケーションが発生する
- 他のファイルを開く
@@ -78,6 +78,24 @@ End to End 暗号化を行うに当たって、異なるパスフレーズで暗
どちらのオペレーションも、実行するとすべての同期設定が無効化されます。
**また、パスフレーズのチェックは、実際に復号するまで行いません。そのため、パスフレーズを間違えて設定し、Apply and receiveで同期を行うと、大量のエラーが発生します。これは仕様です。**
### minimum chunk size と LongLine threshold
チャンクの分割についての設定です。
Self-hosted LiveSyncは一つのチャンクのサイズを最低minimum chunk size文字確保した上で、できるだけ効率的に同期できるよう、ートを分割してチャンクを作成します。
これは、同期を行う際に、一定の文字数で分割した場合、先頭の方を編集すると、その後の分割位置がすべてずれ、結果としてほぼまるごとのファイルのファイル送受信を行うことになっていた問題を避けるために実装されました。
具体的には、先頭から順に直近の下記の箇所を検索し、一番長く切れたものを一つのチャンクとします。
1. 次の改行を探し、それがLongLine Thresholdより先であれば、一つのチャンクとして確定します。
2. そうではない場合は、下記を順に探します。
1. 改行
2. windowsでの空行がある所
3. 非Windowsでの空行がある所
3. この三つのうち一番遠い場所と、 「改行後、#から始まる所」を比べ、短い方をチャンクとします。
このルールは経験則的に作りました。実データが偏っているため。もし思わぬ挙動をしている場合は、是非コマンドから`Dump informations of this doc`を選択し、情報をください。
改行文字と#を除き、すべて●に置換しても、アルゴリズムは有効に働きます。
デフォルトは20文字と、250文字です。
## General Settings
一般的な設定です。
@@ -124,30 +142,34 @@ Self-hosted LiveSyncは通常、フォルダ内のファイルがすべて削除
### Use newer file if conflicted (beta)
競合が発生したとき、常に新しいファイルを使用して競合を自動的に解決します。
### minimum chunk size と LongLine threshold
チャンクの分割についての設定です。
Self-hosted LiveSyncは一つのチャンクのサイズを最低minimum chunk size文字確保した上で、できるだけ効率的に同期できるよう、ートを分割してチャンクを作成します。
これは、同期を行う際に、一定の文字数で分割した場合、先頭の方を編集すると、その後の分割位置がすべてずれ、結果としてほぼまるごとのファイルのファイル送受信を行うことになっていた問題を避けるために実装されました。
具体的には、先頭から順に直近の下記の箇所を検索し、一番長く切れたものを一つのチャンクとします。
### Advanced settings
Self-hosted LiveSyncはPouchDBを使用し、リモートと[このプロトコル](https://docs.couchdb.org/en/stable/replication/protocol.html)で同期しています。
そのため、全てのノートなどはデータベースが許容するペイロードサイズやドキュメントサイズに併せてチャンクに分割されています。
1. 次の改行を探し、それがLongLine Thresholdより先であれば、一つのチャンクとして確定します
しかしながら、それだけでは不十分なケースがあり、[Replicate Changes](https://docs.couchdb.org/en/stable/replication/protocol.html#replicate-changes)の[2.4.2.5.2. Upload Batch of Changed Documents](https://docs.couchdb.org/en/stable/replication/protocol.html#upload-batch-of-changed-documents)を参照すると、このリクエストは巨大になる可能性がありました
2. そうではない場合は、下記を順に探します。
1. 改行
2. windowsでの空行がある所
3. 非Windowsでの空行がある所
3. この三つのうち一番遠い場所と、 「改行後、#から始まる所」を比べ、短い方をチャンクとします。
残念ながら、このサイズを呼び出しごとに自動的に調整する方法はありません。
そのため、設定を変更できるように機能追加いたしました。
このルールは経験則的に作りました。実データが偏っているため。もし思わぬ挙動をしている場合は、是非コマンドから`Dump informations of this doc`を選択し、情報をください
改行文字と#を除き、すべて●に置換しても、アルゴリズムは有効に働きます。
デフォルトは20文字と、250文字です。
備考:もし小さな値を設定した場合、リクエスト数は増えます
もしサーバから遠い場合、トータルのスループットは遅くなり、転送量は増えます。
### Batch size
一度に処理するChange feedの数です。デフォルトは250です。
### Batch limit
一度に処理するBatchの数です。デフォルトは40です。
## Miscellaneous
その他の設定です
### Show status inside editor
同期の情報をエディター内に表示します。
同期の情報をエディター内に表示します。
モバイルで便利です。
### Check integrity on saving
保存時にデータが全て保存できたかチェックを行います。
## Hatch
ここから先は、困ったときに開ける蓋の中身です。注意して使用してください。
@@ -166,9 +188,12 @@ Self-hosted LiveSyncは一つのチャンクのサイズを最低minimum chunk s
ご使用のすべてのデバイスでロックを解除した場合は、データベースのロックを解除することができます。
ただし、このまま放置しても問題はありません。
### Reread all files
### Verify and repair all files
Vault内のファイルを全て読み込み直し、もし差分があったり、データベースから正常に読み込めなかったものに関して、データベースに反映します。
### Sanity check
ローカルデータベースに保存されている全てのファイルが正しくチャンクを持っていることを確認します。
### Drop history
データベースに記録されている履歴を削除し、データベースを初期化します。
新しい端末や新しいVaultへの同期にやたらと時間がかかったり、データベースサイズが肥大化したりしてきた際に使用してください。

4054
main.ts

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,7 @@
{
"id": "obsidian-livesync",
"name": "Self-hosted LiveSync",
"version": "0.1.26",
"version": "0.4.0",
"minAppVersion": "0.9.12",
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"author": "vorotamoroz",

4156
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,11 +1,12 @@
{
"name": "obsidian-livesync",
"version": "0.1.26",
"version": "0.4.0",
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"main": "main.js",
"scripts": {
"dev": "rollup --config rollup.config.js -w",
"build": "rollup --config rollup.config.js --environment BUILD:production"
"build": "rollup --config rollup.config.js --environment BUILD:production",
"lint": "eslint src"
},
"keywords": [],
"author": "vorotamoroz",
@@ -16,7 +17,12 @@
"@rollup/plugin-typescript": "^8.2.1",
"@types/diff-match-patch": "^1.0.32",
"@types/pouchdb-browser": "^6.1.3",
"obsidian": "^0.12.0",
"@typescript-eslint/eslint-plugin": "^5.7.0",
"@typescript-eslint/parser": "^5.0.0",
"eslint": "^7.32.0",
"eslint-config-airbnb-base": "^14.2.1",
"eslint-plugin-import": "^2.25.2",
"obsidian": "^0.13.11",
"rollup": "^2.32.1",
"tslib": "^2.2.0",
"typescript": "^4.2.4"

View File

@@ -11,7 +11,7 @@ if you want to view the source visit the plugins github repository
`;
export default {
input: "main.ts",
input: "./src/main.ts",
output: {
dir: ".",
sourcemap: "inline",

View File

@@ -0,0 +1,81 @@
import { App, Modal } from "obsidian";
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT } from "diff-match-patch";
import { diff_result } from "./types";
import { escapeStringToHTML } from "./utils";
export class ConflictResolveModal extends Modal {
// result: Array<[number, string]>;
result: diff_result;
callback: (remove_rev: string) => Promise<void>;
constructor(app: App, diff: diff_result, callback: (remove_rev: string) => Promise<void>) {
super(app);
this.result = diff;
this.callback = callback;
}
onOpen() {
const { contentEl } = this;
contentEl.empty();
contentEl.createEl("h2", { text: "This document has conflicted changes." });
const div = contentEl.createDiv("");
div.addClass("op-scrollable");
let diff = "";
for (const v of this.result.diff) {
const x1 = v[0];
const x2 = v[1];
if (x1 == DIFF_DELETE) {
diff += "<span class='deleted'>" + escapeStringToHTML(x2) + "</span>";
} else if (x1 == DIFF_EQUAL) {
diff += "<span class='normal'>" + escapeStringToHTML(x2) + "</span>";
} else if (x1 == DIFF_INSERT) {
diff += "<span class='added'>" + escapeStringToHTML(x2) + "</span>";
}
}
diff = diff.replace(/\n/g, "<br>");
div.innerHTML = diff;
const div2 = contentEl.createDiv("");
const date1 = new Date(this.result.left.mtime).toLocaleString();
const date2 = new Date(this.result.right.mtime).toLocaleString();
div2.innerHTML = `
<span class='deleted'>A:${date1}</span><br /><span class='added'>B:${date2}</span><br>
`;
contentEl.createEl("button", { text: "Keep A" }, (e) => {
e.addEventListener("click", async () => {
await this.callback(this.result.right.rev);
this.callback = null;
this.close();
});
});
contentEl.createEl("button", { text: "Keep B" }, (e) => {
e.addEventListener("click", async () => {
await this.callback(this.result.left.rev);
this.callback = null;
this.close();
});
});
contentEl.createEl("button", { text: "Concat both" }, (e) => {
e.addEventListener("click", async () => {
await this.callback("");
this.callback = null;
this.close();
});
});
contentEl.createEl("button", { text: "Not now" }, (e) => {
e.addEventListener("click", () => {
this.close();
});
});
}
onClose() {
const { contentEl } = this;
contentEl.empty();
if (this.callback != null) {
this.callback(null);
}
}
}

1234
src/LocalPouchDB.ts Normal file

File diff suppressed because it is too large Load Diff

37
src/LogDisplayModal.ts Normal file
View File

@@ -0,0 +1,37 @@
import { App, Modal } from "obsidian";
import { escapeStringToHTML } from "./utils";
import ObsidianLiveSyncPlugin from "./main";
export class LogDisplayModal extends Modal {
plugin: ObsidianLiveSyncPlugin;
logEl: HTMLDivElement;
constructor(app: App, plugin: ObsidianLiveSyncPlugin) {
super(app);
this.plugin = plugin;
}
updateLog() {
let msg = "";
for (const v of this.plugin.logMessage) {
msg += escapeStringToHTML(v) + "<br>";
}
this.logEl.innerHTML = msg;
}
onOpen() {
const { contentEl } = this;
contentEl.empty();
contentEl.createEl("h2", { text: "Sync Status" });
const div = contentEl.createDiv("");
div.addClass("op-scrollable");
div.addClass("op-pre");
this.logEl = div;
this.updateLog = this.updateLog.bind(this);
this.plugin.addLogHook = this.updateLog;
this.updateLog();
}
onClose() {
const { contentEl } = this;
contentEl.empty();
this.plugin.addLogHook = null;
}
}

File diff suppressed because it is too large Load Diff

168
src/e2ee.ts Normal file
View File

@@ -0,0 +1,168 @@
import { Logger } from "./logger";
import { LOG_LEVEL } from "./types";
export type encodedData = [encryptedData: string, iv: string, salt: string];
export type KeyBuffer = {
index: string;
key: CryptoKey;
salt: Uint8Array;
};
const KeyBuffs: KeyBuffer[] = [];
const decKeyBuffs: KeyBuffer[] = [];
const KEY_RECYCLE_COUNT = 100;
let recycleCount = KEY_RECYCLE_COUNT;
let semiStaticFieldBuffer: Uint8Array = null;
const nonceBuffer: Uint32Array = new Uint32Array(1);
export async function getKeyForEncrypt(passphrase: string): Promise<[CryptoKey, Uint8Array]> {
// For performance, the plugin reuses the key KEY_RECYCLE_COUNT times.
const f = KeyBuffs.find((e) => e.index == passphrase);
if (f) {
recycleCount--;
if (recycleCount > 0) {
return [f.key, f.salt];
}
KeyBuffs.remove(f);
recycleCount = KEY_RECYCLE_COUNT;
}
const xpassphrase = new TextEncoder().encode(passphrase);
const digest = await crypto.subtle.digest({ name: "SHA-256" }, xpassphrase);
const keyMaterial = await crypto.subtle.importKey("raw", digest, { name: "PBKDF2" }, false, ["deriveKey"]);
const salt = crypto.getRandomValues(new Uint8Array(16));
const key = await crypto.subtle.deriveKey(
{
name: "PBKDF2",
salt,
iterations: 100000,
hash: "SHA-256",
},
keyMaterial,
{ name: "AES-GCM", length: 256 },
false,
["encrypt"]
);
KeyBuffs.push({
index: passphrase,
key,
salt,
});
while (KeyBuffs.length > 50) {
KeyBuffs.shift();
}
return [key, salt];
}
export async function getKeyForDecryption(passphrase: string, salt: Uint8Array): Promise<[CryptoKey, Uint8Array]> {
const bufKey = passphrase + uint8ArrayToHexString(salt);
const f = decKeyBuffs.find((e) => e.index == bufKey);
if (f) {
return [f.key, f.salt];
}
const xpassphrase = new TextEncoder().encode(passphrase);
const digest = await crypto.subtle.digest({ name: "SHA-256" }, xpassphrase);
const keyMaterial = await crypto.subtle.importKey("raw", digest, { name: "PBKDF2" }, false, ["deriveKey"]);
const key = await crypto.subtle.deriveKey(
{
name: "PBKDF2",
salt,
iterations: 100000,
hash: "SHA-256",
},
keyMaterial,
{ name: "AES-GCM", length: 256 },
false,
["decrypt"]
);
decKeyBuffs.push({
index: bufKey,
key,
salt,
});
while (decKeyBuffs.length > 50) {
decKeyBuffs.shift();
}
return [key, salt];
}
function getSemiStaticField(reset?: boolean) {
// return fixed field of iv.
if (semiStaticFieldBuffer != null && !reset) {
return semiStaticFieldBuffer;
}
semiStaticFieldBuffer = crypto.getRandomValues(new Uint8Array(12));
return semiStaticFieldBuffer;
}
function getNonce() {
// This is nonce, so do not send same thing.
nonceBuffer[0]++;
if (nonceBuffer[0] > 10000) {
// reset semi-static field.
getSemiStaticField(true);
}
return nonceBuffer;
}
function uint8ArrayToHexString(src: Uint8Array): string {
return Array.from(src)
.map((e: number): string => `00${e.toString(16)}`.slice(-2))
.join("");
}
function hexStringToUint8Array(src: string): Uint8Array {
const srcArr = [...src];
const arr = srcArr.reduce((acc, _, i) => (i % 2 ? acc : [...acc, srcArr.slice(i, i + 2).join("")]), []).map((e) => parseInt(e, 16));
return Uint8Array.from(arr);
}
export async function encrypt(input: string, passphrase: string) {
const [key, salt] = await getKeyForEncrypt(passphrase);
// Create initial vector with semifixed part and incremental part
// I think it's not good against related-key attacks.
const fixedPart = getSemiStaticField();
const invocationPart = getNonce();
const iv = Uint8Array.from([...fixedPart, ...new Uint8Array(invocationPart.buffer)]);
const plainStringified: string = JSON.stringify(input);
const plainStringBuffer: Uint8Array = new TextEncoder().encode(plainStringified);
const encryptedDataArrayBuffer = await crypto.subtle.encrypt({ name: "AES-GCM", iv }, key, plainStringBuffer);
const encryptedData = window.btoa(Array.from(new Uint8Array(encryptedDataArrayBuffer), (char) => String.fromCharCode(char)).join(""));
//return data with iv and salt.
const response: encodedData = [encryptedData, uint8ArrayToHexString(iv), uint8ArrayToHexString(salt)];
const ret = JSON.stringify(response);
return ret;
}
export async function decrypt(encryptedResult: string, passphrase: string): Promise<string> {
try {
const [encryptedData, ivString, salt]: encodedData = JSON.parse(encryptedResult);
const [key] = await getKeyForDecryption(passphrase, hexStringToUint8Array(salt));
const iv = hexStringToUint8Array(ivString);
// decode base 64, it should increase speed and i should with in MAX_DOC_SIZE_BIN, so it won't OOM.
const encryptedDataBin = window.atob(encryptedData);
const encryptedDataArrayBuffer = Uint8Array.from(encryptedDataBin.split(""), (char) => char.charCodeAt(0));
const plainStringBuffer: ArrayBuffer = await crypto.subtle.decrypt({ name: "AES-GCM", iv }, key, encryptedDataArrayBuffer);
const plainStringified = new TextDecoder().decode(plainStringBuffer);
const plain = JSON.parse(plainStringified);
return plain;
} catch (ex) {
Logger("Couldn't decode! You should wrong the passphrases", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
throw ex;
}
}
export async function testCrypt() {
const src = "supercalifragilisticexpialidocious";
const encoded = await encrypt(src, "passwordTest");
const decrypted = await decrypt(encoded, "passwordTest");
if (src != decrypted) {
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL.VERBOSE);
return false;
} else {
Logger("CRYPT LOGIC OK", LOG_LEVEL.VERBOSE);
return true;
}
}

13
src/logger.ts Normal file
View File

@@ -0,0 +1,13 @@
import { LOG_LEVEL } from "./types";
// eslint-disable-next-line require-await
export let Logger: (message: any, levlel?: LOG_LEVEL) => Promise<void> = async (message, _) => {
const timestamp = new Date().toLocaleString();
const messagecontent = typeof message == "string" ? message : message instanceof Error ? `${message.name}:${message.message}` : JSON.stringify(message, null, 2);
const newmessage = timestamp + "->" + messagecontent;
console.log(newmessage);
};
export function setLogger(loggerFun: (message: any, levlel?: LOG_LEVEL) => Promise<void>) {
Logger = loggerFun;
}

1412
src/main.ts Normal file

File diff suppressed because it is too large Load Diff

230
src/types.ts Normal file
View File

@@ -0,0 +1,230 @@
// docs should be encoded as base64, so 1 char -> 1 bytes
// and cloudant limitation is 1MB , we use 900kb;
import { PluginManifest } from "obsidian";
export const MAX_DOC_SIZE = 1000; // for .md file, but if delimiters exists. use that before.
export const MAX_DOC_SIZE_BIN = 102400; // 100kb
export const VER = 10;
export const RECENT_MOFIDIED_DOCS_QTY = 30;
export const LEAF_WAIT_TIMEOUT = 90000; // in synchronization, waiting missing leaf time out.
export const LOG_LEVEL = {
VERBOSE: 1,
INFO: 10,
NOTICE: 100,
URGENT: 1000,
} as const;
export type LOG_LEVEL = typeof LOG_LEVEL[keyof typeof LOG_LEVEL];
export const VERSIONINFO_DOCID = "obsydian_livesync_version";
export const MILSTONE_DOCID = "_local/obsydian_livesync_milestone";
export const NODEINFO_DOCID = "_local/obsydian_livesync_nodeinfo";
export interface ObsidianLiveSyncSettings {
couchDB_URI: string;
couchDB_USER: string;
couchDB_PASSWORD: string;
couchDB_DBNAME: string;
liveSync: boolean;
syncOnSave: boolean;
syncOnStart: boolean;
syncOnFileOpen: boolean;
savingDelay: number;
lessInformationInLog: boolean;
gcDelay: number;
versionUpFlash: string;
minimumChunkSize: number;
longLineThreshold: number;
showVerboseLog: boolean;
suspendFileWatching: boolean;
trashInsteadDelete: boolean;
periodicReplication: boolean;
periodicReplicationInterval: number;
encrypt: boolean;
passphrase: string;
workingEncrypt: boolean;
workingPassphrase: string;
doNotDeleteFolder: boolean;
resolveConflictsByNewerFile: boolean;
batchSave: boolean;
deviceAndVaultName: string;
usePluginSettings: boolean;
showOwnPlugins: boolean;
showStatusOnEditor: boolean;
usePluginSync: boolean;
autoSweepPlugins: boolean;
autoSweepPluginsPeriodic: boolean;
notifyPluginOrSettingUpdated: boolean;
checkIntegrityOnSave: boolean;
batch_size: number;
batches_limit: number;
}
export const DEFAULT_SETTINGS: ObsidianLiveSyncSettings = {
couchDB_URI: "",
couchDB_USER: "",
couchDB_PASSWORD: "",
couchDB_DBNAME: "",
liveSync: false,
syncOnSave: false,
syncOnStart: false,
savingDelay: 200,
lessInformationInLog: false,
gcDelay: 300,
versionUpFlash: "",
minimumChunkSize: 20,
longLineThreshold: 250,
showVerboseLog: false,
suspendFileWatching: false,
trashInsteadDelete: true,
periodicReplication: false,
periodicReplicationInterval: 60,
syncOnFileOpen: false,
encrypt: false,
passphrase: "",
workingEncrypt: false,
workingPassphrase: "",
doNotDeleteFolder: false,
resolveConflictsByNewerFile: false,
batchSave: false,
deviceAndVaultName: "",
usePluginSettings: false,
showOwnPlugins: false,
showStatusOnEditor: false,
usePluginSync: false,
autoSweepPlugins: false,
autoSweepPluginsPeriodic: false,
notifyPluginOrSettingUpdated: false,
checkIntegrityOnSave: false,
batch_size: 250,
batches_limit: 40,
};
export const PERIODIC_PLUGIN_SWEEP = 60;
export interface Entry {
_id: string;
data: string;
_rev?: string;
ctime: number;
mtime: number;
size: number;
_deleted?: boolean;
_conflicts?: string[];
type?: "notes";
}
export interface NewEntry {
_id: string;
children: string[];
_rev?: string;
ctime: number;
mtime: number;
size: number;
_deleted?: boolean;
_conflicts?: string[];
NewNote: true;
type: "newnote";
}
export interface PlainEntry {
_id: string;
children: string[];
_rev?: string;
ctime: number;
mtime: number;
size: number;
_deleted?: boolean;
NewNote: true;
_conflicts?: string[];
type: "plain";
}
export type LoadedEntry = Entry & {
children: string[];
datatype: "plain" | "newnote";
};
export interface PluginDataEntry {
_id: string;
deviceVaultName: string;
mtime: number;
manifest: PluginManifest;
mainJs: string;
manifestJson: string;
styleCss?: string;
// it must be encrypted.
dataJson?: string;
_rev?: string;
_deleted?: boolean;
_conflicts?: string[];
type: "plugin";
}
export interface EntryLeaf {
_id: string;
data: string;
_deleted?: boolean;
type: "leaf";
_rev?: string;
}
export interface EntryVersionInfo {
_id: typeof VERSIONINFO_DOCID;
_rev?: string;
type: "versioninfo";
version: number;
_deleted?: boolean;
}
export interface EntryMilestoneInfo {
_id: typeof MILSTONE_DOCID;
_rev?: string;
type: "milestoneinfo";
_deleted?: boolean;
created: number;
accepted_nodes: string[];
locked: boolean;
}
export interface EntryNodeInfo {
_id: typeof NODEINFO_DOCID;
_rev?: string;
_deleted?: boolean;
type: "nodeinfo";
nodeid: string;
}
export type EntryBody = Entry | NewEntry | PlainEntry;
export type EntryDoc = EntryBody | LoadedEntry | EntryLeaf | EntryVersionInfo | EntryMilestoneInfo | EntryNodeInfo;
export type diff_result_leaf = {
rev: string;
data: string;
ctime: number;
mtime: number;
};
export type dmp_result = Array<[number, string]>;
export type diff_result = {
left: diff_result_leaf;
right: diff_result_leaf;
diff: dmp_result;
};
export type diff_check_result = boolean | diff_result;
export type Credential = {
username: string;
password: string;
};
export type EntryDocResponse = EntryDoc & PouchDB.Core.IdMeta & PouchDB.Core.GetMeta;
export type DatabaseConnectingStatus = "STARTED" | "NOT_CONNECTED" | "PAUSED" | "CONNECTED" | "COMPLETED" | "CLOSED" | "ERRORED";
export interface PluginList {
[key: string]: PluginDataEntry[];
}
export interface DevicePluginList {
[key: string]: PluginDataEntry;
}
export const FLAGMD_REDFLAG = "redflag.md";

209
src/utils.ts Normal file
View File

@@ -0,0 +1,209 @@
import { normalizePath } from "obsidian";
import { Logger } from "./logger";
import { FLAGMD_REDFLAG, LOG_LEVEL } from "./types";
export function arrayBufferToBase64(buffer: ArrayBuffer): Promise<string> {
return new Promise((res) => {
const blob = new Blob([buffer], { type: "application/octet-binary" });
const reader = new FileReader();
reader.onload = function (evt) {
const dataurl = evt.target.result.toString();
res(dataurl.substr(dataurl.indexOf(",") + 1));
};
reader.readAsDataURL(blob);
});
}
export function base64ToString(base64: string): string {
try {
const binary_string = window.atob(base64);
const len = binary_string.length;
const bytes = new Uint8Array(len);
for (let i = 0; i < len; i++) {
bytes[i] = binary_string.charCodeAt(i);
}
return new TextDecoder().decode(bytes);
} catch (ex) {
return base64;
}
}
export function base64ToArrayBuffer(base64: string): ArrayBuffer {
try {
const binary_string = window.atob(base64);
const len = binary_string.length;
const bytes = new Uint8Array(len);
for (let i = 0; i < len; i++) {
bytes[i] = binary_string.charCodeAt(i);
}
return bytes.buffer;
} catch (ex) {
try {
return new Uint16Array(
[].map.call(base64, function (c: string) {
return c.charCodeAt(0);
})
).buffer;
} catch (ex2) {
return null;
}
}
}
export const escapeStringToHTML = (str: string) => {
if (!str) return "";
return str.replace(/[<>&"'`]/g, (match) => {
const escape: any = {
"<": "&lt;",
">": "&gt;",
"&": "&amp;",
'"': "&quot;",
"'": "&#39;",
"`": "&#x60;",
};
return escape[match];
});
};
export function resolveWithIgnoreKnownError<T>(p: Promise<T>, def: T): Promise<T> {
return new Promise((res, rej) => {
p.then(res).catch((ex) => (ex.status && ex.status == 404 ? res(def) : rej(ex)));
});
}
export function isValidPath(filename: string): boolean {
// eslint-disable-next-line no-control-regex
const regex = /[\u0000-\u001f]|[\\":?<>|*#]/g;
let x = filename.replace(regex, "_");
const win = /(\\|\/)(COM\d|LPT\d|CON|PRN|AUX|NUL|CLOCK$)($|\.)/gi;
const sx = (x = x.replace(win, "/_"));
return sx == filename;
}
export function shouldBeIgnored(filename: string): boolean {
if (filename == FLAGMD_REDFLAG) {
return true;
}
return false;
}
export function versionNumberString2Number(version: string): number {
return version // "1.23.45"
.split(".") // 1 23 45
.reverse() // 45 23 1
.map((e, i) => ((e as any) / 1) * 1000 ** i) // 45 23000 1000000
.reduce((prev, current) => prev + current, 0); // 1023045
}
export const delay = (ms: number): Promise<void> => {
return new Promise((res) => {
setTimeout(() => {
res();
}, ms);
});
};
// For backward compatibility, using the path for determining id.
// Only CouchDB nonacceptable ID (that starts with an underscore) has been prefixed with "/".
// The first slash will be deleted when the path is normalized.
export function path2id(filename: string): string {
let x = normalizePath(filename);
if (x.startsWith("_")) x = "/" + x;
return x;
}
export function id2path(filename: string): string {
return normalizePath(filename);
}
const runningProcs: string[] = [];
const pendingProcs: { [key: string]: (() => Promise<void>)[] } = {};
function objectToKey(key: any): string {
if (typeof key === "string") return key;
const keys = Object.keys(key).sort((a, b) => a.localeCompare(b));
return keys.map((e) => e + objectToKey(key[e])).join(":");
}
// Just run async/await as like transacion ISOLATION SERIALIZABLE
export function runWithLock<T>(key: unknown, ignoreWhenRunning: boolean, proc: () => Promise<T>): Promise<T> {
// Logger(`Lock:${key}:enter`, LOG_LEVEL.VERBOSE);
const lockKey = typeof key === "string" ? key : objectToKey(key);
const handleNextProcs = () => {
if (typeof pendingProcs[lockKey] === "undefined") {
//simply unlock
runningProcs.remove(lockKey);
// Logger(`Lock:${lockKey}:released`, LOG_LEVEL.VERBOSE);
} else {
Logger(`Lock:${lockKey}:left ${pendingProcs[lockKey].length}`, LOG_LEVEL.VERBOSE);
let nextProc = null;
nextProc = pendingProcs[lockKey].shift();
if (nextProc) {
// left some
nextProc()
.then()
.catch((err) => {
Logger(err);
})
.finally(() => {
if (pendingProcs && lockKey in pendingProcs && pendingProcs[lockKey].length == 0) {
delete pendingProcs[lockKey];
}
queueMicrotask(() => {
handleNextProcs();
});
});
} else {
if (pendingProcs && lockKey in pendingProcs && pendingProcs[lockKey].length == 0) {
delete pendingProcs[lockKey];
}
}
}
};
if (runningProcs.contains(lockKey)) {
if (ignoreWhenRunning) {
return null;
}
if (typeof pendingProcs[lockKey] === "undefined") {
pendingProcs[lockKey] = [];
}
let responderRes: (value: T | PromiseLike<T>) => void;
let responderRej: (reason?: unknown) => void;
const responder = new Promise<T>((res, rej) => {
responderRes = res;
responderRej = rej;
//wait for subproc resolved
});
const subproc = () =>
new Promise<void>((res, rej) => {
proc()
.then((v) => {
// Logger(`Lock:${key}:processed`, LOG_LEVEL.VERBOSE);
handleNextProcs();
responderRes(v);
res();
})
.catch((reason) => {
Logger(`Lock:${key}:rejected`, LOG_LEVEL.VERBOSE);
handleNextProcs();
rej(reason);
responderRej(reason);
});
});
pendingProcs[lockKey].push(subproc);
// Logger(`Lock:${lockKey}:queud:left${pendingProcs[lockKey].length}`, LOG_LEVEL.VERBOSE);
return responder;
} else {
runningProcs.push(lockKey);
// Logger(`Lock:${lockKey}:aqquired`, LOG_LEVEL.VERBOSE);
return new Promise((res, rej) => {
proc()
.then((v) => {
handleNextProcs();
res(v);
})
.catch((reason) => {
handleNextProcs();
rej(reason);
});
});
}
}

70
src/utils_couchdb.ts Normal file
View File

@@ -0,0 +1,70 @@
import { Logger } from "./logger";
import { LOG_LEVEL, VER, VERSIONINFO_DOCID, EntryVersionInfo, EntryDoc } from "./types";
import { resolveWithIgnoreKnownError } from "./utils";
import { PouchDB } from "../pouchdb-browser-webpack/dist/pouchdb-browser.js";
export const isValidRemoteCouchDBURI = (uri: string): boolean => {
if (uri.startsWith("https://")) return true;
if (uri.startsWith("http://")) return true;
return false;
};
export const connectRemoteCouchDB = async (uri: string, auth: { username: string; password: string }): Promise<string | { db: PouchDB.Database<EntryDoc>; info: PouchDB.Core.DatabaseInfo }> => {
if (!isValidRemoteCouchDBURI(uri)) return "Remote URI is not valid";
const db: PouchDB.Database<EntryDoc> = new PouchDB<EntryDoc>(uri, {
auth,
});
try {
const info = await db.info();
return { db: db, info: info };
} catch (ex) {
let msg = `${ex.name}:${ex.message}`;
if (ex.name == "TypeError" && ex.message == "Failed to fetch") {
msg += "\n**Note** This error caused by many reasons. The only sure thing is you didn't touch the server.\nTo check details, open inspector.";
}
Logger(ex, LOG_LEVEL.VERBOSE);
return msg;
}
};
// check the version of remote.
// if remote is higher than current(or specified) version, return false.
export const checkRemoteVersion = async (db: PouchDB.Database, migrate: (from: number, to: number) => Promise<boolean>, barrier: number = VER): Promise<boolean> => {
try {
const versionInfo = (await db.get(VERSIONINFO_DOCID)) as EntryVersionInfo;
if (versionInfo.type != "versioninfo") {
return false;
}
const version = versionInfo.version;
if (version < barrier) {
const versionUpResult = await migrate(version, barrier);
if (versionUpResult) {
await bumpRemoteVersion(db);
return true;
}
}
if (version == barrier) return true;
return false;
} catch (ex) {
if (ex.status && ex.status == 404) {
if (await bumpRemoteVersion(db)) {
return true;
}
return false;
}
throw ex;
}
};
export const bumpRemoteVersion = async (db: PouchDB.Database, barrier: number = VER): Promise<boolean> => {
const vi: EntryVersionInfo = {
_id: VERSIONINFO_DOCID,
version: barrier,
type: "versioninfo",
};
const versionInfo = (await resolveWithIgnoreKnownError(db.get(VERSIONINFO_DOCID), vi)) as EntryVersionInfo;
if (versionInfo.type != "versioninfo") {
return false;
}
vi._rev = versionInfo._rev;
await db.put(vi);
return true;
};

View File

@@ -136,3 +136,7 @@ div.sls-setting-menu-btn {
flex-grow: 0;
padding: 6px 10px;
}
.sls-plugins-tbl-device-head {
background-color: var(--background-secondary-alt);
color: var(--text-accent);
}

View File

@@ -9,9 +9,13 @@
"noImplicitAny": true,
"moduleResolution": "node",
"importHelpers": true,
"lib": ["dom", "es5", "scripthost", "es2015"]
"noImplicitReturns": true,
"noImplicitThis": true,
"strictFunctionTypes": true,
"alwaysStrict": true,
"lib": ["dom", "es5", "ES6", "ES7", "es2020"]
},
"include": ["**/*.ts"],
"files": ["./main.ts"],
"include": ["./src/*.ts"],
// "files": ["./src/main.ts"],
"exclude": ["pouchdb-browser-webpack"]
}