Compare commits
39 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
e76e7ae8ea | ||
|
|
f7fbe85d65 | ||
|
|
0313443b29 | ||
|
|
755c30f468 | ||
|
|
b00b0cc5e5 | ||
|
|
d7985a6b41 | ||
|
|
486e816902 | ||
|
|
ef9b19c24b | ||
|
|
4ed9494176 | ||
|
|
fcd56d59d5 | ||
|
|
1cabfcfd19 | ||
|
|
37a18dbfef | ||
|
|
e7edf88713 | ||
|
|
90ff75ab35 | ||
|
|
bff1d661f5 | ||
|
|
6b59c14774 | ||
|
|
8249274eac | ||
|
|
3c6dae7814 | ||
|
|
60cf8fe640 | ||
|
|
3d89b3863f | ||
|
|
ee9364310d | ||
|
|
86b9695bc2 | ||
|
|
e05f8771b9 | ||
|
|
65619c2478 | ||
|
|
1552fa9d9e | ||
|
|
767f12b52f | ||
|
|
4071ba120e | ||
|
|
2c0e3ba01c | ||
|
|
90adf06830 | ||
|
|
cf8e7ff6ca | ||
|
|
95c3ff5043 | ||
|
|
7ea3515801 | ||
|
|
f866981a8a | ||
|
|
8f36d6f893 | ||
|
|
6dd86e9392 | ||
|
|
d22716bef0 | ||
|
|
5d9baec5e4 | ||
|
|
27d71ca2fb | ||
|
|
c024ed13d3 |
94
README.md
@@ -1,62 +1,59 @@
|
||||
<!-- For translation: 20240227r0 -->
|
||||
# Self-hosted LiveSync
|
||||
|
||||
[Japanese docs](./README_ja.md) - [Chinese docs](./README_cn.md).
|
||||
|
||||
Self-hosted LiveSync is a community-implemented synchronization plugin.
|
||||
A self-hosted or purchased CouchDB acts as the intermediate server. Available on every obsidian-compatible platform.
|
||||
|
||||
Note: It has no compatibility with the official "Obsidian Sync".
|
||||
Self-hosted LiveSync is a community-implemented synchronization plugin, available on every obsidian-compatible platform and using CouchDB as the server.
|
||||
|
||||

|
||||
|
||||
Before installing or upgrading LiveSync, please back your vault up.
|
||||
Note: This plugin cannot synchronise with the official "Obsidian Sync".
|
||||
|
||||
## Features
|
||||
|
||||
- Visual conflict resolver included.
|
||||
- Bidirectional synchronization between devices nearly in real-time
|
||||
- You can use CouchDB or its compatibles like IBM Cloudant.
|
||||
- End-to-End encryption is supported.
|
||||
- Plugin synchronization(Beta)
|
||||
- Receive WebClip from [obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf) (End-to-End encryption will not be applicable.)
|
||||
- Synchronize vaults very efficiently with less traffic.
|
||||
- Good at conflicted modification.
|
||||
- Automatic merging for simple conflicts.
|
||||
- Using OSS solution for the server.
|
||||
- Compatible solutions can be used.
|
||||
- Supporting End-to-end encryption.
|
||||
- Synchronisation of settings, snippets, themes, and plug-ins, via [Customization sync(Beta)](#customization-sync) or [Hidden File Sync](#hiddenfilesync)
|
||||
- WebClip from [obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
Useful for researchers, engineers and developers with a need to keep their notes fully self-hosted for security reasons. Or just anyone who would like the peace of mind of knowing that their notes are fully private.
|
||||
This plug-in might be useful for researchers, engineers, and developers with a need to keep their notes fully self-hosted for security reasons. Or just anyone who would like the peace of mind of knowing that their notes are fully private.
|
||||
|
||||
## IMPORTANT NOTICE
|
||||
|
||||
- Do not enable this plugin with another synchronization solution at the same time (including iCloud and Obsidian Sync). Before enabling this plugin, make sure to disable all the other synchronization methods to avoid content corruption or duplication. If you want to synchronize to two or more services, do them one by one and never enable two synchronization methods at the same time.
|
||||
This includes not putting your vault inside a cloud-synchronized folder (eg. an iCloud folder or Dropbox folder)
|
||||
- This is a synchronization plugin. Not a backup solution. Do not rely on this for backup.
|
||||
- If the device's storage runs out, database corruption may happen.
|
||||
- Hidden files or any other invisible files wouldn't be kept in the database, and thus won't be synchronized. (**and may also get deleted**)
|
||||
>[!IMPORTANT]
|
||||
> - Before installing or upgrading this plug-in, please back your vault up.
|
||||
> - Do not enable this plugin with another synchronization solution at the same time (including iCloud and Obsidian Sync).
|
||||
> - This is a synchronization plugin. Not a backup solution. Do not rely on this for backup.
|
||||
|
||||
## How to use
|
||||
|
||||
### Get your database ready.
|
||||
### 3-minute setup - CouchDB on fly.io
|
||||
|
||||
First, get your database ready. fly.io is preferred for testing. Or you can use your own server with CouchDB. For more information, refer below:
|
||||
1. [Setup fly.io](docs/setup_flyio.md)
|
||||
2. [Setup IBM Cloudant](docs/setup_cloudant.md)
|
||||
3. [Setup your CouchDB](docs/setup_own_server.md)
|
||||
**Recommended for beginners**
|
||||
|
||||
### Configure the plugin
|
||||
[](https://www.youtube.com/watch?v=7sa_I1832Xc)
|
||||
|
||||
See [Quick setup guide](doccs/../docs/quick_setup.md)
|
||||
1. [Setup CouchDB on fly.io](docs/setup_flyio.md)
|
||||
2. Configure plug-in in [Quick Setup](docs/quick_setup.md)
|
||||
|
||||
## Something looks corrupted...
|
||||
### Manually Setup
|
||||
|
||||
Please open the configuration link again and Answer below:
|
||||
- If your local database looks corrupted (in other words, when your Obsidian getting weird even standalone.)
|
||||
- Answer `No` to `Keep local DB?`
|
||||
- If your remote database looks corrupted (in other words, when something happens while replicating)
|
||||
- Answer `No` to `Keep remote DB?`
|
||||
1. Setup the server
|
||||
1. [Setup CouchDB on fly.io](docs/setup_flyio.md)
|
||||
2. [Setup your CouchDB](docs/setup_own_server.md)
|
||||
2. Configure plug-in in [Quick Setup](docs/quick_setup.md)
|
||||
|
||||
> [!TIP]
|
||||
> We are still able to use IBM Cloudant. However, it is not recommended for several reasons nowadays. Here is [Setup IBM Cloudant](docs/setup_cloudant.md)
|
||||
|
||||
If you answered `No` to both, your databases will be rebuilt by the content on your device. And the remote database will lock out other devices. You have to synchronize all your devices again. (When this time, almost all your files should be synchronized with a timestamp. So you can use an existing vault).
|
||||
|
||||
## Information in StatusBar
|
||||
|
||||
Synchronization status is shown in statusbar.
|
||||
Synchronization status is shown in the status bar with the following icons.
|
||||
|
||||
- Activity Indicator
|
||||
- 📲 Network request
|
||||
- Status
|
||||
- ⏹️ Stopped
|
||||
- 💤 LiveSync enabled. Waiting for changes
|
||||
@@ -73,32 +70,15 @@ Synchronization status is shown in statusbar.
|
||||
- 🛫 Pending read storage processes
|
||||
- ⚙️ Working or pending storage processes of hidden files
|
||||
- 🧩 Waiting chunks
|
||||
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
|
||||
- 🔌 Working Customisation items (Configuration, snippets, and plug-ins)
|
||||
|
||||
To prevent file and database corruption, please wait until all progress indicators have disappeared. Especially in case of if you have deleted or renamed files.
|
||||
To prevent file and database corruption, please wait to stop Obsidian until all progress indicators have disappeared as possible (The plugin will also try to resume, though). Especially in case of if you have deleted or renamed files.
|
||||
|
||||
|
||||
## Hints
|
||||
- If a folder becomes empty after a replication, it will be deleted by default. But you can toggle this behaviour. Check the [Settings](docs/settings.md).
|
||||
- LiveSync mode drains more batteries in mobile devices. Periodic sync with some automatic sync is recommended.
|
||||
- Mobile Obsidian can not connect to non-secure (HTTP) or locally-signed servers, even if the root certificate is installed on the device.
|
||||
- There are no 'exclude_folders' like configurations.
|
||||
- While synchronizing, files are compared by their modification time and the older ones will be overwritten by the newer ones. Then plugin checks for conflicts and if a merge is needed, a dialog will open.
|
||||
- Rarely, a file in the database could be corrupted. The plugin will not write to local storage when a file looks corrupted. If a local version of the file is on your device, the corruption could be fixed by editing the local file and synchronizing it. But if the file does not exist on any of your devices, then it can not be rescued. In this case, you can delete these items from the settings dialog.
|
||||
- To stop the boot-up sequence (eg. for fixing problems on databases), you can put a `redflag.md` file (or directory) at the root of your vault.
|
||||
Tip for iOS: a redflag directory can be created at the root of the vault using the File application.
|
||||
- Also, with `redflag2.md` placed, we can automatically rebuild both the local and the remote databases during the boot-up sequence. With `redflag3.md`, we can discard only the local database and fetch from the remote again.
|
||||
- Q: The database is growing, how can I shrink it down?
|
||||
A: each of the docs is saved with their past 100 revisions for detecting and resolving conflicts. Picturing that one device has been offline for a while, and comes online again. The device has to compare its notes with the remotely saved ones. If there exists a historic revision in which the note used to be identical, it could be updated safely (like git fast-forward). Even if that is not in revision histories, we only have to check the differences after the revision that both devices commonly have. This is like git's conflict-resolving method. So, We have to make the database again like an enlarged git repo if you want to solve the root of the problem.
|
||||
- And more technical Information is in the [Technical Information](docs/tech_info.md)
|
||||
- If you want to synchronize files without obsidian, you can use [filesystem-livesync](https://github.com/vrtmrz/filesystem-livesync).
|
||||
- WebClipper is also available on Chrome Web Store:[obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
Repo is here: [obsidian-livesync-webclip](https://github.com/vrtmrz/obsidian-livesync-webclip). (Docs are a work in progress.)
|
||||
|
||||
## Troubleshooting
|
||||
If you are having problems getting the plugin working see: [Troubleshooting](docs/troubleshooting.md)
|
||||
## Tips and Troubleshooting
|
||||
If you are having problems getting the plugin working see: [Tips and Troubleshooting](docs/troubleshooting.md)
|
||||
|
||||
## License
|
||||
|
||||
The source code is licensed under the MIT License.
|
||||
Licensed under the MIT License.
|
||||
|
||||
153
README_ja.md
@@ -1,84 +1,85 @@
|
||||
<!-- For translation: 20240227r0 -->
|
||||
# Self-hosted LiveSync
|
||||
[英語版ドキュメント](./README.md) - [中国語版ドキュメント](./README_cn.md).
|
||||
|
||||
**旧): obsidian-livesync**
|
||||
Obsidianで利用可能なすべてのプラットフォームで使える、CouchDBをサーバに使用する、コミュニティ版の同期プラグイン
|
||||
|
||||
セルフホストしたデータベースを使って、双方向のライブシンクするObsidianのプラグイン。
|
||||
**公式のSyncとは互換性はありません**
|
||||

|
||||
|
||||
**インストールする前に、Vaultのバックアップを確実に取得してください**
|
||||
|
||||
[英語版](./README.md)
|
||||
|
||||
## こんなことができるプラグインです。
|
||||
- Windows, Mac, iPad, iPhone, Android, Chromebookで動く
|
||||
- セルフホストしたデータベースに同期して
|
||||
- 複数端末で同時にその変更をほぼリアルタイムで配信し
|
||||
- さらに、他の端末での変更も別の端末に配信する、双方向リアルタイムなLiveSyncを実現でき、
|
||||
- 発生した変更の衝突はその場で解決できます。
|
||||
- 同期先のホストにはCouchDBまたはその互換DBaaSのIBM Cloudantをサーバーに使用できます。あなたのデータは、あなたのものです。
|
||||
- もちろんLiveではない同期もできます。
|
||||
- 万が一のために、サーバーに送る内容を暗号化できます(betaです)。
|
||||
- [Webクリッパー](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf) もあります(End-to-End暗号化対象外です)
|
||||
|
||||
NDAや類似の契約や義務、倫理を守る必要のある、研究者、設計者、開発者のような方に特にオススメです。
|
||||
特にエンタープライズでは、たとえEnd to Endの暗号化が行われていても、管理下にあるサーバーにのみデータを格納することが求められる場合があります。
|
||||
|
||||
# 重要なお知らせ
|
||||
|
||||
- ❌ファイルの重複や破損を避けるため、複数の同期手段を同時に使用しないでください。
|
||||
これは、Vaultをクラウド管理下のフォルダに置くことも含みます。(例えば、iCloudの管理フォルダ内に入れたり)。
|
||||
- ⚠️このプラグインは、端末間でのノートの反映を目的として作成されました。バックアップ等が目的ではありません。そのため、バックアップは必ず別のソリューションで行うようにしてください。
|
||||
- ストレージの空き容量が枯渇した場合、データベースが破損することがあります。
|
||||
|
||||
# このプラグインの使い方
|
||||
|
||||
1. Community Pluginsから、Self-holsted LiveSyncと検索しインストールするか、このリポジトリのReleasesから`main.js`, `manifest.json`, `style.css` をダウンロードしvaultの中の`.obsidian/plugins/obsidian-livesync`に入れて、Obsidianを再起動してください。
|
||||
2. サーバーをセットアップします。IBM Cloudantがお手軽かつ堅牢で便利です。完全にセルフホストする際にはお持ちのサーバーにCouchDBをインストールする必要があります。詳しくは下記を参照してください
|
||||
1. [IBM Cloudantのセットアップ](docs/setup_cloudant_ja.md)
|
||||
2. [独自のCouchDBのセットアップ](docs/setup_own_server_ja.md)
|
||||
|
||||
備考: IBM Cloudantのアカウント登録が出来ないケースがあるようです。代替を探していて、今 [using fly.io](https://github.com/vrtmrz/obsidian-livesync/discussions/85)を検討しています。
|
||||
|
||||
1. [Quick setup](docs/quick_setup_ja.md)から、セットアップウィザード使ってセットアップしてください。
|
||||
|
||||
# テストサーバー
|
||||
|
||||
もし、CouchDBをインストールしたり、Cloudantのインスタンスをセットアップしたりするのに気が引ける場合、[Self-hosted LiveSyncのテストサーバー](https://olstaste.vrtmrz.net/)を作りましたので、使ってみてください。
|
||||
|
||||
備考: 制限事項をよく確認して使用してください。くれぐれも、本当に使用している自分のVaultを同期しないようにしてください。
|
||||
|
||||
# WebClipperあります
|
||||
Self-hosted LiveSync用にWebClipperも作りました。Chrome Web Storeからダウンロードできます。
|
||||
|
||||
[obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
リポジトリはこちらです: [obsidian-livesync-webclip](https://github.com/vrtmrz/obsidian-livesync-webclip)。
|
||||
|
||||
相変わらずドキュメントは間に合っていません。
|
||||
|
||||
# ステータスバーの情報
|
||||
右下のステータスバーに、同期の状態が表示されます
|
||||
|
||||
- 同期状態
|
||||
- ⏹️ 同期は停止しています
|
||||
- 💤 同期はLiveSync中で、なにか起こるのを待っています
|
||||
- ⚡️ 同期中です
|
||||
- ⚠ エラーが発生しています
|
||||
- ↑ 送信したデータ数
|
||||
- ↓ 受信したデータ数
|
||||
- ⏳ 保留している処理の数です
|
||||
ファイルを削除したりリネームした場合、この表示が消えるまでお待ちください。
|
||||
|
||||
# さらなる補足
|
||||
- ファイルは同期された後、タイムスタンプを比較して新しければいったん新しい方で上書きされます。その後、衝突が発生したかによって、マージが行われます。
|
||||
- まれにファイルが破損することがあります。破損したファイルに関してはディスクへの反映を試みないため、実際には使用しているデバイスには少し古いファイルが残っていることが多いです。そのファイルを再度更新してもらうと、データベースが更新されて問題なくなるケースがあります。ファイルがどの端末にも存在しない場合は、設定画面から、削除できます。
|
||||
- データベースの復旧中に再起動した場合など、うまくローカルデータベースを修正できない際には、Vaultのトップに`redflag.md`というファイルを置いてください。起動時のシーケンスがスキップされます。
|
||||
- データベースが大きくなってきてるんだけど、小さくできる?→各ノートは、それぞれの古い100リビジョンとともに保存されています。例えば、しばらくオフラインだったあるデバイスが、久しぶりに同期したと想定してみてください。そのとき、そのデバイスは最新とは少し異なるリビジョンを持ってるはずです。その場合でも、リモートのリビジョン履歴にリモートのものが存在した場合、安全にマージできます。もしリビジョン履歴に存在しなかった場合、確認しなければいけない差分も、対象を存在して持っている共通のリビジョン以降のみに絞れます。ちょうどGitのような方法で、衝突を解決している形になるのです。そのため、肥大化したリポジトリの解消と同様に、本質的にデータベースを小さくしたい場合は、データベースの作り直しが必要です。
|
||||
- その他の技術的なお話は、[技術的な内容](docs/tech_info_ja.md)に書いてあります。
|
||||
※公式のSyncと同期することはできません。
|
||||
|
||||
|
||||
# ライセンス
|
||||
## 機能
|
||||
- 高効率・低トラフィックでVault同士を同期
|
||||
- 競合解決がいい感じ
|
||||
- 単純な競合なら自動マージします
|
||||
- OSSソリューションを同期サーバに使用
|
||||
- 互換ソリューションも使用可能です
|
||||
- End-to-End暗号化実装済み
|
||||
- 設定・スニペット・テーマ、プラグインの同期が可能
|
||||
- [Webクリッパー](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf) もあります
|
||||
|
||||
The source code is licensed MIT.
|
||||
|
||||
NDAや類似の契約や義務、倫理を守る必要のある、研究者、設計者、開発者のような方に特にオススメです。
|
||||
|
||||
|
||||
>[!IMPORTANT]
|
||||
> - インストール・アップデート前には必ずVaultをバックアップしてください
|
||||
> - 複数の同期ソリューションを同時に有効にしないでください(これはiCloudや公式のSyncも含みます)
|
||||
> - このプラグインは同期プラグインです。バックアップとして使用しないでください
|
||||
|
||||
|
||||
## このプラグインの使い方
|
||||
|
||||
### 3分セットアップ - CouchDB on fly.io
|
||||
|
||||
**はじめての方におすすめ**
|
||||
|
||||
[](https://www.youtube.com/watch?v=7sa_I1832Xc)
|
||||
|
||||
1. [Fly.ioにCouchDBをセットアップする](docs/setup_flyio.md)
|
||||
2. [Quick Setup](docs/quick_setup_ja.md)でプラグインを設定する
|
||||
|
||||
|
||||
### Manually Setup
|
||||
|
||||
1. サーバのセットアップ
|
||||
1. [Fly.ioにCouchDBをセットアップする](docs/setup_flyio.md)
|
||||
2. [CouchDBをセットアップする](docs/setup_own_server_ja.md)
|
||||
2. [Quick Setup](docs/quick_setup_ja.md)でプラグインを設定する
|
||||
|
||||
> [!TIP]
|
||||
> IBM Cloudantもまだ使用できますが、いくつかの理由で現在はおすすめしていません。[IBM Cloudantのセットアップ](docs/setup_cloudant_ja.md)はまだあります。
|
||||
|
||||
## ステータスバーの説明
|
||||
|
||||
同期ステータスはステータスバーに、下記のアイコンとともに表示されます
|
||||
|
||||
- アクティビティー
|
||||
- 📲 ネットワーク接続中
|
||||
- 同期ステータス
|
||||
- ⏹️ 停止中
|
||||
- 💤 変更待ち(LiveSync中)
|
||||
- ⚡️ 同期の進行中
|
||||
- ⚠ エラー
|
||||
- 統計情報
|
||||
- ↑ アップロードしたチャンクとメタデータ数
|
||||
- ↓ ダウンロードしたチャンクとメタデータ数
|
||||
- 進捗情報
|
||||
- 📥 転送後、未処理の項目数
|
||||
- 📄 稼働中データベース操作数
|
||||
- 💾 稼働中のストレージ書き込み数操作数
|
||||
- ⏳ 稼働中のストレージ読み込み数操作数
|
||||
- 🛫 待機中のストレージ読み込み数操作数
|
||||
- ⚙️ 隠しファイルの操作数(待機・稼働中合計)
|
||||
- 🧩 取得待ちを行っているチャンク数
|
||||
- 🔌 設定同期関連の操作数
|
||||
|
||||
データベースやファイルの破損を避けるため、Obsidianの終了は進捗情報が表示されなくなるまで待ってください(プラグインも復帰を試みますが)。特にファイルを削除やリネームした場合は気をつけてください。
|
||||
|
||||
|
||||
## Tips and Troubleshooting
|
||||
何かこまったら、[Tips and Troubleshooting](docs/troubleshooting.md)をご参照ください。
|
||||
|
||||
## License
|
||||
|
||||
Licensed under the MIT License.
|
||||
@@ -1,318 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"colab_type": "text",
|
||||
"id": "view-in-github"
|
||||
},
|
||||
"source": [
|
||||
"<a href=\"https://colab.research.google.com/gist/vrtmrz/37c3efd7842e49947aaaa7f665e5020a/deploy_couchdb_to_flyio_v2_with_swap.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "HiRV7G8Gk1Rs"
|
||||
},
|
||||
"source": [
|
||||
"History:\n",
|
||||
"- 18, May, 2023: Initial.\n",
|
||||
"- 19, Jun., 2023: Patched for enabling swap.\n",
|
||||
"- 22, Aug., 2023: Generating Setup-URI implemented.\n",
|
||||
"- 7, Nov., 2023: Fixed the issue of TOML editing."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "2Vh0mEQEZuAK"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Configurations\n",
|
||||
"import os\n",
|
||||
"os.environ['region']=\"nrt\"\n",
|
||||
"os.environ['couchUser']=\"alkcsa93\"\n",
|
||||
"os.environ['couchPwd']=\"c349usdfnv48fsasd\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "SPmbB0jZauQ1"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Delete once (Do not care about `cannot remove './fly.toml': No such file or directory`)\n",
|
||||
"!rm ./fly.toml"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "Nze7QoxLZ7Yx"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Installation\n",
|
||||
"# You have to set up your account in here.\n",
|
||||
"!curl -L https://fly.io/install.sh | sh\n",
|
||||
"!/root/.fly/bin/flyctl auth signup"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "MVJwsIYrbgtx"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Generate server\n",
|
||||
"!/root/.fly/bin/flyctl launch --auto-confirm --generate-name --detach --no-deploy --region ${region}\n",
|
||||
"!/root/.fly/bin/fly volumes create --region ${region} couchdata --size 2 --yes"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "2RSoO9o-i2TT"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Check the toml once.\n",
|
||||
"!cat fly.toml"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "zUtPZLVnbvdQ"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Modify the TOML and generate Dockerfile\n",
|
||||
"!pip install mergedeep\n",
|
||||
"from mergedeep import merge\n",
|
||||
"import toml\n",
|
||||
"fly = toml.load('fly.toml')\n",
|
||||
"override = {\n",
|
||||
" \"http_service\":{\n",
|
||||
" \"internal_port\":5984\n",
|
||||
" },\n",
|
||||
" \"build\":{\n",
|
||||
" \"dockerfile\":\"./Dockerfile\"\n",
|
||||
" },\n",
|
||||
" \"mounts\":{\n",
|
||||
" \"source\":\"couchdata\",\n",
|
||||
" \"destination\":\"/opt/couchdb/data\"\n",
|
||||
" },\n",
|
||||
" \"env\":{\n",
|
||||
" \"COUCHDB_USER\":os.environ['couchUser'],\n",
|
||||
" \"ERL_FLAGS\":\"-couch_ini /opt/couchdb/etc/default.ini /opt/couchdb/etc/default.d/ /opt/couchdb/etc/local.d /opt/couchdb/etc/local.ini /opt/couchdb/data/persistence.ini\",\n",
|
||||
" }\n",
|
||||
"}\n",
|
||||
"out = merge(fly,override)\n",
|
||||
"with open('fly.toml', 'wt') as fp:\n",
|
||||
" toml.dump(out, fp)\n",
|
||||
" fp.close()\n",
|
||||
"\n",
|
||||
"# Make the Dockerfile to modify the permission of the ini file. If you want to use a specific version, you should change `latest` here.\n",
|
||||
"dockerfile = '''FROM couchdb:latest\n",
|
||||
"RUN sed -i '2itouch /opt/couchdb/data/persistence.ini && chmod +w /opt/couchdb/data/persistence.ini && fallocate -l 512M /swapfile && chmod 0600 /swapfile && mkswap /swapfile && echo 10 > /proc/sys/vm/swappiness && swapon /swapfile && echo 1 > /proc/sys/vm/overcommit_memory' /docker-entrypoint.sh\n",
|
||||
"'''\n",
|
||||
"with open(\"./Dockerfile\",\"wt\") as fp:\n",
|
||||
" fp.write(dockerfile)\n",
|
||||
" fp.close()\n",
|
||||
"\n",
|
||||
"!echo ------\n",
|
||||
"!cat fly.toml\n",
|
||||
"!echo ------\n",
|
||||
"!cat Dockerfile"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "xWdsTCI6bzk2"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Configure password\n",
|
||||
"!/root/.fly/bin/flyctl secrets set COUCHDB_PASSWORD=${couchPwd}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "k0WIQlShcXGa"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Deploy server\n",
|
||||
"# Be sure to shutdown after the test.\n",
|
||||
"!/root/.fly/bin/flyctl deploy --detach --remote-only\n",
|
||||
"!/root/.fly/bin/flyctl status"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "0ySggkdlfq7M"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import subprocess, json\n",
|
||||
"result = subprocess.run([\"/root/.fly/bin/flyctl\",\"status\",\"-j\"], capture_output=True, text=True)\n",
|
||||
"if result.returncode==0:\n",
|
||||
" hostname = json.loads(result.stdout)[\"Hostname\"]\n",
|
||||
" os.environ['couchHost']=\"https://%s\" % (hostname)\n",
|
||||
" print(\"Your couchDB server is https://%s/\" % (hostname))\n",
|
||||
"else:\n",
|
||||
" print(\"Something occured.\")\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "cGlSzVqlQG_z"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Finish setting up the CouchDB\n",
|
||||
"# Please repeat until the request is completed without error messages\n",
|
||||
"# i.e., You have to redo this block while \"curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to xxxx\" is showing.\n",
|
||||
"#\n",
|
||||
"# Note: A few minutes might be required to be booted.\n",
|
||||
"!curl -X POST \"${couchHost}/_cluster_setup\" -H \"Content-Type: application/json\" -d \"{\\\"action\\\":\\\"enable_single_node\\\",\\\"username\\\":\\\"${couchUser}\\\",\\\"password\\\":\\\"${couchPwd}\\\",\\\"bind_address\\\":\\\"0.0.0.0\\\",\\\"port\\\":5984,\\\"singlenode\\\":true}\" --user \"${couchUser}:${couchPwd}\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "JePzrsHypY18"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Please repeat until all lines are completed without error messages\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/require_valid_user\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd_auth/require_valid_user\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/httpd/WWW-Authenticate\" -H \"Content-Type: application/json\" -d '\"Basic realm=\\\"couchdb\\\"\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/httpd/enable_cors\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/enable_cors\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/max_http_request_size\" -H \"Content-Type: application/json\" -d '\"4294967296\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/couchdb/max_document_size\" -H \"Content-Type: application/json\" -d '\"50000000\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/credentials\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/origins\" -H \"Content-Type: application/json\" -d '\"app://obsidian.md,capacitor://localhost,http://localhost\"' --user \"${couchUser}:${couchPwd}\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "YfSOomsoXbGS"
|
||||
},
|
||||
"source": [
|
||||
"Now, our CouchDB has been surely installed and configured. Cheers!\n",
|
||||
"\n",
|
||||
"In the steps that follow, create a setup-URI.\n",
|
||||
"\n",
|
||||
"This URI could be imported directly into Self-hosted LiveSync, to configure the use of the CouchDB which we configured now."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "416YncOqXdNn"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Database config\n",
|
||||
"import random, string\n",
|
||||
"\n",
|
||||
"def randomname(n):\n",
|
||||
" return ''.join(random.choices(string.ascii_letters + string.digits, k=n))\n",
|
||||
"\n",
|
||||
"# The database name\n",
|
||||
"os.environ['database']=\"obsidiannote\"\n",
|
||||
"# The passphrase to E2EE\n",
|
||||
"os.environ['passphrase']=randomname(20)\n",
|
||||
"\n",
|
||||
"print(\"Your database:\"+os.environ['database'])\n",
|
||||
"print(\"Your passphrase:\"+os.environ['passphrase'])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "C4d7C0HAXgsr"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Install deno for make setup uri\n",
|
||||
"!curl -fsSL https://deno.land/x/install/install.sh | sh"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "hQL_Dx-PXise"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Fetch module for encrypting a Setup URI\n",
|
||||
"!curl -o encrypt.ts https://gist.githubusercontent.com/vrtmrz/f9d1d95ee2ca3afa1a924a2c6759b854/raw/d7a070d864a6f61403d8dc74208238d5741aeb5a/encrypt.ts"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "o0gX_thFXlIZ"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Make buttons!\n",
|
||||
"from IPython.display import HTML\n",
|
||||
"result = subprocess.run([\"/root/.deno/bin/deno\",\"run\",\"-A\",\"encrypt.ts\"], capture_output=True, text=True)\n",
|
||||
"text=\"\"\n",
|
||||
"if result.returncode==0:\n",
|
||||
" text = result.stdout.strip()\n",
|
||||
" result = HTML(f\"<button onclick=navigator.clipboard.writeText('{text}')>Copy setup uri</button><br>Importing passphrase is `welcome`. <br>If you want to synchronise in live mode, please apply a preset after setup.)\")\n",
|
||||
"else:\n",
|
||||
" result = \"Failed to encrypt the setup URI\"\n",
|
||||
"result"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"colab": {
|
||||
"include_colab_link": true,
|
||||
"private_outputs": true,
|
||||
"provenance": []
|
||||
},
|
||||
"gpuClass": "standard",
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3",
|
||||
"name": "python3"
|
||||
},
|
||||
"language_info": {
|
||||
"name": "python"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 0
|
||||
}
|
||||
@@ -2,34 +2,51 @@
|
||||
|
||||
[Japanese docs](./quick_setup_ja.md) - [Chinese docs](./quick_setup_cn.md).
|
||||
|
||||
The plugin has so many configuration options to deal with different circumstances. However, there are not so many settings that are actually used. Therefore, `The Setup wizard` has been implemented to simplify the initial setup.
|
||||
|
||||
Note: Subsequent devices are recommended to be set up using the `Copy setup URI` and `Open setup URI`.
|
||||
|
||||
## The Setup wizard
|
||||
Open the `🧙♂️ Setup wizard` in the settings dialogue. If the plugin has not been configured before, it should already be open.
|
||||
The plugin has so many configuration options to deal with different circumstances. However, only a few settings are required in the normal cases. Therefore, `The Setup wizard` has been implemented to simplify the setup.
|
||||
|
||||

|
||||
|
||||
- Discard the existing configuration and set up
|
||||
If you have changed any settings, this button allows you to discard all changes before setting up.
|
||||
There are three methods to set up Self-hosted LiveSync.
|
||||
|
||||
- Do not discard the existing configuration and set up
|
||||
Simply reconfigure. Be careful. In wizard mode, you cannot see all configuration items, even if they have been configured.
|
||||
1. [Using setup URIs](#1-using-setup-uris) *(Recommended)*
|
||||
2. [Minimal setup](#2-minimal-setup)
|
||||
3. [Full manually setup the and Enable on this dialogue](#3-manually-setup)
|
||||
|
||||
Pressing `Next` on one of the above options will put the configuration dialog into wizard mode.
|
||||
## At the first device
|
||||
|
||||
### Wizard mode
|
||||
### 1. Using setup URIs
|
||||
|
||||
> [!TIP] What is the setup URI? Why is it required?
|
||||
> The setup URI is the encrypted representation of Self-hosted LiveSync configuration as a URI. This starts `obsidian://setuplivesync?settings=`. This is encrypted with a passphrase, so that it can be shared relatively securely between devices. It is a bit long, but it is one line. This allows a series of settings to be set at once without any inconsistencies.
|
||||
>
|
||||
> If you have configured the remote database by [Automated setup on Fly.io](./setup_flyio.md#a-very-automated-setup) or [set up your server with the tool](./setup_own_server.md#1-generate-the-setup-uri-on-a-desktop-device-or-server), **you should have one of them**
|
||||
|
||||
In this procedure, [this video](https://youtu.be/7sa_I1832Xc?t=146) may help us.
|
||||
|
||||
1. Click `Use` button (Or launch `Use the copied setup URI` from Command palette).
|
||||
2. Paste the Setup URI into the dialogue
|
||||
3. Type the passphrase of the Setup URI
|
||||
4. Answer `yes` for `Importing LiveSync's conf, OK?`.
|
||||
5. Answer `Set it up as secondary or subsequent device` for `How would you like to set it up?`.
|
||||
6. Initialisation will begin, please hold a while.
|
||||
7. You will asked about the hidden file synchronisation, answer as you like.
|
||||
1. If you are new to Self-hosted LiveSync, we can configure it later so leave it once.
|
||||
8. Synchronisation has been started! `Reload app without saving` is recommended after the indicators of Self-hosted LiveSync disappear.
|
||||
|
||||
OK, we can proceed the [next step](#).
|
||||
|
||||
### 2. Minimal setup
|
||||
|
||||
If you do not have any setup URI, Press the `start` button. The setting dialogue turns into the wizard mode and will display only minimal items.
|
||||
|
||||
>[!TIP]
|
||||
> We can generate the setup URI with the tool in any time. Please use [this tool](./setup_own_server.md#1-generate-the-setup-uri-on-a-desktop-device-or-server).
|
||||
|
||||

|
||||
|
||||
Let's see how to use it step-by-step.
|
||||
#### Remote database configuration
|
||||
|
||||
## Remote Database configuration
|
||||
|
||||
### Remote database configuration
|
||||
|
||||
Enter the information for the database we have set up.
|
||||
1. Enter the information for the database we have set up.
|
||||
|
||||

|
||||
|
||||
@@ -37,13 +54,9 @@ Enter the information for the database we have set up.
|
||||
#### Test database connection and Check database configuration
|
||||
|
||||
We can check the connectivity to the database, and the database settings.
|
||||
|
||||

|
||||
|
||||
#### Test Database Connection
|
||||
Check whether we can connect to the database. If it fails, there are several possible reasons, but first attempt the `Check database configuration` check to see if it fails there too.
|
||||
|
||||
#### Check database configuration
|
||||
#### Check and Fix database configuration
|
||||
|
||||
Check the database settings and fix any problems on the spot.
|
||||
|
||||
@@ -52,40 +65,43 @@ Check the database settings and fix any problems on the spot.
|
||||
This item may vary depending on the connection. In the above case, press all three Fix buttons.
|
||||
If the Fix buttons disappear and all become check marks, we are done.
|
||||
|
||||
|
||||
### Confidentiality configuration
|
||||
#### Confidentiality configuration (Optional but very preferred)
|
||||
|
||||

|
||||
|
||||
Encrypt your database in case of unintended database exposure; enable End to End encryption and the contents of your notes will be encrypted at the moment it leaves the device. We strongly recommend enabling it. And `Path Obfuscation` also obfuscates filenames. Now stable and recommended.
|
||||
Encryption is based on 256-bit AES-GCM.
|
||||
Enable End-to-end encryption and the contents of your notes will be encrypted at the moment it leaves the device. We strongly recommend enabling it. And `Path Obfuscation` also obfuscates filenames. Now stable and recommended.
|
||||
|
||||
These setting can be disabled if you are inside a closed network and it is clear that you will not be accessed by third parties.
|
||||
|
||||

|
||||
> [!TIP]
|
||||
> Encryption is based on 256-bit AES-GCM.
|
||||
|
||||
#### Next
|
||||
Go to the Sync Settings.
|
||||
We should proceed to the Next step.
|
||||
|
||||
#### Discard existing database and proceed
|
||||
Discard the contents of the Remote database and go to the Sync Settings.
|
||||
|
||||
### Sync Settings
|
||||
#### Sync Settings
|
||||
Finally, finish the wizard by selecting a preset for synchronisation.
|
||||
|
||||

|
||||
|
||||
Select any synchronisation methods we want to use and `Apply` to initialise and build the local and remote databases as required. If `All done!` is displayed, we are done. Automatically, `Copy setup URI` will open and we will be asked for a passphrase to encrypt the `Setup URI`.
|
||||
Select any synchronisation methods we want to use and `Apply`. If database initialisation is required, it will be performed at this time. When `All done!` is displayed, we are ready to synchronise.
|
||||
|
||||
The dialogue of `Copy settings as a new setup URI` will be open automatically. Please input a passphrase to encrypt the new `Setup URI`. (This passphrase is to encrypt the setup URI, not the vault).
|
||||
|
||||

|
||||
|
||||
Set the passphrase as you like.
|
||||
The Setup URI will be copied to the clipboard, which you can then transfer to the second and subsequent devices in some way.
|
||||
The Setup URI will be copied to the clipboard, please make a note(Not in Obsidian) of this.
|
||||
|
||||
# How to set up the second and subsequent units
|
||||
After installing Self-hosted LiveSync on the first device, select `Open setup URI` from the command palette and enter the setup URI you transferred. Afterwards, enter your passphrase and a setup wizard will open.
|
||||
Answer the following.
|
||||
>[!TIP]
|
||||
We can copy this in any time by `Copy current settings as a new setup URI`.
|
||||
|
||||
- `Yes` to `Importing LiveSync's conf, OK?`
|
||||
- `Set it up as secondary or subsequent device` to `How would you like to set it up?`.
|
||||
### 3. Manually setup
|
||||
|
||||
Then, The configuration will take effect and replication will start. Your files will be synchronised soon! You may need to close the settings dialog and reopen it to see the settings fields populated properly, but they will be set.
|
||||
It is strongly recommended to perform a "minimal set-up" first and set up the other contents after making sure has been synchronised.
|
||||
|
||||
However, if you have some specific reasons to configure it manually, please click the `Enable` button of `Enable LiveSync on this device as the set-up was completed manually`.
|
||||
And, please copy the setup URI by `Copy current settings as a new setup URI` and make a note(Not in Obsidian) of this.
|
||||
|
||||
## At the subsequent device
|
||||
After installing Self-hosted LiveSync on the first device, we should have a setup URI. **The first choice is to use it**. Please share it with the device you want to setup.
|
||||
|
||||
It is completely same as [Using setup URIs on the first device](#1-using-setup-uris). Please refer it.
|
||||
@@ -1,24 +1,53 @@
|
||||
<!-- For translation: 20240209r0 -->
|
||||
# Setup CouchDB on fly.io
|
||||
|
||||
In some cases, the use of IBM Cloudant was found to be hard. We looked for alternatives, but there were no services available. Therefore, we have to build our own servers, which is quite a challenge. In this situation, with fly.io, most of the processes are simplified and only CouchDB needs to be configured.
|
||||
|
||||
This is how to configure fly.io and CouchDB on it for Self-hosted LiveSync.
|
||||
|
||||
It generally falls within the `Free Allowances` range in most cases.
|
||||
> [!WARNING]
|
||||
> It is **your** instance. In Obsidian, we have files locally. Hence, do not hesitate to destroy the remote database if you feel something have got weird. We can launch and switch to the new CouchDB instance anytime[^1].
|
||||
>
|
||||
[^1]: Actually, I am always building the database for reproduction of the issue like so.
|
||||
|
||||
**[Automatic setup using Colaboratory](#automatic-setup-using-colaboratory) is recommended, after reading this document through. It is reproducible and hard to fail.**
|
||||
> [!NOTE]
|
||||
> **What and why is the Fly.io?**
|
||||
> At some point, we started to experience problems related to our IBM Cloudant account. At the same time, Self-hosted LiveSync started to improve its functionality, requiring CouchDB in a more natural state to use all its features.
|
||||
>
|
||||
> Then we found Fly.io. Fly.io is the PaaS Platform, which can be useable for a very reasonable price. It generally falls within the `Free Allowances` range in most cases.
|
||||
|
||||
## Required materials
|
||||
|
||||
- A valid credit or debit card.
|
||||
|
||||
## Warning
|
||||
## Setup CouchDB instance
|
||||
|
||||
- It will be `your` instance. Check the log regularly.
|
||||
### A. Very automated setup
|
||||
|
||||
## Prerequisites
|
||||
[](https://www.youtube.com/watch?v=7sa_I1832Xc)
|
||||
|
||||
For simplicity, the following description assumes that the settings are as shown in the table below. Please read it in accordance with the actual settings you wish to make.
|
||||
1. Open [setup-flyio-on-the-fly-v2.ipynb](../setup-flyio-on-the-fly-v2.ipynb).
|
||||
2. Press the `Open in Colab` button.
|
||||
3. Choose a region and run all blocks (Refer to video).
|
||||
1. If you do not have the account yet, the sign-up page will be shown, please follow the instructions. The [Official document is here](https://fly.io/docs/hands-on/sign-up/).
|
||||
4. Copy the Setup-URI and Use it in the Obsidian.
|
||||
5. You have been synchronised. Use the Setup-URI in subsequent devices.
|
||||
|
||||
Steps 4 and 5 are detailed in the [Quick Setup](./quick_setup.md#1-using-setup-uris).
|
||||
|
||||
> [!NOTE]
|
||||
> Your automatically configured configurations will be shown on the result in the Colab note like below, and **it will not be saved**. Please make a note of it somewhere.
|
||||
> ```
|
||||
> -- YOUR CONFIGURATION --
|
||||
> URL : https://billowing-dawn-6619.fly.dev
|
||||
> username: billowing-cherry-22580
|
||||
> password: misty-dew-13571
|
||||
> region : nrt
|
||||
> ```
|
||||
|
||||
### B. Scripted Setup
|
||||
|
||||
Please refer to the document of [deploy-server.sh](../utils/readme.md#deploy-serversh).
|
||||
|
||||
### C. Manual Setup
|
||||
|
||||
| Used in the text | Meaning and where to use | Memo |
|
||||
| ---------------- | --------------------------- | ------------------------------------------------------------------------ |
|
||||
@@ -26,11 +55,7 @@ For simplicity, the following description assumes that the settings are as shown
|
||||
| dfusiuada9suy | Password | |
|
||||
| nrt | Region to make the instance | We can use any [region](https://fly.io/docs/reference/regions/) near us. |
|
||||
|
||||
## Steps with your computer
|
||||
|
||||
If you want to avoid installing anything, please skip to [Automatic setup using Colaboratory](#automatic-setup-using-colaboratory).
|
||||
|
||||
### 1. Install flyctl
|
||||
#### 1. Install flyctl
|
||||
|
||||
- Mac or Linux
|
||||
|
||||
@@ -44,7 +69,7 @@ $ curl -L https://fly.io/install.sh | sh
|
||||
$ iwr https://fly.io/install.ps1 -useb | iex
|
||||
```
|
||||
|
||||
### 2. Sign up or Sign in to fly.io
|
||||
#### 2. Sign up or Sign in to fly.io
|
||||
|
||||
- Sign up
|
||||
|
||||
@@ -58,148 +83,90 @@ $ fly auth signup
|
||||
$ fly auth login
|
||||
```
|
||||
|
||||
For more information, please refer [Sign up](https://fly.io/docs/hands-on/sign-up/) and [Sign in](https://fly.io/docs/hands-on/sign-in/).
|
||||
For more information, please refer to [Sign up](https://fly.io/docs/hands-on/sign-up/) and [Sign in](https://fly.io/docs/hands-on/sign-in/).
|
||||
|
||||
### 3. Make configuration files
|
||||
#### 3. Make a configuration file
|
||||
|
||||
Please be careful, `nrt` is the region where near to Japan. Please use your preferred region.
|
||||
1. Make `fly.toml` from template `fly.template.toml`.
|
||||
We can simply copy and rename the file. The template is on [utils/flyio/fly.template.toml](../utils/flyio/fly.template.toml)
|
||||
2. Decide the instance name, initialize the App, and set credentials.
|
||||
|
||||
1. Make fly.toml
|
||||
>[!TIP]
|
||||
> - The name `billowing-dawn-6619` is randomly decided name, and it will be a part of the CouchDB URL. It should be globally unique. Therefore, it is recommended to use something random for this name.
|
||||
> - Explicit naming is very good for humans. However, we do not often get the chance to actually enter this manually (have designed so). This database may contain important information for you. The needle should be hidden in the haystack.
|
||||
|
||||
|
||||
```bash
|
||||
$ fly launch --name=billowing-dawn-6619 --env="COUCHDB_USER=campanella" --copy-config=true --detach --no-deploy --region nrt --yes
|
||||
$ fly secrets set COUCHDB_PASSWORD=dfusiuada9suy
|
||||
```
|
||||
|
||||
#### 4. Deploy
|
||||
|
||||
```
|
||||
$ flyctl launch --generate-name --detach --no-deploy --region nrt
|
||||
Creating app in /home/vrtmrz/dev/fly/demo
|
||||
Scanning source code
|
||||
Could not find a Dockerfile, nor detect a runtime or framework from source code. Continuing with a blank app.
|
||||
automatically selected personal organization: vorotamoroz
|
||||
App will use 'nrt' region as primary
|
||||
$ flyctl deploy
|
||||
An existing fly.toml file was found
|
||||
Using build strategies '[the "couchdb:latest" docker image]'. Remove [build] from fly.toml to force a rescan
|
||||
Creating app in /home/vorotamoroz/dev/obsidian-livesync/utils/flyio
|
||||
We're about to launch your app on Fly.io. Here's what you're getting:
|
||||
|
||||
Organization: vorotamoroz (fly launch defaults to the personal org)
|
||||
Name: billowing-dawn-6619 (specified on the command line)
|
||||
Region: Tokyo, Japan (specified on the command line)
|
||||
App Machines: shared-cpu-1x, 256MB RAM (specified on the command line)
|
||||
Postgres: <none> (not requested)
|
||||
Redis: <none> (not requested)
|
||||
|
||||
Created app 'billowing-dawn-6619' in organization 'personal'
|
||||
Admin URL: https://fly.io/apps/billowing-dawn-6619
|
||||
Hostname: billowing-dawn-6619.fly.dev
|
||||
Wrote config file fly.toml
|
||||
```
|
||||
|
||||
`billowing-dawn-6619` is an automatically generated name. It is used as the hostname. Please note it in something.
|
||||
Note: we can specify this without `--generate-name`, but does not recommend in the trial phases.
|
||||
|
||||
1. Make volume
|
||||
|
||||
```
|
||||
$ flyctl volumes create --region nrt couchdata --size 2 --yes
|
||||
ID: vol_g67340kxgmmvydxw
|
||||
Name: couchdata
|
||||
App: billowing-dawn-6619
|
||||
Region: nrt
|
||||
Zone: 35b7
|
||||
Size GB: 2
|
||||
Encrypted: true
|
||||
Created at: 02 Jun 23 01:19 UTC
|
||||
```
|
||||
|
||||
3. Edit fly.toml
|
||||
Changes:
|
||||
- Change exposing port from `8080` to `5984`
|
||||
- Mounting the volume `couchdata` created in step 2 under `/opt/couchdb/data`
|
||||
- Set `campanella` for the administrator of CouchDB
|
||||
- Customise CouchDB to use persistent ini-file; which is located under the data directory.
|
||||
- To use Dockerfile
|
||||
|
||||
```diff
|
||||
# fly.toml app configuration file generated for billowing-dawn-6619 on 2023-06-02T10:18:59+09:00
|
||||
#
|
||||
# See https://fly.io/docs/reference/configuration/ for information about how to use this file.
|
||||
#
|
||||
|
||||
app = "billowing-dawn-6619"
|
||||
primary_region = "nrt"
|
||||
|
||||
[http_service]
|
||||
- internal_port = 8080
|
||||
+ internal_port = 5984
|
||||
force_https = true
|
||||
auto_stop_machines = true
|
||||
auto_start_machines = true
|
||||
min_machines_running = 0
|
||||
|
||||
+[mounts]
|
||||
+ source="couchdata"
|
||||
+ destination="/opt/couchdb/data"
|
||||
+
|
||||
+[env]
|
||||
+ COUCHDB_USER = "campanella"
|
||||
+ ERL_FLAGS="-couch_ini /opt/couchdb/etc/default.ini /opt/couchdb/etc/default.d/ /opt/couchdb/etc/local.d /opt/couchdb/etc/local.ini /opt/couchdb/data/persistence.ini"
|
||||
+
|
||||
+[build]
|
||||
+ dockerfile = "./Dockerfile"
|
||||
```
|
||||
|
||||
4. Make `Dockerfile`
|
||||
Create a Dockerfile that patches the start-up script to fix ini file permissions.
|
||||
|
||||
```dockerfile
|
||||
FROM couchdb:latest
|
||||
RUN sed -i '2itouch /opt/couchdb/data/persistence.ini && chmod +w /opt/couchdb/data/persistence.ini' /docker-entrypoint.sh
|
||||
```
|
||||
|
||||
5. Set credential
|
||||
|
||||
```
|
||||
flyctl secrets set COUCHDB_PASSWORD=dfusiuada9suy
|
||||
```
|
||||
|
||||
### 4. Deploy
|
||||
|
||||
```
|
||||
$ flyctl deploy --detach --remote-only
|
||||
Validating /home/vorotamoroz/dev/obsidian-livesync/utils/flyio/fly.toml
|
||||
Platform: machines
|
||||
✓ Configuration is valid
|
||||
Your app is ready! Deploy with `flyctl deploy`
|
||||
Secrets are staged for the first deployment
|
||||
==> Verifying app config
|
||||
Validating /home/vrtmrz/dev/fly/demo/fly.toml
|
||||
Validating /home/vorotamoroz/dev/obsidian-livesync/utils/flyio/fly.toml
|
||||
Platform: machines
|
||||
✓ Configuration is valid
|
||||
--> Verified app config
|
||||
==> Building image
|
||||
Remote builder fly-builder-bold-sky-4515 ready
|
||||
==> Creating build context
|
||||
--> Creating build context done
|
||||
==> Building image with Docker
|
||||
--> docker host: 20.10.12 linux x86_64
|
||||
Searching for image 'couchdb:latest' remotely...
|
||||
image found: img_ox20prk63084j1zq
|
||||
|
||||
-------------:SNIPPED:-------------
|
||||
|
||||
Watch your app at https://fly.io/apps/billowing-dawn-6619/monitoring
|
||||
Watch your deployment at https://fly.io/apps/billowing-dawn-6619/monitoring
|
||||
|
||||
Provisioning ips for billowing-dawn-6619
|
||||
Dedicated ipv6: 2a09:8280:1::2d:240f
|
||||
Shared ipv4: 66.241.125.213
|
||||
Dedicated ipv6: 2a09:8280:1::37:fde9
|
||||
Shared ipv4: 66.241.124.163
|
||||
Add a dedicated ipv4 with: fly ips allocate-v4
|
||||
|
||||
Creating a 1 GB volume named 'couchdata' for process group 'app'. Use 'fly vol extend' to increase its size
|
||||
This deployment will:
|
||||
* create 1 "app" machine
|
||||
|
||||
No machines in group app, launching a new machine
|
||||
Machine e7845d1f297183 [app] has state: started
|
||||
|
||||
WARNING The app is not listening on the expected address and will not be reachable by fly-proxy.
|
||||
You can fix this by configuring your app to listen on the following addresses:
|
||||
- 0.0.0.0:5984
|
||||
Found these processes inside the machine with open listening sockets:
|
||||
PROCESS | ADDRESSES
|
||||
-----------------*---------------------------------------
|
||||
/.fly/hallpass | [fdaa:0:73b9:a7b:22e:3851:7f28:2]:22
|
||||
|
||||
Finished launching new machines
|
||||
|
||||
NOTE: The machines for [app] have services with 'auto_stop_machines = true' that will be stopped when idling
|
||||
|
||||
-------
|
||||
Checking DNS configuration for billowing-dawn-6619.fly.dev
|
||||
|
||||
Visit your newly deployed app at https://billowing-dawn-6619.fly.dev/
|
||||
```
|
||||
|
||||
Now your CouchDB has been launched. (Do not forget to delete it if no longer need).
|
||||
If failed, please check by `flyctl doctor`. Failure of remote build may be resolved by `flyctl` wireguard reset` or something.
|
||||
|
||||
```
|
||||
$ flyctl status
|
||||
App
|
||||
Name = billowing-dawn-6619
|
||||
Owner = personal
|
||||
Hostname = billowing-dawn-6619.fly.dev
|
||||
Image = billowing-dawn-6619:deployment-01H1WWB3CK5Z9ZX71KHBSDGHF1
|
||||
Platform = machines
|
||||
|
||||
Machines
|
||||
PROCESS ID VERSION REGION STATE CHECKS LAST UPDATED
|
||||
app e7845d1f297183 1 nrt started 2023-06-02T01:43:34Z
|
||||
```
|
||||
|
||||
### 5. Apply CouchDB configuration
|
||||
#### 5. Apply CouchDB configuration
|
||||
|
||||
After the initial setup, CouchDB needs some more customisations to be used from Self-hosted LiveSync. It can be configured in browsers or by HTTP-REST APIs.
|
||||
|
||||
@@ -273,15 +240,13 @@ iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8
|
||||
|
||||
Note: Each of these should also be repeated until finished in 200.
|
||||
|
||||
### 6. Use it from Self-hosted LiveSync
|
||||
#### 6. Use it from Self-hosted LiveSync
|
||||
|
||||
Now the CouchDB is ready to use from Self-hosted LiveSync. We can use `https://billowing-dawn-6619.fly.dev` in URI, `campanella` in `Username` and `dfusiuada9suy` in `Password` on Self-hosted LiveSync. `Database name` could be anything you want.
|
||||
`Enhance chunk size` could be up to around `100`.
|
||||
Now the CouchDB is ready to use from Self-hosted LiveSync. We can use `https://billowing-dawn-6619.fly.dev` in URI, `campanella` in `Username` and `dfusiuada9suy` in `Password` on Self-hosted LiveSync. The `Database name` could be anything you want.
|
||||
Please refer to the [Minimal Setup of the Quick Setup](./quick_setup.md#2-minimal-setup).
|
||||
|
||||
## Automatic setup using Colaboratory
|
||||
## Delete the Instance
|
||||
|
||||
We can perform all these steps by using [this Colaboratory notebook](/deploy_couchdb_to_flyio_v2_with_swap.ipynb) without installing anything.
|
||||
If you want to delete the CouchDB instance, you can do that in [fly.io Dashboard](https://fly.io/dashboard/personal)
|
||||
|
||||
## After testing / before creating a new instance
|
||||
|
||||
**Be sure to delete the instance. We can check instances on the [Dashboard](https://fly.io/dashboard/personal)**
|
||||
If you have done with [B. Scripted Setup](#b-scripted-setup), we can use [delete-server.sh](../utils/readme.md#delete-serversh).
|
||||
@@ -1,138 +1,126 @@
|
||||
# Setup a CouchDB server
|
||||
|
||||
## Table of Contents
|
||||
- [Configure](#configure)
|
||||
- [Run](#run)
|
||||
- [Docker CLI](#docker-cli)
|
||||
- [Docker Compose](#docker-compose)
|
||||
- [Access from a mobile device](#access-from-a-mobile-device)
|
||||
- [Testing from a mobile](#testing-from-a-mobile)
|
||||
- [Setting up your domain](#setting-up-your-domain)
|
||||
- [Reverse Proxies](#reverse-proxies)
|
||||
- [Traefik](#traefik)
|
||||
|
||||
- [Setup a CouchDB server](#setup-a-couchdb-server)
|
||||
- [Table of Contents](#table-of-contents)
|
||||
- [1. Prepare CouchDB](#1-prepare-couchdb)
|
||||
- [A. Using Docker container](#a-using-docker-container)
|
||||
- [1. Prepare](#1-prepare)
|
||||
- [2. Run docker container](#2-run-docker-container)
|
||||
- [B. Install CouchDB directly](#b-install-couchdb-directly)
|
||||
- [2. Run couchdb-init.sh for initialise](#2-run-couchdb-initsh-for-initialise)
|
||||
- [3. Expose CouchDB to the Internet](#3-expose-couchdb-to-the-internet)
|
||||
- [4. Client Setup](#4-client-setup)
|
||||
- [1. Generate the setup URI on a desktop device or server](#1-generate-the-setup-uri-on-a-desktop-device-or-server)
|
||||
- [2. Setup Self-hosted LiveSync to Obsidian](#2-setup-self-hosted-livesync-to-obsidian)
|
||||
- [Manual setup information](#manual-setup-information)
|
||||
- [Setting up your domain](#setting-up-your-domain)
|
||||
- [Reverse Proxies](#reverse-proxies)
|
||||
- [Traefik](#traefik)
|
||||
---
|
||||
|
||||
## Configure
|
||||
## 1. Prepare CouchDB
|
||||
### A. Using Docker container
|
||||
|
||||
The easiest way to set up a CouchDB instance is using the official [docker image](https://hub.docker.com/_/couchdb).
|
||||
#### 1. Prepare
|
||||
```bash
|
||||
|
||||
Some initial configuration is required. Create a `local.ini` to use Self-hosted LiveSync as follows ([CouchDB has to be version 3.2 or higher](https://docs.couchdb.org/en/latest/config/http.html#chttpd/enable_cors), if lower `enable_cors = true` has to be under section `[httpd]` ):
|
||||
# Prepare environment variables.
|
||||
export hostname=localhost:5984
|
||||
export username=goojdasjdas #Please change as you like.
|
||||
export password=kpkdasdosakpdsa #Please change as you like
|
||||
|
||||
```ini
|
||||
[couchdb]
|
||||
single_node=true
|
||||
max_document_size = 50000000
|
||||
|
||||
[chttpd]
|
||||
require_valid_user = true
|
||||
max_http_request_size = 4294967296
|
||||
enable_cors = true
|
||||
|
||||
[chttpd_auth]
|
||||
require_valid_user = true
|
||||
authentication_redirect = /_utils/session.html
|
||||
|
||||
[httpd]
|
||||
WWW-Authenticate = Basic realm="couchdb"
|
||||
bind_address = 0.0.0.0
|
||||
|
||||
[cors]
|
||||
origins = app://obsidian.md, capacitor://localhost, http://localhost
|
||||
credentials = true
|
||||
headers = accept, authorization, content-type, origin, referer
|
||||
methods = GET,PUT,POST,HEAD,DELETE
|
||||
max_age = 3600
|
||||
# Prepare directories which saving data and configurations.
|
||||
mkdir couchdb-data
|
||||
mkdir couchdb-etc
|
||||
```
|
||||
|
||||
## Run
|
||||
#### 2. Run docker container
|
||||
|
||||
### Docker CLI
|
||||
1. Boot Check.
|
||||
```
|
||||
$ docker run --name couchdb-for-ols --rm -it -e COUCHDB_USER=${username} -e COUCHDB_PASSWORD=${password} -v ${PWD}/couchdb-data:/opt/couchdb/data -v ${PWD}/couchdb-etc:/opt/couchdb/etc/local.d -p 5984:5984 couchdb
|
||||
```
|
||||
If your container has been exited, please check the permission of couchdb-data, and couchdb-etc.
|
||||
Once CouchDB run, these directories will be owned by uid:`5984`. Please chown it for you again.
|
||||
|
||||
You can launch CouchDB using your `local.ini` like this:
|
||||
2. Enable it in background
|
||||
```
|
||||
$ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
|
||||
$ docker run --name couchdb-for-ols -d --restart always -e COUCHDB_USER=${username} -e COUCHDB_PASSWORD=${password} -v ${PWD}/couchdb-data:/opt/couchdb/data -v ${PWD}/couchdb-etc:/opt/couchdb/etc/local.d -p 5984:5984 couchdb
|
||||
```
|
||||
*Remember to replace the path with the path to your local.ini*
|
||||
### B. Install CouchDB directly
|
||||
Please refer the [official document](https://docs.couchdb.org/en/stable/install/index.html). However, we do not have to configure it fully. Just administrator needs to be configured.
|
||||
|
||||
Run in detached mode:
|
||||
## 2. Run couchdb-init.sh for initialise
|
||||
```
|
||||
$ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
|
||||
```
|
||||
*Remember to replace the path with the path to your local.ini*
|
||||
|
||||
### Docker Compose
|
||||
Create a directory, place your `local.ini` within it, and create a `docker-compose.yml` alongside it. Make sure to have write permissions for `local.ini` and the about to be created `data` folder after the container start. The directory structure should look similar to this:
|
||||
```
|
||||
obsidian-livesync
|
||||
├── docker-compose.yml
|
||||
└── local.ini
|
||||
$ curl -s https://raw.githubusercontent.com/vrtmrz/obsidian-livesync/main/utils/couchdb/couchdb-init.sh | bash
|
||||
```
|
||||
|
||||
A good place to start for `docker-compose.yml`:
|
||||
```yaml
|
||||
version: "2.1"
|
||||
services:
|
||||
couchdb:
|
||||
image: couchdb
|
||||
container_name: obsidian-livesync
|
||||
user: 1000:1000
|
||||
environment:
|
||||
- COUCHDB_USER=admin
|
||||
- COUCHDB_PASSWORD=password
|
||||
volumes:
|
||||
- ./data:/opt/couchdb/data
|
||||
- ./local.ini:/opt/couchdb/etc/local.ini
|
||||
ports:
|
||||
- 5984:5984
|
||||
restart: unless-stopped
|
||||
If it results like following:
|
||||
```
|
||||
-- Configuring CouchDB by REST APIs... -->
|
||||
{"ok":true}
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
<-- Configuring CouchDB by REST APIs Done!
|
||||
```
|
||||
|
||||
And finally launch the container
|
||||
Your CouchDB has been initialised successfully. If you want this manually, please read the script.
|
||||
|
||||
## 3. Expose CouchDB to the Internet
|
||||
|
||||
- You can skip this instruction if you using only in intranet and only with desktop devices.
|
||||
- For mobile devices, Obsidian requires a valid SSL certificate. Usually, it needs exposing the internet.
|
||||
|
||||
Whatever solutions we can use. For the simplicity, following sample uses Cloudflare Zero Trust for testing.
|
||||
|
||||
```
|
||||
# -d will launch detached so the container runs in background
|
||||
docker compose up -d
|
||||
$ cloudflared tunnel --url http://localhost:5984
|
||||
2024-02-14T10:35:25Z INF Thank you for trying Cloudflare Tunnel. Doing so, without a Cloudflare account, is a quick way to experiment and try it out. However, be aware that these account-less Tunnels have no uptime guarantee. If you intend to use Tunnels in production you should use a pre-created named tunnel by following: https://developers.cloudflare.com/cloudflare-one/connections/connect-apps
|
||||
2024-02-14T10:35:25Z INF Requesting new quick Tunnel on trycloudflare.com...
|
||||
2024-02-14T10:35:26Z INF +--------------------------------------------------------------------------------------------+
|
||||
2024-02-14T10:35:26Z INF | Your quick Tunnel has been created! Visit it at (it may take some time to be reachable): |
|
||||
2024-02-14T10:35:26Z INF | https://tiles-photograph-routine-groundwater.trycloudflare.com |
|
||||
2024-02-14T10:35:26Z INF +--------------------------------------------------------------------------------------------+
|
||||
:
|
||||
:
|
||||
:
|
||||
```
|
||||
Now `https://tiles-photograph-routine-groundwater.trycloudflare.com` is our server. Make it into background once please.
|
||||
|
||||
|
||||
## 4. Client Setup
|
||||
> [!TIP]
|
||||
> Now manually configuration is not recommended for some reasons. However, if you want to do so, please use `Setup wizard`. The recommended extra configurations will be also set.
|
||||
|
||||
### 1. Generate the setup URI on a desktop device or server
|
||||
```bash
|
||||
$ export hostname=https://tiles-photograph-routine-groundwater.trycloudflare.com #Point to your vault
|
||||
$ export database=obsidiannotes #Please change as you like
|
||||
$ export passphrase=dfsapkdjaskdjasdas #Please change as you like
|
||||
$ deno run -A https://raw.githubusercontent.com/vrtmrz/obsidian-livesync/main/utils/flyio/generate_setupuri.ts
|
||||
obsidian://setuplivesync?settings=%5B%22tm2DpsOE74nJAryprZO2M93wF%2Fvg.......4b26ed33230729%22%5D
|
||||
```
|
||||
|
||||
## Access from a mobile device
|
||||
If you want to access Self-hosted LiveSync from mobile devices, you need a valid SSL certificate.
|
||||
### 2. Setup Self-hosted LiveSync to Obsidian
|
||||
[This video](https://youtu.be/7sa_I1832Xc?t=146) may help us.
|
||||
1. Install Self-hosted LiveSync
|
||||
2. Choose `Use the copied setup URI` from the command palette and paste the setup URI. (obsidian://setuplivesync?settings=.....).
|
||||
3. Type `welcome` for setup-uri passphrase.
|
||||
4. Answer `yes` and `Set it up...`, and finish the first dialogue with `Keep them disabled`.
|
||||
5. `Reload app without save` once.
|
||||
|
||||
### Testing from a mobile
|
||||
In the testing phase, [localhost.run](https://localhost.run/) or something like services is very useful.
|
||||
---
|
||||
|
||||
example using localhost.run:
|
||||
```
|
||||
$ ssh -R 80:localhost:5984 nokey@localhost.run
|
||||
Warning: Permanently added the RSA host key for IP address '35.171.254.69' to the list of known hosts.
|
||||
|
||||
===============================================================================
|
||||
Welcome to localhost.run!
|
||||
|
||||
Follow your favourite reverse tunnel at [https://twitter.com/localhost_run].
|
||||
|
||||
**You need a SSH key to access this service.**
|
||||
If you get a permission denied follow Gitlab's most excellent howto:
|
||||
https://docs.gitlab.com/ee/ssh/
|
||||
*Only rsa and ed25519 keys are supported*
|
||||
|
||||
To set up and manage custom domains go to https://admin.localhost.run/
|
||||
|
||||
More details on custom domains (and how to enable subdomains of your custom
|
||||
domain) at https://localhost.run/docs/custom-domains
|
||||
|
||||
To explore using localhost.run visit the documentation site:
|
||||
https://localhost.run/docs/
|
||||
|
||||
===============================================================================
|
||||
|
||||
|
||||
** your connection id is xxxxxxxxxxxxxxxxxxxxxxxxxxxx, please mention it if you send me a message about an issue. **
|
||||
|
||||
xxxxxxxx.localhost.run tunneled with tls termination, https://xxxxxxxx.localhost.run
|
||||
Connection to localhost.run closed by remote host.
|
||||
Connection to localhost.run closed.
|
||||
```
|
||||
|
||||
https://xxxxxxxx.localhost.run is the temporary server address.
|
||||
## Manual setup information
|
||||
|
||||
### Setting up your domain
|
||||
|
||||
|
||||
@@ -91,7 +91,7 @@ services:
|
||||
最后, 创建并启动容器:
|
||||
```
|
||||
# -d will launch detached so the container runs in background
|
||||
docker compose up -d
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## 创建数据库
|
||||
|
||||
@@ -1,12 +1,117 @@
|
||||
# Info
|
||||
In this document some tips for solving issues will be given.
|
||||
<!-- 2024-02-15 -->
|
||||
# Tips and Troubleshooting
|
||||
|
||||
|
||||
# Issue with syncing with mobile device
|
||||
- If you are getting problem like could not sync files when doing initial sync. Try lower batch size
|
||||
- Open setting
|
||||
- Open Live Sync settings
|
||||
- Go to sync settings
|
||||
- Go down to Advanced settings.
|
||||
- Lower "Batch size" and "Batch limit" and try to sync again.
|
||||
- If you keep getting error keep lowering until you find the sweet spot.
|
||||
- [Tips and Troubleshooting](#tips-and-troubleshooting)
|
||||
- [Notable bugs and fixes](#notable-bugs-and-fixes)
|
||||
- [Binary files get bigger on iOS](#binary-files-get-bigger-on-ios)
|
||||
- [Some setting name has been changed](#some-setting-name-has-been-changed)
|
||||
- [FAQ](#faq)
|
||||
- [Why `Use an old adapter for compatibility` is somehow enabled in my vault?](#why-use-an-old-adapter-for-compatibility-is-somehow-enabled-in-my-vault)
|
||||
- [ZIP (or any extensions) files were not synchronised. Why?](#zip-or-any-extensions-files-were-not-synchronised-why)
|
||||
- [I hope to report the issue, but you said you needs `Report`. How to make it?](#i-hope-to-report-the-issue-but-you-said-you-needs-report-how-to-make-it)
|
||||
- [Where can I check the log?](#where-can-i-check-the-log)
|
||||
- [Why are the logs volatile and ephemeral?](#why-are-the-logs-volatile-and-ephemeral)
|
||||
- [Some network logs are not written into the file.](#some-network-logs-are-not-written-into-the-file)
|
||||
- [If a file were deleted or trimmed, the capacity of the database should be reduced, right?](#if-a-file-were-deleted-or-trimmed-the-capacity-of-the-database-should-be-reduced-right)
|
||||
- [Troubleshooting](#troubleshooting)
|
||||
- [On the mobile device, cannot synchronise on the local network!](#on-the-mobile-device-cannot-synchronise-on-the-local-network)
|
||||
- [I think that something bad happening on the vault...](#i-think-that-something-bad-happening-on-the-vault)
|
||||
- [Tips](#tips)
|
||||
- [Old tips](#old-tips)
|
||||
|
||||
<!-- - -->
|
||||
|
||||
|
||||
## Notable bugs and fixes
|
||||
### Binary files get bigger on iOS
|
||||
- Reported at: v0.20.x
|
||||
- Fixed at: v0.21.2 (Fixed but not reviewed)
|
||||
- Required action: larger files will not be fixed automatically, please perform `Verify and repair all files`. If our local database and storage are not matched, we will be asked to apply which one.
|
||||
|
||||
### Some setting name has been changed
|
||||
- Fixed at: v0.22.6
|
||||
|
||||
| Previous name | New name |
|
||||
| ---------------------------- | ---------------------------------------- |
|
||||
| Open setup URI | Use the copied setup URI |
|
||||
| Copy setup URI | Copy current settings as a new setup URI |
|
||||
| Setup Wizard | Minimal Setup |
|
||||
| Check database configuration | Check and Fix database configuration |
|
||||
|
||||
## FAQ
|
||||
|
||||
### Why `Use an old adapter for compatibility` is somehow enabled in my vault?
|
||||
|
||||
Because you are a compassionate and experienced user. Before v0.17.16, we used an old adapter for the local database. At that time, current default adapter has not been stable.
|
||||
The new adapter has better performance and has a new feature like purging. Therefore, we should use new adapters and current default is so.
|
||||
|
||||
However, when switching from an old adapter to a new adapter, some converting or local database rebuilding is required, and it takes a few time. It was a long time ago now, but we once inconvenienced everyone in a hurry when we changed the format of our database.
|
||||
For these reasons, this toggle is automatically on if we have upgraded from vault which using an old adapter.
|
||||
|
||||
When you rebuild everything or fetch from the remote again, you will be asked to switch this.
|
||||
|
||||
Therefore, experienced users (especially those stable enough not to have to rebuild the database) may have this toggle enabled in their Vault.
|
||||
Please disable it when you have enough time.
|
||||
|
||||
### ZIP (or any extensions) files were not synchronised. Why?
|
||||
It depends on Obsidian detects. May toggling `Detect all extensions` of `File and links` (setting of Obsidian) will help us.
|
||||
|
||||
### I hope to report the issue, but you said you needs `Report`. How to make it?
|
||||
We can copy the report to the clipboard, by pressing the `Make report` button on the `Hatch` pane.
|
||||

|
||||
|
||||
### Where can I check the log?
|
||||
We can launch the log pane by `Show log` on the command palette.
|
||||
And if you have troubled something, please enable the `Verbose Log` on the `General Setting` pane.
|
||||
|
||||
However, the logs would not be kept so long and cleared when restarted. If you want to check the logs, please enable `Write logs into the file` temporarily.
|
||||
|
||||

|
||||
|
||||
> [!IMPORTANT]
|
||||
> - Writing logs into the file will impact the performance.
|
||||
> - Please make sure that you have erased all your confidential information before reporting issue.
|
||||
|
||||
### Why are the logs volatile and ephemeral?
|
||||
To avoid unexpected exposure to our confidential things.
|
||||
|
||||
### Some network logs are not written into the file.
|
||||
Especially the CORS error will be reported as a general error to the plug-in for security reasons. So we cannot detect and log it.
|
||||
|
||||
### If a file were deleted or trimmed, the capacity of the database should be reduced, right?
|
||||
No, even though if files were deleted, chunks were not deleted.
|
||||
Self-hosted LiveSync splits the files into multiple chunks and transfers only newly created. This behaviour enables us to less traffic. And, the chunks will be shared between the files to reduce the total usage of the database.
|
||||
|
||||
And one more thing, we can handle the conflicts on any device even though it has happened on other devices. This means that conflicts will happen in the past, after the time we have synchronised. Hence we cannot collect and delete the unused chunks even though if we are not currently referenced.
|
||||
|
||||
To shrink the database size, `Rebuild everything` only reliably and effectively. But do not worry, if we have synchronised well. We have the actual and real files. Only it takes a bit of time and traffics.
|
||||
|
||||
<!-- Add here -->
|
||||
|
||||
## Troubleshooting
|
||||
<!-- Add here -->
|
||||
|
||||
### On the mobile device, cannot synchronise on the local network!
|
||||
Obsidian mobile is not able to connect to the non-secure end-point, such as starting with `http://`. Make sure your URI of CouchDB. Also not able to use a self-signed certificate.
|
||||
|
||||
### I think that something bad happening on the vault...
|
||||
Place `redflag.md` on top of the vault, and restart Obsidian. The most simple way is to create a new note and rename it to `redflag`. Of course, we can put it without Obsidian.
|
||||
|
||||
If there is `redflag.md`, Self-hosted LiveSync suspends all database and storage processes.
|
||||
|
||||
## Tips
|
||||
<!-- Add here -->
|
||||
|
||||
### Old tips
|
||||
- Rarely, a file in the database could be corrupted. The plugin will not write to local storage when a file looks corrupted. If a local version of the file is on your device, the corruption could be fixed by editing the local file and synchronizing it. But if the file does not exist on any of your devices, then it can not be rescued. In this case, you can delete these items from the settings dialog.
|
||||
- To stop the boot-up sequence (eg. for fixing problems on databases), you can put a `redflag.md` file (or directory) at the root of your vault.
|
||||
Tip for iOS: a redflag directory can be created at the root of the vault using the File application.
|
||||
- Also, with `redflag2.md` placed, we can automatically rebuild both the local and the remote databases during the boot-up sequence. With `redflag3.md`, we can discard only the local database and fetch from the remote again.
|
||||
- Q: The database is growing, how can I shrink it down?
|
||||
A: each of the docs is saved with their past 100 revisions for detecting and resolving conflicts. Picturing that one device has been offline for a while, and comes online again. The device has to compare its notes with the remotely saved ones. If there exists a historic revision in which the note used to be identical, it could be updated safely (like git fast-forward). Even if that is not in revision histories, we only have to check the differences after the revision that both devices commonly have. This is like git's conflict-resolving method. So, We have to make the database again like an enlarged git repo if you want to solve the root of the problem.
|
||||
- And more technical Information is in the [Technical Information](tech_info.md)
|
||||
- If you want to synchronize files without obsidian, you can use [filesystem-livesync](https://github.com/vrtmrz/filesystem-livesync).
|
||||
- WebClipper is also available on Chrome Web Store:[obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
Repo is here: [obsidian-livesync-webclip](https://github.com/vrtmrz/obsidian-livesync-webclip). (Docs are a work in progress.)
|
||||
|
||||
BIN
images/hatch.png
Normal file
|
After Width: | Height: | Size: 13 KiB |
|
Before Width: | Height: | Size: 25 KiB After Width: | Height: | Size: 20 KiB |
|
Before Width: | Height: | Size: 36 KiB After Width: | Height: | Size: 5.7 KiB |
|
Before Width: | Height: | Size: 10 KiB |
BIN
images/write_logs_into_the_file.png
Normal file
|
After Width: | Height: | Size: 17 KiB |
@@ -1,10 +1,10 @@
|
||||
{
|
||||
"id": "obsidian-livesync",
|
||||
"name": "Self-hosted LiveSync",
|
||||
"version": "0.22.4",
|
||||
"version": "0.22.14",
|
||||
"minAppVersion": "0.9.12",
|
||||
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||
"author": "vorotamoroz",
|
||||
"authorUrl": "https://github.com/vrtmrz",
|
||||
"isDesktopOnly": false
|
||||
}
|
||||
}
|
||||
|
||||
2674
package-lock.json
generated
54
package.json
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "obsidian-livesync",
|
||||
"version": "0.22.4",
|
||||
"version": "0.22.14",
|
||||
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||
"main": "main.js",
|
||||
"type": "module",
|
||||
@@ -13,29 +13,29 @@
|
||||
"author": "vorotamoroz",
|
||||
"license": "MIT",
|
||||
"devDependencies": {
|
||||
"@tsconfig/svelte": "^5.0.0",
|
||||
"@types/diff-match-patch": "^1.0.32",
|
||||
"@types/node": "^20.2.5",
|
||||
"@types/pouchdb": "^6.4.0",
|
||||
"@types/pouchdb-adapter-http": "^6.1.3",
|
||||
"@types/pouchdb-adapter-idb": "^6.1.4",
|
||||
"@types/pouchdb-browser": "^6.1.3",
|
||||
"@types/pouchdb-core": "^7.0.11",
|
||||
"@types/pouchdb-mapreduce": "^6.1.7",
|
||||
"@types/pouchdb-replication": "^6.4.4",
|
||||
"@types/transform-pouch": "^1.0.2",
|
||||
"@typescript-eslint/eslint-plugin": "^6.2.1",
|
||||
"@typescript-eslint/parser": "^6.2.1",
|
||||
"@tsconfig/svelte": "^5.0.2",
|
||||
"@types/diff-match-patch": "^1.0.36",
|
||||
"@types/node": "^20.11.28",
|
||||
"@types/pouchdb": "^6.4.2",
|
||||
"@types/pouchdb-adapter-http": "^6.1.6",
|
||||
"@types/pouchdb-adapter-idb": "^6.1.7",
|
||||
"@types/pouchdb-browser": "^6.1.5",
|
||||
"@types/pouchdb-core": "^7.0.14",
|
||||
"@types/pouchdb-mapreduce": "^6.1.10",
|
||||
"@types/pouchdb-replication": "^6.4.7",
|
||||
"@types/transform-pouch": "^1.0.6",
|
||||
"@typescript-eslint/eslint-plugin": "^7.2.0",
|
||||
"@typescript-eslint/parser": "^7.2.0",
|
||||
"builtin-modules": "^3.3.0",
|
||||
"esbuild": "0.18.17",
|
||||
"esbuild-svelte": "^0.7.4",
|
||||
"eslint": "^8.46.0",
|
||||
"esbuild": "0.20.2",
|
||||
"esbuild-svelte": "^0.8.0",
|
||||
"eslint": "^8.57.0",
|
||||
"eslint-config-airbnb-base": "^15.0.0",
|
||||
"eslint-plugin-import": "^2.28.0",
|
||||
"eslint-plugin-import": "^2.29.1",
|
||||
"events": "^3.3.0",
|
||||
"obsidian": "^1.4.11",
|
||||
"postcss": "^8.4.27",
|
||||
"postcss-load-config": "^4.0.1",
|
||||
"obsidian": "^1.5.7",
|
||||
"postcss": "^8.4.35",
|
||||
"postcss-load-config": "^5.0.3",
|
||||
"pouchdb-adapter-http": "^8.0.1",
|
||||
"pouchdb-adapter-idb": "^8.0.1",
|
||||
"pouchdb-adapter-indexeddb": "^8.0.1",
|
||||
@@ -46,16 +46,16 @@
|
||||
"pouchdb-merge": "^8.0.1",
|
||||
"pouchdb-replication": "^8.0.1",
|
||||
"pouchdb-utils": "^8.0.1",
|
||||
"svelte": "^4.1.2",
|
||||
"svelte-preprocess": "^5.0.4",
|
||||
"terser": "^5.19.2",
|
||||
"svelte": "^4.2.12",
|
||||
"svelte-preprocess": "^5.1.3",
|
||||
"terser": "^5.29.2",
|
||||
"transform-pouch": "^2.0.0",
|
||||
"tslib": "^2.6.1",
|
||||
"typescript": "^5.1.6"
|
||||
"tslib": "^2.6.2",
|
||||
"typescript": "^5.4.2"
|
||||
},
|
||||
"dependencies": {
|
||||
"diff-match-patch": "^1.0.5",
|
||||
"idb": "^7.1.1",
|
||||
"idb": "^8.0.0",
|
||||
"minimatch": "^9.0.3",
|
||||
"xxhash-wasm": "0.4.2",
|
||||
"xxhash-wasm-102": "npm:xxhash-wasm@^1.0.2"
|
||||
|
||||
151
setup-flyio-on-the-fly-v2.ipynb
Normal file
@@ -0,0 +1,151 @@
|
||||
{
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 0,
|
||||
"metadata": {
|
||||
"colab": {
|
||||
"provenance": [],
|
||||
"private_outputs": true,
|
||||
"authorship_tag": "ABX9TyMexQ5pErH5LBG2tENtEVWf",
|
||||
"include_colab_link": true
|
||||
},
|
||||
"kernelspec": {
|
||||
"name": "python3",
|
||||
"display_name": "Python 3"
|
||||
},
|
||||
"language_info": {
|
||||
"name": "python"
|
||||
}
|
||||
},
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "view-in-github",
|
||||
"colab_type": "text"
|
||||
},
|
||||
"source": [
|
||||
"<a href=\"https://colab.research.google.com/gist/vrtmrz/9402b101746e08e969b1a4f5f0deb465/setup-flyio-on-the-fly-v2.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"- Initial version 7th Feb. 2024"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "AzLlAcLFRO5A"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "z1x8DQpa9opC"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Install prerequesties\n",
|
||||
"!curl -L https://fly.io/install.sh | sh\n",
|
||||
"!curl -fsSL https://deno.land/x/install/install.sh | sh\n",
|
||||
"!apt update && apt -y install jq\n",
|
||||
"import os\n",
|
||||
"%env PATH=/root/.fly/bin:/root/.deno/bin/:{os.environ[\"PATH\"]}\n",
|
||||
"!git clone --recursive https://github.com/vrtmrz/obsidian-livesync"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"# Login up sign up\n",
|
||||
"!flyctl auth signup"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "mGN08BaFDviy"
|
||||
},
|
||||
"execution_count": null,
|
||||
"outputs": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"Select a region and execute the block."
|
||||
],
|
||||
"metadata": {
|
||||
"id": "BBFTFOP6vA8m"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"# see https://fly.io/docs/reference/regions/\n",
|
||||
"region = \"nrt/Tokyo, Japan\" #@param [\"ams/Amsterdam, Netherlands\",\"arn/Stockholm, Sweden\",\"atl/Atlanta, Georgia (US)\",\"bog/Bogotá, Colombia\",\"bos/Boston, Massachusetts (US)\",\"cdg/Paris, France\",\"den/Denver, Colorado (US)\",\"dfw/Dallas, Texas (US)\",\"ewr/Secaucus, NJ (US)\",\"eze/Ezeiza, Argentina\",\"gdl/Guadalajara, Mexico\",\"gig/Rio de Janeiro, Brazil\",\"gru/Sao Paulo, Brazil\",\"hkg/Hong Kong, Hong Kong\",\"iad/Ashburn, Virginia (US)\",\"jnb/Johannesburg, South Africa\",\"lax/Los Angeles, California (US)\",\"lhr/London, United Kingdom\",\"mad/Madrid, Spain\",\"mia/Miami, Florida (US)\",\"nrt/Tokyo, Japan\",\"ord/Chicago, Illinois (US)\",\"otp/Bucharest, Romania\",\"phx/Phoenix, Arizona (US)\",\"qro/Querétaro, Mexico\",\"scl/Santiago, Chile\",\"sea/Seattle, Washington (US)\",\"sin/Singapore, Singapore\",\"sjc/San Jose, California (US)\",\"syd/Sydney, Australia\",\"waw/Warsaw, Poland\",\"yul/Montreal, Canada\",\"yyz/Toronto, Canada\" ] {allow-input: true}\n",
|
||||
"%env region={region.split(\"/\")[0]}\n",
|
||||
"#%env appame=\n",
|
||||
"#%env username=\n",
|
||||
"#%env password=\n",
|
||||
"#%env database=\n",
|
||||
"#%env passphrase=\n",
|
||||
"\n",
|
||||
"# automatic setup leave it -->\n",
|
||||
"%cd obsidian-livesync/utils/flyio\n",
|
||||
"!./deploy-server.sh | tee deploy-result.txt\n",
|
||||
"\n",
|
||||
"## Show result button\n",
|
||||
"from IPython.display import HTML\n",
|
||||
"last_line=\"\"\n",
|
||||
"with open('deploy-result.txt', 'r') as f:\n",
|
||||
" last_line = f.readlines()[-1]\n",
|
||||
" last_line = str.strip(last_line)\n",
|
||||
"\n",
|
||||
"if last_line.startswith(\"obsidian://\"):\n",
|
||||
" result = HTML(f\"Copy your setup-URI with this button! -> <button onclick=\\\"navigator.clipboard.writeText('{last_line}')\\\">Copy setup uri</button><br>Importing passphrase is `welcome`. <br>If you want to synchronise in live mode, please apply a preset after ensuring the imported configuration works.\")\n",
|
||||
"else:\n",
|
||||
" result = \"Failed to encrypt the setup URI\"\n",
|
||||
"result"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "TNl0A603EF9E"
|
||||
},
|
||||
"execution_count": null,
|
||||
"outputs": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"If you see the `Copy setup URI` button, Congratulations! Your CouchDB is ready to use! Please click the button. And open this on Obsidian.\n",
|
||||
"\n",
|
||||
"And, you should keep the output to your secret memo.\n",
|
||||
"\n"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "oeIzExnEKhFp"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"If you want to delete this CouchDB instance, you can do it by executing next cell. \n",
|
||||
"If your fly.toml has been gone, access https://fly.io/dashboard and check the existing app."
|
||||
],
|
||||
"metadata": {
|
||||
"id": "sdQrqOjERN3K"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"!./delete-server.sh"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "7JMSkNvVIIfg"
|
||||
},
|
||||
"execution_count": null,
|
||||
"outputs": []
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -1,13 +1,12 @@
|
||||
import { writable } from 'svelte/store';
|
||||
import { Notice, type PluginManifest, parseYaml, normalizePath } from "./deps";
|
||||
import { Notice, type PluginManifest, parseYaml, normalizePath, type ListedFiles } from "./deps";
|
||||
|
||||
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry, SavingEntry } from "./lib/src/types";
|
||||
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "./lib/src/types";
|
||||
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
|
||||
import { createTextBlob, delay, getDocData, sendSignal, waitForSignal } from "./lib/src/utils";
|
||||
import { createSavingEntryFromLoadedEntry, createTextBlob, delay, fireAndForget, getDocData, isDocContentSame, sendSignal, waitForSignal } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { WrappedNotice } from "./lib/src/wrapper";
|
||||
import { readString, decodeBinary, arrayBufferToBase64, sha1 } from "./lib/src/strbin";
|
||||
import { readString, decodeBinary, arrayBufferToBase64, digestHash } from "./lib/src/strbin";
|
||||
import { serialized } from "./lib/src/lock";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { stripAllPrefixes } from "./lib/src/path";
|
||||
@@ -31,7 +30,8 @@ function serialize(data: PluginDataEx): string {
|
||||
ret += data.mtime + d2;
|
||||
for (const file of data.files) {
|
||||
ret += file.filename + d + (file.displayName ?? "") + d + (file.version ?? "") + d2;
|
||||
ret += file.mtime + d + file.size + d2;
|
||||
const hash = digestHash((file.data ?? []).join());
|
||||
ret += file.mtime + d + file.size + d + hash + d2;
|
||||
for (const data of file.data ?? []) {
|
||||
ret += data + d
|
||||
}
|
||||
@@ -95,6 +95,7 @@ function deserialize2(str: string): PluginDataEx {
|
||||
tokens.nextLine();
|
||||
const mtime = Number(tokens.next());
|
||||
const size = Number(tokens.next());
|
||||
const hash = tokens.next();
|
||||
tokens.nextLine();
|
||||
const data = [] as string[];
|
||||
let piece = "";
|
||||
@@ -110,7 +111,8 @@ function deserialize2(str: string): PluginDataEx {
|
||||
version,
|
||||
mtime,
|
||||
size,
|
||||
data
|
||||
data,
|
||||
hash
|
||||
}
|
||||
)
|
||||
tokens.nextLine();
|
||||
@@ -137,10 +139,11 @@ export const pluginIsEnumerating = writable(false);
|
||||
|
||||
export type PluginDataExFile = {
|
||||
filename: string,
|
||||
data?: string[],
|
||||
data: string[],
|
||||
mtime: number,
|
||||
size: number,
|
||||
version?: string,
|
||||
hash?: string,
|
||||
displayName?: string,
|
||||
}
|
||||
export type PluginDataExDisplay = {
|
||||
@@ -169,19 +172,16 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
pluginScanningCount.onChanged((e) => {
|
||||
const total = e.value;
|
||||
pluginIsEnumerating.set(total != 0);
|
||||
if (total == 0) {
|
||||
Logger(`Processing configurations done`, LOG_LEVEL_INFO, "get-plugins");
|
||||
}
|
||||
// if (total == 0) {
|
||||
// Logger(`Processing configurations done`, LOG_LEVEL_INFO, "get-plugins");
|
||||
// }
|
||||
})
|
||||
}
|
||||
confirmPopup: WrappedNotice = null;
|
||||
get kvDB() {
|
||||
return this.plugin.kvDB;
|
||||
}
|
||||
ensureDirectoryEx(fullPath: string) {
|
||||
return this.plugin.ensureDirectoryEx(fullPath);
|
||||
}
|
||||
pluginDialog: PluginDialogModal = null;
|
||||
|
||||
pluginDialog?: PluginDialogModal = undefined;
|
||||
periodicPluginSweepProcessor = new PeriodicProcessor(this.plugin, async () => await this.scanAllConfigFiles(false));
|
||||
|
||||
pluginList: PluginDataExDisplay[] = [];
|
||||
@@ -189,7 +189,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
if (!this.settings.usePluginSync) {
|
||||
return;
|
||||
}
|
||||
if (this.pluginDialog != null) {
|
||||
if (this.pluginDialog) {
|
||||
this.pluginDialog.open();
|
||||
} else {
|
||||
this.pluginDialog = new PluginDialogModal(this.app, this.plugin);
|
||||
@@ -200,7 +200,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
hidePluginSyncModal() {
|
||||
if (this.pluginDialog != null) {
|
||||
this.pluginDialog.close();
|
||||
this.pluginDialog = null;
|
||||
this.pluginDialog = undefined;
|
||||
}
|
||||
}
|
||||
onunload() {
|
||||
@@ -275,16 +275,28 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
await this.updatePluginList(showMessage);
|
||||
}
|
||||
async loadPluginData(path: FilePathWithPrefix): Promise<PluginDataExDisplay | false> {
|
||||
const wx = await this.localDatabase.getDBEntry(path, null, false, false);
|
||||
const wx = await this.localDatabase.getDBEntry(path, undefined, false, false);
|
||||
if (wx) {
|
||||
const data = deserialize(getDocData(wx.data), {}) as PluginDataEx;
|
||||
const xFiles = [] as PluginDataExFile[];
|
||||
let missingHash = false;
|
||||
for (const file of data.files) {
|
||||
const work = { ...file };
|
||||
const tempStr = getDocData(work.data);
|
||||
work.data = [await sha1(tempStr)];
|
||||
const work = { ...file, data: [] as string[] };
|
||||
if (!file.hash) {
|
||||
// debugger;
|
||||
const tempStr = getDocData(work.data);
|
||||
const hash = digestHash(tempStr);
|
||||
file.hash = hash;
|
||||
missingHash = true;
|
||||
}
|
||||
work.data = [file.hash];
|
||||
xFiles.push(work);
|
||||
}
|
||||
if (missingHash) {
|
||||
Logger(`Digest created for ${path} to improve checking`, LOG_LEVEL_VERBOSE);
|
||||
wx.data = serialize(data);
|
||||
fireAndForget(() => this.localDatabase.putDBEntry(createSavingEntryFromLoadedEntry(wx)));
|
||||
}
|
||||
return ({
|
||||
...data,
|
||||
documentPath: this.getPath(wx),
|
||||
@@ -319,21 +331,21 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
const plugin = v[0];
|
||||
const path = plugin.path || this.getPath(plugin);
|
||||
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
|
||||
if (oldEntry && oldEntry.mtime == plugin.mtime) return;
|
||||
if (oldEntry && oldEntry.mtime == plugin.mtime) return [];
|
||||
try {
|
||||
const pluginData = await this.loadPluginData(path);
|
||||
if (pluginData) {
|
||||
return [pluginData];
|
||||
}
|
||||
// Failed to load
|
||||
return;
|
||||
return [];
|
||||
|
||||
} catch (ex) {
|
||||
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
return;
|
||||
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 300, yieldThreshold: 10 }).pipeTo(
|
||||
return [];
|
||||
}, { suspended: false, batchSize: 1, concurrentLimit: 5, delay: 100, yieldThreshold: 10, maintainDelay: false }).pipeTo(
|
||||
new QueueProcessor(
|
||||
async (pluginDataList) => {
|
||||
// Concurrency is two, therefore, we can unlock the previous awaiting.
|
||||
@@ -351,8 +363,8 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
}
|
||||
return;
|
||||
}
|
||||
, { suspended: true, batchSize: 10, concurrentLimit: 2, delay: 250, yieldThreshold: 25, totalRemainingReactiveSource: pluginScanningCount })).startPipeline().root.onIdle(() => {
|
||||
Logger(`All files enumerated`, LOG_LEVEL_INFO, "get-plugins");
|
||||
, { suspended: false, batchSize: 10, concurrentLimit: 2, delay: 100, yieldThreshold: 25, totalRemainingReactiveSource: pluginScanningCount, maintainDelay: false })).startPipeline().root.onIdle(() => {
|
||||
// Logger(`All files enumerated`, LOG_LEVEL_INFO, "get-plugins");
|
||||
this.createMissingConfigurationEntry();
|
||||
});
|
||||
|
||||
@@ -433,7 +445,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
try {
|
||||
// console.dir(f);
|
||||
const path = `${baseDir}/${f.filename}`;
|
||||
await this.ensureDirectoryEx(path);
|
||||
await this.vaultAccess.ensureDirectory(path);
|
||||
if (!content) {
|
||||
const dt = decodeBinary(f.data);
|
||||
await this.vaultAccess.adapterWrite(path, dt);
|
||||
@@ -504,9 +516,9 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
if (this.plugin.settings.usePluginSync && this.plugin.settings.notifyPluginOrSettingUpdated) {
|
||||
if (!this.pluginDialog || (this.pluginDialog && !this.pluginDialog.isOpened())) {
|
||||
const fragment = createFragment((doc) => {
|
||||
doc.createEl("span", null, (a) => {
|
||||
doc.createEl("span", undefined, (a) => {
|
||||
a.appendText(`Some configuration has been arrived, Press `);
|
||||
a.appendChild(a.createEl("a", null, (anchor) => {
|
||||
a.appendChild(a.createEl("a", undefined, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", () => {
|
||||
this.showPluginSyncModal();
|
||||
@@ -672,7 +684,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
|
||||
const content = createTextBlob(serialize(dt));
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false);
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, false);
|
||||
let saveData: SavingEntry;
|
||||
if (old === false) {
|
||||
saveData = {
|
||||
@@ -689,9 +701,21 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
};
|
||||
} else {
|
||||
if (old.mtime == mtime) {
|
||||
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
||||
// Logger(`STORAGE --> DB:${prefixedFileName}: (config) Skipped (Same time)`, LOG_LEVEL_VERBOSE);
|
||||
return true;
|
||||
}
|
||||
const oldC = await this.localDatabase.getDBEntryFromMeta(old, {}, false, false);
|
||||
if (oldC) {
|
||||
const d = await deserialize(getDocData(oldC.data), {}) as PluginDataEx;
|
||||
const diffs = (d.files.map(previous => ({ prev: previous, curr: dt.files.find(e => e.filename == previous.filename) })).map(async e => {
|
||||
try { return await isDocContentSame(e.curr?.data ?? [], e.prev.data) } catch (_) { return false }
|
||||
}))
|
||||
const isSame = (await Promise.all(diffs)).every(e => e == true);
|
||||
if (isSame) {
|
||||
Logger(`STORAGE --> DB:${prefixedFileName}: (config) Skipped (Same content)`, LOG_LEVEL_VERBOSE);
|
||||
return true;
|
||||
}
|
||||
}
|
||||
saveData =
|
||||
{
|
||||
...old,
|
||||
@@ -757,7 +781,11 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
const filesOnDB = ((await this.localDatabase.allDocsRaw({ startkey: ICXHeader + "", endkey: `${ICXHeader}\u{10ffff}`, include_docs: true })).rows.map(e => e.doc) as InternalFileEntry[]).filter(e => !e.deleted);
|
||||
let deleteCandidate = filesOnDB.map(e => this.getPath(e)).filter(e => e.startsWith(`${ICXHeader}${term}/`));
|
||||
for (const vp of virtualPathsOfLocalFiles) {
|
||||
const p = files.find(e => e.key == vp).file;
|
||||
const p = files.find(e => e.key == vp)?.file;
|
||||
if (!p) {
|
||||
Logger(`scanAllConfigFiles - File not found: ${vp}`, LOG_LEVEL_VERBOSE);
|
||||
continue;
|
||||
}
|
||||
await this.storeCustomizationFiles(p);
|
||||
deleteCandidate = deleteCandidate.filter(e => e != vp);
|
||||
}
|
||||
@@ -772,10 +800,11 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
const mtime = new Date().getTime();
|
||||
await serialized("file-x-" + prefixedFileName, async () => {
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false) as InternalFileEntry | false;
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, false) as InternalFileEntry | false;
|
||||
let saveData: InternalFileEntry;
|
||||
if (old === false) {
|
||||
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) already deleted (Not found on database)`);
|
||||
return;
|
||||
} else {
|
||||
if (old.deleted) {
|
||||
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) already deleted`);
|
||||
@@ -814,7 +843,14 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
lastDepth: number
|
||||
) {
|
||||
if (lastDepth == -1) return [];
|
||||
const w = await this.app.vault.adapter.list(path);
|
||||
let w: ListedFiles;
|
||||
try {
|
||||
w = await this.app.vault.adapter.list(path);
|
||||
} catch (ex) {
|
||||
Logger(`Could not traverse(ConfigSync):${path}`, LOG_LEVEL_INFO);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return [];
|
||||
}
|
||||
let files = [
|
||||
...w.files
|
||||
];
|
||||
|
||||
@@ -1,12 +1,10 @@
|
||||
import { normalizePath, type PluginManifest } from "./deps";
|
||||
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry } from "./lib/src/types";
|
||||
import { normalizePath, type PluginManifest, type ListedFiles } from "./deps";
|
||||
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry, type DocumentID } from "./lib/src/types";
|
||||
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
|
||||
import { createBinaryBlob, isDocContentSame, sendSignal } from "./lib/src/utils";
|
||||
import { readAsBlob, isDocContentSame, sendSignal, readContent, createBlob } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { isInternalMetadata, PeriodicProcessor } from "./utils";
|
||||
import { WrappedNotice } from "./lib/src/wrapper";
|
||||
import { decodeBinary, encodeBinary } from "./lib/src/strbin";
|
||||
import { serialized } from "./lib/src/lock";
|
||||
import { JsonResolveModal } from "./JsonResolveModal";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
@@ -16,13 +14,10 @@ import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "./lib/src/sto
|
||||
|
||||
export class HiddenFileSync extends LiveSyncCommands {
|
||||
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
|
||||
confirmPopup: WrappedNotice = null;
|
||||
|
||||
get kvDB() {
|
||||
return this.plugin.kvDB;
|
||||
}
|
||||
ensureDirectoryEx(fullPath: string) {
|
||||
return this.plugin.ensureDirectoryEx(fullPath);
|
||||
}
|
||||
getConflictedDoc(path: FilePathWithPrefix, rev: string) {
|
||||
return this.plugin.getConflictedDoc(path, rev);
|
||||
}
|
||||
@@ -70,11 +65,11 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
realizeSettingSyncMode(): Promise<void> {
|
||||
this.periodicInternalFileScanProcessor?.disable();
|
||||
if (this.plugin.suspended)
|
||||
return;
|
||||
return Promise.resolve();
|
||||
if (!this.plugin.isReady)
|
||||
return;
|
||||
return Promise.resolve();
|
||||
this.periodicInternalFileScanProcessor.enable(this.settings.syncInternalFiles && this.settings.syncInternalFilesInterval ? (this.settings.syncInternalFilesInterval * 1000) : 0);
|
||||
return;
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
procInternalFile(filename: string) {
|
||||
@@ -102,11 +97,13 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
}
|
||||
const stat = await this.vaultAccess.adapterStat(path);
|
||||
// sometimes folder is coming.
|
||||
if (stat && stat.type != "file")
|
||||
if (stat != null && stat.type != "file") {
|
||||
return;
|
||||
const storageMTime = ~~((stat && stat.mtime || 0) / 1000);
|
||||
}
|
||||
const mtime = stat == null ? 0 : stat?.mtime ?? 0;
|
||||
const storageMTime = ~~((mtime) / 1000);
|
||||
const key = `${path}-${storageMTime}`;
|
||||
if (this.recentProcessedInternalFiles.contains(key)) {
|
||||
if (mtime != 0 && this.recentProcessedInternalFiles.contains(key)) {
|
||||
//If recently processed, it may caused by self.
|
||||
return;
|
||||
}
|
||||
@@ -126,7 +123,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
if (storageMTime == 0) {
|
||||
await this.deleteInternalFileOnDatabase(path);
|
||||
} else {
|
||||
await this.storeInternalFileToDatabase({ path: path, ...stat });
|
||||
await this.storeInternalFileToDatabase({ path: path, mtime, ctime: stat?.ctime ?? mtime, size: stat?.size ?? 0 });
|
||||
}
|
||||
|
||||
}
|
||||
@@ -150,6 +147,27 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
await this.conflictResolutionProcessor.startPipeline().waitForPipeline();
|
||||
}
|
||||
|
||||
async resolveByNewerEntry(id: DocumentID, path: FilePathWithPrefix, currentDoc: EntryDoc, currentRev: string, conflictedRev: string) {
|
||||
const conflictedDoc = await this.localDatabase.getRaw(id, { rev: conflictedRev });
|
||||
// determine which revision should been deleted.
|
||||
// simply check modified time
|
||||
const mtimeCurrent = ("mtime" in currentDoc && currentDoc.mtime) || 0;
|
||||
const mtimeConflicted = ("mtime" in conflictedDoc && conflictedDoc.mtime) || 0;
|
||||
// Logger(`Revisions:${new Date(mtimeA).toLocaleString} and ${new Date(mtimeB).toLocaleString}`);
|
||||
// console.log(`mtime:${mtimeA} - ${mtimeB}`);
|
||||
const delRev = mtimeCurrent < mtimeConflicted ? currentRev : conflictedRev;
|
||||
// delete older one.
|
||||
await this.localDatabase.removeRevision(id, delRev);
|
||||
Logger(`Older one has been deleted:${path}`);
|
||||
const cc = await this.localDatabase.getRaw(id, { conflicts: true });
|
||||
if (cc._conflicts?.length === 0) {
|
||||
await this.extractInternalFileFromDatabase(stripAllPrefixes(path))
|
||||
} else {
|
||||
this.conflictResolutionProcessor.enqueue(path);
|
||||
}
|
||||
// check the file again
|
||||
|
||||
}
|
||||
conflictResolutionProcessor = new QueueProcessor(async (paths: FilePathWithPrefix[]) => {
|
||||
const path = paths[0];
|
||||
sendSignal(`cancel-internal-conflict:${path}`);
|
||||
@@ -157,11 +175,12 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
// Retrieve data
|
||||
const id = await this.path2id(path, ICHeader);
|
||||
const doc = await this.localDatabase.getRaw(id, { conflicts: true });
|
||||
// If there is no conflict, return with false.
|
||||
if (!("_conflicts" in doc))
|
||||
return;
|
||||
// if (!("_conflicts" in doc)){
|
||||
// return [];
|
||||
// }
|
||||
if (doc._conflicts === undefined) return [];
|
||||
if (doc._conflicts.length == 0)
|
||||
return;
|
||||
return [];
|
||||
Logger(`Hidden file conflicted:${path}`);
|
||||
const conflicts = doc._conflicts.sort((a, b) => Number(a.split("-")[0]) - Number(b.split("-")[0]));
|
||||
const revA = doc._rev;
|
||||
@@ -172,50 +191,42 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
const conflictedRevNo = Number(conflictedRev.split("-")[0]);
|
||||
//Search
|
||||
const revFrom = (await this.localDatabase.getRaw<EntryDoc>(id, { revs_info: true }));
|
||||
const commonBase = revFrom._revs_info.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
|
||||
const commonBase = revFrom._revs_info?.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
|
||||
const result = await this.plugin.mergeObject(path, commonBase, doc._rev, conflictedRev);
|
||||
if (result) {
|
||||
Logger(`Object merge:${path}`, LOG_LEVEL_INFO);
|
||||
const filename = stripAllPrefixes(path);
|
||||
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
await this.vaultAccess.ensureDirectory(filename);
|
||||
}
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, result);
|
||||
const stat = await this.vaultAccess.adapterStat(filename);
|
||||
if (!stat) {
|
||||
throw new Error(`conflictResolutionProcessor: Failed to stat file ${filename}`);
|
||||
}
|
||||
await this.storeInternalFileToDatabase({ path: filename, ...stat });
|
||||
await this.extractInternalFileFromDatabase(filename);
|
||||
await this.localDatabase.removeRaw(id, revB);
|
||||
await this.localDatabase.removeRevision(id, revB);
|
||||
this.conflictResolutionProcessor.enqueue(path);
|
||||
return;
|
||||
return [];
|
||||
} else {
|
||||
Logger(`Object merge is not applicable.`, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
return [{ path, revA, revB }];
|
||||
return [{ path, revA, revB, id, doc }];
|
||||
}
|
||||
const revBDoc = await this.localDatabase.getRaw(id, { rev: revB });
|
||||
// determine which revision should been deleted.
|
||||
// simply check modified time
|
||||
const mtimeA = ("mtime" in doc && doc.mtime) || 0;
|
||||
const mtimeB = ("mtime" in revBDoc && revBDoc.mtime) || 0;
|
||||
// Logger(`Revisions:${new Date(mtimeA).toLocaleString} and ${new Date(mtimeB).toLocaleString}`);
|
||||
// console.log(`mtime:${mtimeA} - ${mtimeB}`);
|
||||
const delRev = mtimeA < mtimeB ? revA : revB;
|
||||
// delete older one.
|
||||
await this.localDatabase.removeRaw(id, delRev);
|
||||
Logger(`Older one has been deleted:${path}`);
|
||||
// check the file again
|
||||
this.conflictResolutionProcessor.enqueue(path);
|
||||
return;
|
||||
// When not JSON file, resolve conflicts by choosing a newer one.
|
||||
await this.resolveByNewerEntry(id, path, doc, revA, revB);
|
||||
return [];
|
||||
} catch (ex) {
|
||||
Logger(`Failed to resolve conflict (Hidden): ${path}`);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return;
|
||||
return [];
|
||||
}
|
||||
}, {
|
||||
suspended: false, batchSize: 1, concurrentLimit: 5, delay: 10, keepResultUntilDownstreamConnected: true, yieldThreshold: 10,
|
||||
pipeTo: new QueueProcessor(async (results) => {
|
||||
const { path, revA, revB } = results[0]
|
||||
const { id, doc, path, revA, revB } = results[0];
|
||||
const docAMerge = await this.localDatabase.getDBEntry(path, { rev: revA });
|
||||
const docBMerge = await this.localDatabase.getDBEntry(path, { rev: revB });
|
||||
if (docAMerge != false && docBMerge != false) {
|
||||
@@ -224,6 +235,9 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
this.conflictResolutionProcessor.enqueue(path);
|
||||
}
|
||||
return;
|
||||
} else {
|
||||
// If either revision could not read, force resolving by the newer one.
|
||||
await this.resolveByNewerEntry(id, path, doc, revA, revB);
|
||||
}
|
||||
}, { suspended: false, batchSize: 1, concurrentLimit: 1, delay: 10, keepResultUntilDownstreamConnected: false, yieldThreshold: 10 })
|
||||
})
|
||||
@@ -300,11 +314,11 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
if (processed % 100 == 0) {
|
||||
Logger(`Hidden file: ${processed}/${fileCount}`, logLevel, "sync_internal");
|
||||
}
|
||||
if (!filename) return;
|
||||
if (!filename) return [];
|
||||
if (ignorePatterns.some(e => filename.match(e)))
|
||||
return;
|
||||
return [];
|
||||
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
|
||||
return;
|
||||
return [];
|
||||
}
|
||||
|
||||
const fileOnStorage = filename in filesMap ? filesMap[filename] : undefined;
|
||||
@@ -361,7 +375,11 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
}
|
||||
}
|
||||
} else if (xFileOnStorage && !xFileOnDatabase) {
|
||||
await this.storeInternalFileToDatabase(xFileOnStorage);
|
||||
if (direction == "push" || direction == "pushForce" || direction == "safe") {
|
||||
await this.storeInternalFileToDatabase(xFileOnStorage);
|
||||
} else {
|
||||
await this.extractInternalFileFromDatabase(xFileOnStorage.path);
|
||||
}
|
||||
} else {
|
||||
throw new Error("Invalid state on hidden file sync");
|
||||
// Something corrupted?
|
||||
@@ -387,7 +405,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
const enabledPlugins = this.app.plugins.enabledPlugins as Set<string>;
|
||||
const enabledPluginManifests = manifests.filter(e => enabledPlugins.has(e.id));
|
||||
for (const manifest of enabledPluginManifests) {
|
||||
if (manifest.dir in updatedFolders) {
|
||||
if (manifest.dir && manifest.dir in updatedFolders) {
|
||||
// If notified about plug-ins, reloading Obsidian may not be necessary.
|
||||
updatedCount -= updatedFolders[manifest.dir];
|
||||
const updatePluginId = manifest.id;
|
||||
@@ -435,19 +453,11 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
|
||||
const id = await this.path2id(file.path, ICHeader);
|
||||
const prefixedFileName = addPrefix(file.path, ICHeader);
|
||||
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(file.path);
|
||||
let content: Blob;
|
||||
try {
|
||||
content = createBinaryBlob(contentBin);
|
||||
} catch (ex) {
|
||||
Logger(`The file ${file.path} could not be encoded`);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
const content = createBlob(await this.plugin.vaultAccess.adapterReadAuto(file.path));
|
||||
const mtime = file.mtime;
|
||||
return await serialized("file-" + prefixedFileName, async () => {
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false);
|
||||
const old = await this.localDatabase.getDBEntry(prefixedFileName, undefined, false, false);
|
||||
let saveData: SavingEntry;
|
||||
if (old === false) {
|
||||
saveData = {
|
||||
@@ -463,7 +473,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
type: "newnote",
|
||||
};
|
||||
} else {
|
||||
if (await isDocContentSame(old.data, content) && !forceWrite) {
|
||||
if (await isDocContentSame(readAsBlob(old), content) && !forceWrite) {
|
||||
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
||||
return;
|
||||
}
|
||||
@@ -473,13 +483,13 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
data: content,
|
||||
mtime,
|
||||
size: file.size,
|
||||
datatype: "newnote",
|
||||
datatype: old.datatype,
|
||||
children: [],
|
||||
deleted: false,
|
||||
type: "newnote",
|
||||
type: old.datatype,
|
||||
};
|
||||
}
|
||||
const ret = await this.localDatabase.putDBEntry(saveData, true);
|
||||
const ret = await this.localDatabase.putDBEntry(saveData);
|
||||
Logger(`STORAGE --> DB:${file.path}: (hidden) Done`);
|
||||
return ret;
|
||||
} catch (ex) {
|
||||
@@ -499,7 +509,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
}
|
||||
await serialized("file-" + prefixedFileName, async () => {
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, true) as InternalFileEntry | false;
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, true) as InternalFileEntry | false;
|
||||
let saveData: InternalFileEntry;
|
||||
if (old === false) {
|
||||
saveData = {
|
||||
@@ -513,6 +523,14 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
type: "newnote",
|
||||
};
|
||||
} else {
|
||||
// Remove all conflicted before deleting.
|
||||
const conflicts = await this.localDatabase.getRaw(old._id, { conflicts: true });
|
||||
if (conflicts._conflicts !== undefined) {
|
||||
for (const conflictRev of conflicts._conflicts) {
|
||||
await this.localDatabase.removeRevision(old._id, conflictRev);
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) conflict removed ${old._rev} => ${conflictRev}`, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
}
|
||||
if (old.deleted) {
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) already deleted`);
|
||||
return;
|
||||
@@ -546,8 +564,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
return await serialized("file-" + prefixedFileName, async () => {
|
||||
try {
|
||||
// Check conflicted status
|
||||
//TODO option
|
||||
const fileOnDB = await this.localDatabase.getDBEntry(prefixedFileName, { conflicts: true }, false, true);
|
||||
const fileOnDB = await this.localDatabase.getDBEntry(prefixedFileName, { conflicts: true }, false, true, true);
|
||||
if (fileOnDB === false)
|
||||
throw new Error(`File not found on database.:${filename}`);
|
||||
// Prevent overwrite for Prevent overwriting while some conflicted revision exists.
|
||||
@@ -555,10 +572,10 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
Logger(`Hidden file ${filename} has conflicted revisions, to keep in safe, writing to storage has been prevented`, LOG_LEVEL_INFO);
|
||||
return;
|
||||
}
|
||||
const deleted = "deleted" in fileOnDB ? fileOnDB.deleted : false;
|
||||
const deleted = fileOnDB.deleted || fileOnDB._deleted || false;
|
||||
if (deleted) {
|
||||
if (!isExists) {
|
||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
|
||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
|
||||
} else {
|
||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden).`);
|
||||
await this.plugin.vaultAccess.adapterRemove(filename);
|
||||
@@ -573,8 +590,8 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
return true;
|
||||
}
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
await this.vaultAccess.ensureDirectory(filename);
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, readContent(fileOnDB), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||
@@ -585,13 +602,13 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden,new${force ? ", force" : ""})`);
|
||||
return true;
|
||||
} else {
|
||||
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(filename);
|
||||
const content = await encodeBinary(contentBin);
|
||||
if (await isDocContentSame(content, fileOnDB.data) && !force) {
|
||||
const content = await this.plugin.vaultAccess.adapterReadAuto(filename);
|
||||
const docContent = readContent(fileOnDB);
|
||||
if (await isDocContentSame(content, docContent) && !force) {
|
||||
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
||||
return true;
|
||||
}
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, docContent, { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||
@@ -642,11 +659,15 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
if (!keep && result) {
|
||||
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
await this.vaultAccess.ensureDirectory(filename);
|
||||
}
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, result);
|
||||
const stat = await this.plugin.vaultAccess.adapterStat(filename);
|
||||
await this.storeInternalFileToDatabase({ path: filename, ...stat }, true);
|
||||
if (!stat) {
|
||||
throw new Error("Stat failed");
|
||||
}
|
||||
const mtime = stat?.mtime ?? 0;
|
||||
await this.storeInternalFileToDatabase({ path: filename, mtime, ctime: stat?.ctime ?? mtime, size: stat?.size ?? 0 }, true);
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||
@@ -680,7 +701,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
const root = this.app.vault.getRoot();
|
||||
const findRoot = root.path;
|
||||
|
||||
const filenames = (await this.getFiles(findRoot, [], null, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
|
||||
const filenames = (await this.getFiles(findRoot, [], undefined, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
|
||||
const files = filenames.filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile))).map(async (e) => {
|
||||
return {
|
||||
path: e as FilePath,
|
||||
@@ -693,9 +714,12 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
if (await this.plugin.isIgnoredByIgnoreFiles(w.path)) {
|
||||
continue
|
||||
}
|
||||
const mtime = w.stat?.mtime ?? 0
|
||||
const ctime = w.stat?.ctime ?? mtime;
|
||||
const size = w.stat?.size ?? 0;
|
||||
result.push({
|
||||
...w,
|
||||
...w.stat
|
||||
mtime, ctime, size
|
||||
});
|
||||
}
|
||||
return result;
|
||||
@@ -706,11 +730,17 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
async getFiles(
|
||||
path: string,
|
||||
ignoreList: string[],
|
||||
filter: RegExp[],
|
||||
ignoreFilter: RegExp[]
|
||||
filter?: RegExp[],
|
||||
ignoreFilter?: RegExp[]
|
||||
) {
|
||||
|
||||
const w = await this.app.vault.adapter.list(path);
|
||||
let w: ListedFiles;
|
||||
try {
|
||||
w = await this.app.vault.adapter.list(path);
|
||||
} catch (ex) {
|
||||
Logger(`Could not traverse(HiddenSync):${path}`, LOG_LEVEL_INFO);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return [];
|
||||
}
|
||||
const filesSrc = [
|
||||
...w.files
|
||||
.filter((e) => !ignoreList.some((ee) => e.endsWith(ee)))
|
||||
|
||||
@@ -5,7 +5,7 @@ import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { askSelectString, askYesNo, askString } from "./utils";
|
||||
import { decrypt, encrypt } from "./lib/src/e2ee_v2";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { delay } from "./lib/src/utils";
|
||||
import { delay, fireAndForget } from "./lib/src/utils";
|
||||
import { confirmWithMessage } from "./dialogs";
|
||||
import { Platform } from "./deps";
|
||||
import { fetchAllUsedChunks } from "./lib/src/utils_couchdb";
|
||||
@@ -17,25 +17,25 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-copysetupuri",
|
||||
name: "Copy the setup URI",
|
||||
callback: this.command_copySetupURI.bind(this),
|
||||
name: "Copy settings as a new setup URI",
|
||||
callback: () => fireAndForget(this.command_copySetupURI()),
|
||||
});
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-copysetupuri-short",
|
||||
name: "Copy the setup URI (With customization sync)",
|
||||
callback: this.command_copySetupURIWithSync.bind(this),
|
||||
name: "Copy settings as a new setup URI (With customization sync)",
|
||||
callback: () => fireAndForget(this.command_copySetupURIWithSync()),
|
||||
});
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-copysetupurifull",
|
||||
name: "Copy the setup URI (Full)",
|
||||
callback: this.command_copySetupURIFull.bind(this),
|
||||
name: "Copy settings as a new setup URI (Full)",
|
||||
callback: () => fireAndForget(this.command_copySetupURIFull()),
|
||||
});
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-opensetupuri",
|
||||
name: "Open the setup URI",
|
||||
callback: this.command_openSetupURI.bind(this),
|
||||
name: "Use the copied setup URI (Formerly Open setup URI)",
|
||||
callback: () => fireAndForget(this.command_openSetupURI()),
|
||||
});
|
||||
}
|
||||
onInitializeDatabase(showNotice: boolean) { }
|
||||
@@ -76,7 +76,7 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
|
||||
}
|
||||
async command_copySetupURIWithSync() {
|
||||
this.command_copySetupURI(false);
|
||||
await this.command_copySetupURI(false);
|
||||
}
|
||||
async command_openSetupURI() {
|
||||
const setupURI = await askString(this.app, "Easy setup", "Set up URI", `${configURIBase}aaaaa`);
|
||||
@@ -108,6 +108,7 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
newSettingW.configPassphraseStore = "";
|
||||
newSettingW.encryptedPassphrase = "";
|
||||
newSettingW.encryptedCouchDBConnection = "";
|
||||
newSettingW.additionalSuffixOfDatabaseName = `${("appId" in this.app ? this.app.appId : "")}`
|
||||
const setupJustImport = "Just import setting";
|
||||
const setupAsNew = "Set it up as secondary or subsequent device";
|
||||
const setupAsMerge = "Secondary device but try keeping local changes";
|
||||
@@ -115,6 +116,7 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
const setupManually = "Leave everything to me";
|
||||
newSettingW.syncInternalFiles = false;
|
||||
newSettingW.usePluginSync = false;
|
||||
newSettingW.isConfigured = true;
|
||||
// Migrate completely obsoleted configuration.
|
||||
if (!newSettingW.useIndexedDBAdapter) {
|
||||
newSettingW.useIndexedDBAdapter = true;
|
||||
@@ -132,7 +134,7 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
} else if (setupType == setupAsMerge) {
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
await this.fetchLocalWithKeepLocal();
|
||||
await this.fetchLocalWithRebuild();
|
||||
} else if (setupType == setupAgain) {
|
||||
const confirm = "I know this operation will rebuild all my databases with files on this device, and files that are on the remote database and I didn't synchronize to any other devices will be lost and want to proceed indeed.";
|
||||
if (await askSelectString(this.app, "Do you really want to do this?", ["Cancel", confirm]) != confirm) {
|
||||
@@ -333,10 +335,18 @@ Of course, we are able to disable these features.`
|
||||
|
||||
const ret = await confirmWithMessage(this.plugin, "Database adapter", message, choices, CHOICE_YES, 10);
|
||||
if (ret == CHOICE_YES) {
|
||||
this.plugin.settings.useIndexedDBAdapter = false;
|
||||
this.plugin.settings.useIndexedDBAdapter = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
async resetLocalDatabase() {
|
||||
if (this.plugin.settings.isConfigured && this.plugin.settings.additionalSuffixOfDatabaseName == "") {
|
||||
// Discard the non-suffixed database
|
||||
await this.plugin.resetLocalDatabase();
|
||||
}
|
||||
this.plugin.settings.additionalSuffixOfDatabaseName = `${("appId" in this.app ? this.app.appId : "")}`
|
||||
await this.plugin.resetLocalDatabase();
|
||||
}
|
||||
async fetchRemoteChunks() {
|
||||
if (!this.plugin.settings.doNotSuspendOnFetching && this.plugin.settings.readChunksOnline) {
|
||||
Logger(`Fetching chunks`, LOG_LEVEL_NOTICE);
|
||||
@@ -349,45 +359,36 @@ Of course, we are able to disable these features.`
|
||||
Logger(`Fetching chunks done`, LOG_LEVEL_NOTICE);
|
||||
}
|
||||
}
|
||||
async fetchLocal() {
|
||||
async fetchLocal(makeLocalChunkBeforeSync?: boolean) {
|
||||
this.suspendExtraSync();
|
||||
this.askUseNewAdapter();
|
||||
await this.askUseNewAdapter();
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.suspendReflectingDatabase();
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.resetLocalDatabase();
|
||||
await this.resetLocalDatabase();
|
||||
await delay(1000);
|
||||
await this.plugin.markRemoteResolved();
|
||||
await this.plugin.openDatabase();
|
||||
this.plugin.isReady = true;
|
||||
if (makeLocalChunkBeforeSync) {
|
||||
await this.plugin.initializeDatabase(true);
|
||||
}
|
||||
await this.plugin.markRemoteResolved();
|
||||
await delay(500);
|
||||
await this.plugin.replicateAllFromServer(true);
|
||||
await delay(1000);
|
||||
await this.plugin.replicateAllFromServer(true);
|
||||
await this.fetchRemoteChunks();
|
||||
// if (!tryLessFetching) {
|
||||
// await this.fetchRemoteChunks();
|
||||
// }
|
||||
await this.resumeReflectingDatabase();
|
||||
await this.askHiddenFileConfiguration({ enableFetch: true });
|
||||
}
|
||||
async fetchLocalWithKeepLocal() {
|
||||
this.suspendExtraSync();
|
||||
this.askUseNewAdapter();
|
||||
await this.suspendReflectingDatabase();
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.resetLocalDatabase();
|
||||
await delay(1000);
|
||||
await this.plugin.initializeDatabase(true);
|
||||
await this.plugin.markRemoteResolved();
|
||||
await this.plugin.openDatabase();
|
||||
this.plugin.isReady = true;
|
||||
await delay(500);
|
||||
await this.plugin.replicateAllFromServer(true);
|
||||
await delay(1000);
|
||||
await this.plugin.replicateAllFromServer(true);
|
||||
await this.fetchRemoteChunks();
|
||||
await this.resumeReflectingDatabase();
|
||||
await this.askHiddenFileConfiguration({ enableFetch: true });
|
||||
async fetchLocalWithRebuild() {
|
||||
return await this.fetchLocal(true);
|
||||
}
|
||||
async rebuildRemote() {
|
||||
this.suspendExtraSync();
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.markRemoteLocked();
|
||||
await this.plugin.tryResetRemoteDatabase();
|
||||
@@ -401,9 +402,10 @@ Of course, we are able to disable these features.`
|
||||
}
|
||||
async rebuildEverything() {
|
||||
this.suspendExtraSync();
|
||||
this.askUseNewAdapter();
|
||||
await this.askUseNewAdapter();
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.resetLocalDatabase();
|
||||
await this.resetLocalDatabase();
|
||||
await delay(1000);
|
||||
await this.plugin.initializeDatabase(true);
|
||||
await this.plugin.markRemoteLocked();
|
||||
|
||||
@@ -5,7 +5,7 @@ import ObsidianLiveSyncPlugin from "./main";
|
||||
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { isErrorOfMissingDoc } from "./lib/src/utils_couchdb";
|
||||
import { getDocData } from "./lib/src/utils";
|
||||
import { getDocData, readContent } from "./lib/src/utils";
|
||||
import { isPlainText, stripPrefix } from "./lib/src/path";
|
||||
|
||||
function isImage(path: string) {
|
||||
@@ -21,6 +21,7 @@ function isComparableTextDecode(path: string) {
|
||||
return ["json"].includes(ext)
|
||||
}
|
||||
function readDocument(w: LoadedEntry) {
|
||||
if (w.data.length == 0) return "";
|
||||
if (isImage(w.path)) {
|
||||
return new Uint8Array(decodeBinary(w.data));
|
||||
}
|
||||
@@ -65,13 +66,13 @@ export class DocumentHistoryModal extends Modal {
|
||||
}
|
||||
}
|
||||
|
||||
async loadFile(initialRev: string) {
|
||||
async loadFile(initialRev?: string) {
|
||||
if (!this.id) {
|
||||
this.id = await this.plugin.path2id(this.file);
|
||||
}
|
||||
const db = this.plugin.localDatabase;
|
||||
try {
|
||||
const w = await db.localDatabase.get(this.id, { revs_info: true });
|
||||
const w = await db.getRaw(this.id, { revs_info: true });
|
||||
this.revs_info = w._revs_info?.filter((e) => e?.status == "available") ?? [];
|
||||
this.range.max = `${Math.max(this.revs_info.length - 1, 0)}`;
|
||||
this.range.value = this.range.max;
|
||||
@@ -108,7 +109,7 @@ export class DocumentHistoryModal extends Modal {
|
||||
if (v) {
|
||||
URL.revokeObjectURL(v);
|
||||
}
|
||||
this.BlobURLs.set(key, undefined);
|
||||
this.BlobURLs.delete(key);
|
||||
}
|
||||
generateBlobURL(key: string, data: Uint8Array) {
|
||||
this.revokeURL(key);
|
||||
@@ -246,12 +247,10 @@ export class DocumentHistoryModal extends Modal {
|
||||
Logger(`Old content copied to clipboard`, LOG_LEVEL_NOTICE);
|
||||
});
|
||||
});
|
||||
async function focusFile(path: string) {
|
||||
const targetFile = app.vault
|
||||
.getFiles()
|
||||
.find((f) => f.path === path);
|
||||
const focusFile = async (path: string) => {
|
||||
const targetFile = this.plugin.app.vault.getFileByPath(path);
|
||||
if (targetFile) {
|
||||
const leaf = app.workspace.getLeaf(false);
|
||||
const leaf = this.plugin.app.workspace.getLeaf(false);
|
||||
await leaf.openFile(targetFile);
|
||||
} else {
|
||||
Logger("The file could not view on the editor", LOG_LEVEL_NOTICE)
|
||||
@@ -264,19 +263,16 @@ export class DocumentHistoryModal extends Modal {
|
||||
const pathToWrite = stripPrefix(this.file);
|
||||
if (!isValidPath(pathToWrite)) {
|
||||
Logger("Path is not valid to write content.", LOG_LEVEL_INFO);
|
||||
return;
|
||||
}
|
||||
if (this.currentDoc?.datatype == "plain") {
|
||||
await this.plugin.vaultAccess.adapterWrite(pathToWrite, getDocData(this.currentDoc.data));
|
||||
await focusFile(pathToWrite);
|
||||
this.close();
|
||||
} else if (this.currentDoc?.datatype == "newnote") {
|
||||
await this.plugin.vaultAccess.adapterWrite(pathToWrite, decodeBinary(this.currentDoc.data));
|
||||
await focusFile(pathToWrite);
|
||||
this.close();
|
||||
} else {
|
||||
|
||||
Logger(`Could not parse entry`, LOG_LEVEL_NOTICE);
|
||||
if (!this.currentDoc) {
|
||||
Logger("No active file loaded.", LOG_LEVEL_INFO);
|
||||
return;
|
||||
}
|
||||
const d = readContent(this.currentDoc);
|
||||
await this.plugin.vaultAccess.adapterWrite(pathToWrite, d);
|
||||
await focusFile(pathToWrite);
|
||||
this.close();
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { onDestroy, onMount } from "svelte";
|
||||
import type { AnyEntry, FilePathWithPrefix } from "./lib/src/types";
|
||||
import { createBinaryBlob, getDocData, isDocContentSame } from "./lib/src/utils";
|
||||
import { getDocData, isDocContentSame, readAsBlob } from "./lib/src/utils";
|
||||
import { diff_match_patch } from "./deps";
|
||||
import { DocumentHistoryModal } from "./DocumentHistoryModal";
|
||||
import { isPlainText, stripAllPrefixes } from "./lib/src/path";
|
||||
@@ -106,15 +106,9 @@
|
||||
if (checkStorageDiff) {
|
||||
const abs = plugin.vaultAccess.getAbstractFileByPath(stripAllPrefixes(plugin.getPath(docA)));
|
||||
if (abs instanceof TFile) {
|
||||
let result = false;
|
||||
if (isPlainText(docA.path)) {
|
||||
const data = await plugin.vaultAccess.adapterRead(abs);
|
||||
result = await isDocContentSame(data, doc.data);
|
||||
} else {
|
||||
const data = await plugin.vaultAccess.adapterReadBinary(abs);
|
||||
const dataEEncoded = createBinaryBlob(data);
|
||||
result = await isDocContentSame(dataEEncoded, doc.data);
|
||||
}
|
||||
const data = await plugin.vaultAccess.adapterReadAuto(abs);
|
||||
const d = readAsBlob(doc);
|
||||
const result = await isDocContentSame(data, d);
|
||||
if (result) {
|
||||
diffDetail += " ⚖️";
|
||||
} else {
|
||||
|
||||
@@ -1,14 +1,14 @@
|
||||
import { App, PluginSettingTab, Setting, sanitizeHTMLToDom, TextAreaComponent, MarkdownRenderer, stringifyYaml } from "./deps";
|
||||
import { DEFAULT_SETTINGS, type ObsidianLiveSyncSettings, type ConfigPassphraseStore, type RemoteDBSettings, type FilePathWithPrefix, type HashAlgorithm, type DocumentID, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, LOG_LEVEL_INFO } from "./lib/src/types";
|
||||
import { delay } from "./lib/src/utils";
|
||||
import { Semaphore } from "./lib/src/semaphore";
|
||||
import { DEFAULT_SETTINGS, type ObsidianLiveSyncSettings, type ConfigPassphraseStore, type RemoteDBSettings, type FilePathWithPrefix, type HashAlgorithm, type DocumentID, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, LOG_LEVEL_INFO, type LoadedEntry, PREFERRED_SETTING_CLOUDANT, PREFERRED_SETTING_SELF_HOSTED } from "./lib/src/types";
|
||||
import { createBlob, delay, isDocContentSame, readAsBlob } from "./lib/src/utils";
|
||||
import { versionNumberString2Number } from "./lib/src/strbin";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb";
|
||||
import { testCrypt } from "./lib/src/e2ee_v2";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { askYesNo, performRebuildDB, requestToCouchDB, scheduleTask } from "./utils";
|
||||
import type { ButtonComponent } from "obsidian";
|
||||
import { request, type ButtonComponent, TFile } from "obsidian";
|
||||
import { shouldBeIgnored } from "./lib/src/path";
|
||||
|
||||
|
||||
export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
@@ -27,6 +27,18 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
}
|
||||
this.plugin.addLog(`Connected to ${db.info.db_name}`, LOG_LEVEL_NOTICE);
|
||||
}
|
||||
askReload() {
|
||||
scheduleTask("configReload", 250, async () => {
|
||||
if (await askYesNo(this.app, "Do you want to restart and reload Obsidian now?") == "yes") {
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload")
|
||||
}
|
||||
})
|
||||
}
|
||||
closeSetting() {
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close()
|
||||
}
|
||||
display(): void {
|
||||
const { containerEl } = this;
|
||||
let encrypt = this.plugin.settings.encrypt;
|
||||
@@ -74,11 +86,11 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
}
|
||||
w.querySelectorAll(`.sls-setting-label`).forEach((element) => {
|
||||
element.removeClass("selected");
|
||||
(element.querySelector<HTMLInputElement>("input[type=radio]")).checked = false;
|
||||
(element.querySelector<HTMLInputElement>("input[type=radio]"))!.checked = false;
|
||||
});
|
||||
w.querySelectorAll(`.sls-setting-label.c-${screen}`).forEach((element) => {
|
||||
element.addClass("selected");
|
||||
(element.querySelector<HTMLInputElement>("input[type=radio]")).checked = true;
|
||||
(element.querySelector<HTMLInputElement>("input[type=radio]"))!.checked = true;
|
||||
});
|
||||
this.selectedScreen = screen;
|
||||
};
|
||||
@@ -108,7 +120,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
tmpDiv.innerHTML = `<button> OK, I read all. </button>`;
|
||||
if (lastVersion > this.plugin.settings.lastReadUpdates) {
|
||||
const informationButtonDiv = h3El.appendChild(tmpDiv);
|
||||
informationButtonDiv.querySelector("button").addEventListener("click", async () => {
|
||||
informationButtonDiv.querySelector("button")?.addEventListener("click", async () => {
|
||||
this.plugin.settings.lastReadUpdates = lastVersion;
|
||||
await this.plugin.saveSettings();
|
||||
informationButtonDiv.remove();
|
||||
@@ -137,29 +149,27 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
const setupWizardEl = containerEl.createDiv();
|
||||
setupWizardEl.createEl("h3", { text: "Setup wizard" });
|
||||
new Setting(setupWizardEl)
|
||||
.setName("Discard the existing configuration and set up")
|
||||
.setName("Use the copied setup URI")
|
||||
.setDesc("To setup Self-hosted LiveSync, this method is the most preferred one.")
|
||||
.addButton((text) => {
|
||||
// eslint-disable-next-line require-await
|
||||
text.setButtonText("Next").onClick(async () => {
|
||||
if (JSON.stringify(this.plugin.settings) != JSON.stringify(DEFAULT_SETTINGS)) {
|
||||
this.plugin.replicator.closeReplication();
|
||||
this.plugin.settings = { ...DEFAULT_SETTINGS };
|
||||
this.plugin.saveSettings();
|
||||
Logger("Configuration has been flushed, please open it again", LOG_LEVEL_NOTICE)
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close()
|
||||
} else {
|
||||
containerEl.addClass("isWizard");
|
||||
applyDisplayEnabled();
|
||||
inWizard = true;
|
||||
changeDisplay("0")
|
||||
}
|
||||
text.setButtonText("Use").onClick(async () => {
|
||||
this.closeSetting();
|
||||
await this.plugin.addOnSetup.command_openSetupURI();
|
||||
})
|
||||
})
|
||||
if (this.plugin.settings.isConfigured) {
|
||||
new Setting(setupWizardEl)
|
||||
.setName("Copy current settings as a new setup URI")
|
||||
.addButton((text) => {
|
||||
text.setButtonText("Copy").onClick(async () => {
|
||||
await this.plugin.addOnSetup.command_copySetupURI();
|
||||
})
|
||||
})
|
||||
}
|
||||
new Setting(setupWizardEl)
|
||||
.setName("Do not discard the existing configuration and set up again")
|
||||
.setName("Minimal setup")
|
||||
.addButton((text) => {
|
||||
text.setButtonText("Next").onClick(async () => {
|
||||
text.setButtonText("Start").onClick(async () => {
|
||||
this.plugin.settings.liveSync = false;
|
||||
this.plugin.settings.periodicReplication = false;
|
||||
this.plugin.settings.syncOnSave = false;
|
||||
@@ -173,33 +183,100 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
applyDisplayEnabled();
|
||||
inWizard = true;
|
||||
changeDisplay("0")
|
||||
|
||||
})
|
||||
})
|
||||
const infoWarnForSubsequent = setupWizardEl.createEl("div", { text: `To set up second or subsequent device, please use 'Copy setup URI' and 'Open setup URI'` });
|
||||
infoWarnForSubsequent.addClass("op-warn-info");
|
||||
new Setting(setupWizardEl)
|
||||
.setName("Copy setup URI")
|
||||
.addButton((text) => {
|
||||
text.setButtonText("Copy setup URI").onClick(() => {
|
||||
// @ts-ignore
|
||||
this.plugin.app.commands.executeCommandById("obsidian-livesync:livesync-copysetupuri")
|
||||
if (!this.plugin.settings.isConfigured) {
|
||||
new Setting(setupWizardEl)
|
||||
.setName("Enable LiveSync on this device as the set-up was completed manually")
|
||||
.addButton((text) => {
|
||||
text.setButtonText("Enable").onClick(async () => {
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.plugin.saveSettings();
|
||||
this.askReload();
|
||||
})
|
||||
})
|
||||
}
|
||||
if (this.plugin.settings.isConfigured) {
|
||||
new Setting(setupWizardEl)
|
||||
.setName("Discard exist settings and databases")
|
||||
.addButton((text) => {
|
||||
text.setButtonText("Discard").onClick(async () => {
|
||||
if (await askYesNo(this.plugin.app, "Do you really want to discard exist settings and databases?") == "yes") {
|
||||
this.plugin.settings = { ...DEFAULT_SETTINGS };
|
||||
await this.plugin.saveSettingData();
|
||||
await this.plugin.resetLocalDatabase();
|
||||
// await this.plugin.initializeDatabase();
|
||||
this.askReload();
|
||||
}
|
||||
}).setWarning()
|
||||
})
|
||||
}
|
||||
setupWizardEl.createEl("h3", { text: "Online Tips" });
|
||||
const repo = "vrtmrz/obsidian-livesync";
|
||||
const topPath = "/docs/troubleshooting.md";
|
||||
const rawRepoURI = `https://raw.githubusercontent.com/${repo}/main`;
|
||||
setupWizardEl.createEl("div", "", el => el.innerHTML = `<a href='https://github.com/${repo}/blob/main${topPath}' target="_blank">Open in browser</a>`);
|
||||
const troubleShootEl = setupWizardEl.createEl("div", { text: "", cls: "sls-troubleshoot-preview" });
|
||||
const loadMarkdownPage = async (pathAll: string, basePathParam: string = "") => {
|
||||
troubleShootEl.style.minHeight = troubleShootEl.clientHeight + "px";
|
||||
troubleShootEl.empty();
|
||||
const fullPath = pathAll.startsWith("/") ? pathAll : `${basePathParam}/${pathAll}`;
|
||||
|
||||
const directoryArr = fullPath.split("/");
|
||||
const filename = directoryArr.pop();
|
||||
const directly = directoryArr.join("/");
|
||||
const basePath = directly;
|
||||
|
||||
let remoteTroubleShootMDSrc = "";
|
||||
try {
|
||||
remoteTroubleShootMDSrc = await request(`${rawRepoURI}${basePath}/${filename}`);
|
||||
} catch (ex: any) {
|
||||
remoteTroubleShootMDSrc = "Error Occurred!!\n" + ex.toString();
|
||||
}
|
||||
const remoteTroubleShootMD = remoteTroubleShootMDSrc.replace(/\((.*?(.png)|(.jpg))\)/g, `(${rawRepoURI}${basePath}/$1)`)
|
||||
// Render markdown
|
||||
await MarkdownRenderer.render(this.plugin.app, `<a class='sls-troubleshoot-anchor'></a> [Tips and Troubleshooting](${topPath}) [PageTop](${filename})\n\n${remoteTroubleShootMD}`, troubleShootEl, `${rawRepoURI}`, this.plugin);
|
||||
// Menu
|
||||
troubleShootEl.querySelector<HTMLAnchorElement>(".sls-troubleshoot-anchor")?.parentElement?.setCssStyles({ position: "sticky", top: "-1em", backgroundColor: "var(--modal-background)" });
|
||||
// Trap internal links.
|
||||
troubleShootEl.querySelectorAll<HTMLAnchorElement>("a.internal-link").forEach((anchorEl) => {
|
||||
anchorEl.addEventListener("click", async (evt) => {
|
||||
const uri = anchorEl.getAttr("data-href");
|
||||
if (!uri) return;
|
||||
if (uri.startsWith("#")) {
|
||||
evt.preventDefault();
|
||||
const elements = Array.from(troubleShootEl.querySelectorAll<HTMLHeadingElement>("[data-heading]"))
|
||||
const p = elements.find(e => e.getAttr("data-heading")?.toLowerCase().split(" ").join("-") == uri.substring(1).toLowerCase());
|
||||
if (p) {
|
||||
p.setCssStyles({ scrollMargin: "3em" });
|
||||
p.scrollIntoView({ behavior: "instant", block: "start" });
|
||||
}
|
||||
} else {
|
||||
evt.preventDefault();
|
||||
await loadMarkdownPage(uri, basePath);
|
||||
troubleShootEl.setCssStyles({ scrollMargin: "1em" });
|
||||
troubleShootEl.scrollIntoView({ behavior: "instant", block: "start" });
|
||||
}
|
||||
})
|
||||
})
|
||||
.addButton((text) => {
|
||||
text.setButtonText("Open setup URI").onClick(() => {
|
||||
// @ts-ignore
|
||||
this.plugin.app.commands.executeCommandById("obsidian-livesync:livesync-opensetupuri")
|
||||
|
||||
})
|
||||
})
|
||||
|
||||
troubleShootEl.style.minHeight = "";
|
||||
}
|
||||
loadMarkdownPage(topPath);
|
||||
addScreenElement("110", setupWizardEl);
|
||||
|
||||
const containerRemoteDatabaseEl = containerEl.createDiv();
|
||||
containerRemoteDatabaseEl.createEl("h3", { text: "Remote Database configuration" });
|
||||
const syncWarn = containerRemoteDatabaseEl.createEl("div", { text: `These settings are kept locked while any synchronization options are enabled. Disable these options in the "Sync Settings" tab to unlock.` });
|
||||
if (this.plugin.settings.couchDB_URI.startsWith("http://")) {
|
||||
if (this.plugin.isMobile) {
|
||||
containerRemoteDatabaseEl.createEl("div", { text: `Configured as using plain HTTP. We cannot connect to the remote. Please set up the credentials and use HTTPS for the remote URI.` })
|
||||
.addClass("op-warn");
|
||||
} else {
|
||||
containerRemoteDatabaseEl.createEl("div", { text: `Configured as using plain HTTP. We might fail on mobile devices.` })
|
||||
.addClass("op-warn-info");
|
||||
}
|
||||
}
|
||||
|
||||
syncWarn.addClass("op-warn-info");
|
||||
syncWarn.addClass("sls-hidden");
|
||||
|
||||
@@ -289,6 +366,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Test Database Connection")
|
||||
.setClass("wizardHidden")
|
||||
.setDesc("Open database connection. If the remote database is not found and you have the privilege to create a database, the database will be created.")
|
||||
.addButton((button) =>
|
||||
button
|
||||
@@ -300,8 +378,8 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
);
|
||||
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Check database configuration")
|
||||
// .setDesc("Open database connection. If the remote database is not found and you have the privilege to create a database, the database will be created.")
|
||||
.setName("Check and Fix database configuration")
|
||||
.setDesc("Check the database configuration, and fix if there are any problems.")
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Check")
|
||||
@@ -336,7 +414,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
tmpDiv.addClass("ob-btn-config-fix");
|
||||
tmpDiv.innerHTML = `<label>${title}</label><button>Fix</button>`;
|
||||
const x = checkResultDiv.appendChild(tmpDiv);
|
||||
x.querySelector("button").addEventListener("click", async () => {
|
||||
x.querySelector("button")?.addEventListener("click", async () => {
|
||||
Logger(`CouchDB Configuration: ${title} -> Set ${key} to ${value}`)
|
||||
const res = await requestToCouchDB(this.plugin.settings.couchDB_URI, this.plugin.settings.couchDB_USER, this.plugin.settings.couchDB_PASSWORD, undefined, key, value);
|
||||
if (res.status == 200) {
|
||||
@@ -432,15 +510,11 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
const origins = ["app://obsidian.md", "capacitor://localhost", "http://localhost"];
|
||||
for (const org of origins) {
|
||||
const rr = await requestToCouchDB(this.plugin.settings.couchDB_URI, this.plugin.settings.couchDB_USER, this.plugin.settings.couchDB_PASSWORD, org);
|
||||
const responseHeaders = Object.entries(rr.headers)
|
||||
const responseHeaders = Object.fromEntries(Object.entries(rr.headers)
|
||||
.map((e) => {
|
||||
e[0] = (e[0] + "").toLowerCase();
|
||||
e[0] = `${e[0]}`.toLowerCase();
|
||||
return e;
|
||||
})
|
||||
.reduce((obj, [key, val]) => {
|
||||
obj[key] = val;
|
||||
return obj;
|
||||
}, {} as { [key: string]: string });
|
||||
}));
|
||||
addResult(`Origin check:${org}`);
|
||||
if (responseHeaders["access-control-allow-credentials"] != "true") {
|
||||
addResult("❗ CORS is not allowing credential");
|
||||
@@ -456,7 +530,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
addResult("--Done--", ["ob-btn-config-head"]);
|
||||
addResult("If you have some trouble with Connection-check even though all Config-check has been passed, Please check your reverse proxy's configuration.", ["ob-btn-config-info"]);
|
||||
Logger(`Checking configuration done`, LOG_LEVEL_INFO);
|
||||
} catch (ex) {
|
||||
} catch (ex: any) {
|
||||
if (ex?.status == 401) {
|
||||
addResult(`❗ Access forbidden.`);
|
||||
addResult(`We could not continue the test.`);
|
||||
@@ -508,18 +582,18 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
passphraseSetting?.controlEl.toggleClass("sls-item-dirty", passphrase != this.plugin.settings.passphrase);
|
||||
dynamicIteration?.controlEl.toggleClass("sls-item-dirty", useDynamicIterationCount != this.plugin.settings.useDynamicIterationCount);
|
||||
usePathObfuscationEl?.controlEl.toggleClass("sls-item-dirty", usePathObfuscation != this.plugin.settings.usePathObfuscation);
|
||||
if (encrypt != this.plugin.settings.encrypt ||
|
||||
passphrase != this.plugin.settings.passphrase ||
|
||||
useDynamicIterationCount != this.plugin.settings.useDynamicIterationCount ||
|
||||
usePathObfuscation != this.plugin.settings.usePathObfuscation) {
|
||||
applyE2EButtons.settingEl.removeClass("sls-setting-hidden");
|
||||
} else {
|
||||
applyE2EButtons.settingEl.addClass("sls-setting-hidden");
|
||||
}
|
||||
|
||||
} else {
|
||||
passphraseSetting.settingEl.addClass("sls-setting-hidden");
|
||||
dynamicIteration.settingEl.addClass("sls-setting-hidden");
|
||||
usePathObfuscationEl.settingEl.addClass("sls-setting-hidden");
|
||||
}
|
||||
if (encrypt != this.plugin.settings.encrypt ||
|
||||
passphrase != this.plugin.settings.passphrase ||
|
||||
useDynamicIterationCount != this.plugin.settings.useDynamicIterationCount ||
|
||||
usePathObfuscation != this.plugin.settings.usePathObfuscation) {
|
||||
applyE2EButtons.settingEl.removeClass("sls-setting-hidden");
|
||||
} else {
|
||||
applyE2EButtons.settingEl.addClass("sls-setting-hidden");
|
||||
}
|
||||
}
|
||||
@@ -654,6 +728,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
this.plugin.settings.passphrase = passphrase;
|
||||
this.plugin.settings.useDynamicIterationCount = useDynamicIterationCount;
|
||||
this.plugin.settings.usePathObfuscation = usePathObfuscation;
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.plugin.saveSettings();
|
||||
updateE2EControls();
|
||||
if (sendToServer) {
|
||||
@@ -664,7 +739,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
}
|
||||
};
|
||||
|
||||
const rebuildDB = async (method: "localOnly" | "remoteOnly" | "rebuildBothByThisDevice") => {
|
||||
const rebuildDB = async (method: "localOnly" | "remoteOnly" | "rebuildBothByThisDevice" | "localOnlyWithChunks") => {
|
||||
if (encrypt && passphrase == "") {
|
||||
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
@@ -682,61 +757,39 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
this.plugin.settings.passphrase = passphrase;
|
||||
this.plugin.settings.useDynamicIterationCount = useDynamicIterationCount;
|
||||
this.plugin.settings.usePathObfuscation = usePathObfuscation;
|
||||
this.plugin.settings.isConfigured = true;
|
||||
Logger("All synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL_NOTICE)
|
||||
await this.plugin.saveSettings();
|
||||
updateE2EControls();
|
||||
applyDisplayEnabled();
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close();
|
||||
this.closeSetting();
|
||||
await delay(2000);
|
||||
await performRebuildDB(this.plugin, method);
|
||||
}
|
||||
|
||||
|
||||
let rebuildRemote = false;
|
||||
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("")
|
||||
.setClass("wizardOnly")
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Next")
|
||||
.setClass("mod-cta")
|
||||
.setCta()
|
||||
.setDisabled(false)
|
||||
.onClick(() => {
|
||||
if (!this.plugin.settings.encrypt) {
|
||||
this.plugin.settings.passphrase = "";
|
||||
}
|
||||
if (isCloudantURI(this.plugin.settings.couchDB_URI)) {
|
||||
this.plugin.settings.customChunkSize = 0;
|
||||
// this.plugin.settings.customChunkSize = 0;
|
||||
this.plugin.settings = { ...this.plugin.settings, ...PREFERRED_SETTING_CLOUDANT };
|
||||
} else {
|
||||
this.plugin.settings.customChunkSize = 100;
|
||||
this.plugin.settings = { ...this.plugin.settings, ...PREFERRED_SETTING_SELF_HOSTED };
|
||||
}
|
||||
rebuildRemote = false;
|
||||
changeDisplay("30")
|
||||
})
|
||||
);
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("")
|
||||
.setClass("wizardOnly")
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Discard existing database and proceed")
|
||||
.setDisabled(false)
|
||||
.setWarning()
|
||||
.onClick(() => {
|
||||
if (!this.plugin.settings.encrypt) {
|
||||
this.plugin.settings.passphrase = "";
|
||||
}
|
||||
if (isCloudantURI(this.plugin.settings.couchDB_URI)) {
|
||||
this.plugin.settings.customChunkSize = 0;
|
||||
} else {
|
||||
this.plugin.settings.customChunkSize = 100;
|
||||
}
|
||||
rebuildRemote = true;
|
||||
changeDisplay("30")
|
||||
})
|
||||
);
|
||||
)
|
||||
;
|
||||
|
||||
addScreenElement("0", containerRemoteDatabaseEl);
|
||||
|
||||
const containerGeneralSettingsEl = containerEl.createDiv();
|
||||
@@ -746,7 +799,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
|
||||
new Setting(containerGeneralSettingsEl)
|
||||
.setName("Show status inside the editor")
|
||||
.setDesc("")
|
||||
.setDesc("Reflected after reboot")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.showStatusOnEditor).onChange(async (value) => {
|
||||
this.plugin.settings.showStatusOnEditor = value;
|
||||
@@ -765,7 +818,16 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
})
|
||||
);
|
||||
}
|
||||
|
||||
new Setting(containerGeneralSettingsEl)
|
||||
.setName("Show status on the status bar")
|
||||
.setDesc("Reflected after reboot.")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.showStatusOnStatusbar).onChange(async (value) => {
|
||||
this.plugin.settings.showStatusOnStatusbar = value;
|
||||
await this.plugin.saveSettings();
|
||||
this.display();
|
||||
})
|
||||
);
|
||||
containerGeneralSettingsEl.createEl("h4", { text: "Logging" });
|
||||
new Setting(containerGeneralSettingsEl)
|
||||
.setName("Show only notifications")
|
||||
@@ -1000,25 +1062,20 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
...presetAllDisabled
|
||||
}
|
||||
}
|
||||
this.plugin.saveSettings();
|
||||
await this.plugin.saveSettings();
|
||||
this.display();
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
if (inWizard) {
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close()
|
||||
await this.plugin.resetLocalDatabase();
|
||||
await this.plugin.initializeDatabase(true);
|
||||
if (rebuildRemote) {
|
||||
await this.plugin.markRemoteLocked();
|
||||
await this.plugin.tryResetRemoteDatabase();
|
||||
await this.plugin.markRemoteLocked();
|
||||
await this.plugin.markRemoteResolved();
|
||||
this.closeSetting();
|
||||
if (!this.plugin.settings.isConfigured) {
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.plugin.saveSettings();
|
||||
await rebuildDB("localOnly");
|
||||
Logger("All done! Please set up subsequent devices with 'Copy current settings as a new setup URI' and 'Use the copied setup URI'.", LOG_LEVEL_NOTICE);
|
||||
await this.plugin.addOnSetup.command_copySetupURI();
|
||||
} else {
|
||||
this.askReload();
|
||||
}
|
||||
await this.plugin.replicate(true);
|
||||
|
||||
Logger("All done! Please set up subsequent devices with 'Copy setup URI' and 'Open setup URI'.", LOG_LEVEL_NOTICE);
|
||||
// @ts-ignore
|
||||
this.plugin.app.commands.executeCommandById("obsidian-livesync:livesync-copysetupuri")
|
||||
}
|
||||
})
|
||||
);
|
||||
@@ -1237,24 +1294,21 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
.addButton((button) => {
|
||||
button.setButtonText("Merge")
|
||||
.onClick(async () => {
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close()
|
||||
this.closeSetting()
|
||||
await this.plugin.addOnSetup.configureHiddenFileSync("MERGE");
|
||||
})
|
||||
})
|
||||
.addButton((button) => {
|
||||
button.setButtonText("Fetch")
|
||||
.onClick(async () => {
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close()
|
||||
this.closeSetting()
|
||||
await this.plugin.addOnSetup.configureHiddenFileSync("FETCH");
|
||||
})
|
||||
})
|
||||
.addButton((button) => {
|
||||
button.setButtonText("Overwrite")
|
||||
.onClick(async () => {
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close()
|
||||
this.closeSetting()
|
||||
await this.plugin.addOnSetup.configureHiddenFileSync("OVERWRITE");
|
||||
})
|
||||
});
|
||||
@@ -1288,7 +1342,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
});
|
||||
text.inputEl.setAttribute("type", "number");
|
||||
});
|
||||
let skipPatternTextArea: TextAreaComponent = null;
|
||||
let skipPatternTextArea: TextAreaComponent;
|
||||
const defaultSkipPattern = "\\/node_modules\\/, \\/\\.git\\/, \\/obsidian-livesync\\/";
|
||||
const defaultSkipPatternXPlat = defaultSkipPattern + ",\\/workspace$ ,\\/workspace.json$,\\/workspace-mobile.json$";
|
||||
new Setting(containerSyncSettingEl)
|
||||
@@ -1598,7 +1652,8 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
const pluginConfig = JSON.parse(JSON.stringify(this.plugin.settings)) as ObsidianLiveSyncSettings;
|
||||
pluginConfig.couchDB_DBNAME = REDACTED;
|
||||
pluginConfig.couchDB_PASSWORD = REDACTED;
|
||||
pluginConfig.couchDB_URI = isCloudantURI(pluginConfig.couchDB_URI) ? "cloudant" : "self-hosted";
|
||||
const scheme = pluginConfig.couchDB_URI.startsWith("http:") ? "(HTTP)" : (pluginConfig.couchDB_URI.startsWith("https:")) ? "(HTTPS)" : ""
|
||||
pluginConfig.couchDB_URI = isCloudantURI(pluginConfig.couchDB_URI) ? "cloudant" : `self-hosted${scheme}`;
|
||||
pluginConfig.couchDB_USER = REDACTED;
|
||||
pluginConfig.passphrase = REDACTED;
|
||||
pluginConfig.encryptedPassphrase = REDACTED;
|
||||
@@ -1645,43 +1700,113 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
c.addClass("op-warn");
|
||||
}
|
||||
}
|
||||
|
||||
new Setting(containerHatchEl)
|
||||
.setName("Back to non-configured")
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Back")
|
||||
.setDisabled(false)
|
||||
.onClick(async () => {
|
||||
this.plugin.settings.isConfigured = false;
|
||||
await this.plugin.saveSettings();
|
||||
this.askReload();
|
||||
}));
|
||||
const hatchWarn = containerHatchEl.createEl("div", { text: `To stop the boot up sequence for fixing problems on databases, you can put redflag.md on top of your vault (Rebooting obsidian is required).` });
|
||||
hatchWarn.addClass("op-warn-info");
|
||||
|
||||
|
||||
|
||||
const addResult = (path: string, file: TFile | false, fileOnDB: LoadedEntry | false) => {
|
||||
resultArea.appendChild(resultArea.createEl("div", {}, el => {
|
||||
el.appendChild(el.createEl("h6", { text: path }));
|
||||
el.appendChild(el.createEl("div", {}, infoGroupEl => {
|
||||
infoGroupEl.appendChild(infoGroupEl.createEl("div", { text: `Storage : Modified: ${!file ? `Missing:` : `${new Date(file.stat.mtime).toLocaleString()}, Size:${file.stat.size}`}` }))
|
||||
infoGroupEl.appendChild(infoGroupEl.createEl("div", { text: `Database: Modified: ${!fileOnDB ? `Missing:` : `${new Date(fileOnDB.mtime).toLocaleString()}, Size:${fileOnDB.size}`}` }))
|
||||
}));
|
||||
if (fileOnDB && file) {
|
||||
el.appendChild(el.createEl("button", { text: "Show history" }, buttonEl => {
|
||||
buttonEl.onClickEvent(() => {
|
||||
this.plugin.showHistory(file, fileOnDB._id);
|
||||
})
|
||||
}))
|
||||
}
|
||||
if (file) {
|
||||
el.appendChild(el.createEl("button", { text: "Storage -> Database" }, buttonEl => {
|
||||
buttonEl.onClickEvent(() => {
|
||||
this.plugin.updateIntoDB(file, undefined, true);
|
||||
el.remove();
|
||||
})
|
||||
}))
|
||||
}
|
||||
if (fileOnDB) {
|
||||
el.appendChild(el.createEl("button", { text: "Database -> Storage" }, buttonEl => {
|
||||
buttonEl.onClickEvent(() => {
|
||||
this.plugin.pullFile(this.plugin.getPath(fileOnDB), [], true, undefined, false);
|
||||
el.remove();
|
||||
})
|
||||
}))
|
||||
}
|
||||
return el;
|
||||
}))
|
||||
}
|
||||
|
||||
const checkBetweenStorageAndDatabase = async (file: TFile, fileOnDB: LoadedEntry) => {
|
||||
const dataContent = readAsBlob(fileOnDB);
|
||||
const content = createBlob(await this.plugin.vaultAccess.vaultReadAuto(file))
|
||||
if (await isDocContentSame(content, dataContent)) {
|
||||
Logger(`Compare: SAME: ${file.path}`)
|
||||
} else {
|
||||
Logger(`Compare: CONTENT IS NOT MATCHED! ${file.path}`, LOG_LEVEL_NOTICE);
|
||||
addResult(file.path, file, fileOnDB)
|
||||
}
|
||||
}
|
||||
new Setting(containerHatchEl)
|
||||
.setName("Verify and repair all files")
|
||||
.setDesc("Verify and repair all files and update database without restoring")
|
||||
.setDesc("Compare the content of files between on local database and storage. If not matched, you will asked which one want to keep.")
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Verify and repair")
|
||||
.setButtonText("Verify all")
|
||||
.setDisabled(false)
|
||||
.setWarning()
|
||||
.onClick(async () => {
|
||||
const semaphore = Semaphore(10);
|
||||
const files = this.app.vault.getFiles();
|
||||
const documents = [] as FilePathWithPrefix[];
|
||||
|
||||
const adn = this.plugin.localDatabase.findAllNormalDocs()
|
||||
for await (const i of adn) documents.push(this.plugin.getPath(i));
|
||||
const allPaths = [...new Set([...documents, ...files.map(e => e.path as FilePathWithPrefix)])];
|
||||
let i = 0;
|
||||
const processes = files.map(e => (async (file) => {
|
||||
const releaser = await semaphore.acquire(1, "verifyAndRepair");
|
||||
for (const path of allPaths) {
|
||||
i++;
|
||||
Logger(`${i}/${files.length}\n${path}`, LOG_LEVEL_NOTICE, "verify");
|
||||
if (shouldBeIgnored(path)) continue;
|
||||
const abstractFile = this.plugin.vaultAccess.getAbstractFileByPath(path);
|
||||
const fileOnStorage = abstractFile instanceof TFile ? abstractFile : false;
|
||||
if (!await this.plugin.isTargetFile(path)) continue;
|
||||
|
||||
try {
|
||||
Logger(`UPDATE DATABASE ${file.path}`);
|
||||
await this.plugin.updateIntoDB(file, false, null, true);
|
||||
i++;
|
||||
Logger(`${i}/${files.length}\n${file.path}`, LOG_LEVEL_NOTICE, "verify");
|
||||
if (fileOnStorage && this.plugin.isFileSizeExceeded(fileOnStorage.stat.size)) continue;
|
||||
const fileOnDB = await this.plugin.localDatabase.getDBEntry(path);
|
||||
if (fileOnDB && this.plugin.isFileSizeExceeded(fileOnDB.size)) continue;
|
||||
|
||||
} catch (ex) {
|
||||
i++;
|
||||
Logger(`Error while verifyAndRepair`, LOG_LEVEL_NOTICE);
|
||||
Logger(ex);
|
||||
} finally {
|
||||
releaser();
|
||||
if (!fileOnDB && fileOnStorage) {
|
||||
Logger(`Compare: Not found on the local database: ${path}`, LOG_LEVEL_NOTICE);
|
||||
addResult(path, fileOnStorage, false)
|
||||
continue;
|
||||
}
|
||||
if (fileOnDB && !fileOnStorage) {
|
||||
Logger(`Compare: Not found on the storage: ${path}`, LOG_LEVEL_NOTICE);
|
||||
addResult(path, false, fileOnDB)
|
||||
continue;
|
||||
}
|
||||
if (fileOnStorage && fileOnDB) {
|
||||
await checkBetweenStorageAndDatabase(fileOnStorage, fileOnDB)
|
||||
}
|
||||
}
|
||||
)(e));
|
||||
await Promise.all(processes);
|
||||
Logger("done", LOG_LEVEL_NOTICE, "verify");
|
||||
})
|
||||
);
|
||||
const resultArea = containerHatchEl.createDiv({ text: "" });
|
||||
new Setting(containerHatchEl)
|
||||
.setName("Check and convert non-path-obfuscated files")
|
||||
.setDesc("")
|
||||
@@ -1704,6 +1829,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
//Prepare converted data
|
||||
newDoc._id = idEncoded;
|
||||
newDoc.path = docName as FilePathWithPrefix;
|
||||
// @ts-ignore
|
||||
delete newDoc._rev;
|
||||
try {
|
||||
const obfuscatedDoc = await this.plugin.localDatabase.getRaw(idEncoded, { revs_info: true });
|
||||
@@ -1729,7 +1855,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
Logger(`Converting ${docName} Failed!`, LOG_LEVEL_NOTICE);
|
||||
Logger(ret, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
} catch (ex) {
|
||||
} catch (ex: any) {
|
||||
if (ex?.status == 404) {
|
||||
// We can perform this safely
|
||||
if ((await this.plugin.localDatabase.putRaw(newDoc)).ok) {
|
||||
@@ -1772,12 +1898,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
toggle.setValue(this.plugin.settings.suspendFileWatching).onChange(async (value) => {
|
||||
this.plugin.settings.suspendFileWatching = value;
|
||||
await this.plugin.saveSettings();
|
||||
scheduleTask("configReload", 250, async () => {
|
||||
if (await askYesNo(this.app, "Do you want to restart and reload Obsidian now?") == "yes") {
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload")
|
||||
}
|
||||
})
|
||||
this.askReload();
|
||||
})
|
||||
);
|
||||
new Setting(containerHatchEl)
|
||||
@@ -1787,12 +1908,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
toggle.setValue(this.plugin.settings.suspendParseReplicationResult).onChange(async (value) => {
|
||||
this.plugin.settings.suspendParseReplicationResult = value;
|
||||
await this.plugin.saveSettings();
|
||||
scheduleTask("configReload", 250, async () => {
|
||||
if (await askYesNo(this.app, "Do you want to restart and reload Obsidian now?") == "yes") {
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload")
|
||||
}
|
||||
})
|
||||
this.askReload();
|
||||
})
|
||||
);
|
||||
new Setting(containerHatchEl)
|
||||
@@ -1805,15 +1921,15 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
})
|
||||
);
|
||||
|
||||
new Setting(containerHatchEl)
|
||||
.setName("Do not pace synchronization")
|
||||
.setDesc("If this toggle enabled, synchronisation will not be paced by queued entries. If synchronisation has been deadlocked, please make this enabled once.")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.doNotPaceReplication).onChange(async (value) => {
|
||||
this.plugin.settings.doNotPaceReplication = value;
|
||||
await this.plugin.saveSettings();
|
||||
})
|
||||
);
|
||||
// new Setting(containerHatchEl)
|
||||
// .setName("Do not pace synchronization")
|
||||
// .setDesc("If this toggle enabled, synchronisation will not be paced by queued entries. If synchronisation has been deadlocked, please make this enabled once.")
|
||||
// .addToggle((toggle) =>
|
||||
// toggle.setValue(this.plugin.settings.doNotPaceReplication).onChange(async (value) => {
|
||||
// this.plugin.settings.doNotPaceReplication = value;
|
||||
// await this.plugin.saveSettings();
|
||||
// })
|
||||
// );
|
||||
containerHatchEl.createEl("h4", {
|
||||
text: sanitizeHTMLToDom(`Compatibility`),
|
||||
cls: "wizardHidden"
|
||||
@@ -1854,7 +1970,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
|
||||
new Setting(containerHatchEl)
|
||||
.setName("Use an old adapter for compatibility")
|
||||
.setDesc("This option is not compatible with a database made by older versions. Changing this configuration will fetch the remote database again.")
|
||||
.setDesc("Before v0.17.16, we used an old adapter for the local database. Now the new adapter is preferred. However, it needs local database rebuilding. Please disable this toggle when you have enough time. If leave it enabled, also while fetching from the remote database, you will be asked to disable this.")
|
||||
.setClass("wizardHidden")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(!this.plugin.settings.useIndexedDBAdapter).onChange(async (value) => {
|
||||
@@ -1877,7 +1993,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
let newDatabaseName = this.plugin.settings.additionalSuffixOfDatabaseName + "";
|
||||
new Setting(containerHatchEl)
|
||||
.setName("Database suffix")
|
||||
.setDesc("LiveSync could not treat multiple vaults which have same name, please add some suffix from here.")
|
||||
.setDesc("LiveSync could not handle multiple vaults which have same name without different prefix, This should be automatically configured.")
|
||||
.addText((text) => {
|
||||
text.setPlaceholder("")
|
||||
.setValue(newDatabaseName)
|
||||
@@ -2062,6 +2178,19 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
})
|
||||
)
|
||||
|
||||
new Setting(containerMaintenanceEl)
|
||||
.setName("Fetch rebuilt DB (Save local documents before)")
|
||||
.setDesc("Restore or reconstruct local database from remote database but use local chunks.")
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Save and Fetch")
|
||||
.setWarning()
|
||||
.setDisabled(false)
|
||||
.onClick(async () => {
|
||||
await rebuildDB("localOnlyWithChunks");
|
||||
})
|
||||
)
|
||||
|
||||
new Setting(containerMaintenanceEl)
|
||||
.setName("Discard local database to reset or uninstall Self-hosted LiveSync")
|
||||
.addButton((button) =>
|
||||
@@ -2091,8 +2220,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
.setDisabled(false)
|
||||
.setWarning()
|
||||
.onClick(async () => {
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close()
|
||||
this.closeSetting()
|
||||
await this.plugin.dbGC();
|
||||
})
|
||||
);
|
||||
@@ -2114,7 +2242,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
applyDisplayEnabled();
|
||||
if (this.selectedScreen == "") {
|
||||
if (lastVersion != this.plugin.settings.lastReadUpdates) {
|
||||
if (JSON.stringify(this.plugin.settings) != JSON.stringify(DEFAULT_SETTINGS)) {
|
||||
if (this.plugin.settings.isConfigured) {
|
||||
changeDisplay("100");
|
||||
} else {
|
||||
changeDisplay("110")
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "./deps";
|
||||
import { serialized } from "./lib/src/lock";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { isPlainText } from "./lib/src/path";
|
||||
import type { FilePath } from "./lib/src/types";
|
||||
import { createBinaryBlob, isDocContentSame } from "./lib/src/utils";
|
||||
import type { InternalFileInfo } from "./types";
|
||||
@@ -55,6 +57,12 @@ export class SerializedFileAccess {
|
||||
return await processReadFile(file, () => this.app.vault.adapter.readBinary(path));
|
||||
}
|
||||
|
||||
async adapterReadAuto(file: TFile | string) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
if (isPlainText(path)) return await processReadFile(file, () => this.app.vault.adapter.read(path));
|
||||
return await processReadFile(file, () => this.app.vault.adapter.readBinary(path));
|
||||
}
|
||||
|
||||
async adapterWrite(file: TFile | string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
if (typeof (data) === "string") {
|
||||
@@ -76,12 +84,19 @@ export class SerializedFileAccess {
|
||||
return await processReadFile(file, () => this.app.vault.readBinary(file));
|
||||
}
|
||||
|
||||
async vaultReadAuto(file: TFile) {
|
||||
const path = file.path;
|
||||
if (isPlainText(path)) return await processReadFile(file, () => this.app.vault.read(file));
|
||||
return await processReadFile(file, () => this.app.vault.readBinary(file));
|
||||
}
|
||||
|
||||
|
||||
async vaultModify(file: TFile, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
|
||||
if (typeof (data) === "string") {
|
||||
return await processWriteFile(file, async () => {
|
||||
const oldData = await this.app.vault.read(file);
|
||||
if (data === oldData) {
|
||||
markChangesAreSame(file, file.stat.mtime, options.mtime);
|
||||
if (options && options.mtime) markChangesAreSame(file, file.stat.mtime, options.mtime);
|
||||
return false
|
||||
}
|
||||
await this.app.vault.modify(file, data, options)
|
||||
@@ -92,7 +107,7 @@ export class SerializedFileAccess {
|
||||
return await processWriteFile(file, async () => {
|
||||
const oldData = await this.app.vault.readBinary(file);
|
||||
if (await isDocContentSame(createBinaryBlob(oldData), createBinaryBlob(data))) {
|
||||
markChangesAreSame(file, file.stat.mtime, options.mtime);
|
||||
if (options && options.mtime) markChangesAreSame(file, file.stat.mtime, options.mtime);
|
||||
return false;
|
||||
}
|
||||
await this.app.vault.modifyBinary(file, toArrayBuffer(data), options)
|
||||
@@ -107,6 +122,15 @@ export class SerializedFileAccess {
|
||||
return await processWriteFile(path, () => this.app.vault.createBinary(path, toArrayBuffer(data), options));
|
||||
}
|
||||
}
|
||||
|
||||
trigger(name: string, ...data: any[]) {
|
||||
return this.app.vault.trigger(name, ...data);
|
||||
}
|
||||
|
||||
async adapterAppend(normalizedPath: string, data: string, options?: DataWriteOptions) {
|
||||
return await this.app.vault.adapter.append(normalizedPath, data, options)
|
||||
}
|
||||
|
||||
async delete(file: TFile | TFolder, force = false) {
|
||||
return await processWriteFile(file, () => this.app.vault.delete(file, force));
|
||||
}
|
||||
@@ -127,6 +151,30 @@ export class SerializedFileAccess {
|
||||
// }
|
||||
}
|
||||
|
||||
getFiles() {
|
||||
return this.app.vault.getFiles();
|
||||
}
|
||||
|
||||
async ensureDirectory(fullPath: string) {
|
||||
const pathElements = fullPath.split("/");
|
||||
pathElements.pop();
|
||||
let c = "";
|
||||
for (const v of pathElements) {
|
||||
c += v;
|
||||
try {
|
||||
await this.app.vault.adapter.mkdir(c);
|
||||
} catch (ex: any) {
|
||||
if (ex?.message == "Folder already exists.") {
|
||||
// Skip if already exists.
|
||||
} else {
|
||||
Logger("Folder Create Error");
|
||||
Logger(ex);
|
||||
}
|
||||
}
|
||||
c += "/";
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
touchedFiles: string[] = [];
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import type { SerializedFileAccess } from "./SerializedFileAccess";
|
||||
import { Plugin, TAbstractFile, TFile, TFolder } from "./deps";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
|
||||
import { shouldBeIgnored } from "./lib/src/path";
|
||||
import type { KeyedQueueProcessor } from "./lib/src/processor";
|
||||
import { LOG_LEVEL_NOTICE, type FilePath, type ObsidianLiveSyncSettings } from "./lib/src/types";
|
||||
import { delay } from "./lib/src/utils";
|
||||
@@ -91,6 +91,8 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
||||
}
|
||||
// Cache file and waiting to can be proceed.
|
||||
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
|
||||
if (!this.plugin.settings.isConfigured) return;
|
||||
if (this.plugin.settings.suspendFileWatching) return;
|
||||
for (const param of params) {
|
||||
if (shouldBeIgnored(param.file.path)) {
|
||||
continue;
|
||||
@@ -106,9 +108,9 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
||||
}
|
||||
if (file instanceof TFolder) continue;
|
||||
if (!await this.plugin.isTargetFile(file.path)) continue;
|
||||
if (this.plugin.settings.suspendFileWatching) continue;
|
||||
|
||||
let cache: null | string | ArrayBuffer;
|
||||
// Stop cache using to prevent the corruption;
|
||||
// let cache: null | string | ArrayBuffer;
|
||||
// new file or something changed, cache the changes.
|
||||
if (file instanceof TFile && (type == "CREATE" || type == "CHANGED")) {
|
||||
// Wait for a bit while to let the writer has marked `touched` at the file.
|
||||
@@ -116,12 +118,13 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
||||
if (this.plugin.vaultAccess.recentlyTouched(file)) {
|
||||
continue;
|
||||
}
|
||||
if (!isPlainText(file.name)) {
|
||||
cache = await this.plugin.vaultAccess.vaultReadBinary(file);
|
||||
} else {
|
||||
cache = await this.plugin.vaultAccess.vaultCacheRead(file);
|
||||
if (!cache) cache = await this.plugin.vaultAccess.vaultRead(file);
|
||||
}
|
||||
// cache = await this.plugin.vaultAccess.vaultReadAuto(file);
|
||||
// if (!isPlainText(file.name)) {
|
||||
// cache = await this.plugin.vaultAccess.vaultReadBinary(file);
|
||||
// } else {
|
||||
// cache = await this.plugin.vaultAccess.vaultCacheRead(file);
|
||||
// if (!cache) cache = await this.plugin.vaultAccess.vaultRead(file);
|
||||
// }
|
||||
}
|
||||
const fileInfo = file instanceof TFile ? {
|
||||
ctime: file.stat.ctime,
|
||||
@@ -136,7 +139,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
||||
args: {
|
||||
file: fileInfo,
|
||||
oldPath,
|
||||
cache,
|
||||
// cache,
|
||||
ctx
|
||||
},
|
||||
key: atomicKey
|
||||
|
||||
@@ -4,7 +4,7 @@ export {
|
||||
addIcon, App, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginSettingTab, requestUrl, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
|
||||
parseYaml, ItemView, WorkspaceLeaf
|
||||
} from "obsidian";
|
||||
export type { DataWriteOptions, PluginManifest, RequestUrlParam, RequestUrlResponse, MarkdownFileInfo } from "obsidian";
|
||||
export type { DataWriteOptions, PluginManifest, RequestUrlParam, RequestUrlResponse, MarkdownFileInfo, ListedFiles } from "obsidian";
|
||||
import {
|
||||
normalizePath as normalizePath_
|
||||
} from "obsidian";
|
||||
|
||||
@@ -7,10 +7,9 @@ import PluginPane from "./PluginPane.svelte";
|
||||
|
||||
export class PluginDialogModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
logEl: HTMLDivElement;
|
||||
component: PluginPane = null;
|
||||
component: PluginPane | undefined;
|
||||
isOpened() {
|
||||
return this.component != null;
|
||||
return this.component != undefined;
|
||||
}
|
||||
|
||||
constructor(app: App, plugin: ObsidianLiveSyncPlugin) {
|
||||
@@ -21,7 +20,7 @@ export class PluginDialogModal extends Modal {
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
this.titleEl.setText("Customization Sync (Beta2)")
|
||||
if (this.component == null) {
|
||||
if (!this.component) {
|
||||
this.component = new PluginPane({
|
||||
target: contentEl,
|
||||
props: { plugin: this.plugin },
|
||||
@@ -30,9 +29,9 @@ export class PluginDialogModal extends Modal {
|
||||
}
|
||||
|
||||
onClose() {
|
||||
if (this.component != null) {
|
||||
if (this.component) {
|
||||
this.component.$destroy();
|
||||
this.component = null;
|
||||
this.component = undefined;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -94,13 +93,13 @@ export class InputStringDialog extends Modal {
|
||||
}
|
||||
export class PopoverSelectString extends FuzzySuggestModal<string> {
|
||||
app: App;
|
||||
callback: (e: string) => void = () => { };
|
||||
callback: ((e: string) => void) | undefined = () => { };
|
||||
getItemsFun: () => string[] = () => {
|
||||
return ["yes", "no"];
|
||||
|
||||
}
|
||||
|
||||
constructor(app: App, note: string, placeholder: string | null, getItemsFun: () => string[], callback: (e: string) => void) {
|
||||
constructor(app: App, note: string, placeholder: string | undefined, getItemsFun: (() => string[]) | undefined, callback: (e: string) => void) {
|
||||
super(app);
|
||||
this.app = app;
|
||||
this.setPlaceholder((placeholder ?? "y/n) ") + note);
|
||||
@@ -118,13 +117,14 @@ export class PopoverSelectString extends FuzzySuggestModal<string> {
|
||||
|
||||
onChooseItem(item: string, evt: MouseEvent | KeyboardEvent): void {
|
||||
// debugger;
|
||||
this.callback(item);
|
||||
this.callback = null;
|
||||
this.callback?.(item);
|
||||
this.callback = undefined;
|
||||
}
|
||||
onClose(): void {
|
||||
setTimeout(() => {
|
||||
if (this.callback != null) {
|
||||
if (this.callback) {
|
||||
this.callback("");
|
||||
this.callback = undefined;
|
||||
}
|
||||
}, 100);
|
||||
}
|
||||
@@ -136,16 +136,16 @@ export class MessageBox extends Modal {
|
||||
title: string;
|
||||
contentMd: string;
|
||||
buttons: string[];
|
||||
result: string;
|
||||
result: string | false = false;
|
||||
isManuallyClosed = false;
|
||||
defaultAction: string | undefined;
|
||||
timeout: number | undefined;
|
||||
timer: ReturnType<typeof setInterval> = undefined;
|
||||
timer: ReturnType<typeof setInterval> | undefined = undefined;
|
||||
defaultButtonComponent: ButtonComponent | undefined;
|
||||
|
||||
onSubmit: (result: string | false) => void;
|
||||
|
||||
constructor(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout: number, onSubmit: (result: (typeof buttons)[number] | false) => void) {
|
||||
constructor(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout: number | undefined, onSubmit: (result: (typeof buttons)[number] | false) => void) {
|
||||
super(plugin.app);
|
||||
this.plugin = plugin;
|
||||
this.title = title;
|
||||
@@ -156,6 +156,7 @@ export class MessageBox extends Modal {
|
||||
this.timeout = timeout;
|
||||
if (this.timeout) {
|
||||
this.timer = setInterval(() => {
|
||||
if (this.timeout === undefined) return;
|
||||
this.timeout--;
|
||||
if (this.timeout < 0) {
|
||||
if (this.timer) {
|
||||
@@ -166,7 +167,7 @@ export class MessageBox extends Modal {
|
||||
this.isManuallyClosed = true;
|
||||
this.close();
|
||||
} else {
|
||||
this.defaultButtonComponent.setButtonText(`( ${this.timeout} ) ${defaultAction}`);
|
||||
this.defaultButtonComponent?.setButtonText(`( ${this.timeout} ) ${defaultAction}`);
|
||||
}
|
||||
}, 1000);
|
||||
}
|
||||
@@ -223,7 +224,7 @@ export class MessageBox extends Modal {
|
||||
}
|
||||
|
||||
|
||||
export function confirmWithMessage(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction?: (typeof buttons)[number], timeout?: number): Promise<(typeof buttons)[number] | false> {
|
||||
export function confirmWithMessage(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout?: number): Promise<(typeof buttons)[number] | false> {
|
||||
return new Promise((res) => {
|
||||
const dialog = new MessageBox(plugin, title, contentMd, buttons, defaultAction, timeout, (result) => res(result));
|
||||
dialog.open();
|
||||
|
||||
2
src/lib
484
src/main.ts
@@ -4,7 +4,7 @@ import { type Diff, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch, stri
|
||||
import { Notice, Plugin, TFile, addIcon, TFolder, normalizePath, TAbstractFile, Editor, MarkdownView, type RequestUrlParam, type RequestUrlResponse, requestUrl, type MarkdownFileInfo } from "./deps";
|
||||
import { type EntryDoc, type LoadedEntry, type ObsidianLiveSyncSettings, type diff_check_result, type diff_result_leaf, type EntryBody, LOG_LEVEL, VER, DEFAULT_SETTINGS, type diff_result, FLAGMD_REDFLAG, SYNCINFO_ID, SALT_OF_PASSPHRASE, type ConfigPassphraseStore, type CouchDBConnection, FLAGMD_REDFLAG2, FLAGMD_REDFLAG3, PREFIXMD_LOGFILE, type DatabaseConnectingStatus, type EntryHasPath, type DocumentID, type FilePathWithPrefix, type FilePath, type AnyEntry, LOG_LEVEL_DEBUG, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_URGENT, LOG_LEVEL_VERBOSE, type SavingEntry, MISSING_OR_ERROR, NOT_CONFLICTED, AUTO_MERGED, CANCELLED, LEAVE_TO_SUBSEQUENT, FLAGMD_REDFLAG2_HR, FLAGMD_REDFLAG3_HR, } from "./lib/src/types";
|
||||
import { type InternalFileInfo, type CacheData, type FileEventItem, FileWatchEventQueueMax } from "./types";
|
||||
import { arrayToChunkedArray, createBinaryBlob, createTextBlob, fireAndForget, getDocData, isDocContentSame, isObjectDifferent, sendValue } from "./lib/src/utils";
|
||||
import { arrayToChunkedArray, createBlob, fireAndForget, getDocData, isDocContentSame, isObjectDifferent, readContent, sendValue } from "./lib/src/utils";
|
||||
import { Logger, setGlobalLogFunction } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { ConflictResolveModal } from "./ConflictResolveModal";
|
||||
@@ -76,9 +76,14 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
replicator!: LiveSyncDBReplicator;
|
||||
|
||||
statusBar?: HTMLElement;
|
||||
suspended = false;
|
||||
_suspended = false;
|
||||
get suspended() {
|
||||
return this._suspended || !this.settings?.isConfigured;
|
||||
}
|
||||
set suspended(value: boolean) {
|
||||
this._suspended = value;
|
||||
}
|
||||
deviceAndVaultName = "";
|
||||
isMobile = false;
|
||||
isReady = false;
|
||||
packageVersion = "";
|
||||
manifestVersion = "";
|
||||
@@ -111,6 +116,8 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
return this.isMobile;
|
||||
}
|
||||
|
||||
requestCount = reactiveSource(0);
|
||||
responseCount = reactiveSource(0);
|
||||
processReplication = (e: PouchDB.Core.ExistingDocument<EntryDoc>[]) => this.parseReplicationResult(e);
|
||||
async connectRemoteCouchDB(uri: string, auth: { username: string; password: string }, disableRequestURI: boolean, passphrase: string | false, useDynamicIterationCount: boolean, performSetup: boolean, skipInfo: boolean): Promise<string | { db: PouchDB.Database<EntryDoc>; info: PouchDB.Core.DatabaseInfo }> {
|
||||
if (!isValidRemoteCouchDBURI(uri)) return "Remote URI is not valid";
|
||||
@@ -166,6 +173,7 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
};
|
||||
|
||||
try {
|
||||
this.requestCount.value = this.requestCount.value + 1;
|
||||
const r = await fetchByAPI(requestParam);
|
||||
if (method == "POST" || method == "PUT") {
|
||||
this.last_successful_post = r.status - (r.status % 100) == 200;
|
||||
@@ -187,12 +195,15 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
}
|
||||
Logger(ex);
|
||||
throw ex;
|
||||
} finally {
|
||||
this.responseCount.value = this.responseCount.value + 1;
|
||||
}
|
||||
}
|
||||
|
||||
// -old implementation
|
||||
|
||||
try {
|
||||
this.requestCount.value = this.requestCount.value + 1;
|
||||
const response: Response = await fetch(url, opts);
|
||||
if (method == "POST" || method == "PUT") {
|
||||
this.last_successful_post = response.ok;
|
||||
@@ -200,6 +211,15 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
this.last_successful_post = true;
|
||||
}
|
||||
Logger(`HTTP:${method}${size} to:${localURL} -> ${response.status}`, LOG_LEVEL_DEBUG);
|
||||
if (Math.floor(response.status / 100) !== 2) {
|
||||
const r = response.clone();
|
||||
Logger(`The request may have failed. The reason sent by the server: ${r.status}: ${r.statusText}`);
|
||||
try {
|
||||
Logger(await (await r.blob()).text(), LOG_LEVEL_VERBOSE);
|
||||
} catch (_) {
|
||||
Logger("Cloud not parse response", LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
}
|
||||
return response;
|
||||
} catch (ex) {
|
||||
Logger(`HTTP:${method}${size} to:${localURL} -> failed`, LOG_LEVEL_VERBOSE);
|
||||
@@ -209,6 +229,8 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
}
|
||||
Logger(ex);
|
||||
throw ex;
|
||||
} finally {
|
||||
this.responseCount.value = this.responseCount.value + 1;
|
||||
}
|
||||
// return await fetch(url, opts);
|
||||
},
|
||||
@@ -234,6 +256,22 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
}
|
||||
}
|
||||
|
||||
get isMobile() {
|
||||
// @ts-ignore: internal API
|
||||
return this.app.isMobile
|
||||
}
|
||||
|
||||
get vaultName() {
|
||||
return this.app.vault.getName()
|
||||
}
|
||||
getActiveFile() {
|
||||
return this.app.workspace.getActiveFile();
|
||||
}
|
||||
|
||||
get appId() {
|
||||
return `${("appId" in this.app ? this.app.appId : "")}`;
|
||||
}
|
||||
|
||||
id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
|
||||
const tempId = id2path(id, entry);
|
||||
if (stripPrefix && isInternalMetadata(tempId)) {
|
||||
@@ -295,13 +333,7 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
// end interfaces
|
||||
|
||||
getVaultName(): string {
|
||||
return this.app.vault.getName() + (this.settings?.additionalSuffixOfDatabaseName ? ("-" + this.settings.additionalSuffixOfDatabaseName) : "");
|
||||
}
|
||||
|
||||
setInterval(handler: () => any, timeout?: number): number {
|
||||
const timer = window.setInterval(handler, timeout);
|
||||
this.registerInterval(timer);
|
||||
return timer;
|
||||
return this.vaultName + (this.settings?.additionalSuffixOfDatabaseName ? ("-" + this.settings.additionalSuffixOfDatabaseName) : "");
|
||||
}
|
||||
|
||||
isFlagFileExist(path: string) {
|
||||
@@ -347,7 +379,7 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
}
|
||||
notes.sort((a, b) => b.mtime - a.mtime);
|
||||
const notesList = notes.map(e => e.dispPath);
|
||||
const target = await askSelectString(this.app, "File to view History", notesList);
|
||||
const target = await this.askSelectString("File to view History", notesList);
|
||||
if (target) {
|
||||
const targetId = notes.find(e => e.dispPath == target);
|
||||
this.showHistory(targetId.path, targetId.id);
|
||||
@@ -365,7 +397,7 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
Logger("There are no conflicted documents", LOG_LEVEL_NOTICE);
|
||||
return false;
|
||||
}
|
||||
const target = await askSelectString(this.app, "File to resolve conflict", notesList);
|
||||
const target = await this.askSelectString("File to resolve conflict", notesList);
|
||||
if (target) {
|
||||
const targetItem = notes.find(e => e.dispPath == target);
|
||||
this.resolveConflicted(targetItem.path);
|
||||
@@ -403,7 +435,6 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
if (notes.length == 0) {
|
||||
Logger("There are no old documents");
|
||||
Logger(`Checking expired file history done`);
|
||||
|
||||
return;
|
||||
}
|
||||
for (const v of notes) {
|
||||
@@ -420,6 +451,40 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
Logger(`Something went wrong! The local database is not ready`, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
if (!this.settings.isConfigured) {
|
||||
const message = `Hello and welcome to Self-hosted LiveSync.
|
||||
|
||||
Your device seems to **not be configured yet**. Please finish the setup and synchronise your vaults!
|
||||
|
||||
Click anywhere to stop counting down.
|
||||
|
||||
## At the first device
|
||||
- With Setup URI -> Use \`Use the copied setup URI\`.
|
||||
If you have configured it automatically, you should have one.
|
||||
- Without Setup URI -> Use \`Setup wizard\` in setting dialogue. **\`Minimal setup\` is recommended**.
|
||||
- What is the Setup URI? -> Do not worry! We have [some docs](https://github.com/vrtmrz/obsidian-livesync/blob/main/README.md#how-to-use) now. Please refer to them once.
|
||||
|
||||
## At the subsequent device
|
||||
- With Setup URI -> Use \`Use the copied setup URI\`.
|
||||
If you do not have it yet, you can copy it on the first device.
|
||||
- Without Setup URI -> Use \`Setup wizard\` in setting dialogue, but **strongly recommends using setup URI**.
|
||||
`
|
||||
const OPEN_SETUP = "Open setting dialog";
|
||||
const USE_SETUP = "Use the copied setup URI";
|
||||
const DISMISS = "Dismiss";
|
||||
|
||||
const ret = await confirmWithMessage(this, "Welcome to Self-hosted LiveSync", message, [USE_SETUP, OPEN_SETUP, DISMISS], DISMISS, 40);
|
||||
if (ret === OPEN_SETUP) {
|
||||
try {
|
||||
this.openSetting();
|
||||
} catch (ex) {
|
||||
Logger("Something went wrong on opening setting dialog, please open it manually", LOG_LEVEL_NOTICE);
|
||||
}
|
||||
} else if (ret == USE_SETUP) {
|
||||
fireAndForget(this.addOnSetup.command_openSetupURI());
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
if (this.isRedFlagRaised() || this.isRedFlag2Raised() || this.isRedFlag3Raised()) {
|
||||
@@ -432,22 +497,20 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
Logger(`${FLAGMD_REDFLAG2} or ${FLAGMD_REDFLAG2_HR} has been detected! Self-hosted LiveSync suspends all sync and rebuild everything.`, LOG_LEVEL_NOTICE);
|
||||
await this.addOnSetup.rebuildEverything();
|
||||
await this.deleteRedFlag2();
|
||||
if (await askYesNo(this.app, "Do you want to disable Suspend file watching and restart obsidian now?") == "yes") {
|
||||
if (await this.askYesNo("Do you want to disable Suspend file watching and restart obsidian now?") == "yes") {
|
||||
this.settings.suspendFileWatching = false;
|
||||
await this.saveSettings();
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload")
|
||||
this.performAppReload();
|
||||
}
|
||||
} else if (this.isRedFlag3Raised()) {
|
||||
Logger(`${FLAGMD_REDFLAG3} or ${FLAGMD_REDFLAG3_HR} has been detected! Self-hosted LiveSync will discard the local database and fetch everything from the remote once again.`, LOG_LEVEL_NOTICE);
|
||||
await this.addOnSetup.fetchLocal();
|
||||
await this.deleteRedFlag3();
|
||||
if (this.settings.suspendFileWatching) {
|
||||
if (await askYesNo(this.app, "Do you want to disable Suspend file watching and restart obsidian now?") == "yes") {
|
||||
if (await this.askYesNo("Do you want to disable Suspend file watching and restart obsidian now?") == "yes") {
|
||||
this.settings.suspendFileWatching = false;
|
||||
await this.saveSettings();
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload")
|
||||
this.performAppReload();
|
||||
}
|
||||
}
|
||||
} else {
|
||||
@@ -496,8 +559,7 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
this.askInPopup(`conflicting-detected-on-safety`, `Some files have been left conflicted! Press {HERE} to resolve them, or you can do it later by "Pick a file to resolve conflict`, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", () => {
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("obsidian-livesync:livesync-all-conflictcheck");
|
||||
this.performCommand("obsidian-livesync:livesync-all-conflictcheck");
|
||||
});
|
||||
}
|
||||
);
|
||||
@@ -579,7 +641,7 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
// this.localDatabase.getDBEntry(getPathFromTFile(file), {}, true, false);
|
||||
// },
|
||||
callback: () => {
|
||||
const file = this.app.workspace.getActiveFile();
|
||||
const file = this.getActiveFile();
|
||||
if (!file) return;
|
||||
this.localDatabase.getDBEntry(getPathFromTFile(file), {}, true, false);
|
||||
},
|
||||
@@ -628,7 +690,7 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
id: "livesync-history",
|
||||
name: "Show history",
|
||||
callback: () => {
|
||||
const file = this.app.workspace.getActiveFile();
|
||||
const file = this.getActiveFile();
|
||||
if (file) this.showHistory(file, null);
|
||||
}
|
||||
});
|
||||
@@ -733,16 +795,17 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
const lsKey = "obsidian-live-sync-ver" + this.getVaultName();
|
||||
const last_version = localStorage.getItem(lsKey);
|
||||
this.observeForLogs();
|
||||
this.statusBar = this.addStatusBarItem();
|
||||
this.statusBar.addClass("syncstatusbar");
|
||||
if (this.settings.showStatusOnStatusbar) {
|
||||
this.statusBar = this.addStatusBarItem();
|
||||
this.statusBar.addClass("syncstatusbar");
|
||||
}
|
||||
const lastVersion = ~~(versionNumberString2Number(manifestVersion) / 1000);
|
||||
if (lastVersion > this.settings.lastReadUpdates) {
|
||||
if (lastVersion > this.settings.lastReadUpdates && this.settings.isConfigured) {
|
||||
Logger("Self-hosted LiveSync has undergone a major upgrade. Please open the setting dialog, and check the information pane.", LOG_LEVEL_NOTICE);
|
||||
}
|
||||
|
||||
//@ts-ignore
|
||||
if (this.app.isMobile) {
|
||||
this.isMobile = true;
|
||||
if (this.isMobile) {
|
||||
this.settings.disableRequestURI = true;
|
||||
}
|
||||
if (last_version && Number(last_version) < VER) {
|
||||
@@ -819,8 +882,6 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
}
|
||||
const vaultName = this.getVaultName();
|
||||
Logger("Waiting for ready...");
|
||||
//@ts-ignore
|
||||
this.isMobile = this.app.isMobile;
|
||||
this.localDatabase = new LiveSyncLocalDB(vaultName, this);
|
||||
initializeStores(vaultName);
|
||||
return await this.localDatabase.initializeDatabase();
|
||||
@@ -877,6 +938,16 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
|
||||
async loadSettings() {
|
||||
const settings = Object.assign({}, DEFAULT_SETTINGS, await this.loadData()) as ObsidianLiveSyncSettings;
|
||||
|
||||
if (typeof settings.isConfigured == "undefined") {
|
||||
// If migrated, mark true
|
||||
if (JSON.stringify(settings) !== JSON.stringify(DEFAULT_SETTINGS)) {
|
||||
settings.isConfigured = true;
|
||||
} else {
|
||||
settings.additionalSuffixOfDatabaseName = this.appId;
|
||||
settings.isConfigured = false;
|
||||
}
|
||||
}
|
||||
const passphrase = await this.getPassphrase(settings);
|
||||
if (passphrase === false) {
|
||||
Logger("Could not determine passphrase for reading data.json! DO NOT synchronize with the remote before making sure your configuration is!", LOG_LEVEL_URGENT);
|
||||
@@ -997,7 +1068,7 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
}
|
||||
|
||||
async parseSettingFromMarkdown(filename: string, data?: string) {
|
||||
const file = this.app.vault.getAbstractFileByPath(filename);
|
||||
const file = this.vaultAccess.getAbstractFileByPath(filename);
|
||||
if (!(file instanceof TFile)) return {
|
||||
preamble: "",
|
||||
body: "",
|
||||
@@ -1006,7 +1077,7 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
if (data) {
|
||||
return this.extractSettingFromWholeText(data);
|
||||
}
|
||||
const parseData = data ?? await this.app.vault.read(file);
|
||||
const parseData = data ?? await this.vaultAccess.vaultRead(file);
|
||||
return this.extractSettingFromWholeText(parseData);
|
||||
}
|
||||
|
||||
@@ -1055,7 +1126,7 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
const APPLY_AND_REBUILD = "Apply settings and restart obsidian with red_flag_rebuild.md";
|
||||
const APPLY_AND_FETCH = "Apply settings and restart obsidian with red_flag_fetch.md";
|
||||
const CANCEL = "Cancel";
|
||||
const result = await askSelectString(this.app, "Ready for apply the setting.", [APPLY_AND_RESTART, APPLY_ONLY, APPLY_AND_FETCH, APPLY_AND_REBUILD, CANCEL]);
|
||||
const result = await this.askSelectString("Ready for apply the setting.", [APPLY_AND_RESTART, APPLY_ONLY, APPLY_AND_FETCH, APPLY_AND_REBUILD, CANCEL]);
|
||||
if (result == APPLY_ONLY || result == APPLY_AND_RESTART || result == APPLY_AND_REBUILD || result == APPLY_AND_FETCH) {
|
||||
this.settings = settingToApply;
|
||||
await this.saveSettingData();
|
||||
@@ -1064,13 +1135,12 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
return;
|
||||
}
|
||||
if (result == APPLY_AND_REBUILD) {
|
||||
await this.app.vault.create(FLAGMD_REDFLAG2_HR, "");
|
||||
await this.vaultAccess.vaultCreate(FLAGMD_REDFLAG2_HR, "");
|
||||
}
|
||||
if (result == APPLY_AND_FETCH) {
|
||||
await this.app.vault.create(FLAGMD_REDFLAG3_HR, "");
|
||||
await this.vaultAccess.vaultCreate(FLAGMD_REDFLAG3_HR, "");
|
||||
}
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload");
|
||||
this.performAppReload();
|
||||
}
|
||||
}
|
||||
)
|
||||
@@ -1090,11 +1160,11 @@ Note: We can always able to read V1 format. It will be progressively converted.
|
||||
|
||||
async saveSettingToMarkdown(filename: string) {
|
||||
const saveData = this.generateSettingForMarkdown();
|
||||
let file = this.app.vault.getAbstractFileByPath(filename);
|
||||
let file = this.vaultAccess.getAbstractFileByPath(filename);
|
||||
|
||||
|
||||
if (!file) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
await this.vaultAccess.ensureDirectory(filename);
|
||||
const initialContent = `This file contains Self-hosted LiveSync settings as YAML.
|
||||
Except for the \`livesync-setting\` code block, we can add a note for free.
|
||||
|
||||
@@ -1107,21 +1177,21 @@ We can perform a command in this file.
|
||||
|
||||
|
||||
`
|
||||
file = await this.app.vault.create(filename, initialContent + SETTING_HEADER + "\n" + SETTING_FOOTER);
|
||||
file = await this.vaultAccess.vaultCreate(filename, initialContent + SETTING_HEADER + "\n" + SETTING_FOOTER);
|
||||
}
|
||||
if (!(file instanceof TFile)) {
|
||||
Logger(`Markdown Setting: ${filename} already exists as a folder`, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
|
||||
const data = await this.app.vault.read(file);
|
||||
const data = await this.vaultAccess.vaultRead(file);
|
||||
const { preamble, body, postscript } = this.extractSettingFromWholeText(data);
|
||||
const newBody = stringifyYaml(saveData);
|
||||
|
||||
if (newBody == body) {
|
||||
Logger("Markdown setting: Nothing had been changed", LOG_LEVEL_VERBOSE);
|
||||
} else {
|
||||
await this.app.vault.modify(file, preamble + SETTING_HEADER + newBody + SETTING_FOOTER + postscript);
|
||||
await this.vaultAccess.vaultModify(file, preamble + SETTING_HEADER + newBody + SETTING_FOOTER + postscript);
|
||||
Logger(`Markdown setting: ${filename} has been updated!`, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
}
|
||||
@@ -1164,8 +1234,7 @@ We can perform a command in this file.
|
||||
const _this = this;
|
||||
//@ts-ignore
|
||||
window.CodeMirrorAdapter.commands.save = () => {
|
||||
//@ts-ignore
|
||||
_this.app.commands.executeCommandById('editor:save-file');
|
||||
_this.performCommand('editor:save-file');
|
||||
};
|
||||
}
|
||||
registerWatchEvents() {
|
||||
@@ -1175,7 +1244,6 @@ We can perform a command in this file.
|
||||
this.registerDomEvent(window, "offline", this.watchOnline);
|
||||
}
|
||||
|
||||
|
||||
watchOnline() {
|
||||
scheduleTask("watch-online", 500, () => fireAndForget(() => this.watchOnlineAsync()));
|
||||
}
|
||||
@@ -1193,8 +1261,8 @@ We can perform a command in this file.
|
||||
|
||||
async watchWindowVisibilityAsync() {
|
||||
if (this.settings.suspendFileWatching) return;
|
||||
if (!this.settings.isConfigured) return;
|
||||
if (!this.isReady) return;
|
||||
// if (this.suspended) return;
|
||||
const isHidden = document.hidden;
|
||||
await this.applyBatchChange();
|
||||
if (isHidden) {
|
||||
@@ -1272,12 +1340,12 @@ We can perform a command in this file.
|
||||
return;
|
||||
}
|
||||
|
||||
const cache = queue.args.cache;
|
||||
// const cache = queue.args.cache;
|
||||
if (queue.type == "CREATE" || queue.type == "CHANGED") {
|
||||
fireAndForget(() => this.checkAndApplySettingFromMarkdown(queue.args.file.path, true));
|
||||
const keyD1 = `file-last-proc-DELETED-${file.path}`;
|
||||
await this.kvDB.set(keyD1, mtime);
|
||||
if (!await this.updateIntoDB(targetFile, false, cache)) {
|
||||
if (!await this.updateIntoDB(targetFile, undefined)) {
|
||||
Logger(`STORAGE -> DB: failed, cancel the relative operations: ${targetFile.path}`, LOG_LEVEL_INFO);
|
||||
// cancel running queues and remove one of atomic operation
|
||||
this.cancelRelativeEvent(queue);
|
||||
@@ -1307,6 +1375,7 @@ We can perform a command in this file.
|
||||
|
||||
watchWorkspaceOpen(file: TFile | null) {
|
||||
if (this.settings.suspendFileWatching) return;
|
||||
if (!this.settings.isConfigured) return;
|
||||
if (!this.isReady) return;
|
||||
if (!file) return;
|
||||
scheduleTask("watch-workspace-open", 500, () => fireAndForget(() => this.watchWorkspaceOpenAsync(file)));
|
||||
@@ -1314,6 +1383,7 @@ We can perform a command in this file.
|
||||
|
||||
async watchWorkspaceOpenAsync(file: TFile) {
|
||||
if (this.settings.suspendFileWatching) return;
|
||||
if (!this.settings.isConfigured) return;
|
||||
if (!this.isReady) return;
|
||||
await this.applyBatchChange();
|
||||
if (file == null) {
|
||||
@@ -1348,7 +1418,7 @@ We can perform a command in this file.
|
||||
if (file instanceof TFile) {
|
||||
try {
|
||||
// Logger(`RENAMING.. ${file.path} into db`);
|
||||
if (await this.updateIntoDB(file, false, cache)) {
|
||||
if (await this.updateIntoDB(file, cache)) {
|
||||
// Logger(`deleted ${oldFile} from db`);
|
||||
await this.deleteFromDBbyPath(oldFile);
|
||||
} else {
|
||||
@@ -1394,9 +1464,9 @@ We can perform a command in this file.
|
||||
const logDate = `${PREFIXMD_LOGFILE}${time}.md`;
|
||||
const file = this.vaultAccess.getAbstractFileByPath(normalizePath(logDate));
|
||||
if (!file) {
|
||||
this.app.vault.adapter.append(normalizePath(logDate), "```\n");
|
||||
this.vaultAccess.adapterAppend(normalizePath(logDate), "```\n");
|
||||
}
|
||||
this.app.vault.adapter.append(normalizePath(logDate), vaultName + ":" + newMessage + "\n");
|
||||
this.vaultAccess.adapterAppend(normalizePath(logDate), vaultName + ":" + newMessage + "\n");
|
||||
}
|
||||
recentLogProcessor.enqueue(newMessage);
|
||||
|
||||
@@ -1434,28 +1504,6 @@ We can perform a command in this file.
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
async ensureDirectory(fullPath: string) {
|
||||
const pathElements = fullPath.split("/");
|
||||
pathElements.pop();
|
||||
let c = "";
|
||||
for (const v of pathElements) {
|
||||
c += v;
|
||||
try {
|
||||
await this.app.vault.createFolder(c);
|
||||
} catch (ex) {
|
||||
// basically skip exceptions.
|
||||
if (ex.message && ex.message == "Folder already exists.") {
|
||||
// especially this message is.
|
||||
} else {
|
||||
Logger("Folder Create Error");
|
||||
Logger(ex);
|
||||
}
|
||||
}
|
||||
c += "/";
|
||||
}
|
||||
}
|
||||
|
||||
async processEntryDoc(docEntry: EntryBody, file: TFile | undefined, force?: boolean) {
|
||||
const mode = file == undefined ? "create" : "modify";
|
||||
|
||||
@@ -1514,8 +1562,8 @@ We can perform a command in this file.
|
||||
Logger(msg + "ERROR, invalid path: " + path, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
const writeData = doc.datatype == "newnote" ? decodeBinary(doc.data) : getDocData(doc.data);
|
||||
await this.ensureDirectoryEx(path);
|
||||
const writeData = readContent(doc);
|
||||
await this.vaultAccess.ensureDirectory(path);
|
||||
try {
|
||||
let outFile;
|
||||
let isChanged = true;
|
||||
@@ -1530,7 +1578,7 @@ We can perform a command in this file.
|
||||
if (isChanged) {
|
||||
Logger(msg + path);
|
||||
this.vaultAccess.touch(outFile);
|
||||
this.app.vault.trigger(mode, outFile);
|
||||
this.vaultAccess.trigger(mode, outFile);
|
||||
} else {
|
||||
Logger(msg + "Skipped, the file is the same: " + path, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
@@ -1564,7 +1612,7 @@ We can perform a command in this file.
|
||||
queueConflictCheck(file: FilePathWithPrefix | TFile) {
|
||||
const path = file instanceof TFile ? getPathFromTFile(file) : file;
|
||||
if (this.settings.checkConflictOnlyOnOpen) {
|
||||
const af = this.app.workspace.getActiveFile();
|
||||
const af = this.getActiveFile();
|
||||
if (af && af.path != path) {
|
||||
Logger(`${file} is conflicted, merging process has been postponed.`, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
@@ -1579,21 +1627,22 @@ We can perform a command in this file.
|
||||
localStorage.setItem(lsKey, saveData);
|
||||
}
|
||||
async loadQueuedFiles() {
|
||||
if (!this.settings.suspendParseReplicationResult) {
|
||||
const lsKey = "obsidian-livesync-queuefiles-" + this.getVaultName();
|
||||
const ids = [...new Set(JSON.parse(localStorage.getItem(lsKey) || "[]"))] as string[];
|
||||
const batchSize = 100;
|
||||
const chunkedIds = arrayToChunkedArray(ids, batchSize);
|
||||
for await (const idsBatch of chunkedIds) {
|
||||
const ret = await this.localDatabase.allDocsRaw<EntryDoc>({ keys: idsBatch, include_docs: true, limit: 100 });
|
||||
this.replicationResultProcessor.enqueueAll(ret.rows.map(doc => doc.doc));
|
||||
await this.replicationResultProcessor.waitForPipeline();
|
||||
}
|
||||
if (this.settings.suspendParseReplicationResult) return;
|
||||
if (!this.settings.isConfigured) return;
|
||||
const lsKey = "obsidian-livesync-queuefiles-" + this.getVaultName();
|
||||
const ids = [...new Set(JSON.parse(localStorage.getItem(lsKey) || "[]"))] as string[];
|
||||
const batchSize = 100;
|
||||
const chunkedIds = arrayToChunkedArray(ids, batchSize);
|
||||
for await (const idsBatch of chunkedIds) {
|
||||
const ret = await this.localDatabase.allDocsRaw<EntryDoc>({ keys: idsBatch, include_docs: true, limit: 100 });
|
||||
this.replicationResultProcessor.enqueueAll(ret.rows.map(doc => doc.doc));
|
||||
await this.replicationResultProcessor.waitForPipeline();
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
databaseQueueCount = reactiveSource(0);
|
||||
databaseQueuedProcessor = new KeyedQueueProcessor(async (docs: EntryBody[]) => {
|
||||
databaseQueuedProcessor = new QueueProcessor(async (docs: EntryBody[]) => {
|
||||
const dbDoc = docs[0];
|
||||
const path = this.getPath(dbDoc);
|
||||
// If `Read chunks online` is disabled, chunks should be transferred before here.
|
||||
@@ -1620,13 +1669,13 @@ We can perform a command in this file.
|
||||
storageApplyingProcessor = new KeyedQueueProcessor(async (docs: LoadedEntry[]) => {
|
||||
const entry = docs[0];
|
||||
const path = this.getPath(entry);
|
||||
Logger(`Processing ${path} (${entry._id.substring(0, 8)}: ${entry._rev?.substring(0, 5)}) change...`, LOG_LEVEL_VERBOSE);
|
||||
Logger(`Processing ${path} (${entry._id.substring(0, 8)}: ${entry._rev?.substring(0, 5)}) :Started...`, LOG_LEVEL_VERBOSE);
|
||||
const targetFile = this.vaultAccess.getAbstractFileByPath(this.getPathWithoutPrefix(entry));
|
||||
if (targetFile instanceof TFolder) {
|
||||
Logger(`${this.getPath(entry)} is already exist as the folder`);
|
||||
} else {
|
||||
await this.processEntryDoc(entry, targetFile instanceof TFile ? targetFile : undefined);
|
||||
Logger(`Processing ${path} (${entry._id.substring(0, 8)}:${entry._rev?.substring(0, 5)}) `);
|
||||
Logger(`Processing ${path} (${entry._id.substring(0, 8)} :${entry._rev?.substring(0, 5)}) : Done`);
|
||||
}
|
||||
|
||||
return;
|
||||
@@ -1670,7 +1719,7 @@ We can perform a command in this file.
|
||||
Logger(`Processing ${change.path} has been skipped due to file size exceeding the limit`, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
this.databaseQueuedProcessor.enqueueWithKey(change.path, change);
|
||||
this.databaseQueuedProcessor.enqueue(change);
|
||||
}
|
||||
return;
|
||||
}, { batchSize: 1, suspended: true, concurrentLimit: 100, delay: 0, totalRemainingReactiveSource: this.replicationResultCount }).startPipeline().onUpdateProgress(() => {
|
||||
@@ -1730,6 +1779,10 @@ We can perform a command in this file.
|
||||
const labelConflictProcessCount = conflictProcessCount ? `🔩${conflictProcessCount} ` : "";
|
||||
return `${labelReplication}${labelDBCount}${labelStorageCount}${labelChunkCount}${labelPluginScanCount}${labelHiddenFilesCount}${labelConflictProcessCount}`;
|
||||
})
|
||||
const requestingStatLabel = reactive(() => {
|
||||
const diff = this.requestCount.value - this.responseCount.value;
|
||||
return diff != 0 ? "📲 " : "";
|
||||
})
|
||||
|
||||
const replicationStatLabel = reactive(() => {
|
||||
const e = this.replicationStat.value;
|
||||
@@ -1779,8 +1832,9 @@ We can perform a command in this file.
|
||||
const { w, sent, pushLast, arrived, pullLast } = replicationStatLabel.value;
|
||||
const queued = queueCountLabel.value;
|
||||
const waiting = waitingLabel.value;
|
||||
const networkActivity = requestingStatLabel.value;
|
||||
return {
|
||||
message: `Sync: ${w} ↑${sent}${pushLast} ↓${arrived}${pullLast}${waiting} ${queued}`,
|
||||
message: `${networkActivity}Sync: ${w} ↑${sent}${pushLast} ↓${arrived}${pullLast}${waiting} ${queued}`,
|
||||
};
|
||||
})
|
||||
const statusBarLabels = reactive(() => {
|
||||
@@ -1810,22 +1864,22 @@ We can perform a command in this file.
|
||||
const newLog = log;
|
||||
// scheduleTask("update-display", 50, () => {
|
||||
this.statusBar?.setText(newMsg.split("\n")[0]);
|
||||
const selector = `.CodeMirror-wrap,` +
|
||||
`.markdown-preview-view.cm-s-obsidian,` +
|
||||
`.markdown-source-view.cm-s-obsidian,` +
|
||||
`.canvas-wrapper,` +
|
||||
`.empty-state`
|
||||
;
|
||||
// const selector = `.CodeMirror-wrap,` +
|
||||
// `.markdown-preview-view.cm-s-obsidian,` +
|
||||
// `.markdown-source-view.cm-s-obsidian,` +
|
||||
// `.canvas-wrapper,` +
|
||||
// `.empty-state`
|
||||
// ;
|
||||
if (this.settings.showStatusOnEditor) {
|
||||
const root = activeDocument.documentElement;
|
||||
const q = root.querySelectorAll(selector);
|
||||
q.forEach(e => e.setAttr("data-log", '' + (newMsg + "\n" + newLog) + ''))
|
||||
root.style.setProperty("--sls-log-text", "'" + (newMsg + "\\A " + newLog) + "'");
|
||||
} else {
|
||||
const root = activeDocument.documentElement;
|
||||
const q = root.querySelectorAll(selector);
|
||||
q.forEach(e => e.setAttr("data-log", ''))
|
||||
// const root = activeDocument.documentElement;
|
||||
// root.style.setProperty("--log-text", "'" + (newMsg + "\\A " + newLog) + "'");
|
||||
}
|
||||
// }, true);
|
||||
|
||||
|
||||
scheduleTask("log-hide", 3000, () => { this.statusLog.value = "" });
|
||||
}
|
||||
|
||||
@@ -1953,7 +2007,12 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
|
||||
async syncAllFiles(showingNotice?: boolean) {
|
||||
// synchronize all files between database and storage.
|
||||
let initialScan = false;
|
||||
if (!this.settings.isConfigured) {
|
||||
if (showingNotice) {
|
||||
Logger("LiveSync is not configured yet. Synchronising between the storage and the local database is now prevented.", LOG_LEVEL_NOTICE, "syncAll");
|
||||
}
|
||||
return;
|
||||
}
|
||||
if (showingNotice) {
|
||||
Logger("Initializing", LOG_LEVEL_NOTICE, "syncAll");
|
||||
}
|
||||
@@ -1963,7 +2022,7 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
await this.collectDeletedFiles();
|
||||
|
||||
Logger("Collecting local files on the storage", LOG_LEVEL_VERBOSE);
|
||||
const filesStorageSrc = this.app.vault.getFiles();
|
||||
const filesStorageSrc = this.vaultAccess.getFiles();
|
||||
|
||||
const filesStorage = [] as typeof filesStorageSrc;
|
||||
for (const f of filesStorageSrc) {
|
||||
@@ -1986,11 +2045,7 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
}
|
||||
Logger("Opening the key-value database", LOG_LEVEL_VERBOSE);
|
||||
const isInitialized = await (this.kvDB.get<boolean>("initialized")) || false;
|
||||
// Make chunk bigger if it is the initial scan. There must be non-active docs.
|
||||
if (filesDatabase.length == 0 && !isInitialized) {
|
||||
initialScan = true;
|
||||
Logger("Database looks empty, save files as initial sync data");
|
||||
}
|
||||
|
||||
const onlyInStorage = filesStorage.filter((e) => filesDatabase.indexOf(getPathFromTFile(e)) == -1);
|
||||
const onlyInDatabase = filesDatabase.filter((e) => filesStorageName.indexOf(e) == -1);
|
||||
|
||||
@@ -2031,69 +2086,62 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
}
|
||||
initProcess.push(runAll("UPDATE DATABASE", onlyInStorage, async (e) => {
|
||||
if (!this.isFileSizeExceeded(e.stat.size)) {
|
||||
await this.updateIntoDB(e, initialScan);
|
||||
await this.updateIntoDB(e);
|
||||
fireAndForget(() => this.checkAndApplySettingFromMarkdown(e.path, true));
|
||||
} else {
|
||||
Logger(`UPDATE DATABASE: ${e.path} has been skipped due to file size exceeding the limit`, logLevel);
|
||||
}
|
||||
}));
|
||||
if (!initialScan) {
|
||||
initProcess.push(runAll("UPDATE STORAGE", onlyInDatabase, async (e) => {
|
||||
const w = await this.localDatabase.getDBEntryMeta(e, {}, true);
|
||||
if (w && !(w.deleted || w._deleted)) {
|
||||
if (!this.isFileSizeExceeded(w.size)) {
|
||||
await this.pullFile(e, filesStorage, false, null, false);
|
||||
fireAndForget(() => this.checkAndApplySettingFromMarkdown(e, true));
|
||||
Logger(`Check or pull from db:${e} OK`);
|
||||
} else {
|
||||
Logger(`UPDATE STORAGE: ${e} has been skipped due to file size exceeding the limit`, logLevel);
|
||||
}
|
||||
} else if (w) {
|
||||
Logger(`Deletion history skipped: ${e}`, LOG_LEVEL_VERBOSE);
|
||||
initProcess.push(runAll("UPDATE STORAGE", onlyInDatabase, async (e) => {
|
||||
const w = await this.localDatabase.getDBEntryMeta(e, {}, true);
|
||||
if (w && !(w.deleted || w._deleted)) {
|
||||
if (!this.isFileSizeExceeded(w.size)) {
|
||||
await this.pullFile(e, filesStorage, false, null, false);
|
||||
fireAndForget(() => this.checkAndApplySettingFromMarkdown(e, true));
|
||||
Logger(`Check or pull from db:${e} OK`);
|
||||
} else {
|
||||
Logger(`entry not found: ${e}`);
|
||||
Logger(`UPDATE STORAGE: ${e} has been skipped due to file size exceeding the limit`, logLevel);
|
||||
}
|
||||
}));
|
||||
}
|
||||
if (!initialScan) {
|
||||
// let caches: { [key: string]: { storageMtime: number; docMtime: number } } = {};
|
||||
// caches = await this.kvDB.get<{ [key: string]: { storageMtime: number; docMtime: number } }>("diff-caches") || {};
|
||||
type FileDocPair = { file: TFile, id: DocumentID };
|
||||
} else if (w) {
|
||||
Logger(`Deletion history skipped: ${e}`, LOG_LEVEL_VERBOSE);
|
||||
} else {
|
||||
Logger(`entry not found: ${e}`);
|
||||
}
|
||||
}));
|
||||
type FileDocPair = { file: TFile, id: DocumentID };
|
||||
|
||||
const processPrepareSyncFile = new QueueProcessor(
|
||||
async (files) => {
|
||||
const file = files[0];
|
||||
const id = await this.path2id(getPathFromTFile(file));
|
||||
const pair: FileDocPair = { file, id };
|
||||
return [pair];
|
||||
// processSyncFile.enqueue(pair);
|
||||
}
|
||||
, { batchSize: 1, concurrentLimit: 10, delay: 0, suspended: true }, syncFiles);
|
||||
processPrepareSyncFile
|
||||
.pipeTo(
|
||||
new QueueProcessor(
|
||||
async (pairs) => {
|
||||
const docs = await this.localDatabase.allDocsRaw<EntryDoc>({ keys: pairs.map(e => e.id), include_docs: true });
|
||||
const docsMap = docs.rows.reduce((p, c) => ({ ...p, [c.id]: c.doc }), {} as Record<DocumentID, EntryDoc>);
|
||||
const syncFilesToSync = pairs.map((e) => ({ file: e.file, doc: docsMap[e.id] as LoadedEntry }));
|
||||
return syncFilesToSync;
|
||||
}
|
||||
, { batchSize: 10, concurrentLimit: 5, delay: 10, suspended: false }))
|
||||
.pipeTo(
|
||||
new QueueProcessor(
|
||||
async (loadedPairs) => {
|
||||
const e = loadedPairs[0];
|
||||
await this.syncFileBetweenDBandStorage(e.file, e.doc, initialScan);
|
||||
return;
|
||||
}, { batchSize: 1, concurrentLimit: 5, delay: 10, suspended: false }
|
||||
))
|
||||
const processPrepareSyncFile = new QueueProcessor(
|
||||
async (files) => {
|
||||
const file = files[0];
|
||||
const id = await this.path2id(getPathFromTFile(file));
|
||||
const pair: FileDocPair = { file, id };
|
||||
return [pair];
|
||||
// processSyncFile.enqueue(pair);
|
||||
}
|
||||
, { batchSize: 1, concurrentLimit: 10, delay: 0, suspended: true }, syncFiles);
|
||||
processPrepareSyncFile
|
||||
.pipeTo(
|
||||
new QueueProcessor(
|
||||
async (pairs) => {
|
||||
const docs = await this.localDatabase.allDocsRaw<EntryDoc>({ keys: pairs.map(e => e.id), include_docs: true });
|
||||
const docsMap = Object.fromEntries(docs.rows.map(e => [e.id, e.doc]));
|
||||
const syncFilesToSync = pairs.map((e) => ({ file: e.file, doc: docsMap[e.id] as LoadedEntry }));
|
||||
return syncFilesToSync;
|
||||
}
|
||||
, { batchSize: 10, concurrentLimit: 5, delay: 10, suspended: false }))
|
||||
.pipeTo(
|
||||
new QueueProcessor(
|
||||
async (loadedPairs) => {
|
||||
const e = loadedPairs[0];
|
||||
await this.syncFileBetweenDBandStorage(e.file, e.doc);
|
||||
return;
|
||||
}, { batchSize: 1, concurrentLimit: 5, delay: 10, suspended: false }
|
||||
))
|
||||
|
||||
processPrepareSyncFile.startPipeline();
|
||||
initProcess.push(async () => {
|
||||
await processPrepareSyncFile.waitForPipeline();
|
||||
// await this.kvDB.set("diff-caches", caches);
|
||||
})
|
||||
}
|
||||
processPrepareSyncFile.startPipeline();
|
||||
initProcess.push(async () => {
|
||||
await processPrepareSyncFile.waitForPipeline();
|
||||
})
|
||||
await Promise.all(initProcess);
|
||||
|
||||
// this.setStatusBarText(`NOW TRACKING!`);
|
||||
@@ -2472,7 +2520,7 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
return;
|
||||
}
|
||||
if (this.settings.showMergeDialogOnlyOnActive) {
|
||||
const af = this.app.workspace.getActiveFile();
|
||||
const af = this.getActiveFile();
|
||||
if (af && af.path != filename) {
|
||||
Logger(`${filename} is conflicted. Merging process has been postponed to the file have got opened.`, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
@@ -2586,7 +2634,7 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
//when to opened file;
|
||||
}
|
||||
|
||||
async syncFileBetweenDBandStorage(file: TFile, doc: LoadedEntry, initialScan: boolean) {
|
||||
async syncFileBetweenDBandStorage(file: TFile, doc: LoadedEntry) {
|
||||
if (!doc) {
|
||||
throw new Error(`Missing doc:${(file as any).path}`)
|
||||
}
|
||||
@@ -2604,7 +2652,7 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
case BASE_IS_NEW:
|
||||
if (!this.isFileSizeExceeded(file.stat.size)) {
|
||||
Logger("STORAGE -> DB :" + file.path);
|
||||
await this.updateIntoDB(file, initialScan);
|
||||
await this.updateIntoDB(file);
|
||||
fireAndForget(() => this.checkAndApplySettingFromMarkdown(file.path, true));
|
||||
} else {
|
||||
Logger(`STORAGE -> DB : ${file.path} has been skipped due to file size exceeding the limit`, LOG_LEVEL_NOTICE);
|
||||
@@ -2633,46 +2681,29 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
|
||||
}
|
||||
|
||||
async updateIntoDB(file: TFile, initialScan?: boolean, cache?: CacheData, force?: boolean) {
|
||||
async updateIntoDB(file: TFile, cache?: CacheData, force?: boolean) {
|
||||
if (!await this.isTargetFile(file)) return true;
|
||||
if (shouldBeIgnored(file.path)) {
|
||||
return true;
|
||||
}
|
||||
let content: Blob;
|
||||
let datatype: "plain" | "newnote" = "newnote";
|
||||
if (!cache) {
|
||||
if (!isPlainText(file.name)) {
|
||||
Logger(`Reading : ${file.path}`, LOG_LEVEL_VERBOSE);
|
||||
const contentBin = await this.vaultAccess.vaultReadBinary(file);
|
||||
Logger(`Processing: ${file.path}`, LOG_LEVEL_VERBOSE);
|
||||
try {
|
||||
content = createBinaryBlob(contentBin);
|
||||
} catch (ex) {
|
||||
Logger(`The file ${file.path} could not be encoded`);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
datatype = "newnote";
|
||||
} else {
|
||||
content = createTextBlob(await this.vaultAccess.vaultRead(file));
|
||||
datatype = "plain";
|
||||
}
|
||||
} else {
|
||||
if (cache instanceof ArrayBuffer) {
|
||||
Logger(`Cache Processing: ${file.path}`, LOG_LEVEL_VERBOSE);
|
||||
try {
|
||||
content = createBinaryBlob(cache);
|
||||
} catch (ex) {
|
||||
Logger(`The file ${file.path} could not be encoded`);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
datatype = "newnote"
|
||||
} else {
|
||||
content = createTextBlob(cache);
|
||||
datatype = "plain";
|
||||
}
|
||||
}
|
||||
// let content: Blob;
|
||||
// let datatype: "plain" | "newnote" = "newnote";
|
||||
const isPlain = isPlainText(file.name);
|
||||
const possiblyLarge = !isPlain;
|
||||
// if (!cache) {
|
||||
if (possiblyLarge) Logger(`Reading : ${file.path}`, LOG_LEVEL_VERBOSE);
|
||||
const content = createBlob(await this.vaultAccess.vaultReadAuto(file));
|
||||
const datatype = isPlain ? "plain" : "newnote";
|
||||
// }
|
||||
// else if (cache instanceof ArrayBuffer) {
|
||||
// Logger(`Cache Reading: ${file.path}`, LOG_LEVEL_VERBOSE);
|
||||
// content = createBinaryBlob(cache);
|
||||
// datatype = "newnote"
|
||||
// } else {
|
||||
// content = createTextBlob(cache);
|
||||
// datatype = "plain";
|
||||
// }
|
||||
if (possiblyLarge) Logger(`Processing: ${file.path}`, LOG_LEVEL_VERBOSE);
|
||||
const fullPath = getPathFromTFile(file);
|
||||
const id = await this.path2id(fullPath);
|
||||
const d: SavingEntry = {
|
||||
@@ -2718,7 +2749,7 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
Logger(msg + " Skip " + fullPath, LOG_LEVEL_VERBOSE);
|
||||
return true;
|
||||
}
|
||||
const ret = await this.localDatabase.putDBEntry(d, initialScan);
|
||||
const ret = await this.localDatabase.putDBEntry(d);
|
||||
if (ret !== false) {
|
||||
Logger(msg + fullPath);
|
||||
if (this.settings.syncOnSave && !this.suspended) {
|
||||
@@ -2758,27 +2789,6 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
await this.replicator.tryCreateRemoteDatabase(this.settings);
|
||||
}
|
||||
|
||||
async ensureDirectoryEx(fullPath: string) {
|
||||
const pathElements = fullPath.split("/");
|
||||
pathElements.pop();
|
||||
let c = "";
|
||||
for (const v of pathElements) {
|
||||
c += v;
|
||||
try {
|
||||
await this.app.vault.adapter.mkdir(c);
|
||||
} catch (ex) {
|
||||
// basically skip exceptions.
|
||||
if (ex.message && ex.message == "Folder already exists.") {
|
||||
// especially this message is.
|
||||
} else {
|
||||
Logger("Folder Create Error");
|
||||
Logger(ex);
|
||||
}
|
||||
}
|
||||
c += "/";
|
||||
}
|
||||
}
|
||||
|
||||
filterTargetFiles(files: InternalFileInfo[], targetFiles: string[] | false = false) {
|
||||
const ignorePatterns = this.settings.syncInternalFilesIgnorePatterns
|
||||
.replace(/\n| /g, "")
|
||||
@@ -2803,7 +2813,7 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
const mtimeB = ("mtime" in revBDoc && revBDoc.mtime) || 0;
|
||||
const delRev = mtimeA < mtimeB ? revA : revB;
|
||||
// delete older one.
|
||||
await this.localDatabase.removeRaw(id, delRev);
|
||||
await this.localDatabase.removeRevision(id, delRev);
|
||||
Logger(`Older one has been deleted:${this.getPath(doc)}`);
|
||||
return true;
|
||||
}
|
||||
@@ -2887,6 +2897,12 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
});
|
||||
}
|
||||
|
||||
askYesNo(message: string): Promise<"yes" | "no"> {
|
||||
return askYesNo(this.app, message);
|
||||
}
|
||||
askSelectString(message: string, items: string[]): Promise<string> {
|
||||
return askSelectString(this.app, message, items);
|
||||
}
|
||||
|
||||
askInPopup(key: string, dialogText: string, anchorCallback: (anchor: HTMLAnchorElement) => void) {
|
||||
|
||||
@@ -2905,7 +2921,6 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
const popupKey = "popup-" + key;
|
||||
scheduleTask(popupKey, 1000, async () => {
|
||||
const popup = await memoIfNotExist(popupKey, () => new Notice(fragment, 0));
|
||||
//@ts-ignore
|
||||
const isShown = popup?.noticeEl?.isShown();
|
||||
if (!isShown) {
|
||||
memoObject(popupKey, new Notice(fragment, 0));
|
||||
@@ -2914,7 +2929,6 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
const popup = retrieveMemoObject<Notice>(popupKey);
|
||||
if (!popup)
|
||||
return;
|
||||
//@ts-ignore
|
||||
if (popup?.noticeEl?.isShown()) {
|
||||
popup.hide();
|
||||
}
|
||||
@@ -2922,5 +2936,19 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
});
|
||||
});
|
||||
}
|
||||
openSetting() {
|
||||
//@ts-ignore: undocumented api
|
||||
this.app.setting.open();
|
||||
//@ts-ignore: undocumented api
|
||||
this.app.setting.openTabById("obsidian-livesync");
|
||||
}
|
||||
|
||||
performAppReload() {
|
||||
this.performCommand("app:reload");
|
||||
}
|
||||
performCommand(id: string) {
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById(id)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
24
src/utils.ts
@@ -242,14 +242,11 @@ export function mergeObject(
|
||||
ret[key] = v;
|
||||
}
|
||||
}
|
||||
const retSorted = Object.fromEntries(Object.entries(ret).sort((a, b) => a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0));
|
||||
if (Array.isArray(objA) && Array.isArray(objB)) {
|
||||
return Object.values(Object.entries(ret)
|
||||
.sort()
|
||||
.reduce((p, [key, value]) => ({ ...p, [key]: value }), {}));
|
||||
return Object.values(retSorted);
|
||||
}
|
||||
return Object.entries(ret)
|
||||
.sort()
|
||||
.reduce((p, [key, value]) => ({ ...p, [key]: value }), {});
|
||||
return retSorted;
|
||||
}
|
||||
|
||||
export function flattenObject(obj: Record<string | number | symbol, any>, path: string[] = []): [string, any][] {
|
||||
@@ -313,7 +310,7 @@ export function isCustomisationSyncMetadata(str: string): boolean {
|
||||
|
||||
export const askYesNo = (app: App, message: string): Promise<"yes" | "no"> => {
|
||||
return new Promise((res) => {
|
||||
const popover = new PopoverSelectString(app, message, null, null, (result) => res(result as "yes" | "no"));
|
||||
const popover = new PopoverSelectString(app, message, undefined, undefined, (result) => res(result as "yes" | "no"));
|
||||
popover.open();
|
||||
});
|
||||
};
|
||||
@@ -327,7 +324,7 @@ export const askSelectString = (app: App, message: string, items: string[]): Pro
|
||||
};
|
||||
|
||||
|
||||
export const askString = (app: App, title: string, key: string, placeholder: string, isPassword?: boolean): Promise<string | false> => {
|
||||
export const askString = (app: App, title: string, key: string, placeholder: string, isPassword: boolean = false): Promise<string | false> => {
|
||||
return new Promise((res) => {
|
||||
const dialog = new InputStringDialog(app, title, key, placeholder, isPassword, (result) => res(result));
|
||||
dialog.open();
|
||||
@@ -400,15 +397,18 @@ export const _requestToCouchDB = async (baseUri: string, username: string, passw
|
||||
};
|
||||
return await requestUrl(requestParam);
|
||||
}
|
||||
export const requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string, key?: string, body?: string, method?: string) => {
|
||||
export const requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string = "", key?: string, body?: string, method?: string) => {
|
||||
const uri = `_node/_local/_config${key ? "/" + key : ""}`;
|
||||
return await _requestToCouchDB(baseUri, username, password, origin, uri, body, method);
|
||||
};
|
||||
|
||||
export async function performRebuildDB(plugin: ObsidianLiveSyncPlugin, method: "localOnly" | "remoteOnly" | "rebuildBothByThisDevice") {
|
||||
export async function performRebuildDB(plugin: ObsidianLiveSyncPlugin, method: "localOnly" | "remoteOnly" | "rebuildBothByThisDevice" | "localOnlyWithChunks") {
|
||||
if (method == "localOnly") {
|
||||
await plugin.addOnSetup.fetchLocal();
|
||||
}
|
||||
if (method == "localOnlyWithChunks") {
|
||||
await plugin.addOnSetup.fetchLocal(true);
|
||||
}
|
||||
if (method == "remoteOnly") {
|
||||
await plugin.addOnSetup.rebuildRemote();
|
||||
}
|
||||
@@ -437,7 +437,7 @@ export function compareMTime(baseMTime: number, targetMTime: number): typeof BAS
|
||||
export function markChangesAreSame(file: TFile | AnyEntry | string, mtime1: number, mtime2: number) {
|
||||
if (mtime1 === mtime2) return true;
|
||||
const key = typeof file == "string" ? file : file instanceof TFile ? file.path : file.path ?? file._id;
|
||||
const pairs = sameChangePairs.get(key, []);
|
||||
const pairs = sameChangePairs.get(key, []) || [];
|
||||
if (pairs.some(e => e == mtime1 || e == mtime2)) {
|
||||
sameChangePairs.set(key, [...new Set([...pairs, mtime1, mtime2])]);
|
||||
} else {
|
||||
@@ -446,7 +446,7 @@ export function markChangesAreSame(file: TFile | AnyEntry | string, mtime1: numb
|
||||
}
|
||||
export function isMarkedAsSameChanges(file: TFile | AnyEntry | string, mtimes: number[]) {
|
||||
const key = typeof file == "string" ? file : file instanceof TFile ? file.path : file.path ?? file._id;
|
||||
const pairs = sameChangePairs.get(key, []);
|
||||
const pairs = sameChangePairs.get(key, []) || [];
|
||||
if (mtimes.every(e => pairs.indexOf(e) !== -1)) {
|
||||
return EVEN;
|
||||
}
|
||||
|
||||
20
styles.css
@@ -77,14 +77,6 @@
|
||||
border-top: 1px solid var(--background-modifier-border);
|
||||
}
|
||||
|
||||
/* .sls-table-head{
|
||||
width:50%;
|
||||
}
|
||||
.sls-table-tail{
|
||||
width:50%;
|
||||
|
||||
} */
|
||||
|
||||
.sls-header-button {
|
||||
margin-left: 2em;
|
||||
}
|
||||
@@ -94,7 +86,15 @@
|
||||
}
|
||||
|
||||
:root {
|
||||
--slsmessage: "";
|
||||
--sls-log-text: "";
|
||||
}
|
||||
|
||||
.sls-troubleshoot-preview {
|
||||
max-width: max-content;
|
||||
}
|
||||
|
||||
.sls-troubleshoot-preview img {
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
.CodeMirror-wrap::before,
|
||||
@@ -102,7 +102,7 @@
|
||||
.markdown-source-view.cm-s-obsidian::before,
|
||||
.canvas-wrapper::before,
|
||||
.empty-state::before {
|
||||
content: attr(data-log);
|
||||
content: var(--sls-log-text, "");
|
||||
text-align: right;
|
||||
white-space: pre-wrap;
|
||||
position: absolute;
|
||||
|
||||
98
updates.md
@@ -10,75 +10,43 @@ Note: we got a very performance improvement.
|
||||
Note at 0.22.2: **Now, to rescue mobile devices, Maximum file size is set to 50 by default**. Please configure the limit as you need. If you do not want to limit the sizes, set zero manually, please.
|
||||
|
||||
#### Version history
|
||||
- 0.23.4
|
||||
- Fixed:
|
||||
- Now the result of conflict resolution could be surely written into the storage.
|
||||
- Deleted files can be handled correctly again in the history dialogue and conflict dialogue.
|
||||
- Some wrong log messages were fixed.
|
||||
- Change handling now has become more stable.
|
||||
- Some event handling became to be safer.
|
||||
- 0.22.14:
|
||||
- New feature:
|
||||
- We can disable the status bar in the setting dialogue.
|
||||
- Improved:
|
||||
- Now some files are handled as correct data type.
|
||||
- Customisation sync now uses the digest of each file for better performance.
|
||||
- The status in the Editor now works performant.
|
||||
- Refactored:
|
||||
- Common functions have been ready and the codebase has been organised.
|
||||
- Stricter type checking following TypeScript updates.
|
||||
- Remove old iOS workaround for simplicity and performance.
|
||||
- 0.22.13:
|
||||
- Improved:
|
||||
- Dumping document information shows conflicts and revisions.
|
||||
- The timestamp-only differences can be surely cached.
|
||||
- Timestamp difference detection can be rounded by two seconds.
|
||||
- Now using HTTP for the remote database URI warns of an error (on mobile) or notice (on desktop).
|
||||
- Refactored:
|
||||
- A bit of organisation to write the test.
|
||||
- 0.22.3
|
||||
- Fixed:
|
||||
- No longer detects storage changes which have been caused by Self-hosted LiveSync itself.
|
||||
- Setting sync file will be detected only if it has been configured now.
|
||||
- And its log will be shown only while the verbose log is enabled.
|
||||
- Customisation file enumeration has got less blingy.
|
||||
- Deletion of files is now reliably synchronised.
|
||||
- Fixed and improved:
|
||||
- In-editor-status is now shown in the following areas:
|
||||
- Note editing pane (Source mode and live-preview mode).
|
||||
- New tab pane.
|
||||
- Canvas pane.
|
||||
- 0.22.2
|
||||
- Fixed:
|
||||
- Now the results of resolving conflicts are surely synchronised.
|
||||
- Modified:
|
||||
- Some setting items got new clear names. (`Sync Settings` -> `Targets`).
|
||||
- New feature:
|
||||
- We can limit the synchronising files by their size. (`Sync Settings` -> `Targets` -> `Maximum file size`).
|
||||
- It depends on the size of the newer one.
|
||||
- At Obsidian 1.5.3 on mobile, we should set this to around 50MB to avoid restarting Obsidian.
|
||||
- Now the settings could be stored in a specific markdown file to synchronise or switch it (`General Setting` -> `Share settings via markdown`).
|
||||
- [Screwdriver](https://github.com/vrtmrz/obsidian-screwdriver) is quite good, but mostly we only need this.
|
||||
- Customisation of the obsoleted device is now able to be deleted at once.
|
||||
- We have to put the maintenance mode in at the Customisation sync dialogue.
|
||||
- 0.22.1
|
||||
- New feature:
|
||||
- We can perform automatic conflict resolution for inactive files, and postpone only manual ones by `Postpone manual resolution of inactive files`.
|
||||
- Now we can see the image in the document history dialogue.
|
||||
- We can see the difference of the image, in the document history dialogue.
|
||||
- And also we can highlight differences.
|
||||
- Dependencies have been polished.
|
||||
- 0.22.12:
|
||||
- Changed:
|
||||
- The default settings has been changed.
|
||||
- Improved:
|
||||
- Hidden file sync has been stabilised.
|
||||
- Now automatically reloads the conflict-resolution dialogue when new conflicted revisions have arrived.
|
||||
- Default and preferred settings are applied on completion of the wizard.
|
||||
- Fixed:
|
||||
- No longer periodic process runs after unloading the plug-in.
|
||||
- Now the modification of binary files is surely stored in the storage.
|
||||
- 0.22.0
|
||||
- Refined:
|
||||
- Task scheduling logics has been rewritten.
|
||||
- Screen updates are also now efficient.
|
||||
- Possibly many bugs and fragile behaviour has been fixed.
|
||||
- Status updates and logging have been thinned out to display.
|
||||
- Now Initialisation `Fetch` will be performed smoothly and there will be fewer conflicts.
|
||||
- No longer stuck while Handling transferred or initialised documents.
|
||||
- 0.22.11:
|
||||
- Fixed:
|
||||
- Remote-chunk-fetching now works with keeping request intervals
|
||||
- `Verify and repair all files` is no longer broken.
|
||||
- New feature:
|
||||
- We can show only the icons in the editor.
|
||||
- Progress indicators have been more meaningful:
|
||||
- 📥 Unprocessed transferred items
|
||||
- 📄 Working database operation
|
||||
- 💾 Working write storage processes
|
||||
- ⏳ Working read storage processes
|
||||
- 🛫 Pending read storage processes
|
||||
- ⚙️ Working or pending storage processes of hidden files
|
||||
- 🧩 Waiting chunks
|
||||
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
|
||||
|
||||
|
||||
- Now `Verify and repair all files` is able to...
|
||||
- Restore if the file only in the local database.
|
||||
- Show the history.
|
||||
- Improved:
|
||||
- Performance improved.
|
||||
- 0.22.10
|
||||
- Fixed:
|
||||
- No longer unchanged hidden files and customisations are saved and transferred now.
|
||||
- File integrity of vault history indicates the integrity correctly.
|
||||
- Improved:
|
||||
- In the report, the schema of the remote database URI is now printed.
|
||||
... To continue on to `updates_old.md`.
|
||||
138
updates_old.md
@@ -1,3 +1,141 @@
|
||||
### 0.22.0
|
||||
A few years passed since Self-hosted LiveSync was born, and our codebase had been very complicated. This could be patient now, but it should be a tremendous hurt.
|
||||
Therefore at v0.22.0, for future maintainability, I refined task scheduling logic totally.
|
||||
|
||||
Of course, I think this would be our suffering in some cases. However, I would love to ask you for your cooperation and contribution.
|
||||
|
||||
Sorry for being absent so much long. And thank you for your patience!
|
||||
|
||||
Note: we got a very performance improvement.
|
||||
Note at 0.22.2: **Now, to rescue mobile devices, Maximum file size is set to 50 by default**. Please configure the limit as you need. If you do not want to limit the sizes, set zero manually, please.
|
||||
|
||||
#### Version history
|
||||
- 0.22.9
|
||||
- Fixed:
|
||||
- Fixed a bug on `fetch chunks on demand` that could not fetch the chunks on demand.
|
||||
- Improved:
|
||||
- `fetch chunks on demand` works more smoothly.
|
||||
- Initialisation `Fetch` is now more efficient.
|
||||
- Tidied:
|
||||
- Removed some meaningless codes.
|
||||
- 0.22.8
|
||||
- Fixed:
|
||||
- Now fetch and unlock the locked remote database works well again.
|
||||
- No longer crash on symbolic links inside hidden folders.
|
||||
- Improved:
|
||||
- Chunks are now created more efficiently.
|
||||
- Splitting old notes into a larger chunk.
|
||||
- Better performance in saving notes.
|
||||
- Network activities are indicated as an icon.
|
||||
- Less memory used for binary processing.
|
||||
- Tidied:
|
||||
- Cleaned unused functions up.
|
||||
- Sorting out the codes that have become nonsense.
|
||||
- Changed:
|
||||
- Now no longer `fetch chunks on demand` needs `Pacing replication`
|
||||
- The setting `Do not pace synchronization` has been deleted.
|
||||
- 0.22.7
|
||||
- Fixed:
|
||||
- No longer deleted hidden files were ignored.
|
||||
- The document history dialogue is now able to process the deleted revisions.
|
||||
- Deletion of a hidden file is now surely performed even if the file is already conflicted.
|
||||
- 0.22.6
|
||||
- Fixed:
|
||||
- Fixed a problem with synchronisation taking a long time to start in some cases.
|
||||
- The first synchronisation after update might take a bit longer.
|
||||
- Now we can disable E2EE encryption.
|
||||
- Improved:
|
||||
- `Setup Wizard` is now more clear.
|
||||
- `Minimal Setup` is now more simple.
|
||||
- Self-hosted LiveSync now be able to use even if there are vaults with the same name.
|
||||
- Database suffix will automatically added.
|
||||
- Now Self-hosted LiveSync waits until set-up is complete.
|
||||
- Show reload prompts when possibly recommended while settings.
|
||||
- New feature:
|
||||
- A guidance dialogue prompting for settings will be shown after the installation.
|
||||
- Changed
|
||||
- `Open setup URI` is now `Use the copied setup URI`
|
||||
- `Copy setup URI` is now `Copy current settings as a new setup URI`
|
||||
- `Setup Wizard` is now `Minimal Setup`
|
||||
- `Check database configuration` is now `Check and Fix database configuration`
|
||||
- 0.22.5
|
||||
- Fixed:
|
||||
- Some description of settings have been refined
|
||||
- New feature:
|
||||
- TroubleShooting is now shown in the setting dialogue.
|
||||
- 0.22.4
|
||||
- Fixed:
|
||||
- Now the result of conflict resolution could be surely written into the storage.
|
||||
- Deleted files can be handled correctly again in the history dialogue and conflict dialogue.
|
||||
- Some wrong log messages were fixed.
|
||||
- Change handling now has become more stable.
|
||||
- Some event handling became to be safer.
|
||||
- Improved:
|
||||
- Dumping document information shows conflicts and revisions.
|
||||
- The timestamp-only differences can be surely cached.
|
||||
- Timestamp difference detection can be rounded by two seconds.
|
||||
- Refactored:
|
||||
- A bit of organisation to write the test.
|
||||
- 0.22.3
|
||||
- Fixed:
|
||||
- No longer detects storage changes which have been caused by Self-hosted LiveSync itself.
|
||||
- Setting sync file will be detected only if it has been configured now.
|
||||
- And its log will be shown only while the verbose log is enabled.
|
||||
- Customisation file enumeration has got less blingy.
|
||||
- Deletion of files is now reliably synchronised.
|
||||
- Fixed and improved:
|
||||
- In-editor-status is now shown in the following areas:
|
||||
- Note editing pane (Source mode and live-preview mode).
|
||||
- New tab pane.
|
||||
- Canvas pane.
|
||||
- 0.22.2
|
||||
- Fixed:
|
||||
- Now the results of resolving conflicts are surely synchronised.
|
||||
- Modified:
|
||||
- Some setting items got new clear names. (`Sync Settings` -> `Targets`).
|
||||
- New feature:
|
||||
- We can limit the synchronising files by their size. (`Sync Settings` -> `Targets` -> `Maximum file size`).
|
||||
- It depends on the size of the newer one.
|
||||
- At Obsidian 1.5.3 on mobile, we should set this to around 50MB to avoid restarting Obsidian.
|
||||
- Now the settings could be stored in a specific markdown file to synchronise or switch it (`General Setting` -> `Share settings via markdown`).
|
||||
- [Screwdriver](https://github.com/vrtmrz/obsidian-screwdriver) is quite good, but mostly we only need this.
|
||||
- Customisation of the obsoleted device is now able to be deleted at once.
|
||||
- We have to put the maintenance mode in at the Customisation sync dialogue.
|
||||
- 0.22.1
|
||||
- New feature:
|
||||
- We can perform automatic conflict resolution for inactive files, and postpone only manual ones by `Postpone manual resolution of inactive files`.
|
||||
- Now we can see the image in the document history dialogue.
|
||||
- We can see the difference of the image, in the document history dialogue.
|
||||
- And also we can highlight differences.
|
||||
- Improved:
|
||||
- Hidden file sync has been stabilised.
|
||||
- Now automatically reloads the conflict-resolution dialogue when new conflicted revisions have arrived.
|
||||
- Fixed:
|
||||
- No longer periodic process runs after unloading the plug-in.
|
||||
- Now the modification of binary files is surely stored in the storage.
|
||||
- 0.22.0
|
||||
- Refined:
|
||||
- Task scheduling logics has been rewritten.
|
||||
- Screen updates are also now efficient.
|
||||
- Possibly many bugs and fragile behaviour has been fixed.
|
||||
- Status updates and logging have been thinned out to display.
|
||||
- Fixed:
|
||||
- Remote-chunk-fetching now works with keeping request intervals
|
||||
- New feature:
|
||||
- We can show only the icons in the editor.
|
||||
- Progress indicators have been more meaningful:
|
||||
- 📥 Unprocessed transferred items
|
||||
- 📄 Working database operation
|
||||
- 💾 Working write storage processes
|
||||
- ⏳ Working read storage processes
|
||||
- 🛫 Pending read storage processes
|
||||
- ⚙️ Working or pending storage processes of hidden files
|
||||
- 🧩 Waiting chunks
|
||||
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
|
||||
|
||||
|
||||
... To continue on to `updates_old.md`.
|
||||
|
||||
### 0.21.0
|
||||
The E2EE encryption V2 format has been reverted. That was probably the cause of the glitch.
|
||||
Instead, to maintain efficiency, files are treated with Blob until just before saving. Along with this, the old-fashioned encryption format has also been discontinued.
|
||||
|
||||
1
utils/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
fly.toml
|
||||
29
utils/couchdb/couchdb-init.sh
Executable file
@@ -0,0 +1,29 @@
|
||||
#!/bin/bash
|
||||
if [[ -z "$hostname" ]]; then
|
||||
echo "ERROR: Hostname missing"
|
||||
exit 1
|
||||
fi
|
||||
if [[ -z "$username" ]]; then
|
||||
echo "ERROR: Username missing"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ -z "$password" ]]; then
|
||||
echo "ERROR: Password missing"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "-- Configuring CouchDB by REST APIs... -->"
|
||||
|
||||
until (curl -X POST "${hostname}/_cluster_setup" -H "Content-Type: application/json" -d "{\"action\":\"enable_single_node\",\"username\":\"${username}\",\"password\":\"${password}\",\"bind_address\":\"0.0.0.0\",\"port\":5984,\"singlenode\":true}" --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/chttpd/require_valid_user" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/chttpd_auth/require_valid_user" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/httpd/WWW-Authenticate" -H "Content-Type: application/json" -d '"Basic realm=\"couchdb\""' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/httpd/enable_cors" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/chttpd/enable_cors" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/chttpd/max_http_request_size" -H "Content-Type: application/json" -d '"4294967296"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/couchdb/max_document_size" -H "Content-Type: application/json" -d '"50000000"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/cors/credentials" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/cors/origins" -H "Content-Type: application/json" -d '"app://obsidian.md,capacitor://localhost,http://localhost"' --user "${username}:${password}"); do sleep 5; done
|
||||
|
||||
echo "<-- Configuring CouchDB by REST APIs Done!"
|
||||
4
utils/flyio/delete-server.sh
Executable file
@@ -0,0 +1,4 @@
|
||||
#!/bin/bash
|
||||
set -e
|
||||
fly scale count 0 -y
|
||||
fly apps destroy $(fly status -j | jq -r .Name) -y
|
||||
43
utils/flyio/deploy-server.sh
Executable file
@@ -0,0 +1,43 @@
|
||||
#!/bin/bash
|
||||
## Script for deploy and automatic setup CouchDB onto fly.io.
|
||||
## We need Deno for generating the Setup-URI.
|
||||
|
||||
source setenv.sh $@
|
||||
|
||||
export hostname="https://$appname.fly.dev"
|
||||
|
||||
echo "-- YOUR CONFIGURATION --"
|
||||
echo "URL : $hostname"
|
||||
echo "username: $username"
|
||||
echo "password: $password"
|
||||
echo "region : $region"
|
||||
echo ""
|
||||
echo "-- START DEPLOYING --> "
|
||||
|
||||
set -e
|
||||
fly launch --name=$appname --env="COUCHDB_USER=$username" --copy-config=true --detach --no-deploy --region ${region} --yes
|
||||
fly secrets set COUCHDB_PASSWORD=$password
|
||||
fly deploy
|
||||
|
||||
set +e
|
||||
../couchdb/couchdb-init.sh
|
||||
# flyctl deploy
|
||||
echo "OK!"
|
||||
|
||||
if command -v deno >/dev/null 2>&1; then
|
||||
echo "Setup finished! Also, we can set up Self-hosted LiveSync instantly, by the following setup uri."
|
||||
echo "Passphrase of setup-uri is \`welcome\`".
|
||||
echo "--- configured ---"
|
||||
echo "database : ${database}"
|
||||
echo "E2EE passphrase: ${passphrase}"
|
||||
echo "--- setup uri ---"
|
||||
deno run -A generate_setupuri.ts
|
||||
else
|
||||
echo "Setup finished! Here is the configured values (reprise)!"
|
||||
echo "-- YOUR CONFIGURATION --"
|
||||
echo "URL : $hostname"
|
||||
echo "username: $username"
|
||||
echo "password: $password"
|
||||
echo "-- YOUR CONFIGURATION --"
|
||||
echo "If we had Deno, we would got the setup uri directly!"
|
||||
fi
|
||||
40
utils/flyio/fly.template.toml
Normal file
@@ -0,0 +1,40 @@
|
||||
## CouchDB for fly.io image
|
||||
|
||||
app = ''
|
||||
primary_region = 'nrt'
|
||||
swap_size_mb = 512
|
||||
|
||||
[build]
|
||||
image = "couchdb:latest"
|
||||
|
||||
[mounts]
|
||||
source = "couchdata"
|
||||
destination = "/opt/couchdb/data"
|
||||
initial_size = "1GB"
|
||||
auto_extend_size_threshold = 90
|
||||
auto_extend_size_increment = "1GB"
|
||||
auto_extend_size_limit = "2GB"
|
||||
|
||||
[env]
|
||||
COUCHDB_USER = ""
|
||||
ERL_FLAGS = "-couch_ini /opt/couchdb/etc/default.ini /opt/couchdb/etc/default.d/ /opt/couchdb/etc/local.d /opt/couchdb/etc/local.ini /opt/couchdb/data/persistence.ini"
|
||||
|
||||
[http_service]
|
||||
internal_port = 5984
|
||||
force_https = true
|
||||
auto_stop_machines = true
|
||||
auto_start_machines = true
|
||||
min_machines_running = 0
|
||||
processes = ['app']
|
||||
|
||||
[[vm]]
|
||||
cpu_kind = 'shared'
|
||||
cpus = 1
|
||||
memory_mb = 256
|
||||
|
||||
[[files]]
|
||||
guest_path = "/docker-entrypoint2.sh"
|
||||
raw_value = "#!/bin/bash\ntouch /opt/couchdb/data/persistence.ini\nchmod +w /opt/couchdb/data/persistence.ini\n/docker-entrypoint.sh $@"
|
||||
|
||||
[experimental]
|
||||
entrypoint = ["tini", "--", "/docker-entrypoint2.sh"]
|
||||
180
utils/flyio/generate_setupuri.ts
Normal file
@@ -0,0 +1,180 @@
|
||||
import { webcrypto } from "node:crypto";
|
||||
|
||||
const KEY_RECYCLE_COUNT = 100;
|
||||
type KeyBuffer = {
|
||||
key: CryptoKey;
|
||||
salt: Uint8Array;
|
||||
count: number;
|
||||
};
|
||||
|
||||
let semiStaticFieldBuffer: Uint8Array;
|
||||
const nonceBuffer: Uint32Array = new Uint32Array(1);
|
||||
const writeString = (string: string) => {
|
||||
// Prepare enough buffer.
|
||||
const buffer = new Uint8Array(string.length * 4);
|
||||
const length = string.length;
|
||||
let index = 0;
|
||||
let chr = 0;
|
||||
let idx = 0;
|
||||
while (idx < length) {
|
||||
chr = string.charCodeAt(idx++);
|
||||
if (chr < 128) {
|
||||
buffer[index++] = chr;
|
||||
} else if (chr < 0x800) {
|
||||
// 2 bytes
|
||||
buffer[index++] = 0xC0 | (chr >>> 6);
|
||||
buffer[index++] = 0x80 | (chr & 0x3F);
|
||||
} else if (chr < 0xD800 || chr > 0xDFFF) {
|
||||
// 3 bytes
|
||||
buffer[index++] = 0xE0 | (chr >>> 12);
|
||||
buffer[index++] = 0x80 | ((chr >>> 6) & 0x3F);
|
||||
buffer[index++] = 0x80 | (chr & 0x3F);
|
||||
} else {
|
||||
// 4 bytes - surrogate pair
|
||||
chr = (((chr - 0xD800) << 10) | (string.charCodeAt(idx++) - 0xDC00)) + 0x10000;
|
||||
buffer[index++] = 0xF0 | (chr >>> 18);
|
||||
buffer[index++] = 0x80 | ((chr >>> 12) & 0x3F);
|
||||
buffer[index++] = 0x80 | ((chr >>> 6) & 0x3F);
|
||||
buffer[index++] = 0x80 | (chr & 0x3F);
|
||||
}
|
||||
}
|
||||
return buffer.slice(0, index);
|
||||
};
|
||||
const KeyBuffs = new Map<string, KeyBuffer>();
|
||||
async function getKeyForEncrypt(passphrase: string, autoCalculateIterations: boolean): Promise<[CryptoKey, Uint8Array]> {
|
||||
// For performance, the plugin reuses the key KEY_RECYCLE_COUNT times.
|
||||
const buffKey = `${passphrase}-${autoCalculateIterations}`;
|
||||
const f = KeyBuffs.get(buffKey);
|
||||
if (f) {
|
||||
f.count--;
|
||||
if (f.count > 0) {
|
||||
return [f.key, f.salt];
|
||||
}
|
||||
f.count--;
|
||||
}
|
||||
const passphraseLen = 15 - passphrase.length;
|
||||
const iteration = autoCalculateIterations ? ((passphraseLen > 0 ? passphraseLen : 0) * 1000) + 121 - passphraseLen : 100000;
|
||||
const passphraseBin = new TextEncoder().encode(passphrase);
|
||||
const digest = await webcrypto.subtle.digest({ name: "SHA-256" }, passphraseBin);
|
||||
const keyMaterial = await webcrypto.subtle.importKey("raw", digest, { name: "PBKDF2" }, false, ["deriveKey"]);
|
||||
const salt = webcrypto.getRandomValues(new Uint8Array(16));
|
||||
const key = await webcrypto.subtle.deriveKey(
|
||||
{
|
||||
name: "PBKDF2",
|
||||
salt,
|
||||
iterations: iteration,
|
||||
hash: "SHA-256",
|
||||
},
|
||||
keyMaterial,
|
||||
{ name: "AES-GCM", length: 256 },
|
||||
false,
|
||||
["encrypt"]
|
||||
);
|
||||
KeyBuffs.set(buffKey, {
|
||||
key,
|
||||
salt,
|
||||
count: KEY_RECYCLE_COUNT,
|
||||
});
|
||||
return [key, salt];
|
||||
}
|
||||
|
||||
function getSemiStaticField(reset?: boolean) {
|
||||
// return fixed field of iv.
|
||||
if (semiStaticFieldBuffer != null && !reset) {
|
||||
return semiStaticFieldBuffer;
|
||||
}
|
||||
semiStaticFieldBuffer = webcrypto.getRandomValues(new Uint8Array(12));
|
||||
return semiStaticFieldBuffer;
|
||||
}
|
||||
|
||||
function getNonce() {
|
||||
// This is nonce, so do not send same thing.
|
||||
nonceBuffer[0]++;
|
||||
if (nonceBuffer[0] > 10000) {
|
||||
// reset semi-static field.
|
||||
getSemiStaticField(true);
|
||||
}
|
||||
return nonceBuffer;
|
||||
}
|
||||
function arrayBufferToBase64internalBrowser(buffer: DataView | Uint8Array): Promise<string> {
|
||||
return new Promise((res, rej) => {
|
||||
const blob = new Blob([buffer], { type: "application/octet-binary" });
|
||||
const reader = new FileReader();
|
||||
reader.onload = function (evt) {
|
||||
const dataURI = evt.target?.result?.toString() || "";
|
||||
if (buffer.byteLength != 0 && (dataURI == "" || dataURI == "data:")) return rej(new TypeError("Could not parse the encoded string"));
|
||||
const result = dataURI.substring(dataURI.indexOf(",") + 1);
|
||||
res(result);
|
||||
};
|
||||
reader.readAsDataURL(blob);
|
||||
});
|
||||
}
|
||||
|
||||
// Map for converting hexString
|
||||
const revMap: { [key: string]: number } = {};
|
||||
const numMap: { [key: number]: string } = {};
|
||||
for (let i = 0; i < 256; i++) {
|
||||
revMap[(`00${i.toString(16)}`.slice(-2))] = i;
|
||||
numMap[i] = (`00${i.toString(16)}`.slice(-2));
|
||||
}
|
||||
|
||||
|
||||
function uint8ArrayToHexString(src: Uint8Array): string {
|
||||
return [...src].map(e => numMap[e]).join("");
|
||||
}
|
||||
|
||||
const QUANTUM = 32768;
|
||||
async function arrayBufferToBase64Single(buffer: ArrayBuffer): Promise<string> {
|
||||
const buf = buffer instanceof Uint8Array ? buffer : new Uint8Array(buffer);
|
||||
if (buf.byteLength < QUANTUM) return btoa(String.fromCharCode.apply(null, [...buf]));
|
||||
return await arrayBufferToBase64internalBrowser(buf);
|
||||
}
|
||||
|
||||
|
||||
export async function encrypt(input: string, passphrase: string, autoCalculateIterations: boolean) {
|
||||
const [key, salt] = await getKeyForEncrypt(passphrase, autoCalculateIterations);
|
||||
// Create initial vector with semi-fixed part and incremental part
|
||||
// I think it's not good against related-key attacks.
|
||||
const fixedPart = getSemiStaticField();
|
||||
const invocationPart = getNonce();
|
||||
const iv = new Uint8Array([...fixedPart, ...new Uint8Array(invocationPart.buffer)]);
|
||||
const plainStringified = JSON.stringify(input);
|
||||
|
||||
// const plainStringBuffer: Uint8Array = tex.encode(plainStringified)
|
||||
const plainStringBuffer: Uint8Array = writeString(plainStringified);
|
||||
const encryptedDataArrayBuffer = await webcrypto.subtle.encrypt({ name: "AES-GCM", iv }, key, plainStringBuffer);
|
||||
const encryptedData2 = (await arrayBufferToBase64Single(encryptedDataArrayBuffer));
|
||||
//return data with iv and salt.
|
||||
const ret = `["${encryptedData2}","${uint8ArrayToHexString(iv)}","${uint8ArrayToHexString(salt)}"]`;
|
||||
return ret;
|
||||
}
|
||||
|
||||
const URIBASE = "obsidian://setuplivesync?settings=";
|
||||
async function main() {
|
||||
const conf = {
|
||||
"couchDB_URI": `${Deno.env.get("hostname")}`,
|
||||
"couchDB_USER": `${Deno.env.get("username")}`,
|
||||
"couchDB_PASSWORD": `${Deno.env.get("password")}`,
|
||||
"couchDB_DBNAME": `${Deno.env.get("database")}`,
|
||||
"syncOnStart": true,
|
||||
"gcDelay": 0,
|
||||
"periodicReplication": true,
|
||||
"syncOnFileOpen": true,
|
||||
"encrypt": true,
|
||||
"passphrase": `${Deno.env.get("passphrase")}`,
|
||||
"usePathObfuscation": true,
|
||||
"batchSave": true,
|
||||
"batch_size": 50,
|
||||
"batches_limit": 50,
|
||||
"useHistory": true,
|
||||
"disableRequestURI": true,
|
||||
"customChunkSize": 50,
|
||||
"syncAfterMerge": false,
|
||||
"concurrencyOfReadChunksOnline": 100,
|
||||
"minimumIntervalOfReadChunksOnline": 100,
|
||||
}
|
||||
const encryptedConf = encodeURIComponent(await encrypt(JSON.stringify(conf), "welcome", false));
|
||||
const theURI = `${URIBASE}${encryptedConf}`;
|
||||
console.log(theURI);
|
||||
}
|
||||
await main();
|
||||
30
utils/flyio/setenv.sh
Executable file
@@ -0,0 +1,30 @@
|
||||
random_num() {
|
||||
echo $RANDOM
|
||||
}
|
||||
random_noun() {
|
||||
nouns=("waterfall" "river" "breeze" "moon" "rain" "wind" "sea" "morning" "snow" "lake" "sunset" "pine" "shadow" "leaf" "dawn" "glitter" "forest" "hill" "cloud" "meadow" "sun" "glade" "bird" "brook" "butterfly" "bush" "dew" "dust" "field" "fire" "flower" "firefly" "feather" "grass" "haze" "mountain" "night" "pond" "darkness" "snowflake" "silence" "sound" "sky" "shape" "surf" "thunder" "violet" "water" "wildflower" "wave" "water" "resonance" "sun" "log" "dream" "cherry" "tree" "fog" "frost" "voice" "paper" "frog" "smoke" "star")
|
||||
echo ${nouns[$(($RANDOM % ${#nouns[*]}))]}
|
||||
}
|
||||
|
||||
random_adjective() {
|
||||
adjectives=("autumn" "hidden" "bitter" "misty" "silent" "empty" "dry" "dark" "summer" "icy" "delicate" "quiet" "white" "cool" "spring" "winter" "patient" "twilight" "dawn" "crimson" "wispy" "weathered" "blue" "billowing" "broken" "cold" "damp" "falling" "frosty" "green" "long" "late" "lingering" "bold" "little" "morning" "muddy" "old" "red" "rough" "still" "small" "sparkling" "thrumming" "shy" "wandering" "withered" "wild" "black" "young" "holy" "solitary" "fragrant" "aged" "snowy" "proud" "floral" "restless" "divine" "polished" "ancient" "purple" "lively" "nameless")
|
||||
echo ${adjectives[$(($RANDOM % ${#adjectives[*]}))]}
|
||||
}
|
||||
|
||||
cp ./fly.template.toml ./fly.toml
|
||||
|
||||
if [ "$1" = "renew" ]; then
|
||||
unset appname
|
||||
unset username
|
||||
unset password
|
||||
unset database
|
||||
unset passphrase
|
||||
unset region
|
||||
fi
|
||||
|
||||
[ -z $appname ] && export appname=$(random_adjective)-$(random_noun)-$(random_num)
|
||||
[ -z $username ] && export username=$(random_adjective)-$(random_noun)-$(random_num)
|
||||
[ -z $password ] && export password=$(random_adjective)-$(random_noun)-$(random_num)
|
||||
[ -z $database ] && export database="obsidiannotes"
|
||||
[ -z $passphrase ] && export passphrase=$(random_adjective)-$(random_noun)-$(random_num)
|
||||
[ -z $region ] && export region="nrt"
|
||||
164
utils/readme.md
Normal file
@@ -0,0 +1,164 @@
|
||||
<!-- For translation: 20240206r0 -->
|
||||
# Utilities
|
||||
Here are some useful things.
|
||||
|
||||
## couchdb
|
||||
|
||||
### couchdb-init.sh
|
||||
This script can configure CouchDB with the necessary settings by REST APIs.
|
||||
|
||||
#### Materials
|
||||
- Mandatory: curl
|
||||
|
||||
#### Usage
|
||||
|
||||
```sh
|
||||
export hostname=http://localhost:5984/
|
||||
export username=couchdb-admin-username
|
||||
export password=couchdb-admin-password
|
||||
./couchdb-init.sh
|
||||
```
|
||||
|
||||
curl result will be shown, however, all of them can be ignored if the script has been run completely.
|
||||
|
||||
## fly.io
|
||||
|
||||
### deploy-server.sh
|
||||
|
||||
A fully automated CouchDB deployment script. We can deploy CouchDB onto fly.io. The only we need is an account of it.
|
||||
|
||||
All omitted configurations will be determined at random. (And, it is preferred). The region is configured to `nrt`.
|
||||
If Japan is not close to you, please choose a region closer to you. However, the deployed database will work if you leave it at all.
|
||||
|
||||
#### Materials
|
||||
- Mandatory: curl, flyctl
|
||||
- Recommended: deno
|
||||
|
||||
#### Usage
|
||||
```sh
|
||||
#export appname=
|
||||
#export username=
|
||||
#export password=
|
||||
#export database=
|
||||
#export passphrase=
|
||||
export region=nrt #pick your nearest location
|
||||
./deploy-server.sh
|
||||
```
|
||||
|
||||
The result of this command is as follows.
|
||||
|
||||
```
|
||||
-- YOUR CONFIGURATION --
|
||||
URL : https://young-darkness-25342.fly.dev
|
||||
username: billowing-cherry-22580
|
||||
password: misty-dew-13571
|
||||
region : nrt
|
||||
|
||||
-- START DEPLOYING -->
|
||||
An existing fly.toml file was found
|
||||
Using build strategies '[the "couchdb:latest" docker image]'. Remove [build] from fly.toml to force a rescan
|
||||
Creating app in /home/vorotamoroz/dev/obsidian-livesync/utils/flyio
|
||||
We're about to launch your app on Fly.io. Here's what you're getting:
|
||||
|
||||
Organization: vorotamoroz (fly launch defaults to the personal org)
|
||||
Name: young-darkness-25342 (specified on the command line)
|
||||
Region: Tokyo, Japan (specified on the command line)
|
||||
App Machines: shared-cpu-1x, 256MB RAM (specified on the command line)
|
||||
Postgres: <none> (not requested)
|
||||
Redis: <none> (not requested)
|
||||
|
||||
Created app 'young-darkness-25342' in organization 'personal'
|
||||
Admin URL: https://fly.io/apps/young-darkness-25342
|
||||
Hostname: young-darkness-25342.fly.dev
|
||||
Wrote config file fly.toml
|
||||
Validating /home/vorotamoroz/dev/obsidian-livesync/utils/flyio/fly.toml
|
||||
Platform: machines
|
||||
✓ Configuration is valid
|
||||
Your app is ready! Deploy with `flyctl deploy`
|
||||
Secrets are staged for the first deployment
|
||||
==> Verifying app config
|
||||
Validating /home/vorotamoroz/dev/obsidian-livesync/utils/flyio/fly.toml
|
||||
Platform: machines
|
||||
✓ Configuration is valid
|
||||
--> Verified app config
|
||||
==> Building image
|
||||
Searching for image 'couchdb:latest' remotely...
|
||||
image found: img_ox20prk63084j1zq
|
||||
|
||||
Watch your deployment at https://fly.io/apps/young-darkness-25342/monitoring
|
||||
|
||||
Provisioning ips for young-darkness-25342
|
||||
Dedicated ipv6: 2a09:8280:1::37:fde9
|
||||
Shared ipv4: 66.241.124.163
|
||||
Add a dedicated ipv4 with: fly ips allocate-v4
|
||||
|
||||
Creating a 1 GB volume named 'couchdata' for process group 'app'. Use 'fly vol extend' to increase its size
|
||||
This deployment will:
|
||||
* create 1 "app" machine
|
||||
|
||||
No machines in group app, launching a new machine
|
||||
|
||||
WARNING The app is not listening on the expected address and will not be reachable by fly-proxy.
|
||||
You can fix this by configuring your app to listen on the following addresses:
|
||||
- 0.0.0.0:5984
|
||||
Found these processes inside the machine with open listening sockets:
|
||||
PROCESS | ADDRESSES
|
||||
-----------------*---------------------------------------
|
||||
/.fly/hallpass | [fdaa:0:73b9:a7b:22e:3851:7f28:2]:22
|
||||
|
||||
Finished launching new machines
|
||||
|
||||
NOTE: The machines for [app] have services with 'auto_stop_machines = true' that will be stopped when idling
|
||||
|
||||
-------
|
||||
Checking DNS configuration for young-darkness-25342.fly.dev
|
||||
|
||||
Visit your newly deployed app at https://young-darkness-25342.fly.dev/
|
||||
-- Configuring CouchDB by REST APIs... -->
|
||||
curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to young-darkness-25342.fly.dev:443
|
||||
{"ok":true}
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
<-- Configuring CouchDB by REST APIs Done!
|
||||
OK!
|
||||
Setup finished! Also, we can set up Self-hosted LiveSync instantly, by the following setup uri.
|
||||
Passphrase of setup-uri is `welcome`.
|
||||
--- configured ---
|
||||
database : obsidiannotes
|
||||
E2EE passphrase: dark-wildflower-26467
|
||||
--- setup uri ---
|
||||
obsidian://setuplivesync?settings=%5B%22gZkBwjFbLqxbdSIbJymU%2FmTPBPAKUiHVGDRKYiNnKhW0auQeBgJOfvnxexZtMCn8sNiIUTAlxNaMGF2t%2BCEhpJoeCP%2FO%2BrwfN5LaNDQyky1Uf7E%2B64A5UWyjOYvZDOgq4iCKSdBAXp9oO%2BwKh4MQjUZ78vIVvJp8Mo6NWHfm5fkiWoAoddki1xBMvi%2BmmN%2FhZatQGcslVb9oyYWpZocduTl0a5Dv%2FQviGwlYQ%2F4NY0dVDIoOdvaYS%2FX4GhNAnLzyJKMXhPEJHo9FvR%2FEOBuwyfMdftV1SQUZ8YDCuiR3T7fh7Kn1c6OFgaFMpFm%2BWgIJ%2FZpmAyhZFpEcjpd7ty%2BN9kfd9gQsZM4%2BYyU9OwDd2DahVMBWkqoV12QIJ8OlJScHHdcUfMW5ex%2F4UZTWKNEHJsigITXBrtq11qGk3rBfHys8O0vY6sz%2FaYNM3iAOsR1aoZGyvwZm4O6VwtzK8edg0T15TL4O%2B7UajQgtCGxgKNYxb8EMOGeskv7NifYhjCWcveeTYOJzBhnIDyRbYaWbkAXQgHPBxzJRkkG%2FpBPfBBoJarj7wgjMvhLJ9xtL4FbP6sBNlr8jtAUCoq4L7LJcRNF4hlgvjJpL2BpFZMzkRNtUBcsRYR5J%2BM1X2buWi2BHncbSiRRDKEwNOQkc%2FmhMJjbAn%2F8eNKRuIICOLD5OvxD7FZNCJ0R%2BWzgrzcNV%22%2C%22ec7edc900516b4fcedb4c7cc01000000%22%2C%22fceb5fe54f6619ee266ed9a887634e07%22%5D
|
||||
```
|
||||
|
||||
All we have to do is copy the setup-URI (`obsidian`://...`) and open it from Self-hosted LiveSync on Obsidian.
|
||||
|
||||
If you did not install Deno, configurations will be printed again, instead of the setup-URI. In this case, we should configure it manually.
|
||||
|
||||
### delete-server.sh
|
||||
|
||||
The pair script of `deploy-server.sh`. We can delete the deployed server by this with fly.toml.
|
||||
|
||||
#### Materials
|
||||
|
||||
- Mandatory: flyctl, jq
|
||||
- Recommended: none
|
||||
|
||||
#### Usage
|
||||
```sh
|
||||
./delete-server.sh
|
||||
```
|
||||
|
||||
```
|
||||
App 'young-darkness-25342 is going to be scaled according to this plan:
|
||||
-1 machines for group 'app' on region 'nrt' of size 'shared-cpu-1x'
|
||||
Executing scale plan
|
||||
Destroyed e28667eec57158 group:app region:nrt size:shared-cpu-1x
|
||||
Destroyed app young-darkness-25342
|
||||
```
|
||||