Compare commits
194 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
172b08dbb3 | ||
|
|
d518a3fc1b | ||
|
|
c6ed867498 | ||
|
|
4f4923e977 | ||
|
|
a5ebf29b3d | ||
|
|
ee465184c8 | ||
|
|
d7d4f1e6f2 | ||
|
|
cbf5023593 | ||
|
|
3925052f92 | ||
|
|
1934418258 | ||
|
|
2ae018b2bd | ||
|
|
8474497985 | ||
|
|
b5714cc83b | ||
|
|
133f5a7109 | ||
|
|
daa3feebf1 | ||
|
|
7b5f7d0fbf | ||
|
|
29532193cb | ||
|
|
5b4309c09d | ||
|
|
16ef582453 | ||
|
|
3e22f70c7a | ||
|
|
0a8dbe097e | ||
|
|
2c0fcf74d0 | ||
|
|
a1ab1efd5d | ||
|
|
c8fcf2d0d5 | ||
|
|
c384e2f7fb | ||
|
|
99c1c7dc1a | ||
|
|
84adec4b1a | ||
|
|
f0b202bd91 | ||
|
|
d54b7e2d93 | ||
|
|
6952ef37f5 | ||
|
|
9630bcbae8 | ||
|
|
c3f925ab9a | ||
|
|
034dc0538f | ||
|
|
b6136df836 | ||
|
|
24aacdc2a1 | ||
|
|
f91109b1ad | ||
|
|
e76e7ae8ea | ||
|
|
f7fbe85d65 | ||
|
|
0313443b29 | ||
|
|
755c30f468 | ||
|
|
b00b0cc5e5 | ||
|
|
d7985a6b41 | ||
|
|
486e816902 | ||
|
|
ef9b19c24b | ||
|
|
4ed9494176 | ||
|
|
fcd56d59d5 | ||
|
|
1cabfcfd19 | ||
|
|
37a18dbfef | ||
|
|
e7edf88713 | ||
|
|
90ff75ab35 | ||
|
|
bff1d661f5 | ||
|
|
6b59c14774 | ||
|
|
8249274eac | ||
|
|
3c6dae7814 | ||
|
|
60cf8fe640 | ||
|
|
3d89b3863f | ||
|
|
ee9364310d | ||
|
|
86b9695bc2 | ||
|
|
e05f8771b9 | ||
|
|
65619c2478 | ||
|
|
1552fa9d9e | ||
|
|
767f12b52f | ||
|
|
4071ba120e | ||
|
|
2c0e3ba01c | ||
|
|
90adf06830 | ||
|
|
cf8e7ff6ca | ||
|
|
95c3ff5043 | ||
|
|
7ea3515801 | ||
|
|
f866981a8a | ||
|
|
8f36d6f893 | ||
|
|
6dd86e9392 | ||
|
|
d22716bef0 | ||
|
|
5d9baec5e4 | ||
|
|
27d71ca2fb | ||
|
|
c024ed13d3 | ||
|
|
b9527ccab0 | ||
|
|
fa3aa2702c | ||
|
|
93e7cbb133 | ||
|
|
716ae32e02 | ||
|
|
d6d8cbcf5a | ||
|
|
efd348b266 | ||
|
|
8969b1800a | ||
|
|
2c8e026e29 | ||
|
|
a6c27eab3d | ||
|
|
9b5c57d540 | ||
|
|
c251c596e8 | ||
|
|
61188cfaef | ||
|
|
97d944fd75 | ||
|
|
d3dc1e7328 | ||
|
|
45304af369 | ||
|
|
7f422d58f2 | ||
|
|
c2491fdfad | ||
|
|
06a6e391e8 | ||
|
|
f99475f6b7 | ||
|
|
109fc00b9d | ||
|
|
c071d822e1 | ||
|
|
d2de5b4710 | ||
|
|
cf5ecd8922 | ||
|
|
b337a05b5a | ||
|
|
9ea6bee9d1 | ||
|
|
9747c26d50 | ||
|
|
bb4b764586 | ||
|
|
279b4b41e5 | ||
|
|
b644fb791d | ||
|
|
5802ed31be | ||
|
|
ac9428e96b | ||
|
|
280d9e1dd9 | ||
|
|
f7209e566c | ||
|
|
4a9ab2d1de | ||
|
|
cb74b5ee93 | ||
|
|
60eecd7001 | ||
|
|
4bd7b54bcd | ||
|
|
8923c73d1b | ||
|
|
11e64b13e2 | ||
|
|
983d9248ed | ||
|
|
7240e84328 | ||
|
|
0d55ae2532 | ||
|
|
dbd284f5dd | ||
|
|
c000a02f4a | ||
|
|
79754f48d6 | ||
|
|
dd7a40630b | ||
|
|
14406f8213 | ||
|
|
3bbd9c048d | ||
|
|
d91c4f50b4 | ||
|
|
395b7fbc42 | ||
|
|
3773e57429 | ||
|
|
4835fce62a | ||
|
|
ff814be4a0 | ||
|
|
b271b63efa | ||
|
|
23419e476a | ||
|
|
b9bd1f17b8 | ||
|
|
bcce277c36 | ||
|
|
5acbbe479e | ||
|
|
c9f9d511e0 | ||
|
|
b8cb94c498 | ||
|
|
52c736f6b9 | ||
|
|
ebd1cb7777 | ||
|
|
10decb7909 | ||
|
|
e0aab8d69d | ||
|
|
618600c753 | ||
|
|
d1aba87e37 | ||
|
|
db889f635e | ||
|
|
dd80e634f5 | ||
|
|
bec6fc1a74 | ||
|
|
5c96c7f99b | ||
|
|
7b9724f713 | ||
|
|
4cd12c85ed | ||
|
|
90651540f9 | ||
|
|
9e504d5002 | ||
|
|
faaa94423c | ||
|
|
a7c179fc86 | ||
|
|
ed1a670b9b | ||
|
|
6c3c265bd6 | ||
|
|
9d68025f2a | ||
|
|
e70972c8f9 | ||
|
|
7607be7729 | ||
|
|
db9d428ab4 | ||
|
|
0a2caea3c7 | ||
|
|
b1d1ba0e6b | ||
|
|
5e844372cb | ||
|
|
99c6911e96 | ||
|
|
dc880d7d4e | ||
|
|
c157fef76c | ||
|
|
2b2011dc49 | ||
|
|
ae451e005e | ||
|
|
8a75d41cbb | ||
|
|
5252cc0372 | ||
|
|
5022155317 | ||
|
|
d36f925c65 | ||
|
|
3ae33e0500 | ||
|
|
13e442a0c7 | ||
|
|
6288716966 | ||
|
|
47d2cf9733 | ||
|
|
ae6a9ecee4 | ||
|
|
2289bea8d9 | ||
|
|
cda90259c5 | ||
|
|
432a211f80 | ||
|
|
eaf8c4998e | ||
|
|
55601f7910 | ||
|
|
13e70475d9 | ||
|
|
2572177879 | ||
|
|
e82a2560e4 | ||
|
|
09146591eb | ||
|
|
69c6e57df3 | ||
|
|
5e181a8ec4 | ||
|
|
4354cc3054 | ||
|
|
0664427c63 | ||
|
|
49c4736d69 | ||
|
|
f0ce8f0e05 | ||
|
|
0a70afc5a3 | ||
|
|
431239a736 | ||
|
|
1ceb671683 | ||
|
|
ea40e5918c | ||
|
|
64681729ff |
@@ -1,4 +1,5 @@
|
||||
node_modules
|
||||
build
|
||||
.eslintrc.js.bak
|
||||
src/lib/src/patches/pouchdb-utils
|
||||
src/lib/src/patches/pouchdb-utils
|
||||
esbuild.config.mjs
|
||||
78
.github/ISSUE_TEMPLATE/issue-report.md
vendored
Normal file
@@ -0,0 +1,78 @@
|
||||
---
|
||||
name: Issue report
|
||||
about: Create a report to help us improve
|
||||
title: ''
|
||||
labels: ''
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
Thank you for taking the time to report this issue!
|
||||
To improve the process, I would like to ask you to let me know the information in advance.
|
||||
|
||||
All instructions and examples, and empty entries can be deleted.
|
||||
Just for your information, a [filled example](https://docs.vrtmrz.net/LiveSync/hintandtrivia/Issue+example) is also written.
|
||||
|
||||
## Abstract
|
||||
The synchronisation hung up immediately after connecting.
|
||||
|
||||
## Expected behaviour
|
||||
- Synchronisation ends with the message `Replication completed`
|
||||
- Everything synchronised
|
||||
|
||||
## Actually happened
|
||||
- Synchronisation has been cancelled with the message `TypeError ... ` (captured in the attached log, around LL.10-LL.12)
|
||||
- No files synchronised
|
||||
|
||||
## Reproducing procedure
|
||||
|
||||
1. Configure LiveSync as in the attached material.
|
||||
2. Click the replication button on the ribbon.
|
||||
3. Synchronising has begun.
|
||||
4. About two or three seconds later, we got the error `TypeError ... `.
|
||||
5. Replication has been stopped. No files synchronised.
|
||||
|
||||
Note: If you do not catch the reproducing procedure, please let me know the frequency and signs.
|
||||
|
||||
## Report materials
|
||||
If the information is not available, do not hesitate to report it as it is. You can also of course omit it if you think this is indeed unnecessary. If it is necessary, I will ask you.
|
||||
|
||||
### Report from the LiveSync
|
||||
For more information, please refer to [Making the report](https://docs.vrtmrz.net/LiveSync/hintandtrivia/Making+the+report).
|
||||
<details>
|
||||
<summary>Report from hatch</summary>
|
||||
|
||||
```
|
||||
<!-- paste here -->
|
||||
```
|
||||
</details>
|
||||
|
||||
### Obsidian debug info
|
||||
<details>
|
||||
<summary>Debug info</summary>
|
||||
|
||||
```
|
||||
<!-- paste here -->
|
||||
```
|
||||
</details>
|
||||
|
||||
### Plug-in log
|
||||
We can see the log by tapping the Document box icon. If you noticed something suspicious, please let me know.
|
||||
Note: **Please enable `Verbose Log`**. For detail, refer to [Logging](https://docs.vrtmrz.net/LiveSync/hintandtrivia/Logging), please.
|
||||
|
||||
<details>
|
||||
<summary>Plug-in log</summary>
|
||||
|
||||
```
|
||||
<!-- paste here -->
|
||||
```
|
||||
</details>
|
||||
|
||||
### Network log
|
||||
Network logs displayed in DevTools will possibly help with connection-related issues. To capture that, please refer to [DevTools](https://docs.vrtmrz.net/LiveSync/hintandtrivia/DevTools).
|
||||
|
||||
### Screenshots
|
||||
If applicable, please add screenshots to help explain your problem.
|
||||
|
||||
### Other information, insights and intuition.
|
||||
Please provide any additional context or information about the problem.
|
||||
2
.github/workflows/release.yml
vendored
@@ -17,7 +17,7 @@ jobs:
|
||||
- name: Use Node.js
|
||||
uses: actions/setup-node@v1
|
||||
with:
|
||||
node-version: '14.x' # You might need to adjust this value to your own version
|
||||
node-version: '18.x' # You might need to adjust this value to your own version
|
||||
# Get the version number and put it in a variable
|
||||
- name: Get Version
|
||||
id: version
|
||||
|
||||
1
.gitignore
vendored
@@ -8,6 +8,7 @@ package-lock.json
|
||||
|
||||
# build
|
||||
main.js
|
||||
main_org.js
|
||||
*.js.map
|
||||
|
||||
# obsidian
|
||||
|
||||
119
README.md
@@ -1,99 +1,84 @@
|
||||
<!-- For translation: 20240227r0 -->
|
||||
# Self-hosted LiveSync
|
||||
[Japanese docs](./README_ja.md) - [Chinese docs](./README_cn.md).
|
||||
|
||||
[Japanese docs](./README_ja.md) [Chinese docs](./README_cn.md).
|
||||
|
||||
Self-hosted LiveSync is a community-implemented synchronization plugin.
|
||||
A self-hosted or purchased CouchDB acts as the intermediate server. Available on every obsidian-compatible platform.
|
||||
|
||||
Note: It has no compatibility with the official "Obsidian Sync".
|
||||
Self-hosted LiveSync is a community-implemented synchronization plugin, available on every obsidian-compatible platform and using CouchDB or Object Storage (e.g., MinIO, S3, R2, etc.) as the server.
|
||||
|
||||

|
||||
|
||||
Before installing or upgrading LiveSync, please back your vault up.
|
||||
Note: This plugin cannot synchronise with the official "Obsidian Sync".
|
||||
|
||||
## Features
|
||||
|
||||
- Visual conflict resolver included.
|
||||
- Bidirectional synchronization between devices nearly in real-time
|
||||
- You can use CouchDB or its compatibles like IBM Cloudant.
|
||||
- End-to-End encryption is supported.
|
||||
- Plugin synchronization(Beta)
|
||||
- Receive WebClip from [obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf) (End-to-End encryption will not be applicable.)
|
||||
- Synchronize vaults very efficiently with less traffic.
|
||||
- Good at conflicted modification.
|
||||
- Automatic merging for simple conflicts.
|
||||
- Using OSS solution for the server.
|
||||
- Compatible solutions can be used.
|
||||
- Supporting End-to-end encryption.
|
||||
- Synchronisation of settings, snippets, themes, and plug-ins, via [Customization sync(Beta)](#customization-sync) or [Hidden File Sync](#hiddenfilesync)
|
||||
- WebClip from [obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
Useful for researchers, engineers and developers with a need to keep their notes fully self-hosted for security reasons. Or just anyone who would like the peace of mind of knowing that their notes are fully private.
|
||||
This plug-in might be useful for researchers, engineers, and developers with a need to keep their notes fully self-hosted for security reasons. Or just anyone who would like the peace of mind of knowing that their notes are fully private.
|
||||
|
||||
## IMPORTANT NOTICE
|
||||
|
||||
- Do not enable this plugin with another synchronization solution at the same time (including iCloud and Obsidian Sync). Before enabling this plugin, make sure to disable all the other synchronization methods to avoid content corruption or duplication. If you want to synchronize to two or more services, do them one by one and never enable two synchronization methods at the same time.
|
||||
This includes not putting your vault inside a cloud-synchronized folder (eg. an iCloud folder or Dropbox folder)
|
||||
- This is a synchronization plugin. Not a backup solution. Do not rely on this for backup.
|
||||
- If the device's storage runs out, database corruption may happen.
|
||||
- Hidden files or any other invisible files wouldn't be kept in the database, and thus won't be synchronized. (**and may also get deleted**)
|
||||
>[!IMPORTANT]
|
||||
> - Before installing or upgrading this plug-in, please back your vault up.
|
||||
> - Do not enable this plugin with another synchronization solution at the same time (including iCloud and Obsidian Sync).
|
||||
> - This is a synchronization plugin. Not a backup solution. Do not rely on this for backup.
|
||||
|
||||
## How to use
|
||||
|
||||
### Get your database ready.
|
||||
### 3-minute setup - CouchDB on fly.io
|
||||
|
||||
First, get your database ready. IBM Cloudant is preferred for testing. Or you can use your own server with CouchDB. For more information, refer below:
|
||||
1. [Setup IBM Cloudant](docs/setup_cloudant.md)
|
||||
2. [Setup your CouchDB](docs/setup_own_server.md)
|
||||
**Recommended for beginners**
|
||||
|
||||
Note: More information about alternative hosting methods is needed! Currently, [using fly.io](https://github.com/vrtmrz/obsidian-livesync/discussions/85) is being discussed.
|
||||
[](https://www.youtube.com/watch?v=7sa_I1832Xc)
|
||||
|
||||
### Configure the plugin
|
||||
1. [Setup CouchDB on fly.io](docs/setup_flyio.md)
|
||||
2. Configure plug-in in [Quick Setup](docs/quick_setup.md)
|
||||
|
||||
See [Quick setup guide](doccs/../docs/quick_setup.md)
|
||||
### Manually Setup
|
||||
|
||||
## Something looks corrupted...
|
||||
1. Setup the server
|
||||
1. [Setup CouchDB on fly.io](docs/setup_flyio.md)
|
||||
2. [Setup your CouchDB](docs/setup_own_server.md)
|
||||
2. Configure plug-in in [Quick Setup](docs/quick_setup.md)
|
||||
|
||||
Please open the configuration link again and Answer below:
|
||||
- If your local database looks corrupted (in other words, when your Obsidian getting weird even standalone.)
|
||||
- Answer `No` to `Keep local DB?`
|
||||
- If your remote database looks corrupted (in other words, when something happens while replicating)
|
||||
- Answer `No` to `Keep remote DB?`
|
||||
> [!TIP]
|
||||
> Now, fly.io has become not free. Fortunately, even though there are some issues, we are still able to use IBM Cloudant. Here is [Setup IBM Cloudant](docs/setup_cloudant.md). It will be updated soon!
|
||||
|
||||
If you answered `No` to both, your databases will be rebuilt by the content on your device. And the remote database will lock out other devices. You have to synchronize all your devices again. (When this time, almost all your files should be synchronized with a timestamp. So you can use an existing vault).
|
||||
|
||||
## Test Server
|
||||
|
||||
~~Setting up an instance of Cloudant or local CouchDB is a little complicated, so I set up a [Tasting server for self-hosted-livesync](https://olstaste.vrtmrz.net/). Try it out for free!~~
|
||||
Now (30 May 2023) suspending while the server transfer.
|
||||
Note: Please read "Limitations" carefully. Do not send your private vault.
|
||||
|
||||
## Information in StatusBar
|
||||
|
||||
Synchronization status is shown in statusbar.
|
||||
Synchronization status is shown in the status bar with the following icons.
|
||||
|
||||
- Activity Indicator
|
||||
- 📲 Network request
|
||||
- Status
|
||||
- ⏹️ Stopped
|
||||
- 💤 LiveSync enabled. Waiting for changes.
|
||||
- ⚡️ Synchronization in progress.
|
||||
- ⚠ An error occurred.
|
||||
- ↑ Uploaded chunks and metadata
|
||||
- ↓ Downloaded chunks and metadata
|
||||
- ⏳ Number of pending processes
|
||||
- 🧩 Number of files waiting for their chunks.
|
||||
If you have deleted or renamed files, please wait until ⏳ icon disappeared.
|
||||
- 💤 LiveSync enabled. Waiting for changes
|
||||
- ⚡️ Synchronization in progress
|
||||
- ⚠ An error occurred
|
||||
- Statistical indicator
|
||||
- ↑ Uploaded chunks and metadata
|
||||
- ↓ Downloaded chunks and metadata
|
||||
- Progress indicator
|
||||
- 📥 Unprocessed transferred items
|
||||
- 📄 Working database operation
|
||||
- 💾 Working write storage processes
|
||||
- ⏳ Working read storage processes
|
||||
- 🛫 Pending read storage processes
|
||||
- ⚙️ Working or pending storage processes of hidden files
|
||||
- 🧩 Waiting chunks
|
||||
- 🔌 Working Customisation items (Configuration, snippets, and plug-ins)
|
||||
|
||||
To prevent file and database corruption, please wait to stop Obsidian until all progress indicators have disappeared as possible (The plugin will also try to resume, though). Especially in case of if you have deleted or renamed files.
|
||||
|
||||
|
||||
## Hints
|
||||
- If a folder becomes empty after a replication, it will be deleted by default. But you can toggle this behaviour. Check the [Settings](docs/settings.md).
|
||||
- LiveSync mode drains more batteries in mobile devices. Periodic sync with some automatic sync is recommended.
|
||||
- Mobile Obsidian can not connect to non-secure (HTTP) or locally-signed servers, even if the root certificate is installed on the device.
|
||||
- There are no 'exclude_folders' like configurations.
|
||||
- While synchronizing, files are compared by their modification time and the older ones will be overwritten by the newer ones. Then plugin checks for conflicts and if a merge is needed, a dialog will open.
|
||||
- Rarely, a file in the database could be corrupted. The plugin will not write to local storage when a file looks corrupted. If a local version of the file is on your device, the corruption could be fixed by editing the local file and synchronizing it. But if the file does not exist on any of your devices, then it can not be rescued. In this case, you can delete these items from the settings dialog.
|
||||
- To stop the boot-up sequence (eg. for fixing problems on databases), you can put a `redflag.md` file (or directory) at the root of your vault.
|
||||
Tip for iOS: a redflag directory can be created at the root of the vault using the File application.
|
||||
- Also, with `redflag2.md` placed, we can automatically rebuild both the local and the remote databases during the boot-up sequence. With `redflag3.md`, we can discard only the local database and fetch from the remote again.
|
||||
- Q: The database is growing, how can I shrink it down?
|
||||
A: each of the docs is saved with their past 100 revisions for detecting and resolving conflicts. Picturing that one device has been offline for a while, and comes online again. The device has to compare its notes with the remotely saved ones. If there exists a historic revision in which the note used to be identical, it could be updated safely (like git fast-forward). Even if that is not in revision histories, we only have to check the differences after the revision that both devices commonly have. This is like git's conflict-resolving method. So, We have to make the database again like an enlarged git repo if you want to solve the root of the problem.
|
||||
- And more technical Information is in the [Technical Information](docs/tech_info.md)
|
||||
- If you want to synchronize files without obsidian, you can use [filesystem-livesync](https://github.com/vrtmrz/filesystem-livesync).
|
||||
- WebClipper is also available on Chrome Web Store:[obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
Repo is here: [obsidian-livesync-webclip](https://github.com/vrtmrz/obsidian-livesync-webclip). (Docs are a work in progress.)
|
||||
## Tips and Troubleshooting
|
||||
If you are having problems getting the plugin working see: [Tips and Troubleshooting](docs/troubleshooting.md)
|
||||
|
||||
## License
|
||||
|
||||
The source code is licensed under the MIT License.
|
||||
Licensed under the MIT License.
|
||||
|
||||
153
README_ja.md
@@ -1,84 +1,85 @@
|
||||
<!-- For translation: 20240227r0 -->
|
||||
# Self-hosted LiveSync
|
||||
[英語版ドキュメント](./README.md) - [中国語版ドキュメント](./README_cn.md).
|
||||
|
||||
**旧): obsidian-livesync**
|
||||
Obsidianで利用可能なすべてのプラットフォームで使える、CouchDBをサーバに使用する、コミュニティ版の同期プラグイン
|
||||
|
||||
セルフホストしたデータベースを使って、双方向のライブシンクするObsidianのプラグイン。
|
||||
**公式のSyncとは互換性はありません**
|
||||

|
||||
|
||||
**インストールする前に、Vaultのバックアップを確実に取得してください**
|
||||
|
||||
[英語版](./README.md)
|
||||
|
||||
## こんなことができるプラグインです。
|
||||
- Windows, Mac, iPad, iPhone, Android, Chromebookで動く
|
||||
- セルフホストしたデータベースに同期して
|
||||
- 複数端末で同時にその変更をほぼリアルタイムで配信し
|
||||
- さらに、他の端末での変更も別の端末に配信する、双方向リアルタイムなLiveSyncを実現でき、
|
||||
- 発生した変更の衝突はその場で解決できます。
|
||||
- 同期先のホストにはCouchDBまたはその互換DBaaSのIBM Cloudantをサーバーに使用できます。あなたのデータは、あなたのものです。
|
||||
- もちろんLiveではない同期もできます。
|
||||
- 万が一のために、サーバーに送る内容を暗号化できます(betaです)。
|
||||
- [Webクリッパー](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf) もあります(End-to-End暗号化対象外です)
|
||||
|
||||
NDAや類似の契約や義務、倫理を守る必要のある、研究者、設計者、開発者のような方に特にオススメです。
|
||||
特にエンタープライズでは、たとえEnd to Endの暗号化が行われていても、管理下にあるサーバーにのみデータを格納することが求められる場合があります。
|
||||
|
||||
# 重要なお知らせ
|
||||
|
||||
- ❌ファイルの重複や破損を避けるため、複数の同期手段を同時に使用しないでください。
|
||||
これは、Vaultをクラウド管理下のフォルダに置くことも含みます。(例えば、iCloudの管理フォルダ内に入れたり)。
|
||||
- ⚠️このプラグインは、端末間でのノートの反映を目的として作成されました。バックアップ等が目的ではありません。そのため、バックアップは必ず別のソリューションで行うようにしてください。
|
||||
- ストレージの空き容量が枯渇した場合、データベースが破損することがあります。
|
||||
|
||||
# このプラグインの使い方
|
||||
|
||||
1. Community Pluginsから、Self-holsted LiveSyncと検索しインストールするか、このリポジトリのReleasesから`main.js`, `manifest.json`, `style.css` をダウンロードしvaultの中の`.obsidian/plugins/obsidian-livesync`に入れて、Obsidianを再起動してください。
|
||||
2. サーバーをセットアップします。IBM Cloudantがお手軽かつ堅牢で便利です。完全にセルフホストする際にはお持ちのサーバーにCouchDBをインストールする必要があります。詳しくは下記を参照してください
|
||||
1. [IBM Cloudantのセットアップ](docs/setup_cloudant_ja.md)
|
||||
2. [独自のCouchDBのセットアップ](docs/setup_own_server_ja.md)
|
||||
|
||||
備考: IBM Cloudantのアカウント登録が出来ないケースがあるようです。代替を探していて、今 [using fly.io](https://github.com/vrtmrz/obsidian-livesync/discussions/85)を検討しています。
|
||||
|
||||
1. [Quick setup](docs/quick_setup_ja.md)から、セットアップウィザード使ってセットアップしてください。
|
||||
|
||||
# テストサーバー
|
||||
|
||||
もし、CouchDBをインストールしたり、Cloudantのインスタンスをセットアップしたりするのに気が引ける場合、[Self-hosted LiveSyncのテストサーバー](https://olstaste.vrtmrz.net/)を作りましたので、使ってみてください。
|
||||
|
||||
備考: 制限事項をよく確認して使用してください。くれぐれも、本当に使用している自分のVaultを同期しないようにしてください。
|
||||
|
||||
# WebClipperあります
|
||||
Self-hosted LiveSync用にWebClipperも作りました。Chrome Web Storeからダウンロードできます。
|
||||
|
||||
[obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
リポジトリはこちらです: [obsidian-livesync-webclip](https://github.com/vrtmrz/obsidian-livesync-webclip)。
|
||||
|
||||
相変わらずドキュメントは間に合っていません。
|
||||
|
||||
# ステータスバーの情報
|
||||
右下のステータスバーに、同期の状態が表示されます
|
||||
|
||||
- 同期状態
|
||||
- ⏹️ 同期は停止しています
|
||||
- 💤 同期はLiveSync中で、なにか起こるのを待っています
|
||||
- ⚡️ 同期中です
|
||||
- ⚠ エラーが発生しています
|
||||
- ↑ 送信したデータ数
|
||||
- ↓ 受信したデータ数
|
||||
- ⏳ 保留している処理の数です
|
||||
ファイルを削除したりリネームした場合、この表示が消えるまでお待ちください。
|
||||
|
||||
# さらなる補足
|
||||
- ファイルは同期された後、タイムスタンプを比較して新しければいったん新しい方で上書きされます。その後、衝突が発生したかによって、マージが行われます。
|
||||
- まれにファイルが破損することがあります。破損したファイルに関してはディスクへの反映を試みないため、実際には使用しているデバイスには少し古いファイルが残っていることが多いです。そのファイルを再度更新してもらうと、データベースが更新されて問題なくなるケースがあります。ファイルがどの端末にも存在しない場合は、設定画面から、削除できます。
|
||||
- データベースの復旧中に再起動した場合など、うまくローカルデータベースを修正できない際には、Vaultのトップに`redflag.md`というファイルを置いてください。起動時のシーケンスがスキップされます。
|
||||
- データベースが大きくなってきてるんだけど、小さくできる?→各ノートは、それぞれの古い100リビジョンとともに保存されています。例えば、しばらくオフラインだったあるデバイスが、久しぶりに同期したと想定してみてください。そのとき、そのデバイスは最新とは少し異なるリビジョンを持ってるはずです。その場合でも、リモートのリビジョン履歴にリモートのものが存在した場合、安全にマージできます。もしリビジョン履歴に存在しなかった場合、確認しなければいけない差分も、対象を存在して持っている共通のリビジョン以降のみに絞れます。ちょうどGitのような方法で、衝突を解決している形になるのです。そのため、肥大化したリポジトリの解消と同様に、本質的にデータベースを小さくしたい場合は、データベースの作り直しが必要です。
|
||||
- その他の技術的なお話は、[技術的な内容](docs/tech_info_ja.md)に書いてあります。
|
||||
※公式のSyncと同期することはできません。
|
||||
|
||||
|
||||
# ライセンス
|
||||
## 機能
|
||||
- 高効率・低トラフィックでVault同士を同期
|
||||
- 競合解決がいい感じ
|
||||
- 単純な競合なら自動マージします
|
||||
- OSSソリューションを同期サーバに使用
|
||||
- 互換ソリューションも使用可能です
|
||||
- End-to-End暗号化実装済み
|
||||
- 設定・スニペット・テーマ、プラグインの同期が可能
|
||||
- [Webクリッパー](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf) もあります
|
||||
|
||||
The source code is licensed MIT.
|
||||
|
||||
NDAや類似の契約や義務、倫理を守る必要のある、研究者、設計者、開発者のような方に特にオススメです。
|
||||
|
||||
|
||||
>[!IMPORTANT]
|
||||
> - インストール・アップデート前には必ずVaultをバックアップしてください
|
||||
> - 複数の同期ソリューションを同時に有効にしないでください(これはiCloudや公式のSyncも含みます)
|
||||
> - このプラグインは同期プラグインです。バックアップとして使用しないでください
|
||||
|
||||
|
||||
## このプラグインの使い方
|
||||
|
||||
### 3分セットアップ - CouchDB on fly.io
|
||||
|
||||
**はじめての方におすすめ**
|
||||
|
||||
[](https://www.youtube.com/watch?v=7sa_I1832Xc)
|
||||
|
||||
1. [Fly.ioにCouchDBをセットアップする](docs/setup_flyio.md)
|
||||
2. [Quick Setup](docs/quick_setup_ja.md)でプラグインを設定する
|
||||
|
||||
|
||||
### Manually Setup
|
||||
|
||||
1. サーバのセットアップ
|
||||
1. [Fly.ioにCouchDBをセットアップする](docs/setup_flyio.md)
|
||||
2. [CouchDBをセットアップする](docs/setup_own_server_ja.md)
|
||||
2. [Quick Setup](docs/quick_setup_ja.md)でプラグインを設定する
|
||||
|
||||
> [!TIP]
|
||||
> IBM Cloudantもまだ使用できますが、いくつかの理由で現在はおすすめしていません。[IBM Cloudantのセットアップ](docs/setup_cloudant_ja.md)はまだあります。
|
||||
|
||||
## ステータスバーの説明
|
||||
|
||||
同期ステータスはステータスバーに、下記のアイコンとともに表示されます
|
||||
|
||||
- アクティビティー
|
||||
- 📲 ネットワーク接続中
|
||||
- 同期ステータス
|
||||
- ⏹️ 停止中
|
||||
- 💤 変更待ち(LiveSync中)
|
||||
- ⚡️ 同期の進行中
|
||||
- ⚠ エラー
|
||||
- 統計情報
|
||||
- ↑ アップロードしたチャンクとメタデータ数
|
||||
- ↓ ダウンロードしたチャンクとメタデータ数
|
||||
- 進捗情報
|
||||
- 📥 転送後、未処理の項目数
|
||||
- 📄 稼働中データベース操作数
|
||||
- 💾 稼働中のストレージ書き込み数操作数
|
||||
- ⏳ 稼働中のストレージ読み込み数操作数
|
||||
- 🛫 待機中のストレージ読み込み数操作数
|
||||
- ⚙️ 隠しファイルの操作数(待機・稼働中合計)
|
||||
- 🧩 取得待ちを行っているチャンク数
|
||||
- 🔌 設定同期関連の操作数
|
||||
|
||||
データベースやファイルの破損を避けるため、Obsidianの終了は進捗情報が表示されなくなるまで待ってください(プラグインも復帰を試みますが)。特にファイルを削除やリネームした場合は気をつけてください。
|
||||
|
||||
|
||||
## Tips and Troubleshooting
|
||||
何かこまったら、[Tips and Troubleshooting](docs/troubleshooting.md)をご参照ください。
|
||||
|
||||
## License
|
||||
|
||||
Licensed under the MIT License.
|
||||
46
docker-compose.traefik.yml
Normal file
@@ -0,0 +1,46 @@
|
||||
# For details and other explanations about this file refer to:
|
||||
# https://github.com/vrtmrz/obsidian-livesync/blob/main/docs/setup_own_server.md#traefik
|
||||
|
||||
version: "2.1"
|
||||
services:
|
||||
couchdb:
|
||||
image: couchdb:latest
|
||||
container_name: obsidian-livesync
|
||||
user: 1000:1000
|
||||
environment:
|
||||
- COUCHDB_USER=username
|
||||
- COUCHDB_PASSWORD=password
|
||||
volumes:
|
||||
- ./data:/opt/couchdb/data
|
||||
- ./local.ini:/opt/couchdb/etc/local.ini
|
||||
# Ports not needed when already passed to Traefik
|
||||
#ports:
|
||||
# - 5984:5984
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- proxy
|
||||
labels:
|
||||
- "traefik.enable=true"
|
||||
# The Traefik Network
|
||||
- "traefik.docker.network=proxy"
|
||||
# Don't forget to replace 'obsidian-livesync.example.org' with your own domain
|
||||
- "traefik.http.routers.obsidian-livesync.rule=Host(`obsidian-livesync.example.org`)"
|
||||
# The 'websecure' entryPoint is basically your HTTPS entrypoint. Check the next code snippet if you are encountering problems only; you probably have a working traefik configuration if this is not your first container you are reverse proxying.
|
||||
- "traefik.http.routers.obsidian-livesync.entrypoints=websecure"
|
||||
- "traefik.http.routers.obsidian-livesync.service=obsidian-livesync"
|
||||
- "traefik.http.services.obsidian-livesync.loadbalancer.server.port=5984"
|
||||
- "traefik.http.routers.obsidian-livesync.tls=true"
|
||||
# Replace the string 'letsencrypt' with your own certificate resolver
|
||||
- "traefik.http.routers.obsidian-livesync.tls.certresolver=letsencrypt"
|
||||
- "traefik.http.routers.obsidian-livesync.middlewares=obsidiancors"
|
||||
# The part needed for CORS to work on Traefik 2.x starts here
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolallowmethods=GET,PUT,POST,HEAD,DELETE"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolallowheaders=accept,authorization,content-type,origin,referer"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolalloworiginlist=app://obsidian.md,capacitor://localhost,http://localhost"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolmaxage=3600"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.addvaryheader=true"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accessControlAllowCredentials=true"
|
||||
|
||||
networks:
|
||||
proxy:
|
||||
external: true
|
||||
50
docs/design_docs_of_journalsync.md
Normal file
@@ -0,0 +1,50 @@
|
||||
## The design document of the journal sync
|
||||
|
||||
Original title: Synchronise without CouchDB
|
||||
|
||||
### Goal
|
||||
- Synchronise vaults without CouchDB
|
||||
|
||||
### Motivation
|
||||
- Serving CouchDB is not pretty easy.
|
||||
- Full spec DBaaS (Paid IBM Cloudant) is a bit expensive and lacking of alternatives.
|
||||
- Securing alternatives, from just one protocol.
|
||||
|
||||
### Prerequisite
|
||||
- We should have multiple implementations of the server software.
|
||||
- We should also be able to use SaaS, with a choice of options.
|
||||
- We should require them a reasonable sense of cost, ideally free of charge for trials.
|
||||
- We should be able to serve some instance of the server software, as OSS — with transparency, availability of auditing, and the fact that they actually took place.
|
||||
|
||||
### Methods and implementations
|
||||
|
||||
Ordinarily, local pouchDB and the remote CouchDB are synchronised by sending each missing document through several conversations in their replication protocol. However, to achieve this plan, we cannot rely on CouchDB and its protocols. This limitation is so harsh. However, Overcoming this means gaining new possibilities. After some trials, It was concluded that synchronisation could be completed even if the actions that could be performed were limited to uploading, downloading and retrieving the list. This means we can use any old-fashioned WebDAV server, and Sophisticated “Object storages” such as Self-hosted MinIO, S3, and R2 or any we like. This is realised by sharing and complementing the differences of the journal by each client. Therefore, The focus is therefore on how to identify which are the differences and send them without dynamic communication.
|
||||
|
||||
All clients manage their data in PouchDB. I know this is probably known information, but it has its own journal.
|
||||
|
||||
First, all clients should record to what point in the journal they sent themselves last time. The client then packs from the previous point to the latest when sending and also updates their record. This pack is uploaded to the server with the name starting with the timestamp of its creation. This is the send operation.
|
||||
|
||||
Conversely, when receiving, the packs uploaded to the server that have not yet been received are received in order. This is easy as their names are in date order. When the process is successfully completed, the names of the files received are recorded. The journals from this pack are then reflected in their own database. Conflict resolution is left to PouchDB, so the client only needs to do the work of applying any differences. And here is the key: the client records the ID and revision of the document that was in the journal and applied.
|
||||
|
||||
This key works when creating a pack. When creating a pack, the client omits this 'document recorded as received and used'. This is because received and applied means that it has already been sent by another client and exists on the server. This ensures that unnecessary transmissions do not take place.
|
||||
|
||||
Synchronisation is then always started by receiving. This is a little trick to avoid including unnecessary documents in the pack.
|
||||
|
||||
These behaviours allow clients to voluntarily send and receive only the missing parts of the journal that are not stored on the server, without having to communicate with each other, and still keep a single, consistent journal on the server.
|
||||
|
||||
Source codes actually implemented this is already committed into the repository.
|
||||
|
||||
### Test strategy
|
||||
|
||||
This implementation replaces the synchronisation performed by CouchDB. Therefore, testing was simply done by comparing the same changes to the same vault, replicated in CouchDB, with those done by this implementation.
|
||||
|
||||
### Documentation strategy
|
||||
|
||||
- Documentation should be done in a quick setup, at least.
|
||||
- As several server implementations can be selected, the description is omitted with regard to specific configuration values.
|
||||
- A MinIO set-up might be nice to have. However, it is not considered essential.
|
||||
- It would be a good opportunity to also publish these design documents.
|
||||
|
||||
### Consideration and Conclusion
|
||||
|
||||
This design offers a novel approach to journal synchronisation without relying on CouchDB. It leverages PouchDB's journaling capabilities and leverages simple server-side storage for efficient data exchange. Hence, the new design could be said to have gotten a broader outlook.
|
||||
81
docs/design_docs_of_keep_newborn_chunks.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# Keep newborn chunks in Eden.
|
||||
|
||||
NOTE: This is the planned feature design document. This is planned, but not be implemented now (v0.23.3). This has not reached the design freeze and will be added to from time to time.
|
||||
|
||||
## Goal
|
||||
|
||||
Reduce the number of chunks which in volatile, and reduce the usage of storage of the remote database in middle or long term.
|
||||
|
||||
## Motivation
|
||||
|
||||
- In the current implementation, Self-hosted LiveSync splits documents into metadata and multiple chunks. In particular, chunks are split so that they do not exceed a certain length.
|
||||
- This is to optimise the transfer and take advantage of the properties of CouchDB. This also complies with the restriction of IBM Cloudant on the size of a single document.
|
||||
- However, creating chunks halfway through each editing operation increases the number of unnecessary chunks.
|
||||
- Chunks are shared by several documents. For this reason, it is not clear whether these chunks are needed or not unless all revisions of all documents are checked. This makes it difficult to remove unnecessary data.
|
||||
- On the other hand, chunks are done in units that can be neatly divided as markdown to ensure relatively accurate de-duplication, even if they are created simultaneously on multiple terminals. Therefore, it is unlikely that the data in the editing process will be reused.
|
||||
- For this reason, we have made features such as Batch save available, but they are not a fundamental solution.
|
||||
- As a result, there is a large amount of data that cannot be erased and is probably unused. Therefore, `Fetch chunks on demand` is currently performed for optimal communication.
|
||||
- If the generation of unnecessary chunks is sufficiently reduced, this function will become unnecessary.
|
||||
- The problem is that this unnecessary chunking slows down both local and remote operations.
|
||||
|
||||
## Prerequisite
|
||||
- The implementation must be able to control the size of the document appropriately so that it does not become non-transferable (1).
|
||||
- The implementation must be such that data corruption can be avoided even if forward compatibility is not maintained; due to the nature of Self-hosted LiveSync, backward version connexions are expected.
|
||||
- Viewed as a feature:
|
||||
- This feature should be disabled for migration users.
|
||||
- This feature should be enabled for new users and after rebuilds of migrated users.
|
||||
- Therefore, back into the implementation view, Ideally, the implementation should be such that data recovery can be achieved by immediately upgrading after replication.
|
||||
|
||||
## Outlined methods and implementation plans
|
||||
### Abstract
|
||||
To store and transfer only stable chunks independently and share them from multiple documents after stabilisation, new chunks, i.e. chunks that are considered non-stable, are modified to be stored in the document and transferred with the document. In this case, care should be taken not to exceed prerequisite (1).
|
||||
|
||||
If this is achieved, the non-leaf document will not be transferred, and even if it is, the chunk will be stored in the document, so that the size can be reduced by the compaction.
|
||||
|
||||
Details are given below.
|
||||
|
||||
1. The document will henceforth have the property eden.
|
||||
```typescript
|
||||
// Paritally Type
|
||||
type EntryWithEden = {
|
||||
eden: {
|
||||
[key: DocumentID]: {
|
||||
data: string,
|
||||
epoch: number, // The document revision which this chunk has been born.
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
2. The following configuration items are added:
|
||||
Note: These configurations should be shared as `Tweaks value` between each client.
|
||||
- useEden : boolean
|
||||
- Max chunks in eden : number
|
||||
- Max Total chunk lengths in eden: number
|
||||
- Max age while in eden: number
|
||||
3. In the document saving operation, chunks are added to Eden within each document, having the revision number of the existing document. And if some chunks in eden are not used in the operating revision, they would be removed.
|
||||
Then after being so chosen, a few chunks are also chosen to be graduated as an independent `chunk` in following rules, and they would be left the eden:
|
||||
- Those that have already been confirmed to exist as independent chunks.
|
||||
- This confirmation of existence may ideally be determined by a fast first-order determination, e.g. by a Bloom filter.
|
||||
- Those whose length exceeds the configured maximum length.
|
||||
- Those have aged over the configured value, since epoch at the operating revision.
|
||||
- Those whose total length, when added up when they are arranged in reverse order of the revision in which they were generated, is after the point at which they exceed the max length in the configuration. Or, those after the configured maximum items.
|
||||
4. In the document loading operation, chunks are firstly read from these eden.
|
||||
5. In End-to-End Encryption, property `eden` of documents will also be encrypted.
|
||||
|
||||
### Note
|
||||
- When this feature has been enabled, forward compatibility is temporarily lost. However, it is detected as missing chunks, and this data is not reflected in the storage in the old version. Therefore, no data loss will occur.
|
||||
|
||||
## Test strategy
|
||||
|
||||
1. Confirm that synchronisation with the previous version is possible with this feature disabled.
|
||||
2. With this feature enabled, connect from the previous version and confirm that errors are detected in the previous version but the files are not corrupted.
|
||||
3. Ensure that the two versions with this feature enabled can withstand normal use.
|
||||
|
||||
## Documentation strategy
|
||||
|
||||
- This document is published, and will be referred from the release note.
|
||||
- Indeed, we lack a fulfilled configuration table. Efforts will be made and, if they can be produced, this document will then be referenced. But not required while in the experimental or beta feature.
|
||||
- However, this might be an essential feature. Further efforts are desired.
|
||||
|
||||
### Consideration and Conclusion
|
||||
To be described after implemented, tested, and, released.
|
||||
55
docs/design_docs_of_sharing_tweak_value.md
Normal file
@@ -0,0 +1,55 @@
|
||||
# Sharing `Tweak values`
|
||||
|
||||
NOTE: This is the planned feature design document. This is planned, but not be implemented now (v0.23.3). This has not reached the design freeze and will be added to from time to time.
|
||||
|
||||
## Goal
|
||||
|
||||
Share `Tweak values` between clients to match the chunk lengths, and match per-server configurations for better performance.
|
||||
|
||||
## Motivation
|
||||
|
||||
- In the current implementation, Self-hosted LiveSync splits documents into metadata and multiple chunks. In particular, chunks are split so that they do not exceed a certain length.
|
||||
- This is to optimise the transfer and take advantage of the properties of CouchDB. This also complies with the restriction of IBM Cloudant on the size of a single document.
|
||||
- The length of this chunk is adjusted according to a configured factor. Therefore, if this is inconsistent between clients, de-duplication will not work. This is because, in fact, they point to the same content in total, but are split in different places. This results in unnecessary transfers or storage consumption.
|
||||
- The same applies to hash algorithms.
|
||||
- There are more configurations which `preferred to be matched`, even if it is not required. such as the maximum size of files to be handled and the interval between requests to the remote database, unless there are specific circumstances.
|
||||
- To avoid the tragedy of "Too many toggles", "Unexpected transfer amount", or "Poor performance" at once, the plug-in should know these problems or potential problems and be able to let us know.
|
||||
|
||||
## Prerequisite
|
||||
- We must be informed of a discrepancy in a configured value that is required to be absolutely consistent and be able to make a decision on the spot.
|
||||
- We should be able to see on the configuration dialogue, that there is a discrepancy between configured values that should be matched, and it should be possible to adjust them to a specific one of them (or default).
|
||||
- We must not be exposed to unexpected; such as leaking credentials or their secrets.
|
||||
|
||||
## Outlined methods and implementation plans
|
||||
### Abstract
|
||||
- In the current implementation, each client checks the remote database for the existence of their node information, to detect whether the remote database accepts them.
|
||||
- This is what 'Lock' is all about.
|
||||
- To achieve this feature, the client will also send each configuration value. However, the configuration contains credentials and/or secret values. Hence we cannot send all of them.
|
||||
- With a favourable prediction, Self-hosted LiveSync will continue to increase in feature. Each time this happens, the number of configuration values to be kept secret will also increase. Therefore, they must be handled by an allow-list.
|
||||
- This allow-listed configuration are the `Tweak values`.
|
||||
- If the plug-in detects mismatched `Tweak values` on checking the remote database, the plug-in will ask us to decide which is win (Mine, or theirs).
|
||||
- Node information is one of the documents. Therefore, it will be replicated and saved locally. While showing dialogue, show the notice on each `Match preferred` configuration.
|
||||
|
||||
## Note
|
||||
This feature should be mostly harmless. We will not be able to disable this.
|
||||
|
||||
## Test strategy
|
||||
|
||||
A: During synchronisation.
|
||||
1. No message shall be displayed with all settings matched.
|
||||
2. Message shall be displayed when there are mismatched, required match items.
|
||||
1. The setting values can be changed according to the message.
|
||||
2. The message can be ignored.
|
||||
3. The message shall not be displayed even if there are mismatched items which is recommended to be matched.
|
||||
|
||||
B: On the setting dialogue.
|
||||
1. All mismatched items shall be highlighted in some way.
|
||||
|
||||
## Documentation strategy
|
||||
|
||||
- This document is published, and will be referred from the release note.
|
||||
- Indeed, we lack a fulfilled configuration table. Efforts will be made and, if they can be produced, this document will then be referenced. But not required while in the experimental or beta feature.
|
||||
- However, this might be an essential feature. Further efforts are desired.
|
||||
|
||||
### Consideration and Conclusion
|
||||
To be described after implemented, tested, and, released.
|
||||
@@ -1,97 +1,129 @@
|
||||
# Quick setup
|
||||
The Setup wizard has been implemented since v0.15.0 simplifying the initial setup.
|
||||
|
||||
Note: Subsequent devices should be set up using the `Copy setup URI` and `Open setup URI` functionality.
|
||||
[Japanese docs](./quick_setup_ja.md) - [Chinese docs](./quick_setup_cn.md).
|
||||
|
||||
## How to open and use the wizard
|
||||
Open the `🪄 Setup wizard` in the settings dialogue. If the plugin has not been configured before, it should already be open.
|
||||
The plugin has so many configuration options to deal with different circumstances. However, only a few settings are required in the normal cases. Therefore, `The Setup wizard` has been implemented to simplify the setup.
|
||||
|
||||

|
||||
|
||||
### Discard the existing configuration and set up
|
||||
If you have changed any settings, this button allows you to discard all changes before setting up.
|
||||
There are three methods to set up Self-hosted LiveSync.
|
||||
|
||||
### Do not discard the existing configuration and set up
|
||||
Simply reconfigure. Be careful. In wizard mode, you cannot see all configuration items, even if they have been configured.
|
||||
1. [Using setup URIs](#1-using-setup-uris) *(Recommended)*
|
||||
2. [Minimal setup](#2-minimal-setup)
|
||||
3. [Full manually setup the and Enable on this dialogue](#3-manually-setup)
|
||||
|
||||
Pressing `Next` on one of the above options will put the configuration dialog into wizard mode.
|
||||
## At the first device
|
||||
|
||||
### Wizard mode
|
||||
### 1. Using setup URIs
|
||||
|
||||
> [!TIP]
|
||||
> What is the setup URI? Why is it required?
|
||||
> The setup URI is the encrypted representation of Self-hosted LiveSync configuration as a URI. This starts `obsidian://setuplivesync?settings=`. This is encrypted with a passphrase, so that it can be shared relatively securely between devices. It is a bit long, but it is one line. This allows a series of settings to be set at once without any inconsistencies.
|
||||
>
|
||||
> If you have configured the remote database by [Automated setup on Fly.io](./setup_flyio.md#a-very-automated-setup) or [set up your server with the tool](./setup_own_server.md#1-generate-the-setup-uri-on-a-desktop-device-or-server), **you should have one of them**
|
||||
|
||||
In this procedure, [this video](https://youtu.be/7sa_I1832Xc?t=146) may help us.
|
||||
|
||||
1. Click `Use` button (Or launch `Use the copied setup URI` from Command palette).
|
||||
2. Paste the Setup URI into the dialogue
|
||||
3. Type the passphrase of the Setup URI
|
||||
4. Answer `yes` for `Importing LiveSync's conf, OK?`.
|
||||
5. Answer `Set it up as secondary or subsequent device` for `How would you like to set it up?`.
|
||||
6. Initialisation will begin, please hold a while.
|
||||
7. You will asked about the hidden file synchronisation, answer as you like.
|
||||
1. If you are new to Self-hosted LiveSync, we can configure it later so leave it once.
|
||||
8. Synchronisation has been started! `Reload app without saving` is recommended after the indicators of Self-hosted LiveSync disappear.
|
||||
|
||||
OK, we can proceed the [next step](#).
|
||||
|
||||
### 2. Minimal setup
|
||||
|
||||
If you do not have any setup URI, Press the `start` button. The setting dialogue turns into the wizard mode and will display only minimal items.
|
||||
|
||||
>[!TIP]
|
||||
> We can generate the setup URI with the tool in any time. Please use [this tool](./setup_own_server.md#1-generate-the-setup-uri-on-a-desktop-device-or-server).
|
||||
|
||||

|
||||
|
||||
Let's see how to use it step-by-step.
|
||||
|
||||
## Remote Database configuration
|
||||
#### Select the remote type
|
||||
|
||||
### Remote database configuration
|
||||
1. Select the Remote Type from dropdown list.
|
||||
We now have a choice between CouchDB (and its compatibles) and object storage (MinIO, S3, R2). CouchDB is the first choice and is also recommended. And supporting Object Storage is an experimental feature.
|
||||
|
||||
Enter the information for the database we have set up.
|
||||
#### Remote configuration
|
||||
|
||||
##### CouchDB
|
||||
|
||||
Enter the information for the database we have set up.
|
||||
|
||||

|
||||
|
||||
### End to End Encryption
|
||||
##### Object Storage
|
||||
|
||||

|
||||
1. Enter the information for the S3 API and bucket.
|
||||
|
||||
When End to End encryption is enabled, a third party will be less likely to be able to read your Remote database in the event of a data breach/leak (assuming they do not know the Passphrase). We strongly recommend enabling it.
|
||||
Encryption is based on 256-bit AES-GCM.
|
||||
This setting can be disabled if you are inside a closed network and it is clear that you will not be accessed by third parties.
|
||||

|
||||
|
||||
### Test database connection and Check database configuration
|
||||
Note 1: if you use S3, you can leave the Endpoint URL empty.
|
||||
Note 2: if your Object Storage cannot configure the CORS setting fully, you may able to connect to the server by enabling the `Use Custom HTTP Handler` toggle.
|
||||
|
||||
Here we can check the status of the connection to the database and the database settings.
|
||||
2. Press `Test` of `Test Connection` once and ensure you can connect to the Object Storage.
|
||||
|
||||
#### Only CouchDB: Test database connection and Check database configuration
|
||||
|
||||
We can check the connectivity to the database, and the database settings.
|
||||
|
||||

|
||||
|
||||
#### Test Database Connection
|
||||
Check whether we can connect to the database. If it fails, there are several possible reasons, but first attempt the `Check database configuration` check to see if it fails there too.
|
||||
#### Only CouchDB: Check and Fix database configuration
|
||||
|
||||
#### Check database configuration
|
||||
|
||||
Check the database settings and fix any deficiencies on the spot.
|
||||
Check the database settings and fix any problems on the spot.
|
||||
|
||||

|
||||
|
||||
This item may vary depending on the connection. In the above case, press all three Fix buttons.
|
||||
If the Fix buttons disappear and all become check marks, we are done.
|
||||
|
||||

|
||||
#### Confidentiality configuration (Optional but very preferred)
|
||||
|
||||
### Next
|
||||
Go to the Local Database configuration.
|
||||

|
||||
|
||||
### Discard existing database and proceed
|
||||
Discard the contents of the Remote database and go to the Local Database configuration.
|
||||
Enable End-to-end encryption and the contents of your notes will be encrypted at the moment it leaves the device. We strongly recommend enabling it. And `Path Obfuscation` also obfuscates filenames. Now stable and recommended.
|
||||
|
||||
## Local Database configuration
|
||||
These setting can be disabled if you are inside a closed network and it is clear that you will not be accessed by third parties.
|
||||
|
||||

|
||||
> [!TIP]
|
||||
> Encryption is based on 256-bit AES-GCM.
|
||||
|
||||
Configure the local database. If we already have a Vault with Self-hosted LiveSync installed which has the same directory name as the one we are currently setting up, please specify a different suffix than the Vault you have already set up here.
|
||||
We should proceed to the Next step.
|
||||
|
||||
## Miscellaneous
|
||||
Finally, finish the miscellaneous configurations and select a preset for synchronisation.
|
||||
#### Sync Settings
|
||||
Finally, finish the wizard by selecting a preset for synchronisation.
|
||||
|
||||
Note: If you are going to use Object Storage, you cannot select `LiveSync`.
|
||||
|
||||

|
||||
|
||||
The `Show status inside editor` can be enabled to your liking. If enabled, the status is displayed in the top right-hand corner of the editor. Learn how to read the status bar [here](/README.md#information-in-statusbar).
|
||||
Select any synchronisation methods we want to use and `Apply`. If database initialisation is required, it will be performed at this time. When `All done!` is displayed, we are ready to synchronise.
|
||||
|
||||

|
||||
|
||||
From Presets, select the synchronisation method we want to use and `Apply` to initialise and build the local and remote databases as required.
|
||||
If `All done!' is displayed, we are done. Automatically, `Copy setup URI` will open and we will be asked for a passphrase to encrypt the `Setup URI`.
|
||||
The dialogue of `Copy settings as a new setup URI` will be open automatically. Please input a passphrase to encrypt the new `Setup URI`. (This passphrase is to encrypt the setup URI, not the vault).
|
||||
|
||||

|
||||
|
||||
Set the passphrase as you like.
|
||||
The Setup URI will be copied to the clipboard, which you can then transfer to the second and subsequent devices in some way.
|
||||
The Setup URI will be copied to the clipboard, please make a note(Not in Obsidian) of this.
|
||||
|
||||
# How to set up the second and subsequent units
|
||||
After installing Self-hosted LiveSync on the first device, select `Open setup URI` from the command palette and enter the setup URI you transferred. Afterwards, enter your passphrase and a setup wizard will open.
|
||||
Answer the following.
|
||||
>[!TIP]
|
||||
We can copy this in any time by `Copy current settings as a new setup URI`.
|
||||
|
||||
- `Yes` to `Importing LiveSync's conf, OK?`
|
||||
- `Set it up as secondary or subsequent device` to `How would you like to set it up?`.
|
||||
### 3. Manually setup
|
||||
|
||||
Then, The configuration will take effect and replication will start. Your files will be synchronised soon! You may need to close the settings dialog and reopen it to see the settings fields populated properly, but they will be set.
|
||||
It is strongly recommended to perform a "minimal set-up" first and set up the other contents after making sure has been synchronised.
|
||||
|
||||
However, if you have some specific reasons to configure it manually, please click the `Enable` button of `Enable LiveSync on this device as the set-up was completed manually`.
|
||||
And, please copy the setup URI by `Copy current settings as a new setup URI` and make a note(Not in Obsidian) of this.
|
||||
|
||||
## At the subsequent device
|
||||
After installing Self-hosted LiveSync on the first device, we should have a setup URI. **The first choice is to use it**. Please share it with the device you want to setup.
|
||||
|
||||
It is completely same as [Using setup URIs on the first device](#1-using-setup-uris). Please refer it.
|
||||
|
||||
93
docs/quick_setup_cn.md
Normal file
@@ -0,0 +1,93 @@
|
||||
# 快速配置 (Quick setup)
|
||||
|
||||
该插件有较多配置项, 可以应对不同的情况. 不过, 实际使用的设置并不多. 因此, 我们采用了 "设置向导 (The Setup wizard)" 来简化初始设置.
|
||||
|
||||
Note: 建议使用 `Copy setup URI` and `Open setup URI` 来设置后续设备.
|
||||
|
||||
## 设置向导 (The Setup wizard)
|
||||
|
||||
在设置对话框中打开 `🧙♂️ Setup wizard`. 如果之前未配置插件, 则会自动打开该页面.
|
||||
|
||||

|
||||
|
||||
- 放弃现有配置并进行设置
|
||||
如果您先前有过任何设置, 此按钮允许您在设置前放弃所有更改.
|
||||
|
||||
- 保留现有配置和设置
|
||||
快速重新配置. 请注意, 在向导模式下, 您无法看到所有已经配置过的配置项.
|
||||
|
||||
在上述选项中按下 `Next`, 配置对话框将进入向导模式 (wizard mode).
|
||||
|
||||
### 向导模式 (Wizard mode)
|
||||
|
||||

|
||||
|
||||
接下来将介绍如何逐步使用向导模式.
|
||||
|
||||
## 配置远程数据库
|
||||
|
||||
### 开始配置远程数据库
|
||||
|
||||
输入已部署好的数据库的信息.
|
||||
|
||||

|
||||
|
||||
#### 测试数据库连接并检查数据库配置
|
||||
|
||||
我们可以检查数据库的连接性和数据库设置.
|
||||
|
||||

|
||||
|
||||
#### 测试数据库连接
|
||||
|
||||
检查是否能成功连接数据库. 如果连接失败, 可能是多种原因导致的, 但请先点击 `Check database configuration` 来检查数据库配置是否有问题.
|
||||
|
||||
#### 检查数据库配置
|
||||
|
||||
检查数据库设置并修复问题.
|
||||
|
||||

|
||||
|
||||
Config check 的显示内容可能因不同连接而异. 在上图情况下, 按下所有三个修复按钮.
|
||||
如果修复按钮消失, 全部变为复选标记, 则表示修复完成.
|
||||
|
||||
### 加密配置
|
||||
|
||||

|
||||
|
||||
为您的数据库加密, 以防数据库意外曝光; 启用端到端加密后, 笔记内容在离开设备时就会被加密. 我们强烈建议启用该功能. `路径混淆 (Path Obfuscation)` 还能混淆文件名. 现已稳定并推荐使用.
|
||||
加密基于 256 位 AES-GCM.
|
||||
如果你在一个封闭的网络中, 而且很明显第三方不会访问你的文件, 则可以禁用这些设置.
|
||||
|
||||

|
||||
|
||||
#### Next
|
||||
|
||||
转到同步设置.
|
||||
|
||||
#### 放弃现有数据库并继续
|
||||
|
||||
清除远程数据库的内容, 然后转到同步设置.
|
||||
|
||||
### 同步设置
|
||||
|
||||
最后, 选择一个同步预设完成向导.
|
||||
|
||||

|
||||
|
||||
选择我们要使用的任何同步方法, 然后 `Apply` 初始化并按要求建立本地和远程数据库. 如果显示 `All done!`, 我们就完成了. `Copy setup URI` 将自动打开,并要求我们输入密码以加密 `Setup URI`.
|
||||
|
||||

|
||||
|
||||
根据需要设置密码。.
|
||||
设置 URI (Setup URI) 将被复制到剪贴板, 然后您可以通过某种方式将其传输到第二个及后续设备.
|
||||
|
||||
## 如何设置第二单元和后续单元 (the second and subsequent units)
|
||||
|
||||
在第一台设备上安装 Self-hosted LiveSync 后, 从命令面板上选择 `Open setup URI`, 然后输入您传输的设置 URI (Setup URI). 然后输入密码,安装向导就会打开.
|
||||
在弹窗中选择以下内容.
|
||||
|
||||
- `Importing LiveSync's conf, OK?` 选择 `Yes`
|
||||
- `How would you like to set it up?`. 选择 `Set it up as secondary or subsequent device`
|
||||
|
||||
然后, 配置将生效并开始复制. 您的文件很快就会同步! 您可能需要关闭设置对话框并重新打开, 才能看到设置字段正确填充, 但它们都将设置好.
|
||||
@@ -1,10 +1,10 @@
|
||||
# Quick setup
|
||||
v0.15.0からSetup wizardが実装されました。これで、初回セットアップがシンプルになります。
|
||||
このプラグインには、いろいろな状況に対応するための非常に多くの設定オプションがあります。しかし、実際に使用する設定項目はそれほど多くはありません。そこで、初期設定を簡略化するために、「セットアップウィザード」を実装しています。
|
||||
※なお、次のデバイスからは、`Copy setup URI`と`Open setup URI`を使ってセットアップしてください。
|
||||
|
||||
|
||||
## Wizardの使い方
|
||||
`🪄 Setup wizard` から開きます。もしセットアップされていなかったり、同期設定が何も有効になっていない場合はデフォルトで開いています。
|
||||
`🧙♂️ Setup wizard` から開きます。もしセットアップされていなかったり、同期設定が何も有効になっていない場合はデフォルトで開いています。
|
||||
|
||||

|
||||
|
||||
@@ -32,20 +32,12 @@ v0.15.0からSetup wizardが実装されました。これで、初回セット
|
||||
|
||||
これらはデータベースをセットアップした際に決めた情報です。
|
||||
|
||||
### End to End暗号化の設定
|
||||
|
||||

|
||||
|
||||
End to End暗号化を有効にした場合、万が一Remote databaseの内容が流出してもPassphraseを知らない第三者にそれを読まれる可能性が低くなります。そのため、有効化を強く推奨します。
|
||||
暗号化は256bitのAES-GCMを採用しています。
|
||||
この設定は、あなたが閉じたネットワークの内側にいて、かつ第三者からアクセスされない事が明確な場合には無効にできます。
|
||||
|
||||
### Test database connectionとCheck database configuraion
|
||||
### Test database connectionとCheck database configuration
|
||||
ここで、データベースへの接続状況と、データベース設定を確認します。
|
||||

|
||||
|
||||
#### Test Database Connection
|
||||
データベースに接続出来るか自体を確認します。失敗する場合はいくつか理由がありますが、一度Check database configurationを行ってそちらでも失敗するか確認してください。
|
||||
データベースに接続できるか自体を確認します。失敗する場合はいくつか理由がありますが、一度Check database configurationを行ってそちらでも失敗するか確認してください。
|
||||
|
||||
#### Check database configuration
|
||||
データベースの設定を確認し、不備がある場合はその場で修正します。
|
||||
@@ -55,6 +47,15 @@ End to End暗号化を有効にした場合、万が一Remote databaseの内容
|
||||
この項目は接続先によって異なる場合があります。上記の場合、みっつのFixボタンを順にすべて押してください。
|
||||
Fixボタンがなくなり、すべてチェックマークになれば完了です。
|
||||
|
||||
### 機密性設定
|
||||
|
||||

|
||||
|
||||
意図しないデータベースの暴露に備えて、End to End Encryptionを有効にします。この項目を有効にした場合、デバイスを出る瞬間にノートの内容が暗号化されます。`Path Obfuscation`を有効にすると、ファイル名も難読化されます。現在は安定しているため、こちらも推奨されます。
|
||||
暗号化には256bitのAES-GCMを採用しています。
|
||||
これらの設定は、あなたが閉じたネットワークの内側にいて、かつ第三者からアクセスされない事が明確な場合には無効にできます。
|
||||
|
||||
|
||||

|
||||
|
||||
### Next
|
||||
@@ -63,20 +64,13 @@ Fixボタンがなくなり、すべてチェックマークになれば完了
|
||||
### Discard exist database and proceed
|
||||
すでにRemote databaseがある場合、Remote databaseの内容を破棄してから次へ進みます
|
||||
|
||||
## Local Database confiuration
|
||||

|
||||
ローカルのデータベースを設定します。もし、すでにSelf-hosted LiveSyncをインストールしたVaultがあり、そのVaultと同じデータベース名を使用している場合は、ここですでに設定したVaultとは異なるsuffixを指定してください。
|
||||
|
||||
## Miscellaneous
|
||||
最後にその他の設定を行います。
|
||||
## Sync Settings
|
||||
最後に同期方法の設定を行います。
|
||||
|
||||

|
||||
|
||||
`Show status inside editor`はお好みで有効化してください。有効にするとエディターの右上にステータスが表示されます。
|
||||
|
||||

|
||||
|
||||
Presetsから、使用する同期方法を選び`Apply`を行うと、必要に応じてローカル・リモートのデータベースを初期化・構築します。
|
||||
Presetsから、いずれかの同期方法を選び`Apply`を行うと、必要に応じてローカル・リモートのデータベースを初期化・構築します。
|
||||
All done! と表示されれば完了です。自動的に、`Copy setup URI`が開き、`Setup URI`を暗号化するパスフレーズを聞かれます。
|
||||
|
||||

|
||||
|
||||
252
docs/setup_flyio.md
Normal file
@@ -0,0 +1,252 @@
|
||||
<!-- For translation: 20240209r0 -->
|
||||
# Setup CouchDB on fly.io
|
||||
|
||||
This is how to configure fly.io and CouchDB on it for Self-hosted LiveSync.
|
||||
|
||||
> [!WARNING]
|
||||
> It is **your** instance. In Obsidian, we have files locally. Hence, do not hesitate to destroy the remote database if you feel something have got weird. We can launch and switch to the new CouchDB instance anytime[^1].
|
||||
>
|
||||
[^1]: Actually, I am always building the database for reproduction of the issue like so.
|
||||
|
||||
> [!NOTE]
|
||||
> **What and why is the Fly.io?**
|
||||
> At some point, we started to experience problems related to our IBM Cloudant account. At the same time, Self-hosted LiveSync started to improve its functionality, requiring CouchDB in a more natural state to use all its features.
|
||||
>
|
||||
> Then we found Fly.io. Fly.io is the PaaS Platform, which can be useable for a very reasonable price. It generally falls within the `Free Allowances` range in most cases.
|
||||
|
||||
## Required materials
|
||||
|
||||
- A valid credit or debit card.
|
||||
|
||||
## Setup CouchDB instance
|
||||
|
||||
### A. Very automated setup
|
||||
|
||||
[](https://www.youtube.com/watch?v=7sa_I1832Xc)
|
||||
|
||||
1. Open [setup-flyio-on-the-fly-v2.ipynb](../setup-flyio-on-the-fly-v2.ipynb).
|
||||
2. Press the `Open in Colab` button.
|
||||
3. Choose a region and run all blocks (Refer to video).
|
||||
1. If you do not have the account yet, the sign-up page will be shown, please follow the instructions. The [Official document is here](https://fly.io/docs/hands-on/sign-up/).
|
||||
4. Copy the Setup-URI and Use it in the Obsidian.
|
||||
5. You have been synchronised. Use the Setup-URI in subsequent devices.
|
||||
|
||||
Steps 4 and 5 are detailed in the [Quick Setup](./quick_setup.md#1-using-setup-uris).
|
||||
|
||||
> [!NOTE]
|
||||
> Your automatically configured configurations will be shown on the result in the Colab note like below, and **it will not be saved**. Please make a note of it somewhere.
|
||||
> ```
|
||||
> -- YOUR CONFIGURATION --
|
||||
> URL : https://billowing-dawn-6619.fly.dev
|
||||
> username: billowing-cherry-22580
|
||||
> password: misty-dew-13571
|
||||
> region : nrt
|
||||
> ```
|
||||
|
||||
### B. Scripted Setup
|
||||
|
||||
Please refer to the document of [deploy-server.sh](../utils/readme.md#deploy-serversh).
|
||||
|
||||
### C. Manual Setup
|
||||
|
||||
| Used in the text | Meaning and where to use | Memo |
|
||||
| ---------------- | --------------------------- | ------------------------------------------------------------------------ |
|
||||
| campanella | Username | It is less likely to fail if it consists only of letters and numbers. |
|
||||
| dfusiuada9suy | Password | |
|
||||
| nrt | Region to make the instance | We can use any [region](https://fly.io/docs/reference/regions/) near us. |
|
||||
|
||||
#### 1. Install flyctl
|
||||
|
||||
- Mac or Linux
|
||||
|
||||
```sh
|
||||
$ curl -L https://fly.io/install.sh | sh
|
||||
```
|
||||
|
||||
- Windows
|
||||
|
||||
```powershell
|
||||
$ iwr https://fly.io/install.ps1 -useb | iex
|
||||
```
|
||||
|
||||
#### 2. Sign up or Sign in to fly.io
|
||||
|
||||
- Sign up
|
||||
|
||||
```bash
|
||||
$ fly auth signup
|
||||
```
|
||||
|
||||
- Sign in
|
||||
|
||||
```bash
|
||||
$ fly auth login
|
||||
```
|
||||
|
||||
For more information, please refer to [Sign up](https://fly.io/docs/hands-on/sign-up/) and [Sign in](https://fly.io/docs/hands-on/sign-in/).
|
||||
|
||||
#### 3. Make a configuration file
|
||||
|
||||
1. Make `fly.toml` from template `fly.template.toml`.
|
||||
We can simply copy and rename the file. The template is on [utils/flyio/fly.template.toml](../utils/flyio/fly.template.toml)
|
||||
2. Decide the instance name, initialize the App, and set credentials.
|
||||
|
||||
>[!TIP]
|
||||
> - The name `billowing-dawn-6619` is randomly decided name, and it will be a part of the CouchDB URL. It should be globally unique. Therefore, it is recommended to use something random for this name.
|
||||
> - Explicit naming is very good for humans. However, we do not often get the chance to actually enter this manually (have designed so). This database may contain important information for you. The needle should be hidden in the haystack.
|
||||
|
||||
|
||||
```bash
|
||||
$ fly launch --name=billowing-dawn-6619 --env="COUCHDB_USER=campanella" --copy-config=true --detach --no-deploy --region nrt --yes
|
||||
$ fly secrets set COUCHDB_PASSWORD=dfusiuada9suy
|
||||
```
|
||||
|
||||
#### 4. Deploy
|
||||
|
||||
```
|
||||
$ flyctl deploy
|
||||
An existing fly.toml file was found
|
||||
Using build strategies '[the "couchdb:latest" docker image]'. Remove [build] from fly.toml to force a rescan
|
||||
Creating app in /home/vorotamoroz/dev/obsidian-livesync/utils/flyio
|
||||
We're about to launch your app on Fly.io. Here's what you're getting:
|
||||
|
||||
Organization: vorotamoroz (fly launch defaults to the personal org)
|
||||
Name: billowing-dawn-6619 (specified on the command line)
|
||||
Region: Tokyo, Japan (specified on the command line)
|
||||
App Machines: shared-cpu-1x, 256MB RAM (specified on the command line)
|
||||
Postgres: <none> (not requested)
|
||||
Redis: <none> (not requested)
|
||||
|
||||
Created app 'billowing-dawn-6619' in organization 'personal'
|
||||
Admin URL: https://fly.io/apps/billowing-dawn-6619
|
||||
Hostname: billowing-dawn-6619.fly.dev
|
||||
Wrote config file fly.toml
|
||||
Validating /home/vorotamoroz/dev/obsidian-livesync/utils/flyio/fly.toml
|
||||
Platform: machines
|
||||
✓ Configuration is valid
|
||||
Your app is ready! Deploy with `flyctl deploy`
|
||||
Secrets are staged for the first deployment
|
||||
==> Verifying app config
|
||||
Validating /home/vorotamoroz/dev/obsidian-livesync/utils/flyio/fly.toml
|
||||
Platform: machines
|
||||
✓ Configuration is valid
|
||||
--> Verified app config
|
||||
==> Building image
|
||||
Searching for image 'couchdb:latest' remotely...
|
||||
image found: img_ox20prk63084j1zq
|
||||
|
||||
Watch your deployment at https://fly.io/apps/billowing-dawn-6619/monitoring
|
||||
|
||||
Provisioning ips for billowing-dawn-6619
|
||||
Dedicated ipv6: 2a09:8280:1::37:fde9
|
||||
Shared ipv4: 66.241.124.163
|
||||
Add a dedicated ipv4 with: fly ips allocate-v4
|
||||
|
||||
Creating a 1 GB volume named 'couchdata' for process group 'app'. Use 'fly vol extend' to increase its size
|
||||
This deployment will:
|
||||
* create 1 "app" machine
|
||||
|
||||
No machines in group app, launching a new machine
|
||||
|
||||
WARNING The app is not listening on the expected address and will not be reachable by fly-proxy.
|
||||
You can fix this by configuring your app to listen on the following addresses:
|
||||
- 0.0.0.0:5984
|
||||
Found these processes inside the machine with open listening sockets:
|
||||
PROCESS | ADDRESSES
|
||||
-----------------*---------------------------------------
|
||||
/.fly/hallpass | [fdaa:0:73b9:a7b:22e:3851:7f28:2]:22
|
||||
|
||||
Finished launching new machines
|
||||
|
||||
NOTE: The machines for [app] have services with 'auto_stop_machines = true' that will be stopped when idling
|
||||
|
||||
-------
|
||||
Checking DNS configuration for billowing-dawn-6619.fly.dev
|
||||
|
||||
Visit your newly deployed app at https://billowing-dawn-6619.fly.dev/
|
||||
```
|
||||
|
||||
#### 5. Apply CouchDB configuration
|
||||
|
||||
After the initial setup, CouchDB needs some more customisations to be used from Self-hosted LiveSync. It can be configured in browsers or by HTTP-REST APIs.
|
||||
|
||||
This section is set up using the REST API.
|
||||
|
||||
1. Prepare environment variables.
|
||||
|
||||
- Mac or Linux:
|
||||
|
||||
```bash
|
||||
export couchHost=https://billowing-dawn-6619.fly.dev
|
||||
export couchUser=campanella
|
||||
export couchPwd=dfusiuada9suy
|
||||
```
|
||||
|
||||
- Windows
|
||||
|
||||
```powershell
|
||||
set couchHost https://billowing-dawn-6619.fly.dev
|
||||
set couchUser campanella
|
||||
set couchPwd dfusiuada9suy
|
||||
$creds = [System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes("${couchUser}:${couchPwd}"))
|
||||
```
|
||||
|
||||
2. Perform cluster setup
|
||||
|
||||
- Mac or Linux
|
||||
|
||||
```bash
|
||||
curl -X POST "${couchHost}/_cluster_setup" -H "Content-Type: application/json" -d "{\"action\":\"enable_single_node\",\"username\":\"${couchUser}\",\"password\":\"${couchPwd}\",\"bind_address\":\"0.0.0.0\",\"port\":5984,\"singlenode\":true}" --user "${couchUser}:${couchPwd}"
|
||||
```
|
||||
|
||||
- Windows
|
||||
|
||||
```powershell
|
||||
iwr -UseBasicParsing -Method 'POST' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_cluster_setup" -Body "{""action"":""enable_single_node"",""username"":""${couchUser}"",""password"":""${couchPwd}"",""bind_address"":""0.0.0.0"",""port"":5984,""singlenode"":true}"
|
||||
```
|
||||
|
||||
Note: if the response code is not 200. We have to retry the request once again.
|
||||
If you run the request several times and it does not result in 200, something is wrong. Please report it.
|
||||
|
||||
3. Configure parameters
|
||||
|
||||
- Mac or Linux
|
||||
|
||||
```bash
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/chttpd/require_valid_user" -H "Content-Type: application/json" -d '"true"' --user "${couchUser}:${couchPwd}"
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/chttpd_auth/require_valid_user" -H "Content-Type: application/json" -d '"true"' --user "${couchUser}:${couchPwd}"
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/httpd/WWW-Authenticate" -H "Content-Type: application/json" -d '"Basic realm=\"couchdb\""' --user "${couchUser}:${couchPwd}"
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/httpd/enable_cors" -H "Content-Type: application/json" -d '"true"' --user "${couchUser}:${couchPwd}"
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/chttpd/enable_cors" -H "Content-Type: application/json" -d '"true"' --user "${couchUser}:${couchPwd}"
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/chttpd/max_http_request_size" -H "Content-Type: application/json" -d '"4294967296"' --user "${couchUser}:${couchPwd}"
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/couchdb/max_document_size" -H "Content-Type: application/json" -d '"50000000"' --user "${couchUser}:${couchPwd}"
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/cors/credentials" -H "Content-Type: application/json" -d '"true"' --user "${couchUser}:${couchPwd}"
|
||||
curl -X PUT "${couchHost}/_node/nonode@nohost/_config/cors/origins" -H "Content-Type: application/json" -d '"app://obsidian.md,capacitor://localhost,http://localhost"' --user "${couchUser}:${couchPwd}"
|
||||
```
|
||||
|
||||
- Windows
|
||||
|
||||
```powershell
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/chttpd/require_valid_user" -Body '"true"'
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/chttpd_auth/require_valid_user" -Body '"true"'
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/httpd/WWW-Authenticate" -Body '"Basic realm=\"couchdb\""'
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/httpd/enable_cors" -Body '"true"'
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/chttpd/enable_cors" -Body '"true"'
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/chttpd/max_http_request_size" -Body '"4294967296"'
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/couchdb/max_document_size" -Body '"50000000"'
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/cors/credentials" -Body '"true"'
|
||||
iwr -UseBasicParsing -Method 'PUT' -ContentType 'application/json; charset=utf-8' -Headers @{ 'Authorization' = 'Basic ' + $creds } "${couchHost}/_node/nonode@nohost/_config/cors/origins" -Body '"app://obsidian.md,capacitor://localhost,http://localhost"'
|
||||
```
|
||||
|
||||
Note: Each of these should also be repeated until finished in 200.
|
||||
|
||||
#### 6. Use it from Self-hosted LiveSync
|
||||
|
||||
Now the CouchDB is ready to use from Self-hosted LiveSync. We can use `https://billowing-dawn-6619.fly.dev` in URI, `campanella` in `Username` and `dfusiuada9suy` in `Password` on Self-hosted LiveSync. The `Database name` could be anything you want.
|
||||
Please refer to the [Minimal Setup of the Quick Setup](./quick_setup.md#2-minimal-setup).
|
||||
|
||||
## Delete the Instance
|
||||
|
||||
If you want to delete the CouchDB instance, you can do that in [fly.io Dashboard](https://fly.io/dashboard/personal)
|
||||
|
||||
If you have done with [B. Scripted Setup](#b-scripted-setup), we can use [delete-server.sh](../utils/readme.md#delete-serversh).
|
||||
@@ -1,92 +1,126 @@
|
||||
# Setup CouchDB to your server
|
||||
# Setup a CouchDB server
|
||||
|
||||
## Table of Contents
|
||||
|
||||
## Install CouchDB and access from a PC or Mac
|
||||
- [Setup a CouchDB server](#setup-a-couchdb-server)
|
||||
- [Table of Contents](#table-of-contents)
|
||||
- [1. Prepare CouchDB](#1-prepare-couchdb)
|
||||
- [A. Using Docker container](#a-using-docker-container)
|
||||
- [1. Prepare](#1-prepare)
|
||||
- [2. Run docker container](#2-run-docker-container)
|
||||
- [B. Install CouchDB directly](#b-install-couchdb-directly)
|
||||
- [2. Run couchdb-init.sh for initialise](#2-run-couchdb-initsh-for-initialise)
|
||||
- [3. Expose CouchDB to the Internet](#3-expose-couchdb-to-the-internet)
|
||||
- [4. Client Setup](#4-client-setup)
|
||||
- [1. Generate the setup URI on a desktop device or server](#1-generate-the-setup-uri-on-a-desktop-device-or-server)
|
||||
- [2. Setup Self-hosted LiveSync to Obsidian](#2-setup-self-hosted-livesync-to-obsidian)
|
||||
- [Manual setup information](#manual-setup-information)
|
||||
- [Setting up your domain](#setting-up-your-domain)
|
||||
- [Reverse Proxies](#reverse-proxies)
|
||||
- [Traefik](#traefik)
|
||||
---
|
||||
|
||||
The easiest way to set up the CouchDB is using the [docker image]((https://hub.docker.com/_/couchdb)).
|
||||
## 1. Prepare CouchDB
|
||||
### A. Using Docker container
|
||||
|
||||
But some additional configurations are required in `local.ini` to use from Self-hosted LiveSync, like below:
|
||||
#### 1. Prepare
|
||||
```bash
|
||||
|
||||
```
|
||||
[couchdb]
|
||||
single_node=true
|
||||
max_document_size = 50000000
|
||||
# Prepare environment variables.
|
||||
export hostname=localhost:5984
|
||||
export username=goojdasjdas #Please change as you like.
|
||||
export password=kpkdasdosakpdsa #Please change as you like
|
||||
|
||||
[chttpd]
|
||||
require_valid_user = true
|
||||
max_http_request_size = 4294967296
|
||||
|
||||
[chttpd_auth]
|
||||
require_valid_user = true
|
||||
authentication_redirect = /_utils/session.html
|
||||
|
||||
[httpd]
|
||||
WWW-Authenticate = Basic realm="couchdb"
|
||||
enable_cors = true
|
||||
|
||||
[cors]
|
||||
origins = app://obsidian.md,capacitor://localhost,http://localhost
|
||||
credentials = true
|
||||
headers = accept, authorization, content-type, origin, referer
|
||||
methods = GET, PUT, POST, HEAD, DELETE
|
||||
max_age = 3600
|
||||
# Prepare directories which saving data and configurations.
|
||||
mkdir couchdb-data
|
||||
mkdir couchdb-etc
|
||||
```
|
||||
|
||||
Make `local.ini` and run with docker run like this, you can launch the CouchDB.
|
||||
#### 2. Run docker container
|
||||
|
||||
1. Boot Check.
|
||||
```
|
||||
$ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
|
||||
$ docker run --name couchdb-for-ols --rm -it -e COUCHDB_USER=${username} -e COUCHDB_PASSWORD=${password} -v ${PWD}/couchdb-data:/opt/couchdb/data -v ${PWD}/couchdb-etc:/opt/couchdb/etc/local.d -p 5984:5984 couchdb
|
||||
```
|
||||
*Remember to replace the path with the path to your local.ini*
|
||||
Note: At this time, the file owner of local.ini became 5984:5984. It's the limitation docker image. please change the owner before editing local.ini again.
|
||||
If your container has been exited, please check the permission of couchdb-data, and couchdb-etc.
|
||||
Once CouchDB run, these directories will be owned by uid:`5984`. Please chown it for you again.
|
||||
|
||||
If you could confirm that Self-hosted LiveSync can sync with the server, launch the docker image as a background as you like.
|
||||
|
||||
Example to run docker in detached mode:
|
||||
2. Enable it in background
|
||||
```
|
||||
$ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
|
||||
$ docker run --name couchdb-for-ols -d --restart always -e COUCHDB_USER=${username} -e COUCHDB_PASSWORD=${password} -v ${PWD}/couchdb-data:/opt/couchdb/data -v ${PWD}/couchdb-etc:/opt/couchdb/etc/local.d -p 5984:5984 couchdb
|
||||
```
|
||||
*Remember to replace the path with the path to your local.ini*
|
||||
### B. Install CouchDB directly
|
||||
Please refer the [official document](https://docs.couchdb.org/en/stable/install/index.html). However, we do not have to configure it fully. Just administrator needs to be configured.
|
||||
|
||||
## Access from a mobile device
|
||||
If you want to access Self-hosted LiveSync from mobile devices, you need a valid SSL certificate.
|
||||
|
||||
### Testing from a mobile
|
||||
In the testing phase, [localhost.run](http://localhost.run/) or something like services is very useful.
|
||||
|
||||
example on using localhost.run)
|
||||
## 2. Run couchdb-init.sh for initialise
|
||||
```
|
||||
$ ssh -R 80:localhost:5984 nokey@localhost.run
|
||||
Warning: Permanently added the RSA host key for IP address '35.171.254.69' to the list of known hosts.
|
||||
|
||||
===============================================================================
|
||||
Welcome to localhost.run!
|
||||
|
||||
Follow your favourite reverse tunnel at [https://twitter.com/localhost_run].
|
||||
|
||||
**You need a SSH key to access this service.**
|
||||
If you get a permission denied follow Gitlab's most excellent howto:
|
||||
https://docs.gitlab.com/ee/ssh/
|
||||
*Only rsa and ed25519 keys are supported*
|
||||
|
||||
To set up and manage custom domains go to https://admin.localhost.run/
|
||||
|
||||
More details on custom domains (and how to enable subdomains of your custom
|
||||
domain) at https://localhost.run/docs/custom-domains
|
||||
|
||||
To explore using localhost.run visit the documentation site:
|
||||
https://localhost.run/docs/
|
||||
|
||||
===============================================================================
|
||||
|
||||
|
||||
** your connection id is xxxxxxxxxxxxxxxxxxxxxxxxxxxx, please mention it if you send me a message about an issue. **
|
||||
|
||||
xxxxxxxx.localhost.run tunneled with tls termination, https://xxxxxxxx.localhost.run
|
||||
Connection to localhost.run closed by remote host.
|
||||
Connection to localhost.run closed.
|
||||
$ curl -s https://raw.githubusercontent.com/vrtmrz/obsidian-livesync/main/utils/couchdb/couchdb-init.sh | bash
|
||||
```
|
||||
|
||||
https://xxxxxxxx.localhost.run is the temporary server address.
|
||||
If it results like following:
|
||||
```
|
||||
-- Configuring CouchDB by REST APIs... -->
|
||||
{"ok":true}
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
<-- Configuring CouchDB by REST APIs Done!
|
||||
```
|
||||
|
||||
Your CouchDB has been initialised successfully. If you want this manually, please read the script.
|
||||
|
||||
## 3. Expose CouchDB to the Internet
|
||||
|
||||
- You can skip this instruction if you using only in intranet and only with desktop devices.
|
||||
- For mobile devices, Obsidian requires a valid SSL certificate. Usually, it needs exposing the internet.
|
||||
|
||||
Whatever solutions we can use. For the simplicity, following sample uses Cloudflare Zero Trust for testing.
|
||||
|
||||
```
|
||||
$ cloudflared tunnel --url http://localhost:5984
|
||||
2024-02-14T10:35:25Z INF Thank you for trying Cloudflare Tunnel. Doing so, without a Cloudflare account, is a quick way to experiment and try it out. However, be aware that these account-less Tunnels have no uptime guarantee. If you intend to use Tunnels in production you should use a pre-created named tunnel by following: https://developers.cloudflare.com/cloudflare-one/connections/connect-apps
|
||||
2024-02-14T10:35:25Z INF Requesting new quick Tunnel on trycloudflare.com...
|
||||
2024-02-14T10:35:26Z INF +--------------------------------------------------------------------------------------------+
|
||||
2024-02-14T10:35:26Z INF | Your quick Tunnel has been created! Visit it at (it may take some time to be reachable): |
|
||||
2024-02-14T10:35:26Z INF | https://tiles-photograph-routine-groundwater.trycloudflare.com |
|
||||
2024-02-14T10:35:26Z INF +--------------------------------------------------------------------------------------------+
|
||||
:
|
||||
:
|
||||
:
|
||||
```
|
||||
Now `https://tiles-photograph-routine-groundwater.trycloudflare.com` is our server. Make it into background once please.
|
||||
|
||||
|
||||
## 4. Client Setup
|
||||
> [!TIP]
|
||||
> Now manually configuration is not recommended for some reasons. However, if you want to do so, please use `Setup wizard`. The recommended extra configurations will be also set.
|
||||
|
||||
### 1. Generate the setup URI on a desktop device or server
|
||||
```bash
|
||||
$ export hostname=https://tiles-photograph-routine-groundwater.trycloudflare.com #Point to your vault
|
||||
$ export database=obsidiannotes #Please change as you like
|
||||
$ export passphrase=dfsapkdjaskdjasdas #Please change as you like
|
||||
$ deno run -A https://raw.githubusercontent.com/vrtmrz/obsidian-livesync/main/utils/flyio/generate_setupuri.ts
|
||||
obsidian://setuplivesync?settings=%5B%22tm2DpsOE74nJAryprZO2M93wF%2Fvg.......4b26ed33230729%22%5D
|
||||
```
|
||||
|
||||
### 2. Setup Self-hosted LiveSync to Obsidian
|
||||
[This video](https://youtu.be/7sa_I1832Xc?t=146) may help us.
|
||||
1. Install Self-hosted LiveSync
|
||||
2. Choose `Use the copied setup URI` from the command palette and paste the setup URI. (obsidian://setuplivesync?settings=.....).
|
||||
3. Type `welcome` for setup-uri passphrase.
|
||||
4. Answer `yes` and `Set it up...`, and finish the first dialogue with `Keep them disabled`.
|
||||
5. `Reload app without save` once.
|
||||
|
||||
---
|
||||
|
||||
## Manual setup information
|
||||
|
||||
### Setting up your domain
|
||||
|
||||
@@ -94,6 +128,77 @@ Set the A record of your domain to point to your server, and host reverse proxy
|
||||
Note: Mounting CouchDB on the top directory is not recommended.
|
||||
Using Caddy is a handy way to serve the server with SSL automatically.
|
||||
|
||||
I have published [docker-compose.yml and ini files](https://github.com/vrtmrz/self-hosted-livesync-server) that launch Caddy and CouchDB at once. Please try it out.
|
||||
I have published [docker-compose.yml and ini files](https://github.com/vrtmrz/self-hosted-livesync-server) that launch Caddy and CouchDB at once. If you are using Traefik you can check the [Reverse Proxies](#reverse-proxies) section below.
|
||||
|
||||
And, be sure to check the server log and be careful of malicious access.
|
||||
|
||||
|
||||
## Reverse Proxies
|
||||
|
||||
### Traefik
|
||||
|
||||
If you are using Traefik, this [docker-compose.yml](https://github.com/vrtmrz/obsidian-livesync/blob/main/docker-compose.traefik.yml) file (also pasted below) has all the right CORS parameters set. It assumes you have an external network called `proxy`.
|
||||
|
||||
```yaml
|
||||
version: "2.1"
|
||||
services:
|
||||
couchdb:
|
||||
image: couchdb:latest
|
||||
container_name: obsidian-livesync
|
||||
user: 1000:1000
|
||||
environment:
|
||||
- COUCHDB_USER=username
|
||||
- COUCHDB_PASSWORD=password
|
||||
volumes:
|
||||
- ./data:/opt/couchdb/data
|
||||
- ./local.ini:/opt/couchdb/etc/local.ini
|
||||
# Ports not needed when already passed to Traefik
|
||||
#ports:
|
||||
# - 5984:5984
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- proxy
|
||||
labels:
|
||||
- "traefik.enable=true"
|
||||
# The Traefik Network
|
||||
- "traefik.docker.network=proxy"
|
||||
# Don't forget to replace 'obsidian-livesync.example.org' with your own domain
|
||||
- "traefik.http.routers.obsidian-livesync.rule=Host(`obsidian-livesync.example.org`)"
|
||||
# The 'websecure' entryPoint is basically your HTTPS entrypoint. Check the next code snippet if you are encountering problems only; you probably have a working traefik configuration if this is not your first container you are reverse proxying.
|
||||
- "traefik.http.routers.obsidian-livesync.entrypoints=websecure"
|
||||
- "traefik.http.routers.obsidian-livesync.service=obsidian-livesync"
|
||||
- "traefik.http.services.obsidian-livesync.loadbalancer.server.port=5984"
|
||||
- "traefik.http.routers.obsidian-livesync.tls=true"
|
||||
# Replace the string 'letsencrypt' with your own certificate resolver
|
||||
- "traefik.http.routers.obsidian-livesync.tls.certresolver=letsencrypt"
|
||||
- "traefik.http.routers.obsidian-livesync.middlewares=obsidiancors"
|
||||
# The part needed for CORS to work on Traefik 2.x starts here
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolallowmethods=GET,PUT,POST,HEAD,DELETE"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolallowheaders=accept,authorization,content-type,origin,referer"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolalloworiginlist=app://obsidian.md,capacitor://localhost,http://localhost"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolmaxage=3600"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.addvaryheader=true"
|
||||
- "traefik.http.middlewares.obsidiancors.headers.accessControlAllowCredentials=true"
|
||||
|
||||
networks:
|
||||
proxy:
|
||||
external: true
|
||||
```
|
||||
|
||||
Partial `traefik.yml` config file mentioned in above:
|
||||
```yml
|
||||
...
|
||||
|
||||
entryPoints:
|
||||
web:
|
||||
address: ":80"
|
||||
http:
|
||||
redirections:
|
||||
entryPoint:
|
||||
to: "websecure"
|
||||
scheme: "https"
|
||||
websecure:
|
||||
address: ":443"
|
||||
|
||||
...
|
||||
```
|
||||
@@ -1,8 +1,19 @@
|
||||
# 在你自己的服务器上设置 CouchDB
|
||||
|
||||
## 目录
|
||||
- [配置 CouchDB](#配置-CouchDB)
|
||||
- [运行 CouchDB](#运行-CouchDB)
|
||||
- [Docker CLI](#docker-cli)
|
||||
- [Docker Compose](#docker-compose)
|
||||
- [创建数据库](#创建数据库)
|
||||
- [从移动设备访问](#从移动设备访问)
|
||||
- [移动设备测试](#移动设备测试)
|
||||
- [设置你的域名](#设置你的域名)
|
||||
---
|
||||
|
||||
> 注:提供了 [docker-compose.yml 和 ini 文件](https://github.com/vrtmrz/self-hosted-livesync-server) 可以同时启动 Caddy 和 CouchDB。推荐直接使用该 docker-compose 配置进行搭建。(若使用,请查阅链接中的文档,而不是这个文档)
|
||||
|
||||
## 安装 CouchDB 并从 PC 或 Mac 上访问
|
||||
## 配置 CouchDB
|
||||
|
||||
设置 CouchDB 的最简单方法是使用 [CouchDB docker image]((https://hub.docker.com/_/couchdb)).
|
||||
|
||||
@@ -33,17 +44,62 @@ methods = GET, PUT, POST, HEAD, DELETE
|
||||
max_age = 3600
|
||||
```
|
||||
|
||||
创建 `local.ini` 并用如下指令启动 CouchDB:
|
||||
```
|
||||
$ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v .local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
|
||||
```
|
||||
Note: 此时 local.ini 的文件所有者会变成 5984:5984。这是 docker 镜像的限制,请修改文件所有者后再编辑 local.ini。
|
||||
## 运行 CouchDB
|
||||
|
||||
在确定 Self-hosted LiveSync 可以和服务器同步后,可以后台启动 docker 镜像:
|
||||
### Docker CLI
|
||||
|
||||
你可以通过指定 `local.ini` 配置运行 CouchDB:
|
||||
|
||||
```
|
||||
$ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v .local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
|
||||
$ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
|
||||
```
|
||||
*记得将上述命令中的 local.ini 挂载路径替换成实际的存放路径*
|
||||
|
||||
后台运行:
|
||||
```
|
||||
$ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
|
||||
```
|
||||
*记得将上述命令中的 local.ini 挂载路径替换成实际的存放路径*
|
||||
|
||||
### Docker Compose
|
||||
创建一个文件夹, 将你的 `local.ini` 放在文件夹内, 然后在文件夹内创建 `docker-compose.yml`. 请确保对 `local.ini` 有读写权限并且确保在容器运行后能创建 `data` 文件夹. 文件夹结构大概如下:
|
||||
```
|
||||
obsidian-livesync
|
||||
├── docker-compose.yml
|
||||
└── local.ini
|
||||
```
|
||||
|
||||
可以参照以下内容编辑 `docker-compose.yml`:
|
||||
```yaml
|
||||
version: "2.1"
|
||||
services:
|
||||
couchdb:
|
||||
image: couchdb
|
||||
container_name: obsidian-livesync
|
||||
user: 1000:1000
|
||||
environment:
|
||||
- COUCHDB_USER=admin
|
||||
- COUCHDB_PASSWORD=password
|
||||
volumes:
|
||||
- ./data:/opt/couchdb/data
|
||||
- ./local.ini:/opt/couchdb/etc/local.ini
|
||||
ports:
|
||||
- 5984:5984
|
||||
restart: unless-stopped
|
||||
```
|
||||
|
||||
最后, 创建并启动容器:
|
||||
```
|
||||
# -d will launch detached so the container runs in background
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## 创建数据库
|
||||
|
||||
CouchDB 部署成功后, 需要手动创建一个数据库, 方便插件连接并同步.
|
||||
|
||||
1. 访问 `http://localhost:5984/_utils`, 输入帐号密码后进入管理页面
|
||||
2. 点击 Create Database, 然后根据个人喜好创建数据库
|
||||
|
||||
## 从移动设备访问
|
||||
如果你想要从移动设备访问 Self-hosted LiveSync,你需要一个合法的 SSL 证书。
|
||||
|
||||
@@ -12,10 +12,10 @@ max_document_size = 50000000
|
||||
|
||||
[chttpd]
|
||||
require_valid_user = true
|
||||
max_http_request_size = 4294967296
|
||||
|
||||
[chttpd_auth]
|
||||
require_valid_user = true
|
||||
max_http_request_size = 4294967296
|
||||
authentication_redirect = /_utils/session.html
|
||||
|
||||
[httpd]
|
||||
|
||||
117
docs/troubleshooting.md
Normal file
@@ -0,0 +1,117 @@
|
||||
<!-- 2024-02-15 -->
|
||||
# Tips and Troubleshooting
|
||||
|
||||
|
||||
- [Tips and Troubleshooting](#tips-and-troubleshooting)
|
||||
- [Notable bugs and fixes](#notable-bugs-and-fixes)
|
||||
- [Binary files get bigger on iOS](#binary-files-get-bigger-on-ios)
|
||||
- [Some setting name has been changed](#some-setting-name-has-been-changed)
|
||||
- [FAQ](#faq)
|
||||
- [Why `Use an old adapter for compatibility` is somehow enabled in my vault?](#why-use-an-old-adapter-for-compatibility-is-somehow-enabled-in-my-vault)
|
||||
- [ZIP (or any extensions) files were not synchronised. Why?](#zip-or-any-extensions-files-were-not-synchronised-why)
|
||||
- [I hope to report the issue, but you said you needs `Report`. How to make it?](#i-hope-to-report-the-issue-but-you-said-you-needs-report-how-to-make-it)
|
||||
- [Where can I check the log?](#where-can-i-check-the-log)
|
||||
- [Why are the logs volatile and ephemeral?](#why-are-the-logs-volatile-and-ephemeral)
|
||||
- [Some network logs are not written into the file.](#some-network-logs-are-not-written-into-the-file)
|
||||
- [If a file were deleted or trimmed, the capacity of the database should be reduced, right?](#if-a-file-were-deleted-or-trimmed-the-capacity-of-the-database-should-be-reduced-right)
|
||||
- [Troubleshooting](#troubleshooting)
|
||||
- [On the mobile device, cannot synchronise on the local network!](#on-the-mobile-device-cannot-synchronise-on-the-local-network)
|
||||
- [I think that something bad happening on the vault...](#i-think-that-something-bad-happening-on-the-vault)
|
||||
- [Tips](#tips)
|
||||
- [Old tips](#old-tips)
|
||||
|
||||
<!-- - -->
|
||||
|
||||
|
||||
## Notable bugs and fixes
|
||||
### Binary files get bigger on iOS
|
||||
- Reported at: v0.20.x
|
||||
- Fixed at: v0.21.2 (Fixed but not reviewed)
|
||||
- Required action: larger files will not be fixed automatically, please perform `Verify and repair all files`. If our local database and storage are not matched, we will be asked to apply which one.
|
||||
|
||||
### Some setting name has been changed
|
||||
- Fixed at: v0.22.6
|
||||
|
||||
| Previous name | New name |
|
||||
| ---------------------------- | ---------------------------------------- |
|
||||
| Open setup URI | Use the copied setup URI |
|
||||
| Copy setup URI | Copy current settings as a new setup URI |
|
||||
| Setup Wizard | Minimal Setup |
|
||||
| Check database configuration | Check and Fix database configuration |
|
||||
|
||||
## FAQ
|
||||
|
||||
### Why `Use an old adapter for compatibility` is somehow enabled in my vault?
|
||||
|
||||
Because you are a compassionate and experienced user. Before v0.17.16, we used an old adapter for the local database. At that time, current default adapter has not been stable.
|
||||
The new adapter has better performance and has a new feature like purging. Therefore, we should use new adapters and current default is so.
|
||||
|
||||
However, when switching from an old adapter to a new adapter, some converting or local database rebuilding is required, and it takes a few time. It was a long time ago now, but we once inconvenienced everyone in a hurry when we changed the format of our database.
|
||||
For these reasons, this toggle is automatically on if we have upgraded from vault which using an old adapter.
|
||||
|
||||
When you rebuild everything or fetch from the remote again, you will be asked to switch this.
|
||||
|
||||
Therefore, experienced users (especially those stable enough not to have to rebuild the database) may have this toggle enabled in their Vault.
|
||||
Please disable it when you have enough time.
|
||||
|
||||
### ZIP (or any extensions) files were not synchronised. Why?
|
||||
It depends on Obsidian detects. May toggling `Detect all extensions` of `File and links` (setting of Obsidian) will help us.
|
||||
|
||||
### I hope to report the issue, but you said you needs `Report`. How to make it?
|
||||
We can copy the report to the clipboard, by pressing the `Make report` button on the `Hatch` pane.
|
||||

|
||||
|
||||
### Where can I check the log?
|
||||
We can launch the log pane by `Show log` on the command palette.
|
||||
And if you have troubled something, please enable the `Verbose Log` on the `General Setting` pane.
|
||||
|
||||
However, the logs would not be kept so long and cleared when restarted. If you want to check the logs, please enable `Write logs into the file` temporarily.
|
||||
|
||||

|
||||
|
||||
> [!IMPORTANT]
|
||||
> - Writing logs into the file will impact the performance.
|
||||
> - Please make sure that you have erased all your confidential information before reporting issue.
|
||||
|
||||
### Why are the logs volatile and ephemeral?
|
||||
To avoid unexpected exposure to our confidential things.
|
||||
|
||||
### Some network logs are not written into the file.
|
||||
Especially the CORS error will be reported as a general error to the plug-in for security reasons. So we cannot detect and log it.
|
||||
|
||||
### If a file were deleted or trimmed, the capacity of the database should be reduced, right?
|
||||
No, even though if files were deleted, chunks were not deleted.
|
||||
Self-hosted LiveSync splits the files into multiple chunks and transfers only newly created. This behaviour enables us to less traffic. And, the chunks will be shared between the files to reduce the total usage of the database.
|
||||
|
||||
And one more thing, we can handle the conflicts on any device even though it has happened on other devices. This means that conflicts will happen in the past, after the time we have synchronised. Hence we cannot collect and delete the unused chunks even though if we are not currently referenced.
|
||||
|
||||
To shrink the database size, `Rebuild everything` only reliably and effectively. But do not worry, if we have synchronised well. We have the actual and real files. Only it takes a bit of time and traffics.
|
||||
|
||||
<!-- Add here -->
|
||||
|
||||
## Troubleshooting
|
||||
<!-- Add here -->
|
||||
|
||||
### On the mobile device, cannot synchronise on the local network!
|
||||
Obsidian mobile is not able to connect to the non-secure end-point, such as starting with `http://`. Make sure your URI of CouchDB. Also not able to use a self-signed certificate.
|
||||
|
||||
### I think that something bad happening on the vault...
|
||||
Place `redflag.md` on top of the vault, and restart Obsidian. The most simple way is to create a new note and rename it to `redflag`. Of course, we can put it without Obsidian.
|
||||
|
||||
If there is `redflag.md`, Self-hosted LiveSync suspends all database and storage processes.
|
||||
|
||||
## Tips
|
||||
<!-- Add here -->
|
||||
|
||||
### Old tips
|
||||
- Rarely, a file in the database could be corrupted. The plugin will not write to local storage when a file looks corrupted. If a local version of the file is on your device, the corruption could be fixed by editing the local file and synchronizing it. But if the file does not exist on any of your devices, then it can not be rescued. In this case, you can delete these items from the settings dialog.
|
||||
- To stop the boot-up sequence (eg. for fixing problems on databases), you can put a `redflag.md` file (or directory) at the root of your vault.
|
||||
Tip for iOS: a redflag directory can be created at the root of the vault using the File application.
|
||||
- Also, with `redflag2.md` placed, we can automatically rebuild both the local and the remote databases during the boot-up sequence. With `redflag3.md`, we can discard only the local database and fetch from the remote again.
|
||||
- Q: The database is growing, how can I shrink it down?
|
||||
A: each of the docs is saved with their past 100 revisions for detecting and resolving conflicts. Picturing that one device has been offline for a while, and comes online again. The device has to compare its notes with the remotely saved ones. If there exists a historic revision in which the note used to be identical, it could be updated safely (like git fast-forward). Even if that is not in revision histories, we only have to check the differences after the revision that both devices commonly have. This is like git's conflict-resolving method. So, We have to make the database again like an enlarged git repo if you want to solve the root of the problem.
|
||||
- And more technical Information is in the [Technical Information](tech_info.md)
|
||||
- If you want to synchronize files without obsidian, you can use [filesystem-livesync](https://github.com/vrtmrz/filesystem-livesync).
|
||||
- WebClipper is also available on Chrome Web Store:[obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
Repo is here: [obsidian-livesync-webclip](https://github.com/vrtmrz/obsidian-livesync-webclip). (Docs are a work in progress.)
|
||||
@@ -1,46 +1,165 @@
|
||||
//@ts-check
|
||||
|
||||
import esbuild from "esbuild";
|
||||
import process from "process";
|
||||
import builtins from "builtin-modules";
|
||||
import sveltePlugin from "esbuild-svelte";
|
||||
import sveltePreprocess from "svelte-preprocess";
|
||||
import fs from "node:fs";
|
||||
// import terser from "terser";
|
||||
import { minify } from "terser";
|
||||
const banner = `/*
|
||||
THIS IS A GENERATED/BUNDLED FILE BY ESBUILD
|
||||
THIS IS A GENERATED/BUNDLED FILE BY ESBUILD AND TERSER
|
||||
if you want to view the source, please visit the github repository of this plugin
|
||||
*/
|
||||
`;
|
||||
|
||||
|
||||
const prod = process.argv[2] === "production";
|
||||
const manifestJson = JSON.parse(fs.readFileSync("./manifest.json"));
|
||||
const packageJson = JSON.parse(fs.readFileSync("./package.json"));
|
||||
|
||||
const terserOpt = {
|
||||
sourceMap: (!prod ? {
|
||||
url: "inline"
|
||||
} : {}),
|
||||
format: {
|
||||
indent_level: 2,
|
||||
beautify: true,
|
||||
comments: "some",
|
||||
ecma: 2018,
|
||||
preamble: banner,
|
||||
webkit: true
|
||||
},
|
||||
parse: {
|
||||
// parse options
|
||||
},
|
||||
compress: {
|
||||
// compress options
|
||||
defaults: false,
|
||||
evaluate: true,
|
||||
inline: 3,
|
||||
join_vars: true,
|
||||
loops: true,
|
||||
passes: prod ? 4 : 1,
|
||||
reduce_vars: true,
|
||||
reduce_funcs: true,
|
||||
arrows: true,
|
||||
collapse_vars: true,
|
||||
comparisons: true,
|
||||
lhs_constants: true,
|
||||
hoist_props: true,
|
||||
side_effects: true,
|
||||
if_return: true,
|
||||
ecma: 2018,
|
||||
unused: true,
|
||||
},
|
||||
// mangle: {
|
||||
// // mangle options
|
||||
// keep_classnames: true,
|
||||
// keep_fnames: true,
|
||||
|
||||
// properties: {
|
||||
// // mangle property options
|
||||
|
||||
// }
|
||||
// },
|
||||
|
||||
ecma: 2018, // specify one of: 5, 2015, 2016, etc.
|
||||
enclose: false, // or specify true, or "args:values"
|
||||
keep_classnames: true,
|
||||
keep_fnames: true,
|
||||
ie8: false,
|
||||
module: false,
|
||||
// nameCache: null, // or specify a name cache object
|
||||
safari10: false,
|
||||
toplevel: false
|
||||
}
|
||||
|
||||
|
||||
const manifestJson = JSON.parse(fs.readFileSync("./manifest.json") + "");
|
||||
const packageJson = JSON.parse(fs.readFileSync("./package.json") + "");
|
||||
const updateInfo = JSON.stringify(fs.readFileSync("./updates.md") + "");
|
||||
esbuild
|
||||
.build({
|
||||
banner: {
|
||||
js: banner,
|
||||
},
|
||||
entryPoints: ["src/main.ts"],
|
||||
bundle: true,
|
||||
define: {
|
||||
"MANIFEST_VERSION": `"${manifestJson.version}"`,
|
||||
"PACKAGE_VERSION": `"${packageJson.version}"`,
|
||||
"UPDATE_INFO": `${updateInfo}`,
|
||||
"global":"window",
|
||||
},
|
||||
external: ["obsidian", "electron", "crypto"],
|
||||
format: "cjs",
|
||||
watch: !prod,
|
||||
target: "es2018",
|
||||
logLevel: "info",
|
||||
sourcemap: prod ? false : "inline",
|
||||
treeShaking: true,
|
||||
platform: "browser",
|
||||
plugins: [
|
||||
sveltePlugin({
|
||||
preprocess: sveltePreprocess(),
|
||||
compilerOptions: { css: true },
|
||||
}),
|
||||
],
|
||||
outfile: "main.js",
|
||||
})
|
||||
.catch(() => process.exit(1));
|
||||
|
||||
/** @type esbuild.Plugin[] */
|
||||
const plugins = [{
|
||||
name: 'my-plugin',
|
||||
setup(build) {
|
||||
let count = 0;
|
||||
build.onEnd(async result => {
|
||||
if (count++ === 0) {
|
||||
console.log('first build:', result);
|
||||
|
||||
} else {
|
||||
console.log('subsequent build:');
|
||||
}
|
||||
if (prod) {
|
||||
console.log("Performing terser");
|
||||
const src = fs.readFileSync("./main_org.js").toString();
|
||||
// @ts-ignore
|
||||
const ret = await minify(src, terserOpt);
|
||||
if (ret && ret.code) {
|
||||
fs.writeFileSync("./main.js", ret.code);
|
||||
}
|
||||
console.log("Finished terser");
|
||||
} else {
|
||||
fs.copyFileSync("./main_org.js", "./main.js");
|
||||
}
|
||||
});
|
||||
},
|
||||
}];
|
||||
|
||||
const context = await esbuild.context({
|
||||
banner: {
|
||||
js: banner,
|
||||
},
|
||||
entryPoints: ["src/main.ts"],
|
||||
bundle: true,
|
||||
define: {
|
||||
"MANIFEST_VERSION": `"${manifestJson.version}"`,
|
||||
"PACKAGE_VERSION": `"${packageJson.version}"`,
|
||||
"UPDATE_INFO": `${updateInfo}`,
|
||||
"global": "window",
|
||||
},
|
||||
external: [
|
||||
"obsidian",
|
||||
"electron",
|
||||
"crypto",
|
||||
"@codemirror/autocomplete",
|
||||
"@codemirror/collab",
|
||||
"@codemirror/commands",
|
||||
"@codemirror/language",
|
||||
"@codemirror/lint",
|
||||
"@codemirror/search",
|
||||
"@codemirror/state",
|
||||
"@codemirror/view",
|
||||
"@lezer/common",
|
||||
"@lezer/highlight",
|
||||
"@lezer/lr"],
|
||||
// minifyWhitespace: true,
|
||||
format: "cjs",
|
||||
target: "es2018",
|
||||
logLevel: "info",
|
||||
platform: "browser",
|
||||
sourcemap: prod ? false : "inline",
|
||||
treeShaking: true,
|
||||
outfile: "main_org.js",
|
||||
minifyWhitespace: false,
|
||||
minifySyntax: false,
|
||||
minifyIdentifiers: false,
|
||||
minify: false,
|
||||
|
||||
// keepNames: true,
|
||||
plugins: [
|
||||
sveltePlugin({
|
||||
preprocess: sveltePreprocess(),
|
||||
compilerOptions: { css: true, preserveComments: true },
|
||||
}),
|
||||
...plugins
|
||||
],
|
||||
})
|
||||
|
||||
if (prod) {
|
||||
await context.rebuild();
|
||||
process.exit(0);
|
||||
} else {
|
||||
await context.watch();
|
||||
}
|
||||
|
||||
BIN
images/hatch.png
Normal file
|
After Width: | Height: | Size: 13 KiB |
|
Before Width: | Height: | Size: 39 KiB After Width: | Height: | Size: 20 KiB |
|
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 6.2 KiB |
BIN
images/quick_setup_3b.png
Normal file
|
After Width: | Height: | Size: 74 KiB |
|
Before Width: | Height: | Size: 31 KiB After Width: | Height: | Size: 20 KiB |
|
Before Width: | Height: | Size: 36 KiB After Width: | Height: | Size: 5.7 KiB |
|
Before Width: | Height: | Size: 10 KiB |
|
Before Width: | Height: | Size: 35 KiB After Width: | Height: | Size: 6.4 KiB |
BIN
images/write_logs_into_the_file.png
Normal file
|
After Width: | Height: | Size: 17 KiB |
@@ -1,10 +1,10 @@
|
||||
{
|
||||
"id": "obsidian-livesync",
|
||||
"name": "Self-hosted LiveSync",
|
||||
"version": "0.19.4",
|
||||
"version": "0.23.6",
|
||||
"minAppVersion": "0.9.12",
|
||||
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||
"author": "vorotamoroz",
|
||||
"authorUrl": "https://github.com/vrtmrz",
|
||||
"isDesktopOnly": false
|
||||
}
|
||||
}
|
||||
|
||||
7348
package-lock.json
generated
60
package.json
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "obsidian-livesync",
|
||||
"version": "0.19.4",
|
||||
"version": "0.23.6",
|
||||
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||
"main": "main.js",
|
||||
"type": "module",
|
||||
@@ -13,40 +13,56 @@
|
||||
"author": "vorotamoroz",
|
||||
"license": "MIT",
|
||||
"devDependencies": {
|
||||
"@types/diff-match-patch": "^1.0.32",
|
||||
"@types/pouchdb": "^6.4.0",
|
||||
"@types/pouchdb-browser": "^6.1.3",
|
||||
"@typescript-eslint/eslint-plugin": "^5.54.0",
|
||||
"@typescript-eslint/parser": "^5.54.0",
|
||||
"@tsconfig/svelte": "^5.0.2",
|
||||
"@types/diff-match-patch": "^1.0.36",
|
||||
"@types/node": "^20.11.28",
|
||||
"@types/pouchdb": "^6.4.2",
|
||||
"@types/pouchdb-adapter-http": "^6.1.6",
|
||||
"@types/pouchdb-adapter-idb": "^6.1.7",
|
||||
"@types/pouchdb-browser": "^6.1.5",
|
||||
"@types/pouchdb-core": "^7.0.14",
|
||||
"@types/pouchdb-mapreduce": "^6.1.10",
|
||||
"@types/pouchdb-replication": "^6.4.7",
|
||||
"@types/transform-pouch": "^1.0.6",
|
||||
"@typescript-eslint/eslint-plugin": "^7.2.0",
|
||||
"@typescript-eslint/parser": "^7.2.0",
|
||||
"builtin-modules": "^3.3.0",
|
||||
"esbuild": "0.15.15",
|
||||
"esbuild-svelte": "^0.7.3",
|
||||
"eslint": "^8.35.0",
|
||||
"esbuild": "0.20.2",
|
||||
"esbuild-svelte": "^0.8.0",
|
||||
"eslint": "^8.57.0",
|
||||
"eslint-config-airbnb-base": "^15.0.0",
|
||||
"eslint-plugin-import": "^2.27.5",
|
||||
"eslint-plugin-import": "^2.29.1",
|
||||
"events": "^3.3.0",
|
||||
"obsidian": "^1.1.1",
|
||||
"postcss": "^8.4.21",
|
||||
"postcss-load-config": "^4.0.1",
|
||||
"obsidian": "^1.5.7",
|
||||
"postcss": "^8.4.35",
|
||||
"postcss-load-config": "^5.0.3",
|
||||
"pouchdb-adapter-http": "^8.0.1",
|
||||
"pouchdb-adapter-idb": "^8.0.1",
|
||||
"pouchdb-adapter-indexeddb": "^8.0.1",
|
||||
"pouchdb-core": "^8.0.1",
|
||||
"pouchdb-errors": "^8.0.1",
|
||||
"pouchdb-find": "^8.0.1",
|
||||
"pouchdb-mapreduce": "^8.0.1",
|
||||
"pouchdb-merge": "^8.0.1",
|
||||
"pouchdb-replication": "^8.0.1",
|
||||
"pouchdb-utils": "^8.0.1",
|
||||
"svelte": "^3.55.1",
|
||||
"svelte-preprocess": "^5.0.1",
|
||||
"svelte": "^4.2.12",
|
||||
"svelte-preprocess": "^5.1.3",
|
||||
"terser": "^5.29.2",
|
||||
"transform-pouch": "^2.0.0",
|
||||
"tslib": "^2.5.0",
|
||||
"typescript": "^4.9.5"
|
||||
"tslib": "^2.6.2",
|
||||
"typescript": "^5.4.2"
|
||||
},
|
||||
"dependencies": {
|
||||
"@aws-sdk/client-s3": "^3.556.0",
|
||||
"@smithy/fetch-http-handler": "^2.5.0",
|
||||
"@smithy/protocol-http": "^3.3.0",
|
||||
"@smithy/querystring-builder": "^2.2.0",
|
||||
"diff-match-patch": "^1.0.5",
|
||||
"esbuild": "0.15.15",
|
||||
"esbuild-svelte": "^0.7.3",
|
||||
"idb": "^7.1.1",
|
||||
"xxhash-wasm": "^0.4.2"
|
||||
"fflate": "^0.8.2",
|
||||
"idb": "^8.0.0",
|
||||
"minimatch": "^9.0.3",
|
||||
"xxhash-wasm": "0.4.2",
|
||||
"xxhash-wasm-102": "npm:xxhash-wasm@^1.0.2"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
151
setup-flyio-on-the-fly-v2.ipynb
Normal file
@@ -0,0 +1,151 @@
|
||||
{
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 0,
|
||||
"metadata": {
|
||||
"colab": {
|
||||
"provenance": [],
|
||||
"private_outputs": true,
|
||||
"authorship_tag": "ABX9TyMexQ5pErH5LBG2tENtEVWf",
|
||||
"include_colab_link": true
|
||||
},
|
||||
"kernelspec": {
|
||||
"name": "python3",
|
||||
"display_name": "Python 3"
|
||||
},
|
||||
"language_info": {
|
||||
"name": "python"
|
||||
}
|
||||
},
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {
|
||||
"id": "view-in-github",
|
||||
"colab_type": "text"
|
||||
},
|
||||
"source": [
|
||||
"<a href=\"https://colab.research.google.com/gist/vrtmrz/9402b101746e08e969b1a4f5f0deb465/setup-flyio-on-the-fly-v2.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"- Initial version 7th Feb. 2024"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "AzLlAcLFRO5A"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"id": "z1x8DQpa9opC"
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Install prerequesties\n",
|
||||
"!curl -L https://fly.io/install.sh | sh\n",
|
||||
"!curl -fsSL https://deno.land/x/install/install.sh | sh\n",
|
||||
"!apt update && apt -y install jq\n",
|
||||
"import os\n",
|
||||
"%env PATH=/root/.fly/bin:/root/.deno/bin/:{os.environ[\"PATH\"]}\n",
|
||||
"!git clone --recursive https://github.com/vrtmrz/obsidian-livesync"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"# Login up sign up\n",
|
||||
"!flyctl auth signup"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "mGN08BaFDviy"
|
||||
},
|
||||
"execution_count": null,
|
||||
"outputs": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"Select a region and execute the block."
|
||||
],
|
||||
"metadata": {
|
||||
"id": "BBFTFOP6vA8m"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"# see https://fly.io/docs/reference/regions/\n",
|
||||
"region = \"nrt/Tokyo, Japan\" #@param [\"ams/Amsterdam, Netherlands\",\"arn/Stockholm, Sweden\",\"atl/Atlanta, Georgia (US)\",\"bog/Bogotá, Colombia\",\"bos/Boston, Massachusetts (US)\",\"cdg/Paris, France\",\"den/Denver, Colorado (US)\",\"dfw/Dallas, Texas (US)\",\"ewr/Secaucus, NJ (US)\",\"eze/Ezeiza, Argentina\",\"gdl/Guadalajara, Mexico\",\"gig/Rio de Janeiro, Brazil\",\"gru/Sao Paulo, Brazil\",\"hkg/Hong Kong, Hong Kong\",\"iad/Ashburn, Virginia (US)\",\"jnb/Johannesburg, South Africa\",\"lax/Los Angeles, California (US)\",\"lhr/London, United Kingdom\",\"mad/Madrid, Spain\",\"mia/Miami, Florida (US)\",\"nrt/Tokyo, Japan\",\"ord/Chicago, Illinois (US)\",\"otp/Bucharest, Romania\",\"phx/Phoenix, Arizona (US)\",\"qro/Querétaro, Mexico\",\"scl/Santiago, Chile\",\"sea/Seattle, Washington (US)\",\"sin/Singapore, Singapore\",\"sjc/San Jose, California (US)\",\"syd/Sydney, Australia\",\"waw/Warsaw, Poland\",\"yul/Montreal, Canada\",\"yyz/Toronto, Canada\" ] {allow-input: true}\n",
|
||||
"%env region={region.split(\"/\")[0]}\n",
|
||||
"#%env appame=\n",
|
||||
"#%env username=\n",
|
||||
"#%env password=\n",
|
||||
"#%env database=\n",
|
||||
"#%env passphrase=\n",
|
||||
"\n",
|
||||
"# automatic setup leave it -->\n",
|
||||
"%cd obsidian-livesync/utils/flyio\n",
|
||||
"!./deploy-server.sh | tee deploy-result.txt\n",
|
||||
"\n",
|
||||
"## Show result button\n",
|
||||
"from IPython.display import HTML\n",
|
||||
"last_line=\"\"\n",
|
||||
"with open('deploy-result.txt', 'r') as f:\n",
|
||||
" last_line = f.readlines()[-1]\n",
|
||||
" last_line = str.strip(last_line)\n",
|
||||
"\n",
|
||||
"if last_line.startswith(\"obsidian://\"):\n",
|
||||
" result = HTML(f\"Copy your setup-URI with this button! -> <button onclick=\\\"navigator.clipboard.writeText('{last_line}')\\\">Copy setup uri</button><br>Importing passphrase is `welcome`. <br>If you want to synchronise in live mode, please apply a preset after ensuring the imported configuration works.\")\n",
|
||||
"else:\n",
|
||||
" result = \"Failed to encrypt the setup URI\"\n",
|
||||
"result"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "TNl0A603EF9E"
|
||||
},
|
||||
"execution_count": null,
|
||||
"outputs": []
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"If you see the `Copy setup URI` button, Congratulations! Your CouchDB is ready to use! Please click the button. And open this on Obsidian.\n",
|
||||
"\n",
|
||||
"And, you should keep the output to your secret memo.\n",
|
||||
"\n"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "oeIzExnEKhFp"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"source": [
|
||||
"\n",
|
||||
"\n",
|
||||
"---\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"If you want to delete this CouchDB instance, you can do it by executing next cell. \n",
|
||||
"If your fly.toml has been gone, access https://fly.io/dashboard and check the existing app."
|
||||
],
|
||||
"metadata": {
|
||||
"id": "sdQrqOjERN3K"
|
||||
}
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"source": [
|
||||
"!./delete-server.sh"
|
||||
],
|
||||
"metadata": {
|
||||
"id": "7JMSkNvVIIfg"
|
||||
},
|
||||
"execution_count": null,
|
||||
"outputs": []
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -1,315 +0,0 @@
|
||||
import { normalizePath, PluginManifest } from "./deps";
|
||||
import { DocumentID, EntryDoc, FilePathWithPrefix, LoadedEntry, LOG_LEVEL } from "./lib/src/types";
|
||||
import { PluginDataEntry, PERIODIC_PLUGIN_SWEEP, PluginList, DevicePluginList, PSCHeader, PSCHeaderEnd } from "./types";
|
||||
import { getDocData, isDocContentSame } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { isPluginMetadata, PeriodicProcessor } from "./utils";
|
||||
import { PluginDialogModal } from "./dialogs";
|
||||
import { NewNotice } from "./lib/src/wrapper";
|
||||
import { versionNumberString2Number } from "./lib/src/strbin";
|
||||
import { runWithLock } from "./lib/src/lock";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
|
||||
export class PluginAndTheirSettings extends LiveSyncCommands {
|
||||
|
||||
get deviceAndVaultName() {
|
||||
return this.plugin.deviceAndVaultName;
|
||||
}
|
||||
pluginDialog: PluginDialogModal = null;
|
||||
periodicPluginSweepProcessor = new PeriodicProcessor(this.plugin, async () => await this.sweepPlugin(false));
|
||||
|
||||
showPluginSyncModal() {
|
||||
if (this.pluginDialog != null) {
|
||||
this.pluginDialog.open();
|
||||
} else {
|
||||
this.pluginDialog = new PluginDialogModal(this.app, this.plugin);
|
||||
this.pluginDialog.open();
|
||||
}
|
||||
}
|
||||
|
||||
hidePluginSyncModal() {
|
||||
if (this.pluginDialog != null) {
|
||||
this.pluginDialog.close();
|
||||
this.pluginDialog = null;
|
||||
}
|
||||
}
|
||||
onload(): void | Promise<void> {
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-plugin-dialog",
|
||||
name: "Show Plugins and their settings",
|
||||
callback: () => {
|
||||
this.showPluginSyncModal();
|
||||
},
|
||||
});
|
||||
this.showPluginSyncModal();
|
||||
}
|
||||
onunload() {
|
||||
this.hidePluginSyncModal();
|
||||
this.periodicPluginSweepProcessor?.disable();
|
||||
}
|
||||
parseReplicationResultItem(doc: PouchDB.Core.ExistingDocument<EntryDoc>) {
|
||||
if (isPluginMetadata(doc._id)) {
|
||||
if (this.settings.notifyPluginOrSettingUpdated) {
|
||||
this.triggerCheckPluginUpdate();
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
async beforeReplicate(showMessage: boolean) {
|
||||
if (this.settings.autoSweepPlugins) {
|
||||
await this.sweepPlugin(showMessage);
|
||||
}
|
||||
}
|
||||
async onResume() {
|
||||
if (this.plugin.suspended)
|
||||
return;
|
||||
if (this.settings.autoSweepPlugins) {
|
||||
await this.sweepPlugin(false);
|
||||
}
|
||||
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
|
||||
}
|
||||
async onInitializeDatabase(showNotice: boolean) {
|
||||
if (this.settings.usePluginSync) {
|
||||
try {
|
||||
Logger("Scanning plugins...");
|
||||
await this.sweepPlugin(showNotice);
|
||||
Logger("Scanning plugins done");
|
||||
} catch (ex) {
|
||||
Logger("Scanning plugins failed");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
async realizeSettingSyncMode() {
|
||||
this.periodicPluginSweepProcessor?.disable();
|
||||
if (this.plugin.suspended)
|
||||
return;
|
||||
if (this.settings.autoSweepPlugins) {
|
||||
await this.sweepPlugin(false);
|
||||
}
|
||||
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
|
||||
}
|
||||
|
||||
triggerCheckPluginUpdate() {
|
||||
(async () => await this.checkPluginUpdate())();
|
||||
}
|
||||
|
||||
|
||||
async getPluginList(): Promise<{ plugins: PluginList; allPlugins: DevicePluginList; thisDevicePlugins: DevicePluginList; }> {
|
||||
const docList = await this.localDatabase.allDocsRaw<PluginDataEntry>({ startkey: PSCHeader, endkey: PSCHeaderEnd, include_docs: false });
|
||||
const oldDocs: PluginDataEntry[] = ((await Promise.all(docList.rows.map(async (e) => await this.localDatabase.getDBEntry(e.id as FilePathWithPrefix /* WARN!! THIS SHOULD BE WRAPPED */)))).filter((e) => e !== false) as LoadedEntry[]).map((e) => JSON.parse(getDocData(e.data)));
|
||||
const plugins: { [key: string]: PluginDataEntry[]; } = {};
|
||||
const allPlugins: { [key: string]: PluginDataEntry; } = {};
|
||||
const thisDevicePlugins: { [key: string]: PluginDataEntry; } = {};
|
||||
for (const v of oldDocs) {
|
||||
if (typeof plugins[v.deviceVaultName] === "undefined") {
|
||||
plugins[v.deviceVaultName] = [];
|
||||
}
|
||||
plugins[v.deviceVaultName].push(v);
|
||||
allPlugins[v._id] = v;
|
||||
if (v.deviceVaultName == this.deviceAndVaultName) {
|
||||
thisDevicePlugins[v.manifest.id] = v;
|
||||
}
|
||||
}
|
||||
return { plugins, allPlugins, thisDevicePlugins };
|
||||
}
|
||||
|
||||
async checkPluginUpdate() {
|
||||
if (!this.plugin.settings.usePluginSync)
|
||||
return;
|
||||
await this.sweepPlugin(false);
|
||||
const { allPlugins, thisDevicePlugins } = await this.getPluginList();
|
||||
const arrPlugins = Object.values(allPlugins);
|
||||
let updateFound = false;
|
||||
for (const plugin of arrPlugins) {
|
||||
const ownPlugin = thisDevicePlugins[plugin.manifest.id];
|
||||
if (ownPlugin) {
|
||||
const remoteVersion = versionNumberString2Number(plugin.manifest.version);
|
||||
const ownVersion = versionNumberString2Number(ownPlugin.manifest.version);
|
||||
if (remoteVersion > ownVersion) {
|
||||
updateFound = true;
|
||||
}
|
||||
if (((plugin.mtime / 1000) | 0) > ((ownPlugin.mtime / 1000) | 0) && (plugin.dataJson ?? "") != (ownPlugin.dataJson ?? "")) {
|
||||
updateFound = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (updateFound) {
|
||||
const fragment = createFragment((doc) => {
|
||||
doc.createEl("a", null, (a) => {
|
||||
a.text = "There're some new plugins or their settings";
|
||||
a.addEventListener("click", () => this.showPluginSyncModal());
|
||||
});
|
||||
});
|
||||
NewNotice(fragment, 10000);
|
||||
} else {
|
||||
Logger("Everything is up to date.", LOG_LEVEL.NOTICE);
|
||||
}
|
||||
}
|
||||
|
||||
async sweepPlugin(showMessage = false, specificPluginPath = "") {
|
||||
if (!this.settings.usePluginSync)
|
||||
return;
|
||||
if (!this.localDatabase.isReady)
|
||||
return;
|
||||
// @ts-ignore
|
||||
const pl = this.app.plugins;
|
||||
const manifests: PluginManifest[] = Object.values(pl.manifests);
|
||||
let specificPlugin = "";
|
||||
if (specificPluginPath != "") {
|
||||
specificPlugin = manifests.find(e => e.dir.endsWith("/" + specificPluginPath))?.id ?? "";
|
||||
}
|
||||
await runWithLock("sweepplugin", true, async () => {
|
||||
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
|
||||
if (!this.deviceAndVaultName) {
|
||||
Logger("You have to set your device name.", LOG_LEVEL.NOTICE);
|
||||
return;
|
||||
}
|
||||
Logger("Scanning plugins", logLevel);
|
||||
const oldDocs = await this.localDatabase.allDocsRaw<EntryDoc>({
|
||||
startkey: `ps:${this.deviceAndVaultName}-${specificPlugin}`,
|
||||
endkey: `ps:${this.deviceAndVaultName}-${specificPlugin}\u{10ffff}`,
|
||||
include_docs: true,
|
||||
});
|
||||
// Logger("OLD DOCS.", LOG_LEVEL.VERBOSE);
|
||||
// sweep current plugin.
|
||||
const procs = manifests.map(async (m) => {
|
||||
const pluginDataEntryID = `ps:${this.deviceAndVaultName}-${m.id}` as DocumentID;
|
||||
try {
|
||||
if (specificPlugin && m.id != specificPlugin) {
|
||||
return;
|
||||
}
|
||||
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL.VERBOSE);
|
||||
const path = normalizePath(m.dir) + "/";
|
||||
const adapter = this.app.vault.adapter;
|
||||
const files = ["manifest.json", "main.js", "styles.css", "data.json"];
|
||||
const pluginData: { [key: string]: string; } = {};
|
||||
for (const file of files) {
|
||||
const thePath = path + file;
|
||||
if (await adapter.exists(thePath)) {
|
||||
pluginData[file] = await adapter.read(thePath);
|
||||
}
|
||||
}
|
||||
let mtime = 0;
|
||||
if (await adapter.exists(path + "/data.json")) {
|
||||
mtime = (await adapter.stat(path + "/data.json")).mtime;
|
||||
}
|
||||
|
||||
const p: PluginDataEntry = {
|
||||
_id: pluginDataEntryID,
|
||||
dataJson: pluginData["data.json"],
|
||||
deviceVaultName: this.deviceAndVaultName,
|
||||
mainJs: pluginData["main.js"],
|
||||
styleCss: pluginData["styles.css"],
|
||||
manifest: m,
|
||||
manifestJson: pluginData["manifest.json"],
|
||||
mtime: mtime,
|
||||
type: "plugin",
|
||||
};
|
||||
const d: LoadedEntry = {
|
||||
_id: p._id,
|
||||
path: p._id as string as FilePathWithPrefix,
|
||||
data: JSON.stringify(p),
|
||||
ctime: mtime,
|
||||
mtime: mtime,
|
||||
size: 0,
|
||||
children: [],
|
||||
datatype: "plain",
|
||||
type: "plain"
|
||||
};
|
||||
Logger(`check diff:${m.name}(${m.id})`, LOG_LEVEL.VERBOSE);
|
||||
await runWithLock("plugin-" + m.id, false, async () => {
|
||||
const old = await this.localDatabase.getDBEntry(p._id as string as FilePathWithPrefix /* This also should be explained */, null, false, false);
|
||||
if (old !== false) {
|
||||
const oldData = { data: old.data, deleted: old._deleted };
|
||||
const newData = { data: d.data, deleted: d._deleted };
|
||||
if (isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
|
||||
Logger(`Nothing changed:${m.name}`);
|
||||
return;
|
||||
}
|
||||
}
|
||||
await this.localDatabase.putDBEntry(d);
|
||||
Logger(`Plugin saved:${m.name}`, logLevel);
|
||||
});
|
||||
} catch (ex) {
|
||||
Logger(`Plugin save failed:${m.name}`, LOG_LEVEL.NOTICE);
|
||||
} finally {
|
||||
oldDocs.rows = oldDocs.rows.filter((e) => e.id != pluginDataEntryID);
|
||||
}
|
||||
//remove saved plugin data.
|
||||
}
|
||||
);
|
||||
|
||||
await Promise.all(procs);
|
||||
|
||||
const delDocs = oldDocs.rows.map((e) => {
|
||||
// e.doc._deleted = true;
|
||||
if (e.doc.type == "newnote" || e.doc.type == "plain") {
|
||||
e.doc.deleted = true;
|
||||
if (this.settings.deleteMetadataOfDeletedFiles) {
|
||||
e.doc._deleted = true;
|
||||
}
|
||||
} else {
|
||||
e.doc._deleted = true;
|
||||
}
|
||||
return e.doc;
|
||||
});
|
||||
Logger(`Deleting old plugin:(${delDocs.length})`, LOG_LEVEL.VERBOSE);
|
||||
await this.localDatabase.bulkDocsRaw(delDocs);
|
||||
Logger(`Scan plugin done.`, logLevel);
|
||||
});
|
||||
}
|
||||
|
||||
async applyPluginData(plugin: PluginDataEntry) {
|
||||
await runWithLock("plugin-" + plugin.manifest.id, false, async () => {
|
||||
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
|
||||
const adapter = this.app.vault.adapter;
|
||||
// @ts-ignore
|
||||
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
|
||||
if (stat) {
|
||||
// @ts-ignore
|
||||
await this.app.plugins.unloadPlugin(plugin.manifest.id);
|
||||
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
if (plugin.dataJson)
|
||||
await adapter.write(pluginTargetFolderPath + "data.json", plugin.dataJson);
|
||||
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL.NOTICE);
|
||||
if (stat) {
|
||||
// @ts-ignore
|
||||
await this.app.plugins.loadPlugin(plugin.manifest.id);
|
||||
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async applyPlugin(plugin: PluginDataEntry) {
|
||||
await runWithLock("plugin-" + plugin.manifest.id, false, async () => {
|
||||
// @ts-ignore
|
||||
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
|
||||
if (stat) {
|
||||
// @ts-ignore
|
||||
await this.app.plugins.unloadPlugin(plugin.manifest.id);
|
||||
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
|
||||
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
|
||||
const adapter = this.app.vault.adapter;
|
||||
if ((await adapter.exists(pluginTargetFolderPath)) === false) {
|
||||
await adapter.mkdir(pluginTargetFolderPath);
|
||||
}
|
||||
await adapter.write(pluginTargetFolderPath + "main.js", plugin.mainJs);
|
||||
await adapter.write(pluginTargetFolderPath + "manifest.json", plugin.manifestJson);
|
||||
if (plugin.styleCss)
|
||||
await adapter.write(pluginTargetFolderPath + "styles.css", plugin.styleCss);
|
||||
if (stat) {
|
||||
// @ts-ignore
|
||||
await this.app.plugins.loadPlugin(plugin.manifest.id);
|
||||
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
@@ -1,84 +0,0 @@
|
||||
import { App, Modal } from "./deps";
|
||||
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT } from "diff-match-patch";
|
||||
import { diff_result } from "./lib/src/types";
|
||||
import { escapeStringToHTML } from "./lib/src/strbin";
|
||||
|
||||
export class ConflictResolveModal extends Modal {
|
||||
// result: Array<[number, string]>;
|
||||
result: diff_result;
|
||||
filename: string;
|
||||
callback: (remove_rev: string) => Promise<void>;
|
||||
|
||||
constructor(app: App, filename: string, diff: diff_result, callback: (remove_rev: string) => Promise<void>) {
|
||||
super(app);
|
||||
this.result = diff;
|
||||
this.callback = callback;
|
||||
this.filename = filename;
|
||||
}
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
|
||||
contentEl.empty();
|
||||
|
||||
contentEl.createEl("h2", { text: "This document has conflicted changes." });
|
||||
contentEl.createEl("span", { text: this.filename });
|
||||
const div = contentEl.createDiv("");
|
||||
div.addClass("op-scrollable");
|
||||
let diff = "";
|
||||
for (const v of this.result.diff) {
|
||||
const x1 = v[0];
|
||||
const x2 = v[1];
|
||||
if (x1 == DIFF_DELETE) {
|
||||
diff += "<span class='deleted'>" + escapeStringToHTML(x2) + "</span>";
|
||||
} else if (x1 == DIFF_EQUAL) {
|
||||
diff += "<span class='normal'>" + escapeStringToHTML(x2) + "</span>";
|
||||
} else if (x1 == DIFF_INSERT) {
|
||||
diff += "<span class='added'>" + escapeStringToHTML(x2) + "</span>";
|
||||
}
|
||||
}
|
||||
|
||||
diff = diff.replace(/\n/g, "<br>");
|
||||
div.innerHTML = diff;
|
||||
const div2 = contentEl.createDiv("");
|
||||
const date1 = new Date(this.result.left.mtime).toLocaleString() + (this.result.left.deleted ? " (Deleted)" : "");
|
||||
const date2 = new Date(this.result.right.mtime).toLocaleString() + (this.result.right.deleted ? " (Deleted)" : "");
|
||||
div2.innerHTML = `
|
||||
<span class='deleted'>A:${date1}</span><br /><span class='added'>B:${date2}</span><br>
|
||||
`;
|
||||
contentEl.createEl("button", { text: "Keep A" }, (e) => {
|
||||
e.addEventListener("click", async () => {
|
||||
await this.callback(this.result.right.rev);
|
||||
this.callback = null;
|
||||
this.close();
|
||||
});
|
||||
});
|
||||
contentEl.createEl("button", { text: "Keep B" }, (e) => {
|
||||
e.addEventListener("click", async () => {
|
||||
await this.callback(this.result.left.rev);
|
||||
this.callback = null;
|
||||
this.close();
|
||||
});
|
||||
});
|
||||
contentEl.createEl("button", { text: "Concat both" }, (e) => {
|
||||
e.addEventListener("click", async () => {
|
||||
await this.callback("");
|
||||
this.callback = null;
|
||||
this.close();
|
||||
});
|
||||
});
|
||||
contentEl.createEl("button", { text: "Not now" }, (e) => {
|
||||
e.addEventListener("click", () => {
|
||||
this.close();
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
onClose() {
|
||||
const { contentEl } = this;
|
||||
contentEl.empty();
|
||||
if (this.callback != null) {
|
||||
this.callback(null);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,212 +0,0 @@
|
||||
import { TFile, Modal, App } from "./deps";
|
||||
import { getPathFromTFile, isValidPath } from "./utils";
|
||||
import { base64ToArrayBuffer, base64ToString, escapeStringToHTML } from "./lib/src/strbin";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "diff-match-patch";
|
||||
import { DocumentID, FilePathWithPrefix, LoadedEntry, LOG_LEVEL } from "./lib/src/types";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { isErrorOfMissingDoc } from "./lib/src/utils_couchdb";
|
||||
import { getDocData } from "./lib/src/utils";
|
||||
import { stripPrefix } from "./lib/src/path";
|
||||
|
||||
export class DocumentHistoryModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
range: HTMLInputElement;
|
||||
contentView: HTMLDivElement;
|
||||
info: HTMLDivElement;
|
||||
fileInfo: HTMLDivElement;
|
||||
showDiff = false;
|
||||
id: DocumentID;
|
||||
|
||||
file: FilePathWithPrefix;
|
||||
|
||||
revs_info: PouchDB.Core.RevisionInfo[] = [];
|
||||
currentDoc: LoadedEntry;
|
||||
currentText = "";
|
||||
currentDeleted = false;
|
||||
|
||||
constructor(app: App, plugin: ObsidianLiveSyncPlugin, file: TFile | FilePathWithPrefix, id: DocumentID) {
|
||||
super(app);
|
||||
this.plugin = plugin;
|
||||
this.file = (file instanceof TFile) ? getPathFromTFile(file) : file;
|
||||
this.id = id;
|
||||
if (!file) {
|
||||
this.file = this.plugin.id2path(id, null);
|
||||
}
|
||||
if (localStorage.getItem("ols-history-highlightdiff") == "1") {
|
||||
this.showDiff = true;
|
||||
}
|
||||
}
|
||||
|
||||
async loadFile() {
|
||||
if (!this.id) {
|
||||
this.id = await this.plugin.path2id(this.file);
|
||||
}
|
||||
const db = this.plugin.localDatabase;
|
||||
try {
|
||||
const w = await db.localDatabase.get(this.id, { revs_info: true });
|
||||
this.revs_info = w._revs_info.filter((e) => e?.status == "available");
|
||||
this.range.max = `${this.revs_info.length - 1}`;
|
||||
this.range.value = this.range.max;
|
||||
this.fileInfo.setText(`${this.file} / ${this.revs_info.length} revisions`);
|
||||
await this.loadRevs();
|
||||
} catch (ex) {
|
||||
if (isErrorOfMissingDoc(ex)) {
|
||||
this.range.max = "0";
|
||||
this.range.value = "";
|
||||
this.range.disabled = true;
|
||||
this.showDiff
|
||||
this.contentView.setText(`History of this file was not recorded.`);
|
||||
} else {
|
||||
this.contentView.setText(`Error occurred.`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
}
|
||||
}
|
||||
async loadRevs() {
|
||||
if (this.revs_info.length == 0) return;
|
||||
const db = this.plugin.localDatabase;
|
||||
const index = this.revs_info.length - 1 - (this.range.value as any) / 1;
|
||||
const rev = this.revs_info[index];
|
||||
const w = await db.getDBEntry(this.file, { rev: rev.rev }, false, false, true);
|
||||
this.currentText = "";
|
||||
this.currentDeleted = false;
|
||||
if (w === false) {
|
||||
this.currentDeleted = true;
|
||||
this.info.innerHTML = "";
|
||||
this.contentView.innerHTML = `Could not read this revision<br>(${rev.rev})`;
|
||||
} else {
|
||||
this.currentDoc = w;
|
||||
this.info.innerHTML = `Modified:${new Date(w.mtime).toLocaleString()}`;
|
||||
let result = "";
|
||||
const w1data = w.datatype == "plain" ? getDocData(w.data) : base64ToString(w.data);
|
||||
this.currentDeleted = w.deleted;
|
||||
this.currentText = w1data;
|
||||
if (this.showDiff) {
|
||||
const prevRevIdx = this.revs_info.length - 1 - ((this.range.value as any) / 1 - 1);
|
||||
if (prevRevIdx >= 0 && prevRevIdx < this.revs_info.length) {
|
||||
const oldRev = this.revs_info[prevRevIdx].rev;
|
||||
const w2 = await db.getDBEntry(this.file, { rev: oldRev }, false, false, true);
|
||||
if (w2 != false) {
|
||||
const dmp = new diff_match_patch();
|
||||
const w2data = w2.datatype == "plain" ? getDocData(w2.data) : base64ToString(w2.data);
|
||||
const diff = dmp.diff_main(w2data, w1data);
|
||||
dmp.diff_cleanupSemantic(diff);
|
||||
for (const v of diff) {
|
||||
const x1 = v[0];
|
||||
const x2 = v[1];
|
||||
if (x1 == DIFF_DELETE) {
|
||||
result += "<span class='history-deleted'>" + escapeStringToHTML(x2) + "</span>";
|
||||
} else if (x1 == DIFF_EQUAL) {
|
||||
result += "<span class='history-normal'>" + escapeStringToHTML(x2) + "</span>";
|
||||
} else if (x1 == DIFF_INSERT) {
|
||||
result += "<span class='history-added'>" + escapeStringToHTML(x2) + "</span>";
|
||||
}
|
||||
}
|
||||
|
||||
result = result.replace(/\n/g, "<br>");
|
||||
} else {
|
||||
result = escapeStringToHTML(w1data);
|
||||
}
|
||||
} else {
|
||||
result = escapeStringToHTML(w1data);
|
||||
}
|
||||
} else {
|
||||
result = escapeStringToHTML(w1data);
|
||||
}
|
||||
this.contentView.innerHTML = (this.currentDeleted ? "(At this revision, the file has been deleted)\n" : "") + result;
|
||||
}
|
||||
}
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
|
||||
contentEl.empty();
|
||||
contentEl.createEl("h2", { text: "Document History" });
|
||||
this.fileInfo = contentEl.createDiv("");
|
||||
this.fileInfo.addClass("op-info");
|
||||
const divView = contentEl.createDiv("");
|
||||
divView.addClass("op-flex");
|
||||
|
||||
divView.createEl("input", { type: "range" }, (e) => {
|
||||
this.range = e;
|
||||
e.addEventListener("change", (e) => {
|
||||
this.loadRevs();
|
||||
});
|
||||
e.addEventListener("input", (e) => {
|
||||
this.loadRevs();
|
||||
});
|
||||
});
|
||||
contentEl
|
||||
.createDiv("", (e) => {
|
||||
e.createEl("label", {}, (label) => {
|
||||
label.appendChild(
|
||||
createEl("input", { type: "checkbox" }, (checkbox) => {
|
||||
if (this.showDiff) {
|
||||
checkbox.checked = true;
|
||||
}
|
||||
checkbox.addEventListener("input", (evt: any) => {
|
||||
this.showDiff = checkbox.checked;
|
||||
localStorage.setItem("ols-history-highlightdiff", this.showDiff == true ? "1" : "");
|
||||
this.loadRevs();
|
||||
});
|
||||
})
|
||||
);
|
||||
label.appendText("Highlight diff");
|
||||
});
|
||||
})
|
||||
.addClass("op-info");
|
||||
this.info = contentEl.createDiv("");
|
||||
this.info.addClass("op-info");
|
||||
this.loadFile();
|
||||
const div = contentEl.createDiv({ text: "Loading old revisions..." });
|
||||
this.contentView = div;
|
||||
div.addClass("op-scrollable");
|
||||
div.addClass("op-pre");
|
||||
const buttons = contentEl.createDiv("");
|
||||
buttons.createEl("button", { text: "Copy to clipboard" }, (e) => {
|
||||
e.addClass("mod-cta");
|
||||
e.addEventListener("click", async () => {
|
||||
await navigator.clipboard.writeText(this.currentText);
|
||||
Logger(`Old content copied to clipboard`, LOG_LEVEL.NOTICE);
|
||||
});
|
||||
});
|
||||
async function focusFile(path: string) {
|
||||
const targetFile = app.vault
|
||||
.getFiles()
|
||||
.find((f) => f.path === path);
|
||||
if (targetFile) {
|
||||
const leaf = app.workspace.getLeaf(false);
|
||||
await leaf.openFile(targetFile);
|
||||
} else {
|
||||
Logger("The file could not view on the editor", LOG_LEVEL.NOTICE)
|
||||
}
|
||||
}
|
||||
buttons.createEl("button", { text: "Back to this revision" }, (e) => {
|
||||
e.addClass("mod-cta");
|
||||
e.addEventListener("click", async () => {
|
||||
// const pathToWrite = this.plugin.id2path(this.id, true);
|
||||
const pathToWrite = stripPrefix(this.file);
|
||||
if (!isValidPath(pathToWrite)) {
|
||||
Logger("Path is not valid to write content.", LOG_LEVEL.INFO);
|
||||
}
|
||||
if (this.currentDoc?.datatype == "plain") {
|
||||
await this.app.vault.adapter.write(pathToWrite, getDocData(this.currentDoc.data));
|
||||
await focusFile(pathToWrite);
|
||||
this.close();
|
||||
} else if (this.currentDoc?.datatype == "newnote") {
|
||||
await this.app.vault.adapter.writeBinary(pathToWrite, base64ToArrayBuffer(this.currentDoc.data));
|
||||
await focusFile(pathToWrite);
|
||||
this.close();
|
||||
} else {
|
||||
|
||||
Logger(`Could not parse entry`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
onClose() {
|
||||
const { contentEl } = this;
|
||||
contentEl.empty();
|
||||
}
|
||||
}
|
||||
@@ -1,230 +0,0 @@
|
||||
<script lang="ts">
|
||||
import { onMount } from "svelte";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { PluginDataExDisplay, pluginIsEnumerating, pluginList } from "./CmdConfigSync";
|
||||
import PluginCombo from "./PluginCombo.svelte";
|
||||
export let plugin: ObsidianLiveSyncPlugin;
|
||||
|
||||
$: hideNotApplicable = true;
|
||||
$: thisTerm = plugin.deviceAndVaultName;
|
||||
|
||||
const addOn = plugin.addOnConfigSync;
|
||||
|
||||
let list: PluginDataExDisplay[] = [];
|
||||
|
||||
let selectNewestPulse = 0;
|
||||
let hideEven = true;
|
||||
let loading = false;
|
||||
let applyAllPluse = 0;
|
||||
let isMaintenanceMode = false;
|
||||
async function requestUpdate() {
|
||||
await addOn.updatePluginList(true);
|
||||
}
|
||||
async function requestReload() {
|
||||
await addOn.reloadPluginList(true);
|
||||
}
|
||||
pluginList.subscribe((e) => {
|
||||
list = e;
|
||||
});
|
||||
pluginIsEnumerating.subscribe((e) => {
|
||||
loading = e;
|
||||
});
|
||||
onMount(async () => {
|
||||
requestUpdate();
|
||||
});
|
||||
|
||||
function filterList(list: PluginDataExDisplay[], categories: string[]) {
|
||||
const w = list.filter((e) => categories.indexOf(e.category) !== -1);
|
||||
return w.sort((a, b) => `${a.category}-${a.name}`.localeCompare(`${b.category}-${b.name}`));
|
||||
}
|
||||
|
||||
function groupBy(items: PluginDataExDisplay[], key: string) {
|
||||
let ret = {} as Record<string, PluginDataExDisplay[]>;
|
||||
for (const v of items) {
|
||||
//@ts-ignore
|
||||
const k = (key in v ? v[key] : "") as string;
|
||||
ret[k] = ret[k] || [];
|
||||
ret[k].push(v);
|
||||
}
|
||||
for (const k in ret) {
|
||||
ret[k] = ret[k].sort((a, b) => `${a.category}-${a.name}`.localeCompare(`${b.category}-${b.name}`));
|
||||
}
|
||||
const w = Object.entries(ret);
|
||||
return w.sort(([a], [b]) => `${a}`.localeCompare(`${b}`));
|
||||
}
|
||||
|
||||
const displays = {
|
||||
CONFIG: "Configuration",
|
||||
THEME: "Themes",
|
||||
SNIPPET: "Snippets",
|
||||
};
|
||||
async function scanAgain() {
|
||||
await addOn.scanAllConfigFiles(true);
|
||||
await requestUpdate();
|
||||
}
|
||||
async function replicate() {
|
||||
await plugin.replicate(true);
|
||||
}
|
||||
function selectAllNewest() {
|
||||
selectNewestPulse++;
|
||||
}
|
||||
function applyAll() {
|
||||
applyAllPluse++;
|
||||
}
|
||||
async function applyData(data: PluginDataExDisplay): Promise<boolean> {
|
||||
return await addOn.applyData(data);
|
||||
}
|
||||
async function compareData(docA: PluginDataExDisplay, docB: PluginDataExDisplay): Promise<boolean> {
|
||||
return await addOn.compareUsingDisplayData(docA, docB);
|
||||
}
|
||||
async function deleteData(data: PluginDataExDisplay): Promise<boolean> {
|
||||
return await addOn.deleteData(data);
|
||||
}
|
||||
|
||||
$: options = {
|
||||
thisTerm,
|
||||
hideNotApplicable,
|
||||
selectNewest: selectNewestPulse,
|
||||
applyAllPluse,
|
||||
applyData,
|
||||
compareData,
|
||||
deleteData,
|
||||
plugin,
|
||||
isMaintenanceMode,
|
||||
};
|
||||
</script>
|
||||
|
||||
<div>
|
||||
<div>
|
||||
<h1>Customization sync</h1>
|
||||
<div class="buttons">
|
||||
<button on:click={() => scanAgain()}>Scan changes</button>
|
||||
<button on:click={() => replicate()}>Sync once</button>
|
||||
<button on:click={() => requestUpdate()}>Refresh</button>
|
||||
{#if isMaintenanceMode}
|
||||
<button on:click={() => requestReload()}>Reload</button>
|
||||
{/if}
|
||||
<button on:click={() => selectAllNewest()}>Select All Shiny</button>
|
||||
</div>
|
||||
<div class="buttons">
|
||||
<button on:click={() => applyAll()}>Apply All</button>
|
||||
</div>
|
||||
</div>
|
||||
{#if loading}
|
||||
<div>
|
||||
<span>Updating list...</span>
|
||||
</div>
|
||||
{/if}
|
||||
<div class="list">
|
||||
{#if list.length == 0}
|
||||
<div class="center">No Items.</div>
|
||||
{:else}
|
||||
{#each Object.entries(displays) as [key, label]}
|
||||
<div>
|
||||
<h3>{label}</h3>
|
||||
{#each groupBy(filterList(list, [key]), "name") as [name, listX]}
|
||||
<div class="labelrow {hideEven ? 'hideeven' : ''}">
|
||||
<div class="title">
|
||||
{name}
|
||||
</div>
|
||||
<PluginCombo {...options} list={listX} hidden={false} />
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{/each}
|
||||
<div>
|
||||
<h3>Plugins</h3>
|
||||
{#each groupBy(filterList(list, ["PLUGIN_MAIN", "PLUGIN_DATA", "PLUGIN_ETC"]), "name") as [name, listX]}
|
||||
<div class="labelrow {hideEven ? 'hideeven' : ''}">
|
||||
<div class="title">
|
||||
{name}
|
||||
</div>
|
||||
<PluginCombo {...options} list={listX} hidden={true} />
|
||||
</div>
|
||||
<div class="filerow {hideEven ? 'hideeven' : ''}">
|
||||
<div class="filetitle">Main</div>
|
||||
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_MAIN"])} hidden={false} />
|
||||
</div>
|
||||
<div class="filerow {hideEven ? 'hideeven' : ''}">
|
||||
<div class="filetitle">Data</div>
|
||||
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_DATA"])} hidden={false} />
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
<div class="buttons">
|
||||
<label><span>Hide not applicable items</span><input type="checkbox" bind:checked={hideEven} /></label>
|
||||
</div>
|
||||
<div class="buttons">
|
||||
<label><span>Maintenance mode</span><input type="checkbox" bind:checked={isMaintenanceMode} /></label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
.labelrow {
|
||||
margin-left: 0.4em;
|
||||
display: flex;
|
||||
justify-content: flex-start;
|
||||
align-items: center;
|
||||
border-top: 1px solid var(--background-modifier-border);
|
||||
padding: 4px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
.filerow {
|
||||
margin-left: 1.25em;
|
||||
display: flex;
|
||||
justify-content: flex-start;
|
||||
align-items: center;
|
||||
padding-right: 4px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
.filerow.hideeven:has(.even),
|
||||
.labelrow.hideeven:has(.even) {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.title {
|
||||
color: var(--text-normal);
|
||||
font-size: var(--font-ui-medium);
|
||||
line-height: var(--line-height-tight);
|
||||
margin-right: auto;
|
||||
}
|
||||
.filetitle {
|
||||
color: var(--text-normal);
|
||||
font-size: var(--font-ui-medium);
|
||||
line-height: var(--line-height-tight);
|
||||
margin-right: auto;
|
||||
}
|
||||
.buttons {
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
justify-content: flex-end;
|
||||
margin-top: 8px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
.buttons > button {
|
||||
margin-left: 4px;
|
||||
width: auto;
|
||||
}
|
||||
|
||||
label {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
label > span {
|
||||
margin-right: 0.25em;
|
||||
}
|
||||
:global(.is-mobile) .title,
|
||||
:global(.is-mobile) .filetitle {
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.center {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
min-height: 3em;
|
||||
}
|
||||
</style>
|
||||
@@ -1,172 +0,0 @@
|
||||
import { Plugin_2, TAbstractFile, TFile, TFolder } from "./deps";
|
||||
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
|
||||
import { getGlobalStore } from "./lib/src/store";
|
||||
import { FilePath, ObsidianLiveSyncSettings } from "./lib/src/types";
|
||||
import { FileEventItem, FileEventType, FileInfo, InternalFileInfo, queueItem } from "./types";
|
||||
import { recentlyTouched } from "./utils";
|
||||
|
||||
|
||||
export abstract class StorageEventManager {
|
||||
abstract fetchEvent(): FileEventItem | false;
|
||||
abstract cancelRelativeEvent(item: FileEventItem): void;
|
||||
abstract getQueueLength(): number;
|
||||
}
|
||||
|
||||
type LiveSyncForStorageEventManager = Plugin_2 &
|
||||
{
|
||||
settings: ObsidianLiveSyncSettings
|
||||
} & {
|
||||
isTargetFile: (file: string | TAbstractFile) => boolean,
|
||||
procFileEvent: (applyBatch?: boolean) => Promise<boolean>
|
||||
};
|
||||
|
||||
|
||||
export class StorageEventManagerObsidian extends StorageEventManager {
|
||||
plugin: LiveSyncForStorageEventManager;
|
||||
queuedFilesStore = getGlobalStore("queuedFiles", { queuedItems: [] as queueItem[], fileEventItems: [] as FileEventItem[] });
|
||||
|
||||
watchedFileEventQueue = [] as FileEventItem[];
|
||||
|
||||
constructor(plugin: LiveSyncForStorageEventManager) {
|
||||
super();
|
||||
this.plugin = plugin;
|
||||
this.watchVaultChange = this.watchVaultChange.bind(this);
|
||||
this.watchVaultCreate = this.watchVaultCreate.bind(this);
|
||||
this.watchVaultDelete = this.watchVaultDelete.bind(this);
|
||||
this.watchVaultRename = this.watchVaultRename.bind(this);
|
||||
this.watchVaultRawEvents = this.watchVaultRawEvents.bind(this);
|
||||
plugin.registerEvent(app.vault.on("modify", this.watchVaultChange));
|
||||
plugin.registerEvent(app.vault.on("delete", this.watchVaultDelete));
|
||||
plugin.registerEvent(app.vault.on("rename", this.watchVaultRename));
|
||||
plugin.registerEvent(app.vault.on("create", this.watchVaultCreate));
|
||||
//@ts-ignore : Internal API
|
||||
plugin.registerEvent(app.vault.on("raw", this.watchVaultRawEvents));
|
||||
}
|
||||
|
||||
watchVaultCreate(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "CREATE", file }], ctx);
|
||||
}
|
||||
|
||||
watchVaultChange(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "CHANGED", file }], ctx);
|
||||
}
|
||||
|
||||
watchVaultDelete(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "DELETE", file }], ctx);
|
||||
}
|
||||
watchVaultRename(file: TAbstractFile, oldFile: string, ctx?: any) {
|
||||
if (file instanceof TFile) {
|
||||
this.appendWatchEvent([
|
||||
{ type: "DELETE", file: { path: oldFile as FilePath, mtime: file.stat.mtime, ctime: file.stat.ctime, size: file.stat.size, deleted: true } },
|
||||
{ type: "CREATE", file },
|
||||
], ctx);
|
||||
}
|
||||
}
|
||||
// Watch raw events (Internal API)
|
||||
watchVaultRawEvents(path: FilePath) {
|
||||
if (!this.plugin.settings.syncInternalFiles && !this.plugin.settings.usePluginSync) return;
|
||||
if (!this.plugin.settings.watchInternalFileChanges) return;
|
||||
if (!path.startsWith(app.vault.configDir)) return;
|
||||
const ignorePatterns = this.plugin.settings.syncInternalFilesIgnorePatterns
|
||||
.replace(/\n| /g, "")
|
||||
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
|
||||
if (ignorePatterns.some(e => path.match(e))) return;
|
||||
this.appendWatchEvent(
|
||||
[{
|
||||
type: "INTERNAL",
|
||||
file: { path, mtime: 0, ctime: 0, size: 0 }
|
||||
}], null);
|
||||
}
|
||||
|
||||
// Cache file and waiting to can be proceed.
|
||||
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
|
||||
let forcePerform = false;
|
||||
for (const param of params) {
|
||||
if (shouldBeIgnored(param.file.path)) {
|
||||
continue;
|
||||
}
|
||||
const atomicKey = [0, 0, 0, 0, 0, 0].map(e => `${Math.floor(Math.random() * 100000)}`).join("-");
|
||||
const type = param.type;
|
||||
const file = param.file;
|
||||
const oldPath = param.oldPath;
|
||||
if (file instanceof TFolder) continue;
|
||||
if (!this.plugin.isTargetFile(file.path)) continue;
|
||||
if (this.plugin.settings.suspendFileWatching) continue;
|
||||
|
||||
let cache: null | string | ArrayBuffer;
|
||||
// new file or something changed, cache the changes.
|
||||
if (file instanceof TFile && (type == "CREATE" || type == "CHANGED")) {
|
||||
if (recentlyTouched(file)) {
|
||||
continue;
|
||||
}
|
||||
if (!isPlainText(file.name)) {
|
||||
cache = await app.vault.readBinary(file);
|
||||
} else {
|
||||
// cache = await this.app.vault.read(file);
|
||||
cache = await app.vault.cachedRead(file);
|
||||
if (!cache) cache = await app.vault.read(file);
|
||||
}
|
||||
}
|
||||
if (type == "DELETE" || type == "RENAME") {
|
||||
forcePerform = true;
|
||||
}
|
||||
|
||||
|
||||
if (this.plugin.settings.batchSave) {
|
||||
// if the latest event is the same type, omit that
|
||||
// a.md MODIFY <- this should be cancelled when a.md MODIFIED
|
||||
// b.md MODIFY <- this should be cancelled when b.md MODIFIED
|
||||
// a.md MODIFY
|
||||
// a.md CREATE
|
||||
// :
|
||||
let i = this.watchedFileEventQueue.length;
|
||||
L1:
|
||||
while (i >= 0) {
|
||||
i--;
|
||||
if (i < 0) break L1;
|
||||
if (this.watchedFileEventQueue[i].args.file.path != file.path) {
|
||||
continue L1;
|
||||
}
|
||||
if (this.watchedFileEventQueue[i].type != type) break L1;
|
||||
this.watchedFileEventQueue.remove(this.watchedFileEventQueue[i]);
|
||||
//this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
|
||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
||||
}
|
||||
}
|
||||
|
||||
const fileInfo = file instanceof TFile ? {
|
||||
ctime: file.stat.ctime,
|
||||
mtime: file.stat.mtime,
|
||||
file: file,
|
||||
path: file.path,
|
||||
size: file.stat.size
|
||||
} as FileInfo : file as InternalFileInfo;
|
||||
this.watchedFileEventQueue.push({
|
||||
type,
|
||||
args: {
|
||||
file: fileInfo,
|
||||
oldPath,
|
||||
cache,
|
||||
ctx
|
||||
},
|
||||
key: atomicKey
|
||||
})
|
||||
}
|
||||
// this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
|
||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
||||
this.plugin.procFileEvent(forcePerform);
|
||||
}
|
||||
fetchEvent(): FileEventItem | false {
|
||||
if (this.watchedFileEventQueue.length == 0) return false;
|
||||
const item = this.watchedFileEventQueue.shift();
|
||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
||||
return item;
|
||||
}
|
||||
cancelRelativeEvent(item: FileEventItem) {
|
||||
this.watchedFileEventQueue = [...this.watchedFileEventQueue].filter(e => e.key != item.key);
|
||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
||||
}
|
||||
getQueueLength() {
|
||||
return this.watchedFileEventQueue.length;
|
||||
}
|
||||
}
|
||||
@@ -1,12 +1,12 @@
|
||||
import { deleteDB, IDBPDatabase, openDB } from "idb";
|
||||
import { deleteDB, type IDBPDatabase, openDB } from "idb";
|
||||
export interface KeyValueDatabase {
|
||||
get<T>(key: string): Promise<T>;
|
||||
set<T>(key: string, value: T): Promise<IDBValidKey>;
|
||||
del(key: string): Promise<void>;
|
||||
get<T>(key: IDBValidKey): Promise<T>;
|
||||
set<T>(key: IDBValidKey, value: T): Promise<IDBValidKey>;
|
||||
del(key: IDBValidKey): Promise<void>;
|
||||
clear(): Promise<void>;
|
||||
keys(query?: IDBValidKey | IDBKeyRange, count?: number): Promise<IDBValidKey[]>;
|
||||
close(): void;
|
||||
destroy(): void;
|
||||
destroy(): Promise<void>;
|
||||
}
|
||||
const databaseCache: { [key: string]: IDBPDatabase<any> } = {};
|
||||
export const OpenKeyValueDatabase = async (dbKey: string): Promise<KeyValueDatabase> => {
|
||||
@@ -20,24 +20,23 @@ export const OpenKeyValueDatabase = async (dbKey: string): Promise<KeyValueDatab
|
||||
db.createObjectStore(storeKey);
|
||||
},
|
||||
});
|
||||
let db: IDBPDatabase<any> = null;
|
||||
db = await dbPromise;
|
||||
const db = await dbPromise;
|
||||
databaseCache[dbKey] = db;
|
||||
return {
|
||||
get<T>(key: string): Promise<T> {
|
||||
return db.get(storeKey, key);
|
||||
async get<T>(key: IDBValidKey): Promise<T> {
|
||||
return await db.get(storeKey, key);
|
||||
},
|
||||
set<T>(key: string, value: T) {
|
||||
return db.put(storeKey, value, key);
|
||||
async set<T>(key: IDBValidKey, value: T) {
|
||||
return await db.put(storeKey, value, key);
|
||||
},
|
||||
del(key: string) {
|
||||
return db.delete(storeKey, key);
|
||||
async del(key: IDBValidKey) {
|
||||
return await db.delete(storeKey, key);
|
||||
},
|
||||
clear() {
|
||||
return db.clear(storeKey);
|
||||
async clear() {
|
||||
return await db.clear(storeKey);
|
||||
},
|
||||
keys(query?: IDBValidKey | IDBKeyRange, count?: number) {
|
||||
return db.getAllKeys(storeKey, query, count);
|
||||
async keys(query?: IDBValidKey | IDBKeyRange, count?: number) {
|
||||
return await db.getAllKeys(storeKey, query, count);
|
||||
},
|
||||
close() {
|
||||
delete databaseCache[dbKey];
|
||||
133
src/common/ObsHttpHandler.ts
Normal file
@@ -0,0 +1,133 @@
|
||||
// This file is based on a file that was published by the @remotely-save, under the Apache 2 License.
|
||||
// I would love to express my deepest gratitude to the original authors for their hard work and dedication. Without their contributions, this project would not have been possible.
|
||||
//
|
||||
// Original Implementation is here: https://github.com/remotely-save/remotely-save/blob/28b99557a864ef59c19d2ad96101196e401718f0/src/remoteForS3.ts
|
||||
|
||||
import {
|
||||
FetchHttpHandler,
|
||||
type FetchHttpHandlerOptions,
|
||||
} from "@smithy/fetch-http-handler";
|
||||
import { HttpRequest, HttpResponse, type HttpHandlerOptions } from "@smithy/protocol-http";
|
||||
//@ts-ignore
|
||||
import { requestTimeout } from "@smithy/fetch-http-handler/dist-es/request-timeout";
|
||||
import { buildQueryString } from "@smithy/querystring-builder";
|
||||
import { requestUrl, type RequestUrlParam } from "../deps.ts";
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
// special handler using Obsidian requestUrl
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
|
||||
/**
|
||||
* This is close to origin implementation of FetchHttpHandler
|
||||
* https://github.com/aws/aws-sdk-js-v3/blob/main/packages/fetch-http-handler/src/fetch-http-handler.ts
|
||||
* that is released under Apache 2 License.
|
||||
* But this uses Obsidian requestUrl instead.
|
||||
*/
|
||||
export class ObsHttpHandler extends FetchHttpHandler {
|
||||
requestTimeoutInMs: number | undefined;
|
||||
reverseProxyNoSignUrl: string | undefined;
|
||||
constructor(
|
||||
options?: FetchHttpHandlerOptions,
|
||||
reverseProxyNoSignUrl?: string
|
||||
) {
|
||||
super(options);
|
||||
this.requestTimeoutInMs =
|
||||
options === undefined ? undefined : options.requestTimeout;
|
||||
this.reverseProxyNoSignUrl = reverseProxyNoSignUrl;
|
||||
}
|
||||
async handle(
|
||||
request: HttpRequest,
|
||||
{ abortSignal }: HttpHandlerOptions = {}
|
||||
): Promise<{ response: HttpResponse }> {
|
||||
if (abortSignal?.aborted) {
|
||||
const abortError = new Error("Request aborted");
|
||||
abortError.name = "AbortError";
|
||||
return Promise.reject(abortError);
|
||||
}
|
||||
|
||||
let path = request.path;
|
||||
if (request.query) {
|
||||
const queryString = buildQueryString(request.query);
|
||||
if (queryString) {
|
||||
path += `?${queryString}`;
|
||||
}
|
||||
}
|
||||
|
||||
const { port, method } = request;
|
||||
let url = `${request.protocol}//${request.hostname}${port ? `:${port}` : ""
|
||||
}${path}`;
|
||||
if (
|
||||
this.reverseProxyNoSignUrl !== undefined &&
|
||||
this.reverseProxyNoSignUrl !== ""
|
||||
) {
|
||||
const urlObj = new URL(url);
|
||||
urlObj.host = this.reverseProxyNoSignUrl;
|
||||
url = urlObj.href;
|
||||
}
|
||||
const body =
|
||||
method === "GET" || method === "HEAD" ? undefined : request.body;
|
||||
|
||||
const transformedHeaders: Record<string, string> = {};
|
||||
for (const key of Object.keys(request.headers)) {
|
||||
const keyLower = key.toLowerCase();
|
||||
if (keyLower === "host" || keyLower === "content-length") {
|
||||
continue;
|
||||
}
|
||||
transformedHeaders[keyLower] = request.headers[key];
|
||||
}
|
||||
|
||||
let contentType: string | undefined = undefined;
|
||||
if (transformedHeaders["content-type"] !== undefined) {
|
||||
contentType = transformedHeaders["content-type"];
|
||||
}
|
||||
|
||||
let transformedBody: any = body;
|
||||
if (ArrayBuffer.isView(body)) {
|
||||
transformedBody = new Uint8Array(body.buffer).buffer;
|
||||
}
|
||||
|
||||
const param: RequestUrlParam = {
|
||||
body: transformedBody,
|
||||
headers: transformedHeaders,
|
||||
method: method,
|
||||
url: url,
|
||||
contentType: contentType,
|
||||
};
|
||||
|
||||
const raceOfPromises = [
|
||||
requestUrl(param).then((rsp) => {
|
||||
const headers = rsp.headers;
|
||||
const headersLower: Record<string, string> = {};
|
||||
for (const key of Object.keys(headers)) {
|
||||
headersLower[key.toLowerCase()] = headers[key];
|
||||
}
|
||||
const stream = new ReadableStream<Uint8Array>({
|
||||
start(controller) {
|
||||
controller.enqueue(new Uint8Array(rsp.arrayBuffer));
|
||||
controller.close();
|
||||
},
|
||||
});
|
||||
return {
|
||||
response: new HttpResponse({
|
||||
headers: headersLower,
|
||||
statusCode: rsp.status,
|
||||
body: stream,
|
||||
}),
|
||||
};
|
||||
}),
|
||||
requestTimeout(this.requestTimeoutInMs),
|
||||
];
|
||||
|
||||
if (abortSignal) {
|
||||
raceOfPromises.push(
|
||||
new Promise<never>((resolve, reject) => {
|
||||
abortSignal.onabort = () => {
|
||||
const abortError = new Error("Request aborted");
|
||||
abortError.name = "AbortError";
|
||||
reject(abortError);
|
||||
};
|
||||
})
|
||||
);
|
||||
}
|
||||
return Promise.race(raceOfPromises);
|
||||
}
|
||||
}
|
||||
@@ -1,16 +1,15 @@
|
||||
import { ButtonComponent } from "obsidian";
|
||||
import { App, FuzzySuggestModal, MarkdownRenderer, Modal, Plugin, Setting } from "./deps";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { App, FuzzySuggestModal, MarkdownRenderer, Modal, Plugin, Setting } from "../deps.ts";
|
||||
import ObsidianLiveSyncPlugin from "../main.ts";
|
||||
|
||||
//@ts-ignore
|
||||
import PluginPane from "./PluginPane.svelte";
|
||||
import PluginPane from "../ui/PluginPane.svelte";
|
||||
|
||||
export class PluginDialogModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
logEl: HTMLDivElement;
|
||||
component: PluginPane = null;
|
||||
component: PluginPane | undefined;
|
||||
isOpened() {
|
||||
return this.component != null;
|
||||
return this.component != undefined;
|
||||
}
|
||||
|
||||
constructor(app: App, plugin: ObsidianLiveSyncPlugin) {
|
||||
@@ -20,7 +19,8 @@ export class PluginDialogModal extends Modal {
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
if (this.component == null) {
|
||||
this.titleEl.setText("Customization Sync (Beta2)")
|
||||
if (!this.component) {
|
||||
this.component = new PluginPane({
|
||||
target: contentEl,
|
||||
props: { plugin: this.plugin },
|
||||
@@ -29,36 +29,36 @@ export class PluginDialogModal extends Modal {
|
||||
}
|
||||
|
||||
onClose() {
|
||||
if (this.component != null) {
|
||||
if (this.component) {
|
||||
this.component.$destroy();
|
||||
this.component = null;
|
||||
this.component = undefined;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export class InputStringDialog extends Modal {
|
||||
result: string | false = false;
|
||||
onSubmit: (result: string | boolean) => void;
|
||||
onSubmit: (result: string | false) => void;
|
||||
title: string;
|
||||
key: string;
|
||||
placeholder: string;
|
||||
isManuallyClosed = false;
|
||||
isPassword = false;
|
||||
|
||||
constructor(app: App, title: string, key: string, placeholder: string, onSubmit: (result: string | false) => void) {
|
||||
constructor(app: App, title: string, key: string, placeholder: string, isPassword: boolean, onSubmit: (result: string | false) => void) {
|
||||
super(app);
|
||||
this.onSubmit = onSubmit;
|
||||
this.title = title;
|
||||
this.placeholder = placeholder;
|
||||
this.key = key;
|
||||
this.isPassword = isPassword;
|
||||
}
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
|
||||
contentEl.createEl("h1", { text: this.title });
|
||||
// For enter to submit
|
||||
const formEl = contentEl.createEl("form");
|
||||
new Setting(formEl).setName(this.key).addText((text) =>
|
||||
this.titleEl.setText(this.title);
|
||||
const formEl = contentEl.createDiv();
|
||||
new Setting(formEl).setName(this.key).setClass(this.isPassword ? "password-input" : "normal-input").addText((text) =>
|
||||
text.onChange((value) => {
|
||||
this.result = value;
|
||||
})
|
||||
@@ -93,13 +93,13 @@ export class InputStringDialog extends Modal {
|
||||
}
|
||||
export class PopoverSelectString extends FuzzySuggestModal<string> {
|
||||
app: App;
|
||||
callback: (e: string) => void = () => { };
|
||||
callback: ((e: string) => void) | undefined = () => { };
|
||||
getItemsFun: () => string[] = () => {
|
||||
return ["yes", "no"];
|
||||
|
||||
}
|
||||
|
||||
constructor(app: App, note: string, placeholder: string | null, getItemsFun: () => string[], callback: (e: string) => void) {
|
||||
constructor(app: App, note: string, placeholder: string | undefined, getItemsFun: (() => string[]) | undefined, callback: (e: string) => void) {
|
||||
super(app);
|
||||
this.app = app;
|
||||
this.setPlaceholder((placeholder ?? "y/n) ") + note);
|
||||
@@ -117,13 +117,14 @@ export class PopoverSelectString extends FuzzySuggestModal<string> {
|
||||
|
||||
onChooseItem(item: string, evt: MouseEvent | KeyboardEvent): void {
|
||||
// debugger;
|
||||
this.callback(item);
|
||||
this.callback = null;
|
||||
this.callback?.(item);
|
||||
this.callback = undefined;
|
||||
}
|
||||
onClose(): void {
|
||||
setTimeout(() => {
|
||||
if (this.callback != null) {
|
||||
if (this.callback) {
|
||||
this.callback("");
|
||||
this.callback = undefined;
|
||||
}
|
||||
}, 100);
|
||||
}
|
||||
@@ -135,16 +136,16 @@ export class MessageBox extends Modal {
|
||||
title: string;
|
||||
contentMd: string;
|
||||
buttons: string[];
|
||||
result: string;
|
||||
result: string | false = false;
|
||||
isManuallyClosed = false;
|
||||
defaultAction: string | undefined;
|
||||
timeout: number | undefined;
|
||||
timer: ReturnType<typeof setInterval> = undefined;
|
||||
timer: ReturnType<typeof setInterval> | undefined = undefined;
|
||||
defaultButtonComponent: ButtonComponent | undefined;
|
||||
|
||||
onSubmit: (result: string | boolean) => void;
|
||||
onSubmit: (result: string | false) => void;
|
||||
|
||||
constructor(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout: number, onSubmit: (result: (typeof buttons)[number] | false) => void) {
|
||||
constructor(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout: number | undefined, onSubmit: (result: (typeof buttons)[number] | false) => void) {
|
||||
super(plugin.app);
|
||||
this.plugin = plugin;
|
||||
this.title = title;
|
||||
@@ -155,6 +156,7 @@ export class MessageBox extends Modal {
|
||||
this.timeout = timeout;
|
||||
if (this.timeout) {
|
||||
this.timer = setInterval(() => {
|
||||
if (this.timeout === undefined) return;
|
||||
this.timeout--;
|
||||
if (this.timeout < 0) {
|
||||
if (this.timer) {
|
||||
@@ -165,7 +167,7 @@ export class MessageBox extends Modal {
|
||||
this.isManuallyClosed = true;
|
||||
this.close();
|
||||
} else {
|
||||
this.defaultButtonComponent.setButtonText(`( ${this.timeout} ) ${defaultAction}`);
|
||||
this.defaultButtonComponent?.setButtonText(`( ${this.timeout} ) ${defaultAction}`);
|
||||
}
|
||||
}, 1000);
|
||||
}
|
||||
@@ -173,16 +175,17 @@ export class MessageBox extends Modal {
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
this.titleEl.setText(this.title);
|
||||
contentEl.addEventListener("click", () => {
|
||||
if (this.timer) {
|
||||
clearInterval(this.timer);
|
||||
this.timer = undefined;
|
||||
}
|
||||
})
|
||||
contentEl.createEl("h1", { text: this.title });
|
||||
const div = contentEl.createDiv();
|
||||
MarkdownRenderer.renderMarkdown(this.contentMd, div, "/", null);
|
||||
MarkdownRenderer.render(this.plugin.app, this.contentMd, div, "/", this.plugin);
|
||||
const buttonSetting = new Setting(contentEl);
|
||||
buttonSetting.controlEl.style.flexWrap = "wrap";
|
||||
for (const button of this.buttons) {
|
||||
buttonSetting.addButton((btn) => {
|
||||
btn
|
||||
@@ -221,7 +224,7 @@ export class MessageBox extends Modal {
|
||||
}
|
||||
|
||||
|
||||
export function confirmWithMessage(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction?: (typeof buttons)[number], timeout?: number): Promise<(typeof buttons)[number] | false> {
|
||||
export function confirmWithMessage(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout?: number): Promise<(typeof buttons)[number] | false> {
|
||||
return new Promise((res) => {
|
||||
const dialog = new MessageBox(plugin, title, contentMd, buttons, defaultAction, timeout, (result) => res(result));
|
||||
dialog.open();
|
||||
7
src/common/stores.ts
Normal file
@@ -0,0 +1,7 @@
|
||||
import { PersistentMap } from "../lib/src/dataobject/PersistentMap.ts";
|
||||
|
||||
export let sameChangePairs: PersistentMap<number[]>;
|
||||
|
||||
export function initializeStores(vaultName: string) {
|
||||
sameChangePairs = new PersistentMap<number[]>(`ls-persist-same-changes-${vaultName}`);
|
||||
}
|
||||
@@ -1,5 +1,5 @@
|
||||
import { PluginManifest, TFile } from "./deps";
|
||||
import { DatabaseEntry, EntryBody, FilePath } from "./lib/src/types";
|
||||
import { type PluginManifest, TFile } from "../deps.ts";
|
||||
import { type DatabaseEntry, type EntryBody, type FilePath } from "../lib/src/common/types.ts";
|
||||
|
||||
export interface PluginDataEntry extends DatabaseEntry {
|
||||
deviceVaultName: string;
|
||||
467
src/common/utils.ts
Normal file
@@ -0,0 +1,467 @@
|
||||
import { normalizePath, Platform, TAbstractFile, App, type RequestUrlParam, requestUrl, TFile } from "../deps.ts";
|
||||
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "../lib/src/string_and_binary/path.ts";
|
||||
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { LOG_LEVEL_VERBOSE, type AnyEntry, type DocumentID, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "../lib/src/common/types.ts";
|
||||
import { CHeader, ICHeader, ICHeaderLength, ICXHeader, PSCHeader } from "./types.ts";
|
||||
import { InputStringDialog, PopoverSelectString } from "./dialogs.ts";
|
||||
import type ObsidianLiveSyncPlugin from "../main.ts";
|
||||
import { writeString } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import { fireAndForget } from "../lib/src/common/utils.ts";
|
||||
import { sameChangePairs } from "./stores.ts";
|
||||
|
||||
export { scheduleTask, setPeriodicTask, cancelTask, cancelAllTasks, cancelPeriodicTask, cancelAllPeriodicTask, } from "../lib/src/concurrency/task.ts";
|
||||
|
||||
// For backward compatibility, using the path for determining id.
|
||||
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
|
||||
// The first slash will be deleted when the path is normalized.
|
||||
export async function path2id(filename: FilePathWithPrefix | FilePath, obfuscatePassphrase: string | false): Promise<DocumentID> {
|
||||
const temp = filename.split(":");
|
||||
const path = temp.pop();
|
||||
const normalizedPath = normalizePath(path as FilePath);
|
||||
temp.push(normalizedPath);
|
||||
const fixedPath = temp.join(":") as FilePathWithPrefix;
|
||||
|
||||
const out = await path2id_base(fixedPath, obfuscatePassphrase);
|
||||
return out;
|
||||
}
|
||||
export function id2path(id: DocumentID, entry?: EntryHasPath): FilePathWithPrefix {
|
||||
const filename = id2path_base(id, entry);
|
||||
const temp = filename.split(":");
|
||||
const path = temp.pop();
|
||||
const normalizedPath = normalizePath(path as FilePath);
|
||||
temp.push(normalizedPath);
|
||||
const fixedPath = temp.join(":") as FilePathWithPrefix;
|
||||
return fixedPath;
|
||||
}
|
||||
export function getPath(entry: AnyEntry) {
|
||||
return id2path(entry._id, entry);
|
||||
|
||||
}
|
||||
export function getPathWithoutPrefix(entry: AnyEntry) {
|
||||
const f = getPath(entry);
|
||||
return stripAllPrefixes(f);
|
||||
}
|
||||
|
||||
export function getPathFromTFile(file: TAbstractFile) {
|
||||
return file.path as FilePath;
|
||||
}
|
||||
|
||||
|
||||
const memos: { [key: string]: any } = {};
|
||||
export function memoObject<T>(key: string, obj: T): T {
|
||||
memos[key] = obj;
|
||||
return memos[key] as T;
|
||||
}
|
||||
export async function memoIfNotExist<T>(key: string, func: () => T | Promise<T>): Promise<T> {
|
||||
if (!(key in memos)) {
|
||||
const w = func();
|
||||
const v = w instanceof Promise ? (await w) : w;
|
||||
memos[key] = v;
|
||||
}
|
||||
return memos[key] as T;
|
||||
}
|
||||
export function retrieveMemoObject<T>(key: string): T | false {
|
||||
if (key in memos) {
|
||||
return memos[key];
|
||||
} else {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
export function disposeMemoObject(key: string) {
|
||||
delete memos[key];
|
||||
}
|
||||
|
||||
export function isSensibleMargeApplicable(path: string) {
|
||||
if (path.endsWith(".md")) return true;
|
||||
return false;
|
||||
}
|
||||
export function isObjectMargeApplicable(path: string) {
|
||||
if (path.endsWith(".canvas")) return true;
|
||||
if (path.endsWith(".json")) return true;
|
||||
return false;
|
||||
}
|
||||
|
||||
export function tryParseJSON(str: string, fallbackValue?: any) {
|
||||
try {
|
||||
return JSON.parse(str);
|
||||
} catch (ex) {
|
||||
return fallbackValue;
|
||||
}
|
||||
}
|
||||
|
||||
const MARK_OPERATOR = `\u{0001}`;
|
||||
const MARK_DELETED = `${MARK_OPERATOR}__DELETED`;
|
||||
const MARK_ISARRAY = `${MARK_OPERATOR}__ARRAY`;
|
||||
const MARK_SWAPPED = `${MARK_OPERATOR}__SWAP`;
|
||||
|
||||
function unorderedArrayToObject(obj: Array<any>) {
|
||||
return obj.map(e => ({ [e.id as string]: e })).reduce((p, c) => ({ ...p, ...c }), {})
|
||||
}
|
||||
function objectToUnorderedArray(obj: object) {
|
||||
const entries = Object.entries(obj);
|
||||
if (entries.some(e => e[0] != e[1]?.id)) throw new Error("Item looks like not unordered array")
|
||||
return entries.map(e => e[1]);
|
||||
}
|
||||
function generatePatchUnorderedArray(from: Array<any>, to: Array<any>) {
|
||||
if (from.every(e => typeof (e) == "object" && ("id" in e)) && to.every(e => typeof (e) == "object" && ("id" in e))) {
|
||||
const fObj = unorderedArrayToObject(from);
|
||||
const tObj = unorderedArrayToObject(to);
|
||||
const diff = generatePatchObj(fObj, tObj);
|
||||
if (Object.keys(diff).length > 0) {
|
||||
return { [MARK_ISARRAY]: diff };
|
||||
} else {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
return { [MARK_SWAPPED]: to };
|
||||
}
|
||||
|
||||
export function generatePatchObj(from: Record<string | number | symbol, any>, to: Record<string | number | symbol, any>) {
|
||||
const entries = Object.entries(from);
|
||||
const tempMap = new Map<string | number | symbol, any>(entries);
|
||||
const ret = {} as Record<string | number | symbol, any>;
|
||||
const newEntries = Object.entries(to);
|
||||
for (const [key, value] of newEntries) {
|
||||
if (!tempMap.has(key)) {
|
||||
//New
|
||||
ret[key] = value;
|
||||
tempMap.delete(key);
|
||||
} else {
|
||||
//Exists
|
||||
const v = tempMap.get(key);
|
||||
if (typeof (v) !== typeof (value) || (Array.isArray(v) !== Array.isArray(value))) {
|
||||
//if type is not match, replace completely.
|
||||
ret[key] = { [MARK_SWAPPED]: value };
|
||||
} else {
|
||||
if (typeof (v) == "object" && typeof (value) == "object" && !Array.isArray(v) && !Array.isArray(value)) {
|
||||
const wk = generatePatchObj(v, value);
|
||||
if (Object.keys(wk).length > 0) ret[key] = wk;
|
||||
} else if (typeof (v) == "object" && typeof (value) == "object" && Array.isArray(v) && Array.isArray(value)) {
|
||||
const wk = generatePatchUnorderedArray(v, value);
|
||||
if (Object.keys(wk).length > 0) ret[key] = wk;
|
||||
} else if (typeof (v) != "object" && typeof (value) != "object") {
|
||||
if (JSON.stringify(tempMap.get(key)) !== JSON.stringify(value)) {
|
||||
ret[key] = value;
|
||||
}
|
||||
} else {
|
||||
if (JSON.stringify(tempMap.get(key)) !== JSON.stringify(value)) {
|
||||
ret[key] = { [MARK_SWAPPED]: value };
|
||||
}
|
||||
}
|
||||
}
|
||||
tempMap.delete(key);
|
||||
}
|
||||
}
|
||||
//Not used item, means deleted one
|
||||
for (const [key,] of tempMap) {
|
||||
ret[key] = MARK_DELETED
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
|
||||
export function applyPatch(from: Record<string | number | symbol, any>, patch: Record<string | number | symbol, any>) {
|
||||
const ret = from;
|
||||
const patches = Object.entries(patch);
|
||||
for (const [key, value] of patches) {
|
||||
if (value == MARK_DELETED) {
|
||||
delete ret[key];
|
||||
continue;
|
||||
}
|
||||
if (typeof (value) == "object") {
|
||||
if (MARK_SWAPPED in value) {
|
||||
ret[key] = value[MARK_SWAPPED];
|
||||
continue;
|
||||
}
|
||||
if (MARK_ISARRAY in value) {
|
||||
if (!(key in ret)) ret[key] = [];
|
||||
if (!Array.isArray(ret[key])) {
|
||||
throw new Error("Patch target type is mismatched (array to something)");
|
||||
}
|
||||
const orgArrayObject = unorderedArrayToObject(ret[key]);
|
||||
const appliedObject = applyPatch(orgArrayObject, value[MARK_ISARRAY]);
|
||||
const appliedArray = objectToUnorderedArray(appliedObject);
|
||||
ret[key] = [...appliedArray];
|
||||
} else {
|
||||
if (!(key in ret)) {
|
||||
ret[key] = value;
|
||||
continue;
|
||||
}
|
||||
ret[key] = applyPatch(ret[key], value);
|
||||
}
|
||||
} else {
|
||||
ret[key] = value;
|
||||
}
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
export function mergeObject(
|
||||
objA: Record<string | number | symbol, any> | [any],
|
||||
objB: Record<string | number | symbol, any> | [any]
|
||||
) {
|
||||
const newEntries = Object.entries(objB);
|
||||
const ret: any = { ...objA };
|
||||
if (
|
||||
typeof objA !== typeof objB ||
|
||||
Array.isArray(objA) !== Array.isArray(objB)
|
||||
) {
|
||||
return objB;
|
||||
}
|
||||
|
||||
for (const [key, v] of newEntries) {
|
||||
if (key in ret) {
|
||||
const value = ret[key];
|
||||
if (
|
||||
typeof v !== typeof value ||
|
||||
Array.isArray(v) !== Array.isArray(value)
|
||||
) {
|
||||
//if type is not match, replace completely.
|
||||
ret[key] = v;
|
||||
} else {
|
||||
if (
|
||||
typeof v == "object" &&
|
||||
typeof value == "object" &&
|
||||
!Array.isArray(v) &&
|
||||
!Array.isArray(value)
|
||||
) {
|
||||
ret[key] = mergeObject(v, value);
|
||||
} else if (
|
||||
typeof v == "object" &&
|
||||
typeof value == "object" &&
|
||||
Array.isArray(v) &&
|
||||
Array.isArray(value)
|
||||
) {
|
||||
ret[key] = [...new Set([...v, ...value])];
|
||||
} else {
|
||||
ret[key] = v;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
ret[key] = v;
|
||||
}
|
||||
}
|
||||
const retSorted = Object.fromEntries(Object.entries(ret).sort((a, b) => a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0));
|
||||
if (Array.isArray(objA) && Array.isArray(objB)) {
|
||||
return Object.values(retSorted);
|
||||
}
|
||||
return retSorted;
|
||||
}
|
||||
|
||||
export function flattenObject(obj: Record<string | number | symbol, any>, path: string[] = []): [string, any][] {
|
||||
if (typeof (obj) != "object") return [[path.join("."), obj]];
|
||||
if (Array.isArray(obj)) return [[path.join("."), JSON.stringify(obj)]];
|
||||
const e = Object.entries(obj);
|
||||
const ret = []
|
||||
for (const [key, value] of e) {
|
||||
const p = flattenObject(value, [...path, key]);
|
||||
ret.push(...p);
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
|
||||
export function isValidPath(filename: string) {
|
||||
if (Platform.isDesktop) {
|
||||
// if(Platform.isMacOS) return isValidFilenameInDarwin(filename);
|
||||
if (process.platform == "darwin") return isValidFilenameInDarwin(filename);
|
||||
if (process.platform == "linux") return isValidFilenameInLinux(filename);
|
||||
return isValidFilenameInWidows(filename);
|
||||
}
|
||||
if (Platform.isAndroidApp) return isValidFilenameInAndroid(filename);
|
||||
if (Platform.isIosApp) return isValidFilenameInDarwin(filename);
|
||||
//Fallback
|
||||
Logger("Could not determine platform for checking filename", LOG_LEVEL_VERBOSE);
|
||||
return isValidFilenameInWidows(filename);
|
||||
}
|
||||
|
||||
export function trimPrefix(target: string, prefix: string) {
|
||||
return target.startsWith(prefix) ? target.substring(prefix.length) : target;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* returns is internal chunk of file
|
||||
* @param id ID
|
||||
* @returns
|
||||
*/
|
||||
export function isInternalMetadata(id: FilePath | FilePathWithPrefix | DocumentID): boolean {
|
||||
return id.startsWith(ICHeader);
|
||||
}
|
||||
export function stripInternalMetadataPrefix<T extends FilePath | FilePathWithPrefix | DocumentID>(id: T): T {
|
||||
return id.substring(ICHeaderLength) as T;
|
||||
}
|
||||
export function id2InternalMetadataId(id: DocumentID): DocumentID {
|
||||
return ICHeader + id as DocumentID;
|
||||
}
|
||||
|
||||
// const CHeaderLength = CHeader.length;
|
||||
export function isChunk(str: string): boolean {
|
||||
return str.startsWith(CHeader);
|
||||
}
|
||||
|
||||
export function isPluginMetadata(str: string): boolean {
|
||||
return str.startsWith(PSCHeader);
|
||||
}
|
||||
export function isCustomisationSyncMetadata(str: string): boolean {
|
||||
return str.startsWith(ICXHeader);
|
||||
}
|
||||
|
||||
export const askYesNo = (app: App, message: string): Promise<"yes" | "no"> => {
|
||||
return new Promise((res) => {
|
||||
const popover = new PopoverSelectString(app, message, undefined, undefined, (result) => res(result as "yes" | "no"));
|
||||
popover.open();
|
||||
});
|
||||
};
|
||||
|
||||
export const askSelectString = (app: App, message: string, items: string[]): Promise<string> => {
|
||||
const getItemsFun = () => items;
|
||||
return new Promise((res) => {
|
||||
const popover = new PopoverSelectString(app, message, "", getItemsFun, (result) => res(result));
|
||||
popover.open();
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
export const askString = (app: App, title: string, key: string, placeholder: string, isPassword: boolean = false): Promise<string | false> => {
|
||||
return new Promise((res) => {
|
||||
const dialog = new InputStringDialog(app, title, key, placeholder, isPassword, (result) => res(result));
|
||||
dialog.open();
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
export class PeriodicProcessor {
|
||||
_process: () => Promise<any>;
|
||||
_timer?: number;
|
||||
_plugin: ObsidianLiveSyncPlugin;
|
||||
constructor(plugin: ObsidianLiveSyncPlugin, process: () => Promise<any>) {
|
||||
this._plugin = plugin;
|
||||
this._process = process;
|
||||
}
|
||||
async process() {
|
||||
try {
|
||||
await this._process();
|
||||
} catch (ex) {
|
||||
Logger(ex);
|
||||
}
|
||||
}
|
||||
enable(interval: number) {
|
||||
this.disable();
|
||||
if (interval == 0) return;
|
||||
this._timer = window.setInterval(() => fireAndForget(async () => {
|
||||
await this.process();
|
||||
if (this._plugin._unloaded) {
|
||||
this.disable();
|
||||
}
|
||||
}), interval);
|
||||
this._plugin.registerInterval(this._timer);
|
||||
}
|
||||
disable() {
|
||||
if (this._timer !== undefined) {
|
||||
window.clearInterval(this._timer);
|
||||
this._timer = undefined;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export const _requestToCouchDBFetch = async (baseUri: string, username: string, password: string, path?: string, body?: string | any, method?: string) => {
|
||||
const utf8str = String.fromCharCode.apply(null, [...writeString(`${username}:${password}`)]);
|
||||
const encoded = window.btoa(utf8str);
|
||||
const authHeader = "Basic " + encoded;
|
||||
const transformedHeaders: Record<string, string> = { authorization: authHeader, "content-type": "application/json" };
|
||||
const uri = `${baseUri}/${path}`;
|
||||
const requestParam = {
|
||||
url: uri,
|
||||
method: method || (body ? "PUT" : "GET"),
|
||||
headers: new Headers(transformedHeaders),
|
||||
contentType: "application/json",
|
||||
body: JSON.stringify(body),
|
||||
};
|
||||
return await fetch(uri, requestParam);
|
||||
}
|
||||
|
||||
export const _requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string, path?: string, body?: any, method?: string) => {
|
||||
const utf8str = String.fromCharCode.apply(null, [...writeString(`${username}:${password}`)]);
|
||||
const encoded = window.btoa(utf8str);
|
||||
const authHeader = "Basic " + encoded;
|
||||
const transformedHeaders: Record<string, string> = { authorization: authHeader, origin: origin };
|
||||
const uri = `${baseUri}/${path}`;
|
||||
const requestParam: RequestUrlParam = {
|
||||
url: uri,
|
||||
method: method || (body ? "PUT" : "GET"),
|
||||
headers: transformedHeaders,
|
||||
contentType: "application/json",
|
||||
body: body ? JSON.stringify(body) : undefined,
|
||||
};
|
||||
return await requestUrl(requestParam);
|
||||
}
|
||||
export const requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string = "", key?: string, body?: string, method?: string) => {
|
||||
const uri = `_node/_local/_config${key ? "/" + key : ""}`;
|
||||
return await _requestToCouchDB(baseUri, username, password, origin, uri, body, method);
|
||||
};
|
||||
|
||||
export async function performRebuildDB(plugin: ObsidianLiveSyncPlugin, method: "localOnly" | "remoteOnly" | "rebuildBothByThisDevice" | "localOnlyWithChunks") {
|
||||
if (method == "localOnly") {
|
||||
await plugin.addOnSetup.fetchLocal();
|
||||
}
|
||||
if (method == "localOnlyWithChunks") {
|
||||
await plugin.addOnSetup.fetchLocal(true);
|
||||
}
|
||||
if (method == "remoteOnly") {
|
||||
await plugin.addOnSetup.rebuildRemote();
|
||||
}
|
||||
if (method == "rebuildBothByThisDevice") {
|
||||
await plugin.addOnSetup.rebuildEverything();
|
||||
}
|
||||
}
|
||||
|
||||
export const BASE_IS_NEW = Symbol("base");
|
||||
export const TARGET_IS_NEW = Symbol("target");
|
||||
export const EVEN = Symbol("even");
|
||||
|
||||
|
||||
// Why 2000? : ZIP FILE Does not have enough resolution.
|
||||
const resolution = 2000;
|
||||
export function compareMTime(baseMTime: number, targetMTime: number): typeof BASE_IS_NEW | typeof TARGET_IS_NEW | typeof EVEN {
|
||||
const truncatedBaseMTime = (~~(baseMTime / resolution)) * resolution;
|
||||
const truncatedTargetMTime = (~~(targetMTime / resolution)) * resolution;
|
||||
// Logger(`Resolution MTime ${truncatedBaseMTime} and ${truncatedTargetMTime} `, LOG_LEVEL_VERBOSE);
|
||||
if (truncatedBaseMTime == truncatedTargetMTime) return EVEN;
|
||||
if (truncatedBaseMTime > truncatedTargetMTime) return BASE_IS_NEW;
|
||||
if (truncatedBaseMTime < truncatedTargetMTime) return TARGET_IS_NEW;
|
||||
throw new Error("Unexpected error");
|
||||
}
|
||||
|
||||
export function markChangesAreSame(file: TFile | AnyEntry | string, mtime1: number, mtime2: number) {
|
||||
if (mtime1 === mtime2) return true;
|
||||
const key = typeof file == "string" ? file : file instanceof TFile ? file.path : file.path ?? file._id;
|
||||
const pairs = sameChangePairs.get(key, []) || [];
|
||||
if (pairs.some(e => e == mtime1 || e == mtime2)) {
|
||||
sameChangePairs.set(key, [...new Set([...pairs, mtime1, mtime2])]);
|
||||
} else {
|
||||
sameChangePairs.set(key, [mtime1, mtime2]);
|
||||
}
|
||||
}
|
||||
export function isMarkedAsSameChanges(file: TFile | AnyEntry | string, mtimes: number[]) {
|
||||
const key = typeof file == "string" ? file : file instanceof TFile ? file.path : file.path ?? file._id;
|
||||
const pairs = sameChangePairs.get(key, []) || [];
|
||||
if (mtimes.every(e => pairs.indexOf(e) !== -1)) {
|
||||
return EVEN;
|
||||
}
|
||||
}
|
||||
export function compareFileFreshness(baseFile: TFile | AnyEntry | undefined, checkTarget: TFile | AnyEntry | undefined): typeof BASE_IS_NEW | typeof TARGET_IS_NEW | typeof EVEN {
|
||||
if (baseFile === undefined && checkTarget == undefined) return EVEN;
|
||||
if (baseFile == undefined) return TARGET_IS_NEW;
|
||||
if (checkTarget == undefined) return BASE_IS_NEW;
|
||||
|
||||
const modifiedBase = baseFile instanceof TFile ? baseFile?.stat?.mtime ?? 0 : baseFile?.mtime ?? 0;
|
||||
const modifiedTarget = checkTarget instanceof TFile ? checkTarget?.stat?.mtime ?? 0 : checkTarget?.mtime ?? 0;
|
||||
|
||||
if (modifiedBase && modifiedTarget && isMarkedAsSameChanges(baseFile, [modifiedBase, modifiedTarget])) {
|
||||
return EVEN;
|
||||
}
|
||||
return compareMTime(modifiedBase, modifiedTarget);
|
||||
}
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
import { FilePath } from "./lib/src/types";
|
||||
import { type FilePath } from "./lib/src/common/types.ts";
|
||||
|
||||
export {
|
||||
addIcon, App, DataWriteOptions, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginManifest,
|
||||
PluginSettingTab, Plugin_2, requestUrl, RequestUrlParam, RequestUrlResponse, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
|
||||
parseYaml
|
||||
addIcon, App, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginSettingTab, requestUrl, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
|
||||
parseYaml, ItemView, WorkspaceLeaf
|
||||
} from "obsidian";
|
||||
export type { DataWriteOptions, PluginManifest, RequestUrlParam, RequestUrlResponse, MarkdownFileInfo, ListedFiles } from "obsidian";
|
||||
import {
|
||||
normalizePath as normalizePath_
|
||||
} from "obsidian";
|
||||
const normalizePath = normalizePath_ as <T extends string | FilePath>(from: T) => T;
|
||||
export { normalizePath }
|
||||
export { type Diff, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "diff-match-patch";
|
||||
@@ -1,39 +1,149 @@
|
||||
import { writable } from 'svelte/store';
|
||||
import { Notice, PluginManifest, stringifyYaml, parseYaml } from "./deps";
|
||||
import { Notice, type PluginManifest, parseYaml, normalizePath, type ListedFiles } from "../deps.ts";
|
||||
|
||||
import { EntryDoc, LoadedEntry, LOG_LEVEL, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID } from "./lib/src/types";
|
||||
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
|
||||
import { delay, getDocData } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { WrappedNotice } from "./lib/src/wrapper";
|
||||
import { base64ToArrayBuffer, arrayBufferToBase64, readString, writeString, uint8ArrayToHexString } from "./lib/src/strbin";
|
||||
import { runWithLock } from "./lib/src/lock";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { stripAllPrefixes } from "./lib/src/path";
|
||||
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "./utils";
|
||||
import { Semaphore } from "./lib/src/semaphore";
|
||||
import { PluginDialogModal } from "./dialogs";
|
||||
import { JsonResolveModal } from "./JsonResolveModal";
|
||||
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry, SavingEntry } from "../lib/src/common/types.ts";
|
||||
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "../lib/src/common/types.ts";
|
||||
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "../common/types.ts";
|
||||
import { createSavingEntryFromLoadedEntry, createTextBlob, delay, fireAndForget, getDocData, isDocContentSame, throttle } from "../lib/src/common/utils.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { readString, decodeBinary, arrayBufferToBase64, digestHash } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import { serialized } from "../lib/src/concurrency/lock.ts";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands.ts";
|
||||
import { stripAllPrefixes } from "../lib/src/string_and_binary/path.ts";
|
||||
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "../common/utils.ts";
|
||||
import { PluginDialogModal } from "../common/dialogs.ts";
|
||||
import { JsonResolveModal } from "../ui/JsonResolveModal.ts";
|
||||
import { QueueProcessor } from '../lib/src/concurrency/processor.ts';
|
||||
import { pluginScanningCount } from '../lib/src/mock_and_interop/stores.ts';
|
||||
import type ObsidianLiveSyncPlugin from '../main.ts';
|
||||
|
||||
const d = "\u200b";
|
||||
const d2 = "\n";
|
||||
|
||||
function serialize(data: PluginDataEx): string {
|
||||
// For higher performance, create custom plug-in data strings.
|
||||
// Self-hosted LiveSync uses `\n` to split chunks. Therefore, grouping together those with similar entropy would work nicely.
|
||||
let ret = "";
|
||||
ret += ":";
|
||||
ret += data.category + d + data.name + d + data.term + d2;
|
||||
ret += (data.version ?? "") + d2;
|
||||
ret += data.mtime + d2;
|
||||
for (const file of data.files) {
|
||||
ret += file.filename + d + (file.displayName ?? "") + d + (file.version ?? "") + d2;
|
||||
const hash = digestHash((file.data ?? []).join());
|
||||
ret += file.mtime + d + file.size + d + hash + d2;
|
||||
for (const data of file.data ?? []) {
|
||||
ret += data + d
|
||||
}
|
||||
ret += d2;
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
function fetchToken(source: string, from: number): [next: number, token: string] {
|
||||
const limitIdx = source.indexOf(d2, from);
|
||||
const limit = limitIdx == -1 ? source.length : limitIdx;
|
||||
const delimiterIdx = source.indexOf(d, from);
|
||||
const delimiter = delimiterIdx == -1 ? source.length : delimiterIdx;
|
||||
const tokenEnd = Math.min(limit, delimiter);
|
||||
let next = tokenEnd;
|
||||
if (limit < delimiter) {
|
||||
next = tokenEnd;
|
||||
} else {
|
||||
next = tokenEnd + 1
|
||||
}
|
||||
return [next, source.substring(from, tokenEnd)];
|
||||
}
|
||||
function getTokenizer(source: string) {
|
||||
const t = {
|
||||
pos: 1,
|
||||
next() {
|
||||
const [next, token] = fetchToken(source, this.pos);
|
||||
this.pos = next;
|
||||
return token;
|
||||
},
|
||||
nextLine() {
|
||||
const nextPos = source.indexOf(d2, this.pos);
|
||||
if (nextPos == -1) {
|
||||
this.pos = source.length;
|
||||
} else {
|
||||
this.pos = nextPos + 1;
|
||||
}
|
||||
}
|
||||
}
|
||||
return t;
|
||||
}
|
||||
|
||||
function deserialize2(str: string): PluginDataEx {
|
||||
const tokens = getTokenizer(str);
|
||||
const ret = {} as PluginDataEx;
|
||||
const category = tokens.next();
|
||||
const name = tokens.next();
|
||||
const term = tokens.next();
|
||||
tokens.nextLine();
|
||||
const version = tokens.next();
|
||||
tokens.nextLine();
|
||||
const mtime = Number(tokens.next());
|
||||
tokens.nextLine();
|
||||
const result: PluginDataEx = Object.assign(ret,
|
||||
{ category, name, term, version, mtime, files: [] as PluginDataExFile[] })
|
||||
let filename = "";
|
||||
do {
|
||||
filename = tokens.next();
|
||||
if (!filename) break;
|
||||
const displayName = tokens.next();
|
||||
const version = tokens.next();
|
||||
tokens.nextLine();
|
||||
const mtime = Number(tokens.next());
|
||||
const size = Number(tokens.next());
|
||||
const hash = tokens.next();
|
||||
tokens.nextLine();
|
||||
const data = [] as string[];
|
||||
let piece = "";
|
||||
do {
|
||||
piece = tokens.next();
|
||||
if (piece == "") break;
|
||||
data.push(piece);
|
||||
} while (piece != "");
|
||||
result.files.push(
|
||||
{
|
||||
filename,
|
||||
displayName,
|
||||
version,
|
||||
mtime,
|
||||
size,
|
||||
data,
|
||||
hash
|
||||
}
|
||||
)
|
||||
tokens.nextLine();
|
||||
} while (filename);
|
||||
return result;
|
||||
}
|
||||
|
||||
function deserialize<T>(str: string, def: T) {
|
||||
try {
|
||||
if (str[0] == ":") return deserialize2(str);
|
||||
return JSON.parse(str) as T;
|
||||
} catch (ex) {
|
||||
try {
|
||||
return parseYaml(str);
|
||||
} catch (ex) {
|
||||
return def;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
export const pluginList = writable([] as PluginDataExDisplay[]);
|
||||
export const pluginIsEnumerating = writable(false);
|
||||
|
||||
const hashString = (async (key: string) => {
|
||||
const buff = writeString(key);
|
||||
const digest = await crypto.subtle.digest('SHA-256', buff);
|
||||
return uint8ArrayToHexString(new Uint8Array(digest));
|
||||
})
|
||||
|
||||
export type PluginDataExFile = {
|
||||
filename: string,
|
||||
data?: string[],
|
||||
data: string[],
|
||||
mtime: number,
|
||||
size: number,
|
||||
version?: string,
|
||||
hash?: string,
|
||||
displayName?: string,
|
||||
}
|
||||
export type PluginDataExDisplay = {
|
||||
@@ -57,14 +167,21 @@ export type PluginDataEx = {
|
||||
mtime: number,
|
||||
};
|
||||
export class ConfigSync extends LiveSyncCommands {
|
||||
confirmPopup: WrappedNotice = null;
|
||||
constructor(plugin: ObsidianLiveSyncPlugin) {
|
||||
super(plugin);
|
||||
pluginScanningCount.onChanged((e) => {
|
||||
const total = e.value;
|
||||
pluginIsEnumerating.set(total != 0);
|
||||
// if (total == 0) {
|
||||
// Logger(`Processing configurations done`, LOG_LEVEL_INFO, "get-plugins");
|
||||
// }
|
||||
})
|
||||
}
|
||||
get kvDB() {
|
||||
return this.plugin.kvDB;
|
||||
}
|
||||
ensureDirectoryEx(fullPath: string) {
|
||||
return this.plugin.ensureDirectoryEx(fullPath);
|
||||
}
|
||||
pluginDialog: PluginDialogModal = null;
|
||||
|
||||
pluginDialog?: PluginDialogModal = undefined;
|
||||
periodicPluginSweepProcessor = new PeriodicProcessor(this.plugin, async () => await this.scanAllConfigFiles(false));
|
||||
|
||||
pluginList: PluginDataExDisplay[] = [];
|
||||
@@ -72,7 +189,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
if (!this.settings.usePluginSync) {
|
||||
return;
|
||||
}
|
||||
if (this.pluginDialog != null) {
|
||||
if (this.pluginDialog) {
|
||||
this.pluginDialog.open();
|
||||
} else {
|
||||
this.pluginDialog = new PluginDialogModal(this.app, this.plugin);
|
||||
@@ -83,7 +200,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
hidePluginSyncModal() {
|
||||
if (this.pluginDialog != null) {
|
||||
this.pluginDialog.close();
|
||||
this.pluginDialog = null;
|
||||
this.pluginDialog = undefined;
|
||||
}
|
||||
}
|
||||
onunload() {
|
||||
@@ -99,6 +216,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
getFileCategory(filePath: string): "CONFIG" | "THEME" | "SNIPPET" | "PLUGIN_MAIN" | "PLUGIN_ETC" | "PLUGIN_DATA" | "" {
|
||||
if (filePath.split("/").length == 2 && filePath.endsWith(".json")) return "CONFIG";
|
||||
if (filePath.split("/").length == 4 && filePath.startsWith(`${this.app.vault.configDir}/themes/`)) return "THEME";
|
||||
@@ -131,7 +249,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
Logger("Scanning customizations : done");
|
||||
} catch (ex) {
|
||||
Logger("Scanning customizations : failed");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
|
||||
}
|
||||
@@ -156,84 +274,109 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
pluginList.set(this.pluginList)
|
||||
await this.updatePluginList(showMessage);
|
||||
}
|
||||
async loadPluginData(path: FilePathWithPrefix): Promise<PluginDataExDisplay | false> {
|
||||
const wx = await this.localDatabase.getDBEntry(path, undefined, false, false);
|
||||
if (wx) {
|
||||
const data = deserialize(getDocData(wx.data), {}) as PluginDataEx;
|
||||
const xFiles = [] as PluginDataExFile[];
|
||||
let missingHash = false;
|
||||
for (const file of data.files) {
|
||||
const work = { ...file, data: [] as string[] };
|
||||
if (!file.hash) {
|
||||
// debugger;
|
||||
const tempStr = getDocData(work.data);
|
||||
const hash = digestHash(tempStr);
|
||||
file.hash = hash;
|
||||
missingHash = true;
|
||||
}
|
||||
work.data = [file.hash];
|
||||
xFiles.push(work);
|
||||
}
|
||||
if (missingHash) {
|
||||
Logger(`Digest created for ${path} to improve checking`, LOG_LEVEL_VERBOSE);
|
||||
wx.data = serialize(data);
|
||||
fireAndForget(() => this.localDatabase.putDBEntry(createSavingEntryFromLoadedEntry(wx)));
|
||||
}
|
||||
return ({
|
||||
...data,
|
||||
documentPath: this.getPath(wx),
|
||||
files: xFiles
|
||||
}) as PluginDataExDisplay;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
createMissingConfigurationEntry = throttle(() => this._createMissingConfigurationEntry(), 1000);
|
||||
_createMissingConfigurationEntry() {
|
||||
let saveRequired = false;
|
||||
for (const v of this.pluginList) {
|
||||
const key = `${v.category}/${v.name}`;
|
||||
if (!(key in this.plugin.settings.pluginSyncExtendedSetting)) {
|
||||
this.plugin.settings.pluginSyncExtendedSetting[key] = {
|
||||
key,
|
||||
mode: MODE_SELECTIVE,
|
||||
files: []
|
||||
}
|
||||
}
|
||||
if (this.plugin.settings.pluginSyncExtendedSetting[key].files.sort().join(",").toLowerCase() !=
|
||||
v.files.map(e => e.filename).sort().join(",").toLowerCase()) {
|
||||
this.plugin.settings.pluginSyncExtendedSetting[key].files = v.files.map(e => e.filename).sort();
|
||||
saveRequired = true;
|
||||
}
|
||||
}
|
||||
if (saveRequired) {
|
||||
this.plugin.saveSettingData();
|
||||
}
|
||||
}
|
||||
|
||||
pluginScanProcessor = new QueueProcessor(async (v: AnyEntry[]) => {
|
||||
const plugin = v[0];
|
||||
const path = plugin.path || this.getPath(plugin);
|
||||
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
|
||||
if (oldEntry && oldEntry.mtime == plugin.mtime) return [];
|
||||
try {
|
||||
const pluginData = await this.loadPluginData(path);
|
||||
if (pluginData) {
|
||||
let newList = [...this.pluginList];
|
||||
newList = newList.filter(x => x.documentPath != pluginData.documentPath);
|
||||
newList.push(pluginData);
|
||||
this.pluginList = newList;
|
||||
pluginList.set(newList);
|
||||
}
|
||||
// Failed to load
|
||||
return [];
|
||||
|
||||
} catch (ex) {
|
||||
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
return [];
|
||||
}, { suspended: false, batchSize: 1, concurrentLimit: 10, delay: 100, yieldThreshold: 10, maintainDelay: false, totalRemainingReactiveSource: pluginScanningCount }).startPipeline().root.onUpdateProgress(() => {
|
||||
this.createMissingConfigurationEntry();
|
||||
});
|
||||
|
||||
|
||||
async updatePluginList(showMessage: boolean, updatedDocumentPath?: FilePathWithPrefix): Promise<void> {
|
||||
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
|
||||
// pluginList.set([]);
|
||||
if (!this.settings.usePluginSync) {
|
||||
this.pluginScanProcessor.clearQueue();
|
||||
this.pluginList = [];
|
||||
pluginList.set(this.pluginList)
|
||||
return;
|
||||
}
|
||||
|
||||
await runWithLock("update-plugin-list", false, async () => {
|
||||
// if (updatedDocumentPath != "") pluginList.update(e => e.filter(ee => ee.documentPath != updatedDocumentPath));
|
||||
// const work: Record<string, Record<string, Record<string, Record<string, PluginDataEntryEx>>>> = {};
|
||||
const entries = [] as PluginDataExDisplay[]
|
||||
const plugins = this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
|
||||
const semaphore = Semaphore(4);
|
||||
const processes = [] as Promise<void>[];
|
||||
let count = 0;
|
||||
pluginIsEnumerating.set(true);
|
||||
let processed = false;
|
||||
try {
|
||||
for await (const plugin of plugins) {
|
||||
const path = plugin.path || this.getPath(plugin);
|
||||
if (updatedDocumentPath && updatedDocumentPath != path) {
|
||||
continue;
|
||||
}
|
||||
processed = true;
|
||||
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
|
||||
if (oldEntry && oldEntry.mtime == plugin.mtime) continue;
|
||||
processes.push((async (v) => {
|
||||
|
||||
const release = await semaphore.acquire(1);
|
||||
try {
|
||||
Logger(`Enumerating files... ${count++}`, logLevel, "get-plugins");
|
||||
|
||||
Logger(`plugin-${path}`, LOG_LEVEL.VERBOSE);
|
||||
const wx = await this.localDatabase.getDBEntry(path, null, false, false);
|
||||
if (wx) {
|
||||
const data = parseYaml(getDocData(wx.data)) as PluginDataEx;
|
||||
const xFiles = [] as PluginDataExFile[];
|
||||
for (const file of data.files) {
|
||||
const work = { ...file };
|
||||
const tempStr = getDocData(work.data);
|
||||
work.data = [await hashString(tempStr)];
|
||||
xFiles.push(work);
|
||||
}
|
||||
entries.push({
|
||||
...data,
|
||||
documentPath: this.getPath(wx),
|
||||
files: xFiles
|
||||
});
|
||||
}
|
||||
} catch (ex) {
|
||||
//TODO
|
||||
Logger(`Something happened at enumerating customization :${v.path}`, LOG_LEVEL.NOTICE);
|
||||
console.warn(ex);
|
||||
} finally {
|
||||
release();
|
||||
}
|
||||
}
|
||||
)(plugin));
|
||||
}
|
||||
await Promise.all(processes);
|
||||
let newList = [...this.pluginList];
|
||||
for (const item of entries) {
|
||||
newList = newList.filter(x => x.documentPath != item.documentPath);
|
||||
newList.push(item)
|
||||
}
|
||||
if (updatedDocumentPath != "" && !processed) newList = newList.filter(e => e.documentPath != updatedDocumentPath);
|
||||
|
||||
this.pluginList = newList;
|
||||
pluginList.set(newList);
|
||||
|
||||
|
||||
Logger(`All files enumerated`, logLevel, "get-plugins");
|
||||
} finally {
|
||||
pluginIsEnumerating.set(false);
|
||||
try {
|
||||
const updatedDocumentId = updatedDocumentPath ? await this.path2id(updatedDocumentPath) : "";
|
||||
const plugins = updatedDocumentPath ?
|
||||
this.localDatabase.findEntries(updatedDocumentId, updatedDocumentId + "\u{10ffff}", { include_docs: true, key: updatedDocumentId, limit: 1 }) :
|
||||
this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
|
||||
for await (const v of plugins) {
|
||||
const path = v.path || this.getPath(v);
|
||||
if (updatedDocumentPath && updatedDocumentPath != path) continue;
|
||||
this.pluginScanProcessor.enqueue(v);
|
||||
}
|
||||
});
|
||||
} finally {
|
||||
pluginIsEnumerating.set(false);
|
||||
}
|
||||
pluginIsEnumerating.set(false);
|
||||
// return entries;
|
||||
}
|
||||
async compareUsingDisplayData(dataA: PluginDataExDisplay, dataB: PluginDataExDisplay) {
|
||||
@@ -241,9 +384,9 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
const docB = await this.localDatabase.getDBEntry(dataB.documentPath);
|
||||
|
||||
if (docA && docB) {
|
||||
const pluginDataA = parseYaml(getDocData(docA.data)) as PluginDataEx;
|
||||
const pluginDataA = deserialize(getDocData(docA.data), {}) as PluginDataEx;
|
||||
pluginDataA.documentPath = dataA.documentPath;
|
||||
const pluginDataB = parseYaml(getDocData(docB.data)) as PluginDataEx;
|
||||
const pluginDataB = deserialize(getDocData(docB.data), {}) as PluginDataEx;
|
||||
pluginDataB.documentPath = dataB.documentPath;
|
||||
|
||||
// Use outer structure to wrap each data.
|
||||
@@ -255,9 +398,9 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
showJSONMergeDialogAndMerge(docA: LoadedEntry, docB: LoadedEntry, pluginDataA: PluginDataEx, pluginDataB: PluginDataEx): Promise<boolean> {
|
||||
const fileA = { ...pluginDataA.files[0], ctime: pluginDataA.files[0].mtime, _id: `${pluginDataA.documentPath}` as DocumentID };
|
||||
const fileB = pluginDataB.files[0];
|
||||
const docAx = { ...docA, ...fileA } as LoadedEntry, docBx = { ...docB, ...fileB } as LoadedEntry
|
||||
return runWithLock("config:merge-data", false, () => new Promise((res) => {
|
||||
Logger("Opening data-merging dialog", LOG_LEVEL.VERBOSE);
|
||||
const docAx = { ...docA, ...fileA, datatype: "newnote" } as LoadedEntry, docBx = { ...docB, ...fileB, datatype: "newnote" } as LoadedEntry
|
||||
return serialized("config:merge-data", () => new Promise((res) => {
|
||||
Logger("Opening data-merging dialog", LOG_LEVEL_VERBOSE);
|
||||
// const docs = [docA, docB];
|
||||
const path = stripAllPrefixes(docAx.path.split("/").slice(-1).join("/") as FilePath);
|
||||
const modal = new JsonResolveModal(this.app, path, [docAx, docBx], async (keep, result) => {
|
||||
@@ -266,7 +409,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
res(await this.applyData(pluginDataA, result));
|
||||
} catch (ex) {
|
||||
Logger("Could not apply merged file");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
res(false);
|
||||
}
|
||||
}, "📡", "🛰️", "B");
|
||||
@@ -282,24 +425,24 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
if (dx == false) {
|
||||
throw "Not found on database"
|
||||
}
|
||||
const loadedData = parseYaml(getDocData(dx.data)) as PluginDataEx;
|
||||
const loadedData = deserialize(getDocData(dx.data), {}) as PluginDataEx;
|
||||
for (const f of loadedData.files) {
|
||||
Logger(`Applying ${f.filename} of ${data.displayName || data.name}..`);
|
||||
try {
|
||||
// console.dir(f);
|
||||
const path = `${baseDir}/${f.filename}`;
|
||||
await this.ensureDirectoryEx(path);
|
||||
await this.vaultAccess.ensureDirectory(path);
|
||||
if (!content) {
|
||||
const dt = base64ToArrayBuffer(f.data);
|
||||
await this.app.vault.adapter.writeBinary(path, dt);
|
||||
const dt = decodeBinary(f.data);
|
||||
await this.vaultAccess.adapterWrite(path, dt);
|
||||
} else {
|
||||
await this.app.vault.adapter.write(path, content);
|
||||
await this.vaultAccess.adapterWrite(path, content);
|
||||
}
|
||||
Logger(`Applying ${f.filename} of ${data.displayName || data.name}.. Done`);
|
||||
|
||||
} catch (ex) {
|
||||
Logger(`Applying ${f.filename} of ${data.displayName || data.name}.. Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
|
||||
}
|
||||
@@ -307,7 +450,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
await this.storeCustomizationFiles(uPath);
|
||||
await this.updatePluginList(true, uPath);
|
||||
await delay(100);
|
||||
Logger(`Config ${data.displayName || data.name} has been applied`, LOG_LEVEL.NOTICE);
|
||||
Logger(`Config ${data.displayName || data.name} has been applied`, LOG_LEVEL_NOTICE);
|
||||
if (data.category == "PLUGIN_DATA" || data.category == "PLUGIN_MAIN") {
|
||||
//@ts-ignore
|
||||
const manifests = Object.values(this.app.plugins.manifests) as any as PluginManifest[];
|
||||
@@ -315,12 +458,12 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
const enabledPlugins = this.app.plugins.enabledPlugins as Set<string>;
|
||||
const pluginManifest = manifests.find((manifest) => enabledPlugins.has(manifest.id) && manifest.dir == `${baseDir}/plugins/${data.name}`);
|
||||
if (pluginManifest) {
|
||||
Logger(`Unloading plugin: ${pluginManifest.name}`, LOG_LEVEL.NOTICE, "plugin-reload-" + pluginManifest.id);
|
||||
Logger(`Unloading plugin: ${pluginManifest.name}`, LOG_LEVEL_NOTICE, "plugin-reload-" + pluginManifest.id);
|
||||
// @ts-ignore
|
||||
await this.app.plugins.unloadPlugin(pluginManifest.id);
|
||||
// @ts-ignore
|
||||
await this.app.plugins.loadPlugin(pluginManifest.id);
|
||||
Logger(`Plugin reloaded: ${pluginManifest.name}`, LOG_LEVEL.NOTICE, "plugin-reload-" + pluginManifest.id);
|
||||
Logger(`Plugin reloaded: ${pluginManifest.name}`, LOG_LEVEL_NOTICE, "plugin-reload-" + pluginManifest.id);
|
||||
}
|
||||
} else if (data.category == "CONFIG") {
|
||||
scheduleTask("configReload", 250, async () => {
|
||||
@@ -333,7 +476,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
return true;
|
||||
} catch (ex) {
|
||||
Logger(`Applying ${data.displayName || data.name}.. Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
@@ -342,11 +485,11 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
if (data.documentPath) {
|
||||
await this.deleteConfigOnDatabase(data.documentPath);
|
||||
await this.updatePluginList(false, data.documentPath);
|
||||
Logger(`Delete: ${data.documentPath}`, LOG_LEVEL.NOTICE);
|
||||
Logger(`Delete: ${data.documentPath}`, LOG_LEVEL_NOTICE);
|
||||
}
|
||||
return true;
|
||||
} catch (ex) {
|
||||
Logger(`Failed to delete: ${data.documentPath}`, LOG_LEVEL.NOTICE);
|
||||
Logger(`Failed to delete: ${data.documentPath}`, LOG_LEVEL_NOTICE);
|
||||
return false;
|
||||
|
||||
}
|
||||
@@ -354,14 +497,14 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
async parseReplicationResultItem(docs: PouchDB.Core.ExistingDocument<EntryDoc>) {
|
||||
if (docs._id.startsWith(ICXHeader)) {
|
||||
if (this.plugin.settings.usePluginSync) {
|
||||
await this.updatePluginList(false, docs.path ? docs.path : this.getPath(docs));
|
||||
await this.updatePluginList(false, (docs as AnyEntry).path ? (docs as AnyEntry).path : this.getPath((docs as AnyEntry)));
|
||||
}
|
||||
if (this.plugin.settings.usePluginSync && this.plugin.settings.notifyPluginOrSettingUpdated) {
|
||||
if (!this.pluginDialog || (this.pluginDialog && !this.pluginDialog.isOpened())) {
|
||||
const fragment = createFragment((doc) => {
|
||||
doc.createEl("span", null, (a) => {
|
||||
doc.createEl("span", undefined, (a) => {
|
||||
a.appendText(`Some configuration has been arrived, Press `);
|
||||
a.appendChild(a.createEl("a", null, (anchor) => {
|
||||
a.appendChild(a.createEl("a", undefined, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", () => {
|
||||
this.showPluginSyncModal();
|
||||
@@ -410,15 +553,16 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
|
||||
return;
|
||||
}
|
||||
|
||||
recentProcessedInternalFiles = [] as string[];
|
||||
async makeEntryFromFile(path: FilePath): Promise<false | PluginDataExFile> {
|
||||
const stat = await this.app.vault.adapter.stat(path);
|
||||
const stat = await this.vaultAccess.adapterStat(path);
|
||||
let version: string | undefined;
|
||||
let displayName: string | undefined;
|
||||
if (!stat) {
|
||||
return false;
|
||||
}
|
||||
const contentBin = await this.app.vault.adapter.readBinary(path);
|
||||
const contentBin = await this.vaultAccess.adapterReadBinary(path);
|
||||
let content: string[];
|
||||
try {
|
||||
content = await arrayBufferToBase64(contentBin);
|
||||
@@ -433,12 +577,12 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
displayName = `${json.name}`;
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger(`Configuration sync data: ${path} looks like manifest, but could not read the version`, LOG_LEVEL.INFO);
|
||||
Logger(`Configuration sync data: ${path} looks like manifest, but could not read the version`, LOG_LEVEL_INFO);
|
||||
}
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger(`The file ${path} could not be encoded`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
const mtime = stat.mtime;
|
||||
@@ -464,8 +608,12 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
}
|
||||
async storeCustomizationFiles(path: FilePath, termOverRide?: string) {
|
||||
const term = termOverRide || this.plugin.deviceAndVaultName;
|
||||
if (term == "") {
|
||||
Logger("We have to configure the device name", LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
const vf = this.filenameToUnifiedKey(path, term);
|
||||
return await runWithLock(`plugin-${vf}`, false, async () => {
|
||||
return await serialized(`plugin-${vf}`, async () => {
|
||||
const category = this.getFileCategory(path);
|
||||
let mtime = 0;
|
||||
let fileTargets = [] as FilePath[];
|
||||
@@ -497,7 +645,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
for (const target of fileTargets) {
|
||||
const data = await this.makeEntryFromFile(target);
|
||||
if (data == false) {
|
||||
Logger(`Config: skipped: ${target} `, LOG_LEVEL.VERBOSE);
|
||||
// Logger(`Config: skipped: ${target} `, LOG_LEVEL_VERBOSE);
|
||||
continue;
|
||||
}
|
||||
if (data.version) {
|
||||
@@ -520,10 +668,10 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
return
|
||||
}
|
||||
|
||||
const content = stringifyYaml(dt);
|
||||
const content = createTextBlob(serialize(dt));
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false);
|
||||
let saveData: LoadedEntry;
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, false);
|
||||
let saveData: SavingEntry;
|
||||
if (old === false) {
|
||||
saveData = {
|
||||
_id: id,
|
||||
@@ -532,22 +680,35 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
mtime,
|
||||
ctime: mtime,
|
||||
datatype: "newnote",
|
||||
size: content.length,
|
||||
size: content.size,
|
||||
children: [],
|
||||
deleted: false,
|
||||
type: "newnote",
|
||||
eden: {}
|
||||
};
|
||||
} else {
|
||||
if (old.mtime == mtime) {
|
||||
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL.VERBOSE);
|
||||
// Logger(`STORAGE --> DB:${prefixedFileName}: (config) Skipped (Same time)`, LOG_LEVEL_VERBOSE);
|
||||
return true;
|
||||
}
|
||||
const oldC = await this.localDatabase.getDBEntryFromMeta(old, {}, false, false);
|
||||
if (oldC) {
|
||||
const d = await deserialize(getDocData(oldC.data), {}) as PluginDataEx;
|
||||
const diffs = (d.files.map(previous => ({ prev: previous, curr: dt.files.find(e => e.filename == previous.filename) })).map(async e => {
|
||||
try { return await isDocContentSame(e.curr?.data ?? [], e.prev.data) } catch (_) { return false }
|
||||
}))
|
||||
const isSame = (await Promise.all(diffs)).every(e => e == true);
|
||||
if (isSame) {
|
||||
Logger(`STORAGE --> DB:${prefixedFileName}: (config) Skipped (Same content)`, LOG_LEVEL_VERBOSE);
|
||||
return true;
|
||||
}
|
||||
}
|
||||
saveData =
|
||||
{
|
||||
...old,
|
||||
data: content,
|
||||
mtime,
|
||||
size: content.length,
|
||||
size: content.size,
|
||||
datatype: "newnote",
|
||||
children: [],
|
||||
deleted: false,
|
||||
@@ -560,7 +721,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
return ret;
|
||||
} catch (ex) {
|
||||
Logger(`STORAGE --> DB:${prefixedFileName}: (config) Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
})
|
||||
@@ -569,10 +730,17 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
async watchVaultRawEventsAsync(path: FilePath) {
|
||||
if (!this.settings.usePluginSync) return false;
|
||||
if (!this.isTargetPath(path)) return false;
|
||||
const stat = await this.app.vault.adapter.stat(path);
|
||||
const stat = await this.vaultAccess.adapterStat(path);
|
||||
// Make sure that target is a file.
|
||||
if (stat && stat.type != "file")
|
||||
return false;
|
||||
|
||||
const configDir = normalizePath(this.app.vault.configDir);
|
||||
const synchronisedInConfigSync = Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode != MODE_SELECTIVE).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
|
||||
if (synchronisedInConfigSync.some(e => e.startsWith(path.toLowerCase()))) {
|
||||
Logger(`Customization file skipped: ${path}`, LOG_LEVEL_VERBOSE);
|
||||
return;
|
||||
}
|
||||
const storageMTime = ~~((stat && stat.mtime || 0) / 1000);
|
||||
const key = `${path}-${storageMTime}`;
|
||||
if (this.recentProcessedInternalFiles.contains(key)) {
|
||||
@@ -587,11 +755,11 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
|
||||
|
||||
async scanAllConfigFiles(showMessage: boolean) {
|
||||
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
|
||||
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
|
||||
Logger("Scanning customizing files.", logLevel, "scan-all-config");
|
||||
const term = this.plugin.deviceAndVaultName;
|
||||
if (term == "") {
|
||||
Logger("We have to configure the device name", LOG_LEVEL.NOTICE);
|
||||
Logger("We have to configure the device name", LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
const filesAll = await this.scanInternalFiles();
|
||||
@@ -600,7 +768,11 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
const filesOnDB = ((await this.localDatabase.allDocsRaw({ startkey: ICXHeader + "", endkey: `${ICXHeader}\u{10ffff}`, include_docs: true })).rows.map(e => e.doc) as InternalFileEntry[]).filter(e => !e.deleted);
|
||||
let deleteCandidate = filesOnDB.map(e => this.getPath(e)).filter(e => e.startsWith(`${ICXHeader}${term}/`));
|
||||
for (const vp of virtualPathsOfLocalFiles) {
|
||||
const p = files.find(e => e.key == vp).file;
|
||||
const p = files.find(e => e.key == vp)?.file;
|
||||
if (!p) {
|
||||
Logger(`scanAllConfigFiles - File not found: ${vp}`, LOG_LEVEL_VERBOSE);
|
||||
continue;
|
||||
}
|
||||
await this.storeCustomizationFiles(p);
|
||||
deleteCandidate = deleteCandidate.filter(e => e != vp);
|
||||
}
|
||||
@@ -613,12 +785,13 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
|
||||
// const id = await this.path2id(prefixedFileName);
|
||||
const mtime = new Date().getTime();
|
||||
await runWithLock("file-x-" + prefixedFileName, false, async () => {
|
||||
await serialized("file-x-" + prefixedFileName, async () => {
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false) as InternalFileEntry | false;
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, false) as InternalFileEntry | false;
|
||||
let saveData: InternalFileEntry;
|
||||
if (old === false) {
|
||||
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) already deleted (Not found on database)`);
|
||||
return;
|
||||
} else {
|
||||
if (old.deleted) {
|
||||
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) already deleted`);
|
||||
@@ -639,7 +812,7 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) Done`);
|
||||
} catch (ex) {
|
||||
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
});
|
||||
@@ -657,7 +830,14 @@ export class ConfigSync extends LiveSyncCommands {
|
||||
lastDepth: number
|
||||
) {
|
||||
if (lastDepth == -1) return [];
|
||||
const w = await this.app.vault.adapter.list(path);
|
||||
let w: ListedFiles;
|
||||
try {
|
||||
w = await this.app.vault.adapter.list(path);
|
||||
} catch (ex) {
|
||||
Logger(`Could not traverse(ConfigSync):${path}`, LOG_LEVEL_INFO);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return [];
|
||||
}
|
||||
let files = [
|
||||
...w.files
|
||||
];
|
||||
@@ -1,27 +1,23 @@
|
||||
import { Notice, normalizePath, PluginManifest } from "./deps";
|
||||
import { EntryDoc, LoadedEntry, LOG_LEVEL, InternalFileEntry, FilePathWithPrefix, FilePath } from "./lib/src/types";
|
||||
import { InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
|
||||
import { delay, isDocContentSame } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask, isInternalMetadata, PeriodicProcessor } from "./utils";
|
||||
import { WrappedNotice } from "./lib/src/wrapper";
|
||||
import { base64ToArrayBuffer, arrayBufferToBase64 } from "./lib/src/strbin";
|
||||
import { runWithLock } from "./lib/src/lock";
|
||||
import { Semaphore } from "./lib/src/semaphore";
|
||||
import { JsonResolveModal } from "./JsonResolveModal";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
|
||||
import { normalizePath, type PluginManifest, type ListedFiles } from "../deps.ts";
|
||||
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry, type DocumentID } from "../lib/src/common/types.ts";
|
||||
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "../common/types.ts";
|
||||
import { readAsBlob, isDocContentSame, sendSignal, readContent, createBlob } from "../lib/src/common/utils.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { PouchDB } from "../lib/src/pouchdb/pouchdb-browser.js";
|
||||
import { isInternalMetadata, PeriodicProcessor } from "../common/utils.ts";
|
||||
import { serialized } from "../lib/src/concurrency/lock.ts";
|
||||
import { JsonResolveModal } from "../ui/JsonResolveModal.ts";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands.ts";
|
||||
import { addPrefix, stripAllPrefixes } from "../lib/src/string_and_binary/path.ts";
|
||||
import { QueueProcessor } from "../lib/src/concurrency/processor.ts";
|
||||
import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "../lib/src/mock_and_interop/stores.ts";
|
||||
|
||||
export class HiddenFileSync extends LiveSyncCommands {
|
||||
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
|
||||
confirmPopup: WrappedNotice = null;
|
||||
|
||||
get kvDB() {
|
||||
return this.plugin.kvDB;
|
||||
}
|
||||
ensureDirectoryEx(fullPath: string) {
|
||||
return this.plugin.ensureDirectoryEx(fullPath);
|
||||
}
|
||||
getConflictedDoc(path: FilePathWithPrefix, rev: string) {
|
||||
return this.plugin.getConflictedDoc(path, rev);
|
||||
}
|
||||
@@ -45,7 +41,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
Logger("Synchronizing hidden files done");
|
||||
} catch (ex) {
|
||||
Logger("Synchronizing hidden files failed");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -69,40 +65,45 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
realizeSettingSyncMode(): Promise<void> {
|
||||
this.periodicInternalFileScanProcessor?.disable();
|
||||
if (this.plugin.suspended)
|
||||
return;
|
||||
return Promise.resolve();
|
||||
if (!this.plugin.isReady)
|
||||
return;
|
||||
return Promise.resolve();
|
||||
this.periodicInternalFileScanProcessor.enable(this.settings.syncInternalFiles && this.settings.syncInternalFilesInterval ? (this.settings.syncInternalFilesInterval * 1000) : 0);
|
||||
return;
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
procInternalFiles: string[] = [];
|
||||
async execInternalFile() {
|
||||
await runWithLock("execInternal", false, async () => {
|
||||
const w = [...this.procInternalFiles];
|
||||
this.procInternalFiles = [];
|
||||
Logger(`Applying hidden ${w.length} files change...`);
|
||||
await this.syncInternalFilesAndDatabase("pull", false, false, w);
|
||||
Logger(`Applying hidden ${w.length} files changed`);
|
||||
});
|
||||
}
|
||||
procInternalFile(filename: string) {
|
||||
this.procInternalFiles.push(filename);
|
||||
scheduleTask("procInternal", 500, async () => {
|
||||
await this.execInternalFile();
|
||||
});
|
||||
this.internalFileProcessor.enqueue(filename);
|
||||
}
|
||||
internalFileProcessor = new QueueProcessor<string, any>(
|
||||
async (filenames) => {
|
||||
Logger(`START :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
|
||||
await this.syncInternalFilesAndDatabase("pull", false, false, filenames);
|
||||
Logger(`DONE :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
|
||||
return;
|
||||
}, { batchSize: 100, concurrentLimit: 1, delay: 10, yieldThreshold: 100, suspended: false, totalRemainingReactiveSource: hiddenFilesEventCount }
|
||||
);
|
||||
|
||||
recentProcessedInternalFiles = [] as string[];
|
||||
async watchVaultRawEventsAsync(path: FilePath) {
|
||||
if (!this.settings.syncInternalFiles) return;
|
||||
const stat = await this.app.vault.adapter.stat(path);
|
||||
// sometimes folder is coming.
|
||||
if (stat && stat.type != "file")
|
||||
|
||||
// Exclude files handled by customization sync
|
||||
const configDir = normalizePath(this.app.vault.configDir);
|
||||
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
|
||||
if (synchronisedInConfigSync.some(e => e.startsWith(path.toLowerCase()))) {
|
||||
Logger(`Hidden file skipped: ${path} is synchronized in customization sync.`, LOG_LEVEL_VERBOSE);
|
||||
return;
|
||||
const storageMTime = ~~((stat && stat.mtime || 0) / 1000);
|
||||
}
|
||||
const stat = await this.vaultAccess.adapterStat(path);
|
||||
// sometimes folder is coming.
|
||||
if (stat != null && stat.type != "file") {
|
||||
return;
|
||||
}
|
||||
const mtime = stat == null ? 0 : stat?.mtime ?? 0;
|
||||
const storageMTime = ~~((mtime) / 1000);
|
||||
const key = `${path}-${storageMTime}`;
|
||||
if (this.recentProcessedInternalFiles.contains(key)) {
|
||||
if (mtime != 0 && this.recentProcessedInternalFiles.contains(key)) {
|
||||
//If recently processed, it may caused by self.
|
||||
return;
|
||||
}
|
||||
@@ -122,7 +123,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
if (storageMTime == 0) {
|
||||
await this.deleteInternalFileOnDatabase(path);
|
||||
} else {
|
||||
await this.storeInternalFileToDatabase({ path: path, ...stat });
|
||||
await this.storeInternalFileToDatabase({ path: path, mtime, ctime: stat?.ctime ?? mtime, size: stat?.size ?? 0 });
|
||||
}
|
||||
|
||||
}
|
||||
@@ -130,25 +131,56 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
async resolveConflictOnInternalFiles() {
|
||||
// Scan all conflicted internal files
|
||||
const conflicted = this.localDatabase.findEntries(ICHeader, ICHeaderEnd, { conflicts: true });
|
||||
for await (const doc of conflicted) {
|
||||
if (!("_conflicts" in doc))
|
||||
continue;
|
||||
if (isInternalMetadata(doc._id)) {
|
||||
await this.resolveConflictOnInternalFile(doc.path);
|
||||
this.conflictResolutionProcessor.suspend();
|
||||
try {
|
||||
for await (const doc of conflicted) {
|
||||
if (!("_conflicts" in doc))
|
||||
continue;
|
||||
if (isInternalMetadata(doc._id)) {
|
||||
this.conflictResolutionProcessor.enqueue(doc.path);
|
||||
}
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger("something went wrong on resolving all conflicted internal files");
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
await this.conflictResolutionProcessor.startPipeline().waitForPipeline();
|
||||
}
|
||||
|
||||
async resolveConflictOnInternalFile(path: FilePathWithPrefix): Promise<boolean> {
|
||||
async resolveByNewerEntry(id: DocumentID, path: FilePathWithPrefix, currentDoc: EntryDoc, currentRev: string, conflictedRev: string) {
|
||||
const conflictedDoc = await this.localDatabase.getRaw(id, { rev: conflictedRev });
|
||||
// determine which revision should been deleted.
|
||||
// simply check modified time
|
||||
const mtimeCurrent = ("mtime" in currentDoc && currentDoc.mtime) || 0;
|
||||
const mtimeConflicted = ("mtime" in conflictedDoc && conflictedDoc.mtime) || 0;
|
||||
// Logger(`Revisions:${new Date(mtimeA).toLocaleString} and ${new Date(mtimeB).toLocaleString}`);
|
||||
// console.log(`mtime:${mtimeA} - ${mtimeB}`);
|
||||
const delRev = mtimeCurrent < mtimeConflicted ? currentRev : conflictedRev;
|
||||
// delete older one.
|
||||
await this.localDatabase.removeRevision(id, delRev);
|
||||
Logger(`Older one has been deleted:${path}`);
|
||||
const cc = await this.localDatabase.getRaw(id, { conflicts: true });
|
||||
if (cc._conflicts?.length === 0) {
|
||||
await this.extractInternalFileFromDatabase(stripAllPrefixes(path))
|
||||
} else {
|
||||
this.conflictResolutionProcessor.enqueue(path);
|
||||
}
|
||||
// check the file again
|
||||
|
||||
}
|
||||
conflictResolutionProcessor = new QueueProcessor(async (paths: FilePathWithPrefix[]) => {
|
||||
const path = paths[0];
|
||||
sendSignal(`cancel-internal-conflict:${path}`);
|
||||
try {
|
||||
// Retrieve data
|
||||
const id = await this.path2id(path, ICHeader);
|
||||
const doc = await this.localDatabase.getRaw(id, { conflicts: true });
|
||||
// If there is no conflict, return with false.
|
||||
if (!("_conflicts" in doc))
|
||||
return false;
|
||||
// if (!("_conflicts" in doc)){
|
||||
// return [];
|
||||
// }
|
||||
if (doc._conflicts === undefined) return [];
|
||||
if (doc._conflicts.length == 0)
|
||||
return false;
|
||||
return [];
|
||||
Logger(`Hidden file conflicted:${path}`);
|
||||
const conflicts = doc._conflicts.sort((a, b) => Number(a.split("-")[0]) - Number(b.split("-")[0]));
|
||||
const revA = doc._rev;
|
||||
@@ -159,69 +191,80 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
const conflictedRevNo = Number(conflictedRev.split("-")[0]);
|
||||
//Search
|
||||
const revFrom = (await this.localDatabase.getRaw<EntryDoc>(id, { revs_info: true }));
|
||||
const commonBase = revFrom._revs_info.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
|
||||
const commonBase = revFrom._revs_info?.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
|
||||
const result = await this.plugin.mergeObject(path, commonBase, doc._rev, conflictedRev);
|
||||
if (result) {
|
||||
Logger(`Object merge:${path}`, LOG_LEVEL.INFO);
|
||||
Logger(`Object merge:${path}`, LOG_LEVEL_INFO);
|
||||
const filename = stripAllPrefixes(path);
|
||||
const isExists = await this.app.vault.adapter.exists(filename);
|
||||
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
await this.vaultAccess.ensureDirectory(filename);
|
||||
}
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, result);
|
||||
const stat = await this.vaultAccess.adapterStat(filename);
|
||||
if (!stat) {
|
||||
throw new Error(`conflictResolutionProcessor: Failed to stat file ${filename}`);
|
||||
}
|
||||
await this.app.vault.adapter.write(filename, result);
|
||||
const stat = await this.app.vault.adapter.stat(filename);
|
||||
await this.storeInternalFileToDatabase({ path: filename, ...stat });
|
||||
await this.extractInternalFileFromDatabase(filename);
|
||||
await this.localDatabase.removeRaw(id, revB);
|
||||
return this.resolveConflictOnInternalFile(path);
|
||||
await this.localDatabase.removeRevision(id, revB);
|
||||
this.conflictResolutionProcessor.enqueue(path);
|
||||
return [];
|
||||
} else {
|
||||
Logger(`Object merge is not applicable.`, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
|
||||
const docAMerge = await this.localDatabase.getDBEntry(path, { rev: revA });
|
||||
const docBMerge = await this.localDatabase.getDBEntry(path, { rev: revB });
|
||||
if (docAMerge != false && docBMerge != false) {
|
||||
if (await this.showJSONMergeDialogAndMerge(docAMerge, docBMerge)) {
|
||||
await delay(200);
|
||||
// Again for other conflicted revisions.
|
||||
return this.resolveConflictOnInternalFile(path);
|
||||
}
|
||||
return false;
|
||||
Logger(`Object merge is not applicable.`, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
return [{ path, revA, revB, id, doc }];
|
||||
}
|
||||
const revBDoc = await this.localDatabase.getRaw(id, { rev: revB });
|
||||
// determine which revision should been deleted.
|
||||
// simply check modified time
|
||||
const mtimeA = ("mtime" in doc && doc.mtime) || 0;
|
||||
const mtimeB = ("mtime" in revBDoc && revBDoc.mtime) || 0;
|
||||
// Logger(`Revisions:${new Date(mtimeA).toLocaleString} and ${new Date(mtimeB).toLocaleString}`);
|
||||
// console.log(`mtime:${mtimeA} - ${mtimeB}`);
|
||||
const delRev = mtimeA < mtimeB ? revA : revB;
|
||||
// delete older one.
|
||||
await this.localDatabase.removeRaw(id, delRev);
|
||||
Logger(`Older one has been deleted:${path}`);
|
||||
// check the file again
|
||||
return this.resolveConflictOnInternalFile(path);
|
||||
// When not JSON file, resolve conflicts by choosing a newer one.
|
||||
await this.resolveByNewerEntry(id, path, doc, revA, revB);
|
||||
return [];
|
||||
} catch (ex) {
|
||||
Logger(`Failed to resolve conflict (Hidden): ${path}`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
return false;
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return [];
|
||||
}
|
||||
}, {
|
||||
suspended: false, batchSize: 1, concurrentLimit: 5, delay: 10, keepResultUntilDownstreamConnected: true, yieldThreshold: 10,
|
||||
pipeTo: new QueueProcessor(async (results) => {
|
||||
const { id, doc, path, revA, revB } = results[0];
|
||||
const docAMerge = await this.localDatabase.getDBEntry(path, { rev: revA });
|
||||
const docBMerge = await this.localDatabase.getDBEntry(path, { rev: revB });
|
||||
if (docAMerge != false && docBMerge != false) {
|
||||
if (await this.showJSONMergeDialogAndMerge(docAMerge, docBMerge)) {
|
||||
// Again for other conflicted revisions.
|
||||
this.conflictResolutionProcessor.enqueue(path);
|
||||
}
|
||||
return;
|
||||
} else {
|
||||
// If either revision could not read, force resolving by the newer one.
|
||||
await this.resolveByNewerEntry(id, path, doc, revA, revB);
|
||||
}
|
||||
}, { suspended: false, batchSize: 1, concurrentLimit: 1, delay: 10, keepResultUntilDownstreamConnected: false, yieldThreshold: 10 })
|
||||
})
|
||||
|
||||
queueConflictCheck(path: FilePathWithPrefix) {
|
||||
this.conflictResolutionProcessor.enqueue(path);
|
||||
}
|
||||
|
||||
//TODO: Tidy up. Even though it is experimental feature, So dirty...
|
||||
async syncInternalFilesAndDatabase(direction: "push" | "pull" | "safe" | "pullForce" | "pushForce", showMessage: boolean, files: InternalFileInfo[] | false = false, targetFiles: string[] | false = false) {
|
||||
async syncInternalFilesAndDatabase(direction: "push" | "pull" | "safe" | "pullForce" | "pushForce", showMessage: boolean, filesAll: InternalFileInfo[] | false = false, targetFiles: string[] | false = false) {
|
||||
await this.resolveConflictOnInternalFiles();
|
||||
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
|
||||
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
|
||||
Logger("Scanning hidden files.", logLevel, "sync_internal");
|
||||
const ignorePatterns = this.settings.syncInternalFilesIgnorePatterns
|
||||
.replace(/\n| /g, "")
|
||||
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
|
||||
if (!files)
|
||||
files = await this.scanInternalFiles();
|
||||
|
||||
const configDir = normalizePath(this.app.vault.configDir);
|
||||
let files: InternalFileInfo[] =
|
||||
filesAll ? filesAll : (await this.scanInternalFiles())
|
||||
|
||||
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
|
||||
files = files.filter(file => synchronisedInConfigSync.every(filterFile => !file.path.toLowerCase().startsWith(filterFile)))
|
||||
|
||||
const filesOnDB = ((await this.localDatabase.allDocsRaw({ startkey: ICHeader, endkey: ICHeaderEnd, include_docs: true })).rows.map(e => e.doc) as InternalFileEntry[]).filter(e => !e.deleted);
|
||||
const allFileNamesSrc = [...new Set([...files.map(e => normalizePath(e.path)), ...filesOnDB.map(e => stripAllPrefixes(this.getPath(e)))])];
|
||||
const allFileNames = allFileNamesSrc.filter(filename => !targetFiles || (targetFiles && targetFiles.indexOf(filename) !== -1));
|
||||
const allFileNames = allFileNamesSrc.filter(filename => !targetFiles || (targetFiles && targetFiles.indexOf(filename) !== -1)).filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile)))
|
||||
function compareMTime(a: number, b: number) {
|
||||
const wa = ~~(a / 1000);
|
||||
const wb = ~~(b / 1000);
|
||||
@@ -254,38 +297,49 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
c = pieces.shift();
|
||||
}
|
||||
};
|
||||
const p = [] as Promise<void>[];
|
||||
const semaphore = Semaphore(10);
|
||||
// Cache update time information for files which have already been processed (mainly for files that were skipped due to the same content)
|
||||
let caches: { [key: string]: { storageMtime: number; docMtime: number; }; } = {};
|
||||
caches = await this.kvDB.get<{ [key: string]: { storageMtime: number; docMtime: number; }; }>("diff-caches-internal") || {};
|
||||
for (const filename of allFileNames) {
|
||||
if (!filename) continue;
|
||||
const filesMap = files.reduce((acc, cur) => {
|
||||
acc[cur.path] = cur;
|
||||
return acc;
|
||||
}, {} as { [key: string]: InternalFileInfo; });
|
||||
const filesOnDBMap = filesOnDB.reduce((acc, cur) => {
|
||||
acc[stripAllPrefixes(this.getPath(cur))] = cur;
|
||||
return acc;
|
||||
}, {} as { [key: string]: InternalFileEntry; });
|
||||
await new QueueProcessor(async (filenames: FilePath[]) => {
|
||||
const filename = filenames[0];
|
||||
processed++;
|
||||
if (processed % 100 == 0)
|
||||
if (processed % 100 == 0) {
|
||||
Logger(`Hidden file: ${processed}/${fileCount}`, logLevel, "sync_internal");
|
||||
}
|
||||
if (!filename) return [];
|
||||
if (ignorePatterns.some(e => filename.match(e)))
|
||||
continue;
|
||||
return [];
|
||||
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const fileOnStorage = files.find(e => e.path == filename);
|
||||
const fileOnDatabase = filesOnDB.find(e => stripAllPrefixes(this.getPath(e)) == filename);
|
||||
const addProc = async (p: () => Promise<void>): Promise<void> => {
|
||||
const releaser = await semaphore.acquire(1);
|
||||
try {
|
||||
return p();
|
||||
} catch (ex) {
|
||||
Logger("Some process failed", logLevel);
|
||||
Logger(ex);
|
||||
} finally {
|
||||
releaser();
|
||||
}
|
||||
};
|
||||
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
|
||||
const fileOnStorage = filename in filesMap ? filesMap[filename] : undefined;
|
||||
const fileOnDatabase = filename in filesOnDBMap ? filesOnDBMap[filename] : undefined;
|
||||
|
||||
p.push(addProc(async () => {
|
||||
const xFileOnStorage = fileOnStorage;
|
||||
const xFileOnDatabase = fileOnDatabase;
|
||||
return [{
|
||||
filename,
|
||||
fileOnStorage,
|
||||
fileOnDatabase,
|
||||
}]
|
||||
|
||||
}, { suspended: true, batchSize: 1, concurrentLimit: 10, delay: 0, totalRemainingReactiveSource: hiddenFilesProcessingCount })
|
||||
.pipeTo(new QueueProcessor(async (params) => {
|
||||
const
|
||||
{
|
||||
filename,
|
||||
fileOnStorage: xFileOnStorage,
|
||||
fileOnDatabase: xFileOnDatabase
|
||||
} = params[0];
|
||||
if (xFileOnStorage && xFileOnDatabase) {
|
||||
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
|
||||
// Both => Synchronize
|
||||
if ((direction != "pullForce" && direction != "pushForce") && xFileOnDatabase.mtime == cache.docMtime && xFileOnStorage.mtime == cache.storageMtime) {
|
||||
return;
|
||||
@@ -321,19 +375,25 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
}
|
||||
}
|
||||
} else if (xFileOnStorage && !xFileOnDatabase) {
|
||||
await this.storeInternalFileToDatabase(xFileOnStorage);
|
||||
if (direction == "push" || direction == "pushForce" || direction == "safe") {
|
||||
await this.storeInternalFileToDatabase(xFileOnStorage);
|
||||
} else {
|
||||
await this.extractInternalFileFromDatabase(xFileOnStorage.path);
|
||||
}
|
||||
} else {
|
||||
throw new Error("Invalid state on hidden file sync");
|
||||
// Something corrupted?
|
||||
}
|
||||
}));
|
||||
}
|
||||
await Promise.all(p);
|
||||
return;
|
||||
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 0 }))
|
||||
.root
|
||||
.enqueueAll(allFileNames)
|
||||
.startPipeline().waitForPipeline();
|
||||
|
||||
await this.kvDB.set("diff-caches-internal", caches);
|
||||
|
||||
// When files has been retrieved from the database. they must be reloaded.
|
||||
if ((direction == "pull" || direction == "pullForce") && filesChanged != 0) {
|
||||
const configDir = normalizePath(this.app.vault.configDir);
|
||||
// Show notification to restart obsidian when something has been changed in configDir.
|
||||
if (configDir in updatedFolders) {
|
||||
// Numbers of updated files that is below of configDir.
|
||||
@@ -345,83 +405,38 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
const enabledPlugins = this.app.plugins.enabledPlugins as Set<string>;
|
||||
const enabledPluginManifests = manifests.filter(e => enabledPlugins.has(e.id));
|
||||
for (const manifest of enabledPluginManifests) {
|
||||
if (manifest.dir in updatedFolders) {
|
||||
if (manifest.dir && manifest.dir in updatedFolders) {
|
||||
// If notified about plug-ins, reloading Obsidian may not be necessary.
|
||||
updatedCount -= updatedFolders[manifest.dir];
|
||||
const updatePluginId = manifest.id;
|
||||
const updatePluginName = manifest.name;
|
||||
const fragment = createFragment((doc) => {
|
||||
doc.createEl("span", null, (a) => {
|
||||
a.appendText(`Files in ${updatePluginName} has been updated, Press `);
|
||||
a.appendChild(a.createEl("a", null, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", async () => {
|
||||
Logger(`Unloading plugin: ${updatePluginName}`, LOG_LEVEL.NOTICE, "plugin-reload-" + updatePluginId);
|
||||
// @ts-ignore
|
||||
await this.app.plugins.unloadPlugin(updatePluginId);
|
||||
// @ts-ignore
|
||||
await this.app.plugins.loadPlugin(updatePluginId);
|
||||
Logger(`Plugin reloaded: ${updatePluginName}`, LOG_LEVEL.NOTICE, "plugin-reload-" + updatePluginId);
|
||||
});
|
||||
}));
|
||||
|
||||
a.appendText(` to reload ${updatePluginName}, or press elsewhere to dismiss this message.`);
|
||||
this.plugin.askInPopup(`updated-${updatePluginId}`, `Files in ${updatePluginName} has been updated, Press {HERE} to reload ${updatePluginName}, or press elsewhere to dismiss this message.`, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", async () => {
|
||||
Logger(`Unloading plugin: ${updatePluginName}`, LOG_LEVEL_NOTICE, "plugin-reload-" + updatePluginId);
|
||||
// @ts-ignore
|
||||
await this.app.plugins.unloadPlugin(updatePluginId);
|
||||
// @ts-ignore
|
||||
await this.app.plugins.loadPlugin(updatePluginId);
|
||||
Logger(`Plugin reloaded: ${updatePluginName}`, LOG_LEVEL_NOTICE, "plugin-reload-" + updatePluginId);
|
||||
});
|
||||
});
|
||||
|
||||
const updatedPluginKey = "popupUpdated-" + updatePluginId;
|
||||
scheduleTask(updatedPluginKey, 1000, async () => {
|
||||
const popup = await memoIfNotExist(updatedPluginKey, () => new Notice(fragment, 0));
|
||||
//@ts-ignore
|
||||
const isShown = popup?.noticeEl?.isShown();
|
||||
if (!isShown) {
|
||||
memoObject(updatedPluginKey, new Notice(fragment, 0));
|
||||
}
|
||||
scheduleTask(updatedPluginKey + "-close", 20000, () => {
|
||||
const popup = retrieveMemoObject<Notice>(updatedPluginKey);
|
||||
if (!popup)
|
||||
return;
|
||||
//@ts-ignore
|
||||
if (popup?.noticeEl?.isShown()) {
|
||||
popup.hide();
|
||||
}
|
||||
disposeMemoObject(updatedPluginKey);
|
||||
});
|
||||
});
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger("Error on checking plugin status.");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
|
||||
}
|
||||
|
||||
// If something changes left, notify for reloading Obsidian.
|
||||
if (updatedCount != 0) {
|
||||
const fragment = createFragment((doc) => {
|
||||
doc.createEl("span", null, (a) => {
|
||||
a.appendText(`Hidden files have been synchronized, Press `);
|
||||
a.appendChild(a.createEl("a", null, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", () => {
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload");
|
||||
});
|
||||
}));
|
||||
|
||||
a.appendText(` to reload obsidian, or press elsewhere to dismiss this message.`);
|
||||
});
|
||||
});
|
||||
|
||||
scheduleTask("popupUpdated-" + configDir, 1000, () => {
|
||||
//@ts-ignore
|
||||
const isShown = this.confirmPopup?.noticeEl?.isShown();
|
||||
if (!isShown) {
|
||||
this.confirmPopup = new Notice(fragment, 0);
|
||||
}
|
||||
scheduleTask("popupClose" + configDir, 20000, () => {
|
||||
this.confirmPopup?.hide();
|
||||
this.confirmPopup = null;
|
||||
this.plugin.askInPopup(`updated-any-hidden`, `Hidden files have been synchronized, Press {HERE} to reload Obsidian, or press elsewhere to dismiss this message.`, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", () => {
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload");
|
||||
});
|
||||
});
|
||||
}
|
||||
@@ -432,22 +447,18 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
}
|
||||
|
||||
async storeInternalFileToDatabase(file: InternalFileInfo, forceWrite = false) {
|
||||
if (await this.plugin.isIgnoredByIgnoreFiles(file.path)) {
|
||||
return
|
||||
}
|
||||
|
||||
const id = await this.path2id(file.path, ICHeader);
|
||||
const prefixedFileName = addPrefix(file.path, ICHeader);
|
||||
const contentBin = await this.app.vault.adapter.readBinary(file.path);
|
||||
let content: string[];
|
||||
try {
|
||||
content = await arrayBufferToBase64(contentBin);
|
||||
} catch (ex) {
|
||||
Logger(`The file ${file.path} could not be encoded`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
return false;
|
||||
}
|
||||
const content = createBlob(await this.plugin.vaultAccess.adapterReadAuto(file.path));
|
||||
const mtime = file.mtime;
|
||||
return await runWithLock("file-" + prefixedFileName, false, async () => {
|
||||
return await serialized("file-" + prefixedFileName, async () => {
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false);
|
||||
let saveData: LoadedEntry;
|
||||
const old = await this.localDatabase.getDBEntry(prefixedFileName, undefined, false, false);
|
||||
let saveData: SavingEntry;
|
||||
if (old === false) {
|
||||
saveData = {
|
||||
_id: id,
|
||||
@@ -462,8 +473,8 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
type: "newnote",
|
||||
};
|
||||
} else {
|
||||
if (isDocContentSame(old.data, content) && !forceWrite) {
|
||||
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL.VERBOSE);
|
||||
if (await isDocContentSame(readAsBlob(old), content) && !forceWrite) {
|
||||
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
||||
return;
|
||||
}
|
||||
saveData =
|
||||
@@ -472,18 +483,18 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
data: content,
|
||||
mtime,
|
||||
size: file.size,
|
||||
datatype: "newnote",
|
||||
datatype: old.datatype,
|
||||
children: [],
|
||||
deleted: false,
|
||||
type: "newnote",
|
||||
type: old.datatype,
|
||||
};
|
||||
}
|
||||
const ret = await this.localDatabase.putDBEntry(saveData, true);
|
||||
const ret = await this.localDatabase.putDBEntry(saveData);
|
||||
Logger(`STORAGE --> DB:${file.path}: (hidden) Done`);
|
||||
return ret;
|
||||
} catch (ex) {
|
||||
Logger(`STORAGE --> DB:${file.path}: (hidden) Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
});
|
||||
@@ -493,9 +504,12 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
const id = await this.path2id(filename, ICHeader);
|
||||
const prefixedFileName = addPrefix(filename, ICHeader);
|
||||
const mtime = new Date().getTime();
|
||||
await runWithLock("file-" + prefixedFileName, false, async () => {
|
||||
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
|
||||
return
|
||||
}
|
||||
await serialized("file-" + prefixedFileName, async () => {
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, true) as InternalFileEntry | false;
|
||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, true) as InternalFileEntry | false;
|
||||
let saveData: InternalFileEntry;
|
||||
if (old === false) {
|
||||
saveData = {
|
||||
@@ -509,6 +523,14 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
type: "newnote",
|
||||
};
|
||||
} else {
|
||||
// Remove all conflicted before deleting.
|
||||
const conflicts = await this.localDatabase.getRaw(old._id, { conflicts: true });
|
||||
if (conflicts._conflicts !== undefined) {
|
||||
for (const conflictRev of conflicts._conflicts) {
|
||||
await this.localDatabase.removeRevision(old._id, conflictRev);
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) conflict removed ${old._rev} => ${conflictRev}`, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
}
|
||||
if (old.deleted) {
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) already deleted`);
|
||||
return;
|
||||
@@ -527,71 +549,72 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) Done`);
|
||||
} catch (ex) {
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async extractInternalFileFromDatabase(filename: FilePath, force = false) {
|
||||
const isExists = await this.app.vault.adapter.exists(filename);
|
||||
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
|
||||
const prefixedFileName = addPrefix(filename, ICHeader);
|
||||
|
||||
return await runWithLock("file-" + prefixedFileName, false, async () => {
|
||||
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
|
||||
return;
|
||||
}
|
||||
return await serialized("file-" + prefixedFileName, async () => {
|
||||
try {
|
||||
// Check conflicted status
|
||||
//TODO option
|
||||
const fileOnDB = await this.localDatabase.getDBEntry(prefixedFileName, { conflicts: true }, false, true);
|
||||
const fileOnDB = await this.localDatabase.getDBEntry(prefixedFileName, { conflicts: true }, false, true, true);
|
||||
if (fileOnDB === false)
|
||||
throw new Error(`File not found on database.:${filename}`);
|
||||
// Prevent overwrite for Prevent overwriting while some conflicted revision exists.
|
||||
if (fileOnDB?._conflicts?.length) {
|
||||
Logger(`Hidden file ${filename} has conflicted revisions, to keep in safe, writing to storage has been prevented`, LOG_LEVEL.INFO);
|
||||
Logger(`Hidden file ${filename} has conflicted revisions, to keep in safe, writing to storage has been prevented`, LOG_LEVEL_INFO);
|
||||
return;
|
||||
}
|
||||
const deleted = "deleted" in fileOnDB ? fileOnDB.deleted : false;
|
||||
const deleted = fileOnDB.deleted || fileOnDB._deleted || false;
|
||||
if (deleted) {
|
||||
if (!isExists) {
|
||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
|
||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
|
||||
} else {
|
||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden).`);
|
||||
await this.app.vault.adapter.remove(filename);
|
||||
await this.plugin.vaultAccess.adapterRemove(filename);
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await app.vault.adapter.reconcileInternalFile(filename);
|
||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||
} catch (ex) {
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
await this.app.vault.adapter.writeBinary(filename, base64ToArrayBuffer(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
await this.vaultAccess.ensureDirectory(filename);
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, readContent(fileOnDB), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await app.vault.adapter.reconcileInternalFile(filename);
|
||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||
} catch (ex) {
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden,new${force ? ", force" : ""})`);
|
||||
return true;
|
||||
} else {
|
||||
const contentBin = await this.app.vault.adapter.readBinary(filename);
|
||||
const content = await arrayBufferToBase64(contentBin);
|
||||
if (content == fileOnDB.data && !force) {
|
||||
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL.VERBOSE);
|
||||
const content = await this.plugin.vaultAccess.adapterReadAuto(filename);
|
||||
const docContent = readContent(fileOnDB);
|
||||
if (await isDocContentSame(content, docContent) && !force) {
|
||||
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
||||
return true;
|
||||
}
|
||||
await this.app.vault.adapter.writeBinary(filename, base64ToArrayBuffer(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, docContent, { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await app.vault.adapter.reconcileInternalFile(filename);
|
||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||
} catch (ex) {
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden, overwrite${force ? ", force" : ""})`);
|
||||
return true;
|
||||
@@ -599,7 +622,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden, overwrite${force ? ", force" : ""}) Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return false;
|
||||
}
|
||||
});
|
||||
@@ -608,8 +631,8 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
|
||||
|
||||
showJSONMergeDialogAndMerge(docA: LoadedEntry, docB: LoadedEntry): Promise<boolean> {
|
||||
return runWithLock("conflict:merge-data", false, () => new Promise((res) => {
|
||||
Logger("Opening data-merging dialog", LOG_LEVEL.VERBOSE);
|
||||
return new Promise((res) => {
|
||||
Logger("Opening data-merging dialog", LOG_LEVEL_VERBOSE);
|
||||
const docs = [docA, docB];
|
||||
const path = stripAllPrefixes(docA.path);
|
||||
const modal = new JsonResolveModal(this.app, path, [docA, docB], async (keep, result) => {
|
||||
@@ -634,19 +657,23 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
}
|
||||
}
|
||||
if (!keep && result) {
|
||||
const isExists = await this.app.vault.adapter.exists(filename);
|
||||
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
await this.vaultAccess.ensureDirectory(filename);
|
||||
}
|
||||
await this.app.vault.adapter.write(filename, result);
|
||||
const stat = await this.app.vault.adapter.stat(filename);
|
||||
await this.storeInternalFileToDatabase({ path: filename, ...stat }, true);
|
||||
await this.plugin.vaultAccess.adapterWrite(filename, result);
|
||||
const stat = await this.plugin.vaultAccess.adapterStat(filename);
|
||||
if (!stat) {
|
||||
throw new Error("Stat failed");
|
||||
}
|
||||
const mtime = stat?.mtime ?? 0;
|
||||
await this.storeInternalFileToDatabase({ path: filename, mtime, ctime: stat?.ctime ?? mtime, size: stat?.size ?? 0 }, true);
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await app.vault.adapter.reconcileInternalFile(filename);
|
||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||
} catch (ex) {
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden,merged)`);
|
||||
}
|
||||
@@ -657,33 +684,42 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
res(true);
|
||||
} catch (ex) {
|
||||
Logger("Could not merge conflicted json");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
res(false);
|
||||
}
|
||||
});
|
||||
modal.open();
|
||||
}));
|
||||
});
|
||||
}
|
||||
|
||||
async scanInternalFiles(): Promise<InternalFileInfo[]> {
|
||||
const configDir = normalizePath(this.app.vault.configDir);
|
||||
const ignoreFilter = this.settings.syncInternalFilesIgnorePatterns
|
||||
.replace(/\n| /g, "")
|
||||
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
|
||||
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
|
||||
const root = this.app.vault.getRoot();
|
||||
const findRoot = root.path;
|
||||
const filenames = (await this.getFiles(findRoot, [], null, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
|
||||
const files = filenames.map(async (e) => {
|
||||
|
||||
const filenames = (await this.getFiles(findRoot, [], undefined, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
|
||||
const files = filenames.filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile))).map(async (e) => {
|
||||
return {
|
||||
path: e as FilePath,
|
||||
stat: await this.app.vault.adapter.stat(e)
|
||||
stat: await this.plugin.vaultAccess.adapterStat(e)
|
||||
};
|
||||
});
|
||||
const result: InternalFileInfo[] = [];
|
||||
for (const f of files) {
|
||||
const w = await f;
|
||||
if (await this.plugin.isIgnoredByIgnoreFiles(w.path)) {
|
||||
continue
|
||||
}
|
||||
const mtime = w.stat?.mtime ?? 0
|
||||
const ctime = w.stat?.ctime ?? mtime;
|
||||
const size = w.stat?.size ?? 0;
|
||||
result.push({
|
||||
...w,
|
||||
...w.stat
|
||||
mtime, ctime, size
|
||||
});
|
||||
}
|
||||
return result;
|
||||
@@ -694,17 +730,29 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
async getFiles(
|
||||
path: string,
|
||||
ignoreList: string[],
|
||||
filter: RegExp[],
|
||||
ignoreFilter: RegExp[]
|
||||
filter?: RegExp[],
|
||||
ignoreFilter?: RegExp[]
|
||||
) {
|
||||
|
||||
const w = await this.app.vault.adapter.list(path);
|
||||
let files = [
|
||||
let w: ListedFiles;
|
||||
try {
|
||||
w = await this.app.vault.adapter.list(path);
|
||||
} catch (ex) {
|
||||
Logger(`Could not traverse(HiddenSync):${path}`, LOG_LEVEL_INFO);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
return [];
|
||||
}
|
||||
const filesSrc = [
|
||||
...w.files
|
||||
.filter((e) => !ignoreList.some((ee) => e.endsWith(ee)))
|
||||
.filter((e) => !filter || filter.some((ee) => e.match(ee)))
|
||||
.filter((e) => !ignoreFilter || ignoreFilter.every((ee) => !e.match(ee))),
|
||||
.filter((e) => !ignoreFilter || ignoreFilter.every((ee) => !e.match(ee)))
|
||||
];
|
||||
let files = [] as string[];
|
||||
for (const file of filesSrc) {
|
||||
if (!await this.plugin.isIgnoredByIgnoreFiles(file)) {
|
||||
files.push(file);
|
||||
}
|
||||
}
|
||||
|
||||
L1: for (const v of w.folders) {
|
||||
for (const ignore of ignoreList) {
|
||||
@@ -715,6 +763,9 @@ export class HiddenFileSync extends LiveSyncCommands {
|
||||
if (ignoreFilter && ignoreFilter.some(e => v.match(e))) {
|
||||
continue L1;
|
||||
}
|
||||
if (await this.plugin.isIgnoredByIgnoreFiles(v)) {
|
||||
continue L1;
|
||||
}
|
||||
files = files.concat(await this.getFiles(v, ignoreList, filter, ignoreFilter));
|
||||
}
|
||||
return files;
|
||||
@@ -1,13 +1,15 @@
|
||||
import { EntryDoc, ObsidianLiveSyncSettings, LOG_LEVEL, DEFAULT_SETTINGS } from "./lib/src/types";
|
||||
import { configURIBase } from "./types";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { askSelectString, askYesNo, askString } from "./utils";
|
||||
import { decrypt, encrypt } from "./lib/src/e2ee_v2";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { delay } from "./lib/src/utils";
|
||||
import { confirmWithMessage } from "./dialogs";
|
||||
import { Platform } from "./deps";
|
||||
import { type EntryDoc, type ObsidianLiveSyncSettings, DEFAULT_SETTINGS, LOG_LEVEL_NOTICE, REMOTE_COUCHDB, REMOTE_MINIO } from "../lib/src/common/types.ts";
|
||||
import { configURIBase } from "../common/types.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { PouchDB } from "../lib/src/pouchdb/pouchdb-browser.js";
|
||||
import { askSelectString, askYesNo, askString } from "../common/utils.ts";
|
||||
import { decrypt, encrypt } from "../lib/src/encryption/e2ee_v2.ts";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands.ts";
|
||||
import { delay, fireAndForget } from "../lib/src/common/utils.ts";
|
||||
import { confirmWithMessage } from "../common/dialogs.ts";
|
||||
import { Platform } from "../deps.ts";
|
||||
import { fetchAllUsedChunks } from "../lib/src/pouchdb/utils_couchdb.ts";
|
||||
import type { LiveSyncCouchDBReplicator } from "../lib/src/replication/couchdb/LiveSyncReplicator.js";
|
||||
|
||||
export class SetupLiveSync extends LiveSyncCommands {
|
||||
onunload() { }
|
||||
@@ -16,20 +18,25 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-copysetupuri",
|
||||
name: "Copy the setup URI",
|
||||
callback: this.command_copySetupURI.bind(this),
|
||||
name: "Copy settings as a new setup URI",
|
||||
callback: () => fireAndForget(this.command_copySetupURI()),
|
||||
});
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-copysetupuri-short",
|
||||
name: "Copy settings as a new setup URI (With customization sync)",
|
||||
callback: () => fireAndForget(this.command_copySetupURIWithSync()),
|
||||
});
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-copysetupurifull",
|
||||
name: "Copy the setup URI (Full)",
|
||||
callback: this.command_copySetupURIFull.bind(this),
|
||||
name: "Copy settings as a new setup URI (Full)",
|
||||
callback: () => fireAndForget(this.command_copySetupURIFull()),
|
||||
});
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-opensetupuri",
|
||||
name: "Open the setup URI",
|
||||
callback: this.command_openSetupURI.bind(this),
|
||||
name: "Use the copied setup URI (Formerly Open setup URI)",
|
||||
callback: () => fireAndForget(this.command_openSetupURI()),
|
||||
});
|
||||
}
|
||||
onInitializeDatabase(showNotice: boolean) { }
|
||||
@@ -40,11 +47,14 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
}
|
||||
async realizeSettingSyncMode() { }
|
||||
|
||||
async command_copySetupURI() {
|
||||
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "");
|
||||
async command_copySetupURI(stripExtra = true) {
|
||||
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "", true);
|
||||
if (encryptingPassphrase === false)
|
||||
return;
|
||||
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
|
||||
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" } as Partial<ObsidianLiveSyncSettings>;
|
||||
if (stripExtra) {
|
||||
delete setting.pluginSyncExtendedSetting;
|
||||
}
|
||||
const keys = Object.keys(setting) as (keyof ObsidianLiveSyncSettings)[];
|
||||
for (const k of keys) {
|
||||
if (JSON.stringify(k in setting ? setting[k] : "") == JSON.stringify(k in DEFAULT_SETTINGS ? DEFAULT_SETTINGS[k] : "*")) {
|
||||
@@ -54,24 +64,27 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
|
||||
const uri = `${configURIBase}${encryptedSetting}`;
|
||||
await navigator.clipboard.writeText(uri);
|
||||
Logger("Setup URI copied to clipboard", LOG_LEVEL.NOTICE);
|
||||
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
|
||||
}
|
||||
async command_copySetupURIFull() {
|
||||
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "");
|
||||
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "", true);
|
||||
if (encryptingPassphrase === false)
|
||||
return;
|
||||
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
|
||||
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
|
||||
const uri = `${configURIBase}${encryptedSetting}`;
|
||||
await navigator.clipboard.writeText(uri);
|
||||
Logger("Setup URI copied to clipboard", LOG_LEVEL.NOTICE);
|
||||
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
|
||||
}
|
||||
async command_copySetupURIWithSync() {
|
||||
await this.command_copySetupURI(false);
|
||||
}
|
||||
async command_openSetupURI() {
|
||||
const setupURI = await askString(this.app, "Easy setup", "Set up URI", `${configURIBase}aaaaa`);
|
||||
if (setupURI === false)
|
||||
return;
|
||||
if (!setupURI.startsWith(`${configURIBase}`)) {
|
||||
Logger("Set up URI looks wrong.", LOG_LEVEL.NOTICE);
|
||||
Logger("Set up URI looks wrong.", LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
const config = decodeURIComponent(setupURI.substring(configURIBase.length));
|
||||
@@ -81,7 +94,7 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
async setupWizard(confString: string) {
|
||||
try {
|
||||
const oldConf = JSON.parse(JSON.stringify(this.settings));
|
||||
const encryptingPassphrase = await askString(this.app, "Passphrase", "The passphrase to decrypt your setup URI", "");
|
||||
const encryptingPassphrase = await askString(this.app, "Passphrase", "The passphrase to decrypt your setup URI", "", true);
|
||||
if (encryptingPassphrase === false)
|
||||
return;
|
||||
const newConf = await JSON.parse(await decrypt(confString, encryptingPassphrase, false));
|
||||
@@ -96,13 +109,21 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
newSettingW.configPassphraseStore = "";
|
||||
newSettingW.encryptedPassphrase = "";
|
||||
newSettingW.encryptedCouchDBConnection = "";
|
||||
newSettingW.additionalSuffixOfDatabaseName = `${("appId" in this.app ? this.app.appId : "")}`
|
||||
const setupJustImport = "Just import setting";
|
||||
const setupAsNew = "Set it up as secondary or subsequent device";
|
||||
const setupAsMerge = "Secondary device but try keeping local changes";
|
||||
const setupAgain = "Reconfigure and reconstitute the data";
|
||||
const setupManually = "Leave everything to me";
|
||||
newSettingW.syncInternalFiles = false;
|
||||
newSettingW.usePluginSync = false;
|
||||
const setupType = await askSelectString(this.app, "How would you like to set it up?", [setupAsNew, setupAgain, setupJustImport, setupManually]);
|
||||
newSettingW.isConfigured = true;
|
||||
// Migrate completely obsoleted configuration.
|
||||
if (!newSettingW.useIndexedDBAdapter) {
|
||||
newSettingW.useIndexedDBAdapter = true;
|
||||
}
|
||||
|
||||
const setupType = await askSelectString(this.app, "How would you like to set it up?", [setupAsNew, setupAgain, setupAsMerge, setupJustImport, setupManually]);
|
||||
if (setupType == setupJustImport) {
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
@@ -111,6 +132,10 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
await this.fetchLocal();
|
||||
} else if (setupType == setupAsMerge) {
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
await this.fetchLocalWithRebuild();
|
||||
} else if (setupType == setupAgain) {
|
||||
const confirm = "I know this operation will rebuild all my databases with files on this device, and files that are on the remote database and I didn't synchronize to any other devices will be lost and want to proceed indeed.";
|
||||
if (await askSelectString(this.app, "Do you really want to do this?", ["Cancel", confirm]) != confirm) {
|
||||
@@ -134,13 +159,13 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
await this.plugin.replicate(true);
|
||||
await this.plugin.markRemoteUnlocked();
|
||||
}
|
||||
Logger("Configuration loaded.", LOG_LEVEL.NOTICE);
|
||||
Logger("Configuration loaded.", LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
if (keepLocalDB == "no" && keepRemoteDB == "no") {
|
||||
const reset = await askYesNo(this.app, "Drop everything?");
|
||||
if (reset != "yes") {
|
||||
Logger("Cancelled", LOG_LEVEL.NOTICE);
|
||||
Logger("Cancelled", LOG_LEVEL_NOTICE);
|
||||
this.plugin.settings = oldConf;
|
||||
return;
|
||||
}
|
||||
@@ -175,17 +200,17 @@ export class SetupLiveSync extends LiveSyncCommands {
|
||||
}
|
||||
}
|
||||
|
||||
Logger("Configuration loaded.", LOG_LEVEL.NOTICE);
|
||||
Logger("Configuration loaded.", LOG_LEVEL_NOTICE);
|
||||
} else {
|
||||
Logger("Cancelled.", LOG_LEVEL.NOTICE);
|
||||
Logger("Cancelled.", LOG_LEVEL_NOTICE);
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger("Couldn't parse or decrypt configuration uri.", LOG_LEVEL.NOTICE);
|
||||
Logger("Couldn't parse or decrypt configuration uri.", LOG_LEVEL_NOTICE);
|
||||
}
|
||||
}
|
||||
|
||||
suspendExtraSync() {
|
||||
Logger("Hidden files and plugin synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL.NOTICE)
|
||||
Logger("Hidden files and plugin synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL_NOTICE)
|
||||
this.plugin.settings.syncInternalFiles = false;
|
||||
this.plugin.settings.usePluginSync = false;
|
||||
this.plugin.settings.autoSweepPlugins = false;
|
||||
@@ -230,7 +255,7 @@ Of course, we are able to disable these features.`
|
||||
return;
|
||||
}
|
||||
if (mode != "CUSTOMIZE") {
|
||||
Logger("Gathering files for enabling Hidden File Sync", LOG_LEVEL.NOTICE);
|
||||
Logger("Gathering files for enabling Hidden File Sync", LOG_LEVEL_NOTICE);
|
||||
if (mode == "FETCH") {
|
||||
await this.plugin.addOnHiddenFileSync.syncInternalFilesAndDatabase("pullForce", true);
|
||||
} else if (mode == "OVERWRITE") {
|
||||
@@ -240,7 +265,7 @@ Of course, we are able to disable these features.`
|
||||
}
|
||||
this.plugin.settings.syncInternalFiles = true;
|
||||
await this.plugin.saveSettings();
|
||||
Logger(`Done! Restarting the app is strongly recommended!`, LOG_LEVEL.NOTICE);
|
||||
Logger(`Done! Restarting the app is strongly recommended!`, LOG_LEVEL_NOTICE);
|
||||
} else if (mode == "CUSTOMIZE") {
|
||||
if (!this.plugin.deviceAndVaultName) {
|
||||
let name = await askString(this.app, "Device name", "Please set this device name", `desktop`);
|
||||
@@ -279,27 +304,92 @@ Of course, we are able to disable these features.`
|
||||
this.plugin.settings.liveSync = false;
|
||||
this.plugin.settings.periodicReplication = false;
|
||||
this.plugin.settings.syncOnSave = false;
|
||||
this.plugin.settings.syncOnEditorSave = false;
|
||||
this.plugin.settings.syncOnStart = false;
|
||||
this.plugin.settings.syncOnFileOpen = false;
|
||||
this.plugin.settings.syncAfterMerge = false;
|
||||
//this.suspendExtraSync();
|
||||
}
|
||||
async fetchLocal() {
|
||||
this.suspendExtraSync();
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
async suspendReflectingDatabase() {
|
||||
if (this.plugin.settings.doNotSuspendOnFetching) return;
|
||||
if (this.plugin.settings.remoteType == REMOTE_MINIO) return;
|
||||
Logger(`Suspending reflection: Database and storage changes will not be reflected in each other until completely finished the fetching.`, LOG_LEVEL_NOTICE);
|
||||
this.plugin.settings.suspendParseReplicationResult = true;
|
||||
this.plugin.settings.suspendFileWatching = true;
|
||||
await this.plugin.saveSettings();
|
||||
}
|
||||
async resumeReflectingDatabase() {
|
||||
if (this.plugin.settings.doNotSuspendOnFetching) return;
|
||||
if (this.plugin.settings.remoteType == REMOTE_MINIO) return;
|
||||
Logger(`Database and storage reflection has been resumed!`, LOG_LEVEL_NOTICE);
|
||||
this.plugin.settings.suspendParseReplicationResult = false;
|
||||
this.plugin.settings.suspendFileWatching = false;
|
||||
await this.plugin.syncAllFiles(true);
|
||||
await this.plugin.loadQueuedFiles();
|
||||
await this.plugin.saveSettings();
|
||||
|
||||
}
|
||||
async askUseNewAdapter() {
|
||||
if (!this.plugin.settings.useIndexedDBAdapter) {
|
||||
const message = `Now this plugin has been configured to use the old database adapter for keeping compatibility. Do you want to deactivate it?`;
|
||||
const CHOICE_YES = "Yes, disable and use latest";
|
||||
const CHOICE_NO = "No, keep compatibility";
|
||||
const choices = [CHOICE_YES, CHOICE_NO];
|
||||
|
||||
const ret = await confirmWithMessage(this.plugin, "Database adapter", message, choices, CHOICE_YES, 10);
|
||||
if (ret == CHOICE_YES) {
|
||||
this.plugin.settings.useIndexedDBAdapter = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
async resetLocalDatabase() {
|
||||
if (this.plugin.settings.isConfigured && this.plugin.settings.additionalSuffixOfDatabaseName == "") {
|
||||
// Discard the non-suffixed database
|
||||
await this.plugin.resetLocalDatabase();
|
||||
}
|
||||
this.plugin.settings.additionalSuffixOfDatabaseName = `${("appId" in this.app ? this.app.appId : "")}`
|
||||
await this.plugin.resetLocalDatabase();
|
||||
}
|
||||
async fetchRemoteChunks() {
|
||||
if (!this.plugin.settings.doNotSuspendOnFetching && this.plugin.settings.readChunksOnline && this.plugin.settings.remoteType == REMOTE_COUCHDB) {
|
||||
Logger(`Fetching chunks`, LOG_LEVEL_NOTICE);
|
||||
const replicator = this.plugin.getReplicator() as LiveSyncCouchDBReplicator;
|
||||
const remoteDB = await replicator.connectRemoteCouchDBWithSetting(this.settings, this.plugin.getIsMobile(), true);
|
||||
if (typeof remoteDB == "string") {
|
||||
Logger(remoteDB, LOG_LEVEL_NOTICE);
|
||||
} else {
|
||||
await fetchAllUsedChunks(this.localDatabase.localDatabase, remoteDB.db);
|
||||
}
|
||||
Logger(`Fetching chunks done`, LOG_LEVEL_NOTICE);
|
||||
}
|
||||
}
|
||||
async fetchLocal(makeLocalChunkBeforeSync?: boolean) {
|
||||
this.suspendExtraSync();
|
||||
await this.askUseNewAdapter();
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.suspendReflectingDatabase();
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.resetLocalDatabase();
|
||||
await delay(1000);
|
||||
await this.plugin.markRemoteResolved();
|
||||
await this.plugin.openDatabase();
|
||||
this.plugin.isReady = true;
|
||||
if (makeLocalChunkBeforeSync) {
|
||||
await this.plugin.initializeDatabase(true);
|
||||
}
|
||||
await this.plugin.markRemoteResolved();
|
||||
await delay(500);
|
||||
await this.plugin.replicateAllFromServer(true);
|
||||
await delay(1000);
|
||||
await this.plugin.replicateAllFromServer(true);
|
||||
await this.resumeReflectingDatabase();
|
||||
await this.askHiddenFileConfiguration({ enableFetch: true });
|
||||
}
|
||||
async fetchLocalWithRebuild() {
|
||||
return await this.fetchLocal(true);
|
||||
}
|
||||
async rebuildRemote() {
|
||||
this.suspendExtraSync();
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.markRemoteLocked();
|
||||
await this.plugin.tryResetRemoteDatabase();
|
||||
@@ -313,8 +403,10 @@ Of course, we are able to disable these features.`
|
||||
}
|
||||
async rebuildEverything() {
|
||||
this.suspendExtraSync();
|
||||
await this.askUseNewAdapter();
|
||||
this.plugin.settings.isConfigured = true;
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.resetLocalDatabase();
|
||||
await this.resetLocalDatabase();
|
||||
await delay(1000);
|
||||
await this.plugin.initializeDatabase(true);
|
||||
await this.plugin.markRemoteLocked();
|
||||
@@ -1,6 +1,6 @@
|
||||
import { AnyEntry, DocumentID, EntryDoc, EntryHasPath, FilePath, FilePathWithPrefix } from "./lib/src/types";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import type ObsidianLiveSyncPlugin from "./main";
|
||||
import { type AnyEntry, type DocumentID, type EntryDoc, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "../lib/src/common/types.ts";
|
||||
import { PouchDB } from "../lib/src/pouchdb/pouchdb-browser.js";
|
||||
import type ObsidianLiveSyncPlugin from "../main.ts";
|
||||
|
||||
|
||||
export abstract class LiveSyncCommands {
|
||||
@@ -14,6 +14,9 @@ export abstract class LiveSyncCommands {
|
||||
get localDatabase() {
|
||||
return this.plugin.localDatabase;
|
||||
}
|
||||
get vaultAccess() {
|
||||
return this.plugin.vaultAccess;
|
||||
}
|
||||
id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
|
||||
return this.plugin.id2path(id, entry, stripPrefix);
|
||||
}
|
||||
2
src/lib
2886
src/main.ts
196
src/storages/SerializedFileAccess.ts
Normal file
@@ -0,0 +1,196 @@
|
||||
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "../deps.ts";
|
||||
import { serialized } from "../lib/src/concurrency/lock.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { isPlainText } from "../lib/src/string_and_binary/path.ts";
|
||||
import type { FilePath } from "../lib/src/common/types.ts";
|
||||
import { createBinaryBlob, isDocContentSame } from "../lib/src/common/utils.ts";
|
||||
import type { InternalFileInfo } from "../common/types.ts";
|
||||
import { markChangesAreSame } from "../common/utils.ts";
|
||||
|
||||
function getFileLockKey(file: TFile | TFolder | string) {
|
||||
return `fl:${typeof (file) == "string" ? file : file.path}`;
|
||||
}
|
||||
function toArrayBuffer(arr: Uint8Array | ArrayBuffer | DataView): ArrayBufferLike {
|
||||
if (arr instanceof Uint8Array) {
|
||||
return arr.buffer;
|
||||
}
|
||||
if (arr instanceof DataView) {
|
||||
return arr.buffer;
|
||||
}
|
||||
return arr;
|
||||
}
|
||||
|
||||
|
||||
async function processReadFile<T>(file: TFile | TFolder | string, proc: () => Promise<T>) {
|
||||
const ret = await serialized(getFileLockKey(file), () => proc());
|
||||
return ret;
|
||||
}
|
||||
async function processWriteFile<T>(file: TFile | TFolder | string, proc: () => Promise<T>) {
|
||||
const ret = await serialized(getFileLockKey(file), () => proc());
|
||||
return ret;
|
||||
}
|
||||
export class SerializedFileAccess {
|
||||
app: App
|
||||
constructor(app: App) {
|
||||
this.app = app;
|
||||
}
|
||||
|
||||
async adapterStat(file: TFile | string) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
return await processReadFile(file, () => this.app.vault.adapter.stat(path));
|
||||
}
|
||||
async adapterExists(file: TFile | string) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
return await processReadFile(file, () => this.app.vault.adapter.exists(path));
|
||||
}
|
||||
async adapterRemove(file: TFile | string) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
return await processReadFile(file, () => this.app.vault.adapter.remove(path));
|
||||
}
|
||||
|
||||
async adapterRead(file: TFile | string) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
return await processReadFile(file, () => this.app.vault.adapter.read(path));
|
||||
}
|
||||
async adapterReadBinary(file: TFile | string) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
return await processReadFile(file, () => this.app.vault.adapter.readBinary(path));
|
||||
}
|
||||
|
||||
async adapterReadAuto(file: TFile | string) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
if (isPlainText(path)) return await processReadFile(file, () => this.app.vault.adapter.read(path));
|
||||
return await processReadFile(file, () => this.app.vault.adapter.readBinary(path));
|
||||
}
|
||||
|
||||
async adapterWrite(file: TFile | string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
|
||||
const path = file instanceof TFile ? file.path : file;
|
||||
if (typeof (data) === "string") {
|
||||
return await processWriteFile(file, () => this.app.vault.adapter.write(path, data, options));
|
||||
} else {
|
||||
return await processWriteFile(file, () => this.app.vault.adapter.writeBinary(path, toArrayBuffer(data), options));
|
||||
}
|
||||
}
|
||||
|
||||
async vaultCacheRead(file: TFile) {
|
||||
return await processReadFile(file, () => this.app.vault.cachedRead(file));
|
||||
}
|
||||
|
||||
async vaultRead(file: TFile) {
|
||||
return await processReadFile(file, () => this.app.vault.read(file));
|
||||
}
|
||||
|
||||
async vaultReadBinary(file: TFile) {
|
||||
return await processReadFile(file, () => this.app.vault.readBinary(file));
|
||||
}
|
||||
|
||||
async vaultReadAuto(file: TFile) {
|
||||
const path = file.path;
|
||||
if (isPlainText(path)) return await processReadFile(file, () => this.app.vault.read(file));
|
||||
return await processReadFile(file, () => this.app.vault.readBinary(file));
|
||||
}
|
||||
|
||||
|
||||
async vaultModify(file: TFile, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
|
||||
if (typeof (data) === "string") {
|
||||
return await processWriteFile(file, async () => {
|
||||
const oldData = await this.app.vault.read(file);
|
||||
if (data === oldData) {
|
||||
if (options && options.mtime) markChangesAreSame(file, file.stat.mtime, options.mtime);
|
||||
return false
|
||||
}
|
||||
await this.app.vault.modify(file, data, options)
|
||||
return true;
|
||||
}
|
||||
);
|
||||
} else {
|
||||
return await processWriteFile(file, async () => {
|
||||
const oldData = await this.app.vault.readBinary(file);
|
||||
if (await isDocContentSame(createBinaryBlob(oldData), createBinaryBlob(data))) {
|
||||
if (options && options.mtime) markChangesAreSame(file, file.stat.mtime, options.mtime);
|
||||
return false;
|
||||
}
|
||||
await this.app.vault.modifyBinary(file, toArrayBuffer(data), options)
|
||||
return true;
|
||||
});
|
||||
}
|
||||
}
|
||||
async vaultCreate(path: string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions): Promise<TFile> {
|
||||
if (typeof (data) === "string") {
|
||||
return await processWriteFile(path, () => this.app.vault.create(path, data, options));
|
||||
} else {
|
||||
return await processWriteFile(path, () => this.app.vault.createBinary(path, toArrayBuffer(data), options));
|
||||
}
|
||||
}
|
||||
|
||||
trigger(name: string, ...data: any[]) {
|
||||
return this.app.vault.trigger(name, ...data);
|
||||
}
|
||||
|
||||
async adapterAppend(normalizedPath: string, data: string, options?: DataWriteOptions) {
|
||||
return await this.app.vault.adapter.append(normalizedPath, data, options)
|
||||
}
|
||||
|
||||
async delete(file: TFile | TFolder, force = false) {
|
||||
return await processWriteFile(file, () => this.app.vault.delete(file, force));
|
||||
}
|
||||
async trash(file: TFile | TFolder, force = false) {
|
||||
return await processWriteFile(file, () => this.app.vault.trash(file, force));
|
||||
}
|
||||
|
||||
getAbstractFileByPath(path: FilePath | string): TAbstractFile | null {
|
||||
// Disabled temporary.
|
||||
return this.app.vault.getAbstractFileByPath(path);
|
||||
// // Hidden API but so useful.
|
||||
// // @ts-ignore
|
||||
// if ("getAbstractFileByPathInsensitive" in app.vault && (app.vault.adapter?.insensitive ?? false)) {
|
||||
// // @ts-ignore
|
||||
// return app.vault.getAbstractFileByPathInsensitive(path);
|
||||
// } else {
|
||||
// return app.vault.getAbstractFileByPath(path);
|
||||
// }
|
||||
}
|
||||
|
||||
getFiles() {
|
||||
return this.app.vault.getFiles();
|
||||
}
|
||||
|
||||
async ensureDirectory(fullPath: string) {
|
||||
const pathElements = fullPath.split("/");
|
||||
pathElements.pop();
|
||||
let c = "";
|
||||
for (const v of pathElements) {
|
||||
c += v;
|
||||
try {
|
||||
await this.app.vault.adapter.mkdir(c);
|
||||
} catch (ex: any) {
|
||||
if (ex?.message == "Folder already exists.") {
|
||||
// Skip if already exists.
|
||||
} else {
|
||||
Logger("Folder Create Error");
|
||||
Logger(ex);
|
||||
}
|
||||
}
|
||||
c += "/";
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
touchedFiles: string[] = [];
|
||||
|
||||
|
||||
touch(file: TFile | FilePath) {
|
||||
const f = file instanceof TFile ? file : this.getAbstractFileByPath(file) as TFile;
|
||||
const key = `${f.path}-${f.stat.mtime}-${f.stat.size}`;
|
||||
this.touchedFiles.unshift(key);
|
||||
this.touchedFiles = this.touchedFiles.slice(0, 100);
|
||||
}
|
||||
recentlyTouched(file: TFile | InternalFileInfo) {
|
||||
const key = file instanceof TFile ? `${file.path}-${file.stat.mtime}-${file.stat.size}` : `${file.path}-${file.mtime}-${file.size}`;
|
||||
if (this.touchedFiles.indexOf(key) == -1) return false;
|
||||
return true;
|
||||
}
|
||||
clearTouched() {
|
||||
this.touchedFiles = [];
|
||||
}
|
||||
}
|
||||
148
src/storages/StorageEventManager.ts
Normal file
@@ -0,0 +1,148 @@
|
||||
import type { SerializedFileAccess } from "./SerializedFileAccess.ts";
|
||||
import { Plugin, TAbstractFile, TFile, TFolder } from "../deps.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { shouldBeIgnored } from "../lib/src/string_and_binary/path.ts";
|
||||
import type { QueueProcessor } from "../lib/src/concurrency/processor.ts";
|
||||
import { LOG_LEVEL_NOTICE, type FilePath, type ObsidianLiveSyncSettings } from "../lib/src/common/types.ts";
|
||||
import { delay } from "../lib/src/common/utils.ts";
|
||||
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo } from "../common/types.ts";
|
||||
|
||||
|
||||
export abstract class StorageEventManager {
|
||||
abstract beginWatch(): void;
|
||||
}
|
||||
|
||||
type LiveSyncForStorageEventManager = Plugin &
|
||||
{
|
||||
settings: ObsidianLiveSyncSettings
|
||||
ignoreFiles: string[],
|
||||
vaultAccess: SerializedFileAccess
|
||||
} & {
|
||||
isTargetFile: (file: string | TAbstractFile) => Promise<boolean>,
|
||||
fileEventQueue: QueueProcessor<FileEventItem, any>,
|
||||
isFileSizeExceeded: (size: number) => boolean;
|
||||
};
|
||||
|
||||
|
||||
export class StorageEventManagerObsidian extends StorageEventManager {
|
||||
plugin: LiveSyncForStorageEventManager;
|
||||
constructor(plugin: LiveSyncForStorageEventManager) {
|
||||
super();
|
||||
this.plugin = plugin;
|
||||
}
|
||||
beginWatch() {
|
||||
const plugin = this.plugin;
|
||||
this.watchVaultChange = this.watchVaultChange.bind(this);
|
||||
this.watchVaultCreate = this.watchVaultCreate.bind(this);
|
||||
this.watchVaultDelete = this.watchVaultDelete.bind(this);
|
||||
this.watchVaultRename = this.watchVaultRename.bind(this);
|
||||
this.watchVaultRawEvents = this.watchVaultRawEvents.bind(this);
|
||||
plugin.registerEvent(plugin.app.vault.on("modify", this.watchVaultChange));
|
||||
plugin.registerEvent(plugin.app.vault.on("delete", this.watchVaultDelete));
|
||||
plugin.registerEvent(plugin.app.vault.on("rename", this.watchVaultRename));
|
||||
plugin.registerEvent(plugin.app.vault.on("create", this.watchVaultCreate));
|
||||
//@ts-ignore : Internal API
|
||||
plugin.registerEvent(plugin.app.vault.on("raw", this.watchVaultRawEvents));
|
||||
plugin.fileEventQueue.startPipeline();
|
||||
}
|
||||
|
||||
watchVaultCreate(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "CREATE", file }], ctx);
|
||||
}
|
||||
|
||||
watchVaultChange(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "CHANGED", file }], ctx);
|
||||
}
|
||||
|
||||
watchVaultDelete(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "DELETE", file }], ctx);
|
||||
}
|
||||
watchVaultRename(file: TAbstractFile, oldFile: string, ctx?: any) {
|
||||
if (file instanceof TFile) {
|
||||
this.appendWatchEvent([
|
||||
{ type: "DELETE", file: { path: oldFile as FilePath, mtime: file.stat.mtime, ctime: file.stat.ctime, size: file.stat.size, deleted: true } },
|
||||
{ type: "CREATE", file },
|
||||
], ctx);
|
||||
}
|
||||
}
|
||||
// Watch raw events (Internal API)
|
||||
watchVaultRawEvents(path: FilePath) {
|
||||
if (this.plugin.settings.useIgnoreFiles && this.plugin.ignoreFiles.some(e => path.endsWith(e.trim()))) {
|
||||
// If it is one of ignore files, refresh the cached one.
|
||||
this.plugin.isTargetFile(path).then(() => this._watchVaultRawEvents(path));
|
||||
} else {
|
||||
this._watchVaultRawEvents(path);
|
||||
}
|
||||
}
|
||||
|
||||
_watchVaultRawEvents(path: FilePath) {
|
||||
if (!this.plugin.settings.syncInternalFiles && !this.plugin.settings.usePluginSync) return;
|
||||
if (!this.plugin.settings.watchInternalFileChanges) return;
|
||||
if (!path.startsWith(this.plugin.app.vault.configDir)) return;
|
||||
const ignorePatterns = this.plugin.settings.syncInternalFilesIgnorePatterns
|
||||
.replace(/\n| /g, "")
|
||||
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
|
||||
if (ignorePatterns.some(e => path.match(e))) return;
|
||||
this.appendWatchEvent(
|
||||
[{
|
||||
type: "INTERNAL",
|
||||
file: { path, mtime: 0, ctime: 0, size: 0 }
|
||||
}], null);
|
||||
}
|
||||
// Cache file and waiting to can be proceed.
|
||||
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
|
||||
if (!this.plugin.settings.isConfigured) return;
|
||||
if (this.plugin.settings.suspendFileWatching) return;
|
||||
for (const param of params) {
|
||||
if (shouldBeIgnored(param.file.path)) {
|
||||
continue;
|
||||
}
|
||||
const atomicKey = [0, 0, 0, 0, 0, 0].map(e => `${Math.floor(Math.random() * 100000)}`).join("-");
|
||||
const type = param.type;
|
||||
const file = param.file;
|
||||
const oldPath = param.oldPath;
|
||||
const size = file instanceof TFile ? file.stat.size : (file as InternalFileInfo)?.size ?? 0;
|
||||
if (this.plugin.isFileSizeExceeded(size) && (type == "CREATE" || type == "CHANGED")) {
|
||||
Logger(`The storage file has been changed but exceeds the maximum size. Skipping: ${param.file.path}`, LOG_LEVEL_NOTICE);
|
||||
continue;
|
||||
}
|
||||
if (file instanceof TFolder) continue;
|
||||
if (!await this.plugin.isTargetFile(file.path)) continue;
|
||||
|
||||
// Stop cache using to prevent the corruption;
|
||||
// let cache: null | string | ArrayBuffer;
|
||||
// new file or something changed, cache the changes.
|
||||
if (file instanceof TFile && (type == "CREATE" || type == "CHANGED")) {
|
||||
// Wait for a bit while to let the writer has marked `touched` at the file.
|
||||
await delay(10);
|
||||
if (this.plugin.vaultAccess.recentlyTouched(file)) {
|
||||
continue;
|
||||
}
|
||||
// cache = await this.plugin.vaultAccess.vaultReadAuto(file);
|
||||
// if (!isPlainText(file.name)) {
|
||||
// cache = await this.plugin.vaultAccess.vaultReadBinary(file);
|
||||
// } else {
|
||||
// cache = await this.plugin.vaultAccess.vaultCacheRead(file);
|
||||
// if (!cache) cache = await this.plugin.vaultAccess.vaultRead(file);
|
||||
// }
|
||||
}
|
||||
const fileInfo = file instanceof TFile ? {
|
||||
ctime: file.stat.ctime,
|
||||
mtime: file.stat.mtime,
|
||||
file: file,
|
||||
path: file.path,
|
||||
size: file.stat.size
|
||||
} as FileInfo : file as InternalFileInfo;
|
||||
this.plugin.fileEventQueue.enqueue({
|
||||
type,
|
||||
args: {
|
||||
file: fileInfo,
|
||||
oldPath,
|
||||
// cache,
|
||||
ctx
|
||||
},
|
||||
key: atomicKey
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
93
src/ui/ConflictResolveModal.ts
Normal file
@@ -0,0 +1,93 @@
|
||||
import { App, Modal } from "../deps.ts";
|
||||
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT } from "diff-match-patch";
|
||||
import { CANCELLED, LEAVE_TO_SUBSEQUENT, RESULT_TIMED_OUT, type diff_result } from "../lib/src/common/types.ts";
|
||||
import { escapeStringToHTML } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import { delay, sendValue, waitForValue } from "../lib/src/common/utils.ts";
|
||||
|
||||
export type MergeDialogResult = typeof LEAVE_TO_SUBSEQUENT | typeof CANCELLED | string;
|
||||
export class ConflictResolveModal extends Modal {
|
||||
result: diff_result;
|
||||
filename: string;
|
||||
|
||||
response: MergeDialogResult = CANCELLED;
|
||||
isClosed = false;
|
||||
consumed = false;
|
||||
|
||||
constructor(app: App, filename: string, diff: diff_result) {
|
||||
super(app);
|
||||
this.result = diff;
|
||||
this.filename = filename;
|
||||
// Send cancel signal for the previous merge dialogue
|
||||
// if not there, simply be ignored.
|
||||
// sendValue("close-resolve-conflict:" + this.filename, false);
|
||||
sendValue("cancel-resolve-conflict:" + this.filename, true);
|
||||
}
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
// Send cancel signal for the previous merge dialogue
|
||||
// if not there, simply be ignored.
|
||||
sendValue("cancel-resolve-conflict:" + this.filename, true);
|
||||
setTimeout(async () => {
|
||||
const forceClose = await waitForValue("cancel-resolve-conflict:" + this.filename);
|
||||
// debugger;
|
||||
if (forceClose) {
|
||||
this.sendResponse(CANCELLED);
|
||||
}
|
||||
}, 10)
|
||||
// sendValue("close-resolve-conflict:" + this.filename, false);
|
||||
this.titleEl.setText("Conflicting changes");
|
||||
contentEl.empty();
|
||||
contentEl.createEl("span", { text: this.filename });
|
||||
const div = contentEl.createDiv("");
|
||||
div.addClass("op-scrollable");
|
||||
let diff = "";
|
||||
for (const v of this.result.diff) {
|
||||
const x1 = v[0];
|
||||
const x2 = v[1];
|
||||
if (x1 == DIFF_DELETE) {
|
||||
diff += "<span class='deleted'>" + escapeStringToHTML(x2).replace(/\n/g, "<span class='ls-mark-cr'></span>\n") + "</span>";
|
||||
} else if (x1 == DIFF_EQUAL) {
|
||||
diff += "<span class='normal'>" + escapeStringToHTML(x2).replace(/\n/g, "<span class='ls-mark-cr'></span>\n") + "</span>";
|
||||
} else if (x1 == DIFF_INSERT) {
|
||||
diff += "<span class='added'>" + escapeStringToHTML(x2).replace(/\n/g, "<span class='ls-mark-cr'></span>\n") + "</span>";
|
||||
}
|
||||
}
|
||||
|
||||
diff = diff.replace(/\n/g, "<br>");
|
||||
div.innerHTML = diff;
|
||||
const div2 = contentEl.createDiv("");
|
||||
const date1 = new Date(this.result.left.mtime).toLocaleString() + (this.result.left.deleted ? " (Deleted)" : "");
|
||||
const date2 = new Date(this.result.right.mtime).toLocaleString() + (this.result.right.deleted ? " (Deleted)" : "");
|
||||
div2.innerHTML = `
|
||||
<span class='deleted'>A:${date1}</span><br /><span class='added'>B:${date2}</span><br>
|
||||
`;
|
||||
contentEl.createEl("button", { text: "Keep A" }, (e) => e.addEventListener("click", () => this.sendResponse(this.result.right.rev)));
|
||||
contentEl.createEl("button", { text: "Keep B" }, (e) => e.addEventListener("click", () => this.sendResponse(this.result.left.rev)));
|
||||
contentEl.createEl("button", { text: "Concat both" }, (e) => e.addEventListener("click", () => this.sendResponse(LEAVE_TO_SUBSEQUENT)));
|
||||
contentEl.createEl("button", { text: "Not now" }, (e) => e.addEventListener("click", () => this.sendResponse(CANCELLED)));
|
||||
}
|
||||
|
||||
sendResponse(result: MergeDialogResult) {
|
||||
this.response = result;
|
||||
this.close();
|
||||
}
|
||||
|
||||
onClose() {
|
||||
const { contentEl } = this;
|
||||
contentEl.empty();
|
||||
if (this.consumed) {
|
||||
return;
|
||||
}
|
||||
this.consumed = true;
|
||||
sendValue("close-resolve-conflict:" + this.filename, this.response);
|
||||
sendValue("cancel-resolve-conflict:" + this.filename, false);
|
||||
}
|
||||
|
||||
async waitForResult(): Promise<MergeDialogResult> {
|
||||
await delay(100);
|
||||
const r = await waitForValue<MergeDialogResult>("close-resolve-conflict:" + this.filename);
|
||||
if (r === RESULT_TIMED_OUT) return CANCELLED;
|
||||
return r;
|
||||
}
|
||||
}
|
||||
287
src/ui/DocumentHistoryModal.ts
Normal file
@@ -0,0 +1,287 @@
|
||||
import { TFile, Modal, App, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "../deps.ts";
|
||||
import { getPathFromTFile, isValidPath } from "../common/utils.ts";
|
||||
import { decodeBinary, escapeStringToHTML, readString } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import ObsidianLiveSyncPlugin from "../main.ts";
|
||||
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "../lib/src/common/types.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { isErrorOfMissingDoc } from "../lib/src/pouchdb/utils_couchdb.ts";
|
||||
import { getDocData, readContent } from "../lib/src/common/utils.ts";
|
||||
import { isPlainText, stripPrefix } from "../lib/src/string_and_binary/path.ts";
|
||||
|
||||
function isImage(path: string) {
|
||||
const ext = path.split(".").splice(-1)[0].toLowerCase();
|
||||
return ["png", "jpg", "jpeg", "gif", "bmp", "webp"].includes(ext);
|
||||
}
|
||||
function isComparableText(path: string) {
|
||||
const ext = path.split(".").splice(-1)[0].toLowerCase();
|
||||
return isPlainText(path) || ["md", "mdx", "txt", "json"].includes(ext);
|
||||
}
|
||||
function isComparableTextDecode(path: string) {
|
||||
const ext = path.split(".").splice(-1)[0].toLowerCase();
|
||||
return ["json"].includes(ext)
|
||||
}
|
||||
function readDocument(w: LoadedEntry) {
|
||||
if (w.data.length == 0) return "";
|
||||
if (isImage(w.path)) {
|
||||
return new Uint8Array(decodeBinary(w.data));
|
||||
}
|
||||
if (w.type == "plain" || w.datatype == "plain") return getDocData(w.data);
|
||||
if (isComparableTextDecode(w.path)) return readString(new Uint8Array(decodeBinary(w.data)));
|
||||
if (isComparableText(w.path)) return getDocData(w.data);
|
||||
try {
|
||||
return readString(new Uint8Array(decodeBinary(w.data)));
|
||||
} catch (ex) {
|
||||
// NO OP.
|
||||
}
|
||||
return getDocData(w.data);
|
||||
}
|
||||
export class DocumentHistoryModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
range!: HTMLInputElement;
|
||||
contentView!: HTMLDivElement;
|
||||
info!: HTMLDivElement;
|
||||
fileInfo!: HTMLDivElement;
|
||||
showDiff = false;
|
||||
id?: DocumentID;
|
||||
|
||||
file: FilePathWithPrefix;
|
||||
|
||||
revs_info: PouchDB.Core.RevisionInfo[] = [];
|
||||
currentDoc?: LoadedEntry;
|
||||
currentText = "";
|
||||
currentDeleted = false;
|
||||
initialRev?: string;
|
||||
|
||||
constructor(app: App, plugin: ObsidianLiveSyncPlugin, file: TFile | FilePathWithPrefix, id?: DocumentID, revision?: string) {
|
||||
super(app);
|
||||
this.plugin = plugin;
|
||||
this.file = (file instanceof TFile) ? getPathFromTFile(file) : file;
|
||||
this.id = id;
|
||||
this.initialRev = revision;
|
||||
if (!file && id) {
|
||||
this.file = this.plugin.id2path(id);
|
||||
}
|
||||
if (localStorage.getItem("ols-history-highlightdiff") == "1") {
|
||||
this.showDiff = true;
|
||||
}
|
||||
}
|
||||
|
||||
async loadFile(initialRev?: string) {
|
||||
if (!this.id) {
|
||||
this.id = await this.plugin.path2id(this.file);
|
||||
}
|
||||
const db = this.plugin.localDatabase;
|
||||
try {
|
||||
const w = await db.getRaw(this.id, { revs_info: true });
|
||||
this.revs_info = w._revs_info?.filter((e) => e?.status == "available") ?? [];
|
||||
this.range.max = `${Math.max(this.revs_info.length - 1, 0)}`;
|
||||
this.range.value = this.range.max;
|
||||
this.fileInfo.setText(`${this.file} / ${this.revs_info.length} revisions`);
|
||||
await this.loadRevs(initialRev);
|
||||
} catch (ex) {
|
||||
if (isErrorOfMissingDoc(ex)) {
|
||||
this.range.max = "0";
|
||||
this.range.value = "";
|
||||
this.range.disabled = true;
|
||||
this.contentView.setText(`History of this file was not recorded.`);
|
||||
} else {
|
||||
this.contentView.setText(`Error occurred.`);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
}
|
||||
}
|
||||
}
|
||||
async loadRevs(initialRev?: string) {
|
||||
if (this.revs_info.length == 0) return;
|
||||
if (initialRev) {
|
||||
const rIndex = this.revs_info.findIndex(e => e.rev == initialRev);
|
||||
if (rIndex >= 0) {
|
||||
this.range.value = `${this.revs_info.length - 1 - rIndex}`;
|
||||
}
|
||||
}
|
||||
const index = this.revs_info.length - 1 - (this.range.value as any) / 1;
|
||||
const rev = this.revs_info[index];
|
||||
await this.showExactRev(rev.rev);
|
||||
}
|
||||
BlobURLs = new Map<string, string>();
|
||||
|
||||
revokeURL(key: string) {
|
||||
const v = this.BlobURLs.get(key);
|
||||
if (v) {
|
||||
URL.revokeObjectURL(v);
|
||||
}
|
||||
this.BlobURLs.delete(key);
|
||||
}
|
||||
generateBlobURL(key: string, data: Uint8Array) {
|
||||
this.revokeURL(key);
|
||||
const v = URL.createObjectURL(new Blob([data], { endings: "transparent", type: "application/octet-stream" }));
|
||||
this.BlobURLs.set(key, v);
|
||||
return v;
|
||||
}
|
||||
|
||||
async showExactRev(rev: string) {
|
||||
const db = this.plugin.localDatabase;
|
||||
const w = await db.getDBEntry(this.file, { rev: rev }, false, false, true);
|
||||
this.currentText = "";
|
||||
this.currentDeleted = false;
|
||||
if (w === false) {
|
||||
this.currentDeleted = true;
|
||||
this.info.innerHTML = "";
|
||||
this.contentView.innerHTML = `Could not read this revision<br>(${rev})`;
|
||||
} else {
|
||||
this.currentDoc = w;
|
||||
this.info.innerHTML = `Modified:${new Date(w.mtime).toLocaleString()}`;
|
||||
let result = undefined;
|
||||
const w1data = readDocument(w);
|
||||
this.currentDeleted = !!w.deleted;
|
||||
// this.currentText = w1data;
|
||||
if (this.showDiff) {
|
||||
const prevRevIdx = this.revs_info.length - 1 - ((this.range.value as any) / 1 - 1);
|
||||
if (prevRevIdx >= 0 && prevRevIdx < this.revs_info.length) {
|
||||
const oldRev = this.revs_info[prevRevIdx].rev;
|
||||
const w2 = await db.getDBEntry(this.file, { rev: oldRev }, false, false, true);
|
||||
if (w2 != false) {
|
||||
if (typeof w1data == "string") {
|
||||
result = "";
|
||||
const dmp = new diff_match_patch();
|
||||
const w2data = readDocument(w2) as string;
|
||||
const diff = dmp.diff_main(w2data, w1data);
|
||||
dmp.diff_cleanupSemantic(diff);
|
||||
for (const v of diff) {
|
||||
const x1 = v[0];
|
||||
const x2 = v[1];
|
||||
if (x1 == DIFF_DELETE) {
|
||||
result += "<span class='history-deleted'>" + escapeStringToHTML(x2) + "</span>";
|
||||
} else if (x1 == DIFF_EQUAL) {
|
||||
result += "<span class='history-normal'>" + escapeStringToHTML(x2) + "</span>";
|
||||
} else if (x1 == DIFF_INSERT) {
|
||||
result += "<span class='history-added'>" + escapeStringToHTML(x2) + "</span>";
|
||||
}
|
||||
}
|
||||
result = result.replace(/\n/g, "<br>");
|
||||
} else if (isImage(this.file)) {
|
||||
const src = this.generateBlobURL("base", w1data);
|
||||
const overlay = this.generateBlobURL("overlay", readDocument(w2) as Uint8Array);
|
||||
result =
|
||||
`<div class='ls-imgdiff-wrap'>
|
||||
<div class='overlay'>
|
||||
<img class='img-base' src="${src}">
|
||||
<img class='img-overlay' src='${overlay}'>
|
||||
</div>
|
||||
</div>`;
|
||||
this.contentView.removeClass("op-pre");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
if (result == undefined) {
|
||||
if (typeof w1data != "string") {
|
||||
if (isImage(this.file)) {
|
||||
const src = this.generateBlobURL("base", w1data);
|
||||
result =
|
||||
`<div class='ls-imgdiff-wrap'>
|
||||
<div class='overlay'>
|
||||
<img class='img-base' src="${src}">
|
||||
</div>
|
||||
</div>`;
|
||||
this.contentView.removeClass("op-pre");
|
||||
}
|
||||
} else {
|
||||
result = escapeStringToHTML(w1data);
|
||||
}
|
||||
}
|
||||
if (result == undefined) result = typeof w1data == "string" ? escapeStringToHTML(w1data) : "Binary file";
|
||||
this.contentView.innerHTML = (this.currentDeleted ? "(At this revision, the file has been deleted)\n" : "") + result;
|
||||
}
|
||||
}
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
this.titleEl.setText("Document History");
|
||||
contentEl.empty();
|
||||
this.fileInfo = contentEl.createDiv("");
|
||||
this.fileInfo.addClass("op-info");
|
||||
const divView = contentEl.createDiv("");
|
||||
divView.addClass("op-flex");
|
||||
|
||||
divView.createEl("input", { type: "range" }, (e) => {
|
||||
this.range = e;
|
||||
e.addEventListener("change", (e) => {
|
||||
this.loadRevs();
|
||||
});
|
||||
e.addEventListener("input", (e) => {
|
||||
this.loadRevs();
|
||||
});
|
||||
});
|
||||
contentEl
|
||||
.createDiv("", (e) => {
|
||||
e.createEl("label", {}, (label) => {
|
||||
label.appendChild(
|
||||
createEl("input", { type: "checkbox" }, (checkbox) => {
|
||||
if (this.showDiff) {
|
||||
checkbox.checked = true;
|
||||
}
|
||||
checkbox.addEventListener("input", (evt: any) => {
|
||||
this.showDiff = checkbox.checked;
|
||||
localStorage.setItem("ols-history-highlightdiff", this.showDiff == true ? "1" : "");
|
||||
this.loadRevs();
|
||||
});
|
||||
})
|
||||
);
|
||||
label.appendText("Highlight diff");
|
||||
});
|
||||
})
|
||||
.addClass("op-info");
|
||||
this.info = contentEl.createDiv("");
|
||||
this.info.addClass("op-info");
|
||||
this.loadFile(this.initialRev);
|
||||
const div = contentEl.createDiv({ text: "Loading old revisions..." });
|
||||
this.contentView = div;
|
||||
div.addClass("op-scrollable");
|
||||
div.addClass("op-pre");
|
||||
const buttons = contentEl.createDiv("");
|
||||
buttons.createEl("button", { text: "Copy to clipboard" }, (e) => {
|
||||
e.addClass("mod-cta");
|
||||
e.addEventListener("click", async () => {
|
||||
await navigator.clipboard.writeText(this.currentText);
|
||||
Logger(`Old content copied to clipboard`, LOG_LEVEL_NOTICE);
|
||||
});
|
||||
});
|
||||
const focusFile = async (path: string) => {
|
||||
const targetFile = this.plugin.app.vault.getFileByPath(path);
|
||||
if (targetFile) {
|
||||
const leaf = this.plugin.app.workspace.getLeaf(false);
|
||||
await leaf.openFile(targetFile);
|
||||
} else {
|
||||
Logger("The file could not view on the editor", LOG_LEVEL_NOTICE)
|
||||
}
|
||||
}
|
||||
buttons.createEl("button", { text: "Back to this revision" }, (e) => {
|
||||
e.addClass("mod-cta");
|
||||
e.addEventListener("click", async () => {
|
||||
// const pathToWrite = this.plugin.id2path(this.id, true);
|
||||
const pathToWrite = stripPrefix(this.file);
|
||||
if (!isValidPath(pathToWrite)) {
|
||||
Logger("Path is not valid to write content.", LOG_LEVEL_INFO);
|
||||
return;
|
||||
}
|
||||
if (!this.currentDoc) {
|
||||
Logger("No active file loaded.", LOG_LEVEL_INFO);
|
||||
return;
|
||||
}
|
||||
const d = readContent(this.currentDoc);
|
||||
await this.plugin.vaultAccess.adapterWrite(pathToWrite, d);
|
||||
await focusFile(pathToWrite);
|
||||
this.close();
|
||||
});
|
||||
});
|
||||
}
|
||||
onClose() {
|
||||
const { contentEl } = this;
|
||||
contentEl.empty();
|
||||
this.BlobURLs.forEach(value => {
|
||||
console.log(value);
|
||||
if (value) URL.revokeObjectURL(value);
|
||||
})
|
||||
}
|
||||
}
|
||||
318
src/ui/GlobalHistory.svelte
Normal file
@@ -0,0 +1,318 @@
|
||||
<script lang="ts">
|
||||
import ObsidianLiveSyncPlugin from "../main";
|
||||
import { onDestroy, onMount } from "svelte";
|
||||
import type { AnyEntry, FilePathWithPrefix } from "../lib/src/common/types";
|
||||
import { getDocData, isAnyNote, isDocContentSame, readAsBlob } from "../lib/src/common/utils";
|
||||
import { diff_match_patch } from "../deps";
|
||||
import { DocumentHistoryModal } from "./DocumentHistoryModal";
|
||||
import { isPlainText, stripAllPrefixes } from "../lib/src/string_and_binary/path";
|
||||
import { TFile } from "../deps";
|
||||
export let plugin: ObsidianLiveSyncPlugin;
|
||||
|
||||
let showDiffInfo = false;
|
||||
let showChunkCorrected = false;
|
||||
let checkStorageDiff = false;
|
||||
|
||||
let range_from_epoch = Date.now() - 3600000 * 24 * 7;
|
||||
let range_to_epoch = Date.now() + 3600000 * 24 * 2;
|
||||
const timezoneOffset = new Date().getTimezoneOffset();
|
||||
let dispDateFrom = new Date(range_from_epoch - timezoneOffset).toISOString().split("T")[0];
|
||||
let dispDateTo = new Date(range_to_epoch - timezoneOffset).toISOString().split("T")[0];
|
||||
$: {
|
||||
range_from_epoch = new Date(dispDateFrom).getTime() + timezoneOffset;
|
||||
range_to_epoch = new Date(dispDateTo).getTime() + timezoneOffset;
|
||||
|
||||
getHistory(showDiffInfo, showChunkCorrected, checkStorageDiff);
|
||||
}
|
||||
function mtimeToDate(mtime: number) {
|
||||
return new Date(mtime).toLocaleString();
|
||||
}
|
||||
|
||||
type HistoryData = {
|
||||
id: string;
|
||||
rev?: string;
|
||||
path: string;
|
||||
dirname: string;
|
||||
filename: string;
|
||||
mtime: number;
|
||||
mtimeDisp: string;
|
||||
isDeleted: boolean;
|
||||
size: number;
|
||||
changes: string;
|
||||
chunks: string;
|
||||
isPlain: boolean;
|
||||
};
|
||||
let history = [] as HistoryData[];
|
||||
let loading = false;
|
||||
|
||||
async function fetchChanges(): Promise<HistoryData[]> {
|
||||
try {
|
||||
const db = plugin.localDatabase;
|
||||
let result = [] as typeof history;
|
||||
for await (const docA of db.findAllNormalDocs()) {
|
||||
if (docA.mtime < range_from_epoch) {
|
||||
continue;
|
||||
}
|
||||
if (!isAnyNote(docA)) continue;
|
||||
const path = plugin.getPath(docA as AnyEntry);
|
||||
const isPlain = isPlainText(docA.path);
|
||||
const revs = await db.getRaw(docA._id, { revs_info: true });
|
||||
let p: string | undefined = undefined;
|
||||
const reversedRevs = (revs._revs_info ?? []).reverse();
|
||||
const DIFF_DELETE = -1;
|
||||
|
||||
const DIFF_EQUAL = 0;
|
||||
const DIFF_INSERT = 1;
|
||||
|
||||
for (const revInfo of reversedRevs) {
|
||||
if (revInfo.status == "available") {
|
||||
const doc = (!isPlain && showDiffInfo) || (checkStorageDiff && revInfo.rev == docA._rev) ? await db.getDBEntry(path, { rev: revInfo.rev }, false, false, true) : await db.getDBEntryMeta(path, { rev: revInfo.rev }, true);
|
||||
if (doc === false) continue;
|
||||
const rev = revInfo.rev;
|
||||
|
||||
const mtime = "mtime" in doc ? doc.mtime : 0;
|
||||
if (range_from_epoch > mtime) {
|
||||
continue;
|
||||
}
|
||||
if (range_to_epoch < mtime) {
|
||||
continue;
|
||||
}
|
||||
|
||||
let diffDetail = "";
|
||||
if (showDiffInfo && !isPlain) {
|
||||
const data = getDocData(doc.data);
|
||||
if (p === undefined) {
|
||||
p = data;
|
||||
}
|
||||
if (p != data) {
|
||||
const dmp = new diff_match_patch();
|
||||
const diff = dmp.diff_main(p, data);
|
||||
dmp.diff_cleanupSemantic(diff);
|
||||
p = data;
|
||||
const pxinit = {
|
||||
[DIFF_DELETE]: 0,
|
||||
[DIFF_EQUAL]: 0,
|
||||
[DIFF_INSERT]: 0,
|
||||
} as { [key: number]: number };
|
||||
const px = diff.reduce((p, c) => ({ ...p, [c[0]]: (p[c[0]] ?? 0) + c[1].length }), pxinit);
|
||||
diffDetail = `-${px[DIFF_DELETE]}, +${px[DIFF_INSERT]}`;
|
||||
}
|
||||
}
|
||||
const isDeleted = doc._deleted || (doc as any)?.deleted || false;
|
||||
if (isDeleted) {
|
||||
diffDetail += " 🗑️";
|
||||
}
|
||||
if (rev == docA._rev) {
|
||||
if (checkStorageDiff) {
|
||||
const abs = plugin.vaultAccess.getAbstractFileByPath(stripAllPrefixes(plugin.getPath(docA)));
|
||||
if (abs instanceof TFile) {
|
||||
const data = await plugin.vaultAccess.adapterReadAuto(abs);
|
||||
const d = readAsBlob(doc);
|
||||
const result = await isDocContentSame(data, d);
|
||||
if (result) {
|
||||
diffDetail += " ⚖️";
|
||||
} else {
|
||||
diffDetail += " ⚠️";
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
const docPath = plugin.getPath(doc as AnyEntry);
|
||||
const [filename, ...pathItems] = docPath.split("/").reverse();
|
||||
|
||||
let chunksStatus = "";
|
||||
if (showChunkCorrected) {
|
||||
const chunks = (doc as any)?.children ?? [];
|
||||
const loadedChunks = await db.allDocsRaw({ keys: [...chunks] });
|
||||
const totalCount = loadedChunks.rows.length;
|
||||
const errorCount = loadedChunks.rows.filter((e) => "error" in e).length;
|
||||
if (errorCount == 0) {
|
||||
chunksStatus = `✅ ${totalCount}`;
|
||||
} else {
|
||||
chunksStatus = `🔎 ${errorCount} ✅ ${totalCount}`;
|
||||
}
|
||||
}
|
||||
|
||||
result.push({
|
||||
id: doc._id,
|
||||
rev: doc._rev,
|
||||
path: docPath,
|
||||
dirname: pathItems.reverse().join("/"),
|
||||
filename: filename,
|
||||
mtime: mtime,
|
||||
mtimeDisp: mtimeToDate(mtime),
|
||||
size: (doc as any)?.size ?? 0,
|
||||
isDeleted: isDeleted,
|
||||
changes: diffDetail,
|
||||
chunks: chunksStatus,
|
||||
isPlain: isPlain,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return [...result].sort((a, b) => b.mtime - a.mtime);
|
||||
} finally {
|
||||
loading = false;
|
||||
}
|
||||
}
|
||||
async function getHistory(showDiffInfo: boolean, showChunkCorrected: boolean, checkStorageDiff: boolean) {
|
||||
loading = true;
|
||||
const newDisplay = [];
|
||||
const page = await fetchChanges();
|
||||
newDisplay.push(...page);
|
||||
history = [...newDisplay];
|
||||
}
|
||||
|
||||
function nextWeek() {
|
||||
dispDateTo = new Date(range_to_epoch - timezoneOffset + 3600 * 1000 * 24 * 7).toISOString().split("T")[0];
|
||||
}
|
||||
function prevWeek() {
|
||||
dispDateFrom = new Date(range_from_epoch - timezoneOffset - 3600 * 1000 * 24 * 7).toISOString().split("T")[0];
|
||||
}
|
||||
|
||||
onMount(async () => {
|
||||
await getHistory(showDiffInfo, showChunkCorrected, checkStorageDiff);
|
||||
});
|
||||
onDestroy(() => {});
|
||||
|
||||
function showHistory(file: string, rev: string) {
|
||||
new DocumentHistoryModal(plugin.app, plugin, file as unknown as FilePathWithPrefix, undefined, rev).open();
|
||||
}
|
||||
function openFile(file: string) {
|
||||
plugin.app.workspace.openLinkText(file, file);
|
||||
}
|
||||
</script>
|
||||
|
||||
<div class="globalhistory">
|
||||
<h1>Vault history</h1>
|
||||
<div class="control">
|
||||
<div class="row"><label for="">From:</label><input type="date" bind:value={dispDateFrom} disabled={loading} /></div>
|
||||
<div class="row"><label for="">To:</label><input type="date" bind:value={dispDateTo} disabled={loading} /></div>
|
||||
<div class="row">
|
||||
<label for="">Info:</label>
|
||||
<label><input type="checkbox" bind:checked={showDiffInfo} disabled={loading} /><span>Diff</span></label>
|
||||
<label><input type="checkbox" bind:checked={showChunkCorrected} disabled={loading} /><span>Chunks</span></label>
|
||||
<label><input type="checkbox" bind:checked={checkStorageDiff} disabled={loading} /><span>File integrity</span></label>
|
||||
</div>
|
||||
</div>
|
||||
{#if loading}
|
||||
<div class="">Gathering information...</div>
|
||||
{/if}
|
||||
<table>
|
||||
<tr>
|
||||
<th> Date </th>
|
||||
<th> Path </th>
|
||||
<th> Rev </th>
|
||||
<th> Stat </th>
|
||||
{#if showChunkCorrected}
|
||||
<th> Chunks </th>
|
||||
{/if}
|
||||
</tr>
|
||||
<tr>
|
||||
<td colspan="5" class="more">
|
||||
{#if loading}
|
||||
<div class="" />
|
||||
{:else}
|
||||
<div><button on:click={() => nextWeek()}>+1 week</button></div>
|
||||
{/if}
|
||||
</td>
|
||||
</tr>
|
||||
{#each history as entry}
|
||||
<tr>
|
||||
<td class="mtime">
|
||||
{entry.mtimeDisp}
|
||||
</td>
|
||||
<td class="path">
|
||||
<div class="filenames">
|
||||
<span class="path">/{entry.dirname.split("/").join(`/`)}</span>
|
||||
<span class="filename"><a on:click={() => openFile(entry.path)}>{entry.filename}</a></span>
|
||||
</div>
|
||||
</td>
|
||||
<td>
|
||||
<span class="rev">
|
||||
{#if entry.isPlain}
|
||||
<a on:click={() => showHistory(entry.path, entry?.rev || "")}>{entry.rev}</a>
|
||||
{:else}
|
||||
{entry.rev}
|
||||
{/if}
|
||||
</span>
|
||||
</td>
|
||||
<td>
|
||||
{entry.changes}
|
||||
</td>
|
||||
{#if showChunkCorrected}
|
||||
<td>
|
||||
{entry.chunks}
|
||||
</td>
|
||||
{/if}
|
||||
</tr>
|
||||
{/each}
|
||||
<tr>
|
||||
<td colspan="5" class="more">
|
||||
{#if loading}
|
||||
<div class="" />
|
||||
{:else}
|
||||
<div><button on:click={() => prevWeek()}>+1 week</button></div>
|
||||
{/if}
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
.globalhistory {
|
||||
margin-bottom: 2em;
|
||||
}
|
||||
table {
|
||||
width: 100%;
|
||||
}
|
||||
.more > div {
|
||||
display: flex;
|
||||
}
|
||||
.more > div > button {
|
||||
flex-grow: 1;
|
||||
}
|
||||
th {
|
||||
position: sticky;
|
||||
top: 0;
|
||||
backdrop-filter: blur(10px);
|
||||
}
|
||||
td.mtime {
|
||||
white-space: break-spaces;
|
||||
}
|
||||
td.path {
|
||||
word-break: break-word;
|
||||
}
|
||||
.row {
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
.row > label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
min-width: 5em;
|
||||
}
|
||||
.row > input {
|
||||
flex-grow: 1;
|
||||
}
|
||||
|
||||
.filenames {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
.filenames > .path {
|
||||
font-size: 70%;
|
||||
}
|
||||
.rev {
|
||||
text-overflow: ellipsis;
|
||||
max-width: 3em;
|
||||
display: inline-block;
|
||||
overflow: hidden;
|
||||
white-space: nowrap;
|
||||
}
|
||||
</style>
|
||||
49
src/ui/GlobalHistoryView.ts
Normal file
@@ -0,0 +1,49 @@
|
||||
import {
|
||||
ItemView,
|
||||
WorkspaceLeaf
|
||||
} from "../deps.ts";
|
||||
import GlobalHistoryComponent from "./GlobalHistory.svelte";
|
||||
import type ObsidianLiveSyncPlugin from "../main.ts";
|
||||
|
||||
export const VIEW_TYPE_GLOBAL_HISTORY = "global-history";
|
||||
export class GlobalHistoryView extends ItemView {
|
||||
|
||||
component: GlobalHistoryComponent;
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
icon: "clock";
|
||||
title: string;
|
||||
navigation: true;
|
||||
|
||||
getIcon(): string {
|
||||
return "clock";
|
||||
}
|
||||
|
||||
constructor(leaf: WorkspaceLeaf, plugin: ObsidianLiveSyncPlugin) {
|
||||
super(leaf);
|
||||
this.plugin = plugin;
|
||||
}
|
||||
|
||||
|
||||
getViewType() {
|
||||
return VIEW_TYPE_GLOBAL_HISTORY;
|
||||
}
|
||||
|
||||
getDisplayText() {
|
||||
return "Vault history";
|
||||
}
|
||||
|
||||
// eslint-disable-next-line require-await
|
||||
async onOpen() {
|
||||
this.component = new GlobalHistoryComponent({
|
||||
target: this.contentEl,
|
||||
props: {
|
||||
plugin: this.plugin,
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
// eslint-disable-next-line require-await
|
||||
async onClose() {
|
||||
this.component.$destroy();
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,7 @@
|
||||
import { App, Modal } from "./deps";
|
||||
import { FilePath, LoadedEntry } from "./lib/src/types";
|
||||
import { App, Modal } from "../deps.ts";
|
||||
import { type FilePath, type LoadedEntry } from "../lib/src/common/types.ts";
|
||||
import JsonResolvePane from "./JsonResolvePane.svelte";
|
||||
import { waitForSignal } from "../lib/src/common/utils.ts";
|
||||
|
||||
export class JsonResolveModal extends Modal {
|
||||
// result: Array<[number, string]>;
|
||||
@@ -20,6 +21,7 @@ export class JsonResolveModal extends Modal {
|
||||
this.nameA = nameA;
|
||||
this.nameB = nameB;
|
||||
this.defaultSelect = defaultSelect;
|
||||
waitForSignal(`cancel-internal-conflict:${filename}`).then(() => this.close());
|
||||
}
|
||||
async UICallback(keepRev: string, mergedStr?: string) {
|
||||
this.close();
|
||||
@@ -29,7 +31,7 @@ export class JsonResolveModal extends Modal {
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
|
||||
this.titleEl.setText("Conflicted Setting");
|
||||
contentEl.empty();
|
||||
|
||||
if (this.component == null) {
|
||||
@@ -41,7 +43,7 @@ export class JsonResolveModal extends Modal {
|
||||
nameA: this.nameA,
|
||||
nameB: this.nameB,
|
||||
defaultSelect: this.defaultSelect,
|
||||
callback: (keepRev, mergedStr) => this.UICallback(keepRev, mergedStr),
|
||||
callback: (keepRev: string, mergedStr: string) => this.UICallback(keepRev, mergedStr),
|
||||
},
|
||||
});
|
||||
}
|
||||
@@ -1,20 +1,20 @@
|
||||
<script lang="ts">
|
||||
import { Diff, DIFF_DELETE, DIFF_INSERT, diff_match_patch } from "diff-match-patch";
|
||||
import type { FilePath, LoadedEntry } from "./lib/src/types";
|
||||
import { base64ToString } from "./lib/src/strbin";
|
||||
import { getDocData } from "./lib/src/utils";
|
||||
import { mergeObject } from "./utils";
|
||||
import { type Diff, DIFF_DELETE, DIFF_INSERT, diff_match_patch } from "../deps";
|
||||
import type { FilePath, LoadedEntry } from "../lib/src/common/types";
|
||||
import { decodeBinary, readString } from "../lib/src/string_and_binary/strbin";
|
||||
import { getDocData } from "../lib/src/common/utils";
|
||||
import { mergeObject } from "../common/utils";
|
||||
|
||||
export let docs: LoadedEntry[] = [];
|
||||
export let callback: (keepRev: string, mergedStr?: string) => Promise<void> = async (_, __) => {
|
||||
export let callback: (keepRev?: string, mergedStr?: string) => Promise<void> = async (_, __) => {
|
||||
Promise.resolve();
|
||||
};
|
||||
export let filename: FilePath = "" as FilePath;
|
||||
export let nameA: string = "A";
|
||||
export let nameB: string = "B";
|
||||
export let defaultSelect: string = "";
|
||||
let docA: LoadedEntry = undefined;
|
||||
let docB: LoadedEntry = undefined;
|
||||
let docA: LoadedEntry;
|
||||
let docB: LoadedEntry;
|
||||
let docAContent = "";
|
||||
let docBContent = "";
|
||||
let objA: any = {};
|
||||
@@ -26,9 +26,10 @@
|
||||
let mode: SelectModes = defaultSelect as SelectModes;
|
||||
|
||||
function docToString(doc: LoadedEntry) {
|
||||
return doc.datatype == "plain" ? getDocData(doc.data) : base64ToString(doc.data);
|
||||
return doc.datatype == "plain" ? getDocData(doc.data) : readString(new Uint8Array(decodeBinary(doc.data)));
|
||||
}
|
||||
function revStringToRevNumber(rev: string) {
|
||||
function revStringToRevNumber(rev?: string) {
|
||||
if (!rev) return "";
|
||||
return rev.split("-")[0];
|
||||
}
|
||||
|
||||
@@ -44,15 +45,15 @@
|
||||
}
|
||||
function apply() {
|
||||
if (docA._id == docB._id) {
|
||||
if (mode == "A") return callback(docA._rev, null);
|
||||
if (mode == "B") return callback(docB._rev, null);
|
||||
if (mode == "A") return callback(docA._rev!, undefined);
|
||||
if (mode == "B") return callback(docB._rev!, undefined);
|
||||
} else {
|
||||
if (mode == "A") return callback(null, docToString(docA));
|
||||
if (mode == "B") return callback(null, docToString(docB));
|
||||
if (mode == "A") return callback(undefined, docToString(docA));
|
||||
if (mode == "B") return callback(undefined, docToString(docB));
|
||||
}
|
||||
if (mode == "BA") return callback(null, JSON.stringify(objBA, null, 2));
|
||||
if (mode == "AB") return callback(null, JSON.stringify(objAB, null, 2));
|
||||
callback(null, null);
|
||||
if (mode == "BA") return callback(undefined, JSON.stringify(objBA, null, 2));
|
||||
if (mode == "AB") return callback(undefined, JSON.stringify(objAB, null, 2));
|
||||
callback(undefined, undefined);
|
||||
}
|
||||
$: {
|
||||
if (docs && docs.length >= 1) {
|
||||
@@ -104,7 +105,6 @@
|
||||
] as ["" | "A" | "B" | "AB" | "BA", string][];
|
||||
</script>
|
||||
|
||||
<h1>Conflicted settings</h1>
|
||||
<h2>{filename}</h2>
|
||||
{#if !docA || !docB}
|
||||
<div class="message">Just for a minute, please!</div>
|
||||
@@ -134,13 +134,17 @@
|
||||
{/if}
|
||||
<div>
|
||||
{nameA}
|
||||
{#if docA._id == docB._id} Rev:{revStringToRevNumber(docA._rev)} {/if} ,{new Date(docA.mtime).toLocaleString()}
|
||||
{#if docA._id == docB._id}
|
||||
Rev:{revStringToRevNumber(docA._rev)}
|
||||
{/if} ,{new Date(docA.mtime).toLocaleString()}
|
||||
{docAContent.length} letters
|
||||
</div>
|
||||
|
||||
<div>
|
||||
{nameB}
|
||||
{#if docA._id == docB._id} Rev:{revStringToRevNumber(docB._rev)} {/if} ,{new Date(docB.mtime).toLocaleString()}
|
||||
{#if docA._id == docB._id}
|
||||
Rev:{revStringToRevNumber(docB._rev)}
|
||||
{/if} ,{new Date(docB.mtime).toLocaleString()}
|
||||
{docBContent.length} letters
|
||||
</div>
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import { App, Modal } from "./deps";
|
||||
import { logMessageStore } from "./lib/src/stores";
|
||||
import { escapeStringToHTML } from "./lib/src/strbin";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { App, Modal } from "../deps.ts";
|
||||
import type { ReactiveInstance, } from "../lib/src/dataobject/reactive.ts";
|
||||
import { logMessages } from "../lib/src/mock_and_interop/stores.ts";
|
||||
import { escapeStringToHTML } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import ObsidianLiveSyncPlugin from "../main.ts";
|
||||
|
||||
export class LogDisplayModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
@@ -14,21 +15,23 @@ export class LogDisplayModal extends Modal {
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
this.titleEl.setText("Sync status");
|
||||
|
||||
contentEl.empty();
|
||||
contentEl.createEl("h2", { text: "Sync Status" });
|
||||
const div = contentEl.createDiv("");
|
||||
div.addClass("op-scrollable");
|
||||
div.addClass("op-pre");
|
||||
this.logEl = div;
|
||||
this.unsubscribe = logMessageStore.observe((e) => {
|
||||
function updateLog(logs: ReactiveInstance<string[]>) {
|
||||
const e = logs.value;
|
||||
let msg = "";
|
||||
for (const v of e) {
|
||||
msg += escapeStringToHTML(v) + "<br>";
|
||||
}
|
||||
this.logEl.innerHTML = msg;
|
||||
})
|
||||
logMessageStore.invalidate();
|
||||
}
|
||||
logMessages.onChanged(updateLog);
|
||||
this.unsubscribe = () => logMessages.offChanged(updateLog);
|
||||
}
|
||||
onClose() {
|
||||
const { contentEl } = this;
|
||||
82
src/ui/LogPane.svelte
Normal file
@@ -0,0 +1,82 @@
|
||||
<script lang="ts">
|
||||
import { onDestroy, onMount } from "svelte";
|
||||
import { logMessages } from "../lib/src/mock_and_interop/stores";
|
||||
import type { ReactiveInstance } from "../lib/src/dataobject/reactive";
|
||||
import { Logger } from "../lib/src/common/logger";
|
||||
|
||||
let unsubscribe: () => void;
|
||||
let messages = [] as string[];
|
||||
let wrapRight = false;
|
||||
let autoScroll = true;
|
||||
let suspended = false;
|
||||
function updateLog(logs: ReactiveInstance<string[]>) {
|
||||
const e = logs.value;
|
||||
if (!suspended) {
|
||||
messages = [...e];
|
||||
setTimeout(() => {
|
||||
if (scroll) scroll.scrollTop = scroll.scrollHeight;
|
||||
}, 10);
|
||||
}
|
||||
}
|
||||
onMount(async () => {
|
||||
logMessages.onChanged(updateLog);
|
||||
Logger("Log window opened");
|
||||
unsubscribe = () => logMessages.offChanged(updateLog);
|
||||
});
|
||||
onDestroy(() => {
|
||||
if (unsubscribe) unsubscribe();
|
||||
});
|
||||
let scroll: HTMLDivElement;
|
||||
</script>
|
||||
|
||||
<div class="logpane">
|
||||
<!-- <h1>Self-hosted LiveSync Log</h1> -->
|
||||
<div class="control">
|
||||
<div class="row">
|
||||
<label><input type="checkbox" bind:checked={wrapRight} /><span>Wrap</span></label>
|
||||
<label><input type="checkbox" bind:checked={autoScroll} /><span>Auto scroll</span></label>
|
||||
<label><input type="checkbox" bind:checked={suspended} /><span>Pause</span></label>
|
||||
</div>
|
||||
</div>
|
||||
<div class="log" bind:this={scroll}>
|
||||
{#each messages as line}
|
||||
<pre class:wrap-right={wrapRight}>{line}</pre>
|
||||
{/each}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
.logpane {
|
||||
display: flex;
|
||||
height: 100%;
|
||||
flex-direction: column;
|
||||
}
|
||||
.log {
|
||||
overflow-y: scroll;
|
||||
user-select: text;
|
||||
padding-bottom: 2em;
|
||||
}
|
||||
.log > pre {
|
||||
margin: 0;
|
||||
}
|
||||
.log > pre.wrap-right {
|
||||
word-break: break-all;
|
||||
max-width: 100%;
|
||||
width: 100%;
|
||||
white-space: normal;
|
||||
}
|
||||
.row {
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
justify-content: flex-end;
|
||||
}
|
||||
.row > label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
min-width: 5em;
|
||||
margin-right: 1em;
|
||||
}
|
||||
</style>
|
||||
48
src/ui/LogPaneView.ts
Normal file
@@ -0,0 +1,48 @@
|
||||
import {
|
||||
ItemView,
|
||||
WorkspaceLeaf
|
||||
} from "obsidian";
|
||||
import LogPaneComponent from "./LogPane.svelte";
|
||||
import type ObsidianLiveSyncPlugin from "../main.ts";
|
||||
export const VIEW_TYPE_LOG = "log-log";
|
||||
//Log view
|
||||
export class LogPaneView extends ItemView {
|
||||
|
||||
component: LogPaneComponent;
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
icon: "view-log";
|
||||
title: string;
|
||||
navigation: true;
|
||||
|
||||
getIcon(): string {
|
||||
return "view-log";
|
||||
}
|
||||
|
||||
constructor(leaf: WorkspaceLeaf, plugin: ObsidianLiveSyncPlugin) {
|
||||
super(leaf);
|
||||
this.plugin = plugin;
|
||||
}
|
||||
|
||||
|
||||
getViewType() {
|
||||
return VIEW_TYPE_LOG;
|
||||
}
|
||||
|
||||
getDisplayText() {
|
||||
return "Self-hosted LiveSync Log";
|
||||
}
|
||||
|
||||
// eslint-disable-next-line require-await
|
||||
async onOpen() {
|
||||
this.component = new LogPaneComponent({
|
||||
target: this.contentEl,
|
||||
props: {
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
// eslint-disable-next-line require-await
|
||||
async onClose() {
|
||||
this.component.$destroy();
|
||||
}
|
||||
}
|
||||
2673
src/ui/ObsidianLiveSyncSettingTab.ts
Normal file
470
src/ui/PluginPane.svelte
Normal file
@@ -0,0 +1,470 @@
|
||||
<script lang="ts">
|
||||
import { onMount } from "svelte";
|
||||
import ObsidianLiveSyncPlugin from "../main";
|
||||
import { type PluginDataExDisplay, pluginIsEnumerating, pluginList } from "../features/CmdConfigSync";
|
||||
import PluginCombo from "./components/PluginCombo.svelte";
|
||||
import { Menu } from "obsidian";
|
||||
import { unique } from "../lib/src/common/utils";
|
||||
import { MODE_SELECTIVE, MODE_AUTOMATIC, MODE_PAUSED, type SYNC_MODE, type PluginSyncSettingEntry } from "../lib/src/common/types";
|
||||
import { normalizePath } from "../deps";
|
||||
export let plugin: ObsidianLiveSyncPlugin;
|
||||
|
||||
$: hideNotApplicable = false;
|
||||
$: thisTerm = plugin.deviceAndVaultName;
|
||||
|
||||
const addOn = plugin.addOnConfigSync;
|
||||
|
||||
let list: PluginDataExDisplay[] = [];
|
||||
|
||||
let selectNewestPulse = 0;
|
||||
let hideEven = false;
|
||||
let loading = false;
|
||||
let applyAllPluse = 0;
|
||||
let isMaintenanceMode = false;
|
||||
async function requestUpdate() {
|
||||
await addOn.updatePluginList(true);
|
||||
}
|
||||
async function requestReload() {
|
||||
await addOn.reloadPluginList(true);
|
||||
}
|
||||
let allTerms = [] as string[];
|
||||
pluginList.subscribe((e) => {
|
||||
list = e;
|
||||
allTerms = unique(list.map((e) => e.term));
|
||||
});
|
||||
pluginIsEnumerating.subscribe((e) => {
|
||||
loading = e;
|
||||
});
|
||||
onMount(async () => {
|
||||
requestUpdate();
|
||||
});
|
||||
|
||||
function filterList(list: PluginDataExDisplay[], categories: string[]) {
|
||||
const w = list.filter((e) => categories.indexOf(e.category) !== -1);
|
||||
return w.sort((a, b) => `${a.category}-${a.name}`.localeCompare(`${b.category}-${b.name}`));
|
||||
}
|
||||
|
||||
function groupBy(items: PluginDataExDisplay[], key: string) {
|
||||
let ret = {} as Record<string, PluginDataExDisplay[]>;
|
||||
for (const v of items) {
|
||||
//@ts-ignore
|
||||
const k = (key in v ? v[key] : "") as string;
|
||||
ret[k] = ret[k] || [];
|
||||
ret[k].push(v);
|
||||
}
|
||||
for (const k in ret) {
|
||||
ret[k] = ret[k].sort((a, b) => `${a.category}-${a.name}`.localeCompare(`${b.category}-${b.name}`));
|
||||
}
|
||||
const w = Object.entries(ret);
|
||||
return w.sort(([a], [b]) => `${a}`.localeCompare(`${b}`));
|
||||
}
|
||||
|
||||
const displays = {
|
||||
CONFIG: "Configuration",
|
||||
THEME: "Themes",
|
||||
SNIPPET: "Snippets",
|
||||
};
|
||||
async function scanAgain() {
|
||||
await addOn.scanAllConfigFiles(true);
|
||||
await requestUpdate();
|
||||
}
|
||||
async function replicate() {
|
||||
await plugin.replicate(true);
|
||||
}
|
||||
function selectAllNewest() {
|
||||
selectNewestPulse++;
|
||||
}
|
||||
function applyAll() {
|
||||
applyAllPluse++;
|
||||
}
|
||||
async function applyData(data: PluginDataExDisplay): Promise<boolean> {
|
||||
return await addOn.applyData(data);
|
||||
}
|
||||
async function compareData(docA: PluginDataExDisplay, docB: PluginDataExDisplay): Promise<boolean> {
|
||||
return await addOn.compareUsingDisplayData(docA, docB);
|
||||
}
|
||||
async function deleteData(data: PluginDataExDisplay): Promise<boolean> {
|
||||
return await addOn.deleteData(data);
|
||||
}
|
||||
function askMode(evt: MouseEvent, title: string, key: string) {
|
||||
const menu = new Menu();
|
||||
menu.addItem((item) => item.setTitle(title).setIsLabel(true));
|
||||
menu.addSeparator();
|
||||
const prevMode = automaticList.get(key) ?? MODE_SELECTIVE;
|
||||
for (const mode of [MODE_SELECTIVE, MODE_AUTOMATIC, MODE_PAUSED]) {
|
||||
menu.addItem((item) => {
|
||||
item.setTitle(`${getIcon(mode as SYNC_MODE)}:${TITLES[mode]}`)
|
||||
.onClick((e) => {
|
||||
if (mode === MODE_AUTOMATIC) {
|
||||
askOverwriteModeForAutomatic(evt, key);
|
||||
} else {
|
||||
setMode(key, mode as SYNC_MODE);
|
||||
}
|
||||
})
|
||||
.setChecked(prevMode == mode)
|
||||
.setDisabled(prevMode == mode);
|
||||
});
|
||||
}
|
||||
menu.showAtMouseEvent(evt);
|
||||
}
|
||||
function applyAutomaticSync(key: string, direction: "pushForce" | "pullForce" | "safe") {
|
||||
setMode(key, MODE_AUTOMATIC);
|
||||
const configDir = normalizePath(plugin.app.vault.configDir);
|
||||
const files = (plugin.settings.pluginSyncExtendedSetting[key]?.files ?? []).map((e) => `${configDir}/${e}`);
|
||||
plugin.addOnHiddenFileSync.syncInternalFilesAndDatabase(direction, true, false, files);
|
||||
}
|
||||
function askOverwriteModeForAutomatic(evt: MouseEvent, key: string) {
|
||||
const menu = new Menu();
|
||||
menu.addItem((item) => item.setTitle("Initial Action").setIsLabel(true));
|
||||
menu.addSeparator();
|
||||
menu.addItem((item) => {
|
||||
item.setTitle(`↑: Overwrite Remote`).onClick((e) => {
|
||||
applyAutomaticSync(key, "pushForce");
|
||||
});
|
||||
})
|
||||
.addItem((item) => {
|
||||
item.setTitle(`↓: Overwrite Local`).onClick((e) => {
|
||||
applyAutomaticSync(key, "pullForce");
|
||||
});
|
||||
})
|
||||
.addItem((item) => {
|
||||
item.setTitle(`⇅: Use newer`).onClick((e) => {
|
||||
applyAutomaticSync(key, "safe");
|
||||
});
|
||||
});
|
||||
menu.showAtMouseEvent(evt);
|
||||
}
|
||||
|
||||
$: options = {
|
||||
thisTerm,
|
||||
hideNotApplicable,
|
||||
selectNewest: selectNewestPulse,
|
||||
applyAllPluse,
|
||||
applyData,
|
||||
compareData,
|
||||
deleteData,
|
||||
plugin,
|
||||
isMaintenanceMode,
|
||||
};
|
||||
|
||||
const ICON_EMOJI_PAUSED = `⛔`;
|
||||
const ICON_EMOJI_AUTOMATIC = `✨`;
|
||||
const ICON_EMOJI_SELECTIVE = `🔀`;
|
||||
|
||||
const ICONS: { [key: number]: string } = {
|
||||
[MODE_SELECTIVE]: ICON_EMOJI_SELECTIVE,
|
||||
[MODE_PAUSED]: ICON_EMOJI_PAUSED,
|
||||
[MODE_AUTOMATIC]: ICON_EMOJI_AUTOMATIC,
|
||||
};
|
||||
const TITLES: { [key: number]: string } = {
|
||||
[MODE_SELECTIVE]: "Selective",
|
||||
[MODE_PAUSED]: "Ignore",
|
||||
[MODE_AUTOMATIC]: "Automatic",
|
||||
};
|
||||
const PREFIX_PLUGIN_ALL = "PLUGIN_ALL";
|
||||
const PREFIX_PLUGIN_DATA = "PLUGIN_DATA";
|
||||
const PREFIX_PLUGIN_MAIN = "PLUGIN_MAIN";
|
||||
function setMode(key: string, mode: SYNC_MODE) {
|
||||
if (key.startsWith(PREFIX_PLUGIN_ALL + "/")) {
|
||||
setMode(PREFIX_PLUGIN_DATA + key.substring(PREFIX_PLUGIN_ALL.length), mode);
|
||||
setMode(PREFIX_PLUGIN_MAIN + key.substring(PREFIX_PLUGIN_ALL.length), mode);
|
||||
}
|
||||
const files = unique(
|
||||
list
|
||||
.filter((e) => `${e.category}/${e.name}` == key)
|
||||
.map((e) => e.files)
|
||||
.flat()
|
||||
.map((e) => e.filename),
|
||||
);
|
||||
automaticList.set(key, mode);
|
||||
automaticListDisp = automaticList;
|
||||
if (!(key in plugin.settings.pluginSyncExtendedSetting)) {
|
||||
plugin.settings.pluginSyncExtendedSetting[key] = {
|
||||
key,
|
||||
mode,
|
||||
files: [],
|
||||
};
|
||||
}
|
||||
plugin.settings.pluginSyncExtendedSetting[key].files = files;
|
||||
plugin.settings.pluginSyncExtendedSetting[key].mode = mode;
|
||||
plugin.saveSettingData();
|
||||
}
|
||||
function getIcon(mode: SYNC_MODE) {
|
||||
if (mode in ICONS) {
|
||||
return ICONS[mode];
|
||||
} else {
|
||||
("");
|
||||
}
|
||||
}
|
||||
let automaticList = new Map<string, SYNC_MODE>();
|
||||
let automaticListDisp = new Map<string, SYNC_MODE>();
|
||||
|
||||
// apply current configuration to the dialogue
|
||||
for (const { key, mode } of Object.values(plugin.settings.pluginSyncExtendedSetting)) {
|
||||
automaticList.set(key, mode);
|
||||
}
|
||||
|
||||
automaticListDisp = automaticList;
|
||||
|
||||
let displayKeys: Record<string, string[]> = {};
|
||||
|
||||
$: {
|
||||
const extraKeys = Object.keys(plugin.settings.pluginSyncExtendedSetting);
|
||||
displayKeys = [
|
||||
...list,
|
||||
...extraKeys
|
||||
.map((e) => `${e}///`.split("/"))
|
||||
.filter((e) => e[0] && e[1])
|
||||
.map((e) => ({ category: e[0], name: e[1], displayName: e[1] })),
|
||||
]
|
||||
.sort((a, b) => (a.displayName ?? a.name).localeCompare(b.displayName ?? b.name))
|
||||
.reduce((p, c) => ({ ...p, [c.category]: unique(c.category in p ? [...p[c.category], c.displayName ?? c.name] : [c.displayName ?? c.name]) }), {} as Record<string, string[]>);
|
||||
}
|
||||
|
||||
let deleteTerm = "";
|
||||
|
||||
async function deleteAllItems(term: string) {
|
||||
const deleteItems = list.filter((e) => e.term == term);
|
||||
for (const item of deleteItems) {
|
||||
await deleteData(item);
|
||||
}
|
||||
addOn.reloadPluginList(true);
|
||||
}
|
||||
</script>
|
||||
|
||||
<div>
|
||||
<div>
|
||||
<div class="buttons">
|
||||
<button on:click={() => scanAgain()}>Scan changes</button>
|
||||
<button on:click={() => replicate()}>Sync once</button>
|
||||
<button on:click={() => requestUpdate()}>Refresh</button>
|
||||
{#if isMaintenanceMode}
|
||||
<button on:click={() => requestReload()}>Reload</button>
|
||||
{/if}
|
||||
<button on:click={() => selectAllNewest()}>Select All Shiny</button>
|
||||
</div>
|
||||
<div class="buttons">
|
||||
<button on:click={() => applyAll()}>Apply All</button>
|
||||
</div>
|
||||
</div>
|
||||
{#if loading}
|
||||
<div>
|
||||
<span>Updating list...</span>
|
||||
</div>
|
||||
{/if}
|
||||
<div class="list">
|
||||
{#if list.length == 0}
|
||||
<div class="center">No Items.</div>
|
||||
{:else}
|
||||
{#each Object.entries(displays).filter(([key, _]) => key in displayKeys) as [key, label]}
|
||||
<div>
|
||||
<h3>{label}</h3>
|
||||
{#each displayKeys[key] as name}
|
||||
{@const bindKey = `${key}/${name}`}
|
||||
{@const mode = automaticListDisp.get(bindKey) ?? MODE_SELECTIVE}
|
||||
<div class="labelrow {hideEven ? 'hideeven' : ''}">
|
||||
<div class="title">
|
||||
<button class="status" on:click={(evt) => askMode(evt, `${key}/${name}`, bindKey)}>
|
||||
{getIcon(mode)}
|
||||
</button>
|
||||
<span class="name">{name}</span>
|
||||
</div>
|
||||
{#if mode == MODE_SELECTIVE}
|
||||
<PluginCombo {...options} list={list.filter((e) => e.category == key && e.name == name)} hidden={false} />
|
||||
{:else}
|
||||
<div class="statusnote">{TITLES[mode]}</div>
|
||||
{/if}
|
||||
</div>
|
||||
{/each}
|
||||
</div>
|
||||
{/each}
|
||||
<div>
|
||||
<h3>Plugins</h3>
|
||||
{#each groupBy(filterList(list, ["PLUGIN_MAIN", "PLUGIN_DATA", "PLUGIN_ETC"]), "name") as [name, listX]}
|
||||
{@const bindKeyAll = `${PREFIX_PLUGIN_ALL}/${name}`}
|
||||
{@const modeAll = automaticListDisp.get(bindKeyAll) ?? MODE_SELECTIVE}
|
||||
{@const bindKeyMain = `${PREFIX_PLUGIN_MAIN}/${name}`}
|
||||
{@const modeMain = automaticListDisp.get(bindKeyMain) ?? MODE_SELECTIVE}
|
||||
{@const bindKeyData = `${PREFIX_PLUGIN_DATA}/${name}`}
|
||||
{@const modeData = automaticListDisp.get(bindKeyData) ?? MODE_SELECTIVE}
|
||||
<div class="labelrow {hideEven ? 'hideeven' : ''}">
|
||||
<div class="title">
|
||||
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_ALL}/${name}`, bindKeyAll)}>
|
||||
{getIcon(modeAll)}
|
||||
</button>
|
||||
<span class="name">{name}</span>
|
||||
</div>
|
||||
{#if modeAll == MODE_SELECTIVE}
|
||||
<PluginCombo {...options} list={listX} hidden={true} />
|
||||
{/if}
|
||||
</div>
|
||||
{#if modeAll == MODE_SELECTIVE}
|
||||
<div class="filerow {hideEven ? 'hideeven' : ''}">
|
||||
<div class="filetitle">
|
||||
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_MAIN}/${name}/MAIN`, bindKeyMain)}>
|
||||
{getIcon(modeMain)}
|
||||
</button>
|
||||
<span class="name">MAIN</span>
|
||||
</div>
|
||||
{#if modeMain == MODE_SELECTIVE}
|
||||
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_MAIN"])} hidden={false} />
|
||||
{:else}
|
||||
<div class="statusnote">{TITLES[modeMain]}</div>
|
||||
{/if}
|
||||
</div>
|
||||
<div class="filerow {hideEven ? 'hideeven' : ''}">
|
||||
<div class="filetitle">
|
||||
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_DATA}/${name}`, bindKeyData)}>
|
||||
{getIcon(modeData)}
|
||||
</button>
|
||||
<span class="name">DATA</span>
|
||||
</div>
|
||||
{#if modeData == MODE_SELECTIVE}
|
||||
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_DATA"])} hidden={false} />
|
||||
{:else}
|
||||
<div class="statusnote">{TITLES[modeData]}</div>
|
||||
{/if}
|
||||
</div>
|
||||
{:else}
|
||||
<div class="noterow">
|
||||
<div class="statusnote">{TITLES[modeAll]}</div>
|
||||
</div>
|
||||
{/if}
|
||||
{/each}
|
||||
</div>
|
||||
{/if}
|
||||
</div>
|
||||
{#if isMaintenanceMode}
|
||||
<div class="list">
|
||||
<div>
|
||||
<h3>Maintenance Commands</h3>
|
||||
<div class="maintenancerow">
|
||||
<label for="">Delete All of </label>
|
||||
<select bind:value={deleteTerm}>
|
||||
{#each allTerms as term}
|
||||
<option value={term}>{term}</option>
|
||||
{/each}
|
||||
</select>
|
||||
<button
|
||||
class="status"
|
||||
on:click={(evt) => {
|
||||
deleteAllItems(deleteTerm);
|
||||
}}
|
||||
>
|
||||
🗑️
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{/if}
|
||||
<div class="buttons">
|
||||
<label><span>Hide not applicable items</span><input type="checkbox" bind:checked={hideEven} /></label>
|
||||
</div>
|
||||
<div class="buttons">
|
||||
<label><span>Maintenance mode</span><input type="checkbox" bind:checked={isMaintenanceMode} /></label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
span.spacer {
|
||||
min-width: 1px;
|
||||
flex-grow: 1;
|
||||
}
|
||||
h3 {
|
||||
position: sticky;
|
||||
top: 0;
|
||||
background-color: var(--modal-background);
|
||||
}
|
||||
.labelrow {
|
||||
margin-left: 0.4em;
|
||||
display: flex;
|
||||
justify-content: flex-start;
|
||||
align-items: center;
|
||||
border-top: 1px solid var(--background-modifier-border);
|
||||
padding: 4px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
.filerow {
|
||||
margin-left: 1.25em;
|
||||
display: flex;
|
||||
justify-content: flex-start;
|
||||
align-items: center;
|
||||
padding-right: 4px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
.filerow.hideeven:has(.even),
|
||||
.labelrow.hideeven:has(.even) {
|
||||
display: none;
|
||||
}
|
||||
.noterow {
|
||||
min-height: 2em;
|
||||
display: flex;
|
||||
}
|
||||
button.status {
|
||||
flex-grow: 0;
|
||||
margin: 2px 4px;
|
||||
min-width: 3em;
|
||||
max-width: 4em;
|
||||
}
|
||||
.statusnote {
|
||||
display: flex;
|
||||
justify-content: flex-end;
|
||||
padding-right: var(--size-4-12);
|
||||
align-items: center;
|
||||
min-width: 10em;
|
||||
flex-grow: 1;
|
||||
}
|
||||
|
||||
.title {
|
||||
color: var(--text-normal);
|
||||
font-size: var(--font-ui-medium);
|
||||
line-height: var(--line-height-tight);
|
||||
margin-right: auto;
|
||||
}
|
||||
.filetitle {
|
||||
color: var(--text-normal);
|
||||
font-size: var(--font-ui-medium);
|
||||
line-height: var(--line-height-tight);
|
||||
margin-right: auto;
|
||||
}
|
||||
.buttons {
|
||||
display: flex;
|
||||
flex-direction: row;
|
||||
justify-content: flex-end;
|
||||
margin-top: 8px;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
.buttons > button {
|
||||
margin-left: 4px;
|
||||
width: auto;
|
||||
}
|
||||
|
||||
label {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
label > span {
|
||||
margin-right: 0.25em;
|
||||
}
|
||||
:global(.is-mobile) .title,
|
||||
:global(.is-mobile) .filetitle {
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.center {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
min-height: 3em;
|
||||
}
|
||||
.maintenancerow {
|
||||
display: flex;
|
||||
justify-content: flex-end;
|
||||
align-items: center;
|
||||
}
|
||||
.maintenancerow label {
|
||||
margin-right: 0.5em;
|
||||
margin-left: 0.5em;
|
||||
}
|
||||
</style>
|
||||
83
src/ui/components/MultipleRegExpControl.svelte
Normal file
@@ -0,0 +1,83 @@
|
||||
<script lang="ts">
|
||||
export let patterns = [] as string[];
|
||||
export let originals = [] as string[];
|
||||
|
||||
export let apply: (args: string[]) => Promise<void> = (_: string[]) => Promise.resolve();
|
||||
function revert() {
|
||||
patterns = [...originals];
|
||||
}
|
||||
const CHECK_OK = "✔";
|
||||
const CHECK_NG = "⚠";
|
||||
const MARK_MODIFIED = "✏ ";
|
||||
function checkRegExp(pattern: string) {
|
||||
if (pattern.trim() == "") return "";
|
||||
try {
|
||||
const _ = new RegExp(pattern);
|
||||
return CHECK_OK;
|
||||
} catch (ex) {
|
||||
return CHECK_NG;
|
||||
}
|
||||
}
|
||||
$: status = patterns.map((e) => checkRegExp(e));
|
||||
$: modified = patterns.map((e, i) => (e != originals?.[i] ?? "" ? MARK_MODIFIED : ""));
|
||||
|
||||
function remove(idx: number) {
|
||||
patterns[idx] = "";
|
||||
}
|
||||
function add() {
|
||||
patterns = [...patterns, ""];
|
||||
}
|
||||
</script>
|
||||
|
||||
<ul>
|
||||
{#each patterns as pattern, idx}
|
||||
<li><label>{modified[idx]}{status[idx]}</label><input type="text" bind:value={pattern} class={modified[idx]} /><button class="iconbutton" on:click={() => remove(idx)}>🗑</button></li>
|
||||
{/each}
|
||||
<li>
|
||||
<label><button on:click={() => add()}>Add</button></label>
|
||||
</li>
|
||||
<li class="buttons">
|
||||
<button on:click={() => apply(patterns)} disabled={status.some((e) => e == CHECK_NG) || modified.every((e) => e == "")}>Apply</button>
|
||||
<button on:click={() => revert()} disabled={status.some((e) => e == CHECK_NG) || modified.every((e) => e == "")}>Revert</button>
|
||||
</li>
|
||||
</ul>
|
||||
|
||||
<style>
|
||||
label {
|
||||
min-width: 4em;
|
||||
width: 4em;
|
||||
display: inline-flex;
|
||||
flex-direction: row;
|
||||
justify-content: flex-end;
|
||||
}
|
||||
ul {
|
||||
flex-grow: 1;
|
||||
display: inline-flex;
|
||||
flex-direction: column;
|
||||
list-style-type: none;
|
||||
margin-block-start: 0;
|
||||
margin-block-end: 0;
|
||||
margin-inline-start: 0px;
|
||||
margin-inline-end: 0px;
|
||||
padding-inline-start: 0;
|
||||
}
|
||||
li {
|
||||
padding: var(--size-2-1) var(--size-4-1);
|
||||
display: inline-flex;
|
||||
flex-grow: 1;
|
||||
align-items: center;
|
||||
justify-content: flex-end;
|
||||
gap: var(--size-4-2);
|
||||
}
|
||||
li input {
|
||||
min-width: 10em;
|
||||
}
|
||||
li.buttons {
|
||||
}
|
||||
button.iconbutton {
|
||||
max-width: 4em;
|
||||
}
|
||||
span.spacer {
|
||||
flex-grow: 1;
|
||||
}
|
||||
</style>
|
||||
@@ -1,11 +1,11 @@
|
||||
<script lang="ts">
|
||||
import type { PluginDataExDisplay } from "./CmdConfigSync";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { versionNumberString2Number } from "./lib/src/strbin";
|
||||
import { FilePath, LOG_LEVEL } from "./lib/src/types";
|
||||
import { getDocData } from "./lib/src/utils";
|
||||
import type ObsidianLiveSyncPlugin from "./main";
|
||||
import { askString, scheduleTask } from "./utils";
|
||||
import type { PluginDataExDisplay } from "../../features/CmdConfigSync";
|
||||
import { Logger } from "../../lib/src/common/logger";
|
||||
import { versionNumberString2Number } from "../../lib/src/string_and_binary/strbin";
|
||||
import { type FilePath, LOG_LEVEL_NOTICE } from "../../lib/src/common/types";
|
||||
import { getDocData } from "../../lib/src/common/utils";
|
||||
import type ObsidianLiveSyncPlugin from "../../main";
|
||||
import { askString, scheduleTask } from "../../common/utils";
|
||||
|
||||
export let list: PluginDataExDisplay[] = [];
|
||||
export let thisTerm = "";
|
||||
@@ -108,7 +108,7 @@
|
||||
}
|
||||
}
|
||||
})
|
||||
.reduce((p, c) => p | c, 0);
|
||||
.reduce((p, c) => p | (c as number), 0 as number);
|
||||
if (matchingStatus == 0b0000100) {
|
||||
equivalency = "⚖️ Same";
|
||||
canApply = false;
|
||||
@@ -207,21 +207,21 @@
|
||||
const local = list.find((e) => e.term == thisTerm);
|
||||
const selectedItem = list.find((e) => e.term == selected);
|
||||
if (selectedItem && (await applyData(selectedItem))) {
|
||||
scheduleTask("update-plugin-list", 250, () => addOn.updatePluginList(true, local.documentPath));
|
||||
addOn.updatePluginList(true, local?.documentPath);
|
||||
}
|
||||
}
|
||||
async function compareSelected() {
|
||||
const local = list.find((e) => e.term == thisTerm);
|
||||
const selectedItem = list.find((e) => e.term == selected);
|
||||
if (local && selectedItem && (await compareData(local, selectedItem))) {
|
||||
scheduleTask("update-plugin-list", 250, () => addOn.updatePluginList(true, local.documentPath));
|
||||
addOn.updatePluginList(true, local.documentPath);
|
||||
}
|
||||
}
|
||||
async function deleteSelected() {
|
||||
const selectedItem = list.find((e) => e.term == selected);
|
||||
// const deletedPath = selectedItem.documentPath;
|
||||
if (selectedItem && (await deleteData(selectedItem))) {
|
||||
scheduleTask("update-plugin-list", 250, () => addOn.reloadPluginList(true));
|
||||
addOn.reloadPluginList(true);
|
||||
}
|
||||
}
|
||||
async function duplicateItem() {
|
||||
@@ -229,7 +229,7 @@
|
||||
const duplicateTermName = await askString(plugin.app, "Duplicate", "device name", "");
|
||||
if (duplicateTermName) {
|
||||
if (duplicateTermName.contains("/")) {
|
||||
Logger(`We can not use "/" to the device name`, LOG_LEVEL.NOTICE);
|
||||
Logger(`We can not use "/" to the device name`, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
const key = `${plugin.app.vault.configDir}/${local.files[0].filename}`;
|
||||
@@ -274,7 +274,7 @@
|
||||
{/if}
|
||||
{:else}
|
||||
<span class="spacer" />
|
||||
<span class="message even">All devices are even</span>
|
||||
<span class="message even">All the same or non-existent</span>
|
||||
<button disabled />
|
||||
<button disabled />
|
||||
{/if}
|
||||
740
src/utils.ts
@@ -1,740 +0,0 @@
|
||||
import { DataWriteOptions, normalizePath, TFile, Platform, TAbstractFile, App, Plugin_2, RequestUrlParam, requestUrl } from "./deps";
|
||||
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "./lib/src/path";
|
||||
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { AnyEntry, DocumentID, EntryDoc, EntryHasPath, FilePath, FilePathWithPrefix, LOG_LEVEL, NewEntry } from "./lib/src/types";
|
||||
import { CHeader, ICHeader, ICHeaderLength, PSCHeader } from "./types";
|
||||
import { InputStringDialog, PopoverSelectString } from "./dialogs";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { runWithLock } from "./lib/src/lock";
|
||||
|
||||
// For backward compatibility, using the path for determining id.
|
||||
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
|
||||
// The first slash will be deleted when the path is normalized.
|
||||
export async function path2id(filename: FilePathWithPrefix | FilePath, obfuscatePassphrase: string | false): Promise<DocumentID> {
|
||||
const temp = filename.split(":");
|
||||
const path = temp.pop();
|
||||
const normalizedPath = normalizePath(path as FilePath);
|
||||
temp.push(normalizedPath);
|
||||
const fixedPath = temp.join(":") as FilePathWithPrefix;
|
||||
|
||||
const out = await path2id_base(fixedPath, obfuscatePassphrase);
|
||||
return out;
|
||||
}
|
||||
export function id2path(id: DocumentID, entry?: EntryHasPath): FilePathWithPrefix {
|
||||
const filename = id2path_base(id, entry);
|
||||
const temp = filename.split(":");
|
||||
const path = temp.pop();
|
||||
const normalizedPath = normalizePath(path as FilePath);
|
||||
temp.push(normalizedPath);
|
||||
const fixedPath = temp.join(":") as FilePathWithPrefix;
|
||||
return fixedPath;
|
||||
}
|
||||
export function getPath(entry: AnyEntry) {
|
||||
return id2path(entry._id, entry);
|
||||
|
||||
}
|
||||
export function getPathWithoutPrefix(entry: AnyEntry) {
|
||||
const f = getPath(entry);
|
||||
return stripAllPrefixes(f);
|
||||
}
|
||||
|
||||
export function getPathFromTFile(file: TAbstractFile) {
|
||||
return file.path as FilePath;
|
||||
}
|
||||
|
||||
const tasks: { [key: string]: ReturnType<typeof setTimeout> } = {};
|
||||
export function scheduleTask(key: string, timeout: number, proc: (() => Promise<any> | void), skipIfTaskExist?: boolean) {
|
||||
if (skipIfTaskExist && key in tasks) {
|
||||
return;
|
||||
}
|
||||
cancelTask(key);
|
||||
tasks[key] = setTimeout(async () => {
|
||||
delete tasks[key];
|
||||
await proc();
|
||||
}, timeout);
|
||||
}
|
||||
export function cancelTask(key: string) {
|
||||
if (key in tasks) {
|
||||
clearTimeout(tasks[key]);
|
||||
delete tasks[key];
|
||||
}
|
||||
}
|
||||
export function cancelAllTasks() {
|
||||
for (const v in tasks) {
|
||||
clearTimeout(tasks[v]);
|
||||
delete tasks[v];
|
||||
}
|
||||
}
|
||||
const intervals: { [key: string]: ReturnType<typeof setInterval> } = {};
|
||||
export function setPeriodicTask(key: string, timeout: number, proc: (() => Promise<any> | void)) {
|
||||
cancelPeriodicTask(key);
|
||||
intervals[key] = setInterval(async () => {
|
||||
delete intervals[key];
|
||||
await proc();
|
||||
}, timeout);
|
||||
}
|
||||
export function cancelPeriodicTask(key: string) {
|
||||
if (key in intervals) {
|
||||
clearInterval(intervals[key]);
|
||||
delete intervals[key];
|
||||
}
|
||||
}
|
||||
export function cancelAllPeriodicTask() {
|
||||
for (const v in intervals) {
|
||||
clearInterval(intervals[v]);
|
||||
delete intervals[v];
|
||||
}
|
||||
}
|
||||
|
||||
const memos: { [key: string]: any } = {};
|
||||
export function memoObject<T>(key: string, obj: T): T {
|
||||
memos[key] = obj;
|
||||
return memos[key] as T;
|
||||
}
|
||||
export async function memoIfNotExist<T>(key: string, func: () => T | Promise<T>): Promise<T> {
|
||||
if (!(key in memos)) {
|
||||
const w = func();
|
||||
const v = w instanceof Promise ? (await w) : w;
|
||||
memos[key] = v;
|
||||
}
|
||||
return memos[key] as T;
|
||||
}
|
||||
export function retrieveMemoObject<T>(key: string): T | false {
|
||||
if (key in memos) {
|
||||
return memos[key];
|
||||
} else {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
export function disposeMemoObject(key: string) {
|
||||
delete memos[key];
|
||||
}
|
||||
|
||||
export function isSensibleMargeApplicable(path: string) {
|
||||
if (path.endsWith(".md")) return true;
|
||||
return false;
|
||||
}
|
||||
export function isObjectMargeApplicable(path: string) {
|
||||
if (path.endsWith(".canvas")) return true;
|
||||
if (path.endsWith(".json")) return true;
|
||||
return false;
|
||||
}
|
||||
|
||||
export function tryParseJSON(str: string, fallbackValue?: any) {
|
||||
try {
|
||||
return JSON.parse(str);
|
||||
} catch (ex) {
|
||||
return fallbackValue;
|
||||
}
|
||||
}
|
||||
|
||||
const MARK_OPERATOR = `\u{0001}`;
|
||||
const MARK_DELETED = `${MARK_OPERATOR}__DELETED`;
|
||||
const MARK_ISARRAY = `${MARK_OPERATOR}__ARRAY`;
|
||||
const MARK_SWAPPED = `${MARK_OPERATOR}__SWAP`;
|
||||
|
||||
function unorderedArrayToObject(obj: Array<any>) {
|
||||
return obj.map(e => ({ [e.id as string]: e })).reduce((p, c) => ({ ...p, ...c }), {})
|
||||
}
|
||||
function objectToUnorderedArray(obj: object) {
|
||||
const entries = Object.entries(obj);
|
||||
if (entries.some(e => e[0] != e[1]?.id)) throw new Error("Item looks like not unordered array")
|
||||
return entries.map(e => e[1]);
|
||||
}
|
||||
function generatePatchUnorderedArray(from: Array<any>, to: Array<any>) {
|
||||
if (from.every(e => typeof (e) == "object" && ("id" in e)) && to.every(e => typeof (e) == "object" && ("id" in e))) {
|
||||
const fObj = unorderedArrayToObject(from);
|
||||
const tObj = unorderedArrayToObject(to);
|
||||
const diff = generatePatchObj(fObj, tObj);
|
||||
if (Object.keys(diff).length > 0) {
|
||||
return { [MARK_ISARRAY]: diff };
|
||||
} else {
|
||||
return {};
|
||||
}
|
||||
}
|
||||
return { [MARK_SWAPPED]: to };
|
||||
}
|
||||
|
||||
export function generatePatchObj(from: Record<string | number | symbol, any>, to: Record<string | number | symbol, any>) {
|
||||
const entries = Object.entries(from);
|
||||
const tempMap = new Map<string | number | symbol, any>(entries);
|
||||
const ret = {} as Record<string | number | symbol, any>;
|
||||
const newEntries = Object.entries(to);
|
||||
for (const [key, value] of newEntries) {
|
||||
if (!tempMap.has(key)) {
|
||||
//New
|
||||
ret[key] = value;
|
||||
tempMap.delete(key);
|
||||
} else {
|
||||
//Exists
|
||||
const v = tempMap.get(key);
|
||||
if (typeof (v) !== typeof (value) || (Array.isArray(v) !== Array.isArray(value))) {
|
||||
//if type is not match, replace completely.
|
||||
ret[key] = { [MARK_SWAPPED]: value };
|
||||
} else {
|
||||
if (typeof (v) == "object" && typeof (value) == "object" && !Array.isArray(v) && !Array.isArray(value)) {
|
||||
const wk = generatePatchObj(v, value);
|
||||
if (Object.keys(wk).length > 0) ret[key] = wk;
|
||||
} else if (typeof (v) == "object" && typeof (value) == "object" && Array.isArray(v) && Array.isArray(value)) {
|
||||
const wk = generatePatchUnorderedArray(v, value);
|
||||
if (Object.keys(wk).length > 0) ret[key] = wk;
|
||||
} else if (typeof (v) != "object" && typeof (value) != "object") {
|
||||
if (JSON.stringify(tempMap.get(key)) !== JSON.stringify(value)) {
|
||||
ret[key] = value;
|
||||
}
|
||||
} else {
|
||||
if (JSON.stringify(tempMap.get(key)) !== JSON.stringify(value)) {
|
||||
ret[key] = { [MARK_SWAPPED]: value };
|
||||
}
|
||||
}
|
||||
}
|
||||
tempMap.delete(key);
|
||||
}
|
||||
}
|
||||
//Not used item, means deleted one
|
||||
for (const [key,] of tempMap) {
|
||||
ret[key] = MARK_DELETED
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
|
||||
export function applyPatch(from: Record<string | number | symbol, any>, patch: Record<string | number | symbol, any>) {
|
||||
const ret = from;
|
||||
const patches = Object.entries(patch);
|
||||
for (const [key, value] of patches) {
|
||||
if (value == MARK_DELETED) {
|
||||
delete ret[key];
|
||||
continue;
|
||||
}
|
||||
if (typeof (value) == "object") {
|
||||
if (MARK_SWAPPED in value) {
|
||||
ret[key] = value[MARK_SWAPPED];
|
||||
continue;
|
||||
}
|
||||
if (MARK_ISARRAY in value) {
|
||||
if (!(key in ret)) ret[key] = [];
|
||||
if (!Array.isArray(ret[key])) {
|
||||
throw new Error("Patch target type is mismatched (array to something)");
|
||||
}
|
||||
const orgArrayObject = unorderedArrayToObject(ret[key]);
|
||||
const appliedObject = applyPatch(orgArrayObject, value[MARK_ISARRAY]);
|
||||
const appliedArray = objectToUnorderedArray(appliedObject);
|
||||
ret[key] = [...appliedArray];
|
||||
} else {
|
||||
if (!(key in ret)) {
|
||||
ret[key] = value;
|
||||
continue;
|
||||
}
|
||||
ret[key] = applyPatch(ret[key], value);
|
||||
}
|
||||
} else {
|
||||
ret[key] = value;
|
||||
}
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
export function mergeObject(
|
||||
objA: Record<string | number | symbol, any> | [any],
|
||||
objB: Record<string | number | symbol, any> | [any]
|
||||
) {
|
||||
const newEntries = Object.entries(objB);
|
||||
const ret: any = { ...objA };
|
||||
if (
|
||||
typeof objA !== typeof objB ||
|
||||
Array.isArray(objA) !== Array.isArray(objB)
|
||||
) {
|
||||
return objB;
|
||||
}
|
||||
|
||||
for (const [key, v] of newEntries) {
|
||||
if (key in ret) {
|
||||
const value = ret[key];
|
||||
if (
|
||||
typeof v !== typeof value ||
|
||||
Array.isArray(v) !== Array.isArray(value)
|
||||
) {
|
||||
//if type is not match, replace completely.
|
||||
ret[key] = v;
|
||||
} else {
|
||||
if (
|
||||
typeof v == "object" &&
|
||||
typeof value == "object" &&
|
||||
!Array.isArray(v) &&
|
||||
!Array.isArray(value)
|
||||
) {
|
||||
ret[key] = mergeObject(v, value);
|
||||
} else if (
|
||||
typeof v == "object" &&
|
||||
typeof value == "object" &&
|
||||
Array.isArray(v) &&
|
||||
Array.isArray(value)
|
||||
) {
|
||||
ret[key] = [...new Set([...v, ...value])];
|
||||
} else {
|
||||
ret[key] = v;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
ret[key] = v;
|
||||
}
|
||||
}
|
||||
if (Array.isArray(objA) && Array.isArray(objB)) {
|
||||
return Object.values(Object.entries(ret)
|
||||
.sort()
|
||||
.reduce((p, [key, value]) => ({ ...p, [key]: value }), {}));
|
||||
}
|
||||
return Object.entries(ret)
|
||||
.sort()
|
||||
.reduce((p, [key, value]) => ({ ...p, [key]: value }), {});
|
||||
}
|
||||
|
||||
export function flattenObject(obj: Record<string | number | symbol, any>, path: string[] = []): [string, any][] {
|
||||
if (typeof (obj) != "object") return [[path.join("."), obj]];
|
||||
if (Array.isArray(obj)) return [[path.join("."), JSON.stringify(obj)]];
|
||||
const e = Object.entries(obj);
|
||||
const ret = []
|
||||
for (const [key, value] of e) {
|
||||
const p = flattenObject(value, [...path, key]);
|
||||
ret.push(...p);
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
export function modifyFile(file: TFile, data: string | ArrayBuffer, options?: DataWriteOptions) {
|
||||
if (typeof (data) === "string") {
|
||||
return app.vault.modify(file, data, options);
|
||||
} else {
|
||||
return app.vault.modifyBinary(file, data, options);
|
||||
}
|
||||
}
|
||||
export function createFile(path: string, data: string | ArrayBuffer, options?: DataWriteOptions): Promise<TFile> {
|
||||
if (typeof (data) === "string") {
|
||||
return app.vault.create(path, data, options);
|
||||
} else {
|
||||
return app.vault.createBinary(path, data, options);
|
||||
}
|
||||
}
|
||||
|
||||
export function isValidPath(filename: string) {
|
||||
if (Platform.isDesktop) {
|
||||
// if(Platform.isMacOS) return isValidFilenameInDarwin(filename);
|
||||
if (process.platform == "darwin") return isValidFilenameInDarwin(filename);
|
||||
if (process.platform == "linux") return isValidFilenameInLinux(filename);
|
||||
return isValidFilenameInWidows(filename);
|
||||
}
|
||||
if (Platform.isAndroidApp) return isValidFilenameInAndroid(filename);
|
||||
if (Platform.isIosApp) return isValidFilenameInDarwin(filename);
|
||||
//Fallback
|
||||
Logger("Could not determine platform for checking filename", LOG_LEVEL.VERBOSE);
|
||||
return isValidFilenameInWidows(filename);
|
||||
}
|
||||
|
||||
let touchedFiles: string[] = [];
|
||||
|
||||
export function getAbstractFileByPath(path: FilePath): TAbstractFile | null {
|
||||
// Disabled temporary.
|
||||
return app.vault.getAbstractFileByPath(path);
|
||||
// // Hidden API but so useful.
|
||||
// // @ts-ignore
|
||||
// if ("getAbstractFileByPathInsensitive" in app.vault && (app.vault.adapter?.insensitive ?? false)) {
|
||||
// // @ts-ignore
|
||||
// return app.vault.getAbstractFileByPathInsensitive(path);
|
||||
// } else {
|
||||
// return app.vault.getAbstractFileByPath(path);
|
||||
// }
|
||||
}
|
||||
export function trimPrefix(target: string, prefix: string) {
|
||||
return target.startsWith(prefix) ? target.substring(prefix.length) : target;
|
||||
}
|
||||
|
||||
export function touch(file: TFile | FilePath) {
|
||||
const f = file instanceof TFile ? file : getAbstractFileByPath(file) as TFile;
|
||||
const key = `${f.path}-${f.stat.mtime}-${f.stat.size}`;
|
||||
touchedFiles.unshift(key);
|
||||
touchedFiles = touchedFiles.slice(0, 100);
|
||||
}
|
||||
export function recentlyTouched(file: TFile) {
|
||||
const key = `${file.path}-${file.stat.mtime}-${file.stat.size}`;
|
||||
if (touchedFiles.indexOf(key) == -1) return false;
|
||||
return true;
|
||||
}
|
||||
export function clearTouched() {
|
||||
touchedFiles = [];
|
||||
}
|
||||
|
||||
/**
|
||||
* returns is internal chunk of file
|
||||
* @param id ID
|
||||
* @returns
|
||||
*/
|
||||
export function isInternalMetadata(id: FilePath | FilePathWithPrefix | DocumentID): boolean {
|
||||
return id.startsWith(ICHeader);
|
||||
}
|
||||
export function stripInternalMetadataPrefix<T extends FilePath | FilePathWithPrefix | DocumentID>(id: T): T {
|
||||
return id.substring(ICHeaderLength) as T;
|
||||
}
|
||||
export function id2InternalMetadataId(id: DocumentID): DocumentID {
|
||||
return ICHeader + id as DocumentID;
|
||||
}
|
||||
|
||||
// const CHeaderLength = CHeader.length;
|
||||
export function isChunk(str: string): boolean {
|
||||
return str.startsWith(CHeader);
|
||||
}
|
||||
|
||||
export function isPluginMetadata(str: string): boolean {
|
||||
return str.startsWith(PSCHeader);
|
||||
}
|
||||
|
||||
export const askYesNo = (app: App, message: string): Promise<"yes" | "no"> => {
|
||||
return new Promise((res) => {
|
||||
const popover = new PopoverSelectString(app, message, null, null, (result) => res(result as "yes" | "no"));
|
||||
popover.open();
|
||||
});
|
||||
};
|
||||
|
||||
export const askSelectString = (app: App, message: string, items: string[]): Promise<string> => {
|
||||
const getItemsFun = () => items;
|
||||
return new Promise((res) => {
|
||||
const popover = new PopoverSelectString(app, message, "", getItemsFun, (result) => res(result));
|
||||
popover.open();
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
export const askString = (app: App, title: string, key: string, placeholder: string): Promise<string | false> => {
|
||||
return new Promise((res) => {
|
||||
const dialog = new InputStringDialog(app, title, key, placeholder, (result) => res(result));
|
||||
dialog.open();
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
export class PeriodicProcessor {
|
||||
_process: () => Promise<any>;
|
||||
_timer?: number;
|
||||
_plugin: Plugin_2;
|
||||
constructor(plugin: Plugin_2, process: () => Promise<any>) {
|
||||
this._plugin = plugin;
|
||||
this._process = process;
|
||||
}
|
||||
async process() {
|
||||
try {
|
||||
await this._process();
|
||||
} catch (ex) {
|
||||
Logger(ex);
|
||||
}
|
||||
}
|
||||
enable(interval: number) {
|
||||
this.disable();
|
||||
if (interval == 0) return;
|
||||
this._timer = window.setInterval(() => this._process().then(() => { }), interval);
|
||||
this._plugin.registerInterval(this._timer);
|
||||
}
|
||||
disable() {
|
||||
if (this._timer) clearInterval(this._timer);
|
||||
}
|
||||
}
|
||||
|
||||
function sizeToHumanReadable(size: number | undefined) {
|
||||
if (!size) return "-";
|
||||
const i = Math.floor(Math.log(size) / Math.log(1024));
|
||||
return Number.parseInt((size / Math.pow(1024, i)).toFixed(2)) + ' ' + ['B', 'kB', 'MB', 'GB', 'TB'][i];
|
||||
}
|
||||
|
||||
export const _requestToCouchDBFetch = async (baseUri: string, username: string, password: string, path?: string, body?: string | any, method?: string) => {
|
||||
const utf8str = String.fromCharCode.apply(null, new TextEncoder().encode(`${username}:${password}`));
|
||||
const encoded = window.btoa(utf8str);
|
||||
const authHeader = "Basic " + encoded;
|
||||
const transformedHeaders: Record<string, string> = { authorization: authHeader, "content-type": "application/json" };
|
||||
const uri = `${baseUri}/${path}`;
|
||||
const requestParam = {
|
||||
url: uri,
|
||||
method: method || (body ? "PUT" : "GET"),
|
||||
headers: new Headers(transformedHeaders),
|
||||
contentType: "application/json",
|
||||
body: JSON.stringify(body),
|
||||
};
|
||||
return await fetch(uri, requestParam);
|
||||
}
|
||||
|
||||
export const _requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string, path?: string, body?: any, method?: string) => {
|
||||
const utf8str = String.fromCharCode.apply(null, new TextEncoder().encode(`${username}:${password}`));
|
||||
const encoded = window.btoa(utf8str);
|
||||
const authHeader = "Basic " + encoded;
|
||||
const transformedHeaders: Record<string, string> = { authorization: authHeader, origin: origin };
|
||||
const uri = `${baseUri}/${path}`;
|
||||
const requestParam: RequestUrlParam = {
|
||||
url: uri,
|
||||
method: method || (body ? "PUT" : "GET"),
|
||||
headers: transformedHeaders,
|
||||
contentType: "application/json",
|
||||
body: body ? JSON.stringify(body) : undefined,
|
||||
};
|
||||
return await requestUrl(requestParam);
|
||||
}
|
||||
export const requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string, key?: string, body?: string, method?: string) => {
|
||||
const uri = `_node/_local/_config${key ? "/" + key : ""}`;
|
||||
return await _requestToCouchDB(baseUri, username, password, origin, uri, body, method);
|
||||
};
|
||||
|
||||
export async function performRebuildDB(plugin: ObsidianLiveSyncPlugin, method: "localOnly" | "remoteOnly" | "rebuildBothByThisDevice") {
|
||||
if (method == "localOnly") {
|
||||
await plugin.addOnSetup.fetchLocal();
|
||||
}
|
||||
if (method == "remoteOnly") {
|
||||
await plugin.addOnSetup.rebuildRemote();
|
||||
}
|
||||
if (method == "rebuildBothByThisDevice") {
|
||||
await plugin.addOnSetup.rebuildEverything();
|
||||
}
|
||||
}
|
||||
|
||||
export const gatherChunkUsage = async (db: PouchDB.Database<EntryDoc>) => {
|
||||
const used = new Map();
|
||||
const unreferenced = new Map();
|
||||
const removed = new Map();
|
||||
const missing = new Map();
|
||||
const xx = await db.allDocs({ startkey: "h:", endkey: `h:\u{10ffff}` });
|
||||
for (const xxd of xx.rows) {
|
||||
const chunk = xxd.id
|
||||
unreferenced.set(chunk, xxd.value.rev);
|
||||
}
|
||||
|
||||
const x = await db.find({ limit: 999999999, selector: { children: { $exists: true, $type: "array" } }, fields: ["_id", "path", "mtime", "children"] });
|
||||
for (const temp of x.docs) {
|
||||
for (const chunk of (temp as NewEntry).children) {
|
||||
used.set(chunk, (used.has(chunk) ? used.get(chunk) : 0) + 1);
|
||||
if (unreferenced.has(chunk)) {
|
||||
removed.set(chunk, unreferenced.get(chunk));
|
||||
unreferenced.delete(chunk);
|
||||
} else {
|
||||
if (!removed.has(chunk)) {
|
||||
if (!missing.has(temp._id)) {
|
||||
missing.set(temp._id, []);
|
||||
}
|
||||
missing.get(temp._id).push(chunk);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return { used, unreferenced, missing };
|
||||
}
|
||||
|
||||
export const localDatabaseCleanUp = async (plugin: ObsidianLiveSyncPlugin, force: boolean, dryRun: boolean) => {
|
||||
|
||||
await runWithLock("clean-up:local", true, async () => {
|
||||
const db = plugin.localDatabase.localDatabase;
|
||||
if ((db as any)?.adapter != "indexeddb") {
|
||||
if (force && !dryRun) {
|
||||
Logger("Fetch from the remote database", LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
await performRebuildDB(plugin, "localOnly");
|
||||
return;
|
||||
} else {
|
||||
Logger("This feature requires enabling `Use new adapter`. Please enable it", LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
return;
|
||||
}
|
||||
}
|
||||
Logger(`The remote database has been locked for garbage collection`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
Logger(`Gathering chunk usage information`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
|
||||
const { unreferenced, missing } = await gatherChunkUsage(db);
|
||||
if (missing.size != 0) {
|
||||
Logger(`Some chunks are not found! We have to rescue`, LOG_LEVEL.NOTICE);
|
||||
Logger(missing, LOG_LEVEL.VERBOSE);
|
||||
} else {
|
||||
Logger(`All chunks are OK`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
const payload = {} as Record<string, string[]>;
|
||||
for (const [id, rev] of unreferenced) {
|
||||
payload[id] = [rev];
|
||||
}
|
||||
const removeItems = Object.keys(payload).length;
|
||||
if (removeItems == 0) {
|
||||
Logger(`No unreferenced chunks found (Local)`, LOG_LEVEL.NOTICE);
|
||||
await plugin.markRemoteResolved();
|
||||
}
|
||||
if (dryRun) {
|
||||
Logger(`There are ${removeItems} unreferenced chunks (Local)`, LOG_LEVEL.NOTICE);
|
||||
return;
|
||||
}
|
||||
Logger(`Deleting unreferenced chunks: ${removeItems}`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
for (const [id, rev] of unreferenced) {
|
||||
//@ts-ignore
|
||||
const ret = await db.purge(id, rev);
|
||||
Logger(ret, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
plugin.localDatabase.refreshSettings();
|
||||
Logger(`Compacting local database...`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
await db.compact();
|
||||
await plugin.markRemoteResolved();
|
||||
Logger("Done!", LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
})
|
||||
}
|
||||
|
||||
|
||||
export const balanceChunks = async (plugin: ObsidianLiveSyncPlugin, dryRun: boolean) => {
|
||||
|
||||
await runWithLock("clean-up:balance", true, async () => {
|
||||
const localDB = plugin.localDatabase.localDatabase;
|
||||
Logger(`Gathering chunk usage information`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
|
||||
const ret = await plugin.replicator.connectRemoteCouchDBWithSetting(plugin.settings, plugin.isMobile);
|
||||
if (typeof ret === "string") {
|
||||
Logger(`Connect error: ${ret}`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
return;
|
||||
}
|
||||
const localChunks = new Map<string, string>();
|
||||
const xx = await localDB.allDocs({ startkey: "h:", endkey: `h:\u{10ffff}` });
|
||||
for (const xxd of xx.rows) {
|
||||
const chunk = xxd.id
|
||||
localChunks.set(chunk, xxd.value.rev);
|
||||
}
|
||||
// const info = ret.info;
|
||||
const remoteDB = ret.db;
|
||||
const remoteChunks = new Map<string, string>();
|
||||
const xxr = await remoteDB.allDocs({ startkey: "h:", endkey: `h:\u{10ffff}` });
|
||||
for (const xxd of xxr.rows) {
|
||||
const chunk = xxd.id
|
||||
remoteChunks.set(chunk, xxd.value.rev);
|
||||
}
|
||||
const localToRemote = new Map<string, string>([...localChunks]);
|
||||
const remoteToLocal = new Map<string, string>([...remoteChunks]);
|
||||
for (const id of new Set([...localChunks.keys(), ...remoteChunks.keys()])) {
|
||||
if (remoteChunks.has(id)) {
|
||||
localToRemote.delete(id);
|
||||
}
|
||||
if (localChunks.has(id)) {
|
||||
remoteToLocal.delete(id);
|
||||
}
|
||||
}
|
||||
|
||||
function arrayToChunkedArray<T>(src: T[], size = 25) {
|
||||
const ret = [] as T[][];
|
||||
let i = 0;
|
||||
while (i < src.length) {
|
||||
ret.push(src.slice(i, i += size));
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
if (localToRemote.size == 0) {
|
||||
Logger(`No chunks need to be sent`, LOG_LEVEL.NOTICE);
|
||||
} else {
|
||||
Logger(`${localToRemote.size} chunks need to be sent`, LOG_LEVEL.NOTICE);
|
||||
if (!dryRun) {
|
||||
const w = arrayToChunkedArray([...localToRemote]);
|
||||
for (const chunk of w) {
|
||||
for (const [id,] of chunk) {
|
||||
const queryRet = await localDB.allDocs({ keys: [id], include_docs: true });
|
||||
const docs = queryRet.rows.filter(e => !("error" in e)).map(x => x.doc);
|
||||
|
||||
const ret = await remoteDB.bulkDocs(docs, { new_edits: false });
|
||||
Logger(ret, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
}
|
||||
Logger(`Done! ${remoteToLocal.size} chunks have been sent`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
}
|
||||
if (remoteToLocal.size == 0) {
|
||||
Logger(`No chunks need to be retrieved`, LOG_LEVEL.NOTICE);
|
||||
} else {
|
||||
Logger(`${remoteToLocal.size} chunks need to be retrieved`, LOG_LEVEL.NOTICE);
|
||||
if (!dryRun) {
|
||||
const w = arrayToChunkedArray([...remoteToLocal]);
|
||||
for (const chunk of w) {
|
||||
for (const [id,] of chunk) {
|
||||
const queryRet = await remoteDB.allDocs({ keys: [id], include_docs: true });
|
||||
const docs = queryRet.rows.filter(e => !("error" in e)).map(x => x.doc);
|
||||
|
||||
const ret = await localDB.bulkDocs(docs, { new_edits: false });
|
||||
Logger(ret, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
}
|
||||
Logger(`Done! ${remoteToLocal.size} chunks have been retrieved`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
export const remoteDatabaseCleanup = async (plugin: ObsidianLiveSyncPlugin, dryRun: boolean) => {
|
||||
const getSize = function (info: PouchDB.Core.DatabaseInfo, key: "active" | "external" | "file") {
|
||||
return Number.parseInt((info as any)?.sizes?.[key] ?? 0);
|
||||
}
|
||||
await runWithLock("clean-up:remote", true, async () => {
|
||||
const CHUNK_SIZE = 100;
|
||||
function makeChunkedArrayFromArray<T>(items: T[]): T[][] {
|
||||
const chunked = [];
|
||||
for (let i = 0; i < items.length; i += CHUNK_SIZE) {
|
||||
chunked.push(items.slice(i, i + CHUNK_SIZE));
|
||||
}
|
||||
return chunked;
|
||||
}
|
||||
try {
|
||||
const ret = await plugin.replicator.connectRemoteCouchDBWithSetting(plugin.settings, plugin.isMobile);
|
||||
if (typeof ret === "string") {
|
||||
Logger(`Connect error: ${ret}`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
return;
|
||||
}
|
||||
const info = ret.info;
|
||||
Logger(JSON.stringify(info), LOG_LEVEL.VERBOSE, "clean-up-db");
|
||||
Logger(`Database active-size: ${sizeToHumanReadable(getSize(info, "active"))}, external-size:${sizeToHumanReadable(getSize(info, "external"))}, file-size: ${sizeToHumanReadable(getSize(info, "file"))}`, LOG_LEVEL.NOTICE);
|
||||
|
||||
if (!dryRun) {
|
||||
Logger(`The remote database has been locked for garbage collection`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
await plugin.markRemoteLocked(true);
|
||||
}
|
||||
Logger(`Gathering chunk usage information`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
const db = ret.db;
|
||||
|
||||
const { unreferenced, missing } = await gatherChunkUsage(db);
|
||||
if (missing.size != 0) {
|
||||
Logger(`Some chunks are not found! We have to rescue`, LOG_LEVEL.NOTICE);
|
||||
Logger(missing, LOG_LEVEL.VERBOSE);
|
||||
} else {
|
||||
Logger(`All chunks are OK`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
const payload = {} as Record<string, string[]>;
|
||||
for (const [id, rev] of unreferenced) {
|
||||
payload[id] = [rev];
|
||||
}
|
||||
const removeItems = Object.keys(payload).length;
|
||||
if (removeItems == 0) {
|
||||
Logger(`No unreferenced chunk found (Remote)`, LOG_LEVEL.NOTICE);
|
||||
return;
|
||||
}
|
||||
if (dryRun) {
|
||||
Logger(`There are ${removeItems} unreferenced chunks (Remote)`, LOG_LEVEL.NOTICE);
|
||||
return;
|
||||
}
|
||||
Logger(`Deleting unreferenced chunks: ${removeItems}`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
const buffer = makeChunkedArrayFromArray(Object.entries(payload));
|
||||
for (const chunkedPayload of buffer) {
|
||||
const rets = await _requestToCouchDBFetch(
|
||||
`${plugin.settings.couchDB_URI}/${plugin.settings.couchDB_DBNAME}`,
|
||||
plugin.settings.couchDB_USER,
|
||||
plugin.settings.couchDB_PASSWORD,
|
||||
"_purge",
|
||||
chunkedPayload.reduce((p, c) => ({ ...p, [c[0]]: c[1] }), {}), "POST");
|
||||
// const result = await rets();
|
||||
Logger(JSON.stringify(await rets.json()), LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
Logger(`Compacting database...`, LOG_LEVEL.NOTICE, "clean-up-db");
|
||||
await db.compact();
|
||||
const endInfo = await db.info();
|
||||
|
||||
Logger(`Processed database active-size: ${sizeToHumanReadable(getSize(endInfo, "active"))}, external-size:${sizeToHumanReadable(getSize(endInfo, "external"))}, file-size: ${sizeToHumanReadable(getSize(endInfo, "file"))}`, LOG_LEVEL.NOTICE);
|
||||
Logger(`Reduced sizes: active-size: ${sizeToHumanReadable(getSize(info, "active") - getSize(endInfo, "active"))}, external-size:${sizeToHumanReadable(getSize(info, "external") - getSize(endInfo, "external"))}, file-size: ${sizeToHumanReadable(getSize(info, "file") - getSize(endInfo, "file"))}`, LOG_LEVEL.NOTICE);
|
||||
Logger(JSON.stringify(endInfo), LOG_LEVEL.VERBOSE, "clean-up-db");
|
||||
Logger(`Local database cleaning up...`);
|
||||
await localDatabaseCleanUp(plugin, true, false);
|
||||
} catch (ex) {
|
||||
Logger("Failed to clean up db.")
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
});
|
||||
}
|
||||
105
styles.css
@@ -77,15 +77,6 @@
|
||||
border-top: 1px solid var(--background-modifier-border);
|
||||
}
|
||||
|
||||
/* .sls-table-head{
|
||||
width:50%;
|
||||
}
|
||||
.sls-table-tail{
|
||||
width:50%;
|
||||
|
||||
} */
|
||||
|
||||
|
||||
.sls-header-button {
|
||||
margin-left: 2em;
|
||||
}
|
||||
@@ -95,13 +86,26 @@
|
||||
}
|
||||
|
||||
:root {
|
||||
--slsmessage: "";
|
||||
--sls-log-text: "";
|
||||
}
|
||||
|
||||
.sls-troubleshoot-preview {
|
||||
max-width: max-content;
|
||||
}
|
||||
|
||||
.sls-troubleshoot-preview img {
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
.CodeMirror-wrap::before,
|
||||
.cm-s-obsidian>.cm-editor::before,
|
||||
.canvas-wrapper::before {
|
||||
content: attr(data-log);
|
||||
.markdown-preview-view.cm-s-obsidian::before,
|
||||
.markdown-source-view.cm-s-obsidian::before,
|
||||
.canvas-wrapper::before,
|
||||
.empty-state::before {
|
||||
content: var(--sls-log-text, "");
|
||||
font-variant-numeric: tabular-nums;
|
||||
font-variant-emoji: emoji;
|
||||
tab-size: 4;
|
||||
text-align: right;
|
||||
white-space: pre-wrap;
|
||||
position: absolute;
|
||||
@@ -116,6 +120,19 @@
|
||||
filter: grayscale(100%);
|
||||
}
|
||||
|
||||
.empty-state::before,
|
||||
.markdown-preview-view.cm-s-obsidian::before,
|
||||
.markdown-source-view.cm-s-obsidian::before {
|
||||
top: var(--header-height);
|
||||
right: 1em;
|
||||
}
|
||||
|
||||
.is-mobile .empty-state::before,
|
||||
.is-mobile .markdown-preview-view.cm-s-obsidian::before,
|
||||
.is-mobile .markdown-source-view.cm-s-obsidian::before {
|
||||
top: var(--view-header-height);
|
||||
right: 1em;
|
||||
}
|
||||
.canvas-wrapper::before {
|
||||
right: 48px;
|
||||
}
|
||||
@@ -124,7 +141,7 @@
|
||||
right: 0px;
|
||||
}
|
||||
|
||||
.cm-s-obsidian>.cm-editor::before {
|
||||
.cm-s-obsidian > .cm-editor::before {
|
||||
right: 16px;
|
||||
}
|
||||
|
||||
@@ -153,8 +170,8 @@ div.sls-setting-menu-btn {
|
||||
/* width: 100%; */
|
||||
}
|
||||
|
||||
.sls-setting-tab:hover~div.sls-setting-menu-btn,
|
||||
.sls-setting-label.selected .sls-setting-tab:checked~div.sls-setting-menu-btn {
|
||||
.sls-setting-tab:hover ~ div.sls-setting-menu-btn,
|
||||
.sls-setting-label.selected .sls-setting-tab:checked ~ div.sls-setting-menu-btn {
|
||||
background-color: var(--interactive-accent);
|
||||
color: var(--text-on-accent);
|
||||
}
|
||||
@@ -251,4 +268,58 @@ div.sls-setting-menu-btn {
|
||||
|
||||
.sls-item-dirty::before {
|
||||
content: "✏";
|
||||
}
|
||||
}
|
||||
|
||||
.sls-setting-hidden {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.password-input > .setting-item-control > input {
|
||||
-webkit-text-security: disc;
|
||||
}
|
||||
|
||||
span.ls-mark-cr::after {
|
||||
user-select: none;
|
||||
content: "↲";
|
||||
color: var(--text-muted);
|
||||
font-size: 0.8em;
|
||||
}
|
||||
|
||||
.deleted span.ls-mark-cr::after {
|
||||
color: var(--text-on-accent);
|
||||
}
|
||||
|
||||
.ls-imgdiff-wrap {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.ls-imgdiff-wrap .overlay {
|
||||
position: relative;
|
||||
}
|
||||
|
||||
.ls-imgdiff-wrap .overlay .img-base {
|
||||
position: relative;
|
||||
top: 0;
|
||||
left: 0;
|
||||
}
|
||||
.ls-imgdiff-wrap .overlay .img-overlay {
|
||||
-webkit-filter: invert(100%) opacity(50%);
|
||||
filter: invert(100%) opacity(50%);
|
||||
position: absolute;
|
||||
top: 0;
|
||||
left: 0;
|
||||
animation: ls-blink-diff 0.5s cubic-bezier(0.4, 0, 1, 1) infinite alternate;
|
||||
}
|
||||
@keyframes ls-blink-diff {
|
||||
0% {
|
||||
opacity: 0;
|
||||
}
|
||||
50% {
|
||||
opacity: 0;
|
||||
}
|
||||
100% {
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
{
|
||||
"extends": "@tsconfig/svelte/tsconfig.json",
|
||||
"inlineSourceMap": true,
|
||||
"compilerOptions": {
|
||||
"baseUrl": ".",
|
||||
"module": "ESNext",
|
||||
@@ -6,16 +8,23 @@
|
||||
"allowJs": true,
|
||||
"noImplicitAny": true,
|
||||
"moduleResolution": "node",
|
||||
"types": [
|
||||
"svelte",
|
||||
"node"
|
||||
],
|
||||
// "importsNotUsedAsValues": "error",
|
||||
"importHelpers": false,
|
||||
"alwaysStrict": true,
|
||||
"allowImportingTsExtensions": true,
|
||||
"noEmit": true,
|
||||
"lib": [
|
||||
"es2018",
|
||||
"DOM",
|
||||
"ES5",
|
||||
"ES6",
|
||||
"ES7",
|
||||
"es2019.array"
|
||||
"es2019.array",
|
||||
"ES2020.BigInt",
|
||||
]
|
||||
},
|
||||
"include": [
|
||||
|
||||
88
updates.md
@@ -1,39 +1,67 @@
|
||||
### 0.19.0
|
||||
### 0.23.0
|
||||
Incredibly new features!
|
||||
|
||||
#### Customization sync
|
||||
Now, we can use object storage (MinIO, S3, R2 or anything you like) for synchronising! Moreover, despite that, we can use all the features as if we were using CouchDB.
|
||||
Note: As this is a pretty experimental feature, hence we have some limitations.
|
||||
- This is built on the append-only architecture. It will not shrink used storage if we do not perform a rebuild.
|
||||
- A bit fragile. However, our version x.yy.0 is always so.
|
||||
- When the first synchronisation, the entire history to date is transferred. For this reason, it is preferable to do this under the WiFi network.
|
||||
- Do not worry, from the second synchronisation, we always transfer only differences.
|
||||
|
||||
Since `Plugin and their settings` have been broken, so I tried to fix it, not just fix it, but fix it the way it should be.
|
||||
I hope this feature empowers users to maintain independence and self-host their data, offering an alternative for those who prefer to manage their own storage solutions and avoid being stuck on the right side of a sudden change in business model.
|
||||
|
||||
Now, we have `Customization sync`.
|
||||
Of course, I use Self-hosted MinIO for testing and recommend this. It is for the same reason as using CouchDB. -- open, controllable, auditable and indeed already audited by numerous eyes.
|
||||
|
||||
It is a real shame that the compatibility between these features has been broken. However, this new feature is surely useful and I believe that worth getting over the pain.
|
||||
We can use the new feature with the same configuration. Only the menu on the command palette has been changed. The dialog can be opened by `Show customization sync dialog`.
|
||||
Let me write one more acknowledgement.
|
||||
|
||||
I hope you will give it a try.
|
||||
I have a lot of respect for that plugin, even though it is sometimes treated as if it is a competitor, remotely-save. I think it is a great architecture that embodies a different approach to my approach of recreating history. This time, with all due respect, I have used some of its code as a reference.
|
||||
Hooray for open source, and generous licences, and the sharing of knowledge by experts.
|
||||
|
||||
|
||||
#### Minors
|
||||
|
||||
- 0.19.1
|
||||
- Fixed: Fixed hidden file handling on Linux
|
||||
- Improved: Now customization sync works more smoothly.
|
||||
- 0.19.2
|
||||
#### Version history
|
||||
- 0.23.6:
|
||||
- Fixed:
|
||||
- Fixed garbage collection error while unreferenced chunks exist many.
|
||||
- Fixed filename validation on Linux.
|
||||
- Improved:
|
||||
- Showing status is now thinned for performance.
|
||||
- Enhance caching while collecting chunks.
|
||||
- 0.19.3
|
||||
- Improved:
|
||||
- Now replication will be paced by collecting chunks. If synchronisation has been deadlocked, please enable `Do not pace synchronization` once.
|
||||
- 0.19.4
|
||||
- Improved:
|
||||
- Reduced remote database checking to improve speed and reduce bandwidth.
|
||||
- Now the remote chunks could be decrypted even if we are using `Incubate chunks in Document`. (The note of 0.23.6 has been fixed).
|
||||
- Chunk retrieving with `Incubate chunks in document` got more efficiently.
|
||||
- No longer task processor misses the completed tasks.
|
||||
- Replication is no longer started automatically during changes in window visibility (e.g., task switching on the desktop) when off-focused.
|
||||
- 0.23.5:
|
||||
- New feature:
|
||||
- Now we can check configuration mismatching between clients before synchronisation.
|
||||
- Default: enabled / Preferred: enabled / We can disable this by the `Do not check configuration mismatch before replication` toggle in the `Hatch` pane.
|
||||
- It detects configuration mismatches and prevents synchronisation failures and wasted storage.
|
||||
- Now we can perform remote database compaction from the `Maintenance` pane.
|
||||
- Fixed:
|
||||
- Chunks which previously misinterpreted are now interpreted correctly.
|
||||
- No more missing chunks which not be found forever, except if it has been actually missing.
|
||||
- Deleted file detection on hidden file synchronising now works fine.
|
||||
- Now the Customisation sync is surely quiet while it has been disabled.
|
||||
- We can detect the bucket could not be reachable.
|
||||
- Note:
|
||||
- Known inexplicable behaviour: Recently, (Maybe while enabling `Incubate chunks in Document` and `Fetch chunks on demand` or some more toggles), our customisation sync data is sometimes corrupted. It will be addressed by the next release.
|
||||
- 0.23.4
|
||||
- Fixed:
|
||||
- No longer experimental configuration is shown on the Minimal Setup.
|
||||
- New feature:
|
||||
- We can now use `Incubate Chunks in Document` to reduce non-well-formed chunks.
|
||||
- Default: disabled / Preferred: enabled in all devices.
|
||||
- When we enabled this toggle, newly created chunks are temporarily kept within the document, and graduated to become independent chunks once stabilised.
|
||||
- The [design document](https://github.com/vrtmrz/obsidian-livesync/blob/3925052f9290b3579e45a4b716b3679c833d8ca0/docs/design_docs_of_keep_newborn_chunks.md) has been also available..
|
||||
- 0.23.3
|
||||
- Fixed: No longer unwanted `\f` in journal sync.
|
||||
- 0.23.2
|
||||
- Sorry for all the fixes to experimental features. (These things were also critical for dogfooding). The next release would be the main fixes! Thank you for your patience and understanding!
|
||||
- Fixed:
|
||||
- Journal Sync will not hang up during big replication, especially the initial one.
|
||||
- All changes which have been replicated while rebuilding will not be postponed (Previous behaviour).
|
||||
- Improved:
|
||||
- Now Journal Sync works efficiently in download and parse, or pack and upload.
|
||||
- Less server storage and faster packing/unpacking usage by the new chunk format.
|
||||
- 0.23.1
|
||||
- Fixed:
|
||||
- Now journal synchronisation considers untransferred each from sent and received.
|
||||
- Journal sync now handles retrying.
|
||||
- Journal synchronisation no longer considers the synchronisation of chunks as revision updates (Simply ignored).
|
||||
- Journal sync now splits the journal pack to prevent mobile device rebooting.
|
||||
- Maintenance menus which had been on the command palette are now back in the maintain pane on the setting dialogue.
|
||||
- Improved:
|
||||
- Now all changes which have been replicated while rebuilding will be postponed.
|
||||
|
||||
... To continue on to `updates_old.md`.
|
||||
- 0.23.0
|
||||
- New feature:
|
||||
- Now we can use Object Storage.
|
||||
522
updates_old.md
@@ -1,3 +1,525 @@
|
||||
### 0.22.0
|
||||
A few years passed since Self-hosted LiveSync was born, and our codebase had been very complicated. This could be patient now, but it should be a tremendous hurt.
|
||||
Therefore at v0.22.0, for future maintainability, I refined task scheduling logic totally.
|
||||
|
||||
Of course, I think this would be our suffering in some cases. However, I would love to ask you for your cooperation and contribution.
|
||||
|
||||
Sorry for being absent so much long. And thank you for your patience!
|
||||
|
||||
Note: we got a very performance improvement.
|
||||
Note at 0.22.2: **Now, to rescue mobile devices, Maximum file size is set to 50 by default**. Please configure the limit as you need. If you do not want to limit the sizes, set zero manually, please.
|
||||
|
||||
#### Version history
|
||||
- 0.22.19
|
||||
- Fixed:
|
||||
- No longer data corrupting due to false BASE64 detections.
|
||||
- Improved:
|
||||
- A bit more efficient in Automatic data compression.
|
||||
- 0.22.18
|
||||
- New feature (Very Experimental):
|
||||
- Now we can use `Automatic data compression` to reduce amount of traffic and the usage of remote database.
|
||||
- Please make sure all devices are updated to v0.22.18 before trying this feature.
|
||||
- If you are using some other utilities which connected to your vault, please make sure that they have compatibilities.
|
||||
- Note: Setting `File Compression` on the remote database works for shrink the size of remote database. Please refer the [Doc](https://docs.couchdb.org/en/stable/config/couchdb.html#couchdb/file_compression).
|
||||
- 0.22.17:
|
||||
- Fixed:
|
||||
- Error handling on booting now works fine.
|
||||
- Replication is now started automatically in LiveSync mode.
|
||||
- Batch database update is now disabled in LiveSync mode.
|
||||
- No longer automatically reconnection while off-focused.
|
||||
- Status saves are thinned out.
|
||||
- Now Self-hosted LiveSync waits for all files between the local database and storage to be surely checked.
|
||||
- Improved:
|
||||
- The job scheduler is now more robust and stable.
|
||||
- The status indicator no longer flickers and keeps zero for a while.
|
||||
- No longer meaningless frequent updates of status indicators.
|
||||
- Now we can configure regular expression filters in handy UI. Thank you so much, @eth-p!
|
||||
- `Fetch` or `Rebuild everything` is now more safely performed.
|
||||
- Minor things
|
||||
- Some utility function has been added.
|
||||
- Customisation sync now less wrong messages.
|
||||
- Digging the weeds for eradication of type errors.
|
||||
- 0.22.16:
|
||||
- Fixed:
|
||||
- Fixed the issue that binary files were sometimes corrupted.
|
||||
- Fixed customisation sync data could be corrupted.
|
||||
- Improved:
|
||||
- Now the remote database costs lower memory.
|
||||
- This release requires a brief wait on the first synchronisation, to track the latest changeset again.
|
||||
- Description added for the `Device name`.
|
||||
- Refactored:
|
||||
- Many type-errors have been resolved.
|
||||
- Obsolete file has been deleted.
|
||||
- 0.22.15:
|
||||
- Improved:
|
||||
- Faster start-up by removing too many logs which indicates normality
|
||||
- By streamlined scanning of customised synchronisation extra phases have been deleted.
|
||||
... To continue on to `updates_old.md`.
|
||||
- 0.22.14:
|
||||
- New feature:
|
||||
- We can disable the status bar in the setting dialogue.
|
||||
- Improved:
|
||||
- Now some files are handled as correct data type.
|
||||
- Customisation sync now uses the digest of each file for better performance.
|
||||
- The status in the Editor now works performant.
|
||||
- Refactored:
|
||||
- Common functions have been ready and the codebase has been organised.
|
||||
- Stricter type checking following TypeScript updates.
|
||||
- Remove old iOS workaround for simplicity and performance.
|
||||
- 0.22.13:
|
||||
- Improved:
|
||||
- Now using HTTP for the remote database URI warns of an error (on mobile) or notice (on desktop).
|
||||
- Refactored:
|
||||
- Dependencies have been polished.
|
||||
- 0.22.12:
|
||||
- Changed:
|
||||
- The default settings has been changed.
|
||||
- Improved:
|
||||
- Default and preferred settings are applied on completion of the wizard.
|
||||
- Fixed:
|
||||
- Now Initialisation `Fetch` will be performed smoothly and there will be fewer conflicts.
|
||||
- No longer stuck while Handling transferred or initialised documents.
|
||||
- 0.22.11:
|
||||
- Fixed:
|
||||
- `Verify and repair all files` is no longer broken.
|
||||
- New feature:
|
||||
- Now `Verify and repair all files` is able to...
|
||||
- Restore if the file only in the local database.
|
||||
- Show the history.
|
||||
- Improved:
|
||||
- Performance improved.
|
||||
- 0.22.10
|
||||
- Fixed:
|
||||
- No longer unchanged hidden files and customisations are saved and transferred now.
|
||||
- File integrity of vault history indicates the integrity correctly.
|
||||
- Improved:
|
||||
- In the report, the schema of the remote database URI is now printed.
|
||||
- 0.22.9
|
||||
- Fixed:
|
||||
- Fixed a bug on `fetch chunks on demand` that could not fetch the chunks on demand.
|
||||
- Improved:
|
||||
- `fetch chunks on demand` works more smoothly.
|
||||
- Initialisation `Fetch` is now more efficient.
|
||||
- Tidied:
|
||||
- Removed some meaningless codes.
|
||||
- 0.22.8
|
||||
- Fixed:
|
||||
- Now fetch and unlock the locked remote database works well again.
|
||||
- No longer crash on symbolic links inside hidden folders.
|
||||
- Improved:
|
||||
- Chunks are now created more efficiently.
|
||||
- Splitting old notes into a larger chunk.
|
||||
- Better performance in saving notes.
|
||||
- Network activities are indicated as an icon.
|
||||
- Less memory used for binary processing.
|
||||
- Tidied:
|
||||
- Cleaned unused functions up.
|
||||
- Sorting out the codes that have become nonsense.
|
||||
- Changed:
|
||||
- Now no longer `fetch chunks on demand` needs `Pacing replication`
|
||||
- The setting `Do not pace synchronization` has been deleted.
|
||||
- 0.22.7
|
||||
- Fixed:
|
||||
- No longer deleted hidden files were ignored.
|
||||
- The document history dialogue is now able to process the deleted revisions.
|
||||
- Deletion of a hidden file is now surely performed even if the file is already conflicted.
|
||||
- 0.22.6
|
||||
- Fixed:
|
||||
- Fixed a problem with synchronisation taking a long time to start in some cases.
|
||||
- The first synchronisation after update might take a bit longer.
|
||||
- Now we can disable E2EE encryption.
|
||||
- Improved:
|
||||
- `Setup Wizard` is now more clear.
|
||||
- `Minimal Setup` is now more simple.
|
||||
- Self-hosted LiveSync now be able to use even if there are vaults with the same name.
|
||||
- Database suffix will automatically added.
|
||||
- Now Self-hosted LiveSync waits until set-up is complete.
|
||||
- Show reload prompts when possibly recommended while settings.
|
||||
- New feature:
|
||||
- A guidance dialogue prompting for settings will be shown after the installation.
|
||||
- Changed
|
||||
- `Open setup URI` is now `Use the copied setup URI`
|
||||
- `Copy setup URI` is now `Copy current settings as a new setup URI`
|
||||
- `Setup Wizard` is now `Minimal Setup`
|
||||
- `Check database configuration` is now `Check and Fix database configuration`
|
||||
- 0.22.5
|
||||
- Fixed:
|
||||
- Some description of settings have been refined
|
||||
- New feature:
|
||||
- TroubleShooting is now shown in the setting dialogue.
|
||||
- 0.22.4
|
||||
- Fixed:
|
||||
- Now the result of conflict resolution could be surely written into the storage.
|
||||
- Deleted files can be handled correctly again in the history dialogue and conflict dialogue.
|
||||
- Some wrong log messages were fixed.
|
||||
- Change handling now has become more stable.
|
||||
- Some event handling became to be safer.
|
||||
- Improved:
|
||||
- Dumping document information shows conflicts and revisions.
|
||||
- The timestamp-only differences can be surely cached.
|
||||
- Timestamp difference detection can be rounded by two seconds.
|
||||
- Refactored:
|
||||
- A bit of organisation to write the test.
|
||||
- 0.22.3
|
||||
- Fixed:
|
||||
- No longer detects storage changes which have been caused by Self-hosted LiveSync itself.
|
||||
- Setting sync file will be detected only if it has been configured now.
|
||||
- And its log will be shown only while the verbose log is enabled.
|
||||
- Customisation file enumeration has got less blingy.
|
||||
- Deletion of files is now reliably synchronised.
|
||||
- Fixed and improved:
|
||||
- In-editor-status is now shown in the following areas:
|
||||
- Note editing pane (Source mode and live-preview mode).
|
||||
- New tab pane.
|
||||
- Canvas pane.
|
||||
- 0.22.2
|
||||
- Fixed:
|
||||
- Now the results of resolving conflicts are surely synchronised.
|
||||
- Modified:
|
||||
- Some setting items got new clear names. (`Sync Settings` -> `Targets`).
|
||||
- New feature:
|
||||
- We can limit the synchronising files by their size. (`Sync Settings` -> `Targets` -> `Maximum file size`).
|
||||
- It depends on the size of the newer one.
|
||||
- At Obsidian 1.5.3 on mobile, we should set this to around 50MB to avoid restarting Obsidian.
|
||||
- Now the settings could be stored in a specific markdown file to synchronise or switch it (`General Setting` -> `Share settings via markdown`).
|
||||
- [Screwdriver](https://github.com/vrtmrz/obsidian-screwdriver) is quite good, but mostly we only need this.
|
||||
- Customisation of the obsoleted device is now able to be deleted at once.
|
||||
- We have to put the maintenance mode in at the Customisation sync dialogue.
|
||||
- 0.22.1
|
||||
- New feature:
|
||||
- We can perform automatic conflict resolution for inactive files, and postpone only manual ones by `Postpone manual resolution of inactive files`.
|
||||
- Now we can see the image in the document history dialogue.
|
||||
- We can see the difference of the image, in the document history dialogue.
|
||||
- And also we can highlight differences.
|
||||
- Improved:
|
||||
- Hidden file sync has been stabilised.
|
||||
- Now automatically reloads the conflict-resolution dialogue when new conflicted revisions have arrived.
|
||||
- Fixed:
|
||||
- No longer periodic process runs after unloading the plug-in.
|
||||
- Now the modification of binary files is surely stored in the storage.
|
||||
- 0.22.0
|
||||
- Refined:
|
||||
- Task scheduling logics has been rewritten.
|
||||
- Screen updates are also now efficient.
|
||||
- Possibly many bugs and fragile behaviour has been fixed.
|
||||
- Status updates and logging have been thinned out to display.
|
||||
- Fixed:
|
||||
- Remote-chunk-fetching now works with keeping request intervals
|
||||
- New feature:
|
||||
- We can show only the icons in the editor.
|
||||
- Progress indicators have been more meaningful:
|
||||
- 📥 Unprocessed transferred items
|
||||
- 📄 Working database operation
|
||||
- 💾 Working write storage processes
|
||||
- ⏳ Working read storage processes
|
||||
- 🛫 Pending read storage processes
|
||||
- ⚙️ Working or pending storage processes of hidden files
|
||||
- 🧩 Waiting chunks
|
||||
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
|
||||
|
||||
|
||||
... To continue on to `updates_old.md`.
|
||||
|
||||
### 0.21.0
|
||||
The E2EE encryption V2 format has been reverted. That was probably the cause of the glitch.
|
||||
Instead, to maintain efficiency, files are treated with Blob until just before saving. Along with this, the old-fashioned encryption format has also been discontinued.
|
||||
There are both forward and backwards compatibilities, with recent versions. However, unfortunately, we lost compatibility with filesystem-livesync or some.
|
||||
It will be addressed soon. Please be patient if you are using filesystem-livesync with E2EE.
|
||||
|
||||
- 0.21.5
|
||||
- Improved:
|
||||
- Now all revisions will be shown only its first a few letters.
|
||||
- Now ID of the documents is shown in the log with the first 8 letters.
|
||||
- Fixed:
|
||||
- Check before modifying files has been implemented.
|
||||
- Content change detection has been improved.
|
||||
- 0.21.4
|
||||
- This release had been skipped.
|
||||
- 0.21.3
|
||||
- Implemented:
|
||||
- Now we can use SHA1 for hash function as fallback.
|
||||
- 0.21.2
|
||||
- IMPORTANT NOTICE: **0.21.1 CONTAINS A BUG WHILE REBUILDING THE DATABASE. IF YOU HAVE BEEN REBUILT, PLEASE MAKE SURE THAT ALL FILES ARE SANE.**
|
||||
- This has been fixed in this version.
|
||||
- Fixed:
|
||||
- No longer files are broken while rebuilding.
|
||||
- Now, Large binary files can be written correctly on a mobile platform.
|
||||
- Any decoding errors now make zero-byte files.
|
||||
- Modified:
|
||||
- All files are processed sequentially for each.
|
||||
- 0.21.1
|
||||
- Fixed:
|
||||
- No more infinity loops on larger files.
|
||||
- Show message on decode error.
|
||||
- Refactored:
|
||||
- Fixed to avoid obsolete global variables.
|
||||
- 0.21.0
|
||||
- Changes and performance improvements:
|
||||
- Now the saving files are processed by Blob.
|
||||
- The V2-Format has been reverted.
|
||||
- New encoding format has been enabled in default.
|
||||
- WARNING: Since this version, the compatibilities with older Filesystem LiveSync have been lost.
|
||||
|
||||
## 0.20.0
|
||||
At 0.20.0, Self-hosted LiveSync has changed the binary file format and encrypting format, for efficient synchronisation.
|
||||
The dialogue will be shown and asks us to decide whether to keep v1 or use v2. Once we have enabled v2, all subsequent edits will be saved in v2. Therefore, devices running 0.19 or below cannot understand this and they might say that decryption error. Please update all devices.
|
||||
Then we will have an impressive performance.
|
||||
|
||||
Of course, these are very impactful changes. If you have any questions or troubled things, please feel free to open an issue and mention me.
|
||||
|
||||
Note: if you want to roll it back to v1, please enable `Use binary and encryption version 1` on the `Hatch` pane and perform the `rebuild everything` once.
|
||||
|
||||
Extra but notable information:
|
||||
|
||||
This format change gives us the ability to detect some `marks` in the binary files as same as text files. Therefore, we can split binary files and some specific sort of them (i.e., PDF files) at the specific character. It means that editing the middle of files could be detected with marks.
|
||||
|
||||
Now only a few chunks are transferred, even if we add a comment to the PDF or put new files into the ZIP archives.
|
||||
|
||||
|
||||
- 0.20.7
|
||||
- Fixed
|
||||
- To better replication, path obfuscation is now deterministic even if with E2EE.
|
||||
Note: Compatible with previous database without any conversion. Only new files will be obfuscated in deterministic.
|
||||
- 0.20.6
|
||||
- Fixed
|
||||
- Now empty file could be decoded.
|
||||
- Local files are no longer pre-saved before fetching from a remote database.
|
||||
- No longer deadlock while applying customisation sync.
|
||||
- Configuration with multiple files is now able to be applied correctly.
|
||||
- Deleting folder propagation now works without enabling the use of a trash bin.
|
||||
- 0.20.5
|
||||
- Fixed
|
||||
- Now the files which having digit or character prefixes in the path will not be ignored.
|
||||
- 0.20.4
|
||||
- Fixed
|
||||
- The text-input-dialogue is no longer broken.
|
||||
- Finally, we can use the Setup URI again on mobile.
|
||||
- 0.20.3
|
||||
- New feature:
|
||||
- We can launch Customization sync from the Ribbon if we enabled it.
|
||||
- Fixed:
|
||||
- Setup URI is now back to the previous spec; be encrypted by V1.
|
||||
- It may avoid the trouble with iOS 17.
|
||||
- The Settings dialogue is now registered at the beginning of the start-up process.
|
||||
- We can change the configuration even though LiveSync could not be launched in normal.
|
||||
- Improved:
|
||||
- Enumerating documents has been faster.
|
||||
- 0.20.2
|
||||
- New feature:
|
||||
- We can delete all data of customization sync from the `Delete all customization sync data` on the `Hatch` pane.
|
||||
- Fixed:
|
||||
- Prevent keep restarting on iOS by yielding microtasks.
|
||||
- 0.20.1
|
||||
- Fixed:
|
||||
- No more UI freezing and keep restarting on iOS.
|
||||
- Diff of Non-markdown documents are now shown correctly.
|
||||
- Improved:
|
||||
- Performance has been a bit improved.
|
||||
- Customization sync has gotten faster.
|
||||
- However, We lost forward compatibility again (only for this feature). Please update all devices.
|
||||
- Misc
|
||||
- Terser configuration has been more aggressive.
|
||||
- 0.20.0
|
||||
- Improved:
|
||||
- A New binary file handling implemented
|
||||
- A new encrypted format has been implemented
|
||||
- Now the chunk sizes will be adjusted for efficient sync
|
||||
- Fixed:
|
||||
- levels of exception in some logs have been fixed
|
||||
- Tidied:
|
||||
- Some Lint warnings have been suppressed.
|
||||
|
||||
### 0.19.0
|
||||
|
||||
#### Customization sync
|
||||
|
||||
Since `Plugin and their settings` have been broken, so I tried to fix it, not just fix it, but fix it the way it should be.
|
||||
|
||||
Now, we have `Customization sync`.
|
||||
|
||||
It is a real shame that the compatibility between these features has been broken. However, this new feature is surely useful and I believe that worth getting over the pain.
|
||||
We can use the new feature with the same configuration. Only the menu on the command palette has been changed. The dialog can be opened by `Show customization sync dialog`.
|
||||
|
||||
I hope you will give it a try.
|
||||
|
||||
#### Minors
|
||||
- 0.19.1
|
||||
- Fixed: Fixed hidden file handling on Linux
|
||||
- Improved: Now customization sync works more smoothly.
|
||||
- 0.19.2
|
||||
- Fixed:
|
||||
- Fixed garbage collection error while unreferenced chunks exist many.
|
||||
- Fixed filename validation on Linux.
|
||||
- Improved:
|
||||
- Showing status is now thinned for performance.
|
||||
- Enhance caching while collecting chunks.
|
||||
- 0.19.3
|
||||
- Improved:
|
||||
- Now replication will be paced by collecting chunks. If synchronisation has been deadlocked, please enable `Do not pace synchronization` once.
|
||||
- 0.19.4
|
||||
- Improved:
|
||||
- Reduced remote database checking to improve speed and reduce bandwidth.
|
||||
- Fixed:
|
||||
- Chunks which previously misinterpreted are now interpreted correctly.
|
||||
- No more missing chunks which not be found forever, except if it has been actually missing.
|
||||
- Deleted file detection on hidden file synchronising now works fine.
|
||||
- Now the Customisation sync is surely quiet while it has been disabled.
|
||||
- 0.19.5
|
||||
- Fixed:
|
||||
- Now hidden file synchronisation would not be hanged, even if so many files exist.
|
||||
- Improved:
|
||||
- Customisation sync works more smoothly.
|
||||
- Note: Concurrent processing has been rollbacked into the original implementation. As a result, the total number of processes is no longer shown next to the hourglass icon. However, only the processes that are running concurrently are shown.
|
||||
- 0.19.6
|
||||
- Fixed:
|
||||
- Logging has been tweaked.
|
||||
- No more too many planes and rockets.
|
||||
- The batch database update now surely only works in non-live mode.
|
||||
- Internal things:
|
||||
- Some frameworks has been upgraded.
|
||||
- Import declaration has been fixed.
|
||||
- Improved:
|
||||
- The plug-in now asks to enable a new adaptor, when rebuilding, if it is not enabled yet.
|
||||
- The setting dialogue refined.
|
||||
- Configurations for compatibilities have been moved under the hatch.
|
||||
- Made it clear that disabled is the default.
|
||||
- Ambiguous names configuration have been renamed.
|
||||
- Items that have no meaning in the settings are no longer displayed.
|
||||
- Some items have been reordered for clarity.
|
||||
- Each configuration has been grouped.
|
||||
- 0.19.7
|
||||
- Fixed:
|
||||
- The initial pane of Setting dialogue is now changed to General Settings.
|
||||
- The Setup Wizard is now able to flush existing settings and get into the mode again.
|
||||
- 0.19.8
|
||||
- New feature:
|
||||
- Vault history: A tab has been implemented to give a birds-eye view of the changes that have occurred in the vault.
|
||||
- Improved:
|
||||
- Now the passphrases on the dialogue masked out. Thank you @antoKeinanen!
|
||||
- Log dialogue is now shown as one of tabs.
|
||||
- Fixed:
|
||||
- Some minor issues has been fixed.
|
||||
- 0.19.9
|
||||
- New feature (For fixing a problem):
|
||||
- We can fix the database obfuscated and plain paths that have been mixed up.
|
||||
- Improvements
|
||||
- Customisation Sync performance has been improved.
|
||||
- 0.19.10
|
||||
- Fixed
|
||||
- Fixed the issue about fixing the database.
|
||||
- 0.19.11
|
||||
- Improvements:
|
||||
- Hashing ChunkID has been improved.
|
||||
- Logging keeps 400 lines now.
|
||||
- Refactored:
|
||||
- Import statement has been fixed about types.
|
||||
- 0.19.12
|
||||
- Improved:
|
||||
- Boot-up performance has been improved.
|
||||
- Customisation sync performance has been improved.
|
||||
- Synchronising performance has been improved.
|
||||
- 0.19.13
|
||||
- Implemented:
|
||||
- Database clean-up is now in beta 2!
|
||||
We can shrink the remote database by deleting unused chunks, with keeping history.
|
||||
Note: Local database is not cleaned up totally. We have to `Fetch` again to let it done.
|
||||
**Note2**: Still in beta. Please back your vault up anything before.
|
||||
- Fixed:
|
||||
- The log updates are not thinned out now.
|
||||
- 0.19.14
|
||||
- Fixed:
|
||||
- Internal documents are now ignored.
|
||||
- Merge dialogue now respond immediately to button pressing.
|
||||
- Periodic processing now works fine.
|
||||
- The checking interval of detecting conflicted has got shorter.
|
||||
- Replication is now cancelled while cleaning up.
|
||||
- The database locking by the cleaning up is now carefully unlocked.
|
||||
- Missing chunks message is correctly reported.
|
||||
- New feature:
|
||||
- Suspend database reflecting has been implemented.
|
||||
- This can be disabled by `Fetch database with previous behaviour`.
|
||||
- Now fetch suspends the reflecting database and storage changes temporarily to improve the performance.
|
||||
- We can choose the action when the remote database has been cleaned
|
||||
- Merge dialogue now show `↲` before the new line.
|
||||
- Improved:
|
||||
- Now progress is reported while the cleaning up and fetch process.
|
||||
- Cancelled replication is now detected.
|
||||
- 0.19.15
|
||||
- Fixed:
|
||||
- Now storing files after cleaning up is correct works.
|
||||
- Improved:
|
||||
- Cleaning the local database up got incredibly fastened.
|
||||
Now we can clean instead of fetching again when synchronising with the remote which has been cleaned up.
|
||||
- 0.19.16
|
||||
- Many upgrades on this release. I have tried not to let that happen, if something got corrupted, please feel free to notify me.
|
||||
- New feature:
|
||||
- (Beta) ignore files handling
|
||||
We can use `.gitignore`, `.dockerignore`, and anything you like to filter the synchronising files.
|
||||
- Fixed:
|
||||
- Buttons on lock-detected-dialogue now can be shown in narrow-width devices.
|
||||
- Improved:
|
||||
- Some constant has been flattened to be evaluated.
|
||||
- The usage of the deprecated API of obsidian has been reduced.
|
||||
- Now the indexedDB adapter will be enabled while the importing configuration.
|
||||
- Misc:
|
||||
- Compiler, framework, and dependencies have been upgraded.
|
||||
- Due to standing for these impacts (especially in esbuild and svelte,) terser has been introduced.
|
||||
Feel free to notify your opinion to me! I do not like to obfuscate the code too.
|
||||
- 0.19.17
|
||||
- Fixed:
|
||||
- Now nested ignore files could be parsed correctly.
|
||||
- The unexpected deletion of hidden files in some cases has been corrected.
|
||||
- Hidden file change is no longer reflected on the device which has made the change itself.
|
||||
- Behaviour changed:
|
||||
- From this version, the file which has `:` in its name should be ignored even if on Linux devices.
|
||||
- 0.19.18
|
||||
- Fixed:
|
||||
- Now the empty (or deleted) file could be conflict-resolved.
|
||||
- 0.19.19
|
||||
- Fixed:
|
||||
- Resolving conflicted revision has become more robust.
|
||||
- LiveSync now try to keep local changes when fetching from the rebuilt remote database.
|
||||
Local changes now have been kept as a revision and fetched things will be new revisions.
|
||||
- Now, all files will be restored after performing `fetch` immediately.
|
||||
- 0.19.20
|
||||
- New feature:
|
||||
- `Sync on Editor save` has been implemented
|
||||
- We can start synchronisation when we save from the Obsidian explicitly.
|
||||
- Now we can use the `Hidden file sync` and the `Customization sync` cooperatively.
|
||||
- We can exclude files from `Hidden file sync` which is already handled in Customization sync.
|
||||
- We can ignore specific plugins in Customization sync.
|
||||
- Now the message of leftover conflicted files accepts our click.
|
||||
- We can open `Resolve all conflicted files` in an instant.
|
||||
- Refactored:
|
||||
- Parallelism functions made more explicit.
|
||||
- Type errors have been reduced.
|
||||
- Fixed:
|
||||
- Now documents would not be overwritten if they are conflicted.
|
||||
It will be saved as a new conflicted revision.
|
||||
- Some error messages have been fixed.
|
||||
- Missing dialogue titles have been shown now.
|
||||
- We can click close buttons on mobile now.
|
||||
- Conflicted Customisation sync files will be resolved automatically by their modified time.
|
||||
- 0.19.21
|
||||
- Fixed:
|
||||
- Hidden files are no longer handled in the initial replication.
|
||||
- Report from `Making report` fixed
|
||||
- No longer contains customisation sync information.
|
||||
- Version of LiveSync has been added.
|
||||
- 0.19.22
|
||||
- Fixed:
|
||||
- Now the synchronisation will begin without our interaction.
|
||||
- No longer puts the configuration of the remote database into the log while checking configuration.
|
||||
- Some outdated description notes have been removed.
|
||||
- Options that are meaningless depending on other settings configured are now hidden.
|
||||
- Scan for hidden files before replication
|
||||
- Scan customization periodically
|
||||
- 0.19.23
|
||||
-Improved:
|
||||
- We can open the log pane also from the command palette now.
|
||||
- Now, the hidden file scanning interval could be configured to 0.
|
||||
- `Check database configuration` now points out that we do not have administrator permission.
|
||||
|
||||
|
||||
### 0.18.0
|
||||
|
||||
|
||||
1
utils/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
||||
fly.toml
|
||||
29
utils/couchdb/couchdb-init.sh
Executable file
@@ -0,0 +1,29 @@
|
||||
#!/bin/bash
|
||||
if [[ -z "$hostname" ]]; then
|
||||
echo "ERROR: Hostname missing"
|
||||
exit 1
|
||||
fi
|
||||
if [[ -z "$username" ]]; then
|
||||
echo "ERROR: Username missing"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ -z "$password" ]]; then
|
||||
echo "ERROR: Password missing"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "-- Configuring CouchDB by REST APIs... -->"
|
||||
|
||||
until (curl -X POST "${hostname}/_cluster_setup" -H "Content-Type: application/json" -d "{\"action\":\"enable_single_node\",\"username\":\"${username}\",\"password\":\"${password}\",\"bind_address\":\"0.0.0.0\",\"port\":5984,\"singlenode\":true}" --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/chttpd/require_valid_user" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/chttpd_auth/require_valid_user" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/httpd/WWW-Authenticate" -H "Content-Type: application/json" -d '"Basic realm=\"couchdb\""' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/httpd/enable_cors" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/chttpd/enable_cors" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/chttpd/max_http_request_size" -H "Content-Type: application/json" -d '"4294967296"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/couchdb/max_document_size" -H "Content-Type: application/json" -d '"50000000"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/cors/credentials" -H "Content-Type: application/json" -d '"true"' --user "${username}:${password}"); do sleep 5; done
|
||||
until (curl -X PUT "${hostname}/_node/nonode@nohost/_config/cors/origins" -H "Content-Type: application/json" -d '"app://obsidian.md,capacitor://localhost,http://localhost"' --user "${username}:${password}"); do sleep 5; done
|
||||
|
||||
echo "<-- Configuring CouchDB by REST APIs Done!"
|
||||
4
utils/flyio/delete-server.sh
Executable file
@@ -0,0 +1,4 @@
|
||||
#!/bin/bash
|
||||
set -e
|
||||
fly scale count 0 -y
|
||||
fly apps destroy $(fly status -j | jq -r .Name) -y
|
||||
43
utils/flyio/deploy-server.sh
Executable file
@@ -0,0 +1,43 @@
|
||||
#!/bin/bash
|
||||
## Script for deploy and automatic setup CouchDB onto fly.io.
|
||||
## We need Deno for generating the Setup-URI.
|
||||
|
||||
source setenv.sh $@
|
||||
|
||||
export hostname="https://$appname.fly.dev"
|
||||
|
||||
echo "-- YOUR CONFIGURATION --"
|
||||
echo "URL : $hostname"
|
||||
echo "username: $username"
|
||||
echo "password: $password"
|
||||
echo "region : $region"
|
||||
echo ""
|
||||
echo "-- START DEPLOYING --> "
|
||||
|
||||
set -e
|
||||
fly launch --name=$appname --env="COUCHDB_USER=$username" --copy-config=true --detach --no-deploy --region ${region} --yes
|
||||
fly secrets set COUCHDB_PASSWORD=$password
|
||||
fly deploy
|
||||
|
||||
set +e
|
||||
../couchdb/couchdb-init.sh
|
||||
# flyctl deploy
|
||||
echo "OK!"
|
||||
|
||||
if command -v deno >/dev/null 2>&1; then
|
||||
echo "Setup finished! Also, we can set up Self-hosted LiveSync instantly, by the following setup uri."
|
||||
echo "Passphrase of setup-uri is \`welcome\`".
|
||||
echo "--- configured ---"
|
||||
echo "database : ${database}"
|
||||
echo "E2EE passphrase: ${passphrase}"
|
||||
echo "--- setup uri ---"
|
||||
deno run -A generate_setupuri.ts
|
||||
else
|
||||
echo "Setup finished! Here is the configured values (reprise)!"
|
||||
echo "-- YOUR CONFIGURATION --"
|
||||
echo "URL : $hostname"
|
||||
echo "username: $username"
|
||||
echo "password: $password"
|
||||
echo "-- YOUR CONFIGURATION --"
|
||||
echo "If we had Deno, we would got the setup uri directly!"
|
||||
fi
|
||||
40
utils/flyio/fly.template.toml
Normal file
@@ -0,0 +1,40 @@
|
||||
## CouchDB for fly.io image
|
||||
|
||||
app = ''
|
||||
primary_region = 'nrt'
|
||||
swap_size_mb = 512
|
||||
|
||||
[build]
|
||||
image = "couchdb:latest"
|
||||
|
||||
[mounts]
|
||||
source = "couchdata"
|
||||
destination = "/opt/couchdb/data"
|
||||
initial_size = "1GB"
|
||||
auto_extend_size_threshold = 90
|
||||
auto_extend_size_increment = "1GB"
|
||||
auto_extend_size_limit = "2GB"
|
||||
|
||||
[env]
|
||||
COUCHDB_USER = ""
|
||||
ERL_FLAGS = "-couch_ini /opt/couchdb/etc/default.ini /opt/couchdb/etc/default.d/ /opt/couchdb/etc/local.d /opt/couchdb/etc/local.ini /opt/couchdb/data/persistence.ini"
|
||||
|
||||
[http_service]
|
||||
internal_port = 5984
|
||||
force_https = true
|
||||
auto_stop_machines = true
|
||||
auto_start_machines = true
|
||||
min_machines_running = 0
|
||||
processes = ['app']
|
||||
|
||||
[[vm]]
|
||||
cpu_kind = 'shared'
|
||||
cpus = 1
|
||||
memory_mb = 256
|
||||
|
||||
[[files]]
|
||||
guest_path = "/docker-entrypoint2.sh"
|
||||
raw_value = "#!/bin/bash\ntouch /opt/couchdb/data/persistence.ini\nchmod +w /opt/couchdb/data/persistence.ini\n/docker-entrypoint.sh $@"
|
||||
|
||||
[experimental]
|
||||
entrypoint = ["tini", "--", "/docker-entrypoint2.sh"]
|
||||
180
utils/flyio/generate_setupuri.ts
Normal file
@@ -0,0 +1,180 @@
|
||||
import { webcrypto } from "node:crypto";
|
||||
|
||||
const KEY_RECYCLE_COUNT = 100;
|
||||
type KeyBuffer = {
|
||||
key: CryptoKey;
|
||||
salt: Uint8Array;
|
||||
count: number;
|
||||
};
|
||||
|
||||
let semiStaticFieldBuffer: Uint8Array;
|
||||
const nonceBuffer: Uint32Array = new Uint32Array(1);
|
||||
const writeString = (string: string) => {
|
||||
// Prepare enough buffer.
|
||||
const buffer = new Uint8Array(string.length * 4);
|
||||
const length = string.length;
|
||||
let index = 0;
|
||||
let chr = 0;
|
||||
let idx = 0;
|
||||
while (idx < length) {
|
||||
chr = string.charCodeAt(idx++);
|
||||
if (chr < 128) {
|
||||
buffer[index++] = chr;
|
||||
} else if (chr < 0x800) {
|
||||
// 2 bytes
|
||||
buffer[index++] = 0xC0 | (chr >>> 6);
|
||||
buffer[index++] = 0x80 | (chr & 0x3F);
|
||||
} else if (chr < 0xD800 || chr > 0xDFFF) {
|
||||
// 3 bytes
|
||||
buffer[index++] = 0xE0 | (chr >>> 12);
|
||||
buffer[index++] = 0x80 | ((chr >>> 6) & 0x3F);
|
||||
buffer[index++] = 0x80 | (chr & 0x3F);
|
||||
} else {
|
||||
// 4 bytes - surrogate pair
|
||||
chr = (((chr - 0xD800) << 10) | (string.charCodeAt(idx++) - 0xDC00)) + 0x10000;
|
||||
buffer[index++] = 0xF0 | (chr >>> 18);
|
||||
buffer[index++] = 0x80 | ((chr >>> 12) & 0x3F);
|
||||
buffer[index++] = 0x80 | ((chr >>> 6) & 0x3F);
|
||||
buffer[index++] = 0x80 | (chr & 0x3F);
|
||||
}
|
||||
}
|
||||
return buffer.slice(0, index);
|
||||
};
|
||||
const KeyBuffs = new Map<string, KeyBuffer>();
|
||||
async function getKeyForEncrypt(passphrase: string, autoCalculateIterations: boolean): Promise<[CryptoKey, Uint8Array]> {
|
||||
// For performance, the plugin reuses the key KEY_RECYCLE_COUNT times.
|
||||
const buffKey = `${passphrase}-${autoCalculateIterations}`;
|
||||
const f = KeyBuffs.get(buffKey);
|
||||
if (f) {
|
||||
f.count--;
|
||||
if (f.count > 0) {
|
||||
return [f.key, f.salt];
|
||||
}
|
||||
f.count--;
|
||||
}
|
||||
const passphraseLen = 15 - passphrase.length;
|
||||
const iteration = autoCalculateIterations ? ((passphraseLen > 0 ? passphraseLen : 0) * 1000) + 121 - passphraseLen : 100000;
|
||||
const passphraseBin = new TextEncoder().encode(passphrase);
|
||||
const digest = await webcrypto.subtle.digest({ name: "SHA-256" }, passphraseBin);
|
||||
const keyMaterial = await webcrypto.subtle.importKey("raw", digest, { name: "PBKDF2" }, false, ["deriveKey"]);
|
||||
const salt = webcrypto.getRandomValues(new Uint8Array(16));
|
||||
const key = await webcrypto.subtle.deriveKey(
|
||||
{
|
||||
name: "PBKDF2",
|
||||
salt,
|
||||
iterations: iteration,
|
||||
hash: "SHA-256",
|
||||
},
|
||||
keyMaterial,
|
||||
{ name: "AES-GCM", length: 256 },
|
||||
false,
|
||||
["encrypt"]
|
||||
);
|
||||
KeyBuffs.set(buffKey, {
|
||||
key,
|
||||
salt,
|
||||
count: KEY_RECYCLE_COUNT,
|
||||
});
|
||||
return [key, salt];
|
||||
}
|
||||
|
||||
function getSemiStaticField(reset?: boolean) {
|
||||
// return fixed field of iv.
|
||||
if (semiStaticFieldBuffer != null && !reset) {
|
||||
return semiStaticFieldBuffer;
|
||||
}
|
||||
semiStaticFieldBuffer = webcrypto.getRandomValues(new Uint8Array(12));
|
||||
return semiStaticFieldBuffer;
|
||||
}
|
||||
|
||||
function getNonce() {
|
||||
// This is nonce, so do not send same thing.
|
||||
nonceBuffer[0]++;
|
||||
if (nonceBuffer[0] > 10000) {
|
||||
// reset semi-static field.
|
||||
getSemiStaticField(true);
|
||||
}
|
||||
return nonceBuffer;
|
||||
}
|
||||
function arrayBufferToBase64internalBrowser(buffer: DataView | Uint8Array): Promise<string> {
|
||||
return new Promise((res, rej) => {
|
||||
const blob = new Blob([buffer], { type: "application/octet-binary" });
|
||||
const reader = new FileReader();
|
||||
reader.onload = function (evt) {
|
||||
const dataURI = evt.target?.result?.toString() || "";
|
||||
if (buffer.byteLength != 0 && (dataURI == "" || dataURI == "data:")) return rej(new TypeError("Could not parse the encoded string"));
|
||||
const result = dataURI.substring(dataURI.indexOf(",") + 1);
|
||||
res(result);
|
||||
};
|
||||
reader.readAsDataURL(blob);
|
||||
});
|
||||
}
|
||||
|
||||
// Map for converting hexString
|
||||
const revMap: { [key: string]: number } = {};
|
||||
const numMap: { [key: number]: string } = {};
|
||||
for (let i = 0; i < 256; i++) {
|
||||
revMap[(`00${i.toString(16)}`.slice(-2))] = i;
|
||||
numMap[i] = (`00${i.toString(16)}`.slice(-2));
|
||||
}
|
||||
|
||||
|
||||
function uint8ArrayToHexString(src: Uint8Array): string {
|
||||
return [...src].map(e => numMap[e]).join("");
|
||||
}
|
||||
|
||||
const QUANTUM = 32768;
|
||||
async function arrayBufferToBase64Single(buffer: ArrayBuffer): Promise<string> {
|
||||
const buf = buffer instanceof Uint8Array ? buffer : new Uint8Array(buffer);
|
||||
if (buf.byteLength < QUANTUM) return btoa(String.fromCharCode.apply(null, [...buf]));
|
||||
return await arrayBufferToBase64internalBrowser(buf);
|
||||
}
|
||||
|
||||
|
||||
export async function encrypt(input: string, passphrase: string, autoCalculateIterations: boolean) {
|
||||
const [key, salt] = await getKeyForEncrypt(passphrase, autoCalculateIterations);
|
||||
// Create initial vector with semi-fixed part and incremental part
|
||||
// I think it's not good against related-key attacks.
|
||||
const fixedPart = getSemiStaticField();
|
||||
const invocationPart = getNonce();
|
||||
const iv = new Uint8Array([...fixedPart, ...new Uint8Array(invocationPart.buffer)]);
|
||||
const plainStringified = JSON.stringify(input);
|
||||
|
||||
// const plainStringBuffer: Uint8Array = tex.encode(plainStringified)
|
||||
const plainStringBuffer: Uint8Array = writeString(plainStringified);
|
||||
const encryptedDataArrayBuffer = await webcrypto.subtle.encrypt({ name: "AES-GCM", iv }, key, plainStringBuffer);
|
||||
const encryptedData2 = (await arrayBufferToBase64Single(encryptedDataArrayBuffer));
|
||||
//return data with iv and salt.
|
||||
const ret = `["${encryptedData2}","${uint8ArrayToHexString(iv)}","${uint8ArrayToHexString(salt)}"]`;
|
||||
return ret;
|
||||
}
|
||||
|
||||
const URIBASE = "obsidian://setuplivesync?settings=";
|
||||
async function main() {
|
||||
const conf = {
|
||||
"couchDB_URI": `${Deno.env.get("hostname")}`,
|
||||
"couchDB_USER": `${Deno.env.get("username")}`,
|
||||
"couchDB_PASSWORD": `${Deno.env.get("password")}`,
|
||||
"couchDB_DBNAME": `${Deno.env.get("database")}`,
|
||||
"syncOnStart": true,
|
||||
"gcDelay": 0,
|
||||
"periodicReplication": true,
|
||||
"syncOnFileOpen": true,
|
||||
"encrypt": true,
|
||||
"passphrase": `${Deno.env.get("passphrase")}`,
|
||||
"usePathObfuscation": true,
|
||||
"batchSave": true,
|
||||
"batch_size": 50,
|
||||
"batches_limit": 50,
|
||||
"useHistory": true,
|
||||
"disableRequestURI": true,
|
||||
"customChunkSize": 50,
|
||||
"syncAfterMerge": false,
|
||||
"concurrencyOfReadChunksOnline": 100,
|
||||
"minimumIntervalOfReadChunksOnline": 100,
|
||||
}
|
||||
const encryptedConf = encodeURIComponent(await encrypt(JSON.stringify(conf), "welcome", false));
|
||||
const theURI = `${URIBASE}${encryptedConf}`;
|
||||
console.log(theURI);
|
||||
}
|
||||
await main();
|
||||
30
utils/flyio/setenv.sh
Executable file
@@ -0,0 +1,30 @@
|
||||
random_num() {
|
||||
echo $RANDOM
|
||||
}
|
||||
random_noun() {
|
||||
nouns=("waterfall" "river" "breeze" "moon" "rain" "wind" "sea" "morning" "snow" "lake" "sunset" "pine" "shadow" "leaf" "dawn" "glitter" "forest" "hill" "cloud" "meadow" "sun" "glade" "bird" "brook" "butterfly" "bush" "dew" "dust" "field" "fire" "flower" "firefly" "feather" "grass" "haze" "mountain" "night" "pond" "darkness" "snowflake" "silence" "sound" "sky" "shape" "surf" "thunder" "violet" "water" "wildflower" "wave" "water" "resonance" "sun" "log" "dream" "cherry" "tree" "fog" "frost" "voice" "paper" "frog" "smoke" "star")
|
||||
echo ${nouns[$(($RANDOM % ${#nouns[*]}))]}
|
||||
}
|
||||
|
||||
random_adjective() {
|
||||
adjectives=("autumn" "hidden" "bitter" "misty" "silent" "empty" "dry" "dark" "summer" "icy" "delicate" "quiet" "white" "cool" "spring" "winter" "patient" "twilight" "dawn" "crimson" "wispy" "weathered" "blue" "billowing" "broken" "cold" "damp" "falling" "frosty" "green" "long" "late" "lingering" "bold" "little" "morning" "muddy" "old" "red" "rough" "still" "small" "sparkling" "thrumming" "shy" "wandering" "withered" "wild" "black" "young" "holy" "solitary" "fragrant" "aged" "snowy" "proud" "floral" "restless" "divine" "polished" "ancient" "purple" "lively" "nameless")
|
||||
echo ${adjectives[$(($RANDOM % ${#adjectives[*]}))]}
|
||||
}
|
||||
|
||||
cp ./fly.template.toml ./fly.toml
|
||||
|
||||
if [ "$1" = "renew" ]; then
|
||||
unset appname
|
||||
unset username
|
||||
unset password
|
||||
unset database
|
||||
unset passphrase
|
||||
unset region
|
||||
fi
|
||||
|
||||
[ -z $appname ] && export appname=$(random_adjective)-$(random_noun)-$(random_num)
|
||||
[ -z $username ] && export username=$(random_adjective)-$(random_noun)-$(random_num)
|
||||
[ -z $password ] && export password=$(random_adjective)-$(random_noun)-$(random_num)
|
||||
[ -z $database ] && export database="obsidiannotes"
|
||||
[ -z $passphrase ] && export passphrase=$(random_adjective)-$(random_noun)-$(random_num)
|
||||
[ -z $region ] && export region="nrt"
|
||||
164
utils/readme.md
Normal file
@@ -0,0 +1,164 @@
|
||||
<!-- For translation: 20240206r0 -->
|
||||
# Utilities
|
||||
Here are some useful things.
|
||||
|
||||
## couchdb
|
||||
|
||||
### couchdb-init.sh
|
||||
This script can configure CouchDB with the necessary settings by REST APIs.
|
||||
|
||||
#### Materials
|
||||
- Mandatory: curl
|
||||
|
||||
#### Usage
|
||||
|
||||
```sh
|
||||
export hostname=http://localhost:5984/
|
||||
export username=couchdb-admin-username
|
||||
export password=couchdb-admin-password
|
||||
./couchdb-init.sh
|
||||
```
|
||||
|
||||
curl result will be shown, however, all of them can be ignored if the script has been run completely.
|
||||
|
||||
## fly.io
|
||||
|
||||
### deploy-server.sh
|
||||
|
||||
A fully automated CouchDB deployment script. We can deploy CouchDB onto fly.io. The only we need is an account of it.
|
||||
|
||||
All omitted configurations will be determined at random. (And, it is preferred). The region is configured to `nrt`.
|
||||
If Japan is not close to you, please choose a region closer to you. However, the deployed database will work if you leave it at all.
|
||||
|
||||
#### Materials
|
||||
- Mandatory: curl, flyctl
|
||||
- Recommended: deno
|
||||
|
||||
#### Usage
|
||||
```sh
|
||||
#export appname=
|
||||
#export username=
|
||||
#export password=
|
||||
#export database=
|
||||
#export passphrase=
|
||||
export region=nrt #pick your nearest location
|
||||
./deploy-server.sh
|
||||
```
|
||||
|
||||
The result of this command is as follows.
|
||||
|
||||
```
|
||||
-- YOUR CONFIGURATION --
|
||||
URL : https://young-darkness-25342.fly.dev
|
||||
username: billowing-cherry-22580
|
||||
password: misty-dew-13571
|
||||
region : nrt
|
||||
|
||||
-- START DEPLOYING -->
|
||||
An existing fly.toml file was found
|
||||
Using build strategies '[the "couchdb:latest" docker image]'. Remove [build] from fly.toml to force a rescan
|
||||
Creating app in /home/vorotamoroz/dev/obsidian-livesync/utils/flyio
|
||||
We're about to launch your app on Fly.io. Here's what you're getting:
|
||||
|
||||
Organization: vorotamoroz (fly launch defaults to the personal org)
|
||||
Name: young-darkness-25342 (specified on the command line)
|
||||
Region: Tokyo, Japan (specified on the command line)
|
||||
App Machines: shared-cpu-1x, 256MB RAM (specified on the command line)
|
||||
Postgres: <none> (not requested)
|
||||
Redis: <none> (not requested)
|
||||
|
||||
Created app 'young-darkness-25342' in organization 'personal'
|
||||
Admin URL: https://fly.io/apps/young-darkness-25342
|
||||
Hostname: young-darkness-25342.fly.dev
|
||||
Wrote config file fly.toml
|
||||
Validating /home/vorotamoroz/dev/obsidian-livesync/utils/flyio/fly.toml
|
||||
Platform: machines
|
||||
✓ Configuration is valid
|
||||
Your app is ready! Deploy with `flyctl deploy`
|
||||
Secrets are staged for the first deployment
|
||||
==> Verifying app config
|
||||
Validating /home/vorotamoroz/dev/obsidian-livesync/utils/flyio/fly.toml
|
||||
Platform: machines
|
||||
✓ Configuration is valid
|
||||
--> Verified app config
|
||||
==> Building image
|
||||
Searching for image 'couchdb:latest' remotely...
|
||||
image found: img_ox20prk63084j1zq
|
||||
|
||||
Watch your deployment at https://fly.io/apps/young-darkness-25342/monitoring
|
||||
|
||||
Provisioning ips for young-darkness-25342
|
||||
Dedicated ipv6: 2a09:8280:1::37:fde9
|
||||
Shared ipv4: 66.241.124.163
|
||||
Add a dedicated ipv4 with: fly ips allocate-v4
|
||||
|
||||
Creating a 1 GB volume named 'couchdata' for process group 'app'. Use 'fly vol extend' to increase its size
|
||||
This deployment will:
|
||||
* create 1 "app" machine
|
||||
|
||||
No machines in group app, launching a new machine
|
||||
|
||||
WARNING The app is not listening on the expected address and will not be reachable by fly-proxy.
|
||||
You can fix this by configuring your app to listen on the following addresses:
|
||||
- 0.0.0.0:5984
|
||||
Found these processes inside the machine with open listening sockets:
|
||||
PROCESS | ADDRESSES
|
||||
-----------------*---------------------------------------
|
||||
/.fly/hallpass | [fdaa:0:73b9:a7b:22e:3851:7f28:2]:22
|
||||
|
||||
Finished launching new machines
|
||||
|
||||
NOTE: The machines for [app] have services with 'auto_stop_machines = true' that will be stopped when idling
|
||||
|
||||
-------
|
||||
Checking DNS configuration for young-darkness-25342.fly.dev
|
||||
|
||||
Visit your newly deployed app at https://young-darkness-25342.fly.dev/
|
||||
-- Configuring CouchDB by REST APIs... -->
|
||||
curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to young-darkness-25342.fly.dev:443
|
||||
{"ok":true}
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
""
|
||||
<-- Configuring CouchDB by REST APIs Done!
|
||||
OK!
|
||||
Setup finished! Also, we can set up Self-hosted LiveSync instantly, by the following setup uri.
|
||||
Passphrase of setup-uri is `welcome`.
|
||||
--- configured ---
|
||||
database : obsidiannotes
|
||||
E2EE passphrase: dark-wildflower-26467
|
||||
--- setup uri ---
|
||||
obsidian://setuplivesync?settings=%5B%22gZkBwjFbLqxbdSIbJymU%2FmTPBPAKUiHVGDRKYiNnKhW0auQeBgJOfvnxexZtMCn8sNiIUTAlxNaMGF2t%2BCEhpJoeCP%2FO%2BrwfN5LaNDQyky1Uf7E%2B64A5UWyjOYvZDOgq4iCKSdBAXp9oO%2BwKh4MQjUZ78vIVvJp8Mo6NWHfm5fkiWoAoddki1xBMvi%2BmmN%2FhZatQGcslVb9oyYWpZocduTl0a5Dv%2FQviGwlYQ%2F4NY0dVDIoOdvaYS%2FX4GhNAnLzyJKMXhPEJHo9FvR%2FEOBuwyfMdftV1SQUZ8YDCuiR3T7fh7Kn1c6OFgaFMpFm%2BWgIJ%2FZpmAyhZFpEcjpd7ty%2BN9kfd9gQsZM4%2BYyU9OwDd2DahVMBWkqoV12QIJ8OlJScHHdcUfMW5ex%2F4UZTWKNEHJsigITXBrtq11qGk3rBfHys8O0vY6sz%2FaYNM3iAOsR1aoZGyvwZm4O6VwtzK8edg0T15TL4O%2B7UajQgtCGxgKNYxb8EMOGeskv7NifYhjCWcveeTYOJzBhnIDyRbYaWbkAXQgHPBxzJRkkG%2FpBPfBBoJarj7wgjMvhLJ9xtL4FbP6sBNlr8jtAUCoq4L7LJcRNF4hlgvjJpL2BpFZMzkRNtUBcsRYR5J%2BM1X2buWi2BHncbSiRRDKEwNOQkc%2FmhMJjbAn%2F8eNKRuIICOLD5OvxD7FZNCJ0R%2BWzgrzcNV%22%2C%22ec7edc900516b4fcedb4c7cc01000000%22%2C%22fceb5fe54f6619ee266ed9a887634e07%22%5D
|
||||
```
|
||||
|
||||
All we have to do is copy the setup-URI (`obsidian`://...`) and open it from Self-hosted LiveSync on Obsidian.
|
||||
|
||||
If you did not install Deno, configurations will be printed again, instead of the setup-URI. In this case, we should configure it manually.
|
||||
|
||||
### delete-server.sh
|
||||
|
||||
The pair script of `deploy-server.sh`. We can delete the deployed server by this with fly.toml.
|
||||
|
||||
#### Materials
|
||||
|
||||
- Mandatory: flyctl, jq
|
||||
- Recommended: none
|
||||
|
||||
#### Usage
|
||||
```sh
|
||||
./delete-server.sh
|
||||
```
|
||||
|
||||
```
|
||||
App 'young-darkness-25342 is going to be scaled according to this plan:
|
||||
-1 machines for group 'app' on region 'nrt' of size 'shared-cpu-1x'
|
||||
Executing scale plan
|
||||
Destroyed e28667eec57158 group:app region:nrt size:shared-cpu-1x
|
||||
Destroyed app young-darkness-25342
|
||||
```
|
||||