Compare commits

...

68 Commits

Author SHA1 Message Date
vorotamoroz
61188cfaef bump 2024-01-16 08:36:37 +00:00
vorotamoroz
97d944fd75 New feature:
- We can perform automatic conflict resolution for inactive files, and postpone only manual ones by `Postpone manual resolution of inactive files`.
- Now we can see the image in the document history dialogue.
  - We can see the difference of the image, in the document history dialogue.
	- And also we can highlight differences.

Improved:
- Hidden file sync has been stabilised.
- Now automatically reloads the conflict-resolution dialogue when new conflicted revisions have arrived.

Fixed:
- No longer periodic process runs after unloading the plug-in.
- Now the modification of binary files is surely stored in the storage.
2024-01-16 08:32:43 +00:00
vorotamoroz
d3dc1e7328 Minor fix and refine the readme 2024-01-12 10:29:18 +00:00
vorotamoroz
45304af369 bump 2024-01-12 09:38:57 +00:00
vorotamoroz
7f422d58f2 - Refined:
- Task scheduling logics has been rewritten.
  - Possibly many bugs and fragile behaviour has been fixed
- Fixed:
  - Remote-chunk-fetching now works with keeping request intervals
- New feature:
  - We can show only the icons in the editor.
2024-01-12 09:36:49 +00:00
vorotamoroz
c2491fdfad bump 2023-12-11 12:55:01 +09:00
vorotamoroz
06a6e391e8 Fixed for change detection bug. 2023-12-11 12:53:50 +09:00
vorotamoroz
f99475f6b7 bump 2023-12-11 12:46:23 +09:00
vorotamoroz
109fc00b9d Fixed
- Now ID of the documents is shown in the log with the first 8 letters.
2023-12-11 12:45:40 +09:00
vorotamoroz
c071d822e1 - Improved:
- Now all revisions will be shown only its first a few letters.
- Fixed:
  - Check before modifying files has been implemented.
  - Content change detection has been improved.
2023-12-11 12:22:17 +09:00
vorotamoroz
d2de5b4710 bump 2023-12-04 19:39:47 +09:00
vorotamoroz
cf5ecd8922 Implemented:
- Now we can use SHA1 for hash function as fallback.
2023-12-04 19:39:04 +09:00
vorotamoroz
b337a05b5a bump 2023-11-27 07:13:15 +00:00
vorotamoroz
9ea6bee9d1 - Fixed:
- No longer files are broken while rebuilding.
    - Now, Large binary files can be written correctly on a mobile platform.
    - Any decoding errors now make zero-byte files.
  - Modified:
    - All files are processed sequentially for each.
2023-11-27 06:55:55 +00:00
vorotamoroz
9747c26d50 bump 2023-11-25 02:22:26 +09:00
vorotamoroz
bb4b764586 - Fixed:
- No more infinity loops on larger files.
    - Show message on decode error.
  - Refactored:
    - Fixed to avoid obsolete global variables.
2023-11-25 02:21:44 +09:00
vorotamoroz
279b4b41e5 bump 2023-11-24 10:32:46 +00:00
vorotamoroz
b644fb791d - Changes and performance improvements:
- Now the saving files are processed by Blob.
    - The V2-Format has been reverted.
    - New encoding format has been enabled in default.
    - WARNING: Since this version, the compatibilities with older Filesystem LiveSync have been lost.
2023-11-24 10:31:58 +00:00
vorotamoroz
ac9428e96b Fixed
- To better replication, path obfuscation is now deterministic even if with E2EE.
2023-11-15 08:44:03 +00:00
vorotamoroz
280d9e1dd9 Fixed: Fixed the issue of TOML editing. 2023-11-07 01:07:58 +00:00
vorotamoroz
f7209e566c bump 2023-10-24 10:07:29 +01:00
vorotamoroz
4a9ab2d1de Fixed:
- No longer enumerating file names is broken.
2023-10-24 10:07:17 +01:00
vorotamoroz
cb74b5ee93 - Fixed
- Now empty file could be decoded.
    - Local files are no longer pre-saved before fetching from a remote database.
    - No longer deadlock while applying customisation sync.
    - Configuration with multiple files is now able to be applied correctly.
    - Deleting folder propagation now works without enabling the use of a trash bin.
2023-10-24 09:54:56 +01:00
vorotamoroz
60eecd7001 bump 2023-10-17 12:00:59 +09:00
vorotamoroz
4bd7b54bcd Fixed:
- Now the files which having digit or character prefixes in the path will not be ignored.
2023-10-17 12:00:19 +09:00
vorotamoroz
8923c73d1b bump 2023-10-14 23:08:34 +09:00
vorotamoroz
11e64b13e2 The text-input-dialogue is no longer broken. 2023-10-14 23:07:51 +09:00
vorotamoroz
983d9248ed bump 2023-10-13 04:23:59 +01:00
vorotamoroz
7240e84328 - New feature:
- We can launch Customization sync from the Ribbon if we enable it.
- Fixed:
  - Setup URI is now back to the previous spec; be encrypted by V1.
  - The Settings dialogue is now registered at the beginning of the start-up process.
- Improved:
  - Enumerating documents has been faster.
2023-10-13 04:22:24 +01:00
vorotamoroz
0d55ae2532 Merge pull request #298 from LiamSwayne/patch-1
Grammar fix
2023-10-04 13:07:42 +09:00
Liam Swayne
dbd284f5dd grammar fix 2023-10-03 22:26:30 -04:00
vorotamoroz
c000a02f4a bump 2023-10-02 10:39:54 +01:00
vorotamoroz
79754f48d6 New feature:
- We can delete all data of customization sync from the `Delete all customization sync data` on the `Hatch` pane.
Fixed:
- Prevent keep restarting on iOS by yielding microtasks.
2023-10-02 10:38:54 +01:00
vorotamoroz
dd7a40630b bump 2023-10-02 09:54:56 +01:00
vorotamoroz
14406f8213 - Fixed:
- No more UI freezing and keep restarting on iOS.
  - Diff of Non-markdown documents are now shown correctly.
- Improved:
  - Performance has been a bit improved.
  - Customization sync has gotten faster.
    - However, We lost forward compatibility again (only for this feature). Please update all devices.
- Misc
  - Terser configuration has been more aggressive.
2023-10-02 09:53:41 +01:00
vorotamoroz
3bbd9c048d bump 2023-09-29 18:57:54 +09:00
vorotamoroz
d91c4f50b4 - Improved:
- A New binary file handling implemented
  - A new encrypted format has been implemented
  - Now the chunk sizes will be adjusted for efficient sync
- Fixed:
  - levels of exception in some logs have been fixed
- Tidied:
  - Some Lint warnings have been suppressed.
2023-09-29 18:55:46 +09:00
vorotamoroz
395b7fbc42 bump 2023-09-22 18:03:09 +09:00
vorotamoroz
3773e57429 Improved:
- We can open the log pane also from the command palette now.
- Now, the hidden file scanning interval could be configured to 0.
- `Check database configuration` now points out that we do not have administrator permission.
2023-09-22 17:59:32 +09:00
vorotamoroz
4835fce62a bump 2023-09-21 09:44:50 +01:00
vorotamoroz
ff814be4a0 Fixed:
- Now the synchronisation will begin without our interaction.
- No longer puts the configuration of the remote database into the log while checking configuration.
- Some outdated description notes have been removed.
- Options that are meaningless depending on other settings configured are now hidden.
2023-09-21 09:44:07 +01:00
vorotamoroz
b271b63efa bump 2023-09-20 07:05:18 +01:00
vorotamoroz
23419e476a - Fixed:
- Hidden files are no longer handled in the initial replication.
  - Report from `Making report` fixed
2023-09-20 07:04:31 +01:00
vorotamoroz
b9bd1f17b8 bump 2023-09-19 09:58:04 +01:00
vorotamoroz
bcce277c36 New feature:
- `Sync on Editor save` has been implemented
- Now we can use the `Hidden file sync` and the `Customization sync` cooperatively.
- We can ignore specific plugins in Customization sync.
- Now the message of leftover conflicted files accepts our click.

Refactored:
- Parallelism functions made more explicit.
- Type errors have been reduced.

Fixed:
- Now documents would not be overwritten if they are conflicted.
- Some error messages have been fixed.
- Missing dialogue titles have been shown now.
2023-09-19 09:53:48 +01:00
vorotamoroz
5acbbe479e Merge pull request #276 from cgcel/main
Add Quick setup Chinese translation.
2023-09-13 17:13:55 +09:00
vorotamoroz
c9f9d511e0 bump 2023-09-05 09:16:49 +01:00
vorotamoroz
b8cb94c498 Fixed:
- Resolving conflicted revision has become more robust.
- LiveSync now try to keep local changes when fetching from the rebuilt remote database.
- Now, all files will be restored after performing `fetch` immediately.
2023-09-05 09:16:11 +01:00
cgcel
52c736f6b9 update: format markdown titles. 2023-09-01 11:01:06 +08:00
cgcel
ebd1cb7777 add: Quick setup Chinese translation. 2023-09-01 10:58:07 +08:00
vorotamoroz
10decb7909 Merge pull request #259 from jt-wang/main
Upload colab notebook into the repo, and update setup_flyio.md to reference it
2023-08-24 13:04:02 +09:00
vorotamoroz
e0aab8d69d bump 2023-08-24 04:03:03 +01:00
vorotamoroz
618600c753 Fixed:
- Now the empty (or deleted) file could be conflict-resolved.
2023-08-24 04:00:56 +01:00
Jingtao Wang
d1aba87e37 Updated with latest changes 2023-08-23 19:22:07 -07:00
Jingtao Wang
db889f635e Merge pull request #1 from jt-wang/move-colab-script-into-repo
Upload deploy_couchdb_to_flyio_v2_with_swap.ipynb into the repo, and reference it from fly.io setup doc.
2023-08-15 14:55:37 -07:00
Jingtao Wang
dd80e634f5 Update setup_flyio.md to reference colab notebook within the repo 2023-08-15 14:54:43 -07:00
Jingtao Wang
bec6fc1a74 Upload deploy_couchdb_to_flyio_v2_with_swap.ipynb
Upload https://gist.github.com/vrtmrz/37c3efd7842e49947aaaa7f665e5020a into the repo for unified management.
2023-08-15 14:49:24 -07:00
vorotamoroz
5c96c7f99b bump 2023-08-08 10:45:50 +01:00
vorotamoroz
7b9724f713 Fixed:
- Now nested ignore files could be parsed correctly.
- The unexpected deletion of hidden files in some cases has been corrected.
- Hidden file change is no longer reflected on the device which has made the change itself.
2023-08-08 10:42:07 +01:00
vorotamoroz
4cd12c85ed update the changelog. 2023-08-04 10:03:51 +01:00
vorotamoroz
90651540f9 Update release.yml 2023-08-04 17:52:13 +09:00
vorotamoroz
9e504d5002 bump 2023-08-04 09:45:14 +01:00
vorotamoroz
faaa94423c New feature:
- (Beta) ignore files handling

Fixed:
- Buttons on lock-detected-dialogue now can be shown in narrow-width devices.

Improved:
- Some constant has been flattened to be evaluated.
- The usage of the deprecated API of obsidian has been reduced.
- Now the indexedDB adapter will be enabled while the importing configuration.

Misc:
- Compiler, framework, and dependencies have been upgraded.
- Due to standing for these impacts (especially in esbuild and svelte,) terser has been introduced.
  Feel free to notify your opinion to me! I do not like to obfuscate the code too.
2023-08-04 09:45:04 +01:00
vorotamoroz
a7c179fc86 Merge pull request #247 from cgcel/patch-1
Update setup_own_server_cn.md
2023-07-31 12:49:50 +09:00
vorotamoroz
ed1a670b9b bump 2023-07-26 17:35:06 +09:00
vorotamoroz
6c3c265bd6 Dependency updates 2023-07-26 17:32:07 +09:00
vorotamoroz
9d68025f2a Fixed:
- Now storing files after cleaning up is correct works.

Improved:
- Cleaning the local database up got incredibly fastened.
2023-07-26 17:31:55 +09:00
GC Chen
e70972c8f9 Update setup_own_server_cn.md 2023-07-25 22:31:17 +08:00
43 changed files with 5414 additions and 2940 deletions

View File

@@ -1,4 +1,5 @@
node_modules
build
.eslintrc.js.bak
src/lib/src/patches/pouchdb-utils
src/lib/src/patches/pouchdb-utils
esbuild.config.mjs

View File

@@ -17,7 +17,7 @@ jobs:
- name: Use Node.js
uses: actions/setup-node@v1
with:
node-version: '14.x' # You might need to adjust this value to your own version
node-version: '18.x' # You might need to adjust this value to your own version
# Get the version number and put it in a variable
- name: Get Version
id: version

1
.gitignore vendored
View File

@@ -8,6 +8,7 @@ package-lock.json
# build
main.js
main_org.js
*.js.map
# obsidian

View File

@@ -59,14 +59,23 @@ Synchronization status is shown in statusbar.
- Status
- ⏹️ Stopped
- 💤 LiveSync enabled. Waiting for changes.
- ⚡️ Synchronization in progress.
- ⚠ An error occurred.
- ↑ Uploaded chunks and metadata
- ↓ Downloaded chunks and metadata
- ⏳ Number of pending processes
- 🧩 Number of files waiting for their chunks.
If you have deleted or renamed files, please wait until ⏳ icon disappeared.
- 💤 LiveSync enabled. Waiting for changes
- ⚡️ Synchronization in progress
- ⚠ An error occurred
- Statistical indicator
- ↑ Uploaded chunks and metadata
- ↓ Downloaded chunks and metadata
- Progress indicator
- 📥 Unprocessed transferred items
- 📄 Working database operation
- 💾 Working write storage processes
- ⏳ Working read storage processes
- 🛫 Pending read storage processes
- ⚙️ Working or pending storage processes of hidden files
- 🧩 Waiting chunks
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
To prevent file and database corruption, please wait until all progress indicators have disappeared. Especially in case of if you have deleted or renamed files.
## Hints

View File

@@ -0,0 +1,318 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "view-in-github"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/vrtmrz/37c3efd7842e49947aaaa7f665e5020a/deploy_couchdb_to_flyio_v2_with_swap.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "HiRV7G8Gk1Rs"
},
"source": [
"History:\n",
"- 18, May, 2023: Initial.\n",
"- 19, Jun., 2023: Patched for enabling swap.\n",
"- 22, Aug., 2023: Generating Setup-URI implemented.\n",
"- 7, Nov., 2023: Fixed the issue of TOML editing."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "2Vh0mEQEZuAK"
},
"outputs": [],
"source": [
"# Configurations\n",
"import os\n",
"os.environ['region']=\"nrt\"\n",
"os.environ['couchUser']=\"alkcsa93\"\n",
"os.environ['couchPwd']=\"c349usdfnv48fsasd\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "SPmbB0jZauQ1"
},
"outputs": [],
"source": [
"# Delete once (Do not care about `cannot remove './fly.toml': No such file or directory`)\n",
"!rm ./fly.toml"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "Nze7QoxLZ7Yx"
},
"outputs": [],
"source": [
"# Installation\n",
"# You have to set up your account in here.\n",
"!curl -L https://fly.io/install.sh | sh\n",
"!/root/.fly/bin/flyctl auth signup"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "MVJwsIYrbgtx"
},
"outputs": [],
"source": [
"# Generate server\n",
"!/root/.fly/bin/flyctl launch --auto-confirm --generate-name --detach --no-deploy --region ${region}\n",
"!/root/.fly/bin/fly volumes create --region ${region} couchdata --size 2 --yes"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "2RSoO9o-i2TT"
},
"outputs": [],
"source": [
"# Check the toml once.\n",
"!cat fly.toml"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "zUtPZLVnbvdQ"
},
"outputs": [],
"source": [
"# Modify the TOML and generate Dockerfile\n",
"!pip install mergedeep\n",
"from mergedeep import merge\n",
"import toml\n",
"fly = toml.load('fly.toml')\n",
"override = {\n",
" \"http_service\":{\n",
" \"internal_port\":5984\n",
" },\n",
" \"build\":{\n",
" \"dockerfile\":\"./Dockerfile\"\n",
" },\n",
" \"mounts\":{\n",
" \"source\":\"couchdata\",\n",
" \"destination\":\"/opt/couchdb/data\"\n",
" },\n",
" \"env\":{\n",
" \"COUCHDB_USER\":os.environ['couchUser'],\n",
" \"ERL_FLAGS\":\"-couch_ini /opt/couchdb/etc/default.ini /opt/couchdb/etc/default.d/ /opt/couchdb/etc/local.d /opt/couchdb/etc/local.ini /opt/couchdb/data/persistence.ini\",\n",
" }\n",
"}\n",
"out = merge(fly,override)\n",
"with open('fly.toml', 'wt') as fp:\n",
" toml.dump(out, fp)\n",
" fp.close()\n",
"\n",
"# Make the Dockerfile to modify the permission of the ini file. If you want to use a specific version, you should change `latest` here.\n",
"dockerfile = '''FROM couchdb:latest\n",
"RUN sed -i '2itouch /opt/couchdb/data/persistence.ini && chmod +w /opt/couchdb/data/persistence.ini && fallocate -l 512M /swapfile && chmod 0600 /swapfile && mkswap /swapfile && echo 10 > /proc/sys/vm/swappiness && swapon /swapfile && echo 1 > /proc/sys/vm/overcommit_memory' /docker-entrypoint.sh\n",
"'''\n",
"with open(\"./Dockerfile\",\"wt\") as fp:\n",
" fp.write(dockerfile)\n",
" fp.close()\n",
"\n",
"!echo ------\n",
"!cat fly.toml\n",
"!echo ------\n",
"!cat Dockerfile"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "xWdsTCI6bzk2"
},
"outputs": [],
"source": [
"# Configure password\n",
"!/root/.fly/bin/flyctl secrets set COUCHDB_PASSWORD=${couchPwd}"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "k0WIQlShcXGa"
},
"outputs": [],
"source": [
"# Deploy server\n",
"# Be sure to shutdown after the test.\n",
"!/root/.fly/bin/flyctl deploy --detach --remote-only\n",
"!/root/.fly/bin/flyctl status"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "0ySggkdlfq7M"
},
"outputs": [],
"source": [
"import subprocess, json\n",
"result = subprocess.run([\"/root/.fly/bin/flyctl\",\"status\",\"-j\"], capture_output=True, text=True)\n",
"if result.returncode==0:\n",
" hostname = json.loads(result.stdout)[\"Hostname\"]\n",
" os.environ['couchHost']=\"https://%s\" % (hostname)\n",
" print(\"Your couchDB server is https://%s/\" % (hostname))\n",
"else:\n",
" print(\"Something occured.\")\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "cGlSzVqlQG_z"
},
"outputs": [],
"source": [
"# Finish setting up the CouchDB\n",
"# Please repeat until the request is completed without error messages\n",
"# i.e., You have to redo this block while \"curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to xxxx\" is showing.\n",
"#\n",
"# Note: A few minutes might be required to be booted.\n",
"!curl -X POST \"${couchHost}/_cluster_setup\" -H \"Content-Type: application/json\" -d \"{\\\"action\\\":\\\"enable_single_node\\\",\\\"username\\\":\\\"${couchUser}\\\",\\\"password\\\":\\\"${couchPwd}\\\",\\\"bind_address\\\":\\\"0.0.0.0\\\",\\\"port\\\":5984,\\\"singlenode\\\":true}\" --user \"${couchUser}:${couchPwd}\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "JePzrsHypY18"
},
"outputs": [],
"source": [
"# Please repeat until all lines are completed without error messages\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/require_valid_user\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd_auth/require_valid_user\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/httpd/WWW-Authenticate\" -H \"Content-Type: application/json\" -d '\"Basic realm=\\\"couchdb\\\"\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/httpd/enable_cors\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/enable_cors\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/max_http_request_size\" -H \"Content-Type: application/json\" -d '\"4294967296\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/couchdb/max_document_size\" -H \"Content-Type: application/json\" -d '\"50000000\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/credentials\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/origins\" -H \"Content-Type: application/json\" -d '\"app://obsidian.md,capacitor://localhost,http://localhost\"' --user \"${couchUser}:${couchPwd}\""
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "YfSOomsoXbGS"
},
"source": [
"Now, our CouchDB has been surely installed and configured. Cheers!\n",
"\n",
"In the steps that follow, create a setup-URI.\n",
"\n",
"This URI could be imported directly into Self-hosted LiveSync, to configure the use of the CouchDB which we configured now."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "416YncOqXdNn"
},
"outputs": [],
"source": [
"# Database config\n",
"import random, string\n",
"\n",
"def randomname(n):\n",
" return ''.join(random.choices(string.ascii_letters + string.digits, k=n))\n",
"\n",
"# The database name\n",
"os.environ['database']=\"obsidiannote\"\n",
"# The passphrase to E2EE\n",
"os.environ['passphrase']=randomname(20)\n",
"\n",
"print(\"Your database:\"+os.environ['database'])\n",
"print(\"Your passphrase:\"+os.environ['passphrase'])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "C4d7C0HAXgsr"
},
"outputs": [],
"source": [
"# Install deno for make setup uri\n",
"!curl -fsSL https://deno.land/x/install/install.sh | sh"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "hQL_Dx-PXise"
},
"outputs": [],
"source": [
"# Fetch module for encrypting a Setup URI\n",
"!curl -o encrypt.ts https://gist.githubusercontent.com/vrtmrz/f9d1d95ee2ca3afa1a924a2c6759b854/raw/d7a070d864a6f61403d8dc74208238d5741aeb5a/encrypt.ts"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "o0gX_thFXlIZ"
},
"outputs": [],
"source": [
"# Make buttons!\n",
"from IPython.display import HTML\n",
"result = subprocess.run([\"/root/.deno/bin/deno\",\"run\",\"-A\",\"encrypt.ts\"], capture_output=True, text=True)\n",
"text=\"\"\n",
"if result.returncode==0:\n",
" text = result.stdout.strip()\n",
" result = HTML(f\"<button onclick=navigator.clipboard.writeText('{text}')>Copy setup uri</button><br>Importing passphrase is `welcome`. <br>If you want to synchronise in live mode, please apply a preset after setup.)\")\n",
"else:\n",
" result = \"Failed to encrypt the setup URI\"\n",
"result"
]
}
],
"metadata": {
"colab": {
"include_colab_link": true,
"private_outputs": true,
"provenance": []
},
"gpuClass": "standard",
"kernelspec": {
"display_name": "Python 3",
"name": "python3"
},
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 0
}

View File

@@ -1,4 +1,7 @@
# Quick setup
[Japanese docs](./quick_setup_ja.md) - [Chinese docs](./quick_setup_cn.md).
The plugin has so many configuration options to deal with different circumstances. However, there are not so many settings that are actually used. Therefore, `The Setup wizard` has been implemented to simplify the initial setup.
Note: Subsequent devices are recommended to be set up using the `Copy setup URI` and `Open setup URI`.

93
docs/quick_setup_cn.md Normal file
View File

@@ -0,0 +1,93 @@
# 快速配置 (Quick setup)
该插件有较多配置项, 可以应对不同的情况. 不过, 实际使用的设置并不多. 因此, 我们采用了 "设置向导 (The Setup wizard)" 来简化初始设置.
Note: 建议使用 `Copy setup URI` and `Open setup URI` 来设置后续设备.
## 设置向导 (The Setup wizard)
在设置对话框中打开 `🧙‍♂️ Setup wizard`. 如果之前未配置插件, 则会自动打开该页面.
![quick_setup_1](../images/quick_setup_1.png)
- 放弃现有配置并进行设置
如果您先前有过任何设置, 此按钮允许您在设置前放弃所有更改.
- 保留现有配置和设置
快速重新配置. 请注意, 在向导模式下, 您无法看到所有已经配置过的配置项.
在上述选项中按下 `Next`, 配置对话框将进入向导模式 (wizard mode).
### 向导模式 (Wizard mode)
![quick_setup_2](../images/quick_setup_2.png)
接下来将介绍如何逐步使用向导模式.
## 配置远程数据库
### 开始配置远程数据库
输入已部署好的数据库的信息.
![quick_setup_3](../images/quick_setup_3.png)
#### 测试数据库连接并检查数据库配置
我们可以检查数据库的连接性和数据库设置.
![quick_setup_5](../images/quick_setup_5.png)
#### 测试数据库连接
检查是否能成功连接数据库. 如果连接失败, 可能是多种原因导致的, 但请先点击 `Check database configuration` 来检查数据库配置是否有问题.
#### 检查数据库配置
检查数据库设置并修复问题.
![quick_setup_6](../images/quick_setup_6.png)
Config check 的显示内容可能因不同连接而异. 在上图情况下, 按下所有三个修复按钮.
如果修复按钮消失, 全部变为复选标记, 则表示修复完成.
### 加密配置
![quick_setup_4](../images/quick_setup_4.png)
为您的数据库加密, 以防数据库意外曝光; 启用端到端加密后, 笔记内容在离开设备时就会被加密. 我们强烈建议启用该功能. `路径混淆 (Path Obfuscation)` 还能混淆文件名. 现已稳定并推荐使用.
加密基于 256 位 AES-GCM.
如果你在一个封闭的网络中, 而且很明显第三方不会访问你的文件, 则可以禁用这些设置.
![quick_setup_7](../images/quick_setup_7.png)
#### Next
转到同步设置.
#### 放弃现有数据库并继续
清除远程数据库的内容, 然后转到同步设置.
### 同步设置
最后, 选择一个同步预设完成向导.
![quick_setup_9_1](../images/quick_setup_9_1.png)
选择我们要使用的任何同步方法, 然后 `Apply` 初始化并按要求建立本地和远程数据库. 如果显示 `All done!`, 我们就完成了. `Copy setup URI` 将自动打开,并要求我们输入密码以加密 `Setup URI`.
![quick_setup_10](../images/quick_setup_10.png)
根据需要设置密码。.
设置 URI (Setup URI) 将被复制到剪贴板, 然后您可以通过某种方式将其传输到第二个及后续设备.
## 如何设置第二单元和后续单元 (the second and subsequent units)
在第一台设备上安装 Self-hosted LiveSync 后, 从命令面板上选择 `Open setup URI`, 然后输入您传输的设置 URI (Setup URI). 然后输入密码,安装向导就会打开.
在弹窗中选择以下内容.
- `Importing LiveSync's conf, OK?` 选择 `Yes`
- `How would you like to set it up?`. 选择 `Set it up as secondary or subsequent device`
然后, 配置将生效并开始复制. 您的文件很快就会同步! 您可能需要关闭设置对话框并重新打开, 才能看到设置字段正确填充, 但它们都将设置好.

View File

@@ -280,7 +280,7 @@ Now the CouchDB is ready to use from Self-hosted LiveSync. We can use `https://b
## Automatic setup using Colaboratory
We can perform all these steps by using [this Colaboratory notebook](https://gist.github.com/vrtmrz/b437a539af25ef191bd452aae369242f) without installing anything.
We can perform all these steps by using [this Colaboratory notebook](/deploy_couchdb_to_flyio_v2_with_swap.ipynb) without installing anything.
## After testing / before creating a new instance

View File

@@ -1,8 +1,19 @@
# 在你自己的服务器上设置 CouchDB
## 目录
- [配置 CouchDB](#配置-CouchDB)
- [运行 CouchDB](#运行-CouchDB)
- [Docker CLI](#docker-cli)
- [Docker Compose](#docker-compose)
- [创建数据库](#创建数据库)
- [从移动设备访问](#从移动设备访问)
- [移动设备测试](#移动设备测试)
- [设置你的域名](#设置你的域名)
---
> 注:提供了 [docker-compose.yml 和 ini 文件](https://github.com/vrtmrz/self-hosted-livesync-server) 可以同时启动 Caddy 和 CouchDB。推荐直接使用该 docker-compose 配置进行搭建。(若使用,请查阅链接中的文档,而不是这个文档)
## 安装 CouchDB 并从 PC 或 Mac 上访问
## 配置 CouchDB
设置 CouchDB 的最简单方法是使用 [CouchDB docker image]((https://hub.docker.com/_/couchdb)).
@@ -33,17 +44,62 @@ methods = GET, PUT, POST, HEAD, DELETE
max_age = 3600
```
创建 `local.ini` 并用如下指令启动 CouchDB
```
$ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v .local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
```
Note: 此时 local.ini 的文件所有者会变成 5984:5984。这是 docker 镜像的限制,请修改文件所有者后再编辑 local.ini。
## 运行 CouchDB
在确定 Self-hosted LiveSync 可以和服务器同步后,可以后台启动 docker 镜像:
### Docker CLI
你可以通过指定 `local.ini` 配置运行 CouchDB:
```
$ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v .local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
$ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
```
*记得将上述命令中的 local.ini 挂载路径替换成实际的存放路径*
后台运行:
```
$ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
```
*记得将上述命令中的 local.ini 挂载路径替换成实际的存放路径*
### Docker Compose
创建一个文件夹, 将你的 `local.ini` 放在文件夹内, 然后在文件夹内创建 `docker-compose.yml`. 请确保对 `local.ini` 有读写权限并且确保在容器运行后能创建 `data` 文件夹. 文件夹结构大概如下:
```
obsidian-livesync
├── docker-compose.yml
└── local.ini
```
可以参照以下内容编辑 `docker-compose.yml`:
```yaml
version: "2.1"
services:
couchdb:
image: couchdb
container_name: obsidian-livesync
user: 1000:1000
environment:
- COUCHDB_USER=admin
- COUCHDB_PASSWORD=password
volumes:
- ./data:/opt/couchdb/data
- ./local.ini:/opt/couchdb/etc/local.ini
ports:
- 5984:5984
restart: unless-stopped
```
最后, 创建并启动容器:
```
# -d will launch detached so the container runs in background
docker compose up -d
```
## 创建数据库
CouchDB 部署成功后, 需要手动创建一个数据库, 方便插件连接并同步.
1. 访问 `http://localhost:5984/_utils`, 输入帐号密码后进入管理页面
2. 点击 Create Database, 然后根据个人喜好创建数据库
## 从移动设备访问
如果你想要从移动设备访问 Self-hosted LiveSync你需要一个合法的 SSL 证书。

View File

@@ -1,46 +1,165 @@
//@ts-check
import esbuild from "esbuild";
import process from "process";
import builtins from "builtin-modules";
import sveltePlugin from "esbuild-svelte";
import sveltePreprocess from "svelte-preprocess";
import fs from "node:fs";
// import terser from "terser";
import { minify } from "terser";
const banner = `/*
THIS IS A GENERATED/BUNDLED FILE BY ESBUILD
THIS IS A GENERATED/BUNDLED FILE BY ESBUILD AND TERSER
if you want to view the source, please visit the github repository of this plugin
*/
`;
const prod = process.argv[2] === "production";
const manifestJson = JSON.parse(fs.readFileSync("./manifest.json"));
const packageJson = JSON.parse(fs.readFileSync("./package.json"));
const terserOpt = {
sourceMap: (!prod ? {
url: "inline"
} : {}),
format: {
indent_level: 2,
beautify: true,
comments: "some",
ecma: 2018,
preamble: banner,
webkit: true
},
parse: {
// parse options
},
compress: {
// compress options
defaults: false,
evaluate: true,
inline: 3,
join_vars: true,
loops: true,
passes: prod ? 4 : 1,
reduce_vars: true,
reduce_funcs: true,
arrows: true,
collapse_vars: true,
comparisons: true,
lhs_constants: true,
hoist_props: true,
side_effects: true,
if_return: true,
ecma: 2018,
unused: true,
},
// mangle: {
// // mangle options
// keep_classnames: true,
// keep_fnames: true,
// properties: {
// // mangle property options
// }
// },
ecma: 2018, // specify one of: 5, 2015, 2016, etc.
enclose: false, // or specify true, or "args:values"
keep_classnames: true,
keep_fnames: true,
ie8: false,
module: false,
// nameCache: null, // or specify a name cache object
safari10: false,
toplevel: false
}
const manifestJson = JSON.parse(fs.readFileSync("./manifest.json") + "");
const packageJson = JSON.parse(fs.readFileSync("./package.json") + "");
const updateInfo = JSON.stringify(fs.readFileSync("./updates.md") + "");
esbuild
.build({
banner: {
js: banner,
},
entryPoints: ["src/main.ts"],
bundle: true,
define: {
"MANIFEST_VERSION": `"${manifestJson.version}"`,
"PACKAGE_VERSION": `"${packageJson.version}"`,
"UPDATE_INFO": `${updateInfo}`,
"global":"window",
},
external: ["obsidian", "electron", "crypto"],
format: "cjs",
watch: !prod,
target: "es2018",
logLevel: "info",
sourcemap: prod ? false : "inline",
treeShaking: true,
platform: "browser",
plugins: [
sveltePlugin({
preprocess: sveltePreprocess(),
compilerOptions: { css: true },
}),
],
outfile: "main.js",
})
.catch(() => process.exit(1));
/** @type esbuild.Plugin[] */
const plugins = [{
name: 'my-plugin',
setup(build) {
let count = 0;
build.onEnd(async result => {
if (count++ === 0) {
console.log('first build:', result);
} else {
console.log('subsequent build:');
}
if (prod) {
console.log("Performing terser");
const src = fs.readFileSync("./main_org.js").toString();
// @ts-ignore
const ret = await minify(src, terserOpt);
if (ret && ret.code) {
fs.writeFileSync("./main.js", ret.code);
}
console.log("Finished terser");
} else {
fs.copyFileSync("./main_org.js", "./main.js");
}
});
},
}];
const context = await esbuild.context({
banner: {
js: banner,
},
entryPoints: ["src/main.ts"],
bundle: true,
define: {
"MANIFEST_VERSION": `"${manifestJson.version}"`,
"PACKAGE_VERSION": `"${packageJson.version}"`,
"UPDATE_INFO": `${updateInfo}`,
"global": "window",
},
external: [
"obsidian",
"electron",
"crypto",
"@codemirror/autocomplete",
"@codemirror/collab",
"@codemirror/commands",
"@codemirror/language",
"@codemirror/lint",
"@codemirror/search",
"@codemirror/state",
"@codemirror/view",
"@lezer/common",
"@lezer/highlight",
"@lezer/lr"],
// minifyWhitespace: true,
format: "cjs",
target: "es2018",
logLevel: "info",
platform: "browser",
sourcemap: prod ? false : "inline",
treeShaking: true,
outfile: "main_org.js",
minifyWhitespace: false,
minifySyntax: false,
minifyIdentifiers: false,
minify: false,
// keepNames: true,
plugins: [
sveltePlugin({
preprocess: sveltePreprocess(),
compilerOptions: { css: true, preserveComments: true },
}),
...plugins
],
})
if (prod) {
await context.rebuild();
process.exit(0);
} else {
await context.watch();
}

View File

@@ -1,7 +1,7 @@
{
"id": "obsidian-livesync",
"name": "Self-hosted LiveSync",
"version": "0.19.14",
"version": "0.22.1",
"minAppVersion": "0.9.12",
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"author": "vorotamoroz",

3338
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "obsidian-livesync",
"version": "0.19.14",
"version": "0.22.1",
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"main": "main.js",
"type": "module",
@@ -13,41 +13,51 @@
"author": "vorotamoroz",
"license": "MIT",
"devDependencies": {
"@tsconfig/svelte": "^4.0.1",
"@tsconfig/svelte": "^5.0.0",
"@types/diff-match-patch": "^1.0.32",
"@types/node": "^20.2.5",
"@types/pouchdb": "^6.4.0",
"@types/pouchdb-adapter-http": "^6.1.3",
"@types/pouchdb-adapter-idb": "^6.1.4",
"@types/pouchdb-browser": "^6.1.3",
"@typescript-eslint/eslint-plugin": "^5.54.0",
"@typescript-eslint/parser": "^5.54.0",
"@types/pouchdb-core": "^7.0.11",
"@types/pouchdb-mapreduce": "^6.1.7",
"@types/pouchdb-replication": "^6.4.4",
"@types/transform-pouch": "^1.0.2",
"@typescript-eslint/eslint-plugin": "^6.2.1",
"@typescript-eslint/parser": "^6.2.1",
"builtin-modules": "^3.3.0",
"esbuild": "0.15.15",
"esbuild-svelte": "^0.7.3",
"eslint": "^8.35.0",
"esbuild": "0.18.17",
"esbuild-svelte": "^0.7.4",
"eslint": "^8.46.0",
"eslint-config-airbnb-base": "^15.0.0",
"eslint-plugin-import": "^2.27.5",
"eslint-plugin-import": "^2.28.0",
"events": "^3.3.0",
"obsidian": "^1.1.1",
"postcss": "^8.4.21",
"obsidian": "^1.4.11",
"postcss": "^8.4.27",
"postcss-load-config": "^4.0.1",
"pouchdb-adapter-http": "^8.0.1",
"pouchdb-adapter-idb": "^8.0.1",
"pouchdb-adapter-indexeddb": "^8.0.1",
"pouchdb-core": "^8.0.1",
"pouchdb-errors": "^8.0.1",
"pouchdb-find": "^8.0.1",
"pouchdb-mapreduce": "^8.0.1",
"pouchdb-merge": "^8.0.1",
"pouchdb-replication": "^8.0.1",
"pouchdb-utils": "^8.0.1",
"svelte": "^3.59.1",
"svelte-preprocess": "^5.0.3",
"svelte": "^4.1.2",
"svelte-preprocess": "^5.0.4",
"terser": "^5.19.2",
"transform-pouch": "^2.0.0",
"tslib": "^2.5.0",
"typescript": "^5.0.4"
"tslib": "^2.6.1",
"typescript": "^5.1.6"
},
"dependencies": {
"diff-match-patch": "^1.0.5",
"idb": "^7.1.1",
"xxhash-wasm": "^0.4.2",
"minimatch": "^9.0.3",
"xxhash-wasm": "0.4.2",
"xxhash-wasm-102": "npm:xxhash-wasm@^1.0.2"
}
}

View File

@@ -1,27 +1,126 @@
import { writable } from 'svelte/store';
import { Notice, type PluginManifest, parseYaml } from "./deps";
import { Notice, type PluginManifest, parseYaml, normalizePath } from "./deps";
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry } from "./lib/src/types";
import { LOG_LEVEL } from "./lib/src/types";
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry, SavingEntry } from "./lib/src/types";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "./lib/src/types";
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
import { delay, getDocData } from "./lib/src/utils";
import { createTextBlob, delay, getDocData } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { WrappedNotice } from "./lib/src/wrapper";
import { base64ToArrayBuffer, arrayBufferToBase64, readString, crc32CKHash } from "./lib/src/strbin";
import { runWithLock } from "./lib/src/lock";
import { readString, decodeBinary, arrayBufferToBase64, sha1 } from "./lib/src/strbin";
import { serialized } from "./lib/src/lock";
import { LiveSyncCommands } from "./LiveSyncCommands";
import { stripAllPrefixes } from "./lib/src/path";
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "./utils";
import { PluginDialogModal } from "./dialogs";
import { JsonResolveModal } from "./JsonResolveModal";
import { pipeGeneratorToGenerator, processAllGeneratorTasksWithConcurrencyLimit } from './lib/src/task';
import { QueueProcessor } from './lib/src/processor';
import { pluginScanningCount } from './lib/src/stores';
import type ObsidianLiveSyncPlugin from './main';
const d = "\u200b";
const d2 = "\n";
function serialize<T>(obj: T): string {
return JSON.stringify(obj, null, 1);
function serialize(data: PluginDataEx): string {
// For higher performance, create custom plug-in data strings.
// Self-hosted LiveSync uses `\n` to split chunks. Therefore, grouping together those with similar entropy would work nicely.
let ret = "";
ret += ":";
ret += data.category + d + data.name + d + data.term + d2;
ret += (data.version ?? "") + d2;
ret += data.mtime + d2;
for (const file of data.files) {
ret += file.filename + d + (file.displayName ?? "") + d + (file.version ?? "") + d2;
ret += file.mtime + d + file.size + d2;
for (const data of file.data ?? []) {
ret += data + d
}
ret += d2;
}
return ret;
}
function fetchToken(source: string, from: number): [next: number, token: string] {
const limitIdx = source.indexOf(d2, from);
const limit = limitIdx == -1 ? source.length : limitIdx;
const delimiterIdx = source.indexOf(d, from);
const delimiter = delimiterIdx == -1 ? source.length : delimiterIdx;
const tokenEnd = Math.min(limit, delimiter);
let next = tokenEnd;
if (limit < delimiter) {
next = tokenEnd;
} else {
next = tokenEnd + 1
}
return [next, source.substring(from, tokenEnd)];
}
function getTokenizer(source: string) {
const t = {
pos: 1,
next() {
const [next, token] = fetchToken(source, this.pos);
this.pos = next;
return token;
},
nextLine() {
const nextPos = source.indexOf(d2, this.pos);
if (nextPos == -1) {
this.pos = source.length;
} else {
this.pos = nextPos + 1;
}
}
}
return t;
}
function deserialize2(str: string): PluginDataEx {
const tokens = getTokenizer(str);
const ret = {} as PluginDataEx;
const category = tokens.next();
const name = tokens.next();
const term = tokens.next();
tokens.nextLine();
const version = tokens.next();
tokens.nextLine();
const mtime = Number(tokens.next());
tokens.nextLine();
const result: PluginDataEx = Object.assign(ret,
{ category, name, term, version, mtime, files: [] as PluginDataExFile[] })
let filename = "";
do {
filename = tokens.next();
if (!filename) break;
const displayName = tokens.next();
const version = tokens.next();
tokens.nextLine();
const mtime = Number(tokens.next());
const size = Number(tokens.next());
tokens.nextLine();
const data = [] as string[];
let piece = "";
do {
piece = tokens.next();
if (piece == "") break;
data.push(piece);
} while (piece != "");
result.files.push(
{
filename,
displayName,
version,
mtime,
size,
data
}
)
tokens.nextLine();
} while (filename);
return result;
}
function deserialize<T>(str: string, def: T) {
try {
if (str[0] == ":") return deserialize2(str);
return JSON.parse(str) as T;
} catch (ex) {
try {
@@ -65,6 +164,16 @@ export type PluginDataEx = {
mtime: number,
};
export class ConfigSync extends LiveSyncCommands {
constructor(plugin: ObsidianLiveSyncPlugin) {
super(plugin);
pluginScanningCount.onChanged((e) => {
const total = e.value;
pluginIsEnumerating.set(total != 0);
if (total == 0) {
Logger(`Processing configurations done`, LOG_LEVEL_INFO, "get-plugins");
}
})
}
confirmPopup: WrappedNotice = null;
get kvDB() {
return this.plugin.kvDB;
@@ -107,6 +216,7 @@ export class ConfigSync extends LiveSyncCommands {
},
});
}
getFileCategory(filePath: string): "CONFIG" | "THEME" | "SNIPPET" | "PLUGIN_MAIN" | "PLUGIN_ETC" | "PLUGIN_DATA" | "" {
if (filePath.split("/").length == 2 && filePath.endsWith(".json")) return "CONFIG";
if (filePath.split("/").length == 4 && filePath.startsWith(`${this.app.vault.configDir}/themes/`)) return "THEME";
@@ -139,7 +249,7 @@ export class ConfigSync extends LiveSyncCommands {
Logger("Scanning customizations : done");
} catch (ex) {
Logger("Scanning customizations : failed");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
@@ -164,76 +274,105 @@ export class ConfigSync extends LiveSyncCommands {
pluginList.set(this.pluginList)
await this.updatePluginList(showMessage);
}
async loadPluginData(path: FilePathWithPrefix): Promise<PluginDataExDisplay | false> {
const wx = await this.localDatabase.getDBEntry(path, null, false, false);
if (wx) {
const data = deserialize(getDocData(wx.data), {}) as PluginDataEx;
const xFiles = [] as PluginDataExFile[];
for (const file of data.files) {
const work = { ...file };
const tempStr = getDocData(work.data);
work.data = [await sha1(tempStr)];
xFiles.push(work);
}
return ({
...data,
documentPath: this.getPath(wx),
files: xFiles
}) as PluginDataExDisplay;
}
return false;
}
createMissingConfigurationEntry() {
let saveRequired = false;
for (const v of this.pluginList) {
const key = `${v.category}/${v.name}`;
if (!(key in this.plugin.settings.pluginSyncExtendedSetting)) {
this.plugin.settings.pluginSyncExtendedSetting[key] = {
key,
mode: MODE_SELECTIVE,
files: []
}
}
if (this.plugin.settings.pluginSyncExtendedSetting[key].files.sort().join(",").toLowerCase() !=
v.files.map(e => e.filename).sort().join(",").toLowerCase()) {
this.plugin.settings.pluginSyncExtendedSetting[key].files = v.files.map(e => e.filename).sort();
saveRequired = true;
}
}
if (saveRequired) {
this.plugin.saveSettingData();
}
}
pluginScanProcessor = new QueueProcessor(async (v: AnyEntry[]) => {
const plugin = v[0];
const path = plugin.path || this.getPath(plugin);
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
if (oldEntry && oldEntry.mtime == plugin.mtime) return;
try {
const pluginData = await this.loadPluginData(path);
if (pluginData) {
return [pluginData];
}
// Failed to load
return;
} catch (ex) {
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
return;
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 300, yieldThreshold: 10 }).pipeTo(
new QueueProcessor(
(pluginDataList) => {
let newList = [...this.pluginList];
for (const item of pluginDataList) {
newList = newList.filter(x => x.documentPath != item.documentPath);
newList.push(item)
}
this.pluginList = newList;
pluginList.set(newList);
return;
}
, { suspended: true, batchSize: 1000, concurrentLimit: 10, delay: 200, yieldThreshold: 25, totalRemainingReactiveSource: pluginScanningCount })).startPipeline().root.onIdle(() => {
Logger(`All files enumerated`, LOG_LEVEL_INFO, "get-plugins");
this.createMissingConfigurationEntry();
});
async updatePluginList(showMessage: boolean, updatedDocumentPath?: FilePathWithPrefix): Promise<void> {
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
// pluginList.set([]);
if (!this.settings.usePluginSync) {
this.pluginScanProcessor.clearQueue();
this.pluginList = [];
pluginList.set(this.pluginList)
return;
}
scheduleTask("update-plugin-list-task", 200, async () => {
await runWithLock("update-plugin-list", false, async () => {
try {
const updatedDocumentId = updatedDocumentPath ? await this.path2id(updatedDocumentPath) : "";
const plugins = updatedDocumentPath ?
this.localDatabase.findEntries(updatedDocumentId, updatedDocumentId + "\u{10ffff}", { include_docs: true, key: updatedDocumentId, limit: 1 }) :
this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
let count = 0;
pluginIsEnumerating.set(true);
for await (const v of processAllGeneratorTasksWithConcurrencyLimit(20, pipeGeneratorToGenerator(plugins, async plugin => {
const path = plugin.path || this.getPath(plugin);
if (updatedDocumentPath && updatedDocumentPath != path) {
return false;
}
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
if (oldEntry && oldEntry.mtime == plugin.mtime) return false;
try {
count++;
if (count % 10 == 0) Logger(`Enumerating files... ${count}`, logLevel, "get-plugins");
Logger(`plugin-${path}`, LOG_LEVEL.VERBOSE);
const wx = await this.localDatabase.getDBEntry(path, null, false, false);
if (wx) {
const data = deserialize(getDocData(wx.data), {}) as PluginDataEx;
const xFiles = [] as PluginDataExFile[];
for (const file of data.files) {
const work = { ...file };
const tempStr = getDocData(work.data);
work.data = [crc32CKHash(tempStr)];
xFiles.push(work);
}
return ({
...data,
documentPath: this.getPath(wx),
files: xFiles
});
}
// return entries;
} catch (ex) {
//TODO
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL.NOTICE);
console.warn(ex);
}
return false;
}))) {
if ("ok" in v) {
if (v.ok != false) {
let newList = [...this.pluginList];
const item = v.ok;
newList = newList.filter(x => x.documentPath != item.documentPath);
newList.push(item)
if (updatedDocumentPath != "") newList = newList.filter(e => e.documentPath != updatedDocumentPath);
this.pluginList = newList;
pluginList.set(newList);
}
}
}
Logger(`All files enumerated`, logLevel, "get-plugins");
} finally {
pluginIsEnumerating.set(false);
}
});
});
try {
const updatedDocumentId = updatedDocumentPath ? await this.path2id(updatedDocumentPath) : "";
const plugins = updatedDocumentPath ?
this.localDatabase.findEntries(updatedDocumentId, updatedDocumentId + "\u{10ffff}", { include_docs: true, key: updatedDocumentId, limit: 1 }) :
this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
for await (const v of plugins) {
const path = v.path || this.getPath(v);
if (updatedDocumentPath && updatedDocumentPath != path) continue;
this.pluginScanProcessor.enqueue(v);
}
} finally {
pluginIsEnumerating.set(false);
}
pluginIsEnumerating.set(false);
// return entries;
}
async compareUsingDisplayData(dataA: PluginDataExDisplay, dataB: PluginDataExDisplay) {
@@ -256,8 +395,8 @@ export class ConfigSync extends LiveSyncCommands {
const fileA = { ...pluginDataA.files[0], ctime: pluginDataA.files[0].mtime, _id: `${pluginDataA.documentPath}` as DocumentID };
const fileB = pluginDataB.files[0];
const docAx = { ...docA, ...fileA } as LoadedEntry, docBx = { ...docB, ...fileB } as LoadedEntry
return runWithLock("config:merge-data", false, () => new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL.VERBOSE);
return serialized("config:merge-data", () => new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL_VERBOSE);
// const docs = [docA, docB];
const path = stripAllPrefixes(docAx.path.split("/").slice(-1).join("/") as FilePath);
const modal = new JsonResolveModal(this.app, path, [docAx, docBx], async (keep, result) => {
@@ -266,7 +405,7 @@ export class ConfigSync extends LiveSyncCommands {
res(await this.applyData(pluginDataA, result));
} catch (ex) {
Logger("Could not apply merged file");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
res(false);
}
}, "📡", "🛰️", "B");
@@ -290,16 +429,16 @@ export class ConfigSync extends LiveSyncCommands {
const path = `${baseDir}/${f.filename}`;
await this.ensureDirectoryEx(path);
if (!content) {
const dt = base64ToArrayBuffer(f.data);
await this.app.vault.adapter.writeBinary(path, dt);
const dt = decodeBinary(f.data);
await this.vaultAccess.adapterWrite(path, dt);
} else {
await this.app.vault.adapter.write(path, content);
await this.vaultAccess.adapterWrite(path, content);
}
Logger(`Applying ${f.filename} of ${data.displayName || data.name}.. Done`);
} catch (ex) {
Logger(`Applying ${f.filename} of ${data.displayName || data.name}.. Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
@@ -307,7 +446,7 @@ export class ConfigSync extends LiveSyncCommands {
await this.storeCustomizationFiles(uPath);
await this.updatePluginList(true, uPath);
await delay(100);
Logger(`Config ${data.displayName || data.name} has been applied`, LOG_LEVEL.NOTICE);
Logger(`Config ${data.displayName || data.name} has been applied`, LOG_LEVEL_NOTICE);
if (data.category == "PLUGIN_DATA" || data.category == "PLUGIN_MAIN") {
//@ts-ignore
const manifests = Object.values(this.app.plugins.manifests) as any as PluginManifest[];
@@ -315,12 +454,12 @@ export class ConfigSync extends LiveSyncCommands {
const enabledPlugins = this.app.plugins.enabledPlugins as Set<string>;
const pluginManifest = manifests.find((manifest) => enabledPlugins.has(manifest.id) && manifest.dir == `${baseDir}/plugins/${data.name}`);
if (pluginManifest) {
Logger(`Unloading plugin: ${pluginManifest.name}`, LOG_LEVEL.NOTICE, "plugin-reload-" + pluginManifest.id);
Logger(`Unloading plugin: ${pluginManifest.name}`, LOG_LEVEL_NOTICE, "plugin-reload-" + pluginManifest.id);
// @ts-ignore
await this.app.plugins.unloadPlugin(pluginManifest.id);
// @ts-ignore
await this.app.plugins.loadPlugin(pluginManifest.id);
Logger(`Plugin reloaded: ${pluginManifest.name}`, LOG_LEVEL.NOTICE, "plugin-reload-" + pluginManifest.id);
Logger(`Plugin reloaded: ${pluginManifest.name}`, LOG_LEVEL_NOTICE, "plugin-reload-" + pluginManifest.id);
}
} else if (data.category == "CONFIG") {
scheduleTask("configReload", 250, async () => {
@@ -333,7 +472,7 @@ export class ConfigSync extends LiveSyncCommands {
return true;
} catch (ex) {
Logger(`Applying ${data.displayName || data.name}.. Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
}
@@ -342,11 +481,11 @@ export class ConfigSync extends LiveSyncCommands {
if (data.documentPath) {
await this.deleteConfigOnDatabase(data.documentPath);
await this.updatePluginList(false, data.documentPath);
Logger(`Delete: ${data.documentPath}`, LOG_LEVEL.NOTICE);
Logger(`Delete: ${data.documentPath}`, LOG_LEVEL_NOTICE);
}
return true;
} catch (ex) {
Logger(`Failed to delete: ${data.documentPath}`, LOG_LEVEL.NOTICE);
Logger(`Failed to delete: ${data.documentPath}`, LOG_LEVEL_NOTICE);
return false;
}
@@ -410,15 +549,16 @@ export class ConfigSync extends LiveSyncCommands {
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
return;
}
recentProcessedInternalFiles = [] as string[];
async makeEntryFromFile(path: FilePath): Promise<false | PluginDataExFile> {
const stat = await this.app.vault.adapter.stat(path);
const stat = await this.vaultAccess.adapterStat(path);
let version: string | undefined;
let displayName: string | undefined;
if (!stat) {
return false;
}
const contentBin = await this.app.vault.adapter.readBinary(path);
const contentBin = await this.vaultAccess.adapterReadBinary(path);
let content: string[];
try {
content = await arrayBufferToBase64(contentBin);
@@ -433,12 +573,12 @@ export class ConfigSync extends LiveSyncCommands {
displayName = `${json.name}`;
}
} catch (ex) {
Logger(`Configuration sync data: ${path} looks like manifest, but could not read the version`, LOG_LEVEL.INFO);
Logger(`Configuration sync data: ${path} looks like manifest, but could not read the version`, LOG_LEVEL_INFO);
}
}
} catch (ex) {
Logger(`The file ${path} could not be encoded`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
const mtime = stat.mtime;
@@ -465,11 +605,11 @@ export class ConfigSync extends LiveSyncCommands {
async storeCustomizationFiles(path: FilePath, termOverRide?: string) {
const term = termOverRide || this.plugin.deviceAndVaultName;
if (term == "") {
Logger("We have to configure the device name", LOG_LEVEL.NOTICE);
Logger("We have to configure the device name", LOG_LEVEL_NOTICE);
return;
}
const vf = this.filenameToUnifiedKey(path, term);
return await runWithLock(`plugin-${vf}`, false, async () => {
return await serialized(`plugin-${vf}`, async () => {
const category = this.getFileCategory(path);
let mtime = 0;
let fileTargets = [] as FilePath[];
@@ -501,7 +641,7 @@ export class ConfigSync extends LiveSyncCommands {
for (const target of fileTargets) {
const data = await this.makeEntryFromFile(target);
if (data == false) {
// Logger(`Config: skipped: ${target} `, LOG_LEVEL.VERBOSE);
// Logger(`Config: skipped: ${target} `, LOG_LEVEL_VERBOSE);
continue;
}
if (data.version) {
@@ -524,10 +664,10 @@ export class ConfigSync extends LiveSyncCommands {
return
}
const content = serialize(dt);
const content = createTextBlob(serialize(dt));
try {
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false);
let saveData: LoadedEntry;
let saveData: SavingEntry;
if (old === false) {
saveData = {
_id: id,
@@ -536,14 +676,14 @@ export class ConfigSync extends LiveSyncCommands {
mtime,
ctime: mtime,
datatype: "newnote",
size: content.length,
size: content.size,
children: [],
deleted: false,
type: "newnote",
};
} else {
if (old.mtime == mtime) {
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL.VERBOSE);
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return true;
}
saveData =
@@ -551,7 +691,7 @@ export class ConfigSync extends LiveSyncCommands {
...old,
data: content,
mtime,
size: content.length,
size: content.size,
datatype: "newnote",
children: [],
deleted: false,
@@ -564,7 +704,7 @@ export class ConfigSync extends LiveSyncCommands {
return ret;
} catch (ex) {
Logger(`STORAGE --> DB:${prefixedFileName}: (config) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
})
@@ -573,10 +713,17 @@ export class ConfigSync extends LiveSyncCommands {
async watchVaultRawEventsAsync(path: FilePath) {
if (!this.settings.usePluginSync) return false;
if (!this.isTargetPath(path)) return false;
const stat = await this.app.vault.adapter.stat(path);
const stat = await this.vaultAccess.adapterStat(path);
// Make sure that target is a file.
if (stat && stat.type != "file")
return false;
const configDir = normalizePath(this.app.vault.configDir);
const synchronisedInConfigSync = Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode != MODE_SELECTIVE).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
if (synchronisedInConfigSync.some(e => e.startsWith(path.toLowerCase()))) {
Logger(`Customization file skipped: ${path}`, LOG_LEVEL_VERBOSE);
return;
}
const storageMTime = ~~((stat && stat.mtime || 0) / 1000);
const key = `${path}-${storageMTime}`;
if (this.recentProcessedInternalFiles.contains(key)) {
@@ -591,11 +738,11 @@ export class ConfigSync extends LiveSyncCommands {
async scanAllConfigFiles(showMessage: boolean) {
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
Logger("Scanning customizing files.", logLevel, "scan-all-config");
const term = this.plugin.deviceAndVaultName;
if (term == "") {
Logger("We have to configure the device name", LOG_LEVEL.NOTICE);
Logger("We have to configure the device name", LOG_LEVEL_NOTICE);
return;
}
const filesAll = await this.scanInternalFiles();
@@ -617,7 +764,7 @@ export class ConfigSync extends LiveSyncCommands {
// const id = await this.path2id(prefixedFileName);
const mtime = new Date().getTime();
await runWithLock("file-x-" + prefixedFileName, false, async () => {
await serialized("file-x-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false) as InternalFileEntry | false;
let saveData: InternalFileEntry;
@@ -643,7 +790,7 @@ export class ConfigSync extends LiveSyncCommands {
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) Done`);
} catch (ex) {
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
});

View File

@@ -1,16 +1,18 @@
import { Notice, normalizePath, type PluginManifest } from "./deps";
import { type EntryDoc, type LoadedEntry, LOG_LEVEL, type InternalFileEntry, type FilePathWithPrefix, type FilePath } from "./lib/src/types";
import { normalizePath, type PluginManifest } from "./deps";
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry } from "./lib/src/types";
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
import { Parallels, delay, isDocContentSame } from "./lib/src/utils";
import { createBinaryBlob, isDocContentSame, sendSignal } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
import { disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask, isInternalMetadata, PeriodicProcessor } from "./utils";
import { isInternalMetadata, PeriodicProcessor } from "./utils";
import { WrappedNotice } from "./lib/src/wrapper";
import { base64ToArrayBuffer, arrayBufferToBase64 } from "./lib/src/strbin";
import { runWithLock } from "./lib/src/lock";
import { decodeBinary, encodeBinary } from "./lib/src/strbin";
import { serialized } from "./lib/src/lock";
import { JsonResolveModal } from "./JsonResolveModal";
import { LiveSyncCommands } from "./LiveSyncCommands";
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
import { KeyedQueueProcessor, QueueProcessor } from "./lib/src/processor";
import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "./lib/src/stores";
export class HiddenFileSync extends LiveSyncCommands {
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
@@ -44,7 +46,7 @@ export class HiddenFileSync extends LiveSyncCommands {
Logger("Synchronizing hidden files done");
} catch (ex) {
Logger("Synchronizing hidden files failed");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
}
@@ -75,27 +77,30 @@ export class HiddenFileSync extends LiveSyncCommands {
return;
}
procInternalFiles: string[] = [];
async execInternalFile() {
await runWithLock("execInternal", false, async () => {
const w = [...this.procInternalFiles];
this.procInternalFiles = [];
Logger(`Applying hidden ${w.length} files change...`);
await this.syncInternalFilesAndDatabase("pull", false, false, w);
Logger(`Applying hidden ${w.length} files changed`);
});
}
procInternalFile(filename: string) {
this.procInternalFiles.push(filename);
scheduleTask("procInternal", 500, async () => {
await this.execInternalFile();
});
this.internalFileProcessor.enqueueWithKey(filename, filename);
}
internalFileProcessor = new KeyedQueueProcessor<string, any>(
async (filenames) => {
Logger(`START :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
await this.syncInternalFilesAndDatabase("pull", false, false, filenames);
Logger(`DONE :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
return;
}, { batchSize: 100, concurrentLimit: 1, delay: 10, yieldThreshold: 10, suspended: false, totalRemainingReactiveSource: hiddenFilesEventCount }
);
recentProcessedInternalFiles = [] as string[];
async watchVaultRawEventsAsync(path: FilePath) {
if (!this.settings.syncInternalFiles) return;
const stat = await this.app.vault.adapter.stat(path);
// Exclude files handled by customization sync
const configDir = normalizePath(this.app.vault.configDir);
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
if (synchronisedInConfigSync.some(e => e.startsWith(path.toLowerCase()))) {
Logger(`Hidden file skipped: ${path} is synchronized in customization sync.`, LOG_LEVEL_VERBOSE);
return;
}
const stat = await this.vaultAccess.adapterStat(path);
// sometimes folder is coming.
if (stat && stat.type != "file")
return;
@@ -129,25 +134,34 @@ export class HiddenFileSync extends LiveSyncCommands {
async resolveConflictOnInternalFiles() {
// Scan all conflicted internal files
const conflicted = this.localDatabase.findEntries(ICHeader, ICHeaderEnd, { conflicts: true });
for await (const doc of conflicted) {
if (!("_conflicts" in doc))
continue;
if (isInternalMetadata(doc._id)) {
await this.resolveConflictOnInternalFile(doc.path);
this.conflictResolutionProcessor.suspend();
try {
for await (const doc of conflicted) {
if (!("_conflicts" in doc))
continue;
if (isInternalMetadata(doc._id)) {
this.conflictResolutionProcessor.enqueue(doc.path);
}
}
} catch (ex) {
Logger("something went wrong on resolving all conflicted internal files");
Logger(ex, LOG_LEVEL_VERBOSE);
}
await this.conflictResolutionProcessor.startPipeline().waitForPipeline();
}
async resolveConflictOnInternalFile(path: FilePathWithPrefix): Promise<boolean> {
conflictResolutionProcessor = new QueueProcessor(async (paths: FilePathWithPrefix[]) => {
const path = paths[0];
sendSignal(`cancel-internal-conflict:${path}`);
try {
// Retrieve data
const id = await this.path2id(path, ICHeader);
const doc = await this.localDatabase.getRaw(id, { conflicts: true });
// If there is no conflict, return with false.
if (!("_conflicts" in doc))
return false;
return;
if (doc._conflicts.length == 0)
return false;
return;
Logger(`Hidden file conflicted:${path}`);
const conflicts = doc._conflicts.sort((a, b) => Number(a.split("-")[0]) - Number(b.split("-")[0]));
const revA = doc._rev;
@@ -161,32 +175,23 @@ export class HiddenFileSync extends LiveSyncCommands {
const commonBase = revFrom._revs_info.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
const result = await this.plugin.mergeObject(path, commonBase, doc._rev, conflictedRev);
if (result) {
Logger(`Object merge:${path}`, LOG_LEVEL.INFO);
Logger(`Object merge:${path}`, LOG_LEVEL_INFO);
const filename = stripAllPrefixes(path);
const isExists = await this.app.vault.adapter.exists(filename);
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
if (!isExists) {
await this.ensureDirectoryEx(filename);
}
await this.app.vault.adapter.write(filename, result);
const stat = await this.app.vault.adapter.stat(filename);
await this.plugin.vaultAccess.adapterWrite(filename, result);
const stat = await this.vaultAccess.adapterStat(filename);
await this.storeInternalFileToDatabase({ path: filename, ...stat });
await this.extractInternalFileFromDatabase(filename);
await this.localDatabase.removeRaw(id, revB);
return this.resolveConflictOnInternalFile(path);
this.conflictResolutionProcessor.enqueue(path);
return;
} else {
Logger(`Object merge is not applicable.`, LOG_LEVEL.VERBOSE);
}
const docAMerge = await this.localDatabase.getDBEntry(path, { rev: revA });
const docBMerge = await this.localDatabase.getDBEntry(path, { rev: revB });
if (docAMerge != false && docBMerge != false) {
if (await this.showJSONMergeDialogAndMerge(docAMerge, docBMerge)) {
await delay(200);
// Again for other conflicted revisions.
return this.resolveConflictOnInternalFile(path);
}
return false;
Logger(`Object merge is not applicable.`, LOG_LEVEL_VERBOSE);
}
return [{ path, revA, revB }];
}
const revBDoc = await this.localDatabase.getRaw(id, { rev: revB });
// determine which revision should been deleted.
@@ -200,27 +205,52 @@ export class HiddenFileSync extends LiveSyncCommands {
await this.localDatabase.removeRaw(id, delRev);
Logger(`Older one has been deleted:${path}`);
// check the file again
return this.resolveConflictOnInternalFile(path);
this.conflictResolutionProcessor.enqueue(path);
return;
} catch (ex) {
Logger(`Failed to resolve conflict (Hidden): ${path}`);
Logger(ex, LOG_LEVEL.VERBOSE);
return false;
Logger(ex, LOG_LEVEL_VERBOSE);
return;
}
}, {
suspended: false, batchSize: 1, concurrentLimit: 5, delay: 10, keepResultUntilDownstreamConnected: true, yieldThreshold: 10,
pipeTo: new QueueProcessor(async (results) => {
const { path, revA, revB } = results[0]
const docAMerge = await this.localDatabase.getDBEntry(path, { rev: revA });
const docBMerge = await this.localDatabase.getDBEntry(path, { rev: revB });
if (docAMerge != false && docBMerge != false) {
if (await this.showJSONMergeDialogAndMerge(docAMerge, docBMerge)) {
// Again for other conflicted revisions.
this.conflictResolutionProcessor.enqueue(path);
}
return;
}
}, { suspended: false, batchSize: 1, concurrentLimit: 1, delay: 10, keepResultUntilDownstreamConnected: false, yieldThreshold: 10 })
})
queueConflictCheck(path: FilePathWithPrefix) {
this.conflictResolutionProcessor.enqueue(path);
}
//TODO: Tidy up. Even though it is experimental feature, So dirty...
async syncInternalFilesAndDatabase(direction: "push" | "pull" | "safe" | "pullForce" | "pushForce", showMessage: boolean, files: InternalFileInfo[] | false = false, targetFiles: string[] | false = false) {
async syncInternalFilesAndDatabase(direction: "push" | "pull" | "safe" | "pullForce" | "pushForce", showMessage: boolean, filesAll: InternalFileInfo[] | false = false, targetFiles: string[] | false = false) {
await this.resolveConflictOnInternalFiles();
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
Logger("Scanning hidden files.", logLevel, "sync_internal");
const ignorePatterns = this.settings.syncInternalFilesIgnorePatterns
.replace(/\n| /g, "")
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
if (!files)
files = await this.scanInternalFiles();
const configDir = normalizePath(this.app.vault.configDir);
let files: InternalFileInfo[] =
filesAll ? filesAll : (await this.scanInternalFiles())
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
files = files.filter(file => synchronisedInConfigSync.every(filterFile => !file.path.toLowerCase().startsWith(filterFile)))
const filesOnDB = ((await this.localDatabase.allDocsRaw({ startkey: ICHeader, endkey: ICHeaderEnd, include_docs: true })).rows.map(e => e.doc) as InternalFileEntry[]).filter(e => !e.deleted);
const allFileNamesSrc = [...new Set([...files.map(e => normalizePath(e.path)), ...filesOnDB.map(e => stripAllPrefixes(this.getPath(e)))])];
const allFileNames = allFileNamesSrc.filter(filename => !targetFiles || (targetFiles && targetFiles.indexOf(filename) !== -1));
const allFileNames = allFileNamesSrc.filter(filename => !targetFiles || (targetFiles && targetFiles.indexOf(filename) !== -1)).filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile)))
function compareMTime(a: number, b: number) {
const wa = ~~(a / 1000);
const wb = ~~(b / 1000);
@@ -264,25 +294,38 @@ export class HiddenFileSync extends LiveSyncCommands {
acc[stripAllPrefixes(this.getPath(cur))] = cur;
return acc;
}, {} as { [key: string]: InternalFileEntry; });
const para = Parallels();
for (const filename of allFileNames) {
await new QueueProcessor(async (filenames: FilePath[]) => {
const filename = filenames[0];
processed++;
if (processed % 100 == 0) {
Logger(`Hidden file: ${processed}/${fileCount}`, logLevel, "sync_internal");
}
if (!filename) continue;
if (!filename) return;
if (ignorePatterns.some(e => filename.match(e)))
continue;
return;
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
return;
}
const fileOnStorage = filename in filesMap ? filesMap[filename] : undefined;
const fileOnDatabase = filename in filesOnDBMap ? filesOnDBMap[filename] : undefined;
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
await para.wait(5);
const proc = (async (xFileOnStorage: InternalFileInfo, xFileOnDatabase: InternalFileEntry) => {
return [{
filename,
fileOnStorage,
fileOnDatabase,
}]
}, { suspended: true, batchSize: 1, concurrentLimit: 10, delay: 0, totalRemainingReactiveSource: hiddenFilesProcessingCount })
.pipeTo(new QueueProcessor(async (params) => {
const
{
filename,
fileOnStorage: xFileOnStorage,
fileOnDatabase: xFileOnDatabase
} = params[0];
if (xFileOnStorage && xFileOnDatabase) {
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
// Both => Synchronize
if ((direction != "pullForce" && direction != "pushForce") && xFileOnDatabase.mtime == cache.docMtime && xFileOnStorage.mtime == cache.storageMtime) {
return;
@@ -323,16 +366,16 @@ export class HiddenFileSync extends LiveSyncCommands {
throw new Error("Invalid state on hidden file sync");
// Something corrupted?
}
return;
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 0 }))
.root
.enqueueAll(allFileNames)
.startPipeline().waitForPipeline();
});
para.add(proc(fileOnStorage, fileOnDatabase))
}
await para.all();
await this.kvDB.set("diff-caches-internal", caches);
// When files has been retrieved from the database. they must be reloaded.
if ((direction == "pull" || direction == "pullForce") && filesChanged != 0) {
const configDir = normalizePath(this.app.vault.configDir);
// Show notification to restart obsidian when something has been changed in configDir.
if (configDir in updatedFolders) {
// Numbers of updated files that is below of configDir.
@@ -349,78 +392,33 @@ export class HiddenFileSync extends LiveSyncCommands {
updatedCount -= updatedFolders[manifest.dir];
const updatePluginId = manifest.id;
const updatePluginName = manifest.name;
const fragment = createFragment((doc) => {
doc.createEl("span", null, (a) => {
a.appendText(`Files in ${updatePluginName} has been updated, Press `);
a.appendChild(a.createEl("a", null, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", async () => {
Logger(`Unloading plugin: ${updatePluginName}`, LOG_LEVEL.NOTICE, "plugin-reload-" + updatePluginId);
// @ts-ignore
await this.app.plugins.unloadPlugin(updatePluginId);
// @ts-ignore
await this.app.plugins.loadPlugin(updatePluginId);
Logger(`Plugin reloaded: ${updatePluginName}`, LOG_LEVEL.NOTICE, "plugin-reload-" + updatePluginId);
});
}));
a.appendText(` to reload ${updatePluginName}, or press elsewhere to dismiss this message.`);
this.plugin.askInPopup(`updated-${updatePluginId}`, `Files in ${updatePluginName} has been updated, Press {HERE} to reload ${updatePluginName}, or press elsewhere to dismiss this message.`, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", async () => {
Logger(`Unloading plugin: ${updatePluginName}`, LOG_LEVEL_NOTICE, "plugin-reload-" + updatePluginId);
// @ts-ignore
await this.app.plugins.unloadPlugin(updatePluginId);
// @ts-ignore
await this.app.plugins.loadPlugin(updatePluginId);
Logger(`Plugin reloaded: ${updatePluginName}`, LOG_LEVEL_NOTICE, "plugin-reload-" + updatePluginId);
});
});
const updatedPluginKey = "popupUpdated-" + updatePluginId;
scheduleTask(updatedPluginKey, 1000, async () => {
const popup = await memoIfNotExist(updatedPluginKey, () => new Notice(fragment, 0));
//@ts-ignore
const isShown = popup?.noticeEl?.isShown();
if (!isShown) {
memoObject(updatedPluginKey, new Notice(fragment, 0));
}
scheduleTask(updatedPluginKey + "-close", 20000, () => {
const popup = retrieveMemoObject<Notice>(updatedPluginKey);
if (!popup)
return;
//@ts-ignore
if (popup?.noticeEl?.isShown()) {
popup.hide();
}
disposeMemoObject(updatedPluginKey);
});
});
}
);
}
}
} catch (ex) {
Logger("Error on checking plugin status.");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
// If something changes left, notify for reloading Obsidian.
if (updatedCount != 0) {
const fragment = createFragment((doc) => {
doc.createEl("span", null, (a) => {
a.appendText(`Hidden files have been synchronized, Press `);
a.appendChild(a.createEl("a", null, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", () => {
// @ts-ignore
this.app.commands.executeCommandById("app:reload");
});
}));
a.appendText(` to reload obsidian, or press elsewhere to dismiss this message.`);
});
});
scheduleTask("popupUpdated-" + configDir, 1000, () => {
//@ts-ignore
const isShown = this.confirmPopup?.noticeEl?.isShown();
if (!isShown) {
this.confirmPopup = new Notice(fragment, 0);
}
scheduleTask("popupClose" + configDir, 20000, () => {
this.confirmPopup?.hide();
this.confirmPopup = null;
this.plugin.askInPopup(`updated-any-hidden`, `Hidden files have been synchronized, Press {HERE} to reload Obsidian, or press elsewhere to dismiss this message.`, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", () => {
// @ts-ignore
this.app.commands.executeCommandById("app:reload");
});
});
}
@@ -431,22 +429,26 @@ export class HiddenFileSync extends LiveSyncCommands {
}
async storeInternalFileToDatabase(file: InternalFileInfo, forceWrite = false) {
if (await this.plugin.isIgnoredByIgnoreFiles(file.path)) {
return
}
const id = await this.path2id(file.path, ICHeader);
const prefixedFileName = addPrefix(file.path, ICHeader);
const contentBin = await this.app.vault.adapter.readBinary(file.path);
let content: string[];
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(file.path);
let content: Blob;
try {
content = await arrayBufferToBase64(contentBin);
content = createBinaryBlob(contentBin);
} catch (ex) {
Logger(`The file ${file.path} could not be encoded`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
const mtime = file.mtime;
return await runWithLock("file-" + prefixedFileName, false, async () => {
return await serialized("file-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false);
let saveData: LoadedEntry;
let saveData: SavingEntry;
if (old === false) {
saveData = {
_id: id,
@@ -461,8 +463,8 @@ export class HiddenFileSync extends LiveSyncCommands {
type: "newnote",
};
} else {
if (isDocContentSame(old.data, content) && !forceWrite) {
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL.VERBOSE);
if (await isDocContentSame(old.data, content) && !forceWrite) {
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return;
}
saveData =
@@ -482,7 +484,7 @@ export class HiddenFileSync extends LiveSyncCommands {
return ret;
} catch (ex) {
Logger(`STORAGE --> DB:${file.path}: (hidden) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
});
@@ -492,7 +494,10 @@ export class HiddenFileSync extends LiveSyncCommands {
const id = await this.path2id(filename, ICHeader);
const prefixedFileName = addPrefix(filename, ICHeader);
const mtime = new Date().getTime();
await runWithLock("file-" + prefixedFileName, false, async () => {
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
return
}
await serialized("file-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, true) as InternalFileEntry | false;
let saveData: InternalFileEntry;
@@ -526,17 +531,19 @@ export class HiddenFileSync extends LiveSyncCommands {
Logger(`STORAGE -x> DB:${filename}: (hidden) Done`);
} catch (ex) {
Logger(`STORAGE -x> DB:${filename}: (hidden) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
});
}
async extractInternalFileFromDatabase(filename: FilePath, force = false) {
const isExists = await this.app.vault.adapter.exists(filename);
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
const prefixedFileName = addPrefix(filename, ICHeader);
return await runWithLock("file-" + prefixedFileName, false, async () => {
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
return;
}
return await serialized("file-" + prefixedFileName, async () => {
try {
// Check conflicted status
//TODO option
@@ -545,7 +552,7 @@ export class HiddenFileSync extends LiveSyncCommands {
throw new Error(`File not found on database.:${filename}`);
// Prevent overwrite for Prevent overwriting while some conflicted revision exists.
if (fileOnDB?._conflicts?.length) {
Logger(`Hidden file ${filename} has conflicted revisions, to keep in safe, writing to storage has been prevented`, LOG_LEVEL.INFO);
Logger(`Hidden file ${filename} has conflicted revisions, to keep in safe, writing to storage has been prevented`, LOG_LEVEL_INFO);
return;
}
const deleted = "deleted" in fileOnDB ? fileOnDB.deleted : false;
@@ -554,43 +561,43 @@ export class HiddenFileSync extends LiveSyncCommands {
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
} else {
Logger(`STORAGE <x- DB:${filename}: deleted (hidden).`);
await this.app.vault.adapter.remove(filename);
await this.plugin.vaultAccess.adapterRemove(filename);
try {
//@ts-ignore internalAPI
await app.vault.adapter.reconcileInternalFile(filename);
await this.app.vault.adapter.reconcileInternalFile(filename);
} catch (ex) {
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
return true;
}
if (!isExists) {
await this.ensureDirectoryEx(filename);
await this.app.vault.adapter.writeBinary(filename, base64ToArrayBuffer(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
await this.plugin.vaultAccess.adapterWrite(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
try {
//@ts-ignore internalAPI
await app.vault.adapter.reconcileInternalFile(filename);
await this.app.vault.adapter.reconcileInternalFile(filename);
} catch (ex) {
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
Logger(`STORAGE <-- DB:${filename}: written (hidden,new${force ? ", force" : ""})`);
return true;
} else {
const contentBin = await this.app.vault.adapter.readBinary(filename);
const content = await arrayBufferToBase64(contentBin);
if (content == fileOnDB.data && !force) {
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL.VERBOSE);
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(filename);
const content = await encodeBinary(contentBin);
if (await isDocContentSame(content, fileOnDB.data) && !force) {
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return true;
}
await this.app.vault.adapter.writeBinary(filename, base64ToArrayBuffer(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
await this.plugin.vaultAccess.adapterWrite(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
try {
//@ts-ignore internalAPI
await app.vault.adapter.reconcileInternalFile(filename);
await this.app.vault.adapter.reconcileInternalFile(filename);
} catch (ex) {
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
Logger(`STORAGE <-- DB:${filename}: written (hidden, overwrite${force ? ", force" : ""})`);
return true;
@@ -598,7 +605,7 @@ export class HiddenFileSync extends LiveSyncCommands {
}
} catch (ex) {
Logger(`STORAGE <-- DB:${filename}: written (hidden, overwrite${force ? ", force" : ""}) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
});
@@ -607,8 +614,8 @@ export class HiddenFileSync extends LiveSyncCommands {
showJSONMergeDialogAndMerge(docA: LoadedEntry, docB: LoadedEntry): Promise<boolean> {
return runWithLock("conflict:merge-data", false, () => new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL.VERBOSE);
return new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL_VERBOSE);
const docs = [docA, docB];
const path = stripAllPrefixes(docA.path);
const modal = new JsonResolveModal(this.app, path, [docA, docB], async (keep, result) => {
@@ -633,19 +640,19 @@ export class HiddenFileSync extends LiveSyncCommands {
}
}
if (!keep && result) {
const isExists = await this.app.vault.adapter.exists(filename);
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
if (!isExists) {
await this.ensureDirectoryEx(filename);
}
await this.app.vault.adapter.write(filename, result);
const stat = await this.app.vault.adapter.stat(filename);
await this.plugin.vaultAccess.adapterWrite(filename, result);
const stat = await this.plugin.vaultAccess.adapterStat(filename);
await this.storeInternalFileToDatabase({ path: filename, ...stat }, true);
try {
//@ts-ignore internalAPI
await app.vault.adapter.reconcileInternalFile(filename);
await this.app.vault.adapter.reconcileInternalFile(filename);
} catch (ex) {
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
Logger(`STORAGE <-- DB:${filename}: written (hidden,merged)`);
}
@@ -656,30 +663,36 @@ export class HiddenFileSync extends LiveSyncCommands {
res(true);
} catch (ex) {
Logger("Could not merge conflicted json");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
res(false);
}
});
modal.open();
}));
});
}
async scanInternalFiles(): Promise<InternalFileInfo[]> {
const configDir = normalizePath(this.app.vault.configDir);
const ignoreFilter = this.settings.syncInternalFilesIgnorePatterns
.replace(/\n| /g, "")
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
const root = this.app.vault.getRoot();
const findRoot = root.path;
const filenames = (await this.getFiles(findRoot, [], null, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
const files = filenames.map(async (e) => {
const files = filenames.filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile))).map(async (e) => {
return {
path: e as FilePath,
stat: await this.app.vault.adapter.stat(e)
stat: await this.plugin.vaultAccess.adapterStat(e)
};
});
const result: InternalFileInfo[] = [];
for (const f of files) {
const w = await f;
if (await this.plugin.isIgnoredByIgnoreFiles(w.path)) {
continue
}
result.push({
...w,
...w.stat
@@ -698,12 +711,18 @@ export class HiddenFileSync extends LiveSyncCommands {
) {
const w = await this.app.vault.adapter.list(path);
let files = [
const filesSrc = [
...w.files
.filter((e) => !ignoreList.some((ee) => e.endsWith(ee)))
.filter((e) => !filter || filter.some((ee) => e.match(ee)))
.filter((e) => !ignoreFilter || ignoreFilter.every((ee) => !e.match(ee))),
.filter((e) => !ignoreFilter || ignoreFilter.every((ee) => !e.match(ee)))
];
let files = [] as string[];
for (const file of filesSrc) {
if (!await this.plugin.isIgnoredByIgnoreFiles(file)) {
files.push(file);
}
}
L1: for (const v of w.folders) {
for (const ignore of ignoreList) {
@@ -714,6 +733,9 @@ export class HiddenFileSync extends LiveSyncCommands {
if (ignoreFilter && ignoreFilter.some(e => v.match(e))) {
continue L1;
}
if (await this.plugin.isIgnoredByIgnoreFiles(v)) {
continue L1;
}
files = files.concat(await this.getFiles(v, ignoreList, filter, ignoreFilter));
}
return files;

View File

@@ -1,15 +1,15 @@
import { normalizePath, type PluginManifest } from "./deps";
import type { DocumentID, EntryDoc, FilePathWithPrefix, LoadedEntry } from "./lib/src/types";
import { LOG_LEVEL } from "./lib/src/types";
import type { DocumentID, EntryDoc, FilePathWithPrefix, LoadedEntry, SavingEntry } from "./lib/src/types";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
import { type PluginDataEntry, PERIODIC_PLUGIN_SWEEP, type PluginList, type DevicePluginList, PSCHeader, PSCHeaderEnd } from "./types";
import { getDocData, isDocContentSame } from "./lib/src/utils";
import { createTextBlob, getDocData, isDocContentSame } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
import { isPluginMetadata, PeriodicProcessor } from "./utils";
import { PluginDialogModal } from "./dialogs";
import { NewNotice } from "./lib/src/wrapper";
import { versionNumberString2Number } from "./lib/src/strbin";
import { runWithLock } from "./lib/src/lock";
import { serialized, skipIfDuplicated } from "./lib/src/lock";
import { LiveSyncCommands } from "./LiveSyncCommands";
export class PluginAndTheirSettings extends LiveSyncCommands {
@@ -79,7 +79,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
Logger("Scanning plugins done");
} catch (ex) {
Logger("Scanning plugins failed");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
@@ -148,7 +148,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
});
NewNotice(fragment, 10000);
} else {
Logger("Everything is up to date.", LOG_LEVEL.NOTICE);
Logger("Everything is up to date.", LOG_LEVEL_NOTICE);
}
}
@@ -164,10 +164,10 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
if (specificPluginPath != "") {
specificPlugin = manifests.find(e => e.dir.endsWith("/" + specificPluginPath))?.id ?? "";
}
await runWithLock("sweepplugin", true, async () => {
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
await skipIfDuplicated("sweepplugin", async () => {
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
if (!this.deviceAndVaultName) {
Logger("You have to set your device name.", LOG_LEVEL.NOTICE);
Logger("You have to set your device name.", LOG_LEVEL_NOTICE);
return;
}
Logger("Scanning plugins", logLevel);
@@ -176,7 +176,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
endkey: `ps:${this.deviceAndVaultName}-${specificPlugin}\u{10ffff}`,
include_docs: true,
});
// Logger("OLD DOCS.", LOG_LEVEL.VERBOSE);
// Logger("OLD DOCS.", LOG_LEVEL_VERBOSE);
// sweep current plugin.
const procs = manifests.map(async (m) => {
const pluginDataEntryID = `ps:${this.deviceAndVaultName}-${m.id}` as DocumentID;
@@ -184,20 +184,19 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
if (specificPlugin && m.id != specificPlugin) {
return;
}
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL.VERBOSE);
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL_VERBOSE);
const path = normalizePath(m.dir) + "/";
const adapter = this.app.vault.adapter;
const files = ["manifest.json", "main.js", "styles.css", "data.json"];
const pluginData: { [key: string]: string; } = {};
for (const file of files) {
const thePath = path + file;
if (await adapter.exists(thePath)) {
pluginData[file] = await adapter.read(thePath);
if (await this.plugin.vaultAccess.adapterExists(thePath)) {
pluginData[file] = await this.plugin.vaultAccess.adapterRead(thePath);
}
}
let mtime = 0;
if (await adapter.exists(path + "/data.json")) {
mtime = (await adapter.stat(path + "/data.json")).mtime;
if (await this.plugin.vaultAccess.adapterExists(path + "/data.json")) {
mtime = (await this.plugin.vaultAccess.adapterStat(path + "/data.json")).mtime;
}
const p: PluginDataEntry = {
@@ -211,24 +210,25 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
mtime: mtime,
type: "plugin",
};
const d: LoadedEntry = {
const blob = createTextBlob(JSON.stringify(p));
const d: SavingEntry = {
_id: p._id,
path: p._id as string as FilePathWithPrefix,
data: JSON.stringify(p),
data: blob,
ctime: mtime,
mtime: mtime,
size: 0,
size: blob.size,
children: [],
datatype: "plain",
type: "plain"
};
Logger(`check diff:${m.name}(${m.id})`, LOG_LEVEL.VERBOSE);
await runWithLock("plugin-" + m.id, false, async () => {
Logger(`check diff:${m.name}(${m.id})`, LOG_LEVEL_VERBOSE);
await serialized("plugin-" + m.id, async () => {
const old = await this.localDatabase.getDBEntry(p._id as string as FilePathWithPrefix /* This also should be explained */, null, false, false);
if (old !== false) {
const oldData = { data: old.data, deleted: old._deleted };
const newData = { data: d.data, deleted: d._deleted };
if (isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
if (await isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
Logger(`Nothing changed:${m.name}`);
return;
}
@@ -237,7 +237,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
Logger(`Plugin saved:${m.name}`, logLevel);
});
} catch (ex) {
Logger(`Plugin save failed:${m.name}`, LOG_LEVEL.NOTICE);
Logger(`Plugin save failed:${m.name}`, LOG_LEVEL_NOTICE);
} finally {
oldDocs.rows = oldDocs.rows.filter((e) => e.id != pluginDataEntryID);
}
@@ -259,57 +259,55 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
}
return e.doc;
});
Logger(`Deleting old plugin:(${delDocs.length})`, LOG_LEVEL.VERBOSE);
Logger(`Deleting old plugin:(${delDocs.length})`, LOG_LEVEL_VERBOSE);
await this.localDatabase.bulkDocsRaw(delDocs);
Logger(`Scan plugin done.`, logLevel);
});
}
async applyPluginData(plugin: PluginDataEntry) {
await runWithLock("plugin-" + plugin.manifest.id, false, async () => {
await serialized("plugin-" + plugin.manifest.id, async () => {
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
const adapter = this.app.vault.adapter;
// @ts-ignore
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
if (stat) {
// @ts-ignore
await this.app.plugins.unloadPlugin(plugin.manifest.id);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
if (plugin.dataJson)
await adapter.write(pluginTargetFolderPath + "data.json", plugin.dataJson);
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL.NOTICE);
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "data.json", plugin.dataJson);
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL_NOTICE);
if (stat) {
// @ts-ignore
await this.app.plugins.loadPlugin(plugin.manifest.id);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
});
}
async applyPlugin(plugin: PluginDataEntry) {
await runWithLock("plugin-" + plugin.manifest.id, false, async () => {
await serialized("plugin-" + plugin.manifest.id, async () => {
// @ts-ignore
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
if (stat) {
// @ts-ignore
await this.app.plugins.unloadPlugin(plugin.manifest.id);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
const adapter = this.app.vault.adapter;
if ((await adapter.exists(pluginTargetFolderPath)) === false) {
await adapter.mkdir(pluginTargetFolderPath);
if ((await this.plugin.vaultAccess.adapterExists(pluginTargetFolderPath)) === false) {
await this.app.vault.adapter.mkdir(pluginTargetFolderPath);
}
await adapter.write(pluginTargetFolderPath + "main.js", plugin.mainJs);
await adapter.write(pluginTargetFolderPath + "manifest.json", plugin.manifestJson);
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "main.js", plugin.mainJs);
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "manifest.json", plugin.manifestJson);
if (plugin.styleCss)
await adapter.write(pluginTargetFolderPath + "styles.css", plugin.styleCss);
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "styles.css", plugin.styleCss);
if (stat) {
// @ts-ignore
await this.app.plugins.loadPlugin(plugin.manifest.id);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
});
}

View File

@@ -1,4 +1,4 @@
import { type EntryDoc, type ObsidianLiveSyncSettings, LOG_LEVEL, DEFAULT_SETTINGS } from "./lib/src/types";
import { type EntryDoc, type ObsidianLiveSyncSettings, DEFAULT_SETTINGS, LOG_LEVEL_NOTICE } from "./lib/src/types";
import { configURIBase } from "./types";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
@@ -20,6 +20,11 @@ export class SetupLiveSync extends LiveSyncCommands {
name: "Copy the setup URI",
callback: this.command_copySetupURI.bind(this),
});
this.plugin.addCommand({
id: "livesync-copysetupuri-short",
name: "Copy the setup URI (With customization sync)",
callback: this.command_copySetupURIWithSync.bind(this),
});
this.plugin.addCommand({
id: "livesync-copysetupurifull",
@@ -41,11 +46,14 @@ export class SetupLiveSync extends LiveSyncCommands {
}
async realizeSettingSyncMode() { }
async command_copySetupURI() {
async command_copySetupURI(stripExtra = true) {
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "", true);
if (encryptingPassphrase === false)
return;
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
if (stripExtra) {
delete setting.pluginSyncExtendedSetting;
}
const keys = Object.keys(setting) as (keyof ObsidianLiveSyncSettings)[];
for (const k of keys) {
if (JSON.stringify(k in setting ? setting[k] : "") == JSON.stringify(k in DEFAULT_SETTINGS ? DEFAULT_SETTINGS[k] : "*")) {
@@ -55,7 +63,7 @@ export class SetupLiveSync extends LiveSyncCommands {
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
const uri = `${configURIBase}${encryptedSetting}`;
await navigator.clipboard.writeText(uri);
Logger("Setup URI copied to clipboard", LOG_LEVEL.NOTICE);
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
}
async command_copySetupURIFull() {
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "", true);
@@ -65,14 +73,17 @@ export class SetupLiveSync extends LiveSyncCommands {
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
const uri = `${configURIBase}${encryptedSetting}`;
await navigator.clipboard.writeText(uri);
Logger("Setup URI copied to clipboard", LOG_LEVEL.NOTICE);
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
}
async command_copySetupURIWithSync() {
this.command_copySetupURI(false);
}
async command_openSetupURI() {
const setupURI = await askString(this.app, "Easy setup", "Set up URI", `${configURIBase}aaaaa`);
if (setupURI === false)
return;
if (!setupURI.startsWith(`${configURIBase}`)) {
Logger("Set up URI looks wrong.", LOG_LEVEL.NOTICE);
Logger("Set up URI looks wrong.", LOG_LEVEL_NOTICE);
return;
}
const config = decodeURIComponent(setupURI.substring(configURIBase.length));
@@ -99,11 +110,17 @@ export class SetupLiveSync extends LiveSyncCommands {
newSettingW.encryptedCouchDBConnection = "";
const setupJustImport = "Just import setting";
const setupAsNew = "Set it up as secondary or subsequent device";
const setupAsMerge = "Secondary device but try keeping local changes";
const setupAgain = "Reconfigure and reconstitute the data";
const setupManually = "Leave everything to me";
newSettingW.syncInternalFiles = false;
newSettingW.usePluginSync = false;
const setupType = await askSelectString(this.app, "How would you like to set it up?", [setupAsNew, setupAgain, setupJustImport, setupManually]);
// Migrate completely obsoleted configuration.
if (!newSettingW.useIndexedDBAdapter) {
newSettingW.useIndexedDBAdapter = true;
}
const setupType = await askSelectString(this.app, "How would you like to set it up?", [setupAsNew, setupAgain, setupAsMerge, setupJustImport, setupManually]);
if (setupType == setupJustImport) {
this.plugin.settings = newSettingW;
this.plugin.usedPassphrase = "";
@@ -112,6 +129,10 @@ export class SetupLiveSync extends LiveSyncCommands {
this.plugin.settings = newSettingW;
this.plugin.usedPassphrase = "";
await this.fetchLocal();
} else if (setupType == setupAsMerge) {
this.plugin.settings = newSettingW;
this.plugin.usedPassphrase = "";
await this.fetchLocalWithKeepLocal();
} else if (setupType == setupAgain) {
const confirm = "I know this operation will rebuild all my databases with files on this device, and files that are on the remote database and I didn't synchronize to any other devices will be lost and want to proceed indeed.";
if (await askSelectString(this.app, "Do you really want to do this?", ["Cancel", confirm]) != confirm) {
@@ -135,13 +156,13 @@ export class SetupLiveSync extends LiveSyncCommands {
await this.plugin.replicate(true);
await this.plugin.markRemoteUnlocked();
}
Logger("Configuration loaded.", LOG_LEVEL.NOTICE);
Logger("Configuration loaded.", LOG_LEVEL_NOTICE);
return;
}
if (keepLocalDB == "no" && keepRemoteDB == "no") {
const reset = await askYesNo(this.app, "Drop everything?");
if (reset != "yes") {
Logger("Cancelled", LOG_LEVEL.NOTICE);
Logger("Cancelled", LOG_LEVEL_NOTICE);
this.plugin.settings = oldConf;
return;
}
@@ -176,17 +197,17 @@ export class SetupLiveSync extends LiveSyncCommands {
}
}
Logger("Configuration loaded.", LOG_LEVEL.NOTICE);
Logger("Configuration loaded.", LOG_LEVEL_NOTICE);
} else {
Logger("Cancelled.", LOG_LEVEL.NOTICE);
Logger("Cancelled.", LOG_LEVEL_NOTICE);
}
} catch (ex) {
Logger("Couldn't parse or decrypt configuration uri.", LOG_LEVEL.NOTICE);
Logger("Couldn't parse or decrypt configuration uri.", LOG_LEVEL_NOTICE);
}
}
suspendExtraSync() {
Logger("Hidden files and plugin synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL.NOTICE)
Logger("Hidden files and plugin synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL_NOTICE)
this.plugin.settings.syncInternalFiles = false;
this.plugin.settings.usePluginSync = false;
this.plugin.settings.autoSweepPlugins = false;
@@ -231,7 +252,7 @@ Of course, we are able to disable these features.`
return;
}
if (mode != "CUSTOMIZE") {
Logger("Gathering files for enabling Hidden File Sync", LOG_LEVEL.NOTICE);
Logger("Gathering files for enabling Hidden File Sync", LOG_LEVEL_NOTICE);
if (mode == "FETCH") {
await this.plugin.addOnHiddenFileSync.syncInternalFilesAndDatabase("pullForce", true);
} else if (mode == "OVERWRITE") {
@@ -241,7 +262,7 @@ Of course, we are able to disable these features.`
}
this.plugin.settings.syncInternalFiles = true;
await this.plugin.saveSettings();
Logger(`Done! Restarting the app is strongly recommended!`, LOG_LEVEL.NOTICE);
Logger(`Done! Restarting the app is strongly recommended!`, LOG_LEVEL_NOTICE);
} else if (mode == "CUSTOMIZE") {
if (!this.plugin.deviceAndVaultName) {
let name = await askString(this.app, "Device name", "Please set this device name", `desktop`);
@@ -280,6 +301,7 @@ Of course, we are able to disable these features.`
this.plugin.settings.liveSync = false;
this.plugin.settings.periodicReplication = false;
this.plugin.settings.syncOnSave = false;
this.plugin.settings.syncOnEditorSave = false;
this.plugin.settings.syncOnStart = false;
this.plugin.settings.syncOnFileOpen = false;
this.plugin.settings.syncAfterMerge = false;
@@ -287,23 +309,20 @@ Of course, we are able to disable these features.`
}
async suspendReflectingDatabase() {
if (this.plugin.settings.doNotSuspendOnFetching) return;
Logger(`Suspending reflection: Database and storage changes will not be reflected in each other until completely finished the fetching.`, LOG_LEVEL.NOTICE);
Logger(`Suspending reflection: Database and storage changes will not be reflected in each other until completely finished the fetching.`, LOG_LEVEL_NOTICE);
this.plugin.settings.suspendParseReplicationResult = true;
this.plugin.settings.suspendFileWatching = true;
await this.plugin.saveSettings();
}
async resumeReflectingDatabase() {
if (this.plugin.settings.doNotSuspendOnFetching) return;
Logger(`Database and storage reflection has been resumed!`, LOG_LEVEL.NOTICE);
Logger(`Database and storage reflection has been resumed!`, LOG_LEVEL_NOTICE);
this.plugin.settings.suspendParseReplicationResult = false;
this.plugin.settings.suspendFileWatching = false;
await this.plugin.syncAllFiles(true);
await this.plugin.loadQueuedFiles();
await this.plugin.saveSettings();
if (this.plugin.settings.readChunksOnline) {
await this.plugin.syncAllFiles(true);
await this.plugin.loadQueuedFiles();
// Start processing
this.plugin.procQueuedFiles();
}
}
async askUseNewAdapter() {
if (!this.plugin.settings.useIndexedDBAdapter) {
@@ -320,14 +339,14 @@ Of course, we are able to disable these features.`
}
async fetchRemoteChunks() {
if (!this.plugin.settings.doNotSuspendOnFetching && this.plugin.settings.readChunksOnline) {
Logger(`Fetching chunks`, LOG_LEVEL.NOTICE);
Logger(`Fetching chunks`, LOG_LEVEL_NOTICE);
const remoteDB = await this.plugin.getReplicator().connectRemoteCouchDBWithSetting(this.settings, this.plugin.getIsMobile(), true);
if (typeof remoteDB == "string") {
Logger(remoteDB, LOG_LEVEL.NOTICE);
Logger(remoteDB, LOG_LEVEL_NOTICE);
} else {
await fetchAllUsedChunks(this.localDatabase.localDatabase, remoteDB.db);
}
Logger(`Fetching chunks done`, LOG_LEVEL.NOTICE);
Logger(`Fetching chunks done`, LOG_LEVEL_NOTICE);
}
}
async fetchLocal() {
@@ -348,6 +367,25 @@ Of course, we are able to disable these features.`
await this.resumeReflectingDatabase();
await this.askHiddenFileConfiguration({ enableFetch: true });
}
async fetchLocalWithKeepLocal() {
this.suspendExtraSync();
this.askUseNewAdapter();
await this.suspendReflectingDatabase();
await this.plugin.realizeSettingSyncMode();
await this.plugin.resetLocalDatabase();
await delay(1000);
await this.plugin.initializeDatabase(true);
await this.plugin.markRemoteResolved();
await this.plugin.openDatabase();
this.plugin.isReady = true;
await delay(500);
await this.plugin.replicateAllFromServer(true);
await delay(1000);
await this.plugin.replicateAllFromServer(true);
await this.fetchRemoteChunks();
await this.resumeReflectingDatabase();
await this.askHiddenFileConfiguration({ enableFetch: true });
}
async rebuildRemote() {
this.suspendExtraSync();
await this.plugin.realizeSettingSyncMode();

View File

@@ -1,27 +1,43 @@
import { App, Modal } from "./deps";
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT } from "diff-match-patch";
import { diff_result } from "./lib/src/types";
import { CANCELLED, LEAVE_TO_SUBSEQUENT, RESULT_TIMED_OUT, type diff_result } from "./lib/src/types";
import { escapeStringToHTML } from "./lib/src/strbin";
import { delay, sendValue, waitForValue } from "./lib/src/utils";
export type MergeDialogResult = typeof LEAVE_TO_SUBSEQUENT | typeof CANCELLED | string;
export class ConflictResolveModal extends Modal {
// result: Array<[number, string]>;
result: diff_result;
filename: string;
callback: (remove_rev: string) => Promise<void>;
constructor(app: App, filename: string, diff: diff_result, callback: (remove_rev: string) => Promise<void>) {
response: MergeDialogResult = CANCELLED;
isClosed = false;
consumed = false;
constructor(app: App, filename: string, diff: diff_result) {
super(app);
this.result = diff;
this.callback = callback;
this.filename = filename;
// Send cancel signal for the previous merge dialogue
// if not there, simply be ignored.
// sendValue("close-resolve-conflict:" + this.filename, false);
sendValue("cancel-resolve-conflict:" + this.filename, true);
}
onOpen() {
const { contentEl } = this;
// Send cancel signal for the previous merge dialogue
// if not there, simply be ignored.
sendValue("cancel-resolve-conflict:" + this.filename, true);
setTimeout(async () => {
const forceClose = await waitForValue("cancel-resolve-conflict:" + this.filename);
// debugger;
if (forceClose) {
this.sendResponse(CANCELLED);
}
}, 10)
// sendValue("close-resolve-conflict:" + this.filename, false);
this.titleEl.setText("Conflicting changes");
contentEl.empty();
contentEl.createEl("h2", { text: "This document has conflicted changes." });
contentEl.createEl("span", { text: this.filename });
const div = contentEl.createDiv("");
div.addClass("op-scrollable");
@@ -46,42 +62,32 @@ export class ConflictResolveModal extends Modal {
div2.innerHTML = `
<span class='deleted'>A:${date1}</span><br /><span class='added'>B:${date2}</span><br>
`;
contentEl.createEl("button", { text: "Keep A" }, (e) => {
e.addEventListener("click", async () => {
const callback = this.callback;
this.callback = null;
this.close();
await callback(this.result.right.rev);
});
});
contentEl.createEl("button", { text: "Keep B" }, (e) => {
e.addEventListener("click", async () => {
const callback = this.callback;
this.callback = null;
this.close();
await callback(this.result.left.rev);
});
});
contentEl.createEl("button", { text: "Concat both" }, (e) => {
e.addEventListener("click", async () => {
const callback = this.callback;
this.callback = null;
this.close();
await callback("");
});
});
contentEl.createEl("button", { text: "Not now" }, (e) => {
e.addEventListener("click", () => {
this.close();
});
});
contentEl.createEl("button", { text: "Keep A" }, (e) => e.addEventListener("click", () => this.sendResponse(this.result.right.rev)));
contentEl.createEl("button", { text: "Keep B" }, (e) => e.addEventListener("click", () => this.sendResponse(this.result.left.rev)));
contentEl.createEl("button", { text: "Concat both" }, (e) => e.addEventListener("click", () => this.sendResponse(LEAVE_TO_SUBSEQUENT)));
contentEl.createEl("button", { text: "Not now" }, (e) => e.addEventListener("click", () => this.sendResponse(CANCELLED)));
}
sendResponse(result: MergeDialogResult) {
this.response = result;
this.close();
}
onClose() {
const { contentEl } = this;
contentEl.empty();
if (this.callback != null) {
this.callback(null);
if (this.consumed) {
return;
}
this.consumed = true;
sendValue("close-resolve-conflict:" + this.filename, this.response);
sendValue("cancel-resolve-conflict:" + this.filename, false);
}
}
async waitForResult(): Promise<MergeDialogResult> {
await delay(100);
const r = await waitForValue<MergeDialogResult>("close-resolve-conflict:" + this.filename);
if (r === RESULT_TIMED_OUT) return CANCELLED;
return r;
}
}

View File

@@ -1,38 +1,64 @@
import { TFile, Modal, App, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "./deps";
import { getPathFromTFile, isValidPath } from "./utils";
import { base64ToArrayBuffer, base64ToString, escapeStringToHTML } from "./lib/src/strbin";
import { decodeBinary, escapeStringToHTML, readString } from "./lib/src/strbin";
import ObsidianLiveSyncPlugin from "./main";
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL } from "./lib/src/types";
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
import { Logger } from "./lib/src/logger";
import { isErrorOfMissingDoc } from "./lib/src/utils_couchdb";
import { getDocData } from "./lib/src/utils";
import { stripPrefix } from "./lib/src/path";
import { isPlainText, stripPrefix } from "./lib/src/path";
function isImage(path: string) {
const ext = path.split(".").splice(-1)[0].toLowerCase();
return ["png", "jpg", "jpeg", "gif", "bmp", "webp"].includes(ext);
}
function isComparableText(path: string) {
const ext = path.split(".").splice(-1)[0].toLowerCase();
return isPlainText(path) || ["md", "mdx", "txt", "json"].includes(ext);
}
function isComparableTextDecode(path: string) {
const ext = path.split(".").splice(-1)[0].toLowerCase();
return ["json"].includes(ext)
}
function readDocument(w: LoadedEntry) {
if (isImage(w.path)) {
return new Uint8Array(decodeBinary(w.data));
}
if (w.data == "plain") return getDocData(w.data);
if (isComparableTextDecode(w.path)) return readString(new Uint8Array(decodeBinary(w.data)));
if (isComparableText(w.path)) return getDocData(w.data);
try {
return readString(new Uint8Array(decodeBinary(w.data)));
} catch (ex) {
// NO OP.
}
return getDocData(w.data);
}
export class DocumentHistoryModal extends Modal {
plugin: ObsidianLiveSyncPlugin;
range: HTMLInputElement;
contentView: HTMLDivElement;
info: HTMLDivElement;
fileInfo: HTMLDivElement;
range!: HTMLInputElement;
contentView!: HTMLDivElement;
info!: HTMLDivElement;
fileInfo!: HTMLDivElement;
showDiff = false;
id: DocumentID;
id?: DocumentID;
file: FilePathWithPrefix;
revs_info: PouchDB.Core.RevisionInfo[] = [];
currentDoc: LoadedEntry;
currentDoc?: LoadedEntry;
currentText = "";
currentDeleted = false;
initialRev: string;
initialRev?: string;
constructor(app: App, plugin: ObsidianLiveSyncPlugin, file: TFile | FilePathWithPrefix, id: DocumentID, revision?: string) {
constructor(app: App, plugin: ObsidianLiveSyncPlugin, file: TFile | FilePathWithPrefix, id?: DocumentID, revision?: string) {
super(app);
this.plugin = plugin;
this.file = (file instanceof TFile) ? getPathFromTFile(file) : file;
this.id = id;
this.initialRev = revision;
if (!file) {
this.file = this.plugin.id2path(id, null);
if (!file && id) {
this.file = this.plugin.id2path(id);
}
if (localStorage.getItem("ols-history-highlightdiff") == "1") {
this.showDiff = true;
@@ -46,8 +72,8 @@ export class DocumentHistoryModal extends Modal {
const db = this.plugin.localDatabase;
try {
const w = await db.localDatabase.get(this.id, { revs_info: true });
this.revs_info = w._revs_info.filter((e) => e?.status == "available");
this.range.max = `${this.revs_info.length - 1}`;
this.revs_info = w._revs_info?.filter((e) => e?.status == "available") ?? [];
this.range.max = `${Math.max(this.revs_info.length - 1, 0)}`;
this.range.value = this.range.max;
this.fileInfo.setText(`${this.file} / ${this.revs_info.length} revisions`);
await this.loadRevs(initialRev);
@@ -56,11 +82,10 @@ export class DocumentHistoryModal extends Modal {
this.range.max = "0";
this.range.value = "";
this.range.disabled = true;
this.showDiff
this.contentView.setText(`History of this file was not recorded.`);
} else {
this.contentView.setText(`Error occurred.`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
}
@@ -76,6 +101,22 @@ export class DocumentHistoryModal extends Modal {
const rev = this.revs_info[index];
await this.showExactRev(rev.rev);
}
BlobURLs = new Map<string, string>();
revokeURL(key: string) {
const v = this.BlobURLs.get(key);
if (v) {
URL.revokeObjectURL(v);
}
this.BlobURLs.set(key, undefined);
}
generateBlobURL(key: string, data: Uint8Array) {
this.revokeURL(key);
const v = URL.createObjectURL(new Blob([data], { endings: "transparent", type: "application/octet-stream" }));
this.BlobURLs.set(key, v);
return v;
}
async showExactRev(rev: string) {
const db = this.plugin.localDatabase;
const w = await db.getDBEntry(this.file, { rev: rev }, false, false, true);
@@ -88,51 +129,75 @@ export class DocumentHistoryModal extends Modal {
} else {
this.currentDoc = w;
this.info.innerHTML = `Modified:${new Date(w.mtime).toLocaleString()}`;
let result = "";
const w1data = w.datatype == "plain" ? getDocData(w.data) : base64ToString(w.data);
this.currentDeleted = w.deleted;
this.currentText = w1data;
let result = undefined;
const w1data = readDocument(w);
this.currentDeleted = !!w.deleted;
// this.currentText = w1data;
if (this.showDiff) {
const prevRevIdx = this.revs_info.length - 1 - ((this.range.value as any) / 1 - 1);
if (prevRevIdx >= 0 && prevRevIdx < this.revs_info.length) {
const oldRev = this.revs_info[prevRevIdx].rev;
const w2 = await db.getDBEntry(this.file, { rev: oldRev }, false, false, true);
if (w2 != false) {
const dmp = new diff_match_patch();
const w2data = w2.datatype == "plain" ? getDocData(w2.data) : base64ToString(w2.data);
const diff = dmp.diff_main(w2data, w1data);
dmp.diff_cleanupSemantic(diff);
for (const v of diff) {
const x1 = v[0];
const x2 = v[1];
if (x1 == DIFF_DELETE) {
result += "<span class='history-deleted'>" + escapeStringToHTML(x2) + "</span>";
} else if (x1 == DIFF_EQUAL) {
result += "<span class='history-normal'>" + escapeStringToHTML(x2) + "</span>";
} else if (x1 == DIFF_INSERT) {
result += "<span class='history-added'>" + escapeStringToHTML(x2) + "</span>";
if (typeof w1data == "string") {
result = "";
const dmp = new diff_match_patch();
const w2data = readDocument(w2) as string;
const diff = dmp.diff_main(w2data, w1data);
dmp.diff_cleanupSemantic(diff);
for (const v of diff) {
const x1 = v[0];
const x2 = v[1];
if (x1 == DIFF_DELETE) {
result += "<span class='history-deleted'>" + escapeStringToHTML(x2) + "</span>";
} else if (x1 == DIFF_EQUAL) {
result += "<span class='history-normal'>" + escapeStringToHTML(x2) + "</span>";
} else if (x1 == DIFF_INSERT) {
result += "<span class='history-added'>" + escapeStringToHTML(x2) + "</span>";
}
}
result = result.replace(/\n/g, "<br>");
} else if (isImage(this.file)) {
const src = this.generateBlobURL("base", w1data);
const overlay = this.generateBlobURL("overlay", readDocument(w2) as Uint8Array);
result =
`<div class='ls-imgdiff-wrap'>
<div class='overlay'>
<img class='img-base' src="${src}">
<img class='img-overlay' src='${overlay}'>
</div>
</div>`;
this.contentView.removeClass("op-pre");
}
}
}
result = result.replace(/\n/g, "<br>");
} else {
result = escapeStringToHTML(w1data);
}
if (result == undefined) {
if (typeof w1data != "string") {
if (isImage(this.file)) {
const src = this.generateBlobURL("base", w1data);
result =
`<div class='ls-imgdiff-wrap'>
<div class='overlay'>
<img class='img-base' src="${src}">
</div>
</div>`;
this.contentView.removeClass("op-pre");
}
} else {
result = escapeStringToHTML(w1data);
}
} else {
result = escapeStringToHTML(w1data);
}
if (result == undefined) result = typeof w1data == "string" ? escapeStringToHTML(w1data) : "Binary file";
this.contentView.innerHTML = (this.currentDeleted ? "(At this revision, the file has been deleted)\n" : "") + result;
}
}
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Document History");
contentEl.empty();
contentEl.createEl("h2", { text: "Document History" });
this.fileInfo = contentEl.createDiv("");
this.fileInfo.addClass("op-info");
const divView = contentEl.createDiv("");
@@ -178,7 +243,7 @@ export class DocumentHistoryModal extends Modal {
e.addClass("mod-cta");
e.addEventListener("click", async () => {
await navigator.clipboard.writeText(this.currentText);
Logger(`Old content copied to clipboard`, LOG_LEVEL.NOTICE);
Logger(`Old content copied to clipboard`, LOG_LEVEL_NOTICE);
});
});
async function focusFile(path: string) {
@@ -189,7 +254,7 @@ export class DocumentHistoryModal extends Modal {
const leaf = app.workspace.getLeaf(false);
await leaf.openFile(targetFile);
} else {
Logger("The file could not view on the editor", LOG_LEVEL.NOTICE)
Logger("The file could not view on the editor", LOG_LEVEL_NOTICE)
}
}
buttons.createEl("button", { text: "Back to this revision" }, (e) => {
@@ -198,19 +263,19 @@ export class DocumentHistoryModal extends Modal {
// const pathToWrite = this.plugin.id2path(this.id, true);
const pathToWrite = stripPrefix(this.file);
if (!isValidPath(pathToWrite)) {
Logger("Path is not valid to write content.", LOG_LEVEL.INFO);
Logger("Path is not valid to write content.", LOG_LEVEL_INFO);
}
if (this.currentDoc?.datatype == "plain") {
await this.app.vault.adapter.write(pathToWrite, getDocData(this.currentDoc.data));
await this.plugin.vaultAccess.adapterWrite(pathToWrite, getDocData(this.currentDoc.data));
await focusFile(pathToWrite);
this.close();
} else if (this.currentDoc?.datatype == "newnote") {
await this.app.vault.adapter.writeBinary(pathToWrite, base64ToArrayBuffer(this.currentDoc.data));
await this.plugin.vaultAccess.adapterWrite(pathToWrite, decodeBinary(this.currentDoc.data));
await focusFile(pathToWrite);
this.close();
} else {
Logger(`Could not parse entry`, LOG_LEVEL.NOTICE);
Logger(`Could not parse entry`, LOG_LEVEL_NOTICE);
}
});
});
@@ -218,5 +283,9 @@ export class DocumentHistoryModal extends Modal {
onClose() {
const { contentEl } = this;
contentEl.empty();
this.BlobURLs.forEach(value => {
console.log(value);
if (value) URL.revokeObjectURL(value);
})
}
}

View File

@@ -2,12 +2,11 @@
import ObsidianLiveSyncPlugin from "./main";
import { onDestroy, onMount } from "svelte";
import type { AnyEntry, FilePathWithPrefix } from "./lib/src/types";
import { getDocData, isDocContentSame } from "./lib/src/utils";
import { createBinaryBlob, getDocData, isDocContentSame } from "./lib/src/utils";
import { diff_match_patch } from "./deps";
import { DocumentHistoryModal } from "./DocumentHistoryModal";
import { isPlainText, stripAllPrefixes } from "./lib/src/path";
import { TFile } from "./deps";
import { arrayBufferToBase64 } from "./lib/src/strbin";
export let plugin: ObsidianLiveSyncPlugin;
let showDiffInfo = false;
@@ -67,10 +66,7 @@
for (const revInfo of reversedRevs) {
if (revInfo.status == "available") {
const doc =
(!isPlain && showDiffInfo) || (checkStorageDiff && revInfo.rev == docA._rev)
? await db.getDBEntry(path, { rev: revInfo.rev }, false, false, true)
: await db.getDBEntryMeta(path, { rev: revInfo.rev }, true);
const doc = (!isPlain && showDiffInfo) || (checkStorageDiff && revInfo.rev == docA._rev) ? await db.getDBEntry(path, { rev: revInfo.rev }, false, false, true) : await db.getDBEntryMeta(path, { rev: revInfo.rev }, true);
if (doc === false) continue;
const rev = revInfo.rev;
@@ -108,16 +104,16 @@
}
if (rev == docA._rev) {
if (checkStorageDiff) {
const abs = plugin.app.vault.getAbstractFileByPath(stripAllPrefixes(plugin.getPath(docA)));
const abs = plugin.vaultAccess.getAbstractFileByPath(stripAllPrefixes(plugin.getPath(docA)));
if (abs instanceof TFile) {
let result = false;
if (isPlainText(docA.path)) {
const data = await plugin.app.vault.read(abs);
result = isDocContentSame(data, doc.data);
const data = await plugin.vaultAccess.adapterRead(abs);
result = await isDocContentSame(data, doc.data);
} else {
const data = await plugin.app.vault.readBinary(abs);
const dataEEncoded = await arrayBufferToBase64(data);
result = isDocContentSame(dataEEncoded, doc.data);
const data = await plugin.vaultAccess.adapterReadBinary(abs);
const dataEEncoded = createBinaryBlob(data);
result = await isDocContentSame(dataEEncoded, doc.data);
}
if (result) {
diffDetail += " ⚖️";

View File

@@ -32,6 +32,7 @@ export class GlobalHistoryView extends ItemView {
return "Vault history";
}
// eslint-disable-next-line require-await
async onOpen() {
this.component = new GlobalHistoryComponent({
target: this.contentEl,
@@ -41,6 +42,7 @@ export class GlobalHistoryView extends ItemView {
});
}
// eslint-disable-next-line require-await
async onClose() {
this.component.$destroy();
}

View File

@@ -1,6 +1,7 @@
import { App, Modal } from "./deps";
import { FilePath, LoadedEntry } from "./lib/src/types";
import { type FilePath, type LoadedEntry } from "./lib/src/types";
import JsonResolvePane from "./JsonResolvePane.svelte";
import { waitForSignal } from "./lib/src/utils";
export class JsonResolveModal extends Modal {
// result: Array<[number, string]>;
@@ -20,6 +21,7 @@ export class JsonResolveModal extends Modal {
this.nameA = nameA;
this.nameB = nameB;
this.defaultSelect = defaultSelect;
waitForSignal(`cancel-internal-conflict:${filename}`).then(() => this.close());
}
async UICallback(keepRev: string, mergedStr?: string) {
this.close();
@@ -29,7 +31,7 @@ export class JsonResolveModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Conflicted Setting");
contentEl.empty();
if (this.component == null) {
@@ -41,7 +43,7 @@ export class JsonResolveModal extends Modal {
nameA: this.nameA,
nameB: this.nameB,
defaultSelect: this.defaultSelect,
callback: (keepRev, mergedStr) => this.UICallback(keepRev, mergedStr),
callback: (keepRev: string, mergedStr: string) => this.UICallback(keepRev, mergedStr),
},
});
}

View File

@@ -1,7 +1,7 @@
<script lang="ts">
import { type Diff, DIFF_DELETE, DIFF_INSERT, diff_match_patch } from "./deps";
import type { FilePath, LoadedEntry } from "./lib/src/types";
import { base64ToString } from "./lib/src/strbin";
import { decodeBinary, readString } from "./lib/src/strbin";
import { getDocData } from "./lib/src/utils";
import { mergeObject } from "./utils";
@@ -26,7 +26,7 @@
let mode: SelectModes = defaultSelect as SelectModes;
function docToString(doc: LoadedEntry) {
return doc.datatype == "plain" ? getDocData(doc.data) : base64ToString(doc.data);
return doc.datatype == "plain" ? getDocData(doc.data) : readString(new Uint8Array(decodeBinary(doc.data)));
}
function revStringToRevNumber(rev: string) {
return rev.split("-")[0];
@@ -104,7 +104,6 @@
] as ["" | "A" | "B" | "AB" | "BA", string][];
</script>
<h1>Conflicted settings</h1>
<h2>{filename}</h2>
{#if !docA || !docB}
<div class="message">Just for a minute, please!</div>

View File

@@ -1,4 +1,4 @@
import { deleteDB, IDBPDatabase, openDB } from "idb";
import { deleteDB, type IDBPDatabase, openDB } from "idb";
export interface KeyValueDatabase {
get<T>(key: string): Promise<T>;
set<T>(key: string, value: T): Promise<IDBValidKey>;

View File

@@ -1,4 +1,4 @@
import { AnyEntry, DocumentID, EntryDoc, EntryHasPath, FilePath, FilePathWithPrefix } from "./lib/src/types";
import { type AnyEntry, type DocumentID, type EntryDoc, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "./lib/src/types";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
import type ObsidianLiveSyncPlugin from "./main";
@@ -14,6 +14,9 @@ export abstract class LiveSyncCommands {
get localDatabase() {
return this.plugin.localDatabase;
}
get vaultAccess() {
return this.plugin.vaultAccess;
}
id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
return this.plugin.id2path(id, entry, stripPrefix);
}

View File

@@ -1,5 +1,6 @@
import { App, Modal } from "./deps";
import { logMessageStore } from "./lib/src/stores";
import type { ReactiveInstance, } from "./lib/src/reactive";
import { logMessages } from "./lib/src/stores";
import { escapeStringToHTML } from "./lib/src/strbin";
import ObsidianLiveSyncPlugin from "./main";
@@ -14,21 +15,23 @@ export class LogDisplayModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Sync status");
contentEl.empty();
contentEl.createEl("h2", { text: "Sync Status" });
const div = contentEl.createDiv("");
div.addClass("op-scrollable");
div.addClass("op-pre");
this.logEl = div;
this.unsubscribe = logMessageStore.observe((e) => {
function updateLog(logs: ReactiveInstance<string[]>) {
const e = logs.value;
let msg = "";
for (const v of e) {
msg += escapeStringToHTML(v) + "<br>";
}
this.logEl.innerHTML = msg;
})
logMessageStore.invalidate();
}
logMessages.onChanged(updateLog);
this.unsubscribe = () => logMessages.offChanged(updateLog);
}
onClose() {
const { contentEl } = this;

View File

@@ -1,26 +1,27 @@
<script lang="ts">
import { onDestroy, onMount } from "svelte";
import { logMessageStore } from "./lib/src/stores";
import { logMessages } from "./lib/src/stores";
import type { ReactiveInstance } from "./lib/src/reactive";
import { Logger } from "./lib/src/logger";
let unsubscribe: () => void;
let messages = [] as string[];
let wrapRight = false;
let autoScroll = true;
let suspended = false;
function updateLog(logs: ReactiveInstance<string[]>) {
const e = logs.value;
if (!suspended) {
messages = [...e];
setTimeout(() => {
if (scroll) scroll.scrollTop = scroll.scrollHeight;
}, 10);
}
}
onMount(async () => {
unsubscribe = logMessageStore.observe((e) => {
if (!suspended) {
messages = [...e];
if (autoScroll) {
if (scroll) scroll.scrollTop = scroll.scrollHeight;
}
}
});
logMessageStore.invalidate();
setTimeout(() => {
if (scroll) scroll.scrollTop = scroll.scrollHeight;
}, 100);
logMessages.onChanged(updateLog);
Logger("Log window opened");
unsubscribe = () => logMessages.offChanged(updateLog);
});
onDestroy(() => {
if (unsubscribe) unsubscribe();

View File

@@ -5,7 +5,7 @@ import {
import LogPaneComponent from "./LogPane.svelte";
import type ObsidianLiveSyncPlugin from "./main";
export const VIEW_TYPE_LOG = "log-log";
// Show notes as like scroll.
//Log view
export class LogPaneView extends ItemView {
component: LogPaneComponent;
@@ -32,6 +32,7 @@ export class LogPaneView extends ItemView {
return "Self-hosted LiveSync Log";
}
// eslint-disable-next-line require-await
async onOpen() {
this.component = new LogPaneComponent({
target: this.contentEl,
@@ -40,6 +41,7 @@ export class LogPaneView extends ItemView {
});
}
// eslint-disable-next-line require-await
async onClose() {
this.component.$destroy();
}

View File

@@ -1,10 +1,10 @@
import { App, PluginSettingTab, Setting, sanitizeHTMLToDom, TextAreaComponent, MarkdownRenderer, stringifyYaml } from "./deps";
import { DEFAULT_SETTINGS, LOG_LEVEL, type ObsidianLiveSyncSettings, type ConfigPassphraseStore, type RemoteDBSettings, type FilePathWithPrefix, type HashAlgorithm, type DocumentID } from "./lib/src/types";
import { DEFAULT_SETTINGS, type ObsidianLiveSyncSettings, type ConfigPassphraseStore, type RemoteDBSettings, type FilePathWithPrefix, type HashAlgorithm, type DocumentID, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, LOG_LEVEL_INFO } from "./lib/src/types";
import { delay } from "./lib/src/utils";
import { Semaphore } from "./lib/src/semaphore";
import { versionNumberString2Number } from "./lib/src/strbin";
import { Logger } from "./lib/src/logger";
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb.js";
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb";
import { testCrypt } from "./lib/src/e2ee_v2";
import ObsidianLiveSyncPlugin from "./main";
import { askYesNo, performRebuildDB, requestToCouchDB, scheduleTask } from "./utils";
@@ -21,10 +21,10 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
async testConnection(): Promise<void> {
const db = await this.plugin.replicator.connectRemoteCouchDBWithSetting(this.plugin.settings, this.plugin.isMobile, true);
if (typeof db === "string") {
this.plugin.addLog(`could not connect to ${this.plugin.settings.couchDB_URI} : ${this.plugin.settings.couchDB_DBNAME} \n(${db})`, LOG_LEVEL.NOTICE);
this.plugin.addLog(`could not connect to ${this.plugin.settings.couchDB_URI} : ${this.plugin.settings.couchDB_DBNAME} \n(${db})`, LOG_LEVEL_NOTICE);
return;
}
this.plugin.addLog(`Connected to ${db.info.db_name}`, LOG_LEVEL.NOTICE);
this.plugin.addLog(`Connected to ${db.info.db_name}`, LOG_LEVEL_NOTICE);
}
display(): void {
const { containerEl } = this;
@@ -111,7 +111,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
}
MarkdownRenderer.renderMarkdown(updateInformation, informationDivEl, "/", this.plugin);
MarkdownRenderer.render(this.plugin.app, updateInformation, informationDivEl, "/", this.plugin);
addScreenElement("100", containerInformationEl);
@@ -120,6 +120,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
if (this.plugin.settings.periodicReplication) return true;
if (this.plugin.settings.syncOnFileOpen) return true;
if (this.plugin.settings.syncOnSave) return true;
if (this.plugin.settings.syncOnEditorSave) return true;
if (this.plugin.settings.syncOnStart) return true;
if (this.plugin.settings.syncAfterMerge) return true;
if (this.plugin.replicator.syncStatus == "CONNECTED") return true;
@@ -133,12 +134,13 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
new Setting(setupWizardEl)
.setName("Discard the existing configuration and set up")
.addButton((text) => {
// eslint-disable-next-line require-await
text.setButtonText("Next").onClick(async () => {
if (JSON.stringify(this.plugin.settings) != JSON.stringify(DEFAULT_SETTINGS)) {
this.plugin.replicator.closeReplication();
this.plugin.settings = { ...DEFAULT_SETTINGS };
this.plugin.saveSettings();
Logger("Configuration has been flushed, please open it again", LOG_LEVEL.NOTICE)
Logger("Configuration has been flushed, please open it again", LOG_LEVEL_NOTICE)
// @ts-ignore
this.plugin.app.setting.close()
} else {
@@ -156,6 +158,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
this.plugin.settings.liveSync = false;
this.plugin.settings.periodicReplication = false;
this.plugin.settings.syncOnSave = false;
this.plugin.settings.syncOnEditorSave = false;
this.plugin.settings.syncOnStart = false;
this.plugin.settings.syncOnFileOpen = false;
this.plugin.settings.syncAfterMerge = false;
@@ -215,7 +218,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
syncLive.forEach((e) => {
e.setDisabled(false).setTooltip("");
});
} else if (this.plugin.settings.syncOnFileOpen || this.plugin.settings.syncOnSave || this.plugin.settings.syncOnStart || this.plugin.settings.periodicReplication || this.plugin.settings.syncAfterMerge) {
} else if (this.plugin.settings.syncOnFileOpen || this.plugin.settings.syncOnSave || this.plugin.settings.syncOnEditorSave || this.plugin.settings.syncOnStart || this.plugin.settings.periodicReplication || this.plugin.settings.syncAfterMerge) {
syncNonLive.forEach((e) => {
e.setDisabled(false).setTooltip("");
});
@@ -300,46 +303,44 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
.setDisabled(false)
.onClick(async () => {
const checkConfig = async () => {
Logger(`Checking database configuration`, LOG_LEVEL_INFO);
const emptyDiv = createDiv();
emptyDiv.innerHTML = "<span></span>";
checkResultDiv.replaceChildren(...[emptyDiv]);
const addResult = (msg: string, classes?: string[]) => {
const tmpDiv = createDiv();
tmpDiv.addClass("ob-btn-config-fix");
if (classes) {
tmpDiv.addClasses(classes);
}
tmpDiv.innerHTML = `${msg}`;
checkResultDiv.appendChild(tmpDiv);
};
try {
if (isCloudantURI(this.plugin.settings.couchDB_URI)) {
Logger("This feature cannot be used with IBM Cloudant.", LOG_LEVEL.NOTICE);
Logger("This feature cannot be used with IBM Cloudant.", LOG_LEVEL_NOTICE);
return;
}
const r = await requestToCouchDB(this.plugin.settings.couchDB_URI, this.plugin.settings.couchDB_USER, this.plugin.settings.couchDB_PASSWORD, window.origin);
Logger(JSON.stringify(r.json, null, 2));
const responseConfig = r.json;
const emptyDiv = createDiv();
emptyDiv.innerHTML = "<span></span>";
checkResultDiv.replaceChildren(...[emptyDiv]);
const addResult = (msg: string, classes?: string[]) => {
const tmpDiv = createDiv();
tmpDiv.addClass("ob-btn-config-fix");
if (classes) {
tmpDiv.addClasses(classes);
}
tmpDiv.innerHTML = `${msg}`;
checkResultDiv.appendChild(tmpDiv);
};
const addConfigFixButton = (title: string, key: string, value: string) => {
const tmpDiv = createDiv();
tmpDiv.addClass("ob-btn-config-fix");
tmpDiv.innerHTML = `<label>${title}</label><button>Fix</button>`;
const x = checkResultDiv.appendChild(tmpDiv);
x.querySelector("button").addEventListener("click", async () => {
console.dir({ key, value });
Logger(`CouchDB Configuration: ${title} -> Set ${key} to ${value}`)
const res = await requestToCouchDB(this.plugin.settings.couchDB_URI, this.plugin.settings.couchDB_USER, this.plugin.settings.couchDB_PASSWORD, undefined, key, value);
console.dir(res);
if (res.status == 200) {
Logger(`${title} successfully updated`, LOG_LEVEL.NOTICE);
Logger(`CouchDB Configuration: ${title} successfully updated`, LOG_LEVEL_NOTICE);
checkResultDiv.removeChild(x);
checkConfig();
} else {
Logger(`${title} failed`, LOG_LEVEL.NOTICE);
Logger(res.text);
Logger(`CouchDB Configuration: ${title} failed`, LOG_LEVEL_NOTICE);
Logger(res.text, LOG_LEVEL_VERBOSE);
}
});
};
@@ -349,7 +350,6 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
["ob-btn-config-info"]
);
addResult("Your configuration is dumped to Log", ["ob-btn-config-info"]);
addResult("--Config check--", ["ob-btn-config-head"]);
// Admin check
@@ -450,9 +450,16 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
}
addResult("--Done--", ["ob-btn-config-head"]);
addResult("If you have some trouble with Connection-check even though all Config-check has been passed, Please check your reverse proxy's configuration.", ["ob-btn-config-info"]);
Logger(`Checking configuration done`, LOG_LEVEL_INFO);
} catch (ex) {
Logger(`Checking configuration failed`, LOG_LEVEL.NOTICE);
Logger(ex);
if (ex?.status == 401) {
addResult(`❗ Access forbidden.`);
addResult(`We could not continue the test.`);
Logger(`Checking configuration done`, LOG_LEVEL_INFO);
} else {
Logger(`Checking configuration failed`, LOG_LEVEL_NOTICE);
Logger(ex);
}
}
};
await checkConfig();
@@ -609,25 +616,25 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
console.dir(settingForCheck);
const db = await this.plugin.replicator.connectRemoteCouchDBWithSetting(settingForCheck, this.plugin.isMobile, true);
if (typeof db === "string") {
Logger("Could not connect to the database.", LOG_LEVEL.NOTICE);
Logger("Could not connect to the database.", LOG_LEVEL_NOTICE);
return false;
} else {
if (await checkSyncInfo(db.db)) {
// Logger("Database connected", LOG_LEVEL.NOTICE);
// Logger("Database connected", LOG_LEVEL_NOTICE);
return true;
} else {
Logger("Failed to read remote database", LOG_LEVEL.NOTICE);
Logger("Failed to read remote database", LOG_LEVEL_NOTICE);
return false;
}
}
};
const applyEncryption = async (sendToServer: boolean) => {
if (encrypt && passphrase == "") {
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL.NOTICE);
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL_NOTICE);
return;
}
if (encrypt && !(await testCrypt())) {
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL.NOTICE);
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL_NOTICE);
return;
}
if (!(await checkWorkingPassphrase()) && !sendToServer) {
@@ -654,11 +661,11 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
const rebuildDB = async (method: "localOnly" | "remoteOnly" | "rebuildBothByThisDevice") => {
if (encrypt && passphrase == "") {
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL.NOTICE);
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL_NOTICE);
return;
}
if (encrypt && !(await testCrypt())) {
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL.NOTICE);
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL_NOTICE);
return;
}
if (!encrypt) {
@@ -670,7 +677,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
this.plugin.settings.passphrase = passphrase;
this.plugin.settings.useDynamicIterationCount = useDynamicIterationCount;
this.plugin.settings.usePathObfuscation = usePathObfuscation;
Logger("All synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL.NOTICE)
Logger("All synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL_NOTICE)
await this.plugin.saveSettings();
updateE2EControls();
applyDisplayEnabled();
@@ -739,8 +746,20 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
toggle.setValue(this.plugin.settings.showStatusOnEditor).onChange(async (value) => {
this.plugin.settings.showStatusOnEditor = value;
await this.plugin.saveSettings();
this.display();
})
);
if (this.plugin.settings.showStatusOnEditor) {
new Setting(containerGeneralSettingsEl)
.setName("Show status as icons only")
.setDesc("")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.showOnlyIconsOnEditor).onChange(async (value) => {
this.plugin.settings.showOnlyIconsOnEditor = value;
await this.plugin.saveSettings();
})
);
}
containerGeneralSettingsEl.createEl("h4", { text: "Logging" });
new Setting(containerGeneralSettingsEl)
@@ -882,7 +901,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
.setCta()
.onClick(async () => {
if (currentPreset == "") {
Logger("Select any preset.", LOG_LEVEL.NOTICE);
Logger("Select any preset.", LOG_LEVEL_NOTICE);
return;
}
const presetAllDisabled = {
@@ -890,6 +909,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
liveSync: false,
periodicReplication: false,
syncOnSave: false,
syncOnEditorSave: false,
syncOnStart: false,
syncOnFileOpen: false,
syncAfterMerge: false,
@@ -903,6 +923,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
batchSave: true,
periodicReplication: true,
syncOnSave: false,
syncOnEditorSave: false,
syncOnStart: true,
syncOnFileOpen: true,
syncAfterMerge: true,
@@ -913,15 +934,15 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
...this.plugin.settings,
...presetLiveSync
}
Logger("Synchronization setting configured as LiveSync.", LOG_LEVEL.NOTICE);
Logger("Synchronization setting configured as LiveSync.", LOG_LEVEL_NOTICE);
} else if (currentPreset == "PERIODIC") {
this.plugin.settings = {
...this.plugin.settings,
...presetPeriodic
}
Logger("Synchronization setting configured as Periodic sync with batch database update.", LOG_LEVEL.NOTICE);
Logger("Synchronization setting configured as Periodic sync with batch database update.", LOG_LEVEL_NOTICE);
} else {
Logger("All synchronization disabled.", LOG_LEVEL.NOTICE);
Logger("All synchronization disabled.", LOG_LEVEL_NOTICE);
this.plugin.settings = {
...this.plugin.settings,
...presetAllDisabled
@@ -943,7 +964,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
}
await this.plugin.replicate(true);
Logger("All done! Please set up subsequent devices with 'Copy setup URI' and 'Open setup URI'.", LOG_LEVEL.NOTICE);
Logger("All done! Please set up subsequent devices with 'Copy setup URI' and 'Open setup URI'.", LOG_LEVEL_NOTICE);
// @ts-ignore
this.plugin.app.commands.executeCommandById("obsidian-livesync:livesync-copysetupuri")
}
@@ -1013,6 +1034,17 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
applyDisplayEnabled();
})
)
new Setting(containerSyncSettingEl)
.setName("Sync on Editor Save")
.setDesc("When you save file on the editor, sync automatically")
.setClass("wizardHidden")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.syncOnEditorSave).onChange(async (value) => {
this.plugin.settings.syncOnEditorSave = value;
await this.plugin.saveSettings();
applyDisplayEnabled();
})
)
new Setting(containerSyncSettingEl)
.setName("Sync on File Open")
.setDesc("When you open file, sync automatically")
@@ -1084,7 +1116,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
);
new Setting(containerSyncSettingEl)
.setName("Postpone resolution of unopened files")
.setName("Postpone resolution of inactive files")
.setClass("wizardHidden")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.checkConflictOnlyOnOpen).onChange(async (value) => {
@@ -1092,6 +1124,15 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
await this.plugin.saveSettings();
})
);
new Setting(containerSyncSettingEl)
.setName("Postpone manual resolution of inactive files")
.setClass("wizardHidden")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.showMergeDialogOnlyOnActive).onChange(async (value) => {
this.plugin.settings.showMergeDialogOnlyOnActive = value;
await this.plugin.saveSettings();
})
);
containerSyncSettingEl.createEl("h4", { text: "Compatibility" }).addClass("wizardHidden");
new Setting(containerSyncSettingEl)
.setName("Always resolve conflict manually")
@@ -1167,26 +1208,27 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
});
}
new Setting(containerSyncSettingEl)
.setName("Scan for hidden files before replication")
.setDesc("This configuration will be ignored if monitoring changes is enabled.")
.setClass("wizardHidden")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.syncInternalFilesBeforeReplication).onChange(async (value) => {
this.plugin.settings.syncInternalFilesBeforeReplication = value;
await this.plugin.saveSettings();
})
);
if (!this.plugin.settings.watchInternalFileChanges) {
new Setting(containerSyncSettingEl)
.setName("Scan for hidden files before replication")
.setClass("wizardHidden")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.syncInternalFilesBeforeReplication).onChange(async (value) => {
this.plugin.settings.syncInternalFilesBeforeReplication = value;
await this.plugin.saveSettings();
})
);
}
new Setting(containerSyncSettingEl)
.setName("Scan hidden files periodically")
.setDesc("Seconds, 0 to disable. This configuration will be ignored if monitoring changes is enabled.")
.setDesc("Seconds, 0 to disable")
.setClass("wizardHidden")
.addText((text) => {
text.setPlaceholder("")
.setValue(this.plugin.settings.syncInternalFilesInterval + "")
.onChange(async (value) => {
let v = Number(value);
if (isNaN(v) || v < 10) {
if (v !== 0 && (isNaN(v) || v < 10)) {
v = 10;
}
this.plugin.settings.syncInternalFilesInterval = v;
@@ -1198,7 +1240,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
const defaultSkipPattern = "\\/node_modules\\/, \\/\\.git\\/, \\/obsidian-livesync\\/";
const defaultSkipPatternXPlat = defaultSkipPattern + ",\\/workspace$ ,\\/workspace.json$";
new Setting(containerSyncSettingEl)
.setName("Skip patterns")
.setName("Folders and files to ignore")
.setDesc(
"Regular expression, If you use hidden file sync between desktop and mobile, adding `workspace$` is recommended."
)
@@ -1249,7 +1291,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
new Setting(containerSyncSettingEl)
.setName("Enhance chunk size")
.setDesc("Enhance chunk size for binary files (0.1MBytes). This cannot be increased when using IBM Cloudant.")
.setDesc("Enhance chunk size for binary files (Ratio). This cannot be increased when using IBM Cloudant.")
.setClass("wizardHidden")
.addText((text) => {
text.setPlaceholder("")
@@ -1257,7 +1299,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
.onChange(async (value) => {
let v = Number(value);
if (isNaN(v) || v < 1) {
v = 1;
v = 0;
}
this.plugin.settings.customChunkSize = v;
await this.plugin.saveSettings();
@@ -1330,7 +1372,38 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
return text;
}
);
new Setting(containerSyncSettingEl)
.setName("(Beta) Use ignore files")
.setDesc("If this is set, changes to local files which are matched by the ignore files will be skipped. Remote changes are determined using local ignore files.")
.setClass("wizardHidden")
.addToggle((toggle) => {
toggle
.setValue(this.plugin.settings.useIgnoreFiles)
.onChange(async (value) => {
this.plugin.settings.useIgnoreFiles = value;
await this.plugin.saveSettings();
this.display();
})
return toggle;
}
);
if (this.plugin.settings.useIgnoreFiles) {
new Setting(containerSyncSettingEl)
.setName("Ignore files")
.setDesc("We can use multiple ignore files, e.g.) `.gitignore, .dockerignore`")
.setClass("wizardHidden")
.addTextArea((text) => {
text
.setValue(this.plugin.settings.ignoreFiles)
.setPlaceholder(".gitignore, .dockerignore")
.onChange(async (value) => {
this.plugin.settings.ignoreFiles = value;
await this.plugin.saveSettings();
})
return text;
}
);
}
containerSyncSettingEl.createEl("h4", {
text: sanitizeHTMLToDom(`Advanced settings`),
}).addClass("wizardHidden");
@@ -1461,14 +1534,16 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
pluginConfig.passphrase = REDACTED;
pluginConfig.encryptedPassphrase = REDACTED;
pluginConfig.encryptedCouchDBConnection = REDACTED;
pluginConfig.pluginSyncExtendedSetting = {};
const msgConfig = `----remote config----
${stringifyYaml(responseConfig)}
---- Plug-in config ---
version:${manifestVersion}
${stringifyYaml(pluginConfig)}`;
console.log(msgConfig);
await navigator.clipboard.writeText(msgConfig);
Logger(`Information has been copied to clipboard`, LOG_LEVEL.NOTICE);
Logger(`Information has been copied to clipboard`, LOG_LEVEL_NOTICE);
})
);
@@ -1521,11 +1596,11 @@ ${stringifyYaml(pluginConfig)}`;
Logger(`UPDATE DATABASE ${file.path}`);
await this.plugin.updateIntoDB(file, false, null, true);
i++;
Logger(`${i}/${files.length}\n${file.path}`, LOG_LEVEL.NOTICE, "verify");
Logger(`${i}/${files.length}\n${file.path}`, LOG_LEVEL_NOTICE, "verify");
} catch (ex) {
i++;
Logger(`Error while verifyAndRepair`, LOG_LEVEL.NOTICE);
Logger(`Error while verifyAndRepair`, LOG_LEVEL_NOTICE);
Logger(ex);
} finally {
releaser();
@@ -1533,7 +1608,7 @@ ${stringifyYaml(pluginConfig)}`;
}
)(e));
await Promise.all(processes);
Logger("done", LOG_LEVEL.NOTICE, "verify");
Logger("done", LOG_LEVEL_NOTICE, "verify");
})
);
new Setting(containerHatchEl)
@@ -1573,36 +1648,52 @@ ${stringifyYaml(pluginConfig)}`;
}
const ret = await this.plugin.localDatabase.putRaw(newDoc, { force: true });
if (ret.ok) {
Logger(`${docName} has been converted as conflicted document`, LOG_LEVEL.NOTICE);
Logger(`${docName} has been converted as conflicted document`, LOG_LEVEL_NOTICE);
doc._deleted = true;
if ((await this.plugin.localDatabase.putRaw(doc)).ok) {
Logger(`Old ${docName} has been deleted`, LOG_LEVEL.NOTICE);
Logger(`Old ${docName} has been deleted`, LOG_LEVEL_NOTICE);
}
await this.plugin.showIfConflicted(docName as FilePathWithPrefix);
await this.plugin.queueConflictCheck(docName as FilePathWithPrefix);
} else {
Logger(`Converting ${docName} Failed!`, LOG_LEVEL.NOTICE);
Logger(ret, LOG_LEVEL.VERBOSE);
Logger(`Converting ${docName} Failed!`, LOG_LEVEL_NOTICE);
Logger(ret, LOG_LEVEL_VERBOSE);
}
} catch (ex) {
if (ex?.status == 404) {
// We can perform this safely
if ((await this.plugin.localDatabase.putRaw(newDoc)).ok) {
Logger(`${docName} has been converted`, LOG_LEVEL.NOTICE);
Logger(`${docName} has been converted`, LOG_LEVEL_NOTICE);
doc._deleted = true;
if ((await this.plugin.localDatabase.putRaw(doc)).ok) {
Logger(`Old ${docName} has been deleted`, LOG_LEVEL.NOTICE);
Logger(`Old ${docName} has been deleted`, LOG_LEVEL_NOTICE);
}
}
} else {
Logger(`Something went wrong on converting ${docName}`, LOG_LEVEL.NOTICE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(`Something went wrong on converting ${docName}`, LOG_LEVEL_NOTICE);
Logger(ex, LOG_LEVEL_VERBOSE);
// Something wrong.
}
}
}
}
Logger(`Converting finished`, LOG_LEVEL.NOTICE);
Logger(`Converting finished`, LOG_LEVEL_NOTICE);
}));
new Setting(containerHatchEl)
.setName("Delete all customization sync data")
.addButton((button) =>
button
.setButtonText("Delete")
.setDisabled(false)
.setWarning()
.onClick(async () => {
Logger(`Deleting customization sync data`, LOG_LEVEL_NOTICE);
const entriesToDelete = (await this.plugin.localDatabase.allDocsRaw({ startkey: "ix:", endkey: "ix:\u{10ffff}", include_docs: true }));
const newData = entriesToDelete.rows.map(e => ({ ...e.doc, _deleted: true }));
const r = await this.plugin.localDatabase.bulkDocsRaw(newData as any[]);
// Do not care about the result.
Logger(`${r.length} items have been removed, to confirm how many items are left, please perform it again.`, LOG_LEVEL_NOTICE);
}))
new Setting(containerHatchEl)
.setName("Suspend file watching")
.setDesc("Stop watching for file change.")
@@ -1727,12 +1818,12 @@ ${stringifyYaml(pluginConfig)}`;
button.setButtonText("Change")
.onClick(async () => {
if (this.plugin.settings.additionalSuffixOfDatabaseName == newDatabaseName) {
Logger("Suffix was not changed.", LOG_LEVEL.NOTICE);
Logger("Suffix was not changed.", LOG_LEVEL_NOTICE);
return;
}
this.plugin.settings.additionalSuffixOfDatabaseName = newDatabaseName;
await this.plugin.saveSettings();
Logger("Suffix has been changed. Reopening database...", LOG_LEVEL.NOTICE);
Logger("Suffix has been changed. Reopening database...", LOG_LEVEL_NOTICE);
await this.plugin.initializeDatabase();
})
})
@@ -1743,10 +1834,10 @@ ${stringifyYaml(pluginConfig)}`;
.setClass("wizardHidden")
.addDropdown((dropdown) =>
dropdown
.addOptions({ "": "Old Algorithm", "xxhash32": "xxhash32 (Fast)", "xxhash64": "xxhash64 (Fastest)" } as Record<HashAlgorithm, string>)
.addOptions({ "": "Old Algorithm", "xxhash32": "xxhash32 (Fast)", "xxhash64": "xxhash64 (Fastest)", "sha1": "Fallback (Without WebAssembly)" } as Record<HashAlgorithm, string>)
.setValue(this.plugin.settings.hashAlg)
.onChange(async (value: HashAlgorithm) => {
this.plugin.settings.hashAlg = value;
.onChange(async (value) => {
this.plugin.settings.hashAlg = value as HashAlgorithm;
await this.plugin.saveSettings();
})
)
@@ -1786,11 +1877,11 @@ ${stringifyYaml(pluginConfig)}`;
vaultName.setDisabled(this.plugin.settings.usePluginSync);
// vaultName.setTooltip(this.plugin.settings.autoSweepPlugins || this.plugin.settings.autoSweepPluginsPeriodic ? "You could not change when you enabling auto scan." : "");
};
updateDisabledOfDeviceAndVaultName
updateDisabledOfDeviceAndVaultName();
new Setting(containerPluginSettings).setName("Enable customization sync").addToggle((toggle) =>
toggle.setValue(this.plugin.settings.usePluginSync).onChange(async (value) => {
if (value && this.plugin.deviceAndVaultName.trim() == "") {
Logger("We have to configure `Device name` to use this feature.", LOG_LEVEL.NOTICE);
Logger("We have to configure `Device name` to use this feature.", LOG_LEVEL_NOTICE);
toggle.setValue(false);
return false;
}
@@ -1812,16 +1903,18 @@ ${stringifyYaml(pluginConfig)}`;
})
);
new Setting(containerPluginSettings)
.setName("Scan customization periodically")
.setDesc("Scan customization every 1 minute. This configuration will be ignored if monitoring changes of hidden files has been enabled.")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.autoSweepPluginsPeriodic).onChange(async (value) => {
this.plugin.settings.autoSweepPluginsPeriodic = value;
updateDisabledOfDeviceAndVaultName();
await this.plugin.saveSettings();
})
);
if (!this.plugin.settings.watchInternalFileChanges) {
new Setting(containerPluginSettings)
.setName("Scan customization periodically")
.setDesc("Scan customization every 1 minute.")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.autoSweepPluginsPeriodic).onChange(async (value) => {
this.plugin.settings.autoSweepPluginsPeriodic = value;
updateDisabledOfDeviceAndVaultName();
await this.plugin.saveSettings();
})
);
}
new Setting(containerPluginSettings)
.setName("Notify customized")

View File

@@ -2,7 +2,7 @@
import type { PluginDataExDisplay } from "./CmdConfigSync";
import { Logger } from "./lib/src/logger";
import { versionNumberString2Number } from "./lib/src/strbin";
import { type FilePath, LOG_LEVEL } from "./lib/src/types";
import { type FilePath, LOG_LEVEL_NOTICE } from "./lib/src/types";
import { getDocData } from "./lib/src/utils";
import type ObsidianLiveSyncPlugin from "./main";
import { askString, scheduleTask } from "./utils";
@@ -108,7 +108,7 @@
}
}
})
.reduce((p, c) => p | c, 0);
.reduce((p, c) => p | (c as number), 0 as number);
if (matchingStatus == 0b0000100) {
equivalency = "⚖️ Same";
canApply = false;
@@ -207,21 +207,21 @@
const local = list.find((e) => e.term == thisTerm);
const selectedItem = list.find((e) => e.term == selected);
if (selectedItem && (await applyData(selectedItem))) {
scheduleTask("update-plugin-list", 250, () => addOn.updatePluginList(true, local.documentPath));
addOn.updatePluginList(true, local?.documentPath);
}
}
async function compareSelected() {
const local = list.find((e) => e.term == thisTerm);
const selectedItem = list.find((e) => e.term == selected);
if (local && selectedItem && (await compareData(local, selectedItem))) {
scheduleTask("update-plugin-list", 250, () => addOn.updatePluginList(true, local.documentPath));
addOn.updatePluginList(true, local.documentPath);
}
}
async function deleteSelected() {
const selectedItem = list.find((e) => e.term == selected);
// const deletedPath = selectedItem.documentPath;
if (selectedItem && (await deleteData(selectedItem))) {
scheduleTask("update-plugin-list", 250, () => addOn.reloadPluginList(true));
addOn.reloadPluginList(true);
}
}
async function duplicateItem() {
@@ -229,7 +229,7 @@
const duplicateTermName = await askString(plugin.app, "Duplicate", "device name", "");
if (duplicateTermName) {
if (duplicateTermName.contains("/")) {
Logger(`We can not use "/" to the device name`, LOG_LEVEL.NOTICE);
Logger(`We can not use "/" to the device name`, LOG_LEVEL_NOTICE);
return;
}
const key = `${plugin.app.vault.configDir}/${local.files[0].filename}`;
@@ -274,7 +274,7 @@
{/if}
{:else}
<span class="spacer" />
<span class="message even">All devices are even</span>
<span class="message even">All the same or non-existent</span>
<button disabled />
<button disabled />
{/if}

View File

@@ -3,9 +3,13 @@
import ObsidianLiveSyncPlugin from "./main";
import { type PluginDataExDisplay, pluginIsEnumerating, pluginList } from "./CmdConfigSync";
import PluginCombo from "./PluginCombo.svelte";
import { Menu } from "obsidian";
import { unique } from "./lib/src/utils";
import { MODE_SELECTIVE, MODE_AUTOMATIC, MODE_PAUSED, type SYNC_MODE, type PluginSyncSettingEntry } from "./lib/src/types";
import { normalizePath } from "./deps";
export let plugin: ObsidianLiveSyncPlugin;
$: hideNotApplicable = true;
$: hideNotApplicable = false;
$: thisTerm = plugin.deviceAndVaultName;
const addOn = plugin.addOnConfigSync;
@@ -13,7 +17,7 @@
let list: PluginDataExDisplay[] = [];
let selectNewestPulse = 0;
let hideEven = true;
let hideEven = false;
let loading = false;
let applyAllPluse = 0;
let isMaintenanceMode = false;
@@ -80,6 +84,54 @@
async function deleteData(data: PluginDataExDisplay): Promise<boolean> {
return await addOn.deleteData(data);
}
function askMode(evt: MouseEvent, title: string, key: string) {
const menu = new Menu();
menu.addItem((item) => item.setTitle(title).setIsLabel(true));
menu.addSeparator();
const prevMode = automaticList.get(key) ?? MODE_SELECTIVE;
for (const mode of [MODE_SELECTIVE, MODE_AUTOMATIC, MODE_PAUSED]) {
menu.addItem((item) => {
item.setTitle(`${getIcon(mode as SYNC_MODE)}:${TITLES[mode]}`)
.onClick((e) => {
if (mode === MODE_AUTOMATIC) {
askOverwriteModeForAutomatic(evt, key);
} else {
setMode(key, mode as SYNC_MODE);
}
})
.setChecked(prevMode == mode)
.setDisabled(prevMode == mode);
});
}
menu.showAtMouseEvent(evt);
}
function applyAutomaticSync(key: string, direction: "pushForce" | "pullForce" | "safe") {
setMode(key, MODE_AUTOMATIC);
const configDir = normalizePath(plugin.app.vault.configDir);
const files = (plugin.settings.pluginSyncExtendedSetting[key]?.files ?? []).map((e) => `${configDir}/${e}`);
plugin.addOnHiddenFileSync.syncInternalFilesAndDatabase(direction, true, false, files);
}
function askOverwriteModeForAutomatic(evt: MouseEvent, key: string) {
const menu = new Menu();
menu.addItem((item) => item.setTitle("Initial Action").setIsLabel(true));
menu.addSeparator();
menu.addItem((item) => {
item.setTitle(`↑: Overwrite Remote`).onClick((e) => {
applyAutomaticSync(key, "pushForce");
});
})
.addItem((item) => {
item.setTitle(`↓: Overwrite Local`).onClick((e) => {
applyAutomaticSync(key, "pullForce");
});
})
.addItem((item) => {
item.setTitle(`⇅: Use newer`).onClick((e) => {
applyAutomaticSync(key, "safe");
});
});
menu.showAtMouseEvent(evt);
}
$: options = {
thisTerm,
@@ -92,11 +144,84 @@
plugin,
isMaintenanceMode,
};
const ICON_EMOJI_PAUSED = `⛔`;
const ICON_EMOJI_AUTOMATIC = `✨`;
const ICON_EMOJI_SELECTIVE = `🔀`;
const ICONS: { [key: number]: string } = {
[MODE_SELECTIVE]: ICON_EMOJI_SELECTIVE,
[MODE_PAUSED]: ICON_EMOJI_PAUSED,
[MODE_AUTOMATIC]: ICON_EMOJI_AUTOMATIC,
};
const TITLES: { [key: number]: string } = {
[MODE_SELECTIVE]: "Selective",
[MODE_PAUSED]: "Ignore",
[MODE_AUTOMATIC]: "Automatic",
};
const PREFIX_PLUGIN_ALL = "PLUGIN_ALL";
const PREFIX_PLUGIN_DATA = "PLUGIN_DATA";
const PREFIX_PLUGIN_MAIN = "PLUGIN_MAIN";
function setMode(key: string, mode: SYNC_MODE) {
if (key.startsWith(PREFIX_PLUGIN_ALL + "/")) {
setMode(PREFIX_PLUGIN_DATA + key.substring(PREFIX_PLUGIN_ALL.length), mode);
setMode(PREFIX_PLUGIN_MAIN + key.substring(PREFIX_PLUGIN_ALL.length), mode);
}
const files = unique(
list
.filter((e) => `${e.category}/${e.name}` == key)
.map((e) => e.files)
.flat()
.map((e) => e.filename)
);
automaticList.set(key, mode);
automaticListDisp = automaticList;
if (!(key in plugin.settings.pluginSyncExtendedSetting)) {
plugin.settings.pluginSyncExtendedSetting[key] = {
key,
mode,
files: [],
};
}
plugin.settings.pluginSyncExtendedSetting[key].files = files;
plugin.settings.pluginSyncExtendedSetting[key].mode = mode;
plugin.saveSettingData();
}
function getIcon(mode: SYNC_MODE) {
if (mode in ICONS) {
return ICONS[mode];
} else {
("");
}
}
let automaticList = new Map<string, SYNC_MODE>();
let automaticListDisp = new Map<string, SYNC_MODE>();
// apply current configuration to the dialogue
for (const { key, mode } of Object.values(plugin.settings.pluginSyncExtendedSetting)) {
automaticList.set(key, mode);
}
automaticListDisp = automaticList;
let displayKeys: Record<string, string[]> = {};
$: {
const extraKeys = Object.keys(plugin.settings.pluginSyncExtendedSetting);
displayKeys = [
...list,
...extraKeys
.map((e) => `${e}///`.split("/"))
.filter((e) => e[0] && e[1])
.map((e) => ({ category: e[0], name: e[1], displayName: e[1] })),
]
.sort((a, b) => (a.displayName ?? a.name).localeCompare(b.displayName ?? b.name))
.reduce((p, c) => ({ ...p, [c.category]: unique(c.category in p ? [...p[c.category], c.displayName ?? c.name] : [c.displayName ?? c.name]) }), {} as Record<string, string[]>);
}
</script>
<div>
<div>
<h1>Customization sync</h1>
<div class="buttons">
<button on:click={() => scanAgain()}>Scan changes</button>
<button on:click={() => replicate()}>Sync once</button>
@@ -119,15 +244,24 @@
{#if list.length == 0}
<div class="center">No Items.</div>
{:else}
{#each Object.entries(displays) as [key, label]}
{#each Object.entries(displays).filter(([key, _]) => key in displayKeys) as [key, label]}
<div>
<h3>{label}</h3>
{#each groupBy(filterList(list, [key]), "name") as [name, listX]}
{#each displayKeys[key] as name}
{@const bindKey = `${key}/${name}`}
{@const mode = automaticListDisp.get(bindKey) ?? MODE_SELECTIVE}
<div class="labelrow {hideEven ? 'hideeven' : ''}">
<div class="title">
{name}
<button class="status" on:click={(evt) => askMode(evt, `${key}/${name}`, bindKey)}>
{getIcon(mode)}
</button>
<span class="name">{name}</span>
</div>
<PluginCombo {...options} list={listX} hidden={false} />
{#if mode == MODE_SELECTIVE}
<PluginCombo {...options} list={list.filter((e) => e.category == key && e.name == name)} hidden={false} />
{:else}
<div class="statusnote">{TITLES[mode]}</div>
{/if}
</div>
{/each}
</div>
@@ -135,20 +269,55 @@
<div>
<h3>Plugins</h3>
{#each groupBy(filterList(list, ["PLUGIN_MAIN", "PLUGIN_DATA", "PLUGIN_ETC"]), "name") as [name, listX]}
{@const bindKeyAll = `${PREFIX_PLUGIN_ALL}/${name}`}
{@const modeAll = automaticListDisp.get(bindKeyAll) ?? MODE_SELECTIVE}
{@const bindKeyMain = `${PREFIX_PLUGIN_MAIN}/${name}`}
{@const modeMain = automaticListDisp.get(bindKeyMain) ?? MODE_SELECTIVE}
{@const bindKeyData = `${PREFIX_PLUGIN_DATA}/${name}`}
{@const modeData = automaticListDisp.get(bindKeyData) ?? MODE_SELECTIVE}
<div class="labelrow {hideEven ? 'hideeven' : ''}">
<div class="title">
{name}
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_ALL}/${name}`, bindKeyAll)}>
{getIcon(modeAll)}
</button>
<span class="name">{name}</span>
</div>
<PluginCombo {...options} list={listX} hidden={true} />
</div>
<div class="filerow {hideEven ? 'hideeven' : ''}">
<div class="filetitle">Main</div>
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_MAIN"])} hidden={false} />
</div>
<div class="filerow {hideEven ? 'hideeven' : ''}">
<div class="filetitle">Data</div>
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_DATA"])} hidden={false} />
{#if modeAll == MODE_SELECTIVE}
<PluginCombo {...options} list={listX} hidden={true} />
{/if}
</div>
{#if modeAll == MODE_SELECTIVE}
<div class="filerow {hideEven ? 'hideeven' : ''}">
<div class="filetitle">
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_MAIN}/${name}/MAIN`, bindKeyMain)}>
{getIcon(modeMain)}
</button>
<span class="name">MAIN</span>
</div>
{#if modeMain == MODE_SELECTIVE}
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_MAIN"])} hidden={false} />
{:else}
<div class="statusnote">{TITLES[modeMain]}</div>
{/if}
</div>
<div class="filerow {hideEven ? 'hideeven' : ''}">
<div class="filetitle">
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_DATA}/${name}`, bindKeyData)}>
{getIcon(modeData)}
</button>
<span class="name">DATA</span>
</div>
{#if modeData == MODE_SELECTIVE}
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_DATA"])} hidden={false} />
{:else}
<div class="statusnote">{TITLES[modeData]}</div>
{/if}
</div>
{:else}
<div class="noterow">
<div class="statusnote">{TITLES[modeAll]}</div>
</div>
{/if}
{/each}
</div>
{/if}
@@ -162,6 +331,15 @@
</div>
<style>
span.spacer {
min-width: 1px;
flex-grow: 1;
}
h3 {
position: sticky;
top: 0;
background-color: var(--modal-background);
}
.labelrow {
margin-left: 0.4em;
display: flex;
@@ -183,6 +361,24 @@
.labelrow.hideeven:has(.even) {
display: none;
}
.noterow {
min-height: 2em;
display: flex;
}
button.status {
flex-grow: 0;
margin: 2px 4px;
min-width: 3em;
max-width: 4em;
}
.statusnote {
display: flex;
justify-content: flex-end;
padding-right: var(--size-4-12);
align-items: center;
min-width: 10em;
flex-grow: 1;
}
.title {
color: var(--text-normal);

132
src/SerializedFileAccess.ts Normal file
View File

@@ -0,0 +1,132 @@
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "./deps";
import { serialized } from "./lib/src/lock";
import type { FilePath } from "./lib/src/types";
import { createBinaryBlob, isDocContentSame } from "./lib/src/utils";
function getFileLockKey(file: TFile | TFolder | string) {
return `fl:${typeof (file) == "string" ? file : file.path}`;
}
function toArrayBuffer(arr: Uint8Array | ArrayBuffer | DataView): ArrayBufferLike {
if (arr instanceof Uint8Array) {
return arr.buffer;
}
if (arr instanceof DataView) {
return arr.buffer;
}
return arr;
}
export class SerializedFileAccess {
app: App
constructor(app: App) {
this.app = app;
}
async adapterStat(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.stat(path));
}
async adapterExists(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.exists(path));
}
async adapterRemove(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.remove(path));
}
async adapterRead(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.read(path));
}
async adapterReadBinary(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.readBinary(path));
}
async adapterWrite(file: TFile | string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
const path = file instanceof TFile ? file.path : file;
if (typeof (data) === "string") {
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.write(path, data, options));
} else {
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.writeBinary(path, toArrayBuffer(data), options));
}
}
async vaultCacheRead(file: TFile) {
return await serialized(getFileLockKey(file), () => this.app.vault.cachedRead(file));
}
async vaultRead(file: TFile) {
return await serialized(getFileLockKey(file), () => this.app.vault.read(file));
}
async vaultReadBinary(file: TFile) {
return await serialized(getFileLockKey(file), () => this.app.vault.readBinary(file));
}
async vaultModify(file: TFile, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
if (typeof (data) === "string") {
return await serialized(getFileLockKey(file), async () => {
const oldData = await this.app.vault.read(file);
if (data === oldData) return false
await this.app.vault.modify(file, data, options)
return true;
}
);
} else {
return await serialized(getFileLockKey(file), async () => {
const oldData = await this.app.vault.readBinary(file);
if (await isDocContentSame(createBinaryBlob(oldData), createBinaryBlob(data))) {
return false;
}
await this.app.vault.modifyBinary(file, toArrayBuffer(data), options)
return true;
});
}
}
async vaultCreate(path: string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions): Promise<TFile> {
if (typeof (data) === "string") {
return await serialized(getFileLockKey(path), () => this.app.vault.create(path, data, options));
} else {
return await serialized(getFileLockKey(path), () => this.app.vault.createBinary(path, toArrayBuffer(data), options));
}
}
async delete(file: TFile | TFolder, force = false) {
return await serialized(getFileLockKey(file), () => this.app.vault.delete(file, force));
}
async trash(file: TFile | TFolder, force = false) {
return await serialized(getFileLockKey(file), () => this.app.vault.trash(file, force));
}
getAbstractFileByPath(path: FilePath | string): TAbstractFile | null {
// Disabled temporary.
return this.app.vault.getAbstractFileByPath(path);
// // Hidden API but so useful.
// // @ts-ignore
// if ("getAbstractFileByPathInsensitive" in app.vault && (app.vault.adapter?.insensitive ?? false)) {
// // @ts-ignore
// return app.vault.getAbstractFileByPathInsensitive(path);
// } else {
// return app.vault.getAbstractFileByPath(path);
// }
}
touchedFiles: string[] = [];
touch(file: TFile | FilePath) {
const f = file instanceof TFile ? file : this.getAbstractFileByPath(file) as TFile;
const key = `${f.path}-${f.stat.mtime}-${f.stat.size}`;
this.touchedFiles.unshift(key);
this.touchedFiles = this.touchedFiles.slice(0, 100);
}
recentlyTouched(file: TFile) {
const key = `${file.path}-${file.stat.mtime}-${file.stat.size}`;
if (this.touchedFiles.indexOf(key) == -1) return false;
return true;
}
clearTouched() {
this.touchedFiles = [];
}
}

View File

@@ -1,46 +1,46 @@
import { Plugin_2, TAbstractFile, TFile, TFolder } from "./deps";
import type { SerializedFileAccess } from "./SerializedFileAccess";
import { Plugin, TAbstractFile, TFile, TFolder } from "./deps";
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
import { getGlobalStore } from "./lib/src/store";
import { FilePath, ObsidianLiveSyncSettings } from "./lib/src/types";
import { FileEventItem, FileEventType, FileInfo, InternalFileInfo, queueItem } from "./types";
import { recentlyTouched } from "./utils";
import type { KeyedQueueProcessor } from "./lib/src/processor";
import { type FilePath, type ObsidianLiveSyncSettings } from "./lib/src/types";
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo } from "./types";
export abstract class StorageEventManager {
abstract fetchEvent(): FileEventItem | false;
abstract cancelRelativeEvent(item: FileEventItem): void;
abstract getQueueLength(): number;
abstract beginWatch(): void;
}
type LiveSyncForStorageEventManager = Plugin_2 &
type LiveSyncForStorageEventManager = Plugin &
{
settings: ObsidianLiveSyncSettings
ignoreFiles: string[],
vaultAccess: SerializedFileAccess
} & {
isTargetFile: (file: string | TAbstractFile) => boolean,
procFileEvent: (applyBatch?: boolean) => Promise<boolean>
isTargetFile: (file: string | TAbstractFile) => Promise<boolean>,
fileEventQueue: KeyedQueueProcessor<FileEventItem, any>
};
export class StorageEventManagerObsidian extends StorageEventManager {
plugin: LiveSyncForStorageEventManager;
queuedFilesStore = getGlobalStore("queuedFiles", { queuedItems: [] as queueItem[], fileEventItems: [] as FileEventItem[] });
watchedFileEventQueue = [] as FileEventItem[];
constructor(plugin: LiveSyncForStorageEventManager) {
super();
this.plugin = plugin;
}
beginWatch() {
const plugin = this.plugin;
this.watchVaultChange = this.watchVaultChange.bind(this);
this.watchVaultCreate = this.watchVaultCreate.bind(this);
this.watchVaultDelete = this.watchVaultDelete.bind(this);
this.watchVaultRename = this.watchVaultRename.bind(this);
this.watchVaultRawEvents = this.watchVaultRawEvents.bind(this);
plugin.registerEvent(app.vault.on("modify", this.watchVaultChange));
plugin.registerEvent(app.vault.on("delete", this.watchVaultDelete));
plugin.registerEvent(app.vault.on("rename", this.watchVaultRename));
plugin.registerEvent(app.vault.on("create", this.watchVaultCreate));
plugin.registerEvent(plugin.app.vault.on("modify", this.watchVaultChange));
plugin.registerEvent(plugin.app.vault.on("delete", this.watchVaultDelete));
plugin.registerEvent(plugin.app.vault.on("rename", this.watchVaultRename));
plugin.registerEvent(plugin.app.vault.on("create", this.watchVaultCreate));
//@ts-ignore : Internal API
plugin.registerEvent(app.vault.on("raw", this.watchVaultRawEvents));
plugin.registerEvent(plugin.app.vault.on("raw", this.watchVaultRawEvents));
plugin.fileEventQueue.startPipeline();
}
watchVaultCreate(file: TAbstractFile, ctx?: any) {
@@ -64,9 +64,18 @@ export class StorageEventManagerObsidian extends StorageEventManager {
}
// Watch raw events (Internal API)
watchVaultRawEvents(path: FilePath) {
if (this.plugin.settings.useIgnoreFiles && this.plugin.ignoreFiles.some(e => path.endsWith(e.trim()))) {
// If it is one of ignore files, refresh the cached one.
this.plugin.isTargetFile(path).then(() => this._watchVaultRawEvents(path));
} else {
this._watchVaultRawEvents(path);
}
}
_watchVaultRawEvents(path: FilePath) {
if (!this.plugin.settings.syncInternalFiles && !this.plugin.settings.usePluginSync) return;
if (!this.plugin.settings.watchInternalFileChanges) return;
if (!path.startsWith(app.vault.configDir)) return;
if (!path.startsWith(this.plugin.app.vault.configDir)) return;
const ignorePatterns = this.plugin.settings.syncInternalFilesIgnorePatterns
.replace(/\n| /g, "")
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
@@ -77,10 +86,8 @@ export class StorageEventManagerObsidian extends StorageEventManager {
file: { path, mtime: 0, ctime: 0, size: 0 }
}], null);
}
// Cache file and waiting to can be proceed.
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
let forcePerform = false;
for (const param of params) {
if (shouldBeIgnored(param.file.path)) {
continue;
@@ -90,50 +97,22 @@ export class StorageEventManagerObsidian extends StorageEventManager {
const file = param.file;
const oldPath = param.oldPath;
if (file instanceof TFolder) continue;
if (!this.plugin.isTargetFile(file.path)) continue;
if (!await this.plugin.isTargetFile(file.path)) continue;
if (this.plugin.settings.suspendFileWatching) continue;
let cache: null | string | ArrayBuffer;
// new file or something changed, cache the changes.
if (file instanceof TFile && (type == "CREATE" || type == "CHANGED")) {
if (recentlyTouched(file)) {
if (this.plugin.vaultAccess.recentlyTouched(file)) {
continue;
}
if (!isPlainText(file.name)) {
cache = await app.vault.readBinary(file);
cache = await this.plugin.vaultAccess.vaultReadBinary(file);
} else {
// cache = await this.app.vault.read(file);
cache = await app.vault.cachedRead(file);
if (!cache) cache = await app.vault.read(file);
cache = await this.plugin.vaultAccess.vaultCacheRead(file);
if (!cache) cache = await this.plugin.vaultAccess.vaultRead(file);
}
}
if (type == "DELETE" || type == "RENAME") {
forcePerform = true;
}
if (this.plugin.settings.batchSave && !this.plugin.settings.liveSync) {
// if the latest event is the same type, omit that
// a.md MODIFY <- this should be cancelled when a.md MODIFIED
// b.md MODIFY <- this should be cancelled when b.md MODIFIED
// a.md MODIFY
// a.md CREATE
// :
let i = this.watchedFileEventQueue.length;
L1:
while (i >= 0) {
i--;
if (i < 0) break L1;
if (this.watchedFileEventQueue[i].args.file.path != file.path) {
continue L1;
}
if (this.watchedFileEventQueue[i].type != type) break L1;
this.watchedFileEventQueue.remove(this.watchedFileEventQueue[i]);
//this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
}
}
const fileInfo = file instanceof TFile ? {
ctime: file.stat.ctime,
mtime: file.stat.mtime,
@@ -141,7 +120,8 @@ export class StorageEventManagerObsidian extends StorageEventManager {
path: file.path,
size: file.stat.size
} as FileInfo : file as InternalFileInfo;
this.watchedFileEventQueue.push({
this.plugin.fileEventQueue.enqueueWithKey(`file-${fileInfo.path}`, {
type,
args: {
file: fileInfo,
@@ -152,21 +132,5 @@ export class StorageEventManagerObsidian extends StorageEventManager {
key: atomicKey
})
}
// this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
this.plugin.procFileEvent(forcePerform);
}
fetchEvent(): FileEventItem | false {
if (this.watchedFileEventQueue.length == 0) return false;
const item = this.watchedFileEventQueue.shift();
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
return item;
}
cancelRelativeEvent(item: FileEventItem) {
this.watchedFileEventQueue = [...this.watchedFileEventQueue].filter(e => e.key != item.key);
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
}
getQueueLength() {
return this.watchedFileEventQueue.length;
}
}

View File

@@ -1,10 +1,10 @@
import { type FilePath } from "./lib/src/types";
export {
addIcon, App, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginSettingTab, Plugin_2, requestUrl, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
addIcon, App, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginSettingTab, requestUrl, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
parseYaml, ItemView, WorkspaceLeaf
} from "obsidian";
export type { DataWriteOptions, PluginManifest, RequestUrlParam, RequestUrlResponse } from "obsidian";
export type { DataWriteOptions, PluginManifest, RequestUrlParam, RequestUrlResponse, MarkdownFileInfo } from "obsidian";
import {
normalizePath as normalizePath_
} from "obsidian";

View File

@@ -20,6 +20,7 @@ export class PluginDialogModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Customization Sync (Beta2)")
if (this.component == null) {
this.component = new PluginPane({
target: contentEl,
@@ -38,7 +39,7 @@ export class PluginDialogModal extends Modal {
export class InputStringDialog extends Modal {
result: string | false = false;
onSubmit: (result: string | boolean) => void;
onSubmit: (result: string | false) => void;
title: string;
key: string;
placeholder: string;
@@ -56,10 +57,8 @@ export class InputStringDialog extends Modal {
onOpen() {
const { contentEl } = this;
contentEl.createEl("h1", { text: this.title });
// For enter to submit
const formEl = contentEl.createEl("form");
this.titleEl.setText(this.title);
const formEl = contentEl.createDiv();
new Setting(formEl).setName(this.key).setClass(this.isPassword ? "password-input" : "normal-input").addText((text) =>
text.onChange((value) => {
this.result = value;
@@ -144,7 +143,7 @@ export class MessageBox extends Modal {
timer: ReturnType<typeof setInterval> = undefined;
defaultButtonComponent: ButtonComponent | undefined;
onSubmit: (result: string | boolean) => void;
onSubmit: (result: string | false) => void;
constructor(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout: number, onSubmit: (result: (typeof buttons)[number] | false) => void) {
super(plugin.app);
@@ -175,16 +174,17 @@ export class MessageBox extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText(this.title);
contentEl.addEventListener("click", () => {
if (this.timer) {
clearInterval(this.timer);
this.timer = undefined;
}
})
contentEl.createEl("h1", { text: this.title });
const div = contentEl.createDiv();
MarkdownRenderer.renderMarkdown(this.contentMd, div, "/", this.plugin);
MarkdownRenderer.render(this.plugin.app, this.contentMd, div, "/", this.plugin);
const buttonSetting = new Setting(contentEl);
buttonSetting.controlEl.style.flexWrap = "wrap";
for (const button of this.buttons) {
buttonSetting.addButton((btn) => {
btn

Submodule src/lib updated: ca61c5a64b...57e663b6dc

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,5 @@
import { PluginManifest, TFile } from "./deps";
import { DatabaseEntry, EntryBody, FilePath } from "./lib/src/types";
import { type PluginManifest, TFile } from "./deps";
import { type DatabaseEntry, type EntryBody, type FilePath } from "./lib/src/types";
export interface PluginDataEntry extends DatabaseEntry {
deviceVaultName: string;

View File

@@ -1,11 +1,15 @@
import { type DataWriteOptions, normalizePath, TFile, Platform, TAbstractFile, App, Plugin_2, type RequestUrlParam, requestUrl } from "./deps";
import { normalizePath, Platform, TAbstractFile, App, type RequestUrlParam, requestUrl } from "./deps";
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "./lib/src/path";
import { Logger } from "./lib/src/logger";
import { type AnyEntry, type DocumentID, type EntryHasPath, type FilePath, type FilePathWithPrefix, LOG_LEVEL } from "./lib/src/types";
import { CHeader, ICHeader, ICHeaderLength, PSCHeader } from "./types";
import { LOG_LEVEL_VERBOSE, type AnyEntry, type DocumentID, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "./lib/src/types";
import { CHeader, ICHeader, ICHeaderLength, ICXHeader, PSCHeader } from "./types";
import { InputStringDialog, PopoverSelectString } from "./dialogs";
import ObsidianLiveSyncPlugin from "./main";
import type ObsidianLiveSyncPlugin from "./main";
import { writeString } from "./lib/src/strbin";
import { fireAndForget } from "./lib/src/utils";
export { scheduleTask, setPeriodicTask, cancelTask, cancelAllTasks, cancelPeriodicTask, cancelAllPeriodicTask, } from "./lib/src/task";
// For backward compatibility, using the path for determining id.
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
@@ -42,49 +46,6 @@ export function getPathFromTFile(file: TAbstractFile) {
return file.path as FilePath;
}
const tasks: { [key: string]: ReturnType<typeof setTimeout> } = {};
export function scheduleTask(key: string, timeout: number, proc: (() => Promise<any> | void), skipIfTaskExist?: boolean) {
if (skipIfTaskExist && key in tasks) {
return;
}
cancelTask(key);
tasks[key] = setTimeout(async () => {
delete tasks[key];
await proc();
}, timeout);
}
export function cancelTask(key: string) {
if (key in tasks) {
clearTimeout(tasks[key]);
delete tasks[key];
}
}
export function cancelAllTasks() {
for (const v in tasks) {
clearTimeout(tasks[v]);
delete tasks[v];
}
}
const intervals: { [key: string]: ReturnType<typeof setInterval> } = {};
export function setPeriodicTask(key: string, timeout: number, proc: (() => Promise<any> | void)) {
cancelPeriodicTask(key);
intervals[key] = setInterval(async () => {
delete intervals[key];
await proc();
}, timeout);
}
export function cancelPeriodicTask(key: string) {
if (key in intervals) {
clearInterval(intervals[key]);
delete intervals[key];
}
}
export function cancelAllPeriodicTask() {
for (const v in intervals) {
clearInterval(intervals[v]);
delete intervals[v];
}
}
const memos: { [key: string]: any } = {};
export function memoObject<T>(key: string, obj: T): T {
@@ -302,20 +263,6 @@ export function flattenObject(obj: Record<string | number | symbol, any>, path:
return ret;
}
export function modifyFile(file: TFile, data: string | ArrayBuffer, options?: DataWriteOptions) {
if (typeof (data) === "string") {
return app.vault.modify(file, data, options);
} else {
return app.vault.modifyBinary(file, data, options);
}
}
export function createFile(path: string, data: string | ArrayBuffer, options?: DataWriteOptions): Promise<TFile> {
if (typeof (data) === "string") {
return app.vault.create(path, data, options);
} else {
return app.vault.createBinary(path, data, options);
}
}
export function isValidPath(filename: string) {
if (Platform.isDesktop) {
@@ -327,42 +274,14 @@ export function isValidPath(filename: string) {
if (Platform.isAndroidApp) return isValidFilenameInAndroid(filename);
if (Platform.isIosApp) return isValidFilenameInDarwin(filename);
//Fallback
Logger("Could not determine platform for checking filename", LOG_LEVEL.VERBOSE);
Logger("Could not determine platform for checking filename", LOG_LEVEL_VERBOSE);
return isValidFilenameInWidows(filename);
}
let touchedFiles: string[] = [];
export function getAbstractFileByPath(path: FilePath): TAbstractFile | null {
// Disabled temporary.
return app.vault.getAbstractFileByPath(path);
// // Hidden API but so useful.
// // @ts-ignore
// if ("getAbstractFileByPathInsensitive" in app.vault && (app.vault.adapter?.insensitive ?? false)) {
// // @ts-ignore
// return app.vault.getAbstractFileByPathInsensitive(path);
// } else {
// return app.vault.getAbstractFileByPath(path);
// }
}
export function trimPrefix(target: string, prefix: string) {
return target.startsWith(prefix) ? target.substring(prefix.length) : target;
}
export function touch(file: TFile | FilePath) {
const f = file instanceof TFile ? file : getAbstractFileByPath(file) as TFile;
const key = `${f.path}-${f.stat.mtime}-${f.stat.size}`;
touchedFiles.unshift(key);
touchedFiles = touchedFiles.slice(0, 100);
}
export function recentlyTouched(file: TFile) {
const key = `${file.path}-${file.stat.mtime}-${file.stat.size}`;
if (touchedFiles.indexOf(key) == -1) return false;
return true;
}
export function clearTouched() {
touchedFiles = [];
}
/**
* returns is internal chunk of file
@@ -387,6 +306,9 @@ export function isChunk(str: string): boolean {
export function isPluginMetadata(str: string): boolean {
return str.startsWith(PSCHeader);
}
export function isCustomisationSyncMetadata(str: string): boolean {
return str.startsWith(ICXHeader);
}
export const askYesNo = (app: App, message: string): Promise<"yes" | "no"> => {
return new Promise((res) => {
@@ -415,8 +337,8 @@ export const askString = (app: App, title: string, key: string, placeholder: str
export class PeriodicProcessor {
_process: () => Promise<any>;
_timer?: number;
_plugin: Plugin_2;
constructor(plugin: Plugin_2, process: () => Promise<any>) {
_plugin: ObsidianLiveSyncPlugin;
constructor(plugin: ObsidianLiveSyncPlugin, process: () => Promise<any>) {
this._plugin = plugin;
this._process = process;
}
@@ -430,17 +352,24 @@ export class PeriodicProcessor {
enable(interval: number) {
this.disable();
if (interval == 0) return;
this._timer = window.setInterval(() => this.process().then(() => { }), interval);
this._timer = window.setInterval(() => fireAndForget(async () => {
await this.process();
if (this._plugin._unloaded) {
this.disable();
}
}), interval);
this._plugin.registerInterval(this._timer);
}
disable() {
if (this._timer !== undefined) window.clearInterval(this._timer);
this._timer = undefined;
if (this._timer !== undefined) {
window.clearInterval(this._timer);
this._timer = undefined;
}
}
}
export const _requestToCouchDBFetch = async (baseUri: string, username: string, password: string, path?: string, body?: string | any, method?: string) => {
const utf8str = String.fromCharCode.apply(null, new TextEncoder().encode(`${username}:${password}`));
const utf8str = String.fromCharCode.apply(null, [...writeString(`${username}:${password}`)]);
const encoded = window.btoa(utf8str);
const authHeader = "Basic " + encoded;
const transformedHeaders: Record<string, string> = { authorization: authHeader, "content-type": "application/json" };
@@ -456,7 +385,7 @@ export const _requestToCouchDBFetch = async (baseUri: string, username: string,
}
export const _requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string, path?: string, body?: any, method?: string) => {
const utf8str = String.fromCharCode.apply(null, new TextEncoder().encode(`${username}:${password}`));
const utf8str = String.fromCharCode.apply(null, [...writeString(`${username}:${password}`)]);
const encoded = window.btoa(utf8str);
const authHeader = "Basic " + encoded;
const transformedHeaders: Record<string, string> = { authorization: authHeader, origin: origin };

View File

@@ -85,7 +85,6 @@
} */
.sls-header-button {
margin-left: 2em;
}
@@ -99,7 +98,7 @@
}
.CodeMirror-wrap::before,
.cm-s-obsidian>.cm-editor::before,
.cm-s-obsidian > .cm-editor::before,
.canvas-wrapper::before {
content: attr(data-log);
text-align: right;
@@ -124,7 +123,7 @@
right: 0px;
}
.cm-s-obsidian>.cm-editor::before {
.cm-s-obsidian > .cm-editor::before {
right: 16px;
}
@@ -153,8 +152,8 @@ div.sls-setting-menu-btn {
/* width: 100%; */
}
.sls-setting-tab:hover~div.sls-setting-menu-btn,
.sls-setting-label.selected .sls-setting-tab:checked~div.sls-setting-menu-btn {
.sls-setting-tab:hover ~ div.sls-setting-menu-btn,
.sls-setting-label.selected .sls-setting-tab:checked ~ div.sls-setting-menu-btn {
background-color: var(--interactive-accent);
color: var(--text-on-accent);
}
@@ -257,8 +256,8 @@ div.sls-setting-menu-btn {
display: none;
}
.password-input > .setting-item-control >input {
-webkit-text-security: disc;
.password-input > .setting-item-control > input {
-webkit-text-security: disc;
}
span.ls-mark-cr::after {
@@ -270,4 +269,39 @@ span.ls-mark-cr::after {
.deleted span.ls-mark-cr::after {
color: var(--text-on-accent);
}
}
.ls-imgdiff-wrap {
display: flex;
justify-content: center;
align-items: center;
}
.ls-imgdiff-wrap .overlay {
position: relative;
}
.ls-imgdiff-wrap .overlay .img-base {
position: relative;
top: 0;
left: 0;
}
.ls-imgdiff-wrap .overlay .img-overlay {
-webkit-filter: invert(100%) opacity(50%);
filter: invert(100%) opacity(50%);
position: absolute;
top: 0;
left: 0;
animation: ls-blink-diff 0.5s cubic-bezier(0.4, 0, 1, 1) infinite alternate;
}
@keyframes ls-blink-diff {
0% {
opacity: 0;
}
50% {
opacity: 0;
}
100% {
opacity: 1;
}
}

View File

@@ -15,6 +15,8 @@
// "importsNotUsedAsValues": "error",
"importHelpers": false,
"alwaysStrict": true,
"allowImportingTsExtensions": true,
"noEmit": true,
"lib": [
"es2018",
"DOM",

View File

@@ -1,50 +1,45 @@
### 0.19.0
### 0.22.0
A few years passed since Self-hosted LiveSync was born, and our codebase had been very complicated. This could be patient now, but it should be a tremendous hurt.
Therefore at v0.22.0, for future maintainability, I refined task scheduling logic totally.
#### Customization sync
Of course, I think this would be our suffering in some cases. However, I would love to ask you for your cooperation and contribution.
Since `Plugin and their settings` have been broken, so I tried to fix it, not just fix it, but fix it the way it should be.
Sorry for being absent so much long. And thank you for your patience!
Now, we have `Customization sync`.
Note: we got a very performance improvement.
It is a real shame that the compatibility between these features has been broken. However, this new feature is surely useful and I believe that worth getting over the pain.
We can use the new feature with the same configuration. Only the menu on the command palette has been changed. The dialog can be opened by `Show customization sync dialog`.
I hope you will give it a try.
#### Minors
- 0.19.1 to 0.19.11 has been moved into the updates_old.md
- 0.19.12
- Improved:
- Boot-up performance has been improved.
- Customisation sync performance has been improved.
- Synchronising performance has been improved.
- 0.19.13
- Implemented:
- Database clean-up is now in beta 2!
We can shrink the remote database by deleting unused chunks, with keeping history.
Note: Local database is not cleaned up totally. We have to `Fetch` again to let it done.
**Note2**: Still in beta. Please back your vault up anything before.
- Fixed:
- The log updates are not thinned out now.
- 0.19.14
- Fixed:
- Internal documents are now ignored.
- Merge dialogue now respond immediately to button pressing.
- Periodic processing now works fine.
- The checking interval of detecting conflicted has got shorter.
- Replication is now cancelled while cleaning up.
- The database locking by the cleaning up is now carefully unlocked.
- Missing chunks message is correctly reported.
#### Version history
- 0.22.1
- New feature:
- Suspend database reflecting has been implemented.
- This can be disabled by `Fetch database with previous behaviour`.
- Now fetch suspends the reflecting database and storage changes temporarily to improve the performance.
- We can choose the action when the remote database has been cleaned
- Merge dialogue now show `↲` before the new line.
- We can perform automatic conflict resolution for inactive files, and postpone only manual ones by `Postpone manual resolution of inactive files`.
- Now we can see the image in the document history dialogue.
- We can see the difference of the image, in the document history dialogue.
- And also we can highlight differences.
- Improved:
- Now progress is reported while the cleaning up and fetch process.
- Cancelled replication is now detected.
- Hidden file sync has been stabilised.
- Now automatically reloads the conflict-resolution dialogue when new conflicted revisions have arrived.
- Fixed:
- No longer periodic process runs after unloading the plug-in.
- Now the modification of binary files is surely stored in the storage.
- 0.22.0
- Refined:
- Task scheduling logics has been rewritten.
- Screen updates are also now efficient.
- Possibly many bugs and fragile behaviour has been fixed.
- Status updates and logging have been thinned out to display.
- Fixed:
- Remote-chunk-fetching now works with keeping request intervals
- New feature:
- We can show only the icons in the editor.
- Progress indicators have been more meaningful:
- 📥 Unprocessed transferred items
- 📄 Working database operation
- 💾 Working write storage processes
- ⏳ Working read storage processes
- 🛫 Pending read storage processes
- ⚙️ Working or pending storage processes of hidden files
- 🧩 Waiting chunks
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
... To continue on to `updates_old.md`.
... To continue on to `updates_old.md`.

View File

@@ -1,3 +1,112 @@
### 0.21.0
The E2EE encryption V2 format has been reverted. That was probably the cause of the glitch.
Instead, to maintain efficiency, files are treated with Blob until just before saving. Along with this, the old-fashioned encryption format has also been discontinued.
There are both forward and backwards compatibilities, with recent versions. However, unfortunately, we lost compatibility with filesystem-livesync or some.
It will be addressed soon. Please be patient if you are using filesystem-livesync with E2EE.
- 0.21.5
- Improved:
- Now all revisions will be shown only its first a few letters.
- Now ID of the documents is shown in the log with the first 8 letters.
- Fixed:
- Check before modifying files has been implemented.
- Content change detection has been improved.
- 0.21.4
- This release had been skipped.
- 0.21.3
- Implemented:
- Now we can use SHA1 for hash function as fallback.
- 0.21.2
- IMPORTANT NOTICE: **0.21.1 CONTAINS A BUG WHILE REBUILDING THE DATABASE. IF YOU HAVE BEEN REBUILT, PLEASE MAKE SURE THAT ALL FILES ARE SANE.**
- This has been fixed in this version.
- Fixed:
- No longer files are broken while rebuilding.
- Now, Large binary files can be written correctly on a mobile platform.
- Any decoding errors now make zero-byte files.
- Modified:
- All files are processed sequentially for each.
- 0.21.1
- Fixed:
- No more infinity loops on larger files.
- Show message on decode error.
- Refactored:
- Fixed to avoid obsolete global variables.
- 0.21.0
- Changes and performance improvements:
- Now the saving files are processed by Blob.
- The V2-Format has been reverted.
- New encoding format has been enabled in default.
- WARNING: Since this version, the compatibilities with older Filesystem LiveSync have been lost.
## 0.20.0
At 0.20.0, Self-hosted LiveSync has changed the binary file format and encrypting format, for efficient synchronisation.
The dialogue will be shown and asks us to decide whether to keep v1 or use v2. Once we have enabled v2, all subsequent edits will be saved in v2. Therefore, devices running 0.19 or below cannot understand this and they might say that decryption error. Please update all devices.
Then we will have an impressive performance.
Of course, these are very impactful changes. If you have any questions or troubled things, please feel free to open an issue and mention me.
Note: if you want to roll it back to v1, please enable `Use binary and encryption version 1` on the `Hatch` pane and perform the `rebuild everything` once.
Extra but notable information:
This format change gives us the ability to detect some `marks` in the binary files as same as text files. Therefore, we can split binary files and some specific sort of them (i.e., PDF files) at the specific character. It means that editing the middle of files could be detected with marks.
Now only a few chunks are transferred, even if we add a comment to the PDF or put new files into the ZIP archives.
- 0.20.7
- Fixed
- To better replication, path obfuscation is now deterministic even if with E2EE.
Note: Compatible with previous database without any conversion. Only new files will be obfuscated in deterministic.
- 0.20.6
- Fixed
- Now empty file could be decoded.
- Local files are no longer pre-saved before fetching from a remote database.
- No longer deadlock while applying customisation sync.
- Configuration with multiple files is now able to be applied correctly.
- Deleting folder propagation now works without enabling the use of a trash bin.
- 0.20.5
- Fixed
- Now the files which having digit or character prefixes in the path will not be ignored.
- 0.20.4
- Fixed
- The text-input-dialogue is no longer broken.
- Finally, we can use the Setup URI again on mobile.
- 0.20.3
- New feature:
- We can launch Customization sync from the Ribbon if we enabled it.
- Fixed:
- Setup URI is now back to the previous spec; be encrypted by V1.
- It may avoid the trouble with iOS 17.
- The Settings dialogue is now registered at the beginning of the start-up process.
- We can change the configuration even though LiveSync could not be launched in normal.
- Improved:
- Enumerating documents has been faster.
- 0.20.2
- New feature:
- We can delete all data of customization sync from the `Delete all customization sync data` on the `Hatch` pane.
- Fixed:
- Prevent keep restarting on iOS by yielding microtasks.
- 0.20.1
- Fixed:
- No more UI freezing and keep restarting on iOS.
- Diff of Non-markdown documents are now shown correctly.
- Improved:
- Performance has been a bit improved.
- Customization sync has gotten faster.
- However, We lost forward compatibility again (only for this feature). Please update all devices.
- Misc
- Terser configuration has been more aggressive.
- 0.20.0
- Improved:
- A New binary file handling implemented
- A new encrypted format has been implemented
- Now the chunk sizes will be adjusted for efficient sync
- Fixed:
- levels of exception in some logs have been fixed
- Tidied:
- Some Lint warnings have been suppressed.
### 0.19.0
#### Customization sync
@@ -82,6 +191,114 @@ I hope you will give it a try.
- Logging keeps 400 lines now.
- Refactored:
- Import statement has been fixed about types.
- 0.19.12
- Improved:
- Boot-up performance has been improved.
- Customisation sync performance has been improved.
- Synchronising performance has been improved.
- 0.19.13
- Implemented:
- Database clean-up is now in beta 2!
We can shrink the remote database by deleting unused chunks, with keeping history.
Note: Local database is not cleaned up totally. We have to `Fetch` again to let it done.
**Note2**: Still in beta. Please back your vault up anything before.
- Fixed:
- The log updates are not thinned out now.
- 0.19.14
- Fixed:
- Internal documents are now ignored.
- Merge dialogue now respond immediately to button pressing.
- Periodic processing now works fine.
- The checking interval of detecting conflicted has got shorter.
- Replication is now cancelled while cleaning up.
- The database locking by the cleaning up is now carefully unlocked.
- Missing chunks message is correctly reported.
- New feature:
- Suspend database reflecting has been implemented.
- This can be disabled by `Fetch database with previous behaviour`.
- Now fetch suspends the reflecting database and storage changes temporarily to improve the performance.
- We can choose the action when the remote database has been cleaned
- Merge dialogue now show `↲` before the new line.
- Improved:
- Now progress is reported while the cleaning up and fetch process.
- Cancelled replication is now detected.
- 0.19.15
- Fixed:
- Now storing files after cleaning up is correct works.
- Improved:
- Cleaning the local database up got incredibly fastened.
Now we can clean instead of fetching again when synchronising with the remote which has been cleaned up.
- 0.19.16
- Many upgrades on this release. I have tried not to let that happen, if something got corrupted, please feel free to notify me.
- New feature:
- (Beta) ignore files handling
We can use `.gitignore`, `.dockerignore`, and anything you like to filter the synchronising files.
- Fixed:
- Buttons on lock-detected-dialogue now can be shown in narrow-width devices.
- Improved:
- Some constant has been flattened to be evaluated.
- The usage of the deprecated API of obsidian has been reduced.
- Now the indexedDB adapter will be enabled while the importing configuration.
- Misc:
- Compiler, framework, and dependencies have been upgraded.
- Due to standing for these impacts (especially in esbuild and svelte,) terser has been introduced.
Feel free to notify your opinion to me! I do not like to obfuscate the code too.
- 0.19.17
- Fixed:
- Now nested ignore files could be parsed correctly.
- The unexpected deletion of hidden files in some cases has been corrected.
- Hidden file change is no longer reflected on the device which has made the change itself.
- Behaviour changed:
- From this version, the file which has `:` in its name should be ignored even if on Linux devices.
- 0.19.18
- Fixed:
- Now the empty (or deleted) file could be conflict-resolved.
- 0.19.19
- Fixed:
- Resolving conflicted revision has become more robust.
- LiveSync now try to keep local changes when fetching from the rebuilt remote database.
Local changes now have been kept as a revision and fetched things will be new revisions.
- Now, all files will be restored after performing `fetch` immediately.
- 0.19.20
- New feature:
- `Sync on Editor save` has been implemented
- We can start synchronisation when we save from the Obsidian explicitly.
- Now we can use the `Hidden file sync` and the `Customization sync` cooperatively.
- We can exclude files from `Hidden file sync` which is already handled in Customization sync.
- We can ignore specific plugins in Customization sync.
- Now the message of leftover conflicted files accepts our click.
- We can open `Resolve all conflicted files` in an instant.
- Refactored:
- Parallelism functions made more explicit.
- Type errors have been reduced.
- Fixed:
- Now documents would not be overwritten if they are conflicted.
It will be saved as a new conflicted revision.
- Some error messages have been fixed.
- Missing dialogue titles have been shown now.
- We can click close buttons on mobile now.
- Conflicted Customisation sync files will be resolved automatically by their modified time.
- 0.19.21
- Fixed:
- Hidden files are no longer handled in the initial replication.
- Report from `Making report` fixed
- No longer contains customisation sync information.
- Version of LiveSync has been added.
- 0.19.22
- Fixed:
- Now the synchronisation will begin without our interaction.
- No longer puts the configuration of the remote database into the log while checking configuration.
- Some outdated description notes have been removed.
- Options that are meaningless depending on other settings configured are now hidden.
- Scan for hidden files before replication
- Scan customization periodically
- 0.19.23
-Improved:
- We can open the log pane also from the command palette now.
- Now, the hidden file scanning interval could be configured to 0.
- `Check database configuration` now points out that we do not have administrator permission.
### 0.18.0
#### Now, paths of files in the database can now be obfuscated. (Experimental Feature)