mirror of
https://github.com/vrtmrz/obsidian-livesync.git
synced 2026-05-08 00:31:54 +00:00
Compare commits
18 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d3dc1e7328 | ||
|
|
45304af369 | ||
|
|
7f422d58f2 | ||
|
|
c2491fdfad | ||
|
|
06a6e391e8 | ||
|
|
f99475f6b7 | ||
|
|
109fc00b9d | ||
|
|
c071d822e1 | ||
|
|
d2de5b4710 | ||
|
|
cf5ecd8922 | ||
|
|
b337a05b5a | ||
|
|
9ea6bee9d1 | ||
|
|
9747c26d50 | ||
|
|
bb4b764586 | ||
|
|
279b4b41e5 | ||
|
|
b644fb791d | ||
|
|
ac9428e96b | ||
|
|
280d9e1dd9 |
25
README.md
25
README.md
@@ -59,14 +59,23 @@ Synchronization status is shown in statusbar.
|
|||||||
|
|
||||||
- Status
|
- Status
|
||||||
- ⏹️ Stopped
|
- ⏹️ Stopped
|
||||||
- 💤 LiveSync enabled. Waiting for changes.
|
- 💤 LiveSync enabled. Waiting for changes
|
||||||
- ⚡️ Synchronization in progress.
|
- ⚡️ Synchronization in progress
|
||||||
- ⚠ An error occurred.
|
- ⚠ An error occurred
|
||||||
- ↑ Uploaded chunks and metadata
|
- Statistical indicator
|
||||||
- ↓ Downloaded chunks and metadata
|
- ↑ Uploaded chunks and metadata
|
||||||
- ⏳ Number of pending processes
|
- ↓ Downloaded chunks and metadata
|
||||||
- 🧩 Number of files waiting for their chunks.
|
- Progress indicator
|
||||||
If you have deleted or renamed files, please wait until ⏳ icon disappears.
|
- 📥 Unprocessed transferred items
|
||||||
|
- 📄 Working database operation
|
||||||
|
- 💾 Working write storage processes
|
||||||
|
- ⏳ Working read storage processes
|
||||||
|
- 🛫 Pending read storage processes
|
||||||
|
- ⚙️ Working or pending storage processes of hidden files
|
||||||
|
- 🧩 Waiting chunks
|
||||||
|
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
|
||||||
|
|
||||||
|
To prevent file and database corruption, please wait until all progress indicators have disappeared. Especially in case of if you have deleted or renamed files.
|
||||||
|
|
||||||
|
|
||||||
## Hints
|
## Hints
|
||||||
|
|||||||
@@ -3,8 +3,8 @@
|
|||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"id": "view-in-github",
|
"colab_type": "text",
|
||||||
"colab_type": "text"
|
"id": "view-in-github"
|
||||||
},
|
},
|
||||||
"source": [
|
"source": [
|
||||||
"<a href=\"https://colab.research.google.com/gist/vrtmrz/37c3efd7842e49947aaaa7f665e5020a/deploy_couchdb_to_flyio_v2_with_swap.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
|
"<a href=\"https://colab.research.google.com/gist/vrtmrz/37c3efd7842e49947aaaa7f665e5020a/deploy_couchdb_to_flyio_v2_with_swap.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
|
||||||
@@ -12,15 +12,16 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"id": "HiRV7G8Gk1Rs"
|
||||||
|
},
|
||||||
"source": [
|
"source": [
|
||||||
"History:\n",
|
"History:\n",
|
||||||
"- 18, May, 2023: Initial.\n",
|
"- 18, May, 2023: Initial.\n",
|
||||||
"- 19, Jun., 2023: Patched for enabling swap.\n",
|
"- 19, Jun., 2023: Patched for enabling swap.\n",
|
||||||
"- 22, Aug., 2023: Generating Setup-URI implemented."
|
"- 22, Aug., 2023: Generating Setup-URI implemented.\n",
|
||||||
],
|
"- 7, Nov., 2023: Fixed the issue of TOML editing."
|
||||||
"metadata": {
|
]
|
||||||
"id": "HiRV7G8Gk1Rs"
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
@@ -45,7 +46,7 @@
|
|||||||
},
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Delete once\n",
|
"# Delete once (Do not care about `cannot remove './fly.toml': No such file or directory`)\n",
|
||||||
"!rm ./fly.toml"
|
"!rm ./fly.toml"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -78,15 +79,15 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"source": [
|
"execution_count": null,
|
||||||
"# Check the toml once.\n",
|
|
||||||
"!cat fly.toml"
|
|
||||||
],
|
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"id": "2RSoO9o-i2TT"
|
"id": "2RSoO9o-i2TT"
|
||||||
},
|
},
|
||||||
"execution_count": null,
|
"outputs": [],
|
||||||
"outputs": []
|
"source": [
|
||||||
|
"# Check the toml once.\n",
|
||||||
|
"!cat fly.toml"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
@@ -96,52 +97,45 @@
|
|||||||
},
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Modify fly.toml\n",
|
"# Modify the TOML and generate Dockerfile\n",
|
||||||
"## Port modification\n",
|
"!pip install mergedeep\n",
|
||||||
"!sed -i 's/8080/5984/g' fly.toml\n",
|
"from mergedeep import merge\n",
|
||||||
"## Add user into.\n",
|
"import toml\n",
|
||||||
"!echo -e \"\\n[env]\\n COUCHDB_USER = \\\"${couchUser}\\\"\" >> ./fly.toml\n",
|
"fly = toml.load('fly.toml')\n",
|
||||||
"## Set the location of an ini file which to save configurations persistently via erlang flags.\n",
|
"override = {\n",
|
||||||
"!echo -e \"\\nERL_FLAGS=\\\"-couch_ini /opt/couchdb/etc/default.ini /opt/couchdb/etc/default.d/ /opt/couchdb/etc/local.d /opt/couchdb/etc/local.ini /opt/couchdb/data/persistence.ini\\\"\" >> ./fly.toml\n",
|
" \"http_service\":{\n",
|
||||||
"## Mounting volumes to store data and ini file.\n",
|
" \"internal_port\":5984\n",
|
||||||
"!echo -e \"\\n[mounts]\\n source=\\\"couchdata\\\"\\n destination=\\\"/opt/couchdb/data\\\"\" >> ./fly.toml\n",
|
" },\n",
|
||||||
"!cat fly.toml"
|
" \"build\":{\n",
|
||||||
]
|
" \"dockerfile\":\"./Dockerfile\"\n",
|
||||||
},
|
" },\n",
|
||||||
{
|
" \"mounts\":{\n",
|
||||||
"cell_type": "code",
|
" \"source\":\"couchdata\",\n",
|
||||||
"source": [
|
" \"destination\":\"/opt/couchdb/data\"\n",
|
||||||
|
" },\n",
|
||||||
|
" \"env\":{\n",
|
||||||
|
" \"COUCHDB_USER\":os.environ['couchUser'],\n",
|
||||||
|
" \"ERL_FLAGS\":\"-couch_ini /opt/couchdb/etc/default.ini /opt/couchdb/etc/default.d/ /opt/couchdb/etc/local.d /opt/couchdb/etc/local.ini /opt/couchdb/data/persistence.ini\",\n",
|
||||||
|
" }\n",
|
||||||
|
"}\n",
|
||||||
|
"out = merge(fly,override)\n",
|
||||||
|
"with open('fly.toml', 'wt') as fp:\n",
|
||||||
|
" toml.dump(out, fp)\n",
|
||||||
|
" fp.close()\n",
|
||||||
|
"\n",
|
||||||
"# Make the Dockerfile to modify the permission of the ini file. If you want to use a specific version, you should change `latest` here.\n",
|
"# Make the Dockerfile to modify the permission of the ini file. If you want to use a specific version, you should change `latest` here.\n",
|
||||||
"!echo -e \"\\n[build]\\n dockerfile = \\\"./Dockerfile\\\"\" >> ./fly.toml"
|
"dockerfile = '''FROM couchdb:latest\n",
|
||||||
],
|
"RUN sed -i '2itouch /opt/couchdb/data/persistence.ini && chmod +w /opt/couchdb/data/persistence.ini && fallocate -l 512M /swapfile && chmod 0600 /swapfile && mkswap /swapfile && echo 10 > /proc/sys/vm/swappiness && swapon /swapfile && echo 1 > /proc/sys/vm/overcommit_memory' /docker-entrypoint.sh\n",
|
||||||
"metadata": {
|
"'''\n",
|
||||||
"id": "LQPsZ_dYxkTu"
|
"with open(\"./Dockerfile\",\"wt\") as fp:\n",
|
||||||
},
|
" fp.write(dockerfile)\n",
|
||||||
"execution_count": null,
|
" fp.close()\n",
|
||||||
"outputs": []
|
"\n",
|
||||||
},
|
"!echo ------\n",
|
||||||
{
|
"!cat fly.toml\n",
|
||||||
"cell_type": "code",
|
"!echo ------\n",
|
||||||
"source": [
|
"!cat Dockerfile"
|
||||||
"!echo -e \"FROM couchdb:latest\\nRUN sed -i '2itouch /opt/couchdb/data/persistence.ini && chmod +w /opt/couchdb/data/persistence.ini && fallocate -l 512M /swapfile && chmod 0600 /swapfile && mkswap /swapfile && echo 10 > /proc/sys/vm/swappiness && swapon /swapfile && echo 1 > /proc/sys/vm/overcommit_memory' /docker-entrypoint.sh\" > ./Dockerfile"
|
]
|
||||||
],
|
|
||||||
"metadata": {
|
|
||||||
"id": "44cBeGJ9on5i"
|
|
||||||
},
|
|
||||||
"execution_count": null,
|
|
||||||
"outputs": []
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"source": [
|
|
||||||
"# Check dockerfile\n",
|
|
||||||
"!cat ./Dockerfile"
|
|
||||||
],
|
|
||||||
"metadata": {
|
|
||||||
"id": "ai2R3BbpxRSe"
|
|
||||||
},
|
|
||||||
"execution_count": null,
|
|
||||||
"outputs": []
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
@@ -189,20 +183,27 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"id": "cGlSzVqlQG_z"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Finish setting up the CouchDB\n",
|
"# Finish setting up the CouchDB\n",
|
||||||
"# Please repeat until the request is completed without error messages\n",
|
"# Please repeat until the request is completed without error messages\n",
|
||||||
"# i.e., You have to redo this block while \"curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to xxxx\" is showing.\n",
|
"# i.e., You have to redo this block while \"curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to xxxx\" is showing.\n",
|
||||||
|
"#\n",
|
||||||
|
"# Note: A few minutes might be required to be booted.\n",
|
||||||
"!curl -X POST \"${couchHost}/_cluster_setup\" -H \"Content-Type: application/json\" -d \"{\\\"action\\\":\\\"enable_single_node\\\",\\\"username\\\":\\\"${couchUser}\\\",\\\"password\\\":\\\"${couchPwd}\\\",\\\"bind_address\\\":\\\"0.0.0.0\\\",\\\"port\\\":5984,\\\"singlenode\\\":true}\" --user \"${couchUser}:${couchPwd}\""
|
"!curl -X POST \"${couchHost}/_cluster_setup\" -H \"Content-Type: application/json\" -d \"{\\\"action\\\":\\\"enable_single_node\\\",\\\"username\\\":\\\"${couchUser}\\\",\\\"password\\\":\\\"${couchPwd}\\\",\\\"bind_address\\\":\\\"0.0.0.0\\\",\\\"port\\\":5984,\\\"singlenode\\\":true}\" --user \"${couchUser}:${couchPwd}\""
|
||||||
],
|
]
|
||||||
"metadata": {
|
|
||||||
"id": "cGlSzVqlQG_z"
|
|
||||||
},
|
|
||||||
"execution_count": null,
|
|
||||||
"outputs": []
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"id": "JePzrsHypY18"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Please repeat until all lines are completed without error messages\n",
|
"# Please repeat until all lines are completed without error messages\n",
|
||||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/require_valid_user\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/require_valid_user\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||||
@@ -214,28 +215,28 @@
|
|||||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/couchdb/max_document_size\" -H \"Content-Type: application/json\" -d '\"50000000\"' --user \"${couchUser}:${couchPwd}\"\n",
|
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/couchdb/max_document_size\" -H \"Content-Type: application/json\" -d '\"50000000\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/credentials\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/credentials\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
|
||||||
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/origins\" -H \"Content-Type: application/json\" -d '\"app://obsidian.md,capacitor://localhost,http://localhost\"' --user \"${couchUser}:${couchPwd}\""
|
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/origins\" -H \"Content-Type: application/json\" -d '\"app://obsidian.md,capacitor://localhost,http://localhost\"' --user \"${couchUser}:${couchPwd}\""
|
||||||
],
|
]
|
||||||
"metadata": {
|
|
||||||
"id": "JePzrsHypY18"
|
|
||||||
},
|
|
||||||
"execution_count": null,
|
|
||||||
"outputs": []
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
|
"metadata": {
|
||||||
|
"id": "YfSOomsoXbGS"
|
||||||
|
},
|
||||||
"source": [
|
"source": [
|
||||||
"Now, our CouchDB has been surely installed and configured. Cheers!\n",
|
"Now, our CouchDB has been surely installed and configured. Cheers!\n",
|
||||||
"\n",
|
"\n",
|
||||||
"In the steps that follow, create a setup-URI.\n",
|
"In the steps that follow, create a setup-URI.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"This URI could be imported directly into Self-hosted LiveSync, to configure the use of the CouchDB which we configured now."
|
"This URI could be imported directly into Self-hosted LiveSync, to configure the use of the CouchDB which we configured now."
|
||||||
],
|
]
|
||||||
"metadata": {
|
|
||||||
"id": "YfSOomsoXbGS"
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"id": "416YncOqXdNn"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Database config\n",
|
"# Database config\n",
|
||||||
"import random, string\n",
|
"import random, string\n",
|
||||||
@@ -250,39 +251,39 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"print(\"Your database:\"+os.environ['database'])\n",
|
"print(\"Your database:\"+os.environ['database'])\n",
|
||||||
"print(\"Your passphrase:\"+os.environ['passphrase'])"
|
"print(\"Your passphrase:\"+os.environ['passphrase'])"
|
||||||
],
|
]
|
||||||
"metadata": {
|
|
||||||
"id": "416YncOqXdNn"
|
|
||||||
},
|
|
||||||
"execution_count": null,
|
|
||||||
"outputs": []
|
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"source": [
|
"execution_count": null,
|
||||||
"# Install deno for make setup uri\n",
|
|
||||||
"!curl -fsSL https://deno.land/x/install/install.sh | sh"
|
|
||||||
],
|
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"id": "C4d7C0HAXgsr"
|
"id": "C4d7C0HAXgsr"
|
||||||
},
|
},
|
||||||
"execution_count": null,
|
"outputs": [],
|
||||||
"outputs": []
|
"source": [
|
||||||
|
"# Install deno for make setup uri\n",
|
||||||
|
"!curl -fsSL https://deno.land/x/install/install.sh | sh"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"source": [
|
"execution_count": null,
|
||||||
"# Fetch module for encrypting a Setup URI\n",
|
|
||||||
"!curl -o encrypt.ts https://gist.githubusercontent.com/vrtmrz/f9d1d95ee2ca3afa1a924a2c6759b854/raw/d7a070d864a6f61403d8dc74208238d5741aeb5a/encrypt.ts"
|
|
||||||
],
|
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"id": "hQL_Dx-PXise"
|
"id": "hQL_Dx-PXise"
|
||||||
},
|
},
|
||||||
"execution_count": null,
|
"outputs": [],
|
||||||
"outputs": []
|
"source": [
|
||||||
|
"# Fetch module for encrypting a Setup URI\n",
|
||||||
|
"!curl -o encrypt.ts https://gist.githubusercontent.com/vrtmrz/f9d1d95ee2ca3afa1a924a2c6759b854/raw/d7a070d864a6f61403d8dc74208238d5741aeb5a/encrypt.ts"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {
|
||||||
|
"id": "o0gX_thFXlIZ"
|
||||||
|
},
|
||||||
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Make buttons!\n",
|
"# Make buttons!\n",
|
||||||
"from IPython.display import HTML\n",
|
"from IPython.display import HTML\n",
|
||||||
@@ -294,28 +295,23 @@
|
|||||||
"else:\n",
|
"else:\n",
|
||||||
" result = \"Failed to encrypt the setup URI\"\n",
|
" result = \"Failed to encrypt the setup URI\"\n",
|
||||||
"result"
|
"result"
|
||||||
],
|
]
|
||||||
"metadata": {
|
|
||||||
"id": "o0gX_thFXlIZ"
|
|
||||||
},
|
|
||||||
"execution_count": null,
|
|
||||||
"outputs": []
|
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"colab": {
|
"colab": {
|
||||||
"provenance": [],
|
"include_colab_link": true,
|
||||||
"private_outputs": true,
|
"private_outputs": true,
|
||||||
"include_colab_link": true
|
"provenance": []
|
||||||
},
|
},
|
||||||
|
"gpuClass": "standard",
|
||||||
"kernelspec": {
|
"kernelspec": {
|
||||||
"display_name": "Python 3",
|
"display_name": "Python 3",
|
||||||
"name": "python3"
|
"name": "python3"
|
||||||
},
|
},
|
||||||
"language_info": {
|
"language_info": {
|
||||||
"name": "python"
|
"name": "python"
|
||||||
},
|
}
|
||||||
"gpuClass": "standard"
|
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
"nbformat_minor": 0
|
"nbformat_minor": 0
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"id": "obsidian-livesync",
|
"id": "obsidian-livesync",
|
||||||
"name": "Self-hosted LiveSync",
|
"name": "Self-hosted LiveSync",
|
||||||
"version": "0.20.6",
|
"version": "0.22.0",
|
||||||
"minAppVersion": "0.9.12",
|
"minAppVersion": "0.9.12",
|
||||||
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||||
"author": "vorotamoroz",
|
"author": "vorotamoroz",
|
||||||
|
|||||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "obsidian-livesync",
|
"name": "obsidian-livesync",
|
||||||
"version": "0.20.6",
|
"version": "0.22.0",
|
||||||
"lockfileVersion": 2,
|
"lockfileVersion": 2,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "obsidian-livesync",
|
"name": "obsidian-livesync",
|
||||||
"version": "0.20.6",
|
"version": "0.22.0",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"diff-match-patch": "^1.0.5",
|
"diff-match-patch": "^1.0.5",
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "obsidian-livesync",
|
"name": "obsidian-livesync",
|
||||||
"version": "0.20.6",
|
"version": "0.22.0",
|
||||||
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||||
"main": "main.js",
|
"main": "main.js",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
|
|||||||
@@ -1,20 +1,22 @@
|
|||||||
import { writable } from 'svelte/store';
|
import { writable } from 'svelte/store';
|
||||||
import { Notice, type PluginManifest, parseYaml, normalizePath } from "./deps";
|
import { Notice, type PluginManifest, parseYaml, normalizePath } from "./deps";
|
||||||
|
|
||||||
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry } from "./lib/src/types";
|
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry, SavingEntry } from "./lib/src/types";
|
||||||
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "./lib/src/types";
|
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "./lib/src/types";
|
||||||
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
|
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
|
||||||
import { delay, getDocData } from "./lib/src/utils";
|
import { createTextBlob, delay, getDocData } from "./lib/src/utils";
|
||||||
import { Logger } from "./lib/src/logger";
|
import { Logger } from "./lib/src/logger";
|
||||||
import { WrappedNotice } from "./lib/src/wrapper";
|
import { WrappedNotice } from "./lib/src/wrapper";
|
||||||
import { readString, crc32CKHash, decodeBinary, encodeBinary } from "./lib/src/strbin";
|
import { readString, decodeBinary, arrayBufferToBase64, sha1 } from "./lib/src/strbin";
|
||||||
import { serialized } from "./lib/src/lock";
|
import { serialized } from "./lib/src/lock";
|
||||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||||
import { stripAllPrefixes } from "./lib/src/path";
|
import { stripAllPrefixes } from "./lib/src/path";
|
||||||
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "./utils";
|
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "./utils";
|
||||||
import { PluginDialogModal } from "./dialogs";
|
import { PluginDialogModal } from "./dialogs";
|
||||||
import { JsonResolveModal } from "./JsonResolveModal";
|
import { JsonResolveModal } from "./JsonResolveModal";
|
||||||
import { pipeGeneratorToGenerator, processAllGeneratorTasksWithConcurrencyLimit } from './lib/src/task';
|
import { QueueProcessor } from './lib/src/processor';
|
||||||
|
import { pluginScanningCount } from './lib/src/stores';
|
||||||
|
import type ObsidianLiveSyncPlugin from './main';
|
||||||
|
|
||||||
const d = "\u200b";
|
const d = "\u200b";
|
||||||
const d2 = "\n";
|
const d2 = "\n";
|
||||||
@@ -162,6 +164,16 @@ export type PluginDataEx = {
|
|||||||
mtime: number,
|
mtime: number,
|
||||||
};
|
};
|
||||||
export class ConfigSync extends LiveSyncCommands {
|
export class ConfigSync extends LiveSyncCommands {
|
||||||
|
constructor(plugin: ObsidianLiveSyncPlugin) {
|
||||||
|
super(plugin);
|
||||||
|
pluginScanningCount.onChanged((e) => {
|
||||||
|
const total = e.value;
|
||||||
|
pluginIsEnumerating.set(total != 0);
|
||||||
|
if (total == 0) {
|
||||||
|
Logger(`Processing configurations done`, LOG_LEVEL_INFO, "get-plugins");
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
confirmPopup: WrappedNotice = null;
|
confirmPopup: WrappedNotice = null;
|
||||||
get kvDB() {
|
get kvDB() {
|
||||||
return this.plugin.kvDB;
|
return this.plugin.kvDB;
|
||||||
@@ -270,7 +282,7 @@ export class ConfigSync extends LiveSyncCommands {
|
|||||||
for (const file of data.files) {
|
for (const file of data.files) {
|
||||||
const work = { ...file };
|
const work = { ...file };
|
||||||
const tempStr = getDocData(work.data);
|
const tempStr = getDocData(work.data);
|
||||||
work.data = [crc32CKHash(tempStr)];
|
work.data = [await sha1(tempStr)];
|
||||||
xFiles.push(work);
|
xFiles.push(work);
|
||||||
}
|
}
|
||||||
return ({
|
return ({
|
||||||
@@ -302,65 +314,65 @@ export class ConfigSync extends LiveSyncCommands {
|
|||||||
this.plugin.saveSettingData();
|
this.plugin.saveSettingData();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pluginScanProcessor = new QueueProcessor(async (v: AnyEntry[]) => {
|
||||||
|
const plugin = v[0];
|
||||||
|
const path = plugin.path || this.getPath(plugin);
|
||||||
|
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
|
||||||
|
if (oldEntry && oldEntry.mtime == plugin.mtime) return;
|
||||||
|
try {
|
||||||
|
const pluginData = await this.loadPluginData(path);
|
||||||
|
if (pluginData) {
|
||||||
|
return [pluginData];
|
||||||
|
}
|
||||||
|
// Failed to load
|
||||||
|
return;
|
||||||
|
|
||||||
|
} catch (ex) {
|
||||||
|
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
|
||||||
|
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 300, yieldThreshold: 10 }).pipeTo(
|
||||||
|
new QueueProcessor(
|
||||||
|
(pluginDataList) => {
|
||||||
|
let newList = [...this.pluginList];
|
||||||
|
for (const item of pluginDataList) {
|
||||||
|
newList = newList.filter(x => x.documentPath != item.documentPath);
|
||||||
|
newList.push(item)
|
||||||
|
}
|
||||||
|
this.pluginList = newList;
|
||||||
|
pluginList.set(newList);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
, { suspended: true, batchSize: 1000, concurrentLimit: 10, delay: 200, yieldThreshold: 25, totalRemainingReactiveSource: pluginScanningCount })).startPipeline().root.onIdle(() => {
|
||||||
|
Logger(`All files enumerated`, LOG_LEVEL_INFO, "get-plugins");
|
||||||
|
this.createMissingConfigurationEntry();
|
||||||
|
});
|
||||||
|
|
||||||
|
|
||||||
async updatePluginList(showMessage: boolean, updatedDocumentPath?: FilePathWithPrefix): Promise<void> {
|
async updatePluginList(showMessage: boolean, updatedDocumentPath?: FilePathWithPrefix): Promise<void> {
|
||||||
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
|
|
||||||
// pluginList.set([]);
|
// pluginList.set([]);
|
||||||
if (!this.settings.usePluginSync) {
|
if (!this.settings.usePluginSync) {
|
||||||
|
this.pluginScanProcessor.clearQueue();
|
||||||
this.pluginList = [];
|
this.pluginList = [];
|
||||||
pluginList.set(this.pluginList)
|
pluginList.set(this.pluginList)
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
await Promise.resolve(); // Just to prevent warning.
|
try {
|
||||||
scheduleTask("update-plugin-list-task", 200, async () => {
|
const updatedDocumentId = updatedDocumentPath ? await this.path2id(updatedDocumentPath) : "";
|
||||||
await serialized("update-plugin-list", async () => {
|
const plugins = updatedDocumentPath ?
|
||||||
try {
|
this.localDatabase.findEntries(updatedDocumentId, updatedDocumentId + "\u{10ffff}", { include_docs: true, key: updatedDocumentId, limit: 1 }) :
|
||||||
const updatedDocumentId = updatedDocumentPath ? await this.path2id(updatedDocumentPath) : "";
|
this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
|
||||||
const plugins = updatedDocumentPath ?
|
for await (const v of plugins) {
|
||||||
this.localDatabase.findEntries(updatedDocumentId, updatedDocumentId + "\u{10ffff}", { include_docs: true, key: updatedDocumentId, limit: 1 }) :
|
const path = v.path || this.getPath(v);
|
||||||
this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
|
if (updatedDocumentPath && updatedDocumentPath != path) continue;
|
||||||
let count = 0;
|
this.pluginScanProcessor.enqueue(v);
|
||||||
pluginIsEnumerating.set(true);
|
}
|
||||||
for await (const v of processAllGeneratorTasksWithConcurrencyLimit(20, pipeGeneratorToGenerator(plugins, async plugin => {
|
} finally {
|
||||||
const path = plugin.path || this.getPath(plugin);
|
|
||||||
if (updatedDocumentPath && updatedDocumentPath != path) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
|
|
||||||
if (oldEntry && oldEntry.mtime == plugin.mtime) return false;
|
|
||||||
try {
|
|
||||||
count++;
|
|
||||||
if (count % 10 == 0) Logger(`Enumerating files... ${count}`, logLevel, "get-plugins");
|
|
||||||
Logger(`plugin-${path}`, LOG_LEVEL_VERBOSE);
|
|
||||||
return this.loadPluginData(path);
|
|
||||||
// return entries;
|
|
||||||
} catch (ex) {
|
|
||||||
//TODO
|
|
||||||
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
|
|
||||||
console.warn(ex);
|
|
||||||
}
|
|
||||||
return false;
|
|
||||||
}))) {
|
|
||||||
if ("ok" in v) {
|
|
||||||
if (v.ok !== false) {
|
|
||||||
let newList = [...this.pluginList];
|
|
||||||
const item = v.ok;
|
|
||||||
newList = newList.filter(x => x.documentPath != item.documentPath);
|
|
||||||
newList.push(item)
|
|
||||||
if (updatedDocumentPath != "") newList = newList.filter(e => e.documentPath != updatedDocumentPath);
|
|
||||||
this.pluginList = newList;
|
|
||||||
pluginList.set(newList);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
Logger(`All files enumerated`, logLevel, "get-plugins");
|
|
||||||
pluginIsEnumerating.set(false);
|
|
||||||
this.createMissingConfigurationEntry();
|
|
||||||
} finally {
|
|
||||||
pluginIsEnumerating.set(false);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
pluginIsEnumerating.set(false);
|
pluginIsEnumerating.set(false);
|
||||||
});
|
}
|
||||||
|
pluginIsEnumerating.set(false);
|
||||||
// return entries;
|
// return entries;
|
||||||
}
|
}
|
||||||
async compareUsingDisplayData(dataA: PluginDataExDisplay, dataB: PluginDataExDisplay) {
|
async compareUsingDisplayData(dataA: PluginDataExDisplay, dataB: PluginDataExDisplay) {
|
||||||
@@ -418,9 +430,9 @@ export class ConfigSync extends LiveSyncCommands {
|
|||||||
await this.ensureDirectoryEx(path);
|
await this.ensureDirectoryEx(path);
|
||||||
if (!content) {
|
if (!content) {
|
||||||
const dt = decodeBinary(f.data);
|
const dt = decodeBinary(f.data);
|
||||||
await this.app.vault.adapter.writeBinary(path, dt);
|
await this.vaultAccess.adapterWrite(path, dt);
|
||||||
} else {
|
} else {
|
||||||
await this.app.vault.adapter.write(path, content);
|
await this.vaultAccess.adapterWrite(path, content);
|
||||||
}
|
}
|
||||||
Logger(`Applying ${f.filename} of ${data.displayName || data.name}.. Done`);
|
Logger(`Applying ${f.filename} of ${data.displayName || data.name}.. Done`);
|
||||||
|
|
||||||
@@ -540,16 +552,16 @@ export class ConfigSync extends LiveSyncCommands {
|
|||||||
|
|
||||||
recentProcessedInternalFiles = [] as string[];
|
recentProcessedInternalFiles = [] as string[];
|
||||||
async makeEntryFromFile(path: FilePath): Promise<false | PluginDataExFile> {
|
async makeEntryFromFile(path: FilePath): Promise<false | PluginDataExFile> {
|
||||||
const stat = await this.app.vault.adapter.stat(path);
|
const stat = await this.vaultAccess.adapterStat(path);
|
||||||
let version: string | undefined;
|
let version: string | undefined;
|
||||||
let displayName: string | undefined;
|
let displayName: string | undefined;
|
||||||
if (!stat) {
|
if (!stat) {
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
const contentBin = await this.app.vault.adapter.readBinary(path);
|
const contentBin = await this.vaultAccess.adapterReadBinary(path);
|
||||||
let content: string[];
|
let content: string[];
|
||||||
try {
|
try {
|
||||||
content = await encodeBinary(contentBin, this.settings.useV1);
|
content = await arrayBufferToBase64(contentBin);
|
||||||
if (path.toLowerCase().endsWith("/manifest.json")) {
|
if (path.toLowerCase().endsWith("/manifest.json")) {
|
||||||
const v = readString(new Uint8Array(contentBin));
|
const v = readString(new Uint8Array(contentBin));
|
||||||
try {
|
try {
|
||||||
@@ -652,10 +664,10 @@ export class ConfigSync extends LiveSyncCommands {
|
|||||||
return
|
return
|
||||||
}
|
}
|
||||||
|
|
||||||
const content = serialize(dt);
|
const content = createTextBlob(serialize(dt));
|
||||||
try {
|
try {
|
||||||
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false);
|
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false);
|
||||||
let saveData: LoadedEntry;
|
let saveData: SavingEntry;
|
||||||
if (old === false) {
|
if (old === false) {
|
||||||
saveData = {
|
saveData = {
|
||||||
_id: id,
|
_id: id,
|
||||||
@@ -664,7 +676,7 @@ export class ConfigSync extends LiveSyncCommands {
|
|||||||
mtime,
|
mtime,
|
||||||
ctime: mtime,
|
ctime: mtime,
|
||||||
datatype: "newnote",
|
datatype: "newnote",
|
||||||
size: content.length,
|
size: content.size,
|
||||||
children: [],
|
children: [],
|
||||||
deleted: false,
|
deleted: false,
|
||||||
type: "newnote",
|
type: "newnote",
|
||||||
@@ -679,7 +691,7 @@ export class ConfigSync extends LiveSyncCommands {
|
|||||||
...old,
|
...old,
|
||||||
data: content,
|
data: content,
|
||||||
mtime,
|
mtime,
|
||||||
size: content.length,
|
size: content.size,
|
||||||
datatype: "newnote",
|
datatype: "newnote",
|
||||||
children: [],
|
children: [],
|
||||||
deleted: false,
|
deleted: false,
|
||||||
@@ -701,7 +713,7 @@ export class ConfigSync extends LiveSyncCommands {
|
|||||||
async watchVaultRawEventsAsync(path: FilePath) {
|
async watchVaultRawEventsAsync(path: FilePath) {
|
||||||
if (!this.settings.usePluginSync) return false;
|
if (!this.settings.usePluginSync) return false;
|
||||||
if (!this.isTargetPath(path)) return false;
|
if (!this.isTargetPath(path)) return false;
|
||||||
const stat = await this.app.vault.adapter.stat(path);
|
const stat = await this.vaultAccess.adapterStat(path);
|
||||||
// Make sure that target is a file.
|
// Make sure that target is a file.
|
||||||
if (stat && stat.type != "file")
|
if (stat && stat.type != "file")
|
||||||
return false;
|
return false;
|
||||||
|
|||||||
@@ -1,16 +1,18 @@
|
|||||||
import { normalizePath, type PluginManifest } from "./deps";
|
import { normalizePath, type PluginManifest } from "./deps";
|
||||||
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED } from "./lib/src/types";
|
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry } from "./lib/src/types";
|
||||||
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
|
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
|
||||||
import { Parallels, delay, isDocContentSame } from "./lib/src/utils";
|
import { createBinaryBlob, delay, isDocContentSame } from "./lib/src/utils";
|
||||||
import { Logger } from "./lib/src/logger";
|
import { Logger } from "./lib/src/logger";
|
||||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||||
import { scheduleTask, isInternalMetadata, PeriodicProcessor } from "./utils";
|
import { isInternalMetadata, PeriodicProcessor } from "./utils";
|
||||||
import { WrappedNotice } from "./lib/src/wrapper";
|
import { WrappedNotice } from "./lib/src/wrapper";
|
||||||
import { decodeBinary, encodeBinary } from "./lib/src/strbin";
|
import { decodeBinary, encodeBinary } from "./lib/src/strbin";
|
||||||
import { serialized } from "./lib/src/lock";
|
import { serialized } from "./lib/src/lock";
|
||||||
import { JsonResolveModal } from "./JsonResolveModal";
|
import { JsonResolveModal } from "./JsonResolveModal";
|
||||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||||
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
|
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
|
||||||
|
import { KeyedQueueProcessor, QueueProcessor } from "./lib/src/processor";
|
||||||
|
import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "./lib/src/stores";
|
||||||
|
|
||||||
export class HiddenFileSync extends LiveSyncCommands {
|
export class HiddenFileSync extends LiveSyncCommands {
|
||||||
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
|
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
|
||||||
@@ -75,22 +77,17 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
procInternalFiles: string[] = [];
|
|
||||||
async execInternalFile() {
|
|
||||||
await serialized("execInternal", async () => {
|
|
||||||
const w = [...this.procInternalFiles];
|
|
||||||
this.procInternalFiles = [];
|
|
||||||
Logger(`Applying hidden ${w.length} files change...`);
|
|
||||||
await this.syncInternalFilesAndDatabase("pull", false, false, w);
|
|
||||||
Logger(`Applying hidden ${w.length} files changed`);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
procInternalFile(filename: string) {
|
procInternalFile(filename: string) {
|
||||||
this.procInternalFiles.push(filename);
|
this.internalFileProcessor.enqueueWithKey(filename, filename);
|
||||||
scheduleTask("procInternal", 500, async () => {
|
|
||||||
await this.execInternalFile();
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
|
internalFileProcessor = new KeyedQueueProcessor<string, any>(
|
||||||
|
async (filenames) => {
|
||||||
|
Logger(`START :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
|
||||||
|
await this.syncInternalFilesAndDatabase("pull", false, false, filenames);
|
||||||
|
Logger(`DONE :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
|
||||||
|
return;
|
||||||
|
}, { batchSize: 100, concurrentLimit: 1, delay: 100, yieldThreshold: 10, suspended: false, totalRemainingReactiveSource: hiddenFilesEventCount }
|
||||||
|
);
|
||||||
|
|
||||||
recentProcessedInternalFiles = [] as string[];
|
recentProcessedInternalFiles = [] as string[];
|
||||||
async watchVaultRawEventsAsync(path: FilePath) {
|
async watchVaultRawEventsAsync(path: FilePath) {
|
||||||
@@ -103,7 +100,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
Logger(`Hidden file skipped: ${path} is synchronized in customization sync.`, LOG_LEVEL_VERBOSE);
|
Logger(`Hidden file skipped: ${path} is synchronized in customization sync.`, LOG_LEVEL_VERBOSE);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
const stat = await this.app.vault.adapter.stat(path);
|
const stat = await this.vaultAccess.adapterStat(path);
|
||||||
// sometimes folder is coming.
|
// sometimes folder is coming.
|
||||||
if (stat && stat.type != "file")
|
if (stat && stat.type != "file")
|
||||||
return;
|
return;
|
||||||
@@ -171,12 +168,12 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
if (result) {
|
if (result) {
|
||||||
Logger(`Object merge:${path}`, LOG_LEVEL_INFO);
|
Logger(`Object merge:${path}`, LOG_LEVEL_INFO);
|
||||||
const filename = stripAllPrefixes(path);
|
const filename = stripAllPrefixes(path);
|
||||||
const isExists = await this.app.vault.adapter.exists(filename);
|
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
|
||||||
if (!isExists) {
|
if (!isExists) {
|
||||||
await this.ensureDirectoryEx(filename);
|
await this.ensureDirectoryEx(filename);
|
||||||
}
|
}
|
||||||
await this.app.vault.adapter.write(filename, result);
|
await this.plugin.vaultAccess.adapterWrite(filename, result);
|
||||||
const stat = await this.app.vault.adapter.stat(filename);
|
const stat = await this.vaultAccess.adapterStat(filename);
|
||||||
await this.storeInternalFileToDatabase({ path: filename, ...stat });
|
await this.storeInternalFileToDatabase({ path: filename, ...stat });
|
||||||
await this.extractInternalFileFromDatabase(filename);
|
await this.extractInternalFileFromDatabase(filename);
|
||||||
await this.localDatabase.removeRaw(id, revB);
|
await this.localDatabase.removeRaw(id, revB);
|
||||||
@@ -278,28 +275,38 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
acc[stripAllPrefixes(this.getPath(cur))] = cur;
|
acc[stripAllPrefixes(this.getPath(cur))] = cur;
|
||||||
return acc;
|
return acc;
|
||||||
}, {} as { [key: string]: InternalFileEntry; });
|
}, {} as { [key: string]: InternalFileEntry; });
|
||||||
const para = Parallels();
|
await new QueueProcessor(async (filenames: FilePath[]) => {
|
||||||
for (const filename of allFileNames) {
|
const filename = filenames[0];
|
||||||
processed++;
|
processed++;
|
||||||
if (processed % 100 == 0) {
|
if (processed % 100 == 0) {
|
||||||
Logger(`Hidden file: ${processed}/${fileCount}`, logLevel, "sync_internal");
|
Logger(`Hidden file: ${processed}/${fileCount}`, logLevel, "sync_internal");
|
||||||
}
|
}
|
||||||
if (!filename) continue;
|
if (!filename) return;
|
||||||
if (ignorePatterns.some(e => filename.match(e)))
|
if (ignorePatterns.some(e => filename.match(e)))
|
||||||
continue;
|
return;
|
||||||
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
|
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
|
||||||
continue;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
const fileOnStorage = filename in filesMap ? filesMap[filename] : undefined;
|
const fileOnStorage = filename in filesMap ? filesMap[filename] : undefined;
|
||||||
const fileOnDatabase = filename in filesOnDBMap ? filesOnDBMap[filename] : undefined;
|
const fileOnDatabase = filename in filesOnDBMap ? filesOnDBMap[filename] : undefined;
|
||||||
|
|
||||||
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
|
return [{
|
||||||
|
filename,
|
||||||
await para.wait(5);
|
fileOnStorage,
|
||||||
const proc = (async (xFileOnStorage: InternalFileInfo, xFileOnDatabase: InternalFileEntry) => {
|
fileOnDatabase,
|
||||||
|
}]
|
||||||
|
|
||||||
|
}, { suspended: true, batchSize: 1, concurrentLimit: 10, delay: 0, totalRemainingReactiveSource: hiddenFilesProcessingCount })
|
||||||
|
.pipeTo(new QueueProcessor(async (params) => {
|
||||||
|
const
|
||||||
|
{
|
||||||
|
filename,
|
||||||
|
fileOnStorage: xFileOnStorage,
|
||||||
|
fileOnDatabase: xFileOnDatabase
|
||||||
|
} = params[0];
|
||||||
if (xFileOnStorage && xFileOnDatabase) {
|
if (xFileOnStorage && xFileOnDatabase) {
|
||||||
|
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
|
||||||
// Both => Synchronize
|
// Both => Synchronize
|
||||||
if ((direction != "pullForce" && direction != "pushForce") && xFileOnDatabase.mtime == cache.docMtime && xFileOnStorage.mtime == cache.storageMtime) {
|
if ((direction != "pullForce" && direction != "pushForce") && xFileOnDatabase.mtime == cache.docMtime && xFileOnStorage.mtime == cache.storageMtime) {
|
||||||
return;
|
return;
|
||||||
@@ -340,11 +347,12 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
throw new Error("Invalid state on hidden file sync");
|
throw new Error("Invalid state on hidden file sync");
|
||||||
// Something corrupted?
|
// Something corrupted?
|
||||||
}
|
}
|
||||||
|
return;
|
||||||
|
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 0 }))
|
||||||
|
.root
|
||||||
|
.enqueueAll(allFileNames)
|
||||||
|
.startPipeline().waitForPipeline();
|
||||||
|
|
||||||
});
|
|
||||||
para.add(proc(fileOnStorage, fileOnDatabase))
|
|
||||||
}
|
|
||||||
await para.all();
|
|
||||||
await this.kvDB.set("diff-caches-internal", caches);
|
await this.kvDB.set("diff-caches-internal", caches);
|
||||||
|
|
||||||
// When files has been retrieved from the database. they must be reloaded.
|
// When files has been retrieved from the database. they must be reloaded.
|
||||||
@@ -408,10 +416,10 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
|
|
||||||
const id = await this.path2id(file.path, ICHeader);
|
const id = await this.path2id(file.path, ICHeader);
|
||||||
const prefixedFileName = addPrefix(file.path, ICHeader);
|
const prefixedFileName = addPrefix(file.path, ICHeader);
|
||||||
const contentBin = await this.app.vault.adapter.readBinary(file.path);
|
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(file.path);
|
||||||
let content: string[];
|
let content: Blob;
|
||||||
try {
|
try {
|
||||||
content = await encodeBinary(contentBin, this.settings.useV1);
|
content = createBinaryBlob(contentBin);
|
||||||
} catch (ex) {
|
} catch (ex) {
|
||||||
Logger(`The file ${file.path} could not be encoded`);
|
Logger(`The file ${file.path} could not be encoded`);
|
||||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||||
@@ -421,7 +429,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
return await serialized("file-" + prefixedFileName, async () => {
|
return await serialized("file-" + prefixedFileName, async () => {
|
||||||
try {
|
try {
|
||||||
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false);
|
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false);
|
||||||
let saveData: LoadedEntry;
|
let saveData: SavingEntry;
|
||||||
if (old === false) {
|
if (old === false) {
|
||||||
saveData = {
|
saveData = {
|
||||||
_id: id,
|
_id: id,
|
||||||
@@ -436,7 +444,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
type: "newnote",
|
type: "newnote",
|
||||||
};
|
};
|
||||||
} else {
|
} else {
|
||||||
if (isDocContentSame(old.data, content) && !forceWrite) {
|
if (await isDocContentSame(old.data, content) && !forceWrite) {
|
||||||
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -511,7 +519,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
}
|
}
|
||||||
|
|
||||||
async extractInternalFileFromDatabase(filename: FilePath, force = false) {
|
async extractInternalFileFromDatabase(filename: FilePath, force = false) {
|
||||||
const isExists = await this.app.vault.adapter.exists(filename);
|
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
|
||||||
const prefixedFileName = addPrefix(filename, ICHeader);
|
const prefixedFileName = addPrefix(filename, ICHeader);
|
||||||
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
|
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
|
||||||
return;
|
return;
|
||||||
@@ -534,7 +542,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
|
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
|
||||||
} else {
|
} else {
|
||||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden).`);
|
Logger(`STORAGE <x- DB:${filename}: deleted (hidden).`);
|
||||||
await this.app.vault.adapter.remove(filename);
|
await this.plugin.vaultAccess.adapterRemove(filename);
|
||||||
try {
|
try {
|
||||||
//@ts-ignore internalAPI
|
//@ts-ignore internalAPI
|
||||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||||
@@ -547,7 +555,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
}
|
}
|
||||||
if (!isExists) {
|
if (!isExists) {
|
||||||
await this.ensureDirectoryEx(filename);
|
await this.ensureDirectoryEx(filename);
|
||||||
await this.app.vault.adapter.writeBinary(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
await this.plugin.vaultAccess.adapterWrite(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||||
try {
|
try {
|
||||||
//@ts-ignore internalAPI
|
//@ts-ignore internalAPI
|
||||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||||
@@ -558,13 +566,13 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
Logger(`STORAGE <-- DB:${filename}: written (hidden,new${force ? ", force" : ""})`);
|
Logger(`STORAGE <-- DB:${filename}: written (hidden,new${force ? ", force" : ""})`);
|
||||||
return true;
|
return true;
|
||||||
} else {
|
} else {
|
||||||
const contentBin = await this.app.vault.adapter.readBinary(filename);
|
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(filename);
|
||||||
const content = await encodeBinary(contentBin, this.settings.useV1);
|
const content = await encodeBinary(contentBin);
|
||||||
if (isDocContentSame(content, fileOnDB.data) && !force) {
|
if (await isDocContentSame(content, fileOnDB.data) && !force) {
|
||||||
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL_VERBOSE);
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
await this.app.vault.adapter.writeBinary(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
await this.plugin.vaultAccess.adapterWrite(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||||
try {
|
try {
|
||||||
//@ts-ignore internalAPI
|
//@ts-ignore internalAPI
|
||||||
await this.app.vault.adapter.reconcileInternalFile(filename);
|
await this.app.vault.adapter.reconcileInternalFile(filename);
|
||||||
@@ -613,12 +621,12 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
if (!keep && result) {
|
if (!keep && result) {
|
||||||
const isExists = await this.app.vault.adapter.exists(filename);
|
const isExists = await this.plugin.vaultAccess.adapterExists(filename);
|
||||||
if (!isExists) {
|
if (!isExists) {
|
||||||
await this.ensureDirectoryEx(filename);
|
await this.ensureDirectoryEx(filename);
|
||||||
}
|
}
|
||||||
await this.app.vault.adapter.write(filename, result);
|
await this.plugin.vaultAccess.adapterWrite(filename, result);
|
||||||
const stat = await this.app.vault.adapter.stat(filename);
|
const stat = await this.plugin.vaultAccess.adapterStat(filename);
|
||||||
await this.storeInternalFileToDatabase({ path: filename, ...stat }, true);
|
await this.storeInternalFileToDatabase({ path: filename, ...stat }, true);
|
||||||
try {
|
try {
|
||||||
//@ts-ignore internalAPI
|
//@ts-ignore internalAPI
|
||||||
@@ -657,7 +665,7 @@ export class HiddenFileSync extends LiveSyncCommands {
|
|||||||
const files = filenames.filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile))).map(async (e) => {
|
const files = filenames.filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile))).map(async (e) => {
|
||||||
return {
|
return {
|
||||||
path: e as FilePath,
|
path: e as FilePath,
|
||||||
stat: await this.app.vault.adapter.stat(e)
|
stat: await this.plugin.vaultAccess.adapterStat(e)
|
||||||
};
|
};
|
||||||
});
|
});
|
||||||
const result: InternalFileInfo[] = [];
|
const result: InternalFileInfo[] = [];
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
import { normalizePath, type PluginManifest } from "./deps";
|
import { normalizePath, type PluginManifest } from "./deps";
|
||||||
import type { DocumentID, EntryDoc, FilePathWithPrefix, LoadedEntry } from "./lib/src/types";
|
import type { DocumentID, EntryDoc, FilePathWithPrefix, LoadedEntry, SavingEntry } from "./lib/src/types";
|
||||||
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
|
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
|
||||||
import { type PluginDataEntry, PERIODIC_PLUGIN_SWEEP, type PluginList, type DevicePluginList, PSCHeader, PSCHeaderEnd } from "./types";
|
import { type PluginDataEntry, PERIODIC_PLUGIN_SWEEP, type PluginList, type DevicePluginList, PSCHeader, PSCHeaderEnd } from "./types";
|
||||||
import { getDocData, isDocContentSame } from "./lib/src/utils";
|
import { createTextBlob, getDocData, isDocContentSame } from "./lib/src/utils";
|
||||||
import { Logger } from "./lib/src/logger";
|
import { Logger } from "./lib/src/logger";
|
||||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||||
import { isPluginMetadata, PeriodicProcessor } from "./utils";
|
import { isPluginMetadata, PeriodicProcessor } from "./utils";
|
||||||
@@ -186,18 +186,17 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
|
|||||||
}
|
}
|
||||||
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL_VERBOSE);
|
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL_VERBOSE);
|
||||||
const path = normalizePath(m.dir) + "/";
|
const path = normalizePath(m.dir) + "/";
|
||||||
const adapter = this.app.vault.adapter;
|
|
||||||
const files = ["manifest.json", "main.js", "styles.css", "data.json"];
|
const files = ["manifest.json", "main.js", "styles.css", "data.json"];
|
||||||
const pluginData: { [key: string]: string; } = {};
|
const pluginData: { [key: string]: string; } = {};
|
||||||
for (const file of files) {
|
for (const file of files) {
|
||||||
const thePath = path + file;
|
const thePath = path + file;
|
||||||
if (await adapter.exists(thePath)) {
|
if (await this.plugin.vaultAccess.adapterExists(thePath)) {
|
||||||
pluginData[file] = await adapter.read(thePath);
|
pluginData[file] = await this.plugin.vaultAccess.adapterRead(thePath);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
let mtime = 0;
|
let mtime = 0;
|
||||||
if (await adapter.exists(path + "/data.json")) {
|
if (await this.plugin.vaultAccess.adapterExists(path + "/data.json")) {
|
||||||
mtime = (await adapter.stat(path + "/data.json")).mtime;
|
mtime = (await this.plugin.vaultAccess.adapterStat(path + "/data.json")).mtime;
|
||||||
}
|
}
|
||||||
|
|
||||||
const p: PluginDataEntry = {
|
const p: PluginDataEntry = {
|
||||||
@@ -211,13 +210,14 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
|
|||||||
mtime: mtime,
|
mtime: mtime,
|
||||||
type: "plugin",
|
type: "plugin",
|
||||||
};
|
};
|
||||||
const d: LoadedEntry = {
|
const blob = createTextBlob(JSON.stringify(p));
|
||||||
|
const d: SavingEntry = {
|
||||||
_id: p._id,
|
_id: p._id,
|
||||||
path: p._id as string as FilePathWithPrefix,
|
path: p._id as string as FilePathWithPrefix,
|
||||||
data: JSON.stringify(p),
|
data: blob,
|
||||||
ctime: mtime,
|
ctime: mtime,
|
||||||
mtime: mtime,
|
mtime: mtime,
|
||||||
size: 0,
|
size: blob.size,
|
||||||
children: [],
|
children: [],
|
||||||
datatype: "plain",
|
datatype: "plain",
|
||||||
type: "plain"
|
type: "plain"
|
||||||
@@ -228,7 +228,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
|
|||||||
if (old !== false) {
|
if (old !== false) {
|
||||||
const oldData = { data: old.data, deleted: old._deleted };
|
const oldData = { data: old.data, deleted: old._deleted };
|
||||||
const newData = { data: d.data, deleted: d._deleted };
|
const newData = { data: d.data, deleted: d._deleted };
|
||||||
if (isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
|
if (await isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
|
||||||
Logger(`Nothing changed:${m.name}`);
|
Logger(`Nothing changed:${m.name}`);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
@@ -268,7 +268,6 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
|
|||||||
async applyPluginData(plugin: PluginDataEntry) {
|
async applyPluginData(plugin: PluginDataEntry) {
|
||||||
await serialized("plugin-" + plugin.manifest.id, async () => {
|
await serialized("plugin-" + plugin.manifest.id, async () => {
|
||||||
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
|
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
|
||||||
const adapter = this.app.vault.adapter;
|
|
||||||
// @ts-ignore
|
// @ts-ignore
|
||||||
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
|
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
|
||||||
if (stat) {
|
if (stat) {
|
||||||
@@ -277,7 +276,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
|
|||||||
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
|
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
|
||||||
}
|
}
|
||||||
if (plugin.dataJson)
|
if (plugin.dataJson)
|
||||||
await adapter.write(pluginTargetFolderPath + "data.json", plugin.dataJson);
|
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "data.json", plugin.dataJson);
|
||||||
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL_NOTICE);
|
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL_NOTICE);
|
||||||
if (stat) {
|
if (stat) {
|
||||||
// @ts-ignore
|
// @ts-ignore
|
||||||
@@ -298,14 +297,13 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
|
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
|
||||||
const adapter = this.app.vault.adapter;
|
if ((await this.plugin.vaultAccess.adapterExists(pluginTargetFolderPath)) === false) {
|
||||||
if ((await adapter.exists(pluginTargetFolderPath)) === false) {
|
await this.app.vault.adapter.mkdir(pluginTargetFolderPath);
|
||||||
await adapter.mkdir(pluginTargetFolderPath);
|
|
||||||
}
|
}
|
||||||
await adapter.write(pluginTargetFolderPath + "main.js", plugin.mainJs);
|
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "main.js", plugin.mainJs);
|
||||||
await adapter.write(pluginTargetFolderPath + "manifest.json", plugin.manifestJson);
|
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "manifest.json", plugin.manifestJson);
|
||||||
if (plugin.styleCss)
|
if (plugin.styleCss)
|
||||||
await adapter.write(pluginTargetFolderPath + "styles.css", plugin.styleCss);
|
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "styles.css", plugin.styleCss);
|
||||||
if (stat) {
|
if (stat) {
|
||||||
// @ts-ignore
|
// @ts-ignore
|
||||||
await this.app.plugins.loadPlugin(plugin.manifest.id);
|
await this.app.plugins.loadPlugin(plugin.manifest.id);
|
||||||
|
|||||||
@@ -60,7 +60,7 @@ export class SetupLiveSync extends LiveSyncCommands {
|
|||||||
delete setting[k];
|
delete setting[k];
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false, true));
|
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
|
||||||
const uri = `${configURIBase}${encryptedSetting}`;
|
const uri = `${configURIBase}${encryptedSetting}`;
|
||||||
await navigator.clipboard.writeText(uri);
|
await navigator.clipboard.writeText(uri);
|
||||||
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
|
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
|
||||||
@@ -70,7 +70,7 @@ export class SetupLiveSync extends LiveSyncCommands {
|
|||||||
if (encryptingPassphrase === false)
|
if (encryptingPassphrase === false)
|
||||||
return;
|
return;
|
||||||
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
|
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
|
||||||
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false, true));
|
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
|
||||||
const uri = `${configURIBase}${encryptedSetting}`;
|
const uri = `${configURIBase}${encryptedSetting}`;
|
||||||
await navigator.clipboard.writeText(uri);
|
await navigator.clipboard.writeText(uri);
|
||||||
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
|
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
|
||||||
@@ -321,7 +321,6 @@ Of course, we are able to disable these features.`
|
|||||||
this.plugin.settings.suspendFileWatching = false;
|
this.plugin.settings.suspendFileWatching = false;
|
||||||
await this.plugin.syncAllFiles(true);
|
await this.plugin.syncAllFiles(true);
|
||||||
await this.plugin.loadQueuedFiles();
|
await this.plugin.loadQueuedFiles();
|
||||||
this.plugin.procQueuedFiles();
|
|
||||||
await this.plugin.saveSettings();
|
await this.plugin.saveSettings();
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -200,11 +200,11 @@ export class DocumentHistoryModal extends Modal {
|
|||||||
Logger("Path is not valid to write content.", LOG_LEVEL_INFO);
|
Logger("Path is not valid to write content.", LOG_LEVEL_INFO);
|
||||||
}
|
}
|
||||||
if (this.currentDoc?.datatype == "plain") {
|
if (this.currentDoc?.datatype == "plain") {
|
||||||
await this.app.vault.adapter.write(pathToWrite, getDocData(this.currentDoc.data));
|
await this.plugin.vaultAccess.adapterWrite(pathToWrite, getDocData(this.currentDoc.data));
|
||||||
await focusFile(pathToWrite);
|
await focusFile(pathToWrite);
|
||||||
this.close();
|
this.close();
|
||||||
} else if (this.currentDoc?.datatype == "newnote") {
|
} else if (this.currentDoc?.datatype == "newnote") {
|
||||||
await this.app.vault.adapter.writeBinary(pathToWrite, decodeBinary(this.currentDoc.data));
|
await this.plugin.vaultAccess.adapterWrite(pathToWrite, decodeBinary(this.currentDoc.data));
|
||||||
await focusFile(pathToWrite);
|
await focusFile(pathToWrite);
|
||||||
this.close();
|
this.close();
|
||||||
} else {
|
} else {
|
||||||
|
|||||||
@@ -2,12 +2,11 @@
|
|||||||
import ObsidianLiveSyncPlugin from "./main";
|
import ObsidianLiveSyncPlugin from "./main";
|
||||||
import { onDestroy, onMount } from "svelte";
|
import { onDestroy, onMount } from "svelte";
|
||||||
import type { AnyEntry, FilePathWithPrefix } from "./lib/src/types";
|
import type { AnyEntry, FilePathWithPrefix } from "./lib/src/types";
|
||||||
import { getDocData, isDocContentSame } from "./lib/src/utils";
|
import { createBinaryBlob, getDocData, isDocContentSame } from "./lib/src/utils";
|
||||||
import { diff_match_patch } from "./deps";
|
import { diff_match_patch } from "./deps";
|
||||||
import { DocumentHistoryModal } from "./DocumentHistoryModal";
|
import { DocumentHistoryModal } from "./DocumentHistoryModal";
|
||||||
import { isPlainText, stripAllPrefixes } from "./lib/src/path";
|
import { isPlainText, stripAllPrefixes } from "./lib/src/path";
|
||||||
import { TFile } from "./deps";
|
import { TFile } from "./deps";
|
||||||
import { encodeBinary } from "./lib/src/strbin";
|
|
||||||
export let plugin: ObsidianLiveSyncPlugin;
|
export let plugin: ObsidianLiveSyncPlugin;
|
||||||
|
|
||||||
let showDiffInfo = false;
|
let showDiffInfo = false;
|
||||||
@@ -67,10 +66,7 @@
|
|||||||
|
|
||||||
for (const revInfo of reversedRevs) {
|
for (const revInfo of reversedRevs) {
|
||||||
if (revInfo.status == "available") {
|
if (revInfo.status == "available") {
|
||||||
const doc =
|
const doc = (!isPlain && showDiffInfo) || (checkStorageDiff && revInfo.rev == docA._rev) ? await db.getDBEntry(path, { rev: revInfo.rev }, false, false, true) : await db.getDBEntryMeta(path, { rev: revInfo.rev }, true);
|
||||||
(!isPlain && showDiffInfo) || (checkStorageDiff && revInfo.rev == docA._rev)
|
|
||||||
? await db.getDBEntry(path, { rev: revInfo.rev }, false, false, true)
|
|
||||||
: await db.getDBEntryMeta(path, { rev: revInfo.rev }, true);
|
|
||||||
if (doc === false) continue;
|
if (doc === false) continue;
|
||||||
const rev = revInfo.rev;
|
const rev = revInfo.rev;
|
||||||
|
|
||||||
@@ -108,16 +104,16 @@
|
|||||||
}
|
}
|
||||||
if (rev == docA._rev) {
|
if (rev == docA._rev) {
|
||||||
if (checkStorageDiff) {
|
if (checkStorageDiff) {
|
||||||
const abs = plugin.app.vault.getAbstractFileByPath(stripAllPrefixes(plugin.getPath(docA)));
|
const abs = plugin.vaultAccess.getAbstractFileByPath(stripAllPrefixes(plugin.getPath(docA)));
|
||||||
if (abs instanceof TFile) {
|
if (abs instanceof TFile) {
|
||||||
let result = false;
|
let result = false;
|
||||||
if (isPlainText(docA.path)) {
|
if (isPlainText(docA.path)) {
|
||||||
const data = await plugin.app.vault.read(abs);
|
const data = await plugin.vaultAccess.adapterRead(abs);
|
||||||
result = isDocContentSame(data, doc.data);
|
result = await isDocContentSame(data, doc.data);
|
||||||
} else {
|
} else {
|
||||||
const data = await plugin.app.vault.readBinary(abs);
|
const data = await plugin.vaultAccess.adapterReadBinary(abs);
|
||||||
const dataEEncoded = await encodeBinary(data, plugin.settings.useV1);
|
const dataEEncoded = createBinaryBlob(data);
|
||||||
result = isDocContentSame(dataEEncoded, doc.data);
|
result = await isDocContentSame(dataEEncoded, doc.data);
|
||||||
}
|
}
|
||||||
if (result) {
|
if (result) {
|
||||||
diffDetail += " ⚖️";
|
diffDetail += " ⚖️";
|
||||||
|
|||||||
@@ -14,6 +14,9 @@ export abstract class LiveSyncCommands {
|
|||||||
get localDatabase() {
|
get localDatabase() {
|
||||||
return this.plugin.localDatabase;
|
return this.plugin.localDatabase;
|
||||||
}
|
}
|
||||||
|
get vaultAccess() {
|
||||||
|
return this.plugin.vaultAccess;
|
||||||
|
}
|
||||||
id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
|
id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
|
||||||
return this.plugin.id2path(id, entry, stripPrefix);
|
return this.plugin.id2path(id, entry, stripPrefix);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import { App, Modal } from "./deps";
|
import { App, Modal } from "./deps";
|
||||||
import { logMessageStore } from "./lib/src/stores";
|
import type { ReactiveInstance, } from "./lib/src/reactive";
|
||||||
|
import { logMessages } from "./lib/src/stores";
|
||||||
import { escapeStringToHTML } from "./lib/src/strbin";
|
import { escapeStringToHTML } from "./lib/src/strbin";
|
||||||
import ObsidianLiveSyncPlugin from "./main";
|
import ObsidianLiveSyncPlugin from "./main";
|
||||||
|
|
||||||
@@ -21,14 +22,16 @@ export class LogDisplayModal extends Modal {
|
|||||||
div.addClass("op-scrollable");
|
div.addClass("op-scrollable");
|
||||||
div.addClass("op-pre");
|
div.addClass("op-pre");
|
||||||
this.logEl = div;
|
this.logEl = div;
|
||||||
this.unsubscribe = logMessageStore.observe((e) => {
|
function updateLog(logs: ReactiveInstance<string[]>) {
|
||||||
|
const e = logs.value;
|
||||||
let msg = "";
|
let msg = "";
|
||||||
for (const v of e) {
|
for (const v of e) {
|
||||||
msg += escapeStringToHTML(v) + "<br>";
|
msg += escapeStringToHTML(v) + "<br>";
|
||||||
}
|
}
|
||||||
this.logEl.innerHTML = msg;
|
this.logEl.innerHTML = msg;
|
||||||
})
|
}
|
||||||
logMessageStore.invalidate();
|
logMessages.onChanged(updateLog);
|
||||||
|
this.unsubscribe = () => logMessages.offChanged(updateLog);
|
||||||
}
|
}
|
||||||
onClose() {
|
onClose() {
|
||||||
const { contentEl } = this;
|
const { contentEl } = this;
|
||||||
|
|||||||
@@ -1,26 +1,27 @@
|
|||||||
<script lang="ts">
|
<script lang="ts">
|
||||||
import { onDestroy, onMount } from "svelte";
|
import { onDestroy, onMount } from "svelte";
|
||||||
import { logMessageStore } from "./lib/src/stores";
|
import { logMessages } from "./lib/src/stores";
|
||||||
|
import type { ReactiveInstance } from "./lib/src/reactive";
|
||||||
|
import { Logger } from "./lib/src/logger";
|
||||||
|
|
||||||
let unsubscribe: () => void;
|
let unsubscribe: () => void;
|
||||||
let messages = [] as string[];
|
let messages = [] as string[];
|
||||||
let wrapRight = false;
|
let wrapRight = false;
|
||||||
let autoScroll = true;
|
let autoScroll = true;
|
||||||
let suspended = false;
|
let suspended = false;
|
||||||
|
function updateLog(logs: ReactiveInstance<string[]>) {
|
||||||
|
const e = logs.value;
|
||||||
|
if (!suspended) {
|
||||||
|
messages = [...e];
|
||||||
|
setTimeout(() => {
|
||||||
|
if (scroll) scroll.scrollTop = scroll.scrollHeight;
|
||||||
|
}, 10);
|
||||||
|
}
|
||||||
|
}
|
||||||
onMount(async () => {
|
onMount(async () => {
|
||||||
unsubscribe = logMessageStore.observe((e) => {
|
logMessages.onChanged(updateLog);
|
||||||
if (!suspended) {
|
Logger("Log window opened");
|
||||||
messages = [...e];
|
unsubscribe = () => logMessages.offChanged(updateLog);
|
||||||
if (autoScroll) {
|
|
||||||
if (scroll) scroll.scrollTop = scroll.scrollHeight;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
logMessageStore.invalidate();
|
|
||||||
setTimeout(() => {
|
|
||||||
if (scroll) scroll.scrollTop = scroll.scrollHeight;
|
|
||||||
}, 100);
|
|
||||||
});
|
});
|
||||||
onDestroy(() => {
|
onDestroy(() => {
|
||||||
if (unsubscribe) unsubscribe();
|
if (unsubscribe) unsubscribe();
|
||||||
|
|||||||
@@ -4,7 +4,7 @@ import { delay } from "./lib/src/utils";
|
|||||||
import { Semaphore } from "./lib/src/semaphore";
|
import { Semaphore } from "./lib/src/semaphore";
|
||||||
import { versionNumberString2Number } from "./lib/src/strbin";
|
import { versionNumberString2Number } from "./lib/src/strbin";
|
||||||
import { Logger } from "./lib/src/logger";
|
import { Logger } from "./lib/src/logger";
|
||||||
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb.js";
|
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb";
|
||||||
import { testCrypt } from "./lib/src/e2ee_v2";
|
import { testCrypt } from "./lib/src/e2ee_v2";
|
||||||
import ObsidianLiveSyncPlugin from "./main";
|
import ObsidianLiveSyncPlugin from "./main";
|
||||||
import { askYesNo, performRebuildDB, requestToCouchDB, scheduleTask } from "./utils";
|
import { askYesNo, performRebuildDB, requestToCouchDB, scheduleTask } from "./utils";
|
||||||
@@ -746,8 +746,20 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
|||||||
toggle.setValue(this.plugin.settings.showStatusOnEditor).onChange(async (value) => {
|
toggle.setValue(this.plugin.settings.showStatusOnEditor).onChange(async (value) => {
|
||||||
this.plugin.settings.showStatusOnEditor = value;
|
this.plugin.settings.showStatusOnEditor = value;
|
||||||
await this.plugin.saveSettings();
|
await this.plugin.saveSettings();
|
||||||
|
this.display();
|
||||||
})
|
})
|
||||||
);
|
);
|
||||||
|
if (this.plugin.settings.showStatusOnEditor) {
|
||||||
|
new Setting(containerGeneralSettingsEl)
|
||||||
|
.setName("Show status as icons only")
|
||||||
|
.setDesc("")
|
||||||
|
.addToggle((toggle) =>
|
||||||
|
toggle.setValue(this.plugin.settings.showOnlyIconsOnEditor).onChange(async (value) => {
|
||||||
|
this.plugin.settings.showOnlyIconsOnEditor = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
})
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
containerGeneralSettingsEl.createEl("h4", { text: "Logging" });
|
containerGeneralSettingsEl.createEl("h4", { text: "Logging" });
|
||||||
new Setting(containerGeneralSettingsEl)
|
new Setting(containerGeneralSettingsEl)
|
||||||
@@ -1813,7 +1825,7 @@ ${stringifyYaml(pluginConfig)}`;
|
|||||||
.setClass("wizardHidden")
|
.setClass("wizardHidden")
|
||||||
.addDropdown((dropdown) =>
|
.addDropdown((dropdown) =>
|
||||||
dropdown
|
dropdown
|
||||||
.addOptions({ "": "Old Algorithm", "xxhash32": "xxhash32 (Fast)", "xxhash64": "xxhash64 (Fastest)" } as Record<HashAlgorithm, string>)
|
.addOptions({ "": "Old Algorithm", "xxhash32": "xxhash32 (Fast)", "xxhash64": "xxhash64 (Fastest)", "sha1": "Fallback (Without WebAssembly)" } as Record<HashAlgorithm, string>)
|
||||||
.setValue(this.plugin.settings.hashAlg)
|
.setValue(this.plugin.settings.hashAlg)
|
||||||
.onChange(async (value) => {
|
.onChange(async (value) => {
|
||||||
this.plugin.settings.hashAlg = value as HashAlgorithm;
|
this.plugin.settings.hashAlg = value as HashAlgorithm;
|
||||||
@@ -1832,15 +1844,6 @@ ${stringifyYaml(pluginConfig)}`;
|
|||||||
await this.plugin.saveSettings();
|
await this.plugin.saveSettings();
|
||||||
})
|
})
|
||||||
);
|
);
|
||||||
new Setting(containerHatchEl)
|
|
||||||
.setName("Use binary and encryption version 1")
|
|
||||||
.setDesc("Use the previous data format for other products which shares the remote database.")
|
|
||||||
.addToggle((toggle) =>
|
|
||||||
toggle.setValue(this.plugin.settings.useV1).onChange(async (value) => {
|
|
||||||
this.plugin.settings.useV1 = value;
|
|
||||||
await this.plugin.saveSettings();
|
|
||||||
})
|
|
||||||
);
|
|
||||||
addScreenElement("50", containerHatchEl);
|
addScreenElement("50", containerHatchEl);
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
132
src/SerializedFileAccess.ts
Normal file
132
src/SerializedFileAccess.ts
Normal file
@@ -0,0 +1,132 @@
|
|||||||
|
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "./deps";
|
||||||
|
import { serialized } from "./lib/src/lock";
|
||||||
|
import type { FilePath } from "./lib/src/types";
|
||||||
|
import { createBinaryBlob, isDocContentSame } from "./lib/src/utils";
|
||||||
|
function getFileLockKey(file: TFile | TFolder | string) {
|
||||||
|
return `fl:${typeof (file) == "string" ? file : file.path}`;
|
||||||
|
}
|
||||||
|
function toArrayBuffer(arr: Uint8Array | ArrayBuffer | DataView): ArrayBufferLike {
|
||||||
|
if (arr instanceof Uint8Array) {
|
||||||
|
return arr.buffer;
|
||||||
|
}
|
||||||
|
if (arr instanceof DataView) {
|
||||||
|
return arr.buffer;
|
||||||
|
}
|
||||||
|
return arr;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class SerializedFileAccess {
|
||||||
|
app: App
|
||||||
|
constructor(app: App) {
|
||||||
|
this.app = app;
|
||||||
|
}
|
||||||
|
|
||||||
|
async adapterStat(file: TFile | string) {
|
||||||
|
const path = file instanceof TFile ? file.path : file;
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.stat(path));
|
||||||
|
}
|
||||||
|
async adapterExists(file: TFile | string) {
|
||||||
|
const path = file instanceof TFile ? file.path : file;
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.exists(path));
|
||||||
|
}
|
||||||
|
async adapterRemove(file: TFile | string) {
|
||||||
|
const path = file instanceof TFile ? file.path : file;
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.remove(path));
|
||||||
|
}
|
||||||
|
|
||||||
|
async adapterRead(file: TFile | string) {
|
||||||
|
const path = file instanceof TFile ? file.path : file;
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.read(path));
|
||||||
|
}
|
||||||
|
async adapterReadBinary(file: TFile | string) {
|
||||||
|
const path = file instanceof TFile ? file.path : file;
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.readBinary(path));
|
||||||
|
}
|
||||||
|
|
||||||
|
async adapterWrite(file: TFile | string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
|
||||||
|
const path = file instanceof TFile ? file.path : file;
|
||||||
|
if (typeof (data) === "string") {
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.write(path, data, options));
|
||||||
|
} else {
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.adapter.writeBinary(path, toArrayBuffer(data), options));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async vaultCacheRead(file: TFile) {
|
||||||
|
return await serialized(getFileLockKey(file), () => this.app.vault.cachedRead(file));
|
||||||
|
}
|
||||||
|
|
||||||
|
async vaultRead(file: TFile) {
|
||||||
|
return await serialized(getFileLockKey(file), () => this.app.vault.read(file));
|
||||||
|
}
|
||||||
|
|
||||||
|
async vaultReadBinary(file: TFile) {
|
||||||
|
return await serialized(getFileLockKey(file), () => this.app.vault.readBinary(file));
|
||||||
|
}
|
||||||
|
|
||||||
|
async vaultModify(file: TFile, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
|
||||||
|
if (typeof (data) === "string") {
|
||||||
|
return await serialized(getFileLockKey(file), async () => {
|
||||||
|
const oldData = await this.app.vault.read(file);
|
||||||
|
if (data === oldData) return false
|
||||||
|
await this.app.vault.modify(file, data, options)
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
return await serialized(getFileLockKey(file), async () => {
|
||||||
|
const oldData = await this.app.vault.readBinary(file);
|
||||||
|
if (isDocContentSame(createBinaryBlob(oldData), createBinaryBlob(data))) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
await this.app.vault.modifyBinary(file, toArrayBuffer(data), options)
|
||||||
|
return true;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async vaultCreate(path: string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions): Promise<TFile> {
|
||||||
|
if (typeof (data) === "string") {
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.create(path, data, options));
|
||||||
|
} else {
|
||||||
|
return await serialized(getFileLockKey(path), () => this.app.vault.createBinary(path, toArrayBuffer(data), options));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
async delete(file: TFile | TFolder, force = false) {
|
||||||
|
return await serialized(getFileLockKey(file), () => this.app.vault.delete(file, force));
|
||||||
|
}
|
||||||
|
async trash(file: TFile | TFolder, force = false) {
|
||||||
|
return await serialized(getFileLockKey(file), () => this.app.vault.trash(file, force));
|
||||||
|
}
|
||||||
|
|
||||||
|
getAbstractFileByPath(path: FilePath | string): TAbstractFile | null {
|
||||||
|
// Disabled temporary.
|
||||||
|
return this.app.vault.getAbstractFileByPath(path);
|
||||||
|
// // Hidden API but so useful.
|
||||||
|
// // @ts-ignore
|
||||||
|
// if ("getAbstractFileByPathInsensitive" in app.vault && (app.vault.adapter?.insensitive ?? false)) {
|
||||||
|
// // @ts-ignore
|
||||||
|
// return app.vault.getAbstractFileByPathInsensitive(path);
|
||||||
|
// } else {
|
||||||
|
// return app.vault.getAbstractFileByPath(path);
|
||||||
|
// }
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
touchedFiles: string[] = [];
|
||||||
|
|
||||||
|
|
||||||
|
touch(file: TFile | FilePath) {
|
||||||
|
const f = file instanceof TFile ? file : this.getAbstractFileByPath(file) as TFile;
|
||||||
|
const key = `${f.path}-${f.stat.mtime}-${f.stat.size}`;
|
||||||
|
this.touchedFiles.unshift(key);
|
||||||
|
this.touchedFiles = this.touchedFiles.slice(0, 100);
|
||||||
|
}
|
||||||
|
recentlyTouched(file: TFile) {
|
||||||
|
const key = `${file.path}-${file.stat.mtime}-${file.stat.size}`;
|
||||||
|
if (this.touchedFiles.indexOf(key) == -1) return false;
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
clearTouched() {
|
||||||
|
this.touchedFiles = [];
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,36 +1,34 @@
|
|||||||
|
import type { SerializedFileAccess } from "./SerializedFileAccess";
|
||||||
import { Plugin, TAbstractFile, TFile, TFolder } from "./deps";
|
import { Plugin, TAbstractFile, TFile, TFolder } from "./deps";
|
||||||
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
|
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
|
||||||
import { getGlobalStore } from "./lib/src/store";
|
import type { KeyedQueueProcessor } from "./lib/src/processor";
|
||||||
import { type FilePath, type ObsidianLiveSyncSettings } from "./lib/src/types";
|
import { type FilePath, type ObsidianLiveSyncSettings } from "./lib/src/types";
|
||||||
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo, type queueItem } from "./types";
|
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo } from "./types";
|
||||||
import { recentlyTouched } from "./utils";
|
|
||||||
|
|
||||||
|
|
||||||
export abstract class StorageEventManager {
|
export abstract class StorageEventManager {
|
||||||
abstract fetchEvent(): FileEventItem | false;
|
abstract beginWatch(): void;
|
||||||
abstract cancelRelativeEvent(item: FileEventItem): void;
|
|
||||||
abstract getQueueLength(): number;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
type LiveSyncForStorageEventManager = Plugin &
|
type LiveSyncForStorageEventManager = Plugin &
|
||||||
{
|
{
|
||||||
settings: ObsidianLiveSyncSettings
|
settings: ObsidianLiveSyncSettings
|
||||||
ignoreFiles: string[],
|
ignoreFiles: string[],
|
||||||
|
vaultAccess: SerializedFileAccess
|
||||||
} & {
|
} & {
|
||||||
isTargetFile: (file: string | TAbstractFile) => Promise<boolean>,
|
isTargetFile: (file: string | TAbstractFile) => Promise<boolean>,
|
||||||
procFileEvent: (applyBatch?: boolean) => Promise<any>,
|
fileEventQueue: KeyedQueueProcessor<FileEventItem, any>
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
||||||
export class StorageEventManagerObsidian extends StorageEventManager {
|
export class StorageEventManagerObsidian extends StorageEventManager {
|
||||||
plugin: LiveSyncForStorageEventManager;
|
plugin: LiveSyncForStorageEventManager;
|
||||||
queuedFilesStore = getGlobalStore("queuedFiles", { queuedItems: [] as queueItem[], fileEventItems: [] as FileEventItem[] });
|
|
||||||
|
|
||||||
watchedFileEventQueue = [] as FileEventItem[];
|
|
||||||
|
|
||||||
constructor(plugin: LiveSyncForStorageEventManager) {
|
constructor(plugin: LiveSyncForStorageEventManager) {
|
||||||
super();
|
super();
|
||||||
this.plugin = plugin;
|
this.plugin = plugin;
|
||||||
|
}
|
||||||
|
beginWatch() {
|
||||||
|
const plugin = this.plugin;
|
||||||
this.watchVaultChange = this.watchVaultChange.bind(this);
|
this.watchVaultChange = this.watchVaultChange.bind(this);
|
||||||
this.watchVaultCreate = this.watchVaultCreate.bind(this);
|
this.watchVaultCreate = this.watchVaultCreate.bind(this);
|
||||||
this.watchVaultDelete = this.watchVaultDelete.bind(this);
|
this.watchVaultDelete = this.watchVaultDelete.bind(this);
|
||||||
@@ -42,6 +40,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
|||||||
plugin.registerEvent(plugin.app.vault.on("create", this.watchVaultCreate));
|
plugin.registerEvent(plugin.app.vault.on("create", this.watchVaultCreate));
|
||||||
//@ts-ignore : Internal API
|
//@ts-ignore : Internal API
|
||||||
plugin.registerEvent(plugin.app.vault.on("raw", this.watchVaultRawEvents));
|
plugin.registerEvent(plugin.app.vault.on("raw", this.watchVaultRawEvents));
|
||||||
|
plugin.fileEventQueue.startPipeline();
|
||||||
}
|
}
|
||||||
|
|
||||||
watchVaultCreate(file: TAbstractFile, ctx?: any) {
|
watchVaultCreate(file: TAbstractFile, ctx?: any) {
|
||||||
@@ -89,7 +88,6 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
|||||||
}
|
}
|
||||||
// Cache file and waiting to can be proceed.
|
// Cache file and waiting to can be proceed.
|
||||||
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
|
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
|
||||||
let forcePerform = false;
|
|
||||||
for (const param of params) {
|
for (const param of params) {
|
||||||
if (shouldBeIgnored(param.file.path)) {
|
if (shouldBeIgnored(param.file.path)) {
|
||||||
continue;
|
continue;
|
||||||
@@ -105,44 +103,16 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
|||||||
let cache: null | string | ArrayBuffer;
|
let cache: null | string | ArrayBuffer;
|
||||||
// new file or something changed, cache the changes.
|
// new file or something changed, cache the changes.
|
||||||
if (file instanceof TFile && (type == "CREATE" || type == "CHANGED")) {
|
if (file instanceof TFile && (type == "CREATE" || type == "CHANGED")) {
|
||||||
if (recentlyTouched(file)) {
|
if (this.plugin.vaultAccess.recentlyTouched(file)) {
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
if (!isPlainText(file.name)) {
|
if (!isPlainText(file.name)) {
|
||||||
cache = await this.plugin.app.vault.readBinary(file);
|
cache = await this.plugin.vaultAccess.vaultReadBinary(file);
|
||||||
} else {
|
} else {
|
||||||
// cache = await this.app.vault.read(file);
|
cache = await this.plugin.vaultAccess.vaultCacheRead(file);
|
||||||
cache = await this.plugin.app.vault.cachedRead(file);
|
if (!cache) cache = await this.plugin.vaultAccess.vaultRead(file);
|
||||||
if (!cache) cache = await this.plugin.app.vault.read(file);
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
if (type == "DELETE" || type == "RENAME") {
|
|
||||||
forcePerform = true;
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
if (this.plugin.settings.batchSave && !this.plugin.settings.liveSync) {
|
|
||||||
// if the latest event is the same type, omit that
|
|
||||||
// a.md MODIFY <- this should be cancelled when a.md MODIFIED
|
|
||||||
// b.md MODIFY <- this should be cancelled when b.md MODIFIED
|
|
||||||
// a.md MODIFY
|
|
||||||
// a.md CREATE
|
|
||||||
// :
|
|
||||||
let i = this.watchedFileEventQueue.length;
|
|
||||||
L1:
|
|
||||||
while (i >= 0) {
|
|
||||||
i--;
|
|
||||||
if (i < 0) break L1;
|
|
||||||
if (this.watchedFileEventQueue[i].args.file.path != file.path) {
|
|
||||||
continue L1;
|
|
||||||
}
|
|
||||||
if (this.watchedFileEventQueue[i].type != type) break L1;
|
|
||||||
this.watchedFileEventQueue.remove(this.watchedFileEventQueue[i]);
|
|
||||||
//this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
|
|
||||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const fileInfo = file instanceof TFile ? {
|
const fileInfo = file instanceof TFile ? {
|
||||||
ctime: file.stat.ctime,
|
ctime: file.stat.ctime,
|
||||||
mtime: file.stat.mtime,
|
mtime: file.stat.mtime,
|
||||||
@@ -150,7 +120,8 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
|||||||
path: file.path,
|
path: file.path,
|
||||||
size: file.stat.size
|
size: file.stat.size
|
||||||
} as FileInfo : file as InternalFileInfo;
|
} as FileInfo : file as InternalFileInfo;
|
||||||
this.watchedFileEventQueue.push({
|
|
||||||
|
this.plugin.fileEventQueue.enqueueWithKey(`file-${fileInfo.path}`, {
|
||||||
type,
|
type,
|
||||||
args: {
|
args: {
|
||||||
file: fileInfo,
|
file: fileInfo,
|
||||||
@@ -161,21 +132,5 @@ export class StorageEventManagerObsidian extends StorageEventManager {
|
|||||||
key: atomicKey
|
key: atomicKey
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
// this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
|
|
||||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
|
||||||
this.plugin.procFileEvent(forcePerform);
|
|
||||||
}
|
|
||||||
fetchEvent(): FileEventItem | false {
|
|
||||||
if (this.watchedFileEventQueue.length == 0) return false;
|
|
||||||
const item = this.watchedFileEventQueue.shift();
|
|
||||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
|
||||||
return item;
|
|
||||||
}
|
|
||||||
cancelRelativeEvent(item: FileEventItem) {
|
|
||||||
this.watchedFileEventQueue = [...this.watchedFileEventQueue].filter(e => e.key != item.key);
|
|
||||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
|
||||||
}
|
|
||||||
getQueueLength() {
|
|
||||||
return this.watchedFileEventQueue.length;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
2
src/lib
2
src/lib
Submodule src/lib updated: b099865cee...33f7e69433
931
src/main.ts
931
src/main.ts
File diff suppressed because it is too large
Load Diff
89
src/utils.ts
89
src/utils.ts
@@ -1,4 +1,4 @@
|
|||||||
import { type DataWriteOptions, normalizePath, TFile, Platform, TAbstractFile, App, Plugin, type RequestUrlParam, requestUrl } from "./deps";
|
import { normalizePath, Platform, TAbstractFile, App, Plugin, type RequestUrlParam, requestUrl } from "./deps";
|
||||||
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "./lib/src/path";
|
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "./lib/src/path";
|
||||||
|
|
||||||
import { Logger } from "./lib/src/logger";
|
import { Logger } from "./lib/src/logger";
|
||||||
@@ -8,6 +8,8 @@ import { InputStringDialog, PopoverSelectString } from "./dialogs";
|
|||||||
import ObsidianLiveSyncPlugin from "./main";
|
import ObsidianLiveSyncPlugin from "./main";
|
||||||
import { writeString } from "./lib/src/strbin";
|
import { writeString } from "./lib/src/strbin";
|
||||||
|
|
||||||
|
export { scheduleTask, setPeriodicTask, cancelTask, cancelAllTasks, cancelPeriodicTask, cancelAllPeriodicTask, } from "./lib/src/task";
|
||||||
|
|
||||||
// For backward compatibility, using the path for determining id.
|
// For backward compatibility, using the path for determining id.
|
||||||
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
|
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
|
||||||
// The first slash will be deleted when the path is normalized.
|
// The first slash will be deleted when the path is normalized.
|
||||||
@@ -43,49 +45,6 @@ export function getPathFromTFile(file: TAbstractFile) {
|
|||||||
return file.path as FilePath;
|
return file.path as FilePath;
|
||||||
}
|
}
|
||||||
|
|
||||||
const tasks: { [key: string]: ReturnType<typeof setTimeout> } = {};
|
|
||||||
export function scheduleTask(key: string, timeout: number, proc: (() => Promise<any> | void), skipIfTaskExist?: boolean) {
|
|
||||||
if (skipIfTaskExist && key in tasks) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
cancelTask(key);
|
|
||||||
tasks[key] = setTimeout(async () => {
|
|
||||||
delete tasks[key];
|
|
||||||
await proc();
|
|
||||||
}, timeout);
|
|
||||||
}
|
|
||||||
export function cancelTask(key: string) {
|
|
||||||
if (key in tasks) {
|
|
||||||
clearTimeout(tasks[key]);
|
|
||||||
delete tasks[key];
|
|
||||||
}
|
|
||||||
}
|
|
||||||
export function cancelAllTasks() {
|
|
||||||
for (const v in tasks) {
|
|
||||||
clearTimeout(tasks[v]);
|
|
||||||
delete tasks[v];
|
|
||||||
}
|
|
||||||
}
|
|
||||||
const intervals: { [key: string]: ReturnType<typeof setInterval> } = {};
|
|
||||||
export function setPeriodicTask(key: string, timeout: number, proc: (() => Promise<any> | void)) {
|
|
||||||
cancelPeriodicTask(key);
|
|
||||||
intervals[key] = setInterval(async () => {
|
|
||||||
delete intervals[key];
|
|
||||||
await proc();
|
|
||||||
}, timeout);
|
|
||||||
}
|
|
||||||
export function cancelPeriodicTask(key: string) {
|
|
||||||
if (key in intervals) {
|
|
||||||
clearInterval(intervals[key]);
|
|
||||||
delete intervals[key];
|
|
||||||
}
|
|
||||||
}
|
|
||||||
export function cancelAllPeriodicTask() {
|
|
||||||
for (const v in intervals) {
|
|
||||||
clearInterval(intervals[v]);
|
|
||||||
delete intervals[v];
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
const memos: { [key: string]: any } = {};
|
const memos: { [key: string]: any } = {};
|
||||||
export function memoObject<T>(key: string, obj: T): T {
|
export function memoObject<T>(key: string, obj: T): T {
|
||||||
@@ -303,20 +262,6 @@ export function flattenObject(obj: Record<string | number | symbol, any>, path:
|
|||||||
return ret;
|
return ret;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function modifyFile(file: TFile, data: string | ArrayBuffer, options?: DataWriteOptions) {
|
|
||||||
if (typeof (data) === "string") {
|
|
||||||
return app.vault.modify(file, data, options);
|
|
||||||
} else {
|
|
||||||
return app.vault.modifyBinary(file, data, options);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
export function createFile(path: string, data: string | ArrayBuffer, options?: DataWriteOptions): Promise<TFile> {
|
|
||||||
if (typeof (data) === "string") {
|
|
||||||
return app.vault.create(path, data, options);
|
|
||||||
} else {
|
|
||||||
return app.vault.createBinary(path, data, options);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export function isValidPath(filename: string) {
|
export function isValidPath(filename: string) {
|
||||||
if (Platform.isDesktop) {
|
if (Platform.isDesktop) {
|
||||||
@@ -332,38 +277,10 @@ export function isValidPath(filename: string) {
|
|||||||
return isValidFilenameInWidows(filename);
|
return isValidFilenameInWidows(filename);
|
||||||
}
|
}
|
||||||
|
|
||||||
let touchedFiles: string[] = [];
|
|
||||||
|
|
||||||
export function getAbstractFileByPath(path: FilePath): TAbstractFile | null {
|
|
||||||
// Disabled temporary.
|
|
||||||
return app.vault.getAbstractFileByPath(path);
|
|
||||||
// // Hidden API but so useful.
|
|
||||||
// // @ts-ignore
|
|
||||||
// if ("getAbstractFileByPathInsensitive" in app.vault && (app.vault.adapter?.insensitive ?? false)) {
|
|
||||||
// // @ts-ignore
|
|
||||||
// return app.vault.getAbstractFileByPathInsensitive(path);
|
|
||||||
// } else {
|
|
||||||
// return app.vault.getAbstractFileByPath(path);
|
|
||||||
// }
|
|
||||||
}
|
|
||||||
export function trimPrefix(target: string, prefix: string) {
|
export function trimPrefix(target: string, prefix: string) {
|
||||||
return target.startsWith(prefix) ? target.substring(prefix.length) : target;
|
return target.startsWith(prefix) ? target.substring(prefix.length) : target;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function touch(file: TFile | FilePath) {
|
|
||||||
const f = file instanceof TFile ? file : getAbstractFileByPath(file) as TFile;
|
|
||||||
const key = `${f.path}-${f.stat.mtime}-${f.stat.size}`;
|
|
||||||
touchedFiles.unshift(key);
|
|
||||||
touchedFiles = touchedFiles.slice(0, 100);
|
|
||||||
}
|
|
||||||
export function recentlyTouched(file: TFile) {
|
|
||||||
const key = `${file.path}-${file.stat.mtime}-${file.stat.size}`;
|
|
||||||
if (touchedFiles.indexOf(key) == -1) return false;
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
export function clearTouched() {
|
|
||||||
touchedFiles = [];
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* returns is internal chunk of file
|
* returns is internal chunk of file
|
||||||
|
|||||||
@@ -16,6 +16,7 @@
|
|||||||
"importHelpers": false,
|
"importHelpers": false,
|
||||||
"alwaysStrict": true,
|
"alwaysStrict": true,
|
||||||
"allowImportingTsExtensions": true,
|
"allowImportingTsExtensions": true,
|
||||||
|
"noEmit": true,
|
||||||
"lib": [
|
"lib": [
|
||||||
"es2018",
|
"es2018",
|
||||||
"DOM",
|
"DOM",
|
||||||
|
|||||||
84
updates.md
84
updates.md
@@ -1,67 +1,33 @@
|
|||||||
### 0.20.0
|
### 0.22.0
|
||||||
|
A few years passed since Self-hosted LiveSync was born, and our codebase had been very complicated. This could be patient now, but it should be a tremendous hurt.
|
||||||
|
Therefore at v0.22.0, for future maintainability, I refined task scheduling logic totally.
|
||||||
|
|
||||||
At 0.20.0, Self-hosted LiveSync has changed the binary file format and encrypting format, for efficient synchronisation.
|
Of course, I think this would be our suffering in some cases. However, I would love to ask you for your cooperation and contribution.
|
||||||
The dialogue will be shown and asks us to decide whether to keep v1 or use v2. Once we have enabled v2, all subsequent edits will be saved in v2. Therefore, devices running 0.19 or below cannot understand this and they might say that decryption error. Please update all devices.
|
|
||||||
Then we will have an impressive performance.
|
|
||||||
|
|
||||||
Of course, these are very impactful changes. If you have any questions or troubled things, please feel free to open an issue and mention me.
|
Sorry for being absent so much long. And thank you for your patience!
|
||||||
|
|
||||||
Note: if you want to roll it back to v1, please enable `Use binary and encryption version 1` on the `Hatch` pane and perform the `rebuild everything` once.
|
Note: we got a very performance improvement.
|
||||||
|
|
||||||
Extra but notable information:
|
|
||||||
|
|
||||||
This format change gives us the ability to detect some `marks` in the binary files as same as text files. Therefore, we can split binary files and some specific sort of them (i.e., PDF files) at the specific character. It means that editing the middle of files could be detected with marks.
|
|
||||||
|
|
||||||
Now only a few chunks are transferred, even if we add a comment to the PDF or put new files into the ZIP archives.
|
|
||||||
|
|
||||||
#### Version history
|
#### Version history
|
||||||
- 0.20.6
|
- 0.22.0
|
||||||
- Fixed
|
- Refined:
|
||||||
- Now empty file could be decoded.
|
- Task scheduling logics has been rewritten.
|
||||||
- Local files are no longer pre-saved before fetching from a remote database.
|
- Screen updates are also now efficient.
|
||||||
- No longer deadlock while applying customisation sync.
|
- Possibly many bugs and fragile behaviour has been fixed.
|
||||||
- Configuration with multiple files is now able to be applied correctly.
|
- Status updates and logging have been thinned out to display.
|
||||||
- Deleting folder propagation now works without enabling the use of a trash bin.
|
- Fixed:
|
||||||
- 0.20.5
|
- Remote-chunk-fetching now works with keeping request intervals
|
||||||
- Fixed
|
|
||||||
- Now the files which having digit or character prefixes in the path will not be ignored.
|
|
||||||
- 0.20.4
|
|
||||||
- Fixed
|
|
||||||
- The text-input-dialogue is no longer broken.
|
|
||||||
- Finally, we can use the Setup URI again on mobile.
|
|
||||||
- 0.20.3
|
|
||||||
- New feature:
|
- New feature:
|
||||||
- We can launch Customization sync from the Ribbon if we enabled it.
|
- We can show only the icons in the editor.
|
||||||
- Fixed:
|
- Progress indicators have been more meaningful:
|
||||||
- Setup URI is now back to the previous spec; be encrypted by V1.
|
- 📥 Unprocessed transferred items
|
||||||
- It may avoid the trouble with iOS 17.
|
- 📄 Working database operation
|
||||||
- The Settings dialogue is now registered at the beginning of the start-up process.
|
- 💾 Working write storage processes
|
||||||
- We can change the configuration even though LiveSync could not be launched in normal.
|
- ⏳ Working read storage processes
|
||||||
- Improved:
|
- 🛫 Pending read storage processes
|
||||||
- Enumerating documents has been faster.
|
- ⚙️ Working or pending storage processes of hidden files
|
||||||
- 0.20.2
|
- 🧩 Waiting chunks
|
||||||
- New feature:
|
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
|
||||||
- We can delete all data of customization sync from the `Delete all customization sync data` on the `Hatch` pane.
|
|
||||||
- Fixed:
|
|
||||||
- Prevent keep restarting on iOS by yielding microtasks.
|
|
||||||
- 0.20.1
|
|
||||||
- Fixed:
|
|
||||||
- No more UI freezing and keep restarting on iOS.
|
|
||||||
- Diff of Non-markdown documents are now shown correctly.
|
|
||||||
- Improved:
|
|
||||||
- Performance has been a bit improved.
|
|
||||||
- Customization sync has gotten faster.
|
|
||||||
- However, We lost forward compatibility again (only for this feature). Please update all devices.
|
|
||||||
- Misc
|
|
||||||
- Terser configuration has been more aggressive.
|
|
||||||
- 0.20.0
|
|
||||||
- Improved:
|
|
||||||
- A New binary file handling implemented
|
|
||||||
- A new encrypted format has been implemented
|
|
||||||
- Now the chunk sizes will be adjusted for efficient sync
|
|
||||||
- Fixed:
|
|
||||||
- levels of exception in some logs have been fixed
|
|
||||||
- Tidied:
|
|
||||||
- Some Lint warnings have been suppressed.
|
|
||||||
|
|
||||||
... To continue on to `updates_old.md`.
|
... To continue on to `updates_old.md`.
|
||||||
109
updates_old.md
109
updates_old.md
@@ -1,3 +1,112 @@
|
|||||||
|
### 0.21.0
|
||||||
|
The E2EE encryption V2 format has been reverted. That was probably the cause of the glitch.
|
||||||
|
Instead, to maintain efficiency, files are treated with Blob until just before saving. Along with this, the old-fashioned encryption format has also been discontinued.
|
||||||
|
There are both forward and backwards compatibilities, with recent versions. However, unfortunately, we lost compatibility with filesystem-livesync or some.
|
||||||
|
It will be addressed soon. Please be patient if you are using filesystem-livesync with E2EE.
|
||||||
|
|
||||||
|
- 0.21.5
|
||||||
|
- Improved:
|
||||||
|
- Now all revisions will be shown only its first a few letters.
|
||||||
|
- Now ID of the documents is shown in the log with the first 8 letters.
|
||||||
|
- Fixed:
|
||||||
|
- Check before modifying files has been implemented.
|
||||||
|
- Content change detection has been improved.
|
||||||
|
- 0.21.4
|
||||||
|
- This release had been skipped.
|
||||||
|
- 0.21.3
|
||||||
|
- Implemented:
|
||||||
|
- Now we can use SHA1 for hash function as fallback.
|
||||||
|
- 0.21.2
|
||||||
|
- IMPORTANT NOTICE: **0.21.1 CONTAINS A BUG WHILE REBUILDING THE DATABASE. IF YOU HAVE BEEN REBUILT, PLEASE MAKE SURE THAT ALL FILES ARE SANE.**
|
||||||
|
- This has been fixed in this version.
|
||||||
|
- Fixed:
|
||||||
|
- No longer files are broken while rebuilding.
|
||||||
|
- Now, Large binary files can be written correctly on a mobile platform.
|
||||||
|
- Any decoding errors now make zero-byte files.
|
||||||
|
- Modified:
|
||||||
|
- All files are processed sequentially for each.
|
||||||
|
- 0.21.1
|
||||||
|
- Fixed:
|
||||||
|
- No more infinity loops on larger files.
|
||||||
|
- Show message on decode error.
|
||||||
|
- Refactored:
|
||||||
|
- Fixed to avoid obsolete global variables.
|
||||||
|
- 0.21.0
|
||||||
|
- Changes and performance improvements:
|
||||||
|
- Now the saving files are processed by Blob.
|
||||||
|
- The V2-Format has been reverted.
|
||||||
|
- New encoding format has been enabled in default.
|
||||||
|
- WARNING: Since this version, the compatibilities with older Filesystem LiveSync have been lost.
|
||||||
|
|
||||||
|
## 0.20.0
|
||||||
|
At 0.20.0, Self-hosted LiveSync has changed the binary file format and encrypting format, for efficient synchronisation.
|
||||||
|
The dialogue will be shown and asks us to decide whether to keep v1 or use v2. Once we have enabled v2, all subsequent edits will be saved in v2. Therefore, devices running 0.19 or below cannot understand this and they might say that decryption error. Please update all devices.
|
||||||
|
Then we will have an impressive performance.
|
||||||
|
|
||||||
|
Of course, these are very impactful changes. If you have any questions or troubled things, please feel free to open an issue and mention me.
|
||||||
|
|
||||||
|
Note: if you want to roll it back to v1, please enable `Use binary and encryption version 1` on the `Hatch` pane and perform the `rebuild everything` once.
|
||||||
|
|
||||||
|
Extra but notable information:
|
||||||
|
|
||||||
|
This format change gives us the ability to detect some `marks` in the binary files as same as text files. Therefore, we can split binary files and some specific sort of them (i.e., PDF files) at the specific character. It means that editing the middle of files could be detected with marks.
|
||||||
|
|
||||||
|
Now only a few chunks are transferred, even if we add a comment to the PDF or put new files into the ZIP archives.
|
||||||
|
|
||||||
|
|
||||||
|
- 0.20.7
|
||||||
|
- Fixed
|
||||||
|
- To better replication, path obfuscation is now deterministic even if with E2EE.
|
||||||
|
Note: Compatible with previous database without any conversion. Only new files will be obfuscated in deterministic.
|
||||||
|
- 0.20.6
|
||||||
|
- Fixed
|
||||||
|
- Now empty file could be decoded.
|
||||||
|
- Local files are no longer pre-saved before fetching from a remote database.
|
||||||
|
- No longer deadlock while applying customisation sync.
|
||||||
|
- Configuration with multiple files is now able to be applied correctly.
|
||||||
|
- Deleting folder propagation now works without enabling the use of a trash bin.
|
||||||
|
- 0.20.5
|
||||||
|
- Fixed
|
||||||
|
- Now the files which having digit or character prefixes in the path will not be ignored.
|
||||||
|
- 0.20.4
|
||||||
|
- Fixed
|
||||||
|
- The text-input-dialogue is no longer broken.
|
||||||
|
- Finally, we can use the Setup URI again on mobile.
|
||||||
|
- 0.20.3
|
||||||
|
- New feature:
|
||||||
|
- We can launch Customization sync from the Ribbon if we enabled it.
|
||||||
|
- Fixed:
|
||||||
|
- Setup URI is now back to the previous spec; be encrypted by V1.
|
||||||
|
- It may avoid the trouble with iOS 17.
|
||||||
|
- The Settings dialogue is now registered at the beginning of the start-up process.
|
||||||
|
- We can change the configuration even though LiveSync could not be launched in normal.
|
||||||
|
- Improved:
|
||||||
|
- Enumerating documents has been faster.
|
||||||
|
- 0.20.2
|
||||||
|
- New feature:
|
||||||
|
- We can delete all data of customization sync from the `Delete all customization sync data` on the `Hatch` pane.
|
||||||
|
- Fixed:
|
||||||
|
- Prevent keep restarting on iOS by yielding microtasks.
|
||||||
|
- 0.20.1
|
||||||
|
- Fixed:
|
||||||
|
- No more UI freezing and keep restarting on iOS.
|
||||||
|
- Diff of Non-markdown documents are now shown correctly.
|
||||||
|
- Improved:
|
||||||
|
- Performance has been a bit improved.
|
||||||
|
- Customization sync has gotten faster.
|
||||||
|
- However, We lost forward compatibility again (only for this feature). Please update all devices.
|
||||||
|
- Misc
|
||||||
|
- Terser configuration has been more aggressive.
|
||||||
|
- 0.20.0
|
||||||
|
- Improved:
|
||||||
|
- A New binary file handling implemented
|
||||||
|
- A new encrypted format has been implemented
|
||||||
|
- Now the chunk sizes will be adjusted for efficient sync
|
||||||
|
- Fixed:
|
||||||
|
- levels of exception in some logs have been fixed
|
||||||
|
- Tidied:
|
||||||
|
- Some Lint warnings have been suppressed.
|
||||||
|
|
||||||
### 0.19.0
|
### 0.19.0
|
||||||
|
|
||||||
#### Customization sync
|
#### Customization sync
|
||||||
|
|||||||
Reference in New Issue
Block a user