Compare commits

...

69 Commits

Author SHA1 Message Date
vorotamoroz
ac9428e96b Fixed
- To better replication, path obfuscation is now deterministic even if with E2EE.
2023-11-15 08:44:03 +00:00
vorotamoroz
280d9e1dd9 Fixed: Fixed the issue of TOML editing. 2023-11-07 01:07:58 +00:00
vorotamoroz
f7209e566c bump 2023-10-24 10:07:29 +01:00
vorotamoroz
4a9ab2d1de Fixed:
- No longer enumerating file names is broken.
2023-10-24 10:07:17 +01:00
vorotamoroz
cb74b5ee93 - Fixed
- Now empty file could be decoded.
    - Local files are no longer pre-saved before fetching from a remote database.
    - No longer deadlock while applying customisation sync.
    - Configuration with multiple files is now able to be applied correctly.
    - Deleting folder propagation now works without enabling the use of a trash bin.
2023-10-24 09:54:56 +01:00
vorotamoroz
60eecd7001 bump 2023-10-17 12:00:59 +09:00
vorotamoroz
4bd7b54bcd Fixed:
- Now the files which having digit or character prefixes in the path will not be ignored.
2023-10-17 12:00:19 +09:00
vorotamoroz
8923c73d1b bump 2023-10-14 23:08:34 +09:00
vorotamoroz
11e64b13e2 The text-input-dialogue is no longer broken. 2023-10-14 23:07:51 +09:00
vorotamoroz
983d9248ed bump 2023-10-13 04:23:59 +01:00
vorotamoroz
7240e84328 - New feature:
- We can launch Customization sync from the Ribbon if we enable it.
- Fixed:
  - Setup URI is now back to the previous spec; be encrypted by V1.
  - The Settings dialogue is now registered at the beginning of the start-up process.
- Improved:
  - Enumerating documents has been faster.
2023-10-13 04:22:24 +01:00
vorotamoroz
0d55ae2532 Merge pull request #298 from LiamSwayne/patch-1
Grammar fix
2023-10-04 13:07:42 +09:00
Liam Swayne
dbd284f5dd grammar fix 2023-10-03 22:26:30 -04:00
vorotamoroz
c000a02f4a bump 2023-10-02 10:39:54 +01:00
vorotamoroz
79754f48d6 New feature:
- We can delete all data of customization sync from the `Delete all customization sync data` on the `Hatch` pane.
Fixed:
- Prevent keep restarting on iOS by yielding microtasks.
2023-10-02 10:38:54 +01:00
vorotamoroz
dd7a40630b bump 2023-10-02 09:54:56 +01:00
vorotamoroz
14406f8213 - Fixed:
- No more UI freezing and keep restarting on iOS.
  - Diff of Non-markdown documents are now shown correctly.
- Improved:
  - Performance has been a bit improved.
  - Customization sync has gotten faster.
    - However, We lost forward compatibility again (only for this feature). Please update all devices.
- Misc
  - Terser configuration has been more aggressive.
2023-10-02 09:53:41 +01:00
vorotamoroz
3bbd9c048d bump 2023-09-29 18:57:54 +09:00
vorotamoroz
d91c4f50b4 - Improved:
- A New binary file handling implemented
  - A new encrypted format has been implemented
  - Now the chunk sizes will be adjusted for efficient sync
- Fixed:
  - levels of exception in some logs have been fixed
- Tidied:
  - Some Lint warnings have been suppressed.
2023-09-29 18:55:46 +09:00
vorotamoroz
395b7fbc42 bump 2023-09-22 18:03:09 +09:00
vorotamoroz
3773e57429 Improved:
- We can open the log pane also from the command palette now.
- Now, the hidden file scanning interval could be configured to 0.
- `Check database configuration` now points out that we do not have administrator permission.
2023-09-22 17:59:32 +09:00
vorotamoroz
4835fce62a bump 2023-09-21 09:44:50 +01:00
vorotamoroz
ff814be4a0 Fixed:
- Now the synchronisation will begin without our interaction.
- No longer puts the configuration of the remote database into the log while checking configuration.
- Some outdated description notes have been removed.
- Options that are meaningless depending on other settings configured are now hidden.
2023-09-21 09:44:07 +01:00
vorotamoroz
b271b63efa bump 2023-09-20 07:05:18 +01:00
vorotamoroz
23419e476a - Fixed:
- Hidden files are no longer handled in the initial replication.
  - Report from `Making report` fixed
2023-09-20 07:04:31 +01:00
vorotamoroz
b9bd1f17b8 bump 2023-09-19 09:58:04 +01:00
vorotamoroz
bcce277c36 New feature:
- `Sync on Editor save` has been implemented
- Now we can use the `Hidden file sync` and the `Customization sync` cooperatively.
- We can ignore specific plugins in Customization sync.
- Now the message of leftover conflicted files accepts our click.

Refactored:
- Parallelism functions made more explicit.
- Type errors have been reduced.

Fixed:
- Now documents would not be overwritten if they are conflicted.
- Some error messages have been fixed.
- Missing dialogue titles have been shown now.
2023-09-19 09:53:48 +01:00
vorotamoroz
5acbbe479e Merge pull request #276 from cgcel/main
Add Quick setup Chinese translation.
2023-09-13 17:13:55 +09:00
vorotamoroz
c9f9d511e0 bump 2023-09-05 09:16:49 +01:00
vorotamoroz
b8cb94c498 Fixed:
- Resolving conflicted revision has become more robust.
- LiveSync now try to keep local changes when fetching from the rebuilt remote database.
- Now, all files will be restored after performing `fetch` immediately.
2023-09-05 09:16:11 +01:00
cgcel
52c736f6b9 update: format markdown titles. 2023-09-01 11:01:06 +08:00
cgcel
ebd1cb7777 add: Quick setup Chinese translation. 2023-09-01 10:58:07 +08:00
vorotamoroz
10decb7909 Merge pull request #259 from jt-wang/main
Upload colab notebook into the repo, and update setup_flyio.md to reference it
2023-08-24 13:04:02 +09:00
vorotamoroz
e0aab8d69d bump 2023-08-24 04:03:03 +01:00
vorotamoroz
618600c753 Fixed:
- Now the empty (or deleted) file could be conflict-resolved.
2023-08-24 04:00:56 +01:00
Jingtao Wang
d1aba87e37 Updated with latest changes 2023-08-23 19:22:07 -07:00
Jingtao Wang
db889f635e Merge pull request #1 from jt-wang/move-colab-script-into-repo
Upload deploy_couchdb_to_flyio_v2_with_swap.ipynb into the repo, and reference it from fly.io setup doc.
2023-08-15 14:55:37 -07:00
Jingtao Wang
dd80e634f5 Update setup_flyio.md to reference colab notebook within the repo 2023-08-15 14:54:43 -07:00
Jingtao Wang
bec6fc1a74 Upload deploy_couchdb_to_flyio_v2_with_swap.ipynb
Upload https://gist.github.com/vrtmrz/37c3efd7842e49947aaaa7f665e5020a into the repo for unified management.
2023-08-15 14:49:24 -07:00
vorotamoroz
5c96c7f99b bump 2023-08-08 10:45:50 +01:00
vorotamoroz
7b9724f713 Fixed:
- Now nested ignore files could be parsed correctly.
- The unexpected deletion of hidden files in some cases has been corrected.
- Hidden file change is no longer reflected on the device which has made the change itself.
2023-08-08 10:42:07 +01:00
vorotamoroz
4cd12c85ed update the changelog. 2023-08-04 10:03:51 +01:00
vorotamoroz
90651540f9 Update release.yml 2023-08-04 17:52:13 +09:00
vorotamoroz
9e504d5002 bump 2023-08-04 09:45:14 +01:00
vorotamoroz
faaa94423c New feature:
- (Beta) ignore files handling

Fixed:
- Buttons on lock-detected-dialogue now can be shown in narrow-width devices.

Improved:
- Some constant has been flattened to be evaluated.
- The usage of the deprecated API of obsidian has been reduced.
- Now the indexedDB adapter will be enabled while the importing configuration.

Misc:
- Compiler, framework, and dependencies have been upgraded.
- Due to standing for these impacts (especially in esbuild and svelte,) terser has been introduced.
  Feel free to notify your opinion to me! I do not like to obfuscate the code too.
2023-08-04 09:45:04 +01:00
vorotamoroz
a7c179fc86 Merge pull request #247 from cgcel/patch-1
Update setup_own_server_cn.md
2023-07-31 12:49:50 +09:00
vorotamoroz
ed1a670b9b bump 2023-07-26 17:35:06 +09:00
vorotamoroz
6c3c265bd6 Dependency updates 2023-07-26 17:32:07 +09:00
vorotamoroz
9d68025f2a Fixed:
- Now storing files after cleaning up is correct works.

Improved:
- Cleaning the local database up got incredibly fastened.
2023-07-26 17:31:55 +09:00
GC Chen
e70972c8f9 Update setup_own_server_cn.md 2023-07-25 22:31:17 +08:00
vorotamoroz
7607be7729 bump 2023-07-25 19:20:18 +09:00
vorotamoroz
db9d428ab4 Fixed:
- Internal documents are now ignored.
- Merge dialogue now respond immediately to button pressing.
- Periodic processing now works fine
- The checking interval of detecting conflicted has got shorter
- Replication is now cancelled while cleaning up
- The database locking by the cleaning up is now carefully unlocked
- Missing chunks message is correctly reported

New feature:
- Suspend database reflecting has been implemented
- Now fetch suspends the reflecting database and storage changes temporarily to improve the performance.
- We can choose the action when the remote database has been cleaned
- Merge dialogue now show `↲` before the new line.

Improved:
- Now progress is reported while the cleaning up and fetch process
- Cancelled replication is now detected
2023-07-25 19:16:39 +09:00
vorotamoroz
0a2caea3c7 bump 2023-07-25 00:30:17 +09:00
vorotamoroz
b1d1ba0e6b - Implemented:
- Database clean-up is now in beta 2!
    We can shrink the remote database by deleting unused chunks, with keeping history.
    Note: Local database is not cleaned up totally. We have to `Fetch` again to let it done.
    **Note2**: Still in beta. Please back your vault up anything before.
- Fixed:
  - The log updates are not thinned out now.
2023-07-25 00:29:47 +09:00
vorotamoroz
5e844372cb Merge pull request #236 from bioluks/documentation-rproxy
CouchDB config corrections, reverse proxy documentation, other additions
2023-07-20 10:33:37 +09:00
Meriç Aşkın
99c6911e96 Merge branch 'vrtmrz:main' into documentation-rproxy 2023-07-18 04:25:53 +02:00
vorotamoroz
dc880d7d4e Merge pull request #244 from samyarkd/patch-1
fix: separate languages
2023-07-18 09:24:04 +09:00
Samyar
c157fef76c fix: separate languages
when i first saw it i thought it's one connected sentence
so, i thought it would be nice to separate them using a dash.
2023-07-14 22:14:51 +03:30
bioluks
2b2011dc49 Added docker-compose, table of contents, a new reverse proxies section populated with traefik for now 2023-07-04 01:52:48 +02:00
bioluks
ae451e005e Moved enable_cors to the right section. Added explanation for difference of versions. Added bind_address for making sure the container uses all interfaces given. Added spaces between 'origins' and removed spaces between the 'methods' elements because it's like this in the official Documentation. Added a write permission warning since many newbies had this mistake with couchdb. 2023-07-04 00:15:00 +02:00
vorotamoroz
8a75d41cbb bump 2023-07-03 18:45:05 +09:00
vorotamoroz
5252cc0372 Improved:
- Boot-up performance has been improved.
- Customisation sync performance has been improved.
- Synchronising performance has been improved.
2023-07-03 18:44:02 +09:00
vorotamoroz
5022155317 Fix documentation 2023-06-19 16:31:52 +09:00
vorotamoroz
d36f925c65 bump 2023-06-15 18:13:43 +09:00
vorotamoroz
3ae33e0500 Refactored: External dependency merged. 2023-06-15 18:07:11 +09:00
vorotamoroz
13e442a0c7 Improvements:
- Hashing ChunkID has been improved.
- Logging keeps 400 lines now.

Refactored:
- Import statement has been fixed about types.
2023-06-15 17:55:58 +09:00
vorotamoroz
6288716966 - Fixed
- Fixed the issue about fixing the database.
2023-06-09 19:28:29 +09:00
vorotamoroz
47d2cf9733 bump 2023-06-09 18:48:59 +09:00
vorotamoroz
ae6a9ecee4 - New feature (For fixing a problem):
- We can fix the database obfuscated and plain paths that have been mixed up.
- Improvements
  - Customisation Sync performance has been improved.
2023-06-09 18:48:10 +09:00
44 changed files with 4895 additions and 2464 deletions

View File

@@ -1,4 +1,5 @@
node_modules
build
.eslintrc.js.bak
src/lib/src/patches/pouchdb-utils
src/lib/src/patches/pouchdb-utils
esbuild.config.mjs

View File

@@ -17,7 +17,7 @@ jobs:
- name: Use Node.js
uses: actions/setup-node@v1
with:
node-version: '14.x' # You might need to adjust this value to your own version
node-version: '18.x' # You might need to adjust this value to your own version
# Get the version number and put it in a variable
- name: Get Version
id: version

1
.gitignore vendored
View File

@@ -8,6 +8,7 @@ package-lock.json
# build
main.js
main_org.js
*.js.map
# obsidian

View File

@@ -1,6 +1,6 @@
# Self-hosted LiveSync
[Japanese docs](./README_ja.md) [Chinese docs](./README_cn.md).
[Japanese docs](./README_ja.md) - [Chinese docs](./README_cn.md).
Self-hosted LiveSync is a community-implemented synchronization plugin.
A self-hosted or purchased CouchDB acts as the intermediate server. Available on every obsidian-compatible platform.
@@ -66,7 +66,7 @@ Synchronization status is shown in statusbar.
- ↓ Downloaded chunks and metadata
- ⏳ Number of pending processes
- 🧩 Number of files waiting for their chunks.
If you have deleted or renamed files, please wait until ⏳ icon disappeared.
If you have deleted or renamed files, please wait until ⏳ icon disappears.
## Hints

View File

@@ -0,0 +1,318 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "view-in-github"
},
"source": [
"<a href=\"https://colab.research.google.com/gist/vrtmrz/37c3efd7842e49947aaaa7f665e5020a/deploy_couchdb_to_flyio_v2_with_swap.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "HiRV7G8Gk1Rs"
},
"source": [
"History:\n",
"- 18, May, 2023: Initial.\n",
"- 19, Jun., 2023: Patched for enabling swap.\n",
"- 22, Aug., 2023: Generating Setup-URI implemented.\n",
"- 7, Nov., 2023: Fixed the issue of TOML editing."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "2Vh0mEQEZuAK"
},
"outputs": [],
"source": [
"# Configurations\n",
"import os\n",
"os.environ['region']=\"nrt\"\n",
"os.environ['couchUser']=\"alkcsa93\"\n",
"os.environ['couchPwd']=\"c349usdfnv48fsasd\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "SPmbB0jZauQ1"
},
"outputs": [],
"source": [
"# Delete once (Do not care about `cannot remove './fly.toml': No such file or directory`)\n",
"!rm ./fly.toml"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "Nze7QoxLZ7Yx"
},
"outputs": [],
"source": [
"# Installation\n",
"# You have to set up your account in here.\n",
"!curl -L https://fly.io/install.sh | sh\n",
"!/root/.fly/bin/flyctl auth signup"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "MVJwsIYrbgtx"
},
"outputs": [],
"source": [
"# Generate server\n",
"!/root/.fly/bin/flyctl launch --auto-confirm --generate-name --detach --no-deploy --region ${region}\n",
"!/root/.fly/bin/fly volumes create --region ${region} couchdata --size 2 --yes"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "2RSoO9o-i2TT"
},
"outputs": [],
"source": [
"# Check the toml once.\n",
"!cat fly.toml"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "zUtPZLVnbvdQ"
},
"outputs": [],
"source": [
"# Modify the TOML and generate Dockerfile\n",
"!pip install mergedeep\n",
"from mergedeep import merge\n",
"import toml\n",
"fly = toml.load('fly.toml')\n",
"override = {\n",
" \"http_service\":{\n",
" \"internal_port\":5984\n",
" },\n",
" \"build\":{\n",
" \"dockerfile\":\"./Dockerfile\"\n",
" },\n",
" \"mounts\":{\n",
" \"source\":\"couchdata\",\n",
" \"destination\":\"/opt/couchdb/data\"\n",
" },\n",
" \"env\":{\n",
" \"COUCHDB_USER\":os.environ['couchUser'],\n",
" \"ERL_FLAGS\":\"-couch_ini /opt/couchdb/etc/default.ini /opt/couchdb/etc/default.d/ /opt/couchdb/etc/local.d /opt/couchdb/etc/local.ini /opt/couchdb/data/persistence.ini\",\n",
" }\n",
"}\n",
"out = merge(fly,override)\n",
"with open('fly.toml', 'wt') as fp:\n",
" toml.dump(out, fp)\n",
" fp.close()\n",
"\n",
"# Make the Dockerfile to modify the permission of the ini file. If you want to use a specific version, you should change `latest` here.\n",
"dockerfile = '''FROM couchdb:latest\n",
"RUN sed -i '2itouch /opt/couchdb/data/persistence.ini && chmod +w /opt/couchdb/data/persistence.ini && fallocate -l 512M /swapfile && chmod 0600 /swapfile && mkswap /swapfile && echo 10 > /proc/sys/vm/swappiness && swapon /swapfile && echo 1 > /proc/sys/vm/overcommit_memory' /docker-entrypoint.sh\n",
"'''\n",
"with open(\"./Dockerfile\",\"wt\") as fp:\n",
" fp.write(dockerfile)\n",
" fp.close()\n",
"\n",
"!echo ------\n",
"!cat fly.toml\n",
"!echo ------\n",
"!cat Dockerfile"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "xWdsTCI6bzk2"
},
"outputs": [],
"source": [
"# Configure password\n",
"!/root/.fly/bin/flyctl secrets set COUCHDB_PASSWORD=${couchPwd}"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "k0WIQlShcXGa"
},
"outputs": [],
"source": [
"# Deploy server\n",
"# Be sure to shutdown after the test.\n",
"!/root/.fly/bin/flyctl deploy --detach --remote-only\n",
"!/root/.fly/bin/flyctl status"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "0ySggkdlfq7M"
},
"outputs": [],
"source": [
"import subprocess, json\n",
"result = subprocess.run([\"/root/.fly/bin/flyctl\",\"status\",\"-j\"], capture_output=True, text=True)\n",
"if result.returncode==0:\n",
" hostname = json.loads(result.stdout)[\"Hostname\"]\n",
" os.environ['couchHost']=\"https://%s\" % (hostname)\n",
" print(\"Your couchDB server is https://%s/\" % (hostname))\n",
"else:\n",
" print(\"Something occured.\")\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "cGlSzVqlQG_z"
},
"outputs": [],
"source": [
"# Finish setting up the CouchDB\n",
"# Please repeat until the request is completed without error messages\n",
"# i.e., You have to redo this block while \"curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to xxxx\" is showing.\n",
"#\n",
"# Note: A few minutes might be required to be booted.\n",
"!curl -X POST \"${couchHost}/_cluster_setup\" -H \"Content-Type: application/json\" -d \"{\\\"action\\\":\\\"enable_single_node\\\",\\\"username\\\":\\\"${couchUser}\\\",\\\"password\\\":\\\"${couchPwd}\\\",\\\"bind_address\\\":\\\"0.0.0.0\\\",\\\"port\\\":5984,\\\"singlenode\\\":true}\" --user \"${couchUser}:${couchPwd}\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "JePzrsHypY18"
},
"outputs": [],
"source": [
"# Please repeat until all lines are completed without error messages\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/require_valid_user\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd_auth/require_valid_user\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/httpd/WWW-Authenticate\" -H \"Content-Type: application/json\" -d '\"Basic realm=\\\"couchdb\\\"\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/httpd/enable_cors\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/enable_cors\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/chttpd/max_http_request_size\" -H \"Content-Type: application/json\" -d '\"4294967296\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/couchdb/max_document_size\" -H \"Content-Type: application/json\" -d '\"50000000\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/credentials\" -H \"Content-Type: application/json\" -d '\"true\"' --user \"${couchUser}:${couchPwd}\"\n",
"!curl -X PUT \"${couchHost}/_node/nonode@nohost/_config/cors/origins\" -H \"Content-Type: application/json\" -d '\"app://obsidian.md,capacitor://localhost,http://localhost\"' --user \"${couchUser}:${couchPwd}\""
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "YfSOomsoXbGS"
},
"source": [
"Now, our CouchDB has been surely installed and configured. Cheers!\n",
"\n",
"In the steps that follow, create a setup-URI.\n",
"\n",
"This URI could be imported directly into Self-hosted LiveSync, to configure the use of the CouchDB which we configured now."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "416YncOqXdNn"
},
"outputs": [],
"source": [
"# Database config\n",
"import random, string\n",
"\n",
"def randomname(n):\n",
" return ''.join(random.choices(string.ascii_letters + string.digits, k=n))\n",
"\n",
"# The database name\n",
"os.environ['database']=\"obsidiannote\"\n",
"# The passphrase to E2EE\n",
"os.environ['passphrase']=randomname(20)\n",
"\n",
"print(\"Your database:\"+os.environ['database'])\n",
"print(\"Your passphrase:\"+os.environ['passphrase'])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "C4d7C0HAXgsr"
},
"outputs": [],
"source": [
"# Install deno for make setup uri\n",
"!curl -fsSL https://deno.land/x/install/install.sh | sh"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "hQL_Dx-PXise"
},
"outputs": [],
"source": [
"# Fetch module for encrypting a Setup URI\n",
"!curl -o encrypt.ts https://gist.githubusercontent.com/vrtmrz/f9d1d95ee2ca3afa1a924a2c6759b854/raw/d7a070d864a6f61403d8dc74208238d5741aeb5a/encrypt.ts"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "o0gX_thFXlIZ"
},
"outputs": [],
"source": [
"# Make buttons!\n",
"from IPython.display import HTML\n",
"result = subprocess.run([\"/root/.deno/bin/deno\",\"run\",\"-A\",\"encrypt.ts\"], capture_output=True, text=True)\n",
"text=\"\"\n",
"if result.returncode==0:\n",
" text = result.stdout.strip()\n",
" result = HTML(f\"<button onclick=navigator.clipboard.writeText('{text}')>Copy setup uri</button><br>Importing passphrase is `welcome`. <br>If you want to synchronise in live mode, please apply a preset after setup.)\")\n",
"else:\n",
" result = \"Failed to encrypt the setup URI\"\n",
"result"
]
}
],
"metadata": {
"colab": {
"include_colab_link": true,
"private_outputs": true,
"provenance": []
},
"gpuClass": "standard",
"kernelspec": {
"display_name": "Python 3",
"name": "python3"
},
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 0
}

View File

@@ -0,0 +1,46 @@
# For details and other explanations about this file refer to:
# https://github.com/vrtmrz/obsidian-livesync/blob/main/docs/setup_own_server.md#traefik
version: "2.1"
services:
couchdb:
image: couchdb:latest
container_name: obsidian-livesync
user: 1000:1000
environment:
- COUCHDB_USER=username
- COUCHDB_PASSWORD=password
volumes:
- ./data:/opt/couchdb/data
- ./local.ini:/opt/couchdb/etc/local.ini
# Ports not needed when already passed to Traefik
#ports:
# - 5984:5984
restart: unless-stopped
networks:
- proxy
labels:
- "traefik.enable=true"
# The Traefik Network
- "traefik.docker.network=proxy"
# Don't forget to replace 'obsidian-livesync.example.org' with your own domain
- "traefik.http.routers.obsidian-livesync.rule=Host(`obsidian-livesync.example.org`)"
# The 'websecure' entryPoint is basically your HTTPS entrypoint. Check the next code snippet if you are encountering problems only; you probably have a working traefik configuration if this is not your first container you are reverse proxying.
- "traefik.http.routers.obsidian-livesync.entrypoints=websecure"
- "traefik.http.routers.obsidian-livesync.service=obsidian-livesync"
- "traefik.http.services.obsidian-livesync.loadbalancer.server.port=5984"
- "traefik.http.routers.obsidian-livesync.tls=true"
# Replace the string 'letsencrypt' with your own certificate resolver
- "traefik.http.routers.obsidian-livesync.tls.certresolver=letsencrypt"
- "traefik.http.routers.obsidian-livesync.middlewares=obsidiancors"
# The part needed for CORS to work on Traefik 2.x starts here
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolallowmethods=GET,PUT,POST,HEAD,DELETE"
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolallowheaders=accept,authorization,content-type,origin,referer"
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolalloworiginlist=app://obsidian.md,capacitor://localhost,http://localhost"
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolmaxage=3600"
- "traefik.http.middlewares.obsidiancors.headers.addvaryheader=true"
- "traefik.http.middlewares.obsidiancors.headers.accessControlAllowCredentials=true"
networks:
proxy:
external: true

View File

@@ -1,4 +1,7 @@
# Quick setup
[Japanese docs](./quick_setup_ja.md) - [Chinese docs](./quick_setup_cn.md).
The plugin has so many configuration options to deal with different circumstances. However, there are not so many settings that are actually used. Therefore, `The Setup wizard` has been implemented to simplify the initial setup.
Note: Subsequent devices are recommended to be set up using the `Copy setup URI` and `Open setup URI`.

93
docs/quick_setup_cn.md Normal file
View File

@@ -0,0 +1,93 @@
# 快速配置 (Quick setup)
该插件有较多配置项, 可以应对不同的情况. 不过, 实际使用的设置并不多. 因此, 我们采用了 "设置向导 (The Setup wizard)" 来简化初始设置.
Note: 建议使用 `Copy setup URI` and `Open setup URI` 来设置后续设备.
## 设置向导 (The Setup wizard)
在设置对话框中打开 `🧙‍♂️ Setup wizard`. 如果之前未配置插件, 则会自动打开该页面.
![quick_setup_1](../images/quick_setup_1.png)
- 放弃现有配置并进行设置
如果您先前有过任何设置, 此按钮允许您在设置前放弃所有更改.
- 保留现有配置和设置
快速重新配置. 请注意, 在向导模式下, 您无法看到所有已经配置过的配置项.
在上述选项中按下 `Next`, 配置对话框将进入向导模式 (wizard mode).
### 向导模式 (Wizard mode)
![quick_setup_2](../images/quick_setup_2.png)
接下来将介绍如何逐步使用向导模式.
## 配置远程数据库
### 开始配置远程数据库
输入已部署好的数据库的信息.
![quick_setup_3](../images/quick_setup_3.png)
#### 测试数据库连接并检查数据库配置
我们可以检查数据库的连接性和数据库设置.
![quick_setup_5](../images/quick_setup_5.png)
#### 测试数据库连接
检查是否能成功连接数据库. 如果连接失败, 可能是多种原因导致的, 但请先点击 `Check database configuration` 来检查数据库配置是否有问题.
#### 检查数据库配置
检查数据库设置并修复问题.
![quick_setup_6](../images/quick_setup_6.png)
Config check 的显示内容可能因不同连接而异. 在上图情况下, 按下所有三个修复按钮.
如果修复按钮消失, 全部变为复选标记, 则表示修复完成.
### 加密配置
![quick_setup_4](../images/quick_setup_4.png)
为您的数据库加密, 以防数据库意外曝光; 启用端到端加密后, 笔记内容在离开设备时就会被加密. 我们强烈建议启用该功能. `路径混淆 (Path Obfuscation)` 还能混淆文件名. 现已稳定并推荐使用.
加密基于 256 位 AES-GCM.
如果你在一个封闭的网络中, 而且很明显第三方不会访问你的文件, 则可以禁用这些设置.
![quick_setup_7](../images/quick_setup_7.png)
#### Next
转到同步设置.
#### 放弃现有数据库并继续
清除远程数据库的内容, 然后转到同步设置.
### 同步设置
最后, 选择一个同步预设完成向导.
![quick_setup_9_1](../images/quick_setup_9_1.png)
选择我们要使用的任何同步方法, 然后 `Apply` 初始化并按要求建立本地和远程数据库. 如果显示 `All done!`, 我们就完成了. `Copy setup URI` 将自动打开,并要求我们输入密码以加密 `Setup URI`.
![quick_setup_10](../images/quick_setup_10.png)
根据需要设置密码。.
设置 URI (Setup URI) 将被复制到剪贴板, 然后您可以通过某种方式将其传输到第二个及后续设备.
## 如何设置第二单元和后续单元 (the second and subsequent units)
在第一台设备上安装 Self-hosted LiveSync 后, 从命令面板上选择 `Open setup URI`, 然后输入您传输的设置 URI (Setup URI). 然后输入密码,安装向导就会打开.
在弹窗中选择以下内容.
- `Importing LiveSync's conf, OK?` 选择 `Yes`
- `How would you like to set it up?`. 选择 `Set it up as secondary or subsequent device`
然后, 配置将生效并开始复制. 您的文件很快就会同步! 您可能需要关闭设置对话框并重新打开, 才能看到设置字段正确填充, 但它们都将设置好.

View File

@@ -280,7 +280,7 @@ Now the CouchDB is ready to use from Self-hosted LiveSync. We can use `https://b
## Automatic setup using Colaboratory
We can perform all these steps by using [this Colaboratory notebook](https://gist.github.com/vrtmrz/b437a539af25ef191bd452aae369242f) without installing anything.
We can perform all these steps by using [this Colaboratory notebook](/deploy_couchdb_to_flyio_v2_with_swap.ipynb) without installing anything.
## After testing / before creating a new instance

View File

@@ -1,10 +1,22 @@
# Setup a CouchDB server
## Table of Contents
- [Configure](#configure)
- [Run](#run)
- [Docker CLI](#docker-cli)
- [Docker Compose](#docker-compose)
- [Access from a mobile device](#access-from-a-mobile-device)
- [Testing from a mobile](#testing-from-a-mobile)
- [Setting up your domain](#setting-up-your-domain)
- [Reverse Proxies](#reverse-proxies)
- [Traefik](#traefik)
---
## Configure
The easiest way to set up a CouchDB instance is using the official [docker image](https://hub.docker.com/_/couchdb).
Some initial configuration is required. Create a `local.ini` to use Self-hosted LiveSync as follows:
Some initial configuration is required. Create a `local.ini` to use Self-hosted LiveSync as follows ([CouchDB has to be version 3.2 or higher](https://docs.couchdb.org/en/latest/config/http.html#chttpd/enable_cors), if lower `enable_cors = true` has to be under section `[httpd]` ):
```ini
[couchdb]
@@ -14,6 +26,7 @@ max_document_size = 50000000
[chttpd]
require_valid_user = true
max_http_request_size = 4294967296
enable_cors = true
[chttpd_auth]
require_valid_user = true
@@ -21,13 +34,13 @@ authentication_redirect = /_utils/session.html
[httpd]
WWW-Authenticate = Basic realm="couchdb"
enable_cors = true
bind_address = 0.0.0.0
[cors]
origins = app://obsidian.md,capacitor://localhost,http://localhost
origins = app://obsidian.md, capacitor://localhost, http://localhost
credentials = true
headers = accept, authorization, content-type, origin, referer
methods = GET, PUT, POST, HEAD, DELETE
methods = GET,PUT,POST,HEAD,DELETE
max_age = 3600
```
@@ -48,7 +61,7 @@ $ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=passw
*Remember to replace the path with the path to your local.ini*
### Docker Compose
Create a directory, place your `local.ini` within it, and create a `docker-compose.yml` alongside it. The directory structure should look similar to this:
Create a directory, place your `local.ini` within it, and create a `docker-compose.yml` alongside it. Make sure to have write permissions for `local.ini` and the about to be created `data` folder after the container start. The directory structure should look similar to this:
```
obsidian-livesync
├── docker-compose.yml
@@ -127,6 +140,77 @@ Set the A record of your domain to point to your server, and host reverse proxy
Note: Mounting CouchDB on the top directory is not recommended.
Using Caddy is a handy way to serve the server with SSL automatically.
I have published [docker-compose.yml and ini files](https://github.com/vrtmrz/self-hosted-livesync-server) that launch Caddy and CouchDB at once. Please try it out.
I have published [docker-compose.yml and ini files](https://github.com/vrtmrz/self-hosted-livesync-server) that launch Caddy and CouchDB at once. If you are using Traefik you can check the [Reverse Proxies](#reverse-proxies) section below.
And, be sure to check the server log and be careful of malicious access.
## Reverse Proxies
### Traefik
If you are using Traefik, this [docker-compose.yml](https://github.com/vrtmrz/obsidian-livesync/blob/main/docker-compose.traefik.yml) file (also pasted below) has all the right CORS parameters set. It assumes you have an external network called `proxy`.
```yaml
version: "2.1"
services:
couchdb:
image: couchdb:latest
container_name: obsidian-livesync
user: 1000:1000
environment:
- COUCHDB_USER=username
- COUCHDB_PASSWORD=password
volumes:
- ./data:/opt/couchdb/data
- ./local.ini:/opt/couchdb/etc/local.ini
# Ports not needed when already passed to Traefik
#ports:
# - 5984:5984
restart: unless-stopped
networks:
- proxy
labels:
- "traefik.enable=true"
# The Traefik Network
- "traefik.docker.network=proxy"
# Don't forget to replace 'obsidian-livesync.example.org' with your own domain
- "traefik.http.routers.obsidian-livesync.rule=Host(`obsidian-livesync.example.org`)"
# The 'websecure' entryPoint is basically your HTTPS entrypoint. Check the next code snippet if you are encountering problems only; you probably have a working traefik configuration if this is not your first container you are reverse proxying.
- "traefik.http.routers.obsidian-livesync.entrypoints=websecure"
- "traefik.http.routers.obsidian-livesync.service=obsidian-livesync"
- "traefik.http.services.obsidian-livesync.loadbalancer.server.port=5984"
- "traefik.http.routers.obsidian-livesync.tls=true"
# Replace the string 'letsencrypt' with your own certificate resolver
- "traefik.http.routers.obsidian-livesync.tls.certresolver=letsencrypt"
- "traefik.http.routers.obsidian-livesync.middlewares=obsidiancors"
# The part needed for CORS to work on Traefik 2.x starts here
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolallowmethods=GET,PUT,POST,HEAD,DELETE"
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolallowheaders=accept,authorization,content-type,origin,referer"
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolalloworiginlist=app://obsidian.md,capacitor://localhost,http://localhost"
- "traefik.http.middlewares.obsidiancors.headers.accesscontrolmaxage=3600"
- "traefik.http.middlewares.obsidiancors.headers.addvaryheader=true"
- "traefik.http.middlewares.obsidiancors.headers.accessControlAllowCredentials=true"
networks:
proxy:
external: true
```
Partial `traefik.yml` config file mentioned in above:
```yml
...
entryPoints:
web:
address: ":80"
http:
redirections:
entryPoint:
to: "websecure"
scheme: "https"
websecure:
address: ":443"
...
```

View File

@@ -1,8 +1,19 @@
# 在你自己的服务器上设置 CouchDB
## 目录
- [配置 CouchDB](#配置-CouchDB)
- [运行 CouchDB](#运行-CouchDB)
- [Docker CLI](#docker-cli)
- [Docker Compose](#docker-compose)
- [创建数据库](#创建数据库)
- [从移动设备访问](#从移动设备访问)
- [移动设备测试](#移动设备测试)
- [设置你的域名](#设置你的域名)
---
> 注:提供了 [docker-compose.yml 和 ini 文件](https://github.com/vrtmrz/self-hosted-livesync-server) 可以同时启动 Caddy 和 CouchDB。推荐直接使用该 docker-compose 配置进行搭建。(若使用,请查阅链接中的文档,而不是这个文档)
## 安装 CouchDB 并从 PC 或 Mac 上访问
## 配置 CouchDB
设置 CouchDB 的最简单方法是使用 [CouchDB docker image]((https://hub.docker.com/_/couchdb)).
@@ -33,17 +44,62 @@ methods = GET, PUT, POST, HEAD, DELETE
max_age = 3600
```
创建 `local.ini` 并用如下指令启动 CouchDB
```
$ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v .local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
```
Note: 此时 local.ini 的文件所有者会变成 5984:5984。这是 docker 镜像的限制,请修改文件所有者后再编辑 local.ini。
## 运行 CouchDB
在确定 Self-hosted LiveSync 可以和服务器同步后,可以后台启动 docker 镜像:
### Docker CLI
你可以通过指定 `local.ini` 配置运行 CouchDB:
```
$ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v .local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
$ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
```
*记得将上述命令中的 local.ini 挂载路径替换成实际的存放路径*
后台运行:
```
$ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /path/to/local.ini:/opt/couchdb/etc/local.ini -p 5984:5984 couchdb
```
*记得将上述命令中的 local.ini 挂载路径替换成实际的存放路径*
### Docker Compose
创建一个文件夹, 将你的 `local.ini` 放在文件夹内, 然后在文件夹内创建 `docker-compose.yml`. 请确保对 `local.ini` 有读写权限并且确保在容器运行后能创建 `data` 文件夹. 文件夹结构大概如下:
```
obsidian-livesync
├── docker-compose.yml
└── local.ini
```
可以参照以下内容编辑 `docker-compose.yml`:
```yaml
version: "2.1"
services:
couchdb:
image: couchdb
container_name: obsidian-livesync
user: 1000:1000
environment:
- COUCHDB_USER=admin
- COUCHDB_PASSWORD=password
volumes:
- ./data:/opt/couchdb/data
- ./local.ini:/opt/couchdb/etc/local.ini
ports:
- 5984:5984
restart: unless-stopped
```
最后, 创建并启动容器:
```
# -d will launch detached so the container runs in background
docker compose up -d
```
## 创建数据库
CouchDB 部署成功后, 需要手动创建一个数据库, 方便插件连接并同步.
1. 访问 `http://localhost:5984/_utils`, 输入帐号密码后进入管理页面
2. 点击 Create Database, 然后根据个人喜好创建数据库
## 从移动设备访问
如果你想要从移动设备访问 Self-hosted LiveSync你需要一个合法的 SSL 证书。

View File

@@ -12,10 +12,10 @@ max_document_size = 50000000
[chttpd]
require_valid_user = true
max_http_request_size = 4294967296
[chttpd_auth]
require_valid_user = true
max_http_request_size = 4294967296
authentication_redirect = /_utils/session.html
[httpd]

View File

@@ -1,46 +1,165 @@
//@ts-check
import esbuild from "esbuild";
import process from "process";
import builtins from "builtin-modules";
import sveltePlugin from "esbuild-svelte";
import sveltePreprocess from "svelte-preprocess";
import fs from "node:fs";
// import terser from "terser";
import { minify } from "terser";
const banner = `/*
THIS IS A GENERATED/BUNDLED FILE BY ESBUILD
THIS IS A GENERATED/BUNDLED FILE BY ESBUILD AND TERSER
if you want to view the source, please visit the github repository of this plugin
*/
`;
const prod = process.argv[2] === "production";
const manifestJson = JSON.parse(fs.readFileSync("./manifest.json"));
const packageJson = JSON.parse(fs.readFileSync("./package.json"));
const terserOpt = {
sourceMap: (!prod ? {
url: "inline"
} : {}),
format: {
indent_level: 2,
beautify: true,
comments: "some",
ecma: 2018,
preamble: banner,
webkit: true
},
parse: {
// parse options
},
compress: {
// compress options
defaults: false,
evaluate: true,
inline: 3,
join_vars: true,
loops: true,
passes: prod ? 4 : 1,
reduce_vars: true,
reduce_funcs: true,
arrows: true,
collapse_vars: true,
comparisons: true,
lhs_constants: true,
hoist_props: true,
side_effects: true,
if_return: true,
ecma: 2018,
unused: true,
},
// mangle: {
// // mangle options
// keep_classnames: true,
// keep_fnames: true,
// properties: {
// // mangle property options
// }
// },
ecma: 2018, // specify one of: 5, 2015, 2016, etc.
enclose: false, // or specify true, or "args:values"
keep_classnames: true,
keep_fnames: true,
ie8: false,
module: false,
// nameCache: null, // or specify a name cache object
safari10: false,
toplevel: false
}
const manifestJson = JSON.parse(fs.readFileSync("./manifest.json") + "");
const packageJson = JSON.parse(fs.readFileSync("./package.json") + "");
const updateInfo = JSON.stringify(fs.readFileSync("./updates.md") + "");
esbuild
.build({
banner: {
js: banner,
},
entryPoints: ["src/main.ts"],
bundle: true,
define: {
"MANIFEST_VERSION": `"${manifestJson.version}"`,
"PACKAGE_VERSION": `"${packageJson.version}"`,
"UPDATE_INFO": `${updateInfo}`,
"global":"window",
},
external: ["obsidian", "electron", "crypto"],
format: "cjs",
watch: !prod,
target: "es2018",
logLevel: "info",
sourcemap: prod ? false : "inline",
treeShaking: true,
platform: "browser",
plugins: [
sveltePlugin({
preprocess: sveltePreprocess(),
compilerOptions: { css: true },
}),
],
outfile: "main.js",
})
.catch(() => process.exit(1));
/** @type esbuild.Plugin[] */
const plugins = [{
name: 'my-plugin',
setup(build) {
let count = 0;
build.onEnd(async result => {
if (count++ === 0) {
console.log('first build:', result);
} else {
console.log('subsequent build:');
}
if (prod) {
console.log("Performing terser");
const src = fs.readFileSync("./main_org.js").toString();
// @ts-ignore
const ret = await minify(src, terserOpt);
if (ret && ret.code) {
fs.writeFileSync("./main.js", ret.code);
}
console.log("Finished terser");
} else {
fs.copyFileSync("./main_org.js", "./main.js");
}
});
},
}];
const context = await esbuild.context({
banner: {
js: banner,
},
entryPoints: ["src/main.ts"],
bundle: true,
define: {
"MANIFEST_VERSION": `"${manifestJson.version}"`,
"PACKAGE_VERSION": `"${packageJson.version}"`,
"UPDATE_INFO": `${updateInfo}`,
"global": "window",
},
external: [
"obsidian",
"electron",
"crypto",
"@codemirror/autocomplete",
"@codemirror/collab",
"@codemirror/commands",
"@codemirror/language",
"@codemirror/lint",
"@codemirror/search",
"@codemirror/state",
"@codemirror/view",
"@lezer/common",
"@lezer/highlight",
"@lezer/lr"],
// minifyWhitespace: true,
format: "cjs",
target: "es2018",
logLevel: "info",
platform: "browser",
sourcemap: prod ? false : "inline",
treeShaking: true,
outfile: "main_org.js",
minifyWhitespace: false,
minifySyntax: false,
minifyIdentifiers: false,
minify: false,
// keepNames: true,
plugins: [
sveltePlugin({
preprocess: sveltePreprocess(),
compilerOptions: { css: true, preserveComments: true },
}),
...plugins
],
})
if (prod) {
await context.rebuild();
process.exit(0);
} else {
await context.watch();
}

View File

@@ -1,7 +1,7 @@
{
"id": "obsidian-livesync",
"name": "Self-hosted LiveSync",
"version": "0.19.8",
"version": "0.20.7",
"minAppVersion": "0.9.12",
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"author": "vorotamoroz",

3350
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "obsidian-livesync",
"version": "0.19.8",
"version": "0.20.7",
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"main": "main.js",
"type": "module",
@@ -13,40 +13,51 @@
"author": "vorotamoroz",
"license": "MIT",
"devDependencies": {
"@tsconfig/svelte": "^4.0.1",
"@tsconfig/svelte": "^5.0.0",
"@types/diff-match-patch": "^1.0.32",
"@types/node": "^20.2.5",
"@types/pouchdb": "^6.4.0",
"@types/pouchdb-adapter-http": "^6.1.3",
"@types/pouchdb-adapter-idb": "^6.1.4",
"@types/pouchdb-browser": "^6.1.3",
"@typescript-eslint/eslint-plugin": "^5.54.0",
"@typescript-eslint/parser": "^5.54.0",
"@types/pouchdb-core": "^7.0.11",
"@types/pouchdb-mapreduce": "^6.1.7",
"@types/pouchdb-replication": "^6.4.4",
"@types/transform-pouch": "^1.0.2",
"@typescript-eslint/eslint-plugin": "^6.2.1",
"@typescript-eslint/parser": "^6.2.1",
"builtin-modules": "^3.3.0",
"esbuild": "0.15.15",
"esbuild-svelte": "^0.7.3",
"eslint": "^8.35.0",
"esbuild": "0.18.17",
"esbuild-svelte": "^0.7.4",
"eslint": "^8.46.0",
"eslint-config-airbnb-base": "^15.0.0",
"eslint-plugin-import": "^2.27.5",
"eslint-plugin-import": "^2.28.0",
"events": "^3.3.0",
"obsidian": "^1.1.1",
"postcss": "^8.4.21",
"obsidian": "^1.4.11",
"postcss": "^8.4.27",
"postcss-load-config": "^4.0.1",
"pouchdb-adapter-http": "^8.0.1",
"pouchdb-adapter-idb": "^8.0.1",
"pouchdb-adapter-indexeddb": "^8.0.1",
"pouchdb-core": "^8.0.1",
"pouchdb-errors": "^8.0.1",
"pouchdb-find": "^8.0.1",
"pouchdb-mapreduce": "^8.0.1",
"pouchdb-merge": "^8.0.1",
"pouchdb-replication": "^8.0.1",
"pouchdb-utils": "^8.0.1",
"svelte": "^3.59.1",
"svelte-preprocess": "^5.0.3",
"svelte": "^4.1.2",
"svelte-preprocess": "^5.0.4",
"terser": "^5.19.2",
"transform-pouch": "^2.0.0",
"tslib": "^2.5.0",
"typescript": "^5.0.4"
"tslib": "^2.6.1",
"typescript": "^5.1.6"
},
"dependencies": {
"diff-match-patch": "^1.0.5",
"idb": "^7.1.1",
"xxhash-wasm": "^0.4.2"
"minimatch": "^9.0.3",
"xxhash-wasm": "0.4.2",
"xxhash-wasm-102": "npm:xxhash-wasm@^1.0.2"
}
}

View File

@@ -1,27 +1,124 @@
import { writable } from 'svelte/store';
import { Notice, type PluginManifest, parseYaml } from "./deps";
import { Notice, type PluginManifest, parseYaml, normalizePath } from "./deps";
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry } from "./lib/src/types";
import { LOG_LEVEL } from "./lib/src/types";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "./lib/src/types";
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
import { Parallels, delay, getDocData } from "./lib/src/utils";
import { delay, getDocData } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { WrappedNotice } from "./lib/src/wrapper";
import { base64ToArrayBuffer, arrayBufferToBase64, readString, uint8ArrayToHexString } from "./lib/src/strbin";
import { runWithLock } from "./lib/src/lock";
import { readString, crc32CKHash, decodeBinary, encodeBinary } from "./lib/src/strbin";
import { serialized } from "./lib/src/lock";
import { LiveSyncCommands } from "./LiveSyncCommands";
import { stripAllPrefixes } from "./lib/src/path";
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "./utils";
import { Semaphore } from "./lib/src/semaphore";
import { PluginDialogModal } from "./dialogs";
import { JsonResolveModal } from "./JsonResolveModal";
import { pipeGeneratorToGenerator, processAllGeneratorTasksWithConcurrencyLimit } from './lib/src/task';
const d = "\u200b";
const d2 = "\n";
function serialize<T>(obj: T): string {
return JSON.stringify(obj, null, 1);
function serialize(data: PluginDataEx): string {
// For higher performance, create custom plug-in data strings.
// Self-hosted LiveSync uses `\n` to split chunks. Therefore, grouping together those with similar entropy would work nicely.
let ret = "";
ret += ":";
ret += data.category + d + data.name + d + data.term + d2;
ret += (data.version ?? "") + d2;
ret += data.mtime + d2;
for (const file of data.files) {
ret += file.filename + d + (file.displayName ?? "") + d + (file.version ?? "") + d2;
ret += file.mtime + d + file.size + d2;
for (const data of file.data ?? []) {
ret += data + d
}
ret += d2;
}
return ret;
}
function fetchToken(source: string, from: number): [next: number, token: string] {
const limitIdx = source.indexOf(d2, from);
const limit = limitIdx == -1 ? source.length : limitIdx;
const delimiterIdx = source.indexOf(d, from);
const delimiter = delimiterIdx == -1 ? source.length : delimiterIdx;
const tokenEnd = Math.min(limit, delimiter);
let next = tokenEnd;
if (limit < delimiter) {
next = tokenEnd;
} else {
next = tokenEnd + 1
}
return [next, source.substring(from, tokenEnd)];
}
function getTokenizer(source: string) {
const t = {
pos: 1,
next() {
const [next, token] = fetchToken(source, this.pos);
this.pos = next;
return token;
},
nextLine() {
const nextPos = source.indexOf(d2, this.pos);
if (nextPos == -1) {
this.pos = source.length;
} else {
this.pos = nextPos + 1;
}
}
}
return t;
}
function deserialize2(str: string): PluginDataEx {
const tokens = getTokenizer(str);
const ret = {} as PluginDataEx;
const category = tokens.next();
const name = tokens.next();
const term = tokens.next();
tokens.nextLine();
const version = tokens.next();
tokens.nextLine();
const mtime = Number(tokens.next());
tokens.nextLine();
const result: PluginDataEx = Object.assign(ret,
{ category, name, term, version, mtime, files: [] as PluginDataExFile[] })
let filename = "";
do {
filename = tokens.next();
if (!filename) break;
const displayName = tokens.next();
const version = tokens.next();
tokens.nextLine();
const mtime = Number(tokens.next());
const size = Number(tokens.next());
tokens.nextLine();
const data = [] as string[];
let piece = "";
do {
piece = tokens.next();
if (piece == "") break;
data.push(piece);
} while (piece != "");
result.files.push(
{
filename,
displayName,
version,
mtime,
size,
data
}
)
tokens.nextLine();
} while (filename);
return result;
}
function deserialize<T>(str: string, def: T) {
try {
if (str[0] == ":") return deserialize2(str);
return JSON.parse(str) as T;
} catch (ex) {
try {
@@ -36,14 +133,6 @@ function deserialize<T>(str: string, def: T) {
export const pluginList = writable([] as PluginDataExDisplay[]);
export const pluginIsEnumerating = writable(false);
const encoder = new TextEncoder();
const hashString = (async (key: string) => {
// const buff = writeString(key);
const buff = encoder.encode(key);
const digest = await crypto.subtle.digest('SHA-256', buff);
return uint8ArrayToHexString(new Uint8Array(digest));
})
export type PluginDataExFile = {
filename: string,
data?: string[],
@@ -115,6 +204,7 @@ export class ConfigSync extends LiveSyncCommands {
},
});
}
getFileCategory(filePath: string): "CONFIG" | "THEME" | "SNIPPET" | "PLUGIN_MAIN" | "PLUGIN_ETC" | "PLUGIN_DATA" | "" {
if (filePath.split("/").length == 2 && filePath.endsWith(".json")) return "CONFIG";
if (filePath.split("/").length == 4 && filePath.startsWith(`${this.app.vault.configDir}/themes/`)) return "THEME";
@@ -147,7 +237,7 @@ export class ConfigSync extends LiveSyncCommands {
Logger("Scanning customizations : done");
} catch (ex) {
Logger("Scanning customizations : failed");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
@@ -172,85 +262,104 @@ export class ConfigSync extends LiveSyncCommands {
pluginList.set(this.pluginList)
await this.updatePluginList(showMessage);
}
async loadPluginData(path: FilePathWithPrefix): Promise<PluginDataExDisplay | false> {
const wx = await this.localDatabase.getDBEntry(path, null, false, false);
if (wx) {
const data = deserialize(getDocData(wx.data), {}) as PluginDataEx;
const xFiles = [] as PluginDataExFile[];
for (const file of data.files) {
const work = { ...file };
const tempStr = getDocData(work.data);
work.data = [crc32CKHash(tempStr)];
xFiles.push(work);
}
return ({
...data,
documentPath: this.getPath(wx),
files: xFiles
}) as PluginDataExDisplay;
}
return false;
}
createMissingConfigurationEntry() {
let saveRequired = false;
for (const v of this.pluginList) {
const key = `${v.category}/${v.name}`;
if (!(key in this.plugin.settings.pluginSyncExtendedSetting)) {
this.plugin.settings.pluginSyncExtendedSetting[key] = {
key,
mode: MODE_SELECTIVE,
files: []
}
}
if (this.plugin.settings.pluginSyncExtendedSetting[key].files.sort().join(",").toLowerCase() !=
v.files.map(e => e.filename).sort().join(",").toLowerCase()) {
this.plugin.settings.pluginSyncExtendedSetting[key].files = v.files.map(e => e.filename).sort();
saveRequired = true;
}
}
if (saveRequired) {
this.plugin.saveSettingData();
}
}
async updatePluginList(showMessage: boolean, updatedDocumentPath?: FilePathWithPrefix): Promise<void> {
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
// pluginList.set([]);
if (!this.settings.usePluginSync) {
this.pluginList = [];
pluginList.set(this.pluginList)
return;
}
await runWithLock("update-plugin-list", false, async () => {
// if (updatedDocumentPath != "") pluginList.update(e => e.filter(ee => ee.documentPath != updatedDocumentPath));
// const work: Record<string, Record<string, Record<string, Record<string, PluginDataEntryEx>>>> = {};
const entries = [] as PluginDataExDisplay[]
const plugins = this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
const semaphore = Semaphore(4);
const para = Parallels();
let count = 0;
pluginIsEnumerating.set(true);
let processed = false;
try {
for await (const plugin of plugins) {
const path = plugin.path || this.getPath(plugin);
if (updatedDocumentPath && updatedDocumentPath != path) {
continue;
}
processed = true;
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
if (oldEntry && oldEntry.mtime == plugin.mtime) continue;
await para.wait(5);
para.add((async (v) => {
const release = await semaphore.acquire(1);
await Promise.resolve(); // Just to prevent warning.
scheduleTask("update-plugin-list-task", 200, async () => {
await serialized("update-plugin-list", async () => {
try {
const updatedDocumentId = updatedDocumentPath ? await this.path2id(updatedDocumentPath) : "";
const plugins = updatedDocumentPath ?
this.localDatabase.findEntries(updatedDocumentId, updatedDocumentId + "\u{10ffff}", { include_docs: true, key: updatedDocumentId, limit: 1 }) :
this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
let count = 0;
pluginIsEnumerating.set(true);
for await (const v of processAllGeneratorTasksWithConcurrencyLimit(20, pipeGeneratorToGenerator(plugins, async plugin => {
const path = plugin.path || this.getPath(plugin);
if (updatedDocumentPath && updatedDocumentPath != path) {
return false;
}
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
if (oldEntry && oldEntry.mtime == plugin.mtime) return false;
try {
count++;
if (count % 10 == 0) Logger(`Enumerating files... ${count}`, logLevel, "get-plugins");
Logger(`plugin-${path}`, LOG_LEVEL.VERBOSE);
const wx = await this.localDatabase.getDBEntry(path, null, false, false);
if (wx) {
const data = deserialize(getDocData(wx.data), {}) as PluginDataEx;
const xFiles = [] as PluginDataExFile[];
for (const file of data.files) {
const work = { ...file };
const tempStr = getDocData(work.data);
work.data = [await hashString(tempStr)];
xFiles.push(work);
}
entries.push({
...data,
documentPath: this.getPath(wx),
files: xFiles
});
}
Logger(`plugin-${path}`, LOG_LEVEL_VERBOSE);
return this.loadPluginData(path);
// return entries;
} catch (ex) {
//TODO
Logger(`Something happened at enumerating customization :${v.path}`, LOG_LEVEL.NOTICE);
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
console.warn(ex);
} finally {
release();
}
return false;
}))) {
if ("ok" in v) {
if (v.ok !== false) {
let newList = [...this.pluginList];
const item = v.ok;
newList = newList.filter(x => x.documentPath != item.documentPath);
newList.push(item)
if (updatedDocumentPath != "") newList = newList.filter(e => e.documentPath != updatedDocumentPath);
this.pluginList = newList;
pluginList.set(newList);
}
}
}
)(plugin));
Logger(`All files enumerated`, logLevel, "get-plugins");
pluginIsEnumerating.set(false);
this.createMissingConfigurationEntry();
} finally {
pluginIsEnumerating.set(false);
}
await para.all();
let newList = [...this.pluginList];
for (const item of entries) {
newList = newList.filter(x => x.documentPath != item.documentPath);
newList.push(item)
}
if (updatedDocumentPath != "" && !processed) newList = newList.filter(e => e.documentPath != updatedDocumentPath);
this.pluginList = newList;
pluginList.set(newList);
Logger(`All files enumerated`, logLevel, "get-plugins");
} finally {
pluginIsEnumerating.set(false);
}
});
pluginIsEnumerating.set(false);
});
// return entries;
}
@@ -274,8 +383,8 @@ export class ConfigSync extends LiveSyncCommands {
const fileA = { ...pluginDataA.files[0], ctime: pluginDataA.files[0].mtime, _id: `${pluginDataA.documentPath}` as DocumentID };
const fileB = pluginDataB.files[0];
const docAx = { ...docA, ...fileA } as LoadedEntry, docBx = { ...docB, ...fileB } as LoadedEntry
return runWithLock("config:merge-data", false, () => new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL.VERBOSE);
return serialized("config:merge-data", () => new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL_VERBOSE);
// const docs = [docA, docB];
const path = stripAllPrefixes(docAx.path.split("/").slice(-1).join("/") as FilePath);
const modal = new JsonResolveModal(this.app, path, [docAx, docBx], async (keep, result) => {
@@ -284,7 +393,7 @@ export class ConfigSync extends LiveSyncCommands {
res(await this.applyData(pluginDataA, result));
} catch (ex) {
Logger("Could not apply merged file");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
res(false);
}
}, "📡", "🛰️", "B");
@@ -308,7 +417,7 @@ export class ConfigSync extends LiveSyncCommands {
const path = `${baseDir}/${f.filename}`;
await this.ensureDirectoryEx(path);
if (!content) {
const dt = base64ToArrayBuffer(f.data);
const dt = decodeBinary(f.data);
await this.app.vault.adapter.writeBinary(path, dt);
} else {
await this.app.vault.adapter.write(path, content);
@@ -317,7 +426,7 @@ export class ConfigSync extends LiveSyncCommands {
} catch (ex) {
Logger(`Applying ${f.filename} of ${data.displayName || data.name}.. Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
@@ -325,7 +434,7 @@ export class ConfigSync extends LiveSyncCommands {
await this.storeCustomizationFiles(uPath);
await this.updatePluginList(true, uPath);
await delay(100);
Logger(`Config ${data.displayName || data.name} has been applied`, LOG_LEVEL.NOTICE);
Logger(`Config ${data.displayName || data.name} has been applied`, LOG_LEVEL_NOTICE);
if (data.category == "PLUGIN_DATA" || data.category == "PLUGIN_MAIN") {
//@ts-ignore
const manifests = Object.values(this.app.plugins.manifests) as any as PluginManifest[];
@@ -333,12 +442,12 @@ export class ConfigSync extends LiveSyncCommands {
const enabledPlugins = this.app.plugins.enabledPlugins as Set<string>;
const pluginManifest = manifests.find((manifest) => enabledPlugins.has(manifest.id) && manifest.dir == `${baseDir}/plugins/${data.name}`);
if (pluginManifest) {
Logger(`Unloading plugin: ${pluginManifest.name}`, LOG_LEVEL.NOTICE, "plugin-reload-" + pluginManifest.id);
Logger(`Unloading plugin: ${pluginManifest.name}`, LOG_LEVEL_NOTICE, "plugin-reload-" + pluginManifest.id);
// @ts-ignore
await this.app.plugins.unloadPlugin(pluginManifest.id);
// @ts-ignore
await this.app.plugins.loadPlugin(pluginManifest.id);
Logger(`Plugin reloaded: ${pluginManifest.name}`, LOG_LEVEL.NOTICE, "plugin-reload-" + pluginManifest.id);
Logger(`Plugin reloaded: ${pluginManifest.name}`, LOG_LEVEL_NOTICE, "plugin-reload-" + pluginManifest.id);
}
} else if (data.category == "CONFIG") {
scheduleTask("configReload", 250, async () => {
@@ -351,7 +460,7 @@ export class ConfigSync extends LiveSyncCommands {
return true;
} catch (ex) {
Logger(`Applying ${data.displayName || data.name}.. Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
}
@@ -360,11 +469,11 @@ export class ConfigSync extends LiveSyncCommands {
if (data.documentPath) {
await this.deleteConfigOnDatabase(data.documentPath);
await this.updatePluginList(false, data.documentPath);
Logger(`Delete: ${data.documentPath}`, LOG_LEVEL.NOTICE);
Logger(`Delete: ${data.documentPath}`, LOG_LEVEL_NOTICE);
}
return true;
} catch (ex) {
Logger(`Failed to delete: ${data.documentPath}`, LOG_LEVEL.NOTICE);
Logger(`Failed to delete: ${data.documentPath}`, LOG_LEVEL_NOTICE);
return false;
}
@@ -428,6 +537,7 @@ export class ConfigSync extends LiveSyncCommands {
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
return;
}
recentProcessedInternalFiles = [] as string[];
async makeEntryFromFile(path: FilePath): Promise<false | PluginDataExFile> {
const stat = await this.app.vault.adapter.stat(path);
@@ -439,7 +549,7 @@ export class ConfigSync extends LiveSyncCommands {
const contentBin = await this.app.vault.adapter.readBinary(path);
let content: string[];
try {
content = await arrayBufferToBase64(contentBin);
content = await encodeBinary(contentBin, this.settings.useV1);
if (path.toLowerCase().endsWith("/manifest.json")) {
const v = readString(new Uint8Array(contentBin));
try {
@@ -451,12 +561,12 @@ export class ConfigSync extends LiveSyncCommands {
displayName = `${json.name}`;
}
} catch (ex) {
Logger(`Configuration sync data: ${path} looks like manifest, but could not read the version`, LOG_LEVEL.INFO);
Logger(`Configuration sync data: ${path} looks like manifest, but could not read the version`, LOG_LEVEL_INFO);
}
}
} catch (ex) {
Logger(`The file ${path} could not be encoded`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
const mtime = stat.mtime;
@@ -483,11 +593,11 @@ export class ConfigSync extends LiveSyncCommands {
async storeCustomizationFiles(path: FilePath, termOverRide?: string) {
const term = termOverRide || this.plugin.deviceAndVaultName;
if (term == "") {
Logger("We have to configure the device name", LOG_LEVEL.NOTICE);
Logger("We have to configure the device name", LOG_LEVEL_NOTICE);
return;
}
const vf = this.filenameToUnifiedKey(path, term);
return await runWithLock(`plugin-${vf}`, false, async () => {
return await serialized(`plugin-${vf}`, async () => {
const category = this.getFileCategory(path);
let mtime = 0;
let fileTargets = [] as FilePath[];
@@ -519,7 +629,7 @@ export class ConfigSync extends LiveSyncCommands {
for (const target of fileTargets) {
const data = await this.makeEntryFromFile(target);
if (data == false) {
// Logger(`Config: skipped: ${target} `, LOG_LEVEL.VERBOSE);
// Logger(`Config: skipped: ${target} `, LOG_LEVEL_VERBOSE);
continue;
}
if (data.version) {
@@ -561,7 +671,7 @@ export class ConfigSync extends LiveSyncCommands {
};
} else {
if (old.mtime == mtime) {
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL.VERBOSE);
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return true;
}
saveData =
@@ -582,7 +692,7 @@ export class ConfigSync extends LiveSyncCommands {
return ret;
} catch (ex) {
Logger(`STORAGE --> DB:${prefixedFileName}: (config) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
})
@@ -595,6 +705,13 @@ export class ConfigSync extends LiveSyncCommands {
// Make sure that target is a file.
if (stat && stat.type != "file")
return false;
const configDir = normalizePath(this.app.vault.configDir);
const synchronisedInConfigSync = Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode != MODE_SELECTIVE).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
if (synchronisedInConfigSync.some(e => e.startsWith(path.toLowerCase()))) {
Logger(`Customization file skipped: ${path}`, LOG_LEVEL_VERBOSE);
return;
}
const storageMTime = ~~((stat && stat.mtime || 0) / 1000);
const key = `${path}-${storageMTime}`;
if (this.recentProcessedInternalFiles.contains(key)) {
@@ -609,11 +726,11 @@ export class ConfigSync extends LiveSyncCommands {
async scanAllConfigFiles(showMessage: boolean) {
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
Logger("Scanning customizing files.", logLevel, "scan-all-config");
const term = this.plugin.deviceAndVaultName;
if (term == "") {
Logger("We have to configure the device name", LOG_LEVEL.NOTICE);
Logger("We have to configure the device name", LOG_LEVEL_NOTICE);
return;
}
const filesAll = await this.scanInternalFiles();
@@ -635,7 +752,7 @@ export class ConfigSync extends LiveSyncCommands {
// const id = await this.path2id(prefixedFileName);
const mtime = new Date().getTime();
await runWithLock("file-x-" + prefixedFileName, false, async () => {
await serialized("file-x-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false) as InternalFileEntry | false;
let saveData: InternalFileEntry;
@@ -661,7 +778,7 @@ export class ConfigSync extends LiveSyncCommands {
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) Done`);
} catch (ex) {
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
});

View File

@@ -1,13 +1,13 @@
import { Notice, normalizePath, PluginManifest } from "./deps";
import { EntryDoc, LoadedEntry, LOG_LEVEL, InternalFileEntry, FilePathWithPrefix, FilePath } from "./lib/src/types";
import { InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
import { normalizePath, type PluginManifest } from "./deps";
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED } from "./lib/src/types";
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
import { Parallels, delay, isDocContentSame } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
import { disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask, isInternalMetadata, PeriodicProcessor } from "./utils";
import { scheduleTask, isInternalMetadata, PeriodicProcessor } from "./utils";
import { WrappedNotice } from "./lib/src/wrapper";
import { base64ToArrayBuffer, arrayBufferToBase64 } from "./lib/src/strbin";
import { runWithLock } from "./lib/src/lock";
import { decodeBinary, encodeBinary } from "./lib/src/strbin";
import { serialized } from "./lib/src/lock";
import { JsonResolveModal } from "./JsonResolveModal";
import { LiveSyncCommands } from "./LiveSyncCommands";
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
@@ -44,7 +44,7 @@ export class HiddenFileSync extends LiveSyncCommands {
Logger("Synchronizing hidden files done");
} catch (ex) {
Logger("Synchronizing hidden files failed");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
}
@@ -77,7 +77,7 @@ export class HiddenFileSync extends LiveSyncCommands {
procInternalFiles: string[] = [];
async execInternalFile() {
await runWithLock("execInternal", false, async () => {
await serialized("execInternal", async () => {
const w = [...this.procInternalFiles];
this.procInternalFiles = [];
Logger(`Applying hidden ${w.length} files change...`);
@@ -95,6 +95,14 @@ export class HiddenFileSync extends LiveSyncCommands {
recentProcessedInternalFiles = [] as string[];
async watchVaultRawEventsAsync(path: FilePath) {
if (!this.settings.syncInternalFiles) return;
// Exclude files handled by customization sync
const configDir = normalizePath(this.app.vault.configDir);
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
if (synchronisedInConfigSync.some(e => e.startsWith(path.toLowerCase()))) {
Logger(`Hidden file skipped: ${path} is synchronized in customization sync.`, LOG_LEVEL_VERBOSE);
return;
}
const stat = await this.app.vault.adapter.stat(path);
// sometimes folder is coming.
if (stat && stat.type != "file")
@@ -161,7 +169,7 @@ export class HiddenFileSync extends LiveSyncCommands {
const commonBase = revFrom._revs_info.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
const result = await this.plugin.mergeObject(path, commonBase, doc._rev, conflictedRev);
if (result) {
Logger(`Object merge:${path}`, LOG_LEVEL.INFO);
Logger(`Object merge:${path}`, LOG_LEVEL_INFO);
const filename = stripAllPrefixes(path);
const isExists = await this.app.vault.adapter.exists(filename);
if (!isExists) {
@@ -174,7 +182,7 @@ export class HiddenFileSync extends LiveSyncCommands {
await this.localDatabase.removeRaw(id, revB);
return this.resolveConflictOnInternalFile(path);
} else {
Logger(`Object merge is not applicable.`, LOG_LEVEL.VERBOSE);
Logger(`Object merge is not applicable.`, LOG_LEVEL_VERBOSE);
}
const docAMerge = await this.localDatabase.getDBEntry(path, { rev: revA });
@@ -203,24 +211,30 @@ export class HiddenFileSync extends LiveSyncCommands {
return this.resolveConflictOnInternalFile(path);
} catch (ex) {
Logger(`Failed to resolve conflict (Hidden): ${path}`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
}
//TODO: Tidy up. Even though it is experimental feature, So dirty...
async syncInternalFilesAndDatabase(direction: "push" | "pull" | "safe" | "pullForce" | "pushForce", showMessage: boolean, files: InternalFileInfo[] | false = false, targetFiles: string[] | false = false) {
async syncInternalFilesAndDatabase(direction: "push" | "pull" | "safe" | "pullForce" | "pushForce", showMessage: boolean, filesAll: InternalFileInfo[] | false = false, targetFiles: string[] | false = false) {
await this.resolveConflictOnInternalFiles();
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
Logger("Scanning hidden files.", logLevel, "sync_internal");
const ignorePatterns = this.settings.syncInternalFilesIgnorePatterns
.replace(/\n| /g, "")
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
if (!files)
files = await this.scanInternalFiles();
const configDir = normalizePath(this.app.vault.configDir);
let files: InternalFileInfo[] =
filesAll ? filesAll : (await this.scanInternalFiles())
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
files = files.filter(file => synchronisedInConfigSync.every(filterFile => !file.path.toLowerCase().startsWith(filterFile)))
const filesOnDB = ((await this.localDatabase.allDocsRaw({ startkey: ICHeader, endkey: ICHeaderEnd, include_docs: true })).rows.map(e => e.doc) as InternalFileEntry[]).filter(e => !e.deleted);
const allFileNamesSrc = [...new Set([...files.map(e => normalizePath(e.path)), ...filesOnDB.map(e => stripAllPrefixes(this.getPath(e)))])];
const allFileNames = allFileNamesSrc.filter(filename => !targetFiles || (targetFiles && targetFiles.indexOf(filename) !== -1));
const allFileNames = allFileNamesSrc.filter(filename => !targetFiles || (targetFiles && targetFiles.indexOf(filename) !== -1)).filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile)))
function compareMTime(a: number, b: number) {
const wa = ~~(a / 1000);
const wb = ~~(b / 1000);
@@ -273,6 +287,9 @@ export class HiddenFileSync extends LiveSyncCommands {
if (!filename) continue;
if (ignorePatterns.some(e => filename.match(e)))
continue;
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
continue;
}
const fileOnStorage = filename in filesMap ? filesMap[filename] : undefined;
const fileOnDatabase = filename in filesOnDBMap ? filesOnDBMap[filename] : undefined;
@@ -332,7 +349,6 @@ export class HiddenFileSync extends LiveSyncCommands {
// When files has been retrieved from the database. they must be reloaded.
if ((direction == "pull" || direction == "pullForce") && filesChanged != 0) {
const configDir = normalizePath(this.app.vault.configDir);
// Show notification to restart obsidian when something has been changed in configDir.
if (configDir in updatedFolders) {
// Numbers of updated files that is below of configDir.
@@ -349,78 +365,33 @@ export class HiddenFileSync extends LiveSyncCommands {
updatedCount -= updatedFolders[manifest.dir];
const updatePluginId = manifest.id;
const updatePluginName = manifest.name;
const fragment = createFragment((doc) => {
doc.createEl("span", null, (a) => {
a.appendText(`Files in ${updatePluginName} has been updated, Press `);
a.appendChild(a.createEl("a", null, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", async () => {
Logger(`Unloading plugin: ${updatePluginName}`, LOG_LEVEL.NOTICE, "plugin-reload-" + updatePluginId);
// @ts-ignore
await this.app.plugins.unloadPlugin(updatePluginId);
// @ts-ignore
await this.app.plugins.loadPlugin(updatePluginId);
Logger(`Plugin reloaded: ${updatePluginName}`, LOG_LEVEL.NOTICE, "plugin-reload-" + updatePluginId);
});
}));
a.appendText(` to reload ${updatePluginName}, or press elsewhere to dismiss this message.`);
this.plugin.askInPopup(`updated-${updatePluginId}`, `Files in ${updatePluginName} has been updated, Press {HERE} to reload ${updatePluginName}, or press elsewhere to dismiss this message.`, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", async () => {
Logger(`Unloading plugin: ${updatePluginName}`, LOG_LEVEL_NOTICE, "plugin-reload-" + updatePluginId);
// @ts-ignore
await this.app.plugins.unloadPlugin(updatePluginId);
// @ts-ignore
await this.app.plugins.loadPlugin(updatePluginId);
Logger(`Plugin reloaded: ${updatePluginName}`, LOG_LEVEL_NOTICE, "plugin-reload-" + updatePluginId);
});
});
const updatedPluginKey = "popupUpdated-" + updatePluginId;
scheduleTask(updatedPluginKey, 1000, async () => {
const popup = await memoIfNotExist(updatedPluginKey, () => new Notice(fragment, 0));
//@ts-ignore
const isShown = popup?.noticeEl?.isShown();
if (!isShown) {
memoObject(updatedPluginKey, new Notice(fragment, 0));
}
scheduleTask(updatedPluginKey + "-close", 20000, () => {
const popup = retrieveMemoObject<Notice>(updatedPluginKey);
if (!popup)
return;
//@ts-ignore
if (popup?.noticeEl?.isShown()) {
popup.hide();
}
disposeMemoObject(updatedPluginKey);
});
});
}
);
}
}
} catch (ex) {
Logger("Error on checking plugin status.");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
// If something changes left, notify for reloading Obsidian.
if (updatedCount != 0) {
const fragment = createFragment((doc) => {
doc.createEl("span", null, (a) => {
a.appendText(`Hidden files have been synchronized, Press `);
a.appendChild(a.createEl("a", null, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", () => {
// @ts-ignore
this.app.commands.executeCommandById("app:reload");
});
}));
a.appendText(` to reload obsidian, or press elsewhere to dismiss this message.`);
});
});
scheduleTask("popupUpdated-" + configDir, 1000, () => {
//@ts-ignore
const isShown = this.confirmPopup?.noticeEl?.isShown();
if (!isShown) {
this.confirmPopup = new Notice(fragment, 0);
}
scheduleTask("popupClose" + configDir, 20000, () => {
this.confirmPopup?.hide();
this.confirmPopup = null;
this.plugin.askInPopup(`updated-any-hidden`, `Hidden files have been synchronized, Press {HERE} to reload Obsidian, or press elsewhere to dismiss this message.`, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", () => {
// @ts-ignore
this.app.commands.executeCommandById("app:reload");
});
});
}
@@ -431,19 +402,23 @@ export class HiddenFileSync extends LiveSyncCommands {
}
async storeInternalFileToDatabase(file: InternalFileInfo, forceWrite = false) {
if (await this.plugin.isIgnoredByIgnoreFiles(file.path)) {
return
}
const id = await this.path2id(file.path, ICHeader);
const prefixedFileName = addPrefix(file.path, ICHeader);
const contentBin = await this.app.vault.adapter.readBinary(file.path);
let content: string[];
try {
content = await arrayBufferToBase64(contentBin);
content = await encodeBinary(contentBin, this.settings.useV1);
} catch (ex) {
Logger(`The file ${file.path} could not be encoded`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
const mtime = file.mtime;
return await runWithLock("file-" + prefixedFileName, false, async () => {
return await serialized("file-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false);
let saveData: LoadedEntry;
@@ -462,7 +437,7 @@ export class HiddenFileSync extends LiveSyncCommands {
};
} else {
if (isDocContentSame(old.data, content) && !forceWrite) {
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL.VERBOSE);
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return;
}
saveData =
@@ -482,7 +457,7 @@ export class HiddenFileSync extends LiveSyncCommands {
return ret;
} catch (ex) {
Logger(`STORAGE --> DB:${file.path}: (hidden) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
});
@@ -492,7 +467,10 @@ export class HiddenFileSync extends LiveSyncCommands {
const id = await this.path2id(filename, ICHeader);
const prefixedFileName = addPrefix(filename, ICHeader);
const mtime = new Date().getTime();
await runWithLock("file-" + prefixedFileName, false, async () => {
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
return
}
await serialized("file-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, true) as InternalFileEntry | false;
let saveData: InternalFileEntry;
@@ -526,7 +504,7 @@ export class HiddenFileSync extends LiveSyncCommands {
Logger(`STORAGE -x> DB:${filename}: (hidden) Done`);
} catch (ex) {
Logger(`STORAGE -x> DB:${filename}: (hidden) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
});
@@ -535,8 +513,10 @@ export class HiddenFileSync extends LiveSyncCommands {
async extractInternalFileFromDatabase(filename: FilePath, force = false) {
const isExists = await this.app.vault.adapter.exists(filename);
const prefixedFileName = addPrefix(filename, ICHeader);
return await runWithLock("file-" + prefixedFileName, false, async () => {
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
return;
}
return await serialized("file-" + prefixedFileName, async () => {
try {
// Check conflicted status
//TODO option
@@ -545,7 +525,7 @@ export class HiddenFileSync extends LiveSyncCommands {
throw new Error(`File not found on database.:${filename}`);
// Prevent overwrite for Prevent overwriting while some conflicted revision exists.
if (fileOnDB?._conflicts?.length) {
Logger(`Hidden file ${filename} has conflicted revisions, to keep in safe, writing to storage has been prevented`, LOG_LEVEL.INFO);
Logger(`Hidden file ${filename} has conflicted revisions, to keep in safe, writing to storage has been prevented`, LOG_LEVEL_INFO);
return;
}
const deleted = "deleted" in fileOnDB ? fileOnDB.deleted : false;
@@ -557,40 +537,40 @@ export class HiddenFileSync extends LiveSyncCommands {
await this.app.vault.adapter.remove(filename);
try {
//@ts-ignore internalAPI
await app.vault.adapter.reconcileInternalFile(filename);
await this.app.vault.adapter.reconcileInternalFile(filename);
} catch (ex) {
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
return true;
}
if (!isExists) {
await this.ensureDirectoryEx(filename);
await this.app.vault.adapter.writeBinary(filename, base64ToArrayBuffer(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
await this.app.vault.adapter.writeBinary(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
try {
//@ts-ignore internalAPI
await app.vault.adapter.reconcileInternalFile(filename);
await this.app.vault.adapter.reconcileInternalFile(filename);
} catch (ex) {
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
Logger(`STORAGE <-- DB:${filename}: written (hidden,new${force ? ", force" : ""})`);
return true;
} else {
const contentBin = await this.app.vault.adapter.readBinary(filename);
const content = await arrayBufferToBase64(contentBin);
if (content == fileOnDB.data && !force) {
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL.VERBOSE);
const content = await encodeBinary(contentBin, this.settings.useV1);
if (isDocContentSame(content, fileOnDB.data) && !force) {
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return true;
}
await this.app.vault.adapter.writeBinary(filename, base64ToArrayBuffer(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
await this.app.vault.adapter.writeBinary(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
try {
//@ts-ignore internalAPI
await app.vault.adapter.reconcileInternalFile(filename);
await this.app.vault.adapter.reconcileInternalFile(filename);
} catch (ex) {
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
Logger(`STORAGE <-- DB:${filename}: written (hidden, overwrite${force ? ", force" : ""})`);
return true;
@@ -598,7 +578,7 @@ export class HiddenFileSync extends LiveSyncCommands {
}
} catch (ex) {
Logger(`STORAGE <-- DB:${filename}: written (hidden, overwrite${force ? ", force" : ""}) Failed`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
});
@@ -607,8 +587,8 @@ export class HiddenFileSync extends LiveSyncCommands {
showJSONMergeDialogAndMerge(docA: LoadedEntry, docB: LoadedEntry): Promise<boolean> {
return runWithLock("conflict:merge-data", false, () => new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL.VERBOSE);
return serialized("conflict:merge-data", () => new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL_VERBOSE);
const docs = [docA, docB];
const path = stripAllPrefixes(docA.path);
const modal = new JsonResolveModal(this.app, path, [docA, docB], async (keep, result) => {
@@ -642,10 +622,10 @@ export class HiddenFileSync extends LiveSyncCommands {
await this.storeInternalFileToDatabase({ path: filename, ...stat }, true);
try {
//@ts-ignore internalAPI
await app.vault.adapter.reconcileInternalFile(filename);
await this.app.vault.adapter.reconcileInternalFile(filename);
} catch (ex) {
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
Logger(`STORAGE <-- DB:${filename}: written (hidden,merged)`);
}
@@ -656,7 +636,7 @@ export class HiddenFileSync extends LiveSyncCommands {
res(true);
} catch (ex) {
Logger("Could not merge conflicted json");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
res(false);
}
});
@@ -665,13 +645,16 @@ export class HiddenFileSync extends LiveSyncCommands {
}
async scanInternalFiles(): Promise<InternalFileInfo[]> {
const configDir = normalizePath(this.app.vault.configDir);
const ignoreFilter = this.settings.syncInternalFilesIgnorePatterns
.replace(/\n| /g, "")
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
const synchronisedInConfigSync = !this.settings.usePluginSync ? [] : Object.values(this.settings.pluginSyncExtendedSetting).filter(e => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED).map(e => e.files).flat().map(e => `${configDir}/${e}`.toLowerCase());
const root = this.app.vault.getRoot();
const findRoot = root.path;
const filenames = (await this.getFiles(findRoot, [], null, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
const files = filenames.map(async (e) => {
const files = filenames.filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile))).map(async (e) => {
return {
path: e as FilePath,
stat: await this.app.vault.adapter.stat(e)
@@ -680,6 +663,9 @@ export class HiddenFileSync extends LiveSyncCommands {
const result: InternalFileInfo[] = [];
for (const f of files) {
const w = await f;
if (await this.plugin.isIgnoredByIgnoreFiles(w.path)) {
continue
}
result.push({
...w,
...w.stat
@@ -698,12 +684,18 @@ export class HiddenFileSync extends LiveSyncCommands {
) {
const w = await this.app.vault.adapter.list(path);
let files = [
const filesSrc = [
...w.files
.filter((e) => !ignoreList.some((ee) => e.endsWith(ee)))
.filter((e) => !filter || filter.some((ee) => e.match(ee)))
.filter((e) => !ignoreFilter || ignoreFilter.every((ee) => !e.match(ee))),
.filter((e) => !ignoreFilter || ignoreFilter.every((ee) => !e.match(ee)))
];
let files = [] as string[];
for (const file of filesSrc) {
if (!await this.plugin.isIgnoredByIgnoreFiles(file)) {
files.push(file);
}
}
L1: for (const v of w.folders) {
for (const ignore of ignoreList) {
@@ -714,6 +706,9 @@ export class HiddenFileSync extends LiveSyncCommands {
if (ignoreFilter && ignoreFilter.some(e => v.match(e))) {
continue L1;
}
if (await this.plugin.isIgnoredByIgnoreFiles(v)) {
continue L1;
}
files = files.concat(await this.getFiles(v, ignoreList, filter, ignoreFilter));
}
return files;

View File

@@ -1,6 +1,6 @@
import { normalizePath, type PluginManifest } from "./deps";
import type { DocumentID, EntryDoc, FilePathWithPrefix, LoadedEntry } from "./lib/src/types";
import { LOG_LEVEL } from "./lib/src/types";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
import { type PluginDataEntry, PERIODIC_PLUGIN_SWEEP, type PluginList, type DevicePluginList, PSCHeader, PSCHeaderEnd } from "./types";
import { getDocData, isDocContentSame } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
@@ -9,7 +9,7 @@ import { isPluginMetadata, PeriodicProcessor } from "./utils";
import { PluginDialogModal } from "./dialogs";
import { NewNotice } from "./lib/src/wrapper";
import { versionNumberString2Number } from "./lib/src/strbin";
import { runWithLock } from "./lib/src/lock";
import { serialized, skipIfDuplicated } from "./lib/src/lock";
import { LiveSyncCommands } from "./LiveSyncCommands";
export class PluginAndTheirSettings extends LiveSyncCommands {
@@ -79,7 +79,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
Logger("Scanning plugins done");
} catch (ex) {
Logger("Scanning plugins failed");
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
@@ -148,7 +148,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
});
NewNotice(fragment, 10000);
} else {
Logger("Everything is up to date.", LOG_LEVEL.NOTICE);
Logger("Everything is up to date.", LOG_LEVEL_NOTICE);
}
}
@@ -164,10 +164,10 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
if (specificPluginPath != "") {
specificPlugin = manifests.find(e => e.dir.endsWith("/" + specificPluginPath))?.id ?? "";
}
await runWithLock("sweepplugin", true, async () => {
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
await skipIfDuplicated("sweepplugin", async () => {
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
if (!this.deviceAndVaultName) {
Logger("You have to set your device name.", LOG_LEVEL.NOTICE);
Logger("You have to set your device name.", LOG_LEVEL_NOTICE);
return;
}
Logger("Scanning plugins", logLevel);
@@ -176,7 +176,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
endkey: `ps:${this.deviceAndVaultName}-${specificPlugin}\u{10ffff}`,
include_docs: true,
});
// Logger("OLD DOCS.", LOG_LEVEL.VERBOSE);
// Logger("OLD DOCS.", LOG_LEVEL_VERBOSE);
// sweep current plugin.
const procs = manifests.map(async (m) => {
const pluginDataEntryID = `ps:${this.deviceAndVaultName}-${m.id}` as DocumentID;
@@ -184,7 +184,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
if (specificPlugin && m.id != specificPlugin) {
return;
}
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL.VERBOSE);
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL_VERBOSE);
const path = normalizePath(m.dir) + "/";
const adapter = this.app.vault.adapter;
const files = ["manifest.json", "main.js", "styles.css", "data.json"];
@@ -222,8 +222,8 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
datatype: "plain",
type: "plain"
};
Logger(`check diff:${m.name}(${m.id})`, LOG_LEVEL.VERBOSE);
await runWithLock("plugin-" + m.id, false, async () => {
Logger(`check diff:${m.name}(${m.id})`, LOG_LEVEL_VERBOSE);
await serialized("plugin-" + m.id, async () => {
const old = await this.localDatabase.getDBEntry(p._id as string as FilePathWithPrefix /* This also should be explained */, null, false, false);
if (old !== false) {
const oldData = { data: old.data, deleted: old._deleted };
@@ -237,7 +237,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
Logger(`Plugin saved:${m.name}`, logLevel);
});
} catch (ex) {
Logger(`Plugin save failed:${m.name}`, LOG_LEVEL.NOTICE);
Logger(`Plugin save failed:${m.name}`, LOG_LEVEL_NOTICE);
} finally {
oldDocs.rows = oldDocs.rows.filter((e) => e.id != pluginDataEntryID);
}
@@ -259,14 +259,14 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
}
return e.doc;
});
Logger(`Deleting old plugin:(${delDocs.length})`, LOG_LEVEL.VERBOSE);
Logger(`Deleting old plugin:(${delDocs.length})`, LOG_LEVEL_VERBOSE);
await this.localDatabase.bulkDocsRaw(delDocs);
Logger(`Scan plugin done.`, logLevel);
});
}
async applyPluginData(plugin: PluginDataEntry) {
await runWithLock("plugin-" + plugin.manifest.id, false, async () => {
await serialized("plugin-" + plugin.manifest.id, async () => {
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
const adapter = this.app.vault.adapter;
// @ts-ignore
@@ -274,27 +274,27 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
if (stat) {
// @ts-ignore
await this.app.plugins.unloadPlugin(plugin.manifest.id);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
if (plugin.dataJson)
await adapter.write(pluginTargetFolderPath + "data.json", plugin.dataJson);
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL.NOTICE);
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL_NOTICE);
if (stat) {
// @ts-ignore
await this.app.plugins.loadPlugin(plugin.manifest.id);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
});
}
async applyPlugin(plugin: PluginDataEntry) {
await runWithLock("plugin-" + plugin.manifest.id, false, async () => {
await serialized("plugin-" + plugin.manifest.id, async () => {
// @ts-ignore
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
if (stat) {
// @ts-ignore
await this.app.plugins.unloadPlugin(plugin.manifest.id);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
@@ -309,7 +309,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
if (stat) {
// @ts-ignore
await this.app.plugins.loadPlugin(plugin.manifest.id);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
});
}

View File

@@ -1,4 +1,4 @@
import { type EntryDoc, type ObsidianLiveSyncSettings, LOG_LEVEL, DEFAULT_SETTINGS } from "./lib/src/types";
import { type EntryDoc, type ObsidianLiveSyncSettings, DEFAULT_SETTINGS, LOG_LEVEL_NOTICE } from "./lib/src/types";
import { configURIBase } from "./types";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
@@ -8,6 +8,7 @@ import { LiveSyncCommands } from "./LiveSyncCommands";
import { delay } from "./lib/src/utils";
import { confirmWithMessage } from "./dialogs";
import { Platform } from "./deps";
import { fetchAllUsedChunks } from "./lib/src/utils_couchdb";
export class SetupLiveSync extends LiveSyncCommands {
onunload() { }
@@ -19,6 +20,11 @@ export class SetupLiveSync extends LiveSyncCommands {
name: "Copy the setup URI",
callback: this.command_copySetupURI.bind(this),
});
this.plugin.addCommand({
id: "livesync-copysetupuri-short",
name: "Copy the setup URI (With customization sync)",
callback: this.command_copySetupURIWithSync.bind(this),
});
this.plugin.addCommand({
id: "livesync-copysetupurifull",
@@ -40,38 +46,44 @@ export class SetupLiveSync extends LiveSyncCommands {
}
async realizeSettingSyncMode() { }
async command_copySetupURI() {
async command_copySetupURI(stripExtra = true) {
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "", true);
if (encryptingPassphrase === false)
return;
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
if (stripExtra) {
delete setting.pluginSyncExtendedSetting;
}
const keys = Object.keys(setting) as (keyof ObsidianLiveSyncSettings)[];
for (const k of keys) {
if (JSON.stringify(k in setting ? setting[k] : "") == JSON.stringify(k in DEFAULT_SETTINGS ? DEFAULT_SETTINGS[k] : "*")) {
delete setting[k];
}
}
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false, true));
const uri = `${configURIBase}${encryptedSetting}`;
await navigator.clipboard.writeText(uri);
Logger("Setup URI copied to clipboard", LOG_LEVEL.NOTICE);
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
}
async command_copySetupURIFull() {
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "", true);
if (encryptingPassphrase === false)
return;
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false, true));
const uri = `${configURIBase}${encryptedSetting}`;
await navigator.clipboard.writeText(uri);
Logger("Setup URI copied to clipboard", LOG_LEVEL.NOTICE);
Logger("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
}
async command_copySetupURIWithSync() {
this.command_copySetupURI(false);
}
async command_openSetupURI() {
const setupURI = await askString(this.app, "Easy setup", "Set up URI", `${configURIBase}aaaaa`);
if (setupURI === false)
return;
if (!setupURI.startsWith(`${configURIBase}`)) {
Logger("Set up URI looks wrong.", LOG_LEVEL.NOTICE);
Logger("Set up URI looks wrong.", LOG_LEVEL_NOTICE);
return;
}
const config = decodeURIComponent(setupURI.substring(configURIBase.length));
@@ -98,11 +110,17 @@ export class SetupLiveSync extends LiveSyncCommands {
newSettingW.encryptedCouchDBConnection = "";
const setupJustImport = "Just import setting";
const setupAsNew = "Set it up as secondary or subsequent device";
const setupAsMerge = "Secondary device but try keeping local changes";
const setupAgain = "Reconfigure and reconstitute the data";
const setupManually = "Leave everything to me";
newSettingW.syncInternalFiles = false;
newSettingW.usePluginSync = false;
const setupType = await askSelectString(this.app, "How would you like to set it up?", [setupAsNew, setupAgain, setupJustImport, setupManually]);
// Migrate completely obsoleted configuration.
if (!newSettingW.useIndexedDBAdapter) {
newSettingW.useIndexedDBAdapter = true;
}
const setupType = await askSelectString(this.app, "How would you like to set it up?", [setupAsNew, setupAgain, setupAsMerge, setupJustImport, setupManually]);
if (setupType == setupJustImport) {
this.plugin.settings = newSettingW;
this.plugin.usedPassphrase = "";
@@ -111,6 +129,10 @@ export class SetupLiveSync extends LiveSyncCommands {
this.plugin.settings = newSettingW;
this.plugin.usedPassphrase = "";
await this.fetchLocal();
} else if (setupType == setupAsMerge) {
this.plugin.settings = newSettingW;
this.plugin.usedPassphrase = "";
await this.fetchLocalWithKeepLocal();
} else if (setupType == setupAgain) {
const confirm = "I know this operation will rebuild all my databases with files on this device, and files that are on the remote database and I didn't synchronize to any other devices will be lost and want to proceed indeed.";
if (await askSelectString(this.app, "Do you really want to do this?", ["Cancel", confirm]) != confirm) {
@@ -134,13 +156,13 @@ export class SetupLiveSync extends LiveSyncCommands {
await this.plugin.replicate(true);
await this.plugin.markRemoteUnlocked();
}
Logger("Configuration loaded.", LOG_LEVEL.NOTICE);
Logger("Configuration loaded.", LOG_LEVEL_NOTICE);
return;
}
if (keepLocalDB == "no" && keepRemoteDB == "no") {
const reset = await askYesNo(this.app, "Drop everything?");
if (reset != "yes") {
Logger("Cancelled", LOG_LEVEL.NOTICE);
Logger("Cancelled", LOG_LEVEL_NOTICE);
this.plugin.settings = oldConf;
return;
}
@@ -175,17 +197,17 @@ export class SetupLiveSync extends LiveSyncCommands {
}
}
Logger("Configuration loaded.", LOG_LEVEL.NOTICE);
Logger("Configuration loaded.", LOG_LEVEL_NOTICE);
} else {
Logger("Cancelled.", LOG_LEVEL.NOTICE);
Logger("Cancelled.", LOG_LEVEL_NOTICE);
}
} catch (ex) {
Logger("Couldn't parse or decrypt configuration uri.", LOG_LEVEL.NOTICE);
Logger("Couldn't parse or decrypt configuration uri.", LOG_LEVEL_NOTICE);
}
}
suspendExtraSync() {
Logger("Hidden files and plugin synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL.NOTICE)
Logger("Hidden files and plugin synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL_NOTICE)
this.plugin.settings.syncInternalFiles = false;
this.plugin.settings.usePluginSync = false;
this.plugin.settings.autoSweepPlugins = false;
@@ -230,7 +252,7 @@ Of course, we are able to disable these features.`
return;
}
if (mode != "CUSTOMIZE") {
Logger("Gathering files for enabling Hidden File Sync", LOG_LEVEL.NOTICE);
Logger("Gathering files for enabling Hidden File Sync", LOG_LEVEL_NOTICE);
if (mode == "FETCH") {
await this.plugin.addOnHiddenFileSync.syncInternalFilesAndDatabase("pullForce", true);
} else if (mode == "OVERWRITE") {
@@ -240,7 +262,7 @@ Of course, we are able to disable these features.`
}
this.plugin.settings.syncInternalFiles = true;
await this.plugin.saveSettings();
Logger(`Done! Restarting the app is strongly recommended!`, LOG_LEVEL.NOTICE);
Logger(`Done! Restarting the app is strongly recommended!`, LOG_LEVEL_NOTICE);
} else if (mode == "CUSTOMIZE") {
if (!this.plugin.deviceAndVaultName) {
let name = await askString(this.app, "Device name", "Please set this device name", `desktop`);
@@ -279,11 +301,30 @@ Of course, we are able to disable these features.`
this.plugin.settings.liveSync = false;
this.plugin.settings.periodicReplication = false;
this.plugin.settings.syncOnSave = false;
this.plugin.settings.syncOnEditorSave = false;
this.plugin.settings.syncOnStart = false;
this.plugin.settings.syncOnFileOpen = false;
this.plugin.settings.syncAfterMerge = false;
//this.suspendExtraSync();
}
async suspendReflectingDatabase() {
if (this.plugin.settings.doNotSuspendOnFetching) return;
Logger(`Suspending reflection: Database and storage changes will not be reflected in each other until completely finished the fetching.`, LOG_LEVEL_NOTICE);
this.plugin.settings.suspendParseReplicationResult = true;
this.plugin.settings.suspendFileWatching = true;
await this.plugin.saveSettings();
}
async resumeReflectingDatabase() {
if (this.plugin.settings.doNotSuspendOnFetching) return;
Logger(`Database and storage reflection has been resumed!`, LOG_LEVEL_NOTICE);
this.plugin.settings.suspendParseReplicationResult = false;
this.plugin.settings.suspendFileWatching = false;
await this.plugin.syncAllFiles(true);
await this.plugin.loadQueuedFiles();
this.plugin.procQueuedFiles();
await this.plugin.saveSettings();
}
async askUseNewAdapter() {
if (!this.plugin.settings.useIndexedDBAdapter) {
const message = `Now this plugin has been configured to use the old database adapter for keeping compatibility. Do you want to deactivate it?`;
@@ -297,9 +338,22 @@ Of course, we are able to disable these features.`
}
}
}
async fetchRemoteChunks() {
if (!this.plugin.settings.doNotSuspendOnFetching && this.plugin.settings.readChunksOnline) {
Logger(`Fetching chunks`, LOG_LEVEL_NOTICE);
const remoteDB = await this.plugin.getReplicator().connectRemoteCouchDBWithSetting(this.settings, this.plugin.getIsMobile(), true);
if (typeof remoteDB == "string") {
Logger(remoteDB, LOG_LEVEL_NOTICE);
} else {
await fetchAllUsedChunks(this.localDatabase.localDatabase, remoteDB.db);
}
Logger(`Fetching chunks done`, LOG_LEVEL_NOTICE);
}
}
async fetchLocal() {
this.suspendExtraSync();
this.askUseNewAdapter();
await this.suspendReflectingDatabase();
await this.plugin.realizeSettingSyncMode();
await this.plugin.resetLocalDatabase();
await delay(1000);
@@ -310,6 +364,27 @@ Of course, we are able to disable these features.`
await this.plugin.replicateAllFromServer(true);
await delay(1000);
await this.plugin.replicateAllFromServer(true);
await this.fetchRemoteChunks();
await this.resumeReflectingDatabase();
await this.askHiddenFileConfiguration({ enableFetch: true });
}
async fetchLocalWithKeepLocal() {
this.suspendExtraSync();
this.askUseNewAdapter();
await this.suspendReflectingDatabase();
await this.plugin.realizeSettingSyncMode();
await this.plugin.resetLocalDatabase();
await delay(1000);
await this.plugin.initializeDatabase(true);
await this.plugin.markRemoteResolved();
await this.plugin.openDatabase();
this.plugin.isReady = true;
await delay(500);
await this.plugin.replicateAllFromServer(true);
await delay(1000);
await this.plugin.replicateAllFromServer(true);
await this.fetchRemoteChunks();
await this.resumeReflectingDatabase();
await this.askHiddenFileConfiguration({ enableFetch: true });
}
async rebuildRemote() {

View File

@@ -1,6 +1,6 @@
import { App, Modal } from "./deps";
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT } from "diff-match-patch";
import { diff_result } from "./lib/src/types";
import { type diff_result } from "./lib/src/types";
import { escapeStringToHTML } from "./lib/src/strbin";
export class ConflictResolveModal extends Modal {
@@ -18,10 +18,8 @@ export class ConflictResolveModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Conflicting changes");
contentEl.empty();
contentEl.createEl("h2", { text: "This document has conflicted changes." });
contentEl.createEl("span", { text: this.filename });
const div = contentEl.createDiv("");
div.addClass("op-scrollable");
@@ -30,11 +28,11 @@ export class ConflictResolveModal extends Modal {
const x1 = v[0];
const x2 = v[1];
if (x1 == DIFF_DELETE) {
diff += "<span class='deleted'>" + escapeStringToHTML(x2) + "</span>";
diff += "<span class='deleted'>" + escapeStringToHTML(x2).replace(/\n/g, "<span class='ls-mark-cr'></span>\n") + "</span>";
} else if (x1 == DIFF_EQUAL) {
diff += "<span class='normal'>" + escapeStringToHTML(x2) + "</span>";
diff += "<span class='normal'>" + escapeStringToHTML(x2).replace(/\n/g, "<span class='ls-mark-cr'></span>\n") + "</span>";
} else if (x1 == DIFF_INSERT) {
diff += "<span class='added'>" + escapeStringToHTML(x2) + "</span>";
diff += "<span class='added'>" + escapeStringToHTML(x2).replace(/\n/g, "<span class='ls-mark-cr'></span>\n") + "</span>";
}
}
@@ -48,23 +46,26 @@ export class ConflictResolveModal extends Modal {
`;
contentEl.createEl("button", { text: "Keep A" }, (e) => {
e.addEventListener("click", async () => {
await this.callback(this.result.right.rev);
const callback = this.callback;
this.callback = null;
this.close();
await callback(this.result.right.rev);
});
});
contentEl.createEl("button", { text: "Keep B" }, (e) => {
e.addEventListener("click", async () => {
await this.callback(this.result.left.rev);
const callback = this.callback;
this.callback = null;
this.close();
await callback(this.result.left.rev);
});
});
contentEl.createEl("button", { text: "Concat both" }, (e) => {
e.addEventListener("click", async () => {
await this.callback("");
const callback = this.callback;
this.callback = null;
this.close();
await callback("");
});
});
contentEl.createEl("button", { text: "Not now" }, (e) => {

View File

@@ -1,9 +1,8 @@
import { TFile, Modal, App } from "./deps";
import { TFile, Modal, App, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "./deps";
import { getPathFromTFile, isValidPath } from "./utils";
import { base64ToArrayBuffer, base64ToString, escapeStringToHTML } from "./lib/src/strbin";
import { decodeBinary, escapeStringToHTML, readString } from "./lib/src/strbin";
import ObsidianLiveSyncPlugin from "./main";
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "diff-match-patch";
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL } from "./lib/src/types";
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
import { Logger } from "./lib/src/logger";
import { isErrorOfMissingDoc } from "./lib/src/utils_couchdb";
import { getDocData } from "./lib/src/utils";
@@ -11,29 +10,29 @@ import { stripPrefix } from "./lib/src/path";
export class DocumentHistoryModal extends Modal {
plugin: ObsidianLiveSyncPlugin;
range: HTMLInputElement;
contentView: HTMLDivElement;
info: HTMLDivElement;
fileInfo: HTMLDivElement;
range!: HTMLInputElement;
contentView!: HTMLDivElement;
info!: HTMLDivElement;
fileInfo!: HTMLDivElement;
showDiff = false;
id: DocumentID;
id?: DocumentID;
file: FilePathWithPrefix;
revs_info: PouchDB.Core.RevisionInfo[] = [];
currentDoc: LoadedEntry;
currentDoc?: LoadedEntry;
currentText = "";
currentDeleted = false;
initialRev: string;
initialRev?: string;
constructor(app: App, plugin: ObsidianLiveSyncPlugin, file: TFile | FilePathWithPrefix, id: DocumentID, revision?: string) {
constructor(app: App, plugin: ObsidianLiveSyncPlugin, file: TFile | FilePathWithPrefix, id?: DocumentID, revision?: string) {
super(app);
this.plugin = plugin;
this.file = (file instanceof TFile) ? getPathFromTFile(file) : file;
this.id = id;
this.initialRev = revision;
if (!file) {
this.file = this.plugin.id2path(id, null);
if (!file && id) {
this.file = this.plugin.id2path(id);
}
if (localStorage.getItem("ols-history-highlightdiff") == "1") {
this.showDiff = true;
@@ -47,8 +46,8 @@ export class DocumentHistoryModal extends Modal {
const db = this.plugin.localDatabase;
try {
const w = await db.localDatabase.get(this.id, { revs_info: true });
this.revs_info = w._revs_info.filter((e) => e?.status == "available");
this.range.max = `${this.revs_info.length - 1}`;
this.revs_info = w._revs_info?.filter((e) => e?.status == "available") ?? [];
this.range.max = `${Math.max(this.revs_info.length - 1, 0)}`;
this.range.value = this.range.max;
this.fileInfo.setText(`${this.file} / ${this.revs_info.length} revisions`);
await this.loadRevs(initialRev);
@@ -61,7 +60,7 @@ export class DocumentHistoryModal extends Modal {
this.contentView.setText(`History of this file was not recorded.`);
} else {
this.contentView.setText(`Error occurred.`);
Logger(ex, LOG_LEVEL.VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
}
@@ -90,8 +89,8 @@ export class DocumentHistoryModal extends Modal {
this.currentDoc = w;
this.info.innerHTML = `Modified:${new Date(w.mtime).toLocaleString()}`;
let result = "";
const w1data = w.datatype == "plain" ? getDocData(w.data) : base64ToString(w.data);
this.currentDeleted = w.deleted;
const w1data = w.datatype == "plain" ? getDocData(w.data) : readString(new Uint8Array(decodeBinary(w.data)));
this.currentDeleted = !!w.deleted;
this.currentText = w1data;
if (this.showDiff) {
const prevRevIdx = this.revs_info.length - 1 - ((this.range.value as any) / 1 - 1);
@@ -100,7 +99,7 @@ export class DocumentHistoryModal extends Modal {
const w2 = await db.getDBEntry(this.file, { rev: oldRev }, false, false, true);
if (w2 != false) {
const dmp = new diff_match_patch();
const w2data = w2.datatype == "plain" ? getDocData(w2.data) : base64ToString(w2.data);
const w2data = w2.datatype == "plain" ? getDocData(w2.data) : readString(new Uint8Array(decodeBinary(w2.data)));
const diff = dmp.diff_main(w2data, w1data);
dmp.diff_cleanupSemantic(diff);
for (const v of diff) {
@@ -131,9 +130,8 @@ export class DocumentHistoryModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Document History");
contentEl.empty();
contentEl.createEl("h2", { text: "Document History" });
this.fileInfo = contentEl.createDiv("");
this.fileInfo.addClass("op-info");
const divView = contentEl.createDiv("");
@@ -179,7 +177,7 @@ export class DocumentHistoryModal extends Modal {
e.addClass("mod-cta");
e.addEventListener("click", async () => {
await navigator.clipboard.writeText(this.currentText);
Logger(`Old content copied to clipboard`, LOG_LEVEL.NOTICE);
Logger(`Old content copied to clipboard`, LOG_LEVEL_NOTICE);
});
});
async function focusFile(path: string) {
@@ -190,7 +188,7 @@ export class DocumentHistoryModal extends Modal {
const leaf = app.workspace.getLeaf(false);
await leaf.openFile(targetFile);
} else {
Logger("The file could not view on the editor", LOG_LEVEL.NOTICE)
Logger("The file could not view on the editor", LOG_LEVEL_NOTICE)
}
}
buttons.createEl("button", { text: "Back to this revision" }, (e) => {
@@ -199,19 +197,19 @@ export class DocumentHistoryModal extends Modal {
// const pathToWrite = this.plugin.id2path(this.id, true);
const pathToWrite = stripPrefix(this.file);
if (!isValidPath(pathToWrite)) {
Logger("Path is not valid to write content.", LOG_LEVEL.INFO);
Logger("Path is not valid to write content.", LOG_LEVEL_INFO);
}
if (this.currentDoc?.datatype == "plain") {
await this.app.vault.adapter.write(pathToWrite, getDocData(this.currentDoc.data));
await focusFile(pathToWrite);
this.close();
} else if (this.currentDoc?.datatype == "newnote") {
await this.app.vault.adapter.writeBinary(pathToWrite, base64ToArrayBuffer(this.currentDoc.data));
await this.app.vault.adapter.writeBinary(pathToWrite, decodeBinary(this.currentDoc.data));
await focusFile(pathToWrite);
this.close();
} else {
Logger(`Could not parse entry`, LOG_LEVEL.NOTICE);
Logger(`Could not parse entry`, LOG_LEVEL_NOTICE);
}
});
});

View File

@@ -3,11 +3,11 @@
import { onDestroy, onMount } from "svelte";
import type { AnyEntry, FilePathWithPrefix } from "./lib/src/types";
import { getDocData, isDocContentSame } from "./lib/src/utils";
import { diff_match_patch } from "diff-match-patch";
import { diff_match_patch } from "./deps";
import { DocumentHistoryModal } from "./DocumentHistoryModal";
import { isPlainText, stripAllPrefixes } from "./lib/src/path";
import { TFile } from "./deps";
import { arrayBufferToBase64 } from "./lib/src/strbin";
import { encodeBinary } from "./lib/src/strbin";
export let plugin: ObsidianLiveSyncPlugin;
let showDiffInfo = false;
@@ -116,7 +116,7 @@
result = isDocContentSame(data, doc.data);
} else {
const data = await plugin.app.vault.readBinary(abs);
const dataEEncoded = await arrayBufferToBase64(data);
const dataEEncoded = await encodeBinary(data, plugin.settings.useV1);
result = isDocContentSame(dataEEncoded, doc.data);
}
if (result) {

View File

@@ -32,6 +32,7 @@ export class GlobalHistoryView extends ItemView {
return "Vault history";
}
// eslint-disable-next-line require-await
async onOpen() {
this.component = new GlobalHistoryComponent({
target: this.contentEl,
@@ -41,6 +42,7 @@ export class GlobalHistoryView extends ItemView {
});
}
// eslint-disable-next-line require-await
async onClose() {
this.component.$destroy();
}

View File

@@ -1,5 +1,5 @@
import { App, Modal } from "./deps";
import { FilePath, LoadedEntry } from "./lib/src/types";
import { type FilePath, type LoadedEntry } from "./lib/src/types";
import JsonResolvePane from "./JsonResolvePane.svelte";
export class JsonResolveModal extends Modal {
@@ -29,7 +29,7 @@ export class JsonResolveModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Conflicted Setting");
contentEl.empty();
if (this.component == null) {
@@ -41,7 +41,7 @@ export class JsonResolveModal extends Modal {
nameA: this.nameA,
nameB: this.nameB,
defaultSelect: this.defaultSelect,
callback: (keepRev, mergedStr) => this.UICallback(keepRev, mergedStr),
callback: (keepRev: string, mergedStr: string) => this.UICallback(keepRev, mergedStr),
},
});
}

View File

@@ -1,7 +1,7 @@
<script lang="ts">
import { type Diff, DIFF_DELETE, DIFF_INSERT, diff_match_patch } from "diff-match-patch";
import { type Diff, DIFF_DELETE, DIFF_INSERT, diff_match_patch } from "./deps";
import type { FilePath, LoadedEntry } from "./lib/src/types";
import { base64ToString } from "./lib/src/strbin";
import { decodeBinary, readString } from "./lib/src/strbin";
import { getDocData } from "./lib/src/utils";
import { mergeObject } from "./utils";
@@ -26,7 +26,7 @@
let mode: SelectModes = defaultSelect as SelectModes;
function docToString(doc: LoadedEntry) {
return doc.datatype == "plain" ? getDocData(doc.data) : base64ToString(doc.data);
return doc.datatype == "plain" ? getDocData(doc.data) : readString(new Uint8Array(decodeBinary(doc.data)));
}
function revStringToRevNumber(rev: string) {
return rev.split("-")[0];
@@ -104,7 +104,6 @@
] as ["" | "A" | "B" | "AB" | "BA", string][];
</script>
<h1>Conflicted settings</h1>
<h2>{filename}</h2>
{#if !docA || !docB}
<div class="message">Just for a minute, please!</div>

View File

@@ -1,4 +1,4 @@
import { deleteDB, IDBPDatabase, openDB } from "idb";
import { deleteDB, type IDBPDatabase, openDB } from "idb";
export interface KeyValueDatabase {
get<T>(key: string): Promise<T>;
set<T>(key: string, value: T): Promise<IDBValidKey>;

View File

@@ -1,4 +1,4 @@
import { AnyEntry, DocumentID, EntryDoc, EntryHasPath, FilePath, FilePathWithPrefix } from "./lib/src/types";
import { type AnyEntry, type DocumentID, type EntryDoc, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "./lib/src/types";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
import type ObsidianLiveSyncPlugin from "./main";

View File

@@ -14,9 +14,9 @@ export class LogDisplayModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Sync status");
contentEl.empty();
contentEl.createEl("h2", { text: "Sync Status" });
const div = contentEl.createDiv("");
div.addClass("op-scrollable");
div.addClass("op-pre");

View File

@@ -5,7 +5,7 @@ import {
import LogPaneComponent from "./LogPane.svelte";
import type ObsidianLiveSyncPlugin from "./main";
export const VIEW_TYPE_LOG = "log-log";
// Show notes as like scroll.
//Log view
export class LogPaneView extends ItemView {
component: LogPaneComponent;
@@ -32,6 +32,7 @@ export class LogPaneView extends ItemView {
return "Self-hosted LiveSync Log";
}
// eslint-disable-next-line require-await
async onOpen() {
this.component = new LogPaneComponent({
target: this.contentEl,
@@ -40,6 +41,7 @@ export class LogPaneView extends ItemView {
});
}
// eslint-disable-next-line require-await
async onClose() {
this.component.$destroy();
}

View File

@@ -1,5 +1,5 @@
import { App, PluginSettingTab, Setting, sanitizeHTMLToDom, TextAreaComponent, MarkdownRenderer, stringifyYaml } from "./deps";
import { DEFAULT_SETTINGS, LOG_LEVEL, type ObsidianLiveSyncSettings, type ConfigPassphraseStore, type RemoteDBSettings } from "./lib/src/types";
import { DEFAULT_SETTINGS, type ObsidianLiveSyncSettings, type ConfigPassphraseStore, type RemoteDBSettings, type FilePathWithPrefix, type HashAlgorithm, type DocumentID, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, LOG_LEVEL_INFO } from "./lib/src/types";
import { delay } from "./lib/src/utils";
import { Semaphore } from "./lib/src/semaphore";
import { versionNumberString2Number } from "./lib/src/strbin";
@@ -7,7 +7,7 @@ import { Logger } from "./lib/src/logger";
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb.js";
import { testCrypt } from "./lib/src/e2ee_v2";
import ObsidianLiveSyncPlugin from "./main";
import { balanceChunks, localDatabaseCleanUp, performRebuildDB, remoteDatabaseCleanup, requestToCouchDB } from "./utils";
import { askYesNo, performRebuildDB, requestToCouchDB, scheduleTask } from "./utils";
export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
@@ -21,10 +21,10 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
async testConnection(): Promise<void> {
const db = await this.plugin.replicator.connectRemoteCouchDBWithSetting(this.plugin.settings, this.plugin.isMobile, true);
if (typeof db === "string") {
this.plugin.addLog(`could not connect to ${this.plugin.settings.couchDB_URI} : ${this.plugin.settings.couchDB_DBNAME} \n(${db})`, LOG_LEVEL.NOTICE);
this.plugin.addLog(`could not connect to ${this.plugin.settings.couchDB_URI} : ${this.plugin.settings.couchDB_DBNAME} \n(${db})`, LOG_LEVEL_NOTICE);
return;
}
this.plugin.addLog(`Connected to ${db.info.db_name}`, LOG_LEVEL.NOTICE);
this.plugin.addLog(`Connected to ${db.info.db_name}`, LOG_LEVEL_NOTICE);
}
display(): void {
const { containerEl } = this;
@@ -111,7 +111,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
}
MarkdownRenderer.renderMarkdown(updateInformation, informationDivEl, "/", null);
MarkdownRenderer.render(this.plugin.app, updateInformation, informationDivEl, "/", this.plugin);
addScreenElement("100", containerInformationEl);
@@ -120,6 +120,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
if (this.plugin.settings.periodicReplication) return true;
if (this.plugin.settings.syncOnFileOpen) return true;
if (this.plugin.settings.syncOnSave) return true;
if (this.plugin.settings.syncOnEditorSave) return true;
if (this.plugin.settings.syncOnStart) return true;
if (this.plugin.settings.syncAfterMerge) return true;
if (this.plugin.replicator.syncStatus == "CONNECTED") return true;
@@ -133,12 +134,13 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
new Setting(setupWizardEl)
.setName("Discard the existing configuration and set up")
.addButton((text) => {
// eslint-disable-next-line require-await
text.setButtonText("Next").onClick(async () => {
if (JSON.stringify(this.plugin.settings) != JSON.stringify(DEFAULT_SETTINGS)) {
this.plugin.replicator.closeReplication();
this.plugin.settings = { ...DEFAULT_SETTINGS };
this.plugin.saveSettings();
Logger("Configuration has been flushed, please open it again", LOG_LEVEL.NOTICE)
Logger("Configuration has been flushed, please open it again", LOG_LEVEL_NOTICE)
// @ts-ignore
this.plugin.app.setting.close()
} else {
@@ -156,6 +158,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
this.plugin.settings.liveSync = false;
this.plugin.settings.periodicReplication = false;
this.plugin.settings.syncOnSave = false;
this.plugin.settings.syncOnEditorSave = false;
this.plugin.settings.syncOnStart = false;
this.plugin.settings.syncOnFileOpen = false;
this.plugin.settings.syncAfterMerge = false;
@@ -215,7 +218,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
syncLive.forEach((e) => {
e.setDisabled(false).setTooltip("");
});
} else if (this.plugin.settings.syncOnFileOpen || this.plugin.settings.syncOnSave || this.plugin.settings.syncOnStart || this.plugin.settings.periodicReplication || this.plugin.settings.syncAfterMerge) {
} else if (this.plugin.settings.syncOnFileOpen || this.plugin.settings.syncOnSave || this.plugin.settings.syncOnEditorSave || this.plugin.settings.syncOnStart || this.plugin.settings.periodicReplication || this.plugin.settings.syncAfterMerge) {
syncNonLive.forEach((e) => {
e.setDisabled(false).setTooltip("");
});
@@ -300,46 +303,44 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
.setDisabled(false)
.onClick(async () => {
const checkConfig = async () => {
Logger(`Checking database configuration`, LOG_LEVEL_INFO);
const emptyDiv = createDiv();
emptyDiv.innerHTML = "<span></span>";
checkResultDiv.replaceChildren(...[emptyDiv]);
const addResult = (msg: string, classes?: string[]) => {
const tmpDiv = createDiv();
tmpDiv.addClass("ob-btn-config-fix");
if (classes) {
tmpDiv.addClasses(classes);
}
tmpDiv.innerHTML = `${msg}`;
checkResultDiv.appendChild(tmpDiv);
};
try {
if (isCloudantURI(this.plugin.settings.couchDB_URI)) {
Logger("This feature cannot be used with IBM Cloudant.", LOG_LEVEL.NOTICE);
Logger("This feature cannot be used with IBM Cloudant.", LOG_LEVEL_NOTICE);
return;
}
const r = await requestToCouchDB(this.plugin.settings.couchDB_URI, this.plugin.settings.couchDB_USER, this.plugin.settings.couchDB_PASSWORD, window.origin);
Logger(JSON.stringify(r.json, null, 2));
const responseConfig = r.json;
const emptyDiv = createDiv();
emptyDiv.innerHTML = "<span></span>";
checkResultDiv.replaceChildren(...[emptyDiv]);
const addResult = (msg: string, classes?: string[]) => {
const tmpDiv = createDiv();
tmpDiv.addClass("ob-btn-config-fix");
if (classes) {
tmpDiv.addClasses(classes);
}
tmpDiv.innerHTML = `${msg}`;
checkResultDiv.appendChild(tmpDiv);
};
const addConfigFixButton = (title: string, key: string, value: string) => {
const tmpDiv = createDiv();
tmpDiv.addClass("ob-btn-config-fix");
tmpDiv.innerHTML = `<label>${title}</label><button>Fix</button>`;
const x = checkResultDiv.appendChild(tmpDiv);
x.querySelector("button").addEventListener("click", async () => {
console.dir({ key, value });
Logger(`CouchDB Configuration: ${title} -> Set ${key} to ${value}`)
const res = await requestToCouchDB(this.plugin.settings.couchDB_URI, this.plugin.settings.couchDB_USER, this.plugin.settings.couchDB_PASSWORD, undefined, key, value);
console.dir(res);
if (res.status == 200) {
Logger(`${title} successfully updated`, LOG_LEVEL.NOTICE);
Logger(`CouchDB Configuration: ${title} successfully updated`, LOG_LEVEL_NOTICE);
checkResultDiv.removeChild(x);
checkConfig();
} else {
Logger(`${title} failed`, LOG_LEVEL.NOTICE);
Logger(res.text);
Logger(`CouchDB Configuration: ${title} failed`, LOG_LEVEL_NOTICE);
Logger(res.text, LOG_LEVEL_VERBOSE);
}
});
};
@@ -349,7 +350,6 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
["ob-btn-config-info"]
);
addResult("Your configuration is dumped to Log", ["ob-btn-config-info"]);
addResult("--Config check--", ["ob-btn-config-head"]);
// Admin check
@@ -450,9 +450,16 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
}
addResult("--Done--", ["ob-btn-config-head"]);
addResult("If you have some trouble with Connection-check even though all Config-check has been passed, Please check your reverse proxy's configuration.", ["ob-btn-config-info"]);
Logger(`Checking configuration done`, LOG_LEVEL_INFO);
} catch (ex) {
Logger(`Checking configuration failed`, LOG_LEVEL.NOTICE);
Logger(ex);
if (ex?.status == 401) {
addResult(`❗ Access forbidden.`);
addResult(`We could not continue the test.`);
Logger(`Checking configuration done`, LOG_LEVEL_INFO);
} else {
Logger(`Checking configuration failed`, LOG_LEVEL_NOTICE);
Logger(ex);
}
}
};
await checkConfig();
@@ -609,25 +616,25 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
console.dir(settingForCheck);
const db = await this.plugin.replicator.connectRemoteCouchDBWithSetting(settingForCheck, this.plugin.isMobile, true);
if (typeof db === "string") {
Logger("Could not connect to the database.", LOG_LEVEL.NOTICE);
Logger("Could not connect to the database.", LOG_LEVEL_NOTICE);
return false;
} else {
if (await checkSyncInfo(db.db)) {
// Logger("Database connected", LOG_LEVEL.NOTICE);
// Logger("Database connected", LOG_LEVEL_NOTICE);
return true;
} else {
Logger("Failed to read remote database", LOG_LEVEL.NOTICE);
Logger("Failed to read remote database", LOG_LEVEL_NOTICE);
return false;
}
}
};
const applyEncryption = async (sendToServer: boolean) => {
if (encrypt && passphrase == "") {
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL.NOTICE);
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL_NOTICE);
return;
}
if (encrypt && !(await testCrypt())) {
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL.NOTICE);
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL_NOTICE);
return;
}
if (!(await checkWorkingPassphrase()) && !sendToServer) {
@@ -654,11 +661,11 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
const rebuildDB = async (method: "localOnly" | "remoteOnly" | "rebuildBothByThisDevice") => {
if (encrypt && passphrase == "") {
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL.NOTICE);
Logger("If you enable encryption, you have to set the passphrase", LOG_LEVEL_NOTICE);
return;
}
if (encrypt && !(await testCrypt())) {
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL.NOTICE);
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL_NOTICE);
return;
}
if (!encrypt) {
@@ -670,7 +677,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
this.plugin.settings.passphrase = passphrase;
this.plugin.settings.useDynamicIterationCount = useDynamicIterationCount;
this.plugin.settings.usePathObfuscation = usePathObfuscation;
Logger("All synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL.NOTICE)
Logger("All synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL_NOTICE)
await this.plugin.saveSettings();
updateE2EControls();
applyDisplayEnabled();
@@ -882,7 +889,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
.setCta()
.onClick(async () => {
if (currentPreset == "") {
Logger("Select any preset.", LOG_LEVEL.NOTICE);
Logger("Select any preset.", LOG_LEVEL_NOTICE);
return;
}
const presetAllDisabled = {
@@ -890,6 +897,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
liveSync: false,
periodicReplication: false,
syncOnSave: false,
syncOnEditorSave: false,
syncOnStart: false,
syncOnFileOpen: false,
syncAfterMerge: false,
@@ -903,6 +911,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
batchSave: true,
periodicReplication: true,
syncOnSave: false,
syncOnEditorSave: false,
syncOnStart: true,
syncOnFileOpen: true,
syncAfterMerge: true,
@@ -913,15 +922,15 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
...this.plugin.settings,
...presetLiveSync
}
Logger("Synchronization setting configured as LiveSync.", LOG_LEVEL.NOTICE);
Logger("Synchronization setting configured as LiveSync.", LOG_LEVEL_NOTICE);
} else if (currentPreset == "PERIODIC") {
this.plugin.settings = {
...this.plugin.settings,
...presetPeriodic
}
Logger("Synchronization setting configured as Periodic sync with batch database update.", LOG_LEVEL.NOTICE);
Logger("Synchronization setting configured as Periodic sync with batch database update.", LOG_LEVEL_NOTICE);
} else {
Logger("All synchronization disabled.", LOG_LEVEL.NOTICE);
Logger("All synchronization disabled.", LOG_LEVEL_NOTICE);
this.plugin.settings = {
...this.plugin.settings,
...presetAllDisabled
@@ -943,7 +952,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
}
await this.plugin.replicate(true);
Logger("All done! Please set up subsequent devices with 'Copy setup URI' and 'Open setup URI'.", LOG_LEVEL.NOTICE);
Logger("All done! Please set up subsequent devices with 'Copy setup URI' and 'Open setup URI'.", LOG_LEVEL_NOTICE);
// @ts-ignore
this.plugin.app.commands.executeCommandById("obsidian-livesync:livesync-copysetupuri")
}
@@ -1013,6 +1022,17 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
applyDisplayEnabled();
})
)
new Setting(containerSyncSettingEl)
.setName("Sync on Editor Save")
.setDesc("When you save file on the editor, sync automatically")
.setClass("wizardHidden")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.syncOnEditorSave).onChange(async (value) => {
this.plugin.settings.syncOnEditorSave = value;
await this.plugin.saveSettings();
applyDisplayEnabled();
})
)
new Setting(containerSyncSettingEl)
.setName("Sync on File Open")
.setDesc("When you open file, sync automatically")
@@ -1167,26 +1187,27 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
});
}
new Setting(containerSyncSettingEl)
.setName("Scan for hidden files before replication")
.setDesc("This configuration will be ignored if monitoring changes is enabled.")
.setClass("wizardHidden")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.syncInternalFilesBeforeReplication).onChange(async (value) => {
this.plugin.settings.syncInternalFilesBeforeReplication = value;
await this.plugin.saveSettings();
})
);
if (!this.plugin.settings.watchInternalFileChanges) {
new Setting(containerSyncSettingEl)
.setName("Scan for hidden files before replication")
.setClass("wizardHidden")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.syncInternalFilesBeforeReplication).onChange(async (value) => {
this.plugin.settings.syncInternalFilesBeforeReplication = value;
await this.plugin.saveSettings();
})
);
}
new Setting(containerSyncSettingEl)
.setName("Scan hidden files periodically")
.setDesc("Seconds, 0 to disable. This configuration will be ignored if monitoring changes is enabled.")
.setDesc("Seconds, 0 to disable")
.setClass("wizardHidden")
.addText((text) => {
text.setPlaceholder("")
.setValue(this.plugin.settings.syncInternalFilesInterval + "")
.onChange(async (value) => {
let v = Number(value);
if (isNaN(v) || v < 10) {
if (v !== 0 && (isNaN(v) || v < 10)) {
v = 10;
}
this.plugin.settings.syncInternalFilesInterval = v;
@@ -1198,7 +1219,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
const defaultSkipPattern = "\\/node_modules\\/, \\/\\.git\\/, \\/obsidian-livesync\\/";
const defaultSkipPatternXPlat = defaultSkipPattern + ",\\/workspace$ ,\\/workspace.json$";
new Setting(containerSyncSettingEl)
.setName("Skip patterns")
.setName("Folders and files to ignore")
.setDesc(
"Regular expression, If you use hidden file sync between desktop and mobile, adding `workspace$` is recommended."
)
@@ -1249,7 +1270,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
new Setting(containerSyncSettingEl)
.setName("Enhance chunk size")
.setDesc("Enhance chunk size for binary files (0.1MBytes). This cannot be increased when using IBM Cloudant.")
.setDesc("Enhance chunk size for binary files (Ratio). This cannot be increased when using IBM Cloudant.")
.setClass("wizardHidden")
.addText((text) => {
text.setPlaceholder("")
@@ -1257,7 +1278,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
.onChange(async (value) => {
let v = Number(value);
if (isNaN(v) || v < 1) {
v = 1;
v = 0;
}
this.plugin.settings.customChunkSize = v;
await this.plugin.saveSettings();
@@ -1330,7 +1351,38 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
return text;
}
);
new Setting(containerSyncSettingEl)
.setName("(Beta) Use ignore files")
.setDesc("If this is set, changes to local files which are matched by the ignore files will be skipped. Remote changes are determined using local ignore files.")
.setClass("wizardHidden")
.addToggle((toggle) => {
toggle
.setValue(this.plugin.settings.useIgnoreFiles)
.onChange(async (value) => {
this.plugin.settings.useIgnoreFiles = value;
await this.plugin.saveSettings();
this.display();
})
return toggle;
}
);
if (this.plugin.settings.useIgnoreFiles) {
new Setting(containerSyncSettingEl)
.setName("Ignore files")
.setDesc("We can use multiple ignore files, e.g.) `.gitignore, .dockerignore`")
.setClass("wizardHidden")
.addTextArea((text) => {
text
.setValue(this.plugin.settings.ignoreFiles)
.setPlaceholder(".gitignore, .dockerignore")
.onChange(async (value) => {
this.plugin.settings.ignoreFiles = value;
await this.plugin.saveSettings();
})
return text;
}
);
}
containerSyncSettingEl.createEl("h4", {
text: sanitizeHTMLToDom(`Advanced settings`),
}).addClass("wizardHidden");
@@ -1461,14 +1513,16 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
pluginConfig.passphrase = REDACTED;
pluginConfig.encryptedPassphrase = REDACTED;
pluginConfig.encryptedCouchDBConnection = REDACTED;
pluginConfig.pluginSyncExtendedSetting = {};
const msgConfig = `----remote config----
${stringifyYaml(responseConfig)}
---- Plug-in config ---
version:${manifestVersion}
${stringifyYaml(pluginConfig)}`;
console.log(msgConfig);
await navigator.clipboard.writeText(msgConfig);
Logger(`Information has been copied to clipboard`, LOG_LEVEL.NOTICE);
Logger(`Information has been copied to clipboard`, LOG_LEVEL_NOTICE);
})
);
@@ -1521,11 +1575,11 @@ ${stringifyYaml(pluginConfig)}`;
Logger(`UPDATE DATABASE ${file.path}`);
await this.plugin.updateIntoDB(file, false, null, true);
i++;
Logger(`${i}/${files.length}\n${file.path}`, LOG_LEVEL.NOTICE, "verify");
Logger(`${i}/${files.length}\n${file.path}`, LOG_LEVEL_NOTICE, "verify");
} catch (ex) {
i++;
Logger(`Error while verifyAndRepair`, LOG_LEVEL.NOTICE);
Logger(`Error while verifyAndRepair`, LOG_LEVEL_NOTICE);
Logger(ex);
} finally {
releaser();
@@ -1533,10 +1587,92 @@ ${stringifyYaml(pluginConfig)}`;
}
)(e));
await Promise.all(processes);
Logger("done", LOG_LEVEL.NOTICE, "verify");
Logger("done", LOG_LEVEL_NOTICE, "verify");
})
);
new Setting(containerHatchEl)
.setName("Check and convert non-path-obfuscated files")
.setDesc("")
.addButton((button) =>
button
.setButtonText("Perform")
.setDisabled(false)
.setWarning()
.onClick(async () => {
for await (const docName of this.plugin.localDatabase.findAllDocNames()) {
if (!docName.startsWith("f:")) {
const idEncoded = await this.plugin.path2id(docName as FilePathWithPrefix);
const doc = await this.plugin.localDatabase.getRaw(docName as DocumentID);
if (!doc) continue;
if (doc.type != "newnote" && doc.type != "plain") {
continue;
}
if (doc?.deleted ?? false) continue;
const newDoc = { ...doc };
//Prepare converted data
newDoc._id = idEncoded;
newDoc.path = docName as FilePathWithPrefix;
delete newDoc._rev;
try {
const obfuscatedDoc = await this.plugin.localDatabase.getRaw(idEncoded, { revs_info: true });
// Unfortunately we have to delete one of them.
// Just now, save it as a conflicted document.
obfuscatedDoc._revs_info?.shift(); // Drop latest revision.
const previousRev = obfuscatedDoc._revs_info?.shift(); // Use second revision.
if (previousRev) {
newDoc._rev = previousRev.rev;
} else {
//If there are no revisions, set the possibly unique one
newDoc._rev = "1-" + (`00000000000000000000000000000000${~~(Math.random() * 1e9)}${~~(Math.random() * 1e9)}${~~(Math.random() * 1e9)}${~~(Math.random() * 1e9)}`.slice(-32));
}
const ret = await this.plugin.localDatabase.putRaw(newDoc, { force: true });
if (ret.ok) {
Logger(`${docName} has been converted as conflicted document`, LOG_LEVEL_NOTICE);
doc._deleted = true;
if ((await this.plugin.localDatabase.putRaw(doc)).ok) {
Logger(`Old ${docName} has been deleted`, LOG_LEVEL_NOTICE);
}
await this.plugin.showIfConflicted(docName as FilePathWithPrefix);
} else {
Logger(`Converting ${docName} Failed!`, LOG_LEVEL_NOTICE);
Logger(ret, LOG_LEVEL_VERBOSE);
}
} catch (ex) {
if (ex?.status == 404) {
// We can perform this safely
if ((await this.plugin.localDatabase.putRaw(newDoc)).ok) {
Logger(`${docName} has been converted`, LOG_LEVEL_NOTICE);
doc._deleted = true;
if ((await this.plugin.localDatabase.putRaw(doc)).ok) {
Logger(`Old ${docName} has been deleted`, LOG_LEVEL_NOTICE);
}
}
} else {
Logger(`Something went wrong on converting ${docName}`, LOG_LEVEL_NOTICE);
Logger(ex, LOG_LEVEL_VERBOSE);
// Something wrong.
}
}
}
}
Logger(`Converting finished`, LOG_LEVEL_NOTICE);
}));
new Setting(containerHatchEl)
.setName("Delete all customization sync data")
.addButton((button) =>
button
.setButtonText("Delete")
.setDisabled(false)
.setWarning()
.onClick(async () => {
Logger(`Deleting customization sync data`, LOG_LEVEL_NOTICE);
const entriesToDelete = (await this.plugin.localDatabase.allDocsRaw({ startkey: "ix:", endkey: "ix:\u{10ffff}", include_docs: true }));
const newData = entriesToDelete.rows.map(e => ({ ...e.doc, _deleted: true }));
const r = await this.plugin.localDatabase.bulkDocsRaw(newData as any[]);
// Do not care about the result.
Logger(`${r.length} items have been removed, to confirm how many items are left, please perform it again.`, LOG_LEVEL_NOTICE);
}))
new Setting(containerHatchEl)
.setName("Suspend file watching")
.setDesc("Stop watching for file change.")
@@ -1544,6 +1680,27 @@ ${stringifyYaml(pluginConfig)}`;
toggle.setValue(this.plugin.settings.suspendFileWatching).onChange(async (value) => {
this.plugin.settings.suspendFileWatching = value;
await this.plugin.saveSettings();
scheduleTask("configReload", 250, async () => {
if (await askYesNo(this.app, "Do you want to restart and reload Obsidian now?") == "yes") {
// @ts-ignore
this.app.commands.executeCommandById("app:reload")
}
})
})
);
new Setting(containerHatchEl)
.setName("Suspend database reflecting")
.setDesc("Stop reflecting database changes to storage files.")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.suspendParseReplicationResult).onChange(async (value) => {
this.plugin.settings.suspendParseReplicationResult = value;
await this.plugin.saveSettings();
scheduleTask("configReload", 250, async () => {
if (await askYesNo(this.app, "Do you want to restart and reload Obsidian now?") == "yes") {
// @ts-ignore
this.app.commands.executeCommandById("app:reload")
}
})
})
);
new Setting(containerHatchEl)
@@ -1640,15 +1797,50 @@ ${stringifyYaml(pluginConfig)}`;
button.setButtonText("Change")
.onClick(async () => {
if (this.plugin.settings.additionalSuffixOfDatabaseName == newDatabaseName) {
Logger("Suffix was not changed.", LOG_LEVEL.NOTICE);
Logger("Suffix was not changed.", LOG_LEVEL_NOTICE);
return;
}
this.plugin.settings.additionalSuffixOfDatabaseName = newDatabaseName;
await this.plugin.saveSettings();
Logger("Suffix has been changed. Reopening database...", LOG_LEVEL.NOTICE);
Logger("Suffix has been changed. Reopening database...", LOG_LEVEL_NOTICE);
await this.plugin.initializeDatabase();
})
})
new Setting(containerHatchEl)
.setName("The Hash algorithm for chunk IDs")
.setDesc("xxhash64 is the current default.")
.setClass("wizardHidden")
.addDropdown((dropdown) =>
dropdown
.addOptions({ "": "Old Algorithm", "xxhash32": "xxhash32 (Fast)", "xxhash64": "xxhash64 (Fastest)" } as Record<HashAlgorithm, string>)
.setValue(this.plugin.settings.hashAlg)
.onChange(async (value) => {
this.plugin.settings.hashAlg = value as HashAlgorithm;
await this.plugin.saveSettings();
})
)
.setClass("wizardHidden");
new Setting(containerHatchEl)
.setName("Fetch database with previous behaviour")
.setDesc("")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.doNotSuspendOnFetching).onChange(async (value) => {
this.plugin.settings.doNotSuspendOnFetching = value;
await this.plugin.saveSettings();
})
);
new Setting(containerHatchEl)
.setName("Use binary and encryption version 1")
.setDesc("Use the previous data format for other products which shares the remote database.")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.useV1).onChange(async (value) => {
this.plugin.settings.useV1 = value;
await this.plugin.saveSettings();
})
);
addScreenElement("50", containerHatchEl);
@@ -1673,11 +1865,11 @@ ${stringifyYaml(pluginConfig)}`;
vaultName.setDisabled(this.plugin.settings.usePluginSync);
// vaultName.setTooltip(this.plugin.settings.autoSweepPlugins || this.plugin.settings.autoSweepPluginsPeriodic ? "You could not change when you enabling auto scan." : "");
};
updateDisabledOfDeviceAndVaultName
updateDisabledOfDeviceAndVaultName();
new Setting(containerPluginSettings).setName("Enable customization sync").addToggle((toggle) =>
toggle.setValue(this.plugin.settings.usePluginSync).onChange(async (value) => {
if (value && this.plugin.deviceAndVaultName.trim() == "") {
Logger("We have to configure `Device name` to use this feature.", LOG_LEVEL.NOTICE);
Logger("We have to configure `Device name` to use this feature.", LOG_LEVEL_NOTICE);
toggle.setValue(false);
return false;
}
@@ -1699,16 +1891,18 @@ ${stringifyYaml(pluginConfig)}`;
})
);
new Setting(containerPluginSettings)
.setName("Scan customization periodically")
.setDesc("Scan customization every 1 minute. This configuration will be ignored if monitoring changes of hidden files has been enabled.")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.autoSweepPluginsPeriodic).onChange(async (value) => {
this.plugin.settings.autoSweepPluginsPeriodic = value;
updateDisabledOfDeviceAndVaultName();
await this.plugin.saveSettings();
})
);
if (!this.plugin.settings.watchInternalFileChanges) {
new Setting(containerPluginSettings)
.setName("Scan customization periodically")
.setDesc("Scan customization every 1 minute.")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.autoSweepPluginsPeriodic).onChange(async (value) => {
this.plugin.settings.autoSweepPluginsPeriodic = value;
updateDisabledOfDeviceAndVaultName();
await this.plugin.saveSettings();
})
);
}
new Setting(containerPluginSettings)
.setName("Notify customized")
@@ -1769,26 +1963,6 @@ ${stringifyYaml(pluginConfig)}`;
})
)
new Setting(containerMaintenanceEl)
.setName("(Beta) Clean the remote database")
.setDesc("")
.addButton((button) =>
button.setButtonText("Count")
.setDisabled(false)
.onClick(async () => {
await remoteDatabaseCleanup(this.plugin, true);
})
).addButton((button) =>
button.setButtonText("Perform cleaning")
.setDisabled(false)
.setWarning()
.onClick(async () => {
// @ts-ignore
this.plugin.app.setting.close()
await remoteDatabaseCleanup(this.plugin, false);
await balanceChunks(this.plugin, false);
})
);
containerMaintenanceEl.createEl("h4", { text: "The local database" });
@@ -1805,26 +1979,6 @@ ${stringifyYaml(pluginConfig)}`;
})
)
new Setting(containerMaintenanceEl)
.setName("(Beta) Clean the local database")
.setDesc("This feature requires disabling 'Use an old adapter for compatibility'")
.addButton((button) =>
button.setButtonText("Count")
.setDisabled(false)
.onClick(async () => {
await localDatabaseCleanUp(this.plugin, false, true);
})
).addButton((button) =>
button.setButtonText("Perform cleaning")
.setDisabled(false)
.setWarning()
.onClick(async () => {
// @ts-ignore
this.plugin.app.setting.close()
await localDatabaseCleanUp(this.plugin, false, false);
})
);
new Setting(containerMaintenanceEl)
.setName("Discard local database to reset or uninstall Self-hosted LiveSync")
.addButton((button) =>
@@ -1840,6 +1994,25 @@ ${stringifyYaml(pluginConfig)}`;
containerMaintenanceEl.createEl("h4", { text: "Both databases" });
new Setting(containerMaintenanceEl)
.setName("(Beta2) Clean up databases")
.setDesc("Delete unused chunks to shrink the database. This feature requires disabling 'Use an old adapter for compatibility'")
.addButton((button) =>
button.setButtonText("DryRun")
.setDisabled(false)
.onClick(async () => {
await this.plugin.dryRunGC();
})
).addButton((button) =>
button.setButtonText("Perform cleaning")
.setDisabled(false)
.setWarning()
.onClick(async () => {
// @ts-ignore
this.plugin.app.setting.close()
await this.plugin.dbGC();
})
);
new Setting(containerMaintenanceEl)
.setName("Rebuild everything")
.setDesc("Rebuild local and remote database with local files.")
@@ -1852,19 +2025,6 @@ ${stringifyYaml(pluginConfig)}`;
await rebuildDB("rebuildBothByThisDevice");
})
)
new Setting(containerMaintenanceEl)
.setName("(Beta) Complement each other with possible missing chunks.")
.setDesc("")
.addButton((button) =>
button
.setButtonText("Balance")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await balanceChunks(this.plugin, false);
})
)
applyDisplayEnabled();
addScreenElement("70", containerMaintenanceEl);

View File

@@ -2,7 +2,7 @@
import type { PluginDataExDisplay } from "./CmdConfigSync";
import { Logger } from "./lib/src/logger";
import { versionNumberString2Number } from "./lib/src/strbin";
import { type FilePath, LOG_LEVEL } from "./lib/src/types";
import { type FilePath, LOG_LEVEL_NOTICE } from "./lib/src/types";
import { getDocData } from "./lib/src/utils";
import type ObsidianLiveSyncPlugin from "./main";
import { askString, scheduleTask } from "./utils";
@@ -108,7 +108,7 @@
}
}
})
.reduce((p, c) => p | c, 0);
.reduce((p, c) => p | (c as number), 0 as number);
if (matchingStatus == 0b0000100) {
equivalency = "⚖️ Same";
canApply = false;
@@ -207,21 +207,21 @@
const local = list.find((e) => e.term == thisTerm);
const selectedItem = list.find((e) => e.term == selected);
if (selectedItem && (await applyData(selectedItem))) {
scheduleTask("update-plugin-list", 250, () => addOn.updatePluginList(true, local.documentPath));
addOn.updatePluginList(true, local?.documentPath);
}
}
async function compareSelected() {
const local = list.find((e) => e.term == thisTerm);
const selectedItem = list.find((e) => e.term == selected);
if (local && selectedItem && (await compareData(local, selectedItem))) {
scheduleTask("update-plugin-list", 250, () => addOn.updatePluginList(true, local.documentPath));
addOn.updatePluginList(true, local.documentPath);
}
}
async function deleteSelected() {
const selectedItem = list.find((e) => e.term == selected);
// const deletedPath = selectedItem.documentPath;
if (selectedItem && (await deleteData(selectedItem))) {
scheduleTask("update-plugin-list", 250, () => addOn.reloadPluginList(true));
addOn.reloadPluginList(true);
}
}
async function duplicateItem() {
@@ -229,7 +229,7 @@
const duplicateTermName = await askString(plugin.app, "Duplicate", "device name", "");
if (duplicateTermName) {
if (duplicateTermName.contains("/")) {
Logger(`We can not use "/" to the device name`, LOG_LEVEL.NOTICE);
Logger(`We can not use "/" to the device name`, LOG_LEVEL_NOTICE);
return;
}
const key = `${plugin.app.vault.configDir}/${local.files[0].filename}`;
@@ -274,7 +274,7 @@
{/if}
{:else}
<span class="spacer" />
<span class="message even">All devices are even</span>
<span class="message even">All the same or non-existent</span>
<button disabled />
<button disabled />
{/if}

View File

@@ -3,9 +3,13 @@
import ObsidianLiveSyncPlugin from "./main";
import { type PluginDataExDisplay, pluginIsEnumerating, pluginList } from "./CmdConfigSync";
import PluginCombo from "./PluginCombo.svelte";
import { Menu } from "obsidian";
import { unique } from "./lib/src/utils";
import { MODE_SELECTIVE, MODE_AUTOMATIC, MODE_PAUSED, type SYNC_MODE, type PluginSyncSettingEntry } from "./lib/src/types";
import { normalizePath } from "./deps";
export let plugin: ObsidianLiveSyncPlugin;
$: hideNotApplicable = true;
$: hideNotApplicable = false;
$: thisTerm = plugin.deviceAndVaultName;
const addOn = plugin.addOnConfigSync;
@@ -13,7 +17,7 @@
let list: PluginDataExDisplay[] = [];
let selectNewestPulse = 0;
let hideEven = true;
let hideEven = false;
let loading = false;
let applyAllPluse = 0;
let isMaintenanceMode = false;
@@ -80,6 +84,54 @@
async function deleteData(data: PluginDataExDisplay): Promise<boolean> {
return await addOn.deleteData(data);
}
function askMode(evt: MouseEvent, title: string, key: string) {
const menu = new Menu();
menu.addItem((item) => item.setTitle(title).setIsLabel(true));
menu.addSeparator();
const prevMode = automaticList.get(key) ?? MODE_SELECTIVE;
for (const mode of [MODE_SELECTIVE, MODE_AUTOMATIC, MODE_PAUSED]) {
menu.addItem((item) => {
item.setTitle(`${getIcon(mode as SYNC_MODE)}:${TITLES[mode]}`)
.onClick((e) => {
if (mode === MODE_AUTOMATIC) {
askOverwriteModeForAutomatic(evt, key);
} else {
setMode(key, mode as SYNC_MODE);
}
})
.setChecked(prevMode == mode)
.setDisabled(prevMode == mode);
});
}
menu.showAtMouseEvent(evt);
}
function applyAutomaticSync(key: string, direction: "pushForce" | "pullForce" | "safe") {
setMode(key, MODE_AUTOMATIC);
const configDir = normalizePath(plugin.app.vault.configDir);
const files = (plugin.settings.pluginSyncExtendedSetting[key]?.files ?? []).map((e) => `${configDir}/${e}`);
plugin.addOnHiddenFileSync.syncInternalFilesAndDatabase(direction, true, false, files);
}
function askOverwriteModeForAutomatic(evt: MouseEvent, key: string) {
const menu = new Menu();
menu.addItem((item) => item.setTitle("Initial Action").setIsLabel(true));
menu.addSeparator();
menu.addItem((item) => {
item.setTitle(`↑: Overwrite Remote`).onClick((e) => {
applyAutomaticSync(key, "pushForce");
});
})
.addItem((item) => {
item.setTitle(`↓: Overwrite Local`).onClick((e) => {
applyAutomaticSync(key, "pullForce");
});
})
.addItem((item) => {
item.setTitle(`⇅: Use newer`).onClick((e) => {
applyAutomaticSync(key, "safe");
});
});
menu.showAtMouseEvent(evt);
}
$: options = {
thisTerm,
@@ -92,11 +144,84 @@
plugin,
isMaintenanceMode,
};
const ICON_EMOJI_PAUSED = `⛔`;
const ICON_EMOJI_AUTOMATIC = `✨`;
const ICON_EMOJI_SELECTIVE = `🔀`;
const ICONS: { [key: number]: string } = {
[MODE_SELECTIVE]: ICON_EMOJI_SELECTIVE,
[MODE_PAUSED]: ICON_EMOJI_PAUSED,
[MODE_AUTOMATIC]: ICON_EMOJI_AUTOMATIC,
};
const TITLES: { [key: number]: string } = {
[MODE_SELECTIVE]: "Selective",
[MODE_PAUSED]: "Ignore",
[MODE_AUTOMATIC]: "Automatic",
};
const PREFIX_PLUGIN_ALL = "PLUGIN_ALL";
const PREFIX_PLUGIN_DATA = "PLUGIN_DATA";
const PREFIX_PLUGIN_MAIN = "PLUGIN_MAIN";
function setMode(key: string, mode: SYNC_MODE) {
if (key.startsWith(PREFIX_PLUGIN_ALL + "/")) {
setMode(PREFIX_PLUGIN_DATA + key.substring(PREFIX_PLUGIN_ALL.length), mode);
setMode(PREFIX_PLUGIN_MAIN + key.substring(PREFIX_PLUGIN_ALL.length), mode);
}
const files = unique(
list
.filter((e) => `${e.category}/${e.name}` == key)
.map((e) => e.files)
.flat()
.map((e) => e.filename)
);
automaticList.set(key, mode);
automaticListDisp = automaticList;
if (!(key in plugin.settings.pluginSyncExtendedSetting)) {
plugin.settings.pluginSyncExtendedSetting[key] = {
key,
mode,
files: [],
};
}
plugin.settings.pluginSyncExtendedSetting[key].files = files;
plugin.settings.pluginSyncExtendedSetting[key].mode = mode;
plugin.saveSettingData();
}
function getIcon(mode: SYNC_MODE) {
if (mode in ICONS) {
return ICONS[mode];
} else {
("");
}
}
let automaticList = new Map<string, SYNC_MODE>();
let automaticListDisp = new Map<string, SYNC_MODE>();
// apply current configuration to the dialogue
for (const { key, mode } of Object.values(plugin.settings.pluginSyncExtendedSetting)) {
automaticList.set(key, mode);
}
automaticListDisp = automaticList;
let displayKeys: Record<string, string[]> = {};
$: {
const extraKeys = Object.keys(plugin.settings.pluginSyncExtendedSetting);
displayKeys = [
...list,
...extraKeys
.map((e) => `${e}///`.split("/"))
.filter((e) => e[0] && e[1])
.map((e) => ({ category: e[0], name: e[1], displayName: e[1] })),
]
.sort((a, b) => (a.displayName ?? a.name).localeCompare(b.displayName ?? b.name))
.reduce((p, c) => ({ ...p, [c.category]: unique(c.category in p ? [...p[c.category], c.displayName ?? c.name] : [c.displayName ?? c.name]) }), {} as Record<string, string[]>);
}
</script>
<div>
<div>
<h1>Customization sync</h1>
<div class="buttons">
<button on:click={() => scanAgain()}>Scan changes</button>
<button on:click={() => replicate()}>Sync once</button>
@@ -119,15 +244,24 @@
{#if list.length == 0}
<div class="center">No Items.</div>
{:else}
{#each Object.entries(displays) as [key, label]}
{#each Object.entries(displays).filter(([key, _]) => key in displayKeys) as [key, label]}
<div>
<h3>{label}</h3>
{#each groupBy(filterList(list, [key]), "name") as [name, listX]}
{#each displayKeys[key] as name}
{@const bindKey = `${key}/${name}`}
{@const mode = automaticListDisp.get(bindKey) ?? MODE_SELECTIVE}
<div class="labelrow {hideEven ? 'hideeven' : ''}">
<div class="title">
{name}
<button class="status" on:click={(evt) => askMode(evt, `${key}/${name}`, bindKey)}>
{getIcon(mode)}
</button>
<span class="name">{name}</span>
</div>
<PluginCombo {...options} list={listX} hidden={false} />
{#if mode == MODE_SELECTIVE}
<PluginCombo {...options} list={list.filter((e) => e.category == key && e.name == name)} hidden={false} />
{:else}
<div class="statusnote">{TITLES[mode]}</div>
{/if}
</div>
{/each}
</div>
@@ -135,20 +269,55 @@
<div>
<h3>Plugins</h3>
{#each groupBy(filterList(list, ["PLUGIN_MAIN", "PLUGIN_DATA", "PLUGIN_ETC"]), "name") as [name, listX]}
{@const bindKeyAll = `${PREFIX_PLUGIN_ALL}/${name}`}
{@const modeAll = automaticListDisp.get(bindKeyAll) ?? MODE_SELECTIVE}
{@const bindKeyMain = `${PREFIX_PLUGIN_MAIN}/${name}`}
{@const modeMain = automaticListDisp.get(bindKeyMain) ?? MODE_SELECTIVE}
{@const bindKeyData = `${PREFIX_PLUGIN_DATA}/${name}`}
{@const modeData = automaticListDisp.get(bindKeyData) ?? MODE_SELECTIVE}
<div class="labelrow {hideEven ? 'hideeven' : ''}">
<div class="title">
{name}
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_ALL}/${name}`, bindKeyAll)}>
{getIcon(modeAll)}
</button>
<span class="name">{name}</span>
</div>
<PluginCombo {...options} list={listX} hidden={true} />
</div>
<div class="filerow {hideEven ? 'hideeven' : ''}">
<div class="filetitle">Main</div>
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_MAIN"])} hidden={false} />
</div>
<div class="filerow {hideEven ? 'hideeven' : ''}">
<div class="filetitle">Data</div>
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_DATA"])} hidden={false} />
{#if modeAll == MODE_SELECTIVE}
<PluginCombo {...options} list={listX} hidden={true} />
{/if}
</div>
{#if modeAll == MODE_SELECTIVE}
<div class="filerow {hideEven ? 'hideeven' : ''}">
<div class="filetitle">
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_MAIN}/${name}/MAIN`, bindKeyMain)}>
{getIcon(modeMain)}
</button>
<span class="name">MAIN</span>
</div>
{#if modeMain == MODE_SELECTIVE}
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_MAIN"])} hidden={false} />
{:else}
<div class="statusnote">{TITLES[modeMain]}</div>
{/if}
</div>
<div class="filerow {hideEven ? 'hideeven' : ''}">
<div class="filetitle">
<button class="status" on:click={(evt) => askMode(evt, `${PREFIX_PLUGIN_DATA}/${name}`, bindKeyData)}>
{getIcon(modeData)}
</button>
<span class="name">DATA</span>
</div>
{#if modeData == MODE_SELECTIVE}
<PluginCombo {...options} list={filterList(listX, ["PLUGIN_DATA"])} hidden={false} />
{:else}
<div class="statusnote">{TITLES[modeData]}</div>
{/if}
</div>
{:else}
<div class="noterow">
<div class="statusnote">{TITLES[modeAll]}</div>
</div>
{/if}
{/each}
</div>
{/if}
@@ -162,6 +331,15 @@
</div>
<style>
span.spacer {
min-width: 1px;
flex-grow: 1;
}
h3 {
position: sticky;
top: 0;
background-color: var(--modal-background);
}
.labelrow {
margin-left: 0.4em;
display: flex;
@@ -183,6 +361,24 @@
.labelrow.hideeven:has(.even) {
display: none;
}
.noterow {
min-height: 2em;
display: flex;
}
button.status {
flex-grow: 0;
margin: 2px 4px;
min-width: 3em;
max-width: 4em;
}
.statusnote {
display: flex;
justify-content: flex-end;
padding-right: var(--size-4-12);
align-items: center;
min-width: 10em;
flex-grow: 1;
}
.title {
color: var(--text-normal);

View File

@@ -1,8 +1,8 @@
import { Plugin_2, TAbstractFile, TFile, TFolder } from "./deps";
import { Plugin, TAbstractFile, TFile, TFolder } from "./deps";
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
import { getGlobalStore } from "./lib/src/store";
import { FilePath, ObsidianLiveSyncSettings } from "./lib/src/types";
import { FileEventItem, FileEventType, FileInfo, InternalFileInfo, queueItem } from "./types";
import { type FilePath, type ObsidianLiveSyncSettings } from "./lib/src/types";
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo, type queueItem } from "./types";
import { recentlyTouched } from "./utils";
@@ -12,12 +12,13 @@ export abstract class StorageEventManager {
abstract getQueueLength(): number;
}
type LiveSyncForStorageEventManager = Plugin_2 &
type LiveSyncForStorageEventManager = Plugin &
{
settings: ObsidianLiveSyncSettings
ignoreFiles: string[],
} & {
isTargetFile: (file: string | TAbstractFile) => boolean,
procFileEvent: (applyBatch?: boolean) => Promise<boolean>
isTargetFile: (file: string | TAbstractFile) => Promise<boolean>,
procFileEvent: (applyBatch?: boolean) => Promise<any>,
};
@@ -35,12 +36,12 @@ export class StorageEventManagerObsidian extends StorageEventManager {
this.watchVaultDelete = this.watchVaultDelete.bind(this);
this.watchVaultRename = this.watchVaultRename.bind(this);
this.watchVaultRawEvents = this.watchVaultRawEvents.bind(this);
plugin.registerEvent(app.vault.on("modify", this.watchVaultChange));
plugin.registerEvent(app.vault.on("delete", this.watchVaultDelete));
plugin.registerEvent(app.vault.on("rename", this.watchVaultRename));
plugin.registerEvent(app.vault.on("create", this.watchVaultCreate));
plugin.registerEvent(plugin.app.vault.on("modify", this.watchVaultChange));
plugin.registerEvent(plugin.app.vault.on("delete", this.watchVaultDelete));
plugin.registerEvent(plugin.app.vault.on("rename", this.watchVaultRename));
plugin.registerEvent(plugin.app.vault.on("create", this.watchVaultCreate));
//@ts-ignore : Internal API
plugin.registerEvent(app.vault.on("raw", this.watchVaultRawEvents));
plugin.registerEvent(plugin.app.vault.on("raw", this.watchVaultRawEvents));
}
watchVaultCreate(file: TAbstractFile, ctx?: any) {
@@ -64,9 +65,18 @@ export class StorageEventManagerObsidian extends StorageEventManager {
}
// Watch raw events (Internal API)
watchVaultRawEvents(path: FilePath) {
if (this.plugin.settings.useIgnoreFiles && this.plugin.ignoreFiles.some(e => path.endsWith(e.trim()))) {
// If it is one of ignore files, refresh the cached one.
this.plugin.isTargetFile(path).then(() => this._watchVaultRawEvents(path));
} else {
this._watchVaultRawEvents(path);
}
}
_watchVaultRawEvents(path: FilePath) {
if (!this.plugin.settings.syncInternalFiles && !this.plugin.settings.usePluginSync) return;
if (!this.plugin.settings.watchInternalFileChanges) return;
if (!path.startsWith(app.vault.configDir)) return;
if (!path.startsWith(this.plugin.app.vault.configDir)) return;
const ignorePatterns = this.plugin.settings.syncInternalFilesIgnorePatterns
.replace(/\n| /g, "")
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
@@ -77,7 +87,6 @@ export class StorageEventManagerObsidian extends StorageEventManager {
file: { path, mtime: 0, ctime: 0, size: 0 }
}], null);
}
// Cache file and waiting to can be proceed.
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
let forcePerform = false;
@@ -90,7 +99,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
const file = param.file;
const oldPath = param.oldPath;
if (file instanceof TFolder) continue;
if (!this.plugin.isTargetFile(file.path)) continue;
if (!await this.plugin.isTargetFile(file.path)) continue;
if (this.plugin.settings.suspendFileWatching) continue;
let cache: null | string | ArrayBuffer;
@@ -100,11 +109,11 @@ export class StorageEventManagerObsidian extends StorageEventManager {
continue;
}
if (!isPlainText(file.name)) {
cache = await app.vault.readBinary(file);
cache = await this.plugin.app.vault.readBinary(file);
} else {
// cache = await this.app.vault.read(file);
cache = await app.vault.cachedRead(file);
if (!cache) cache = await app.vault.read(file);
cache = await this.plugin.app.vault.cachedRead(file);
if (!cache) cache = await this.plugin.app.vault.read(file);
}
}
if (type == "DELETE" || type == "RENAME") {

View File

@@ -1,12 +1,13 @@
import { type FilePath } from "./lib/src/types";
export {
addIcon, App, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginSettingTab, Plugin_2, requestUrl, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
addIcon, App, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginSettingTab, requestUrl, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
parseYaml, ItemView, WorkspaceLeaf
} from "obsidian";
export type { DataWriteOptions, PluginManifest, RequestUrlParam, RequestUrlResponse } from "obsidian";
export type { DataWriteOptions, PluginManifest, RequestUrlParam, RequestUrlResponse, MarkdownFileInfo } from "obsidian";
import {
normalizePath as normalizePath_
} from "obsidian";
const normalizePath = normalizePath_ as <T extends string | FilePath>(from: T) => T;
export { normalizePath }
export { type Diff, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "diff-match-patch";

View File

@@ -20,6 +20,7 @@ export class PluginDialogModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Customization Sync (Beta2)")
if (this.component == null) {
this.component = new PluginPane({
target: contentEl,
@@ -38,7 +39,7 @@ export class PluginDialogModal extends Modal {
export class InputStringDialog extends Modal {
result: string | false = false;
onSubmit: (result: string | boolean) => void;
onSubmit: (result: string | false) => void;
title: string;
key: string;
placeholder: string;
@@ -56,10 +57,8 @@ export class InputStringDialog extends Modal {
onOpen() {
const { contentEl } = this;
contentEl.createEl("h1", { text: this.title });
// For enter to submit
const formEl = contentEl.createEl("form");
this.titleEl.setText(this.title);
const formEl = contentEl.createDiv();
new Setting(formEl).setName(this.key).setClass(this.isPassword ? "password-input" : "normal-input").addText((text) =>
text.onChange((value) => {
this.result = value;
@@ -144,7 +143,7 @@ export class MessageBox extends Modal {
timer: ReturnType<typeof setInterval> = undefined;
defaultButtonComponent: ButtonComponent | undefined;
onSubmit: (result: string | boolean) => void;
onSubmit: (result: string | false) => void;
constructor(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout: number, onSubmit: (result: (typeof buttons)[number] | false) => void) {
super(plugin.app);
@@ -175,16 +174,17 @@ export class MessageBox extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText(this.title);
contentEl.addEventListener("click", () => {
if (this.timer) {
clearInterval(this.timer);
this.timer = undefined;
}
})
contentEl.createEl("h1", { text: this.title });
const div = contentEl.createDiv();
MarkdownRenderer.renderMarkdown(this.contentMd, div, "/", null);
MarkdownRenderer.render(this.plugin.app, this.contentMd, div, "/", this.plugin);
const buttonSetting = new Setting(contentEl);
buttonSetting.controlEl.style.flexWrap = "wrap";
for (const button of this.buttons) {
buttonSetting.addButton((btn) => {
btn

Submodule src/lib updated: 63fa0074fe...4a4e7a26b4

File diff suppressed because it is too large Load Diff

View File

@@ -1,5 +1,5 @@
import { PluginManifest, TFile } from "./deps";
import { DatabaseEntry, EntryBody, FilePath } from "./lib/src/types";
import { type PluginManifest, TFile } from "./deps";
import { type DatabaseEntry, type EntryBody, type FilePath } from "./lib/src/types";
export interface PluginDataEntry extends DatabaseEntry {
deviceVaultName: string;

View File

@@ -1,12 +1,12 @@
import { type DataWriteOptions, normalizePath, TFile, Platform, TAbstractFile, App, Plugin_2, type RequestUrlParam, requestUrl } from "./deps";
import { type DataWriteOptions, normalizePath, TFile, Platform, TAbstractFile, App, Plugin, type RequestUrlParam, requestUrl } from "./deps";
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "./lib/src/path";
import { Logger } from "./lib/src/logger";
import { type AnyEntry, type DocumentID, type EntryDoc, type EntryHasPath, type FilePath, type FilePathWithPrefix, LOG_LEVEL, type NewEntry } from "./lib/src/types";
import { CHeader, ICHeader, ICHeaderLength, PSCHeader } from "./types";
import { LOG_LEVEL_VERBOSE, type AnyEntry, type DocumentID, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "./lib/src/types";
import { CHeader, ICHeader, ICHeaderLength, ICXHeader, PSCHeader } from "./types";
import { InputStringDialog, PopoverSelectString } from "./dialogs";
import ObsidianLiveSyncPlugin from "./main";
import { runWithLock } from "./lib/src/lock";
import { writeString } from "./lib/src/strbin";
// For backward compatibility, using the path for determining id.
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
@@ -328,7 +328,7 @@ export function isValidPath(filename: string) {
if (Platform.isAndroidApp) return isValidFilenameInAndroid(filename);
if (Platform.isIosApp) return isValidFilenameInDarwin(filename);
//Fallback
Logger("Could not determine platform for checking filename", LOG_LEVEL.VERBOSE);
Logger("Could not determine platform for checking filename", LOG_LEVEL_VERBOSE);
return isValidFilenameInWidows(filename);
}
@@ -388,6 +388,9 @@ export function isChunk(str: string): boolean {
export function isPluginMetadata(str: string): boolean {
return str.startsWith(PSCHeader);
}
export function isCustomisationSyncMetadata(str: string): boolean {
return str.startsWith(ICXHeader);
}
export const askYesNo = (app: App, message: string): Promise<"yes" | "no"> => {
return new Promise((res) => {
@@ -416,8 +419,8 @@ export const askString = (app: App, title: string, key: string, placeholder: str
export class PeriodicProcessor {
_process: () => Promise<any>;
_timer?: number;
_plugin: Plugin_2;
constructor(plugin: Plugin_2, process: () => Promise<any>) {
_plugin: Plugin;
constructor(plugin: Plugin, process: () => Promise<any>) {
this._plugin = plugin;
this._process = process;
}
@@ -431,22 +434,17 @@ export class PeriodicProcessor {
enable(interval: number) {
this.disable();
if (interval == 0) return;
this._timer = window.setInterval(() => this._process().then(() => { }), interval);
this._timer = window.setInterval(() => this.process().then(() => { }), interval);
this._plugin.registerInterval(this._timer);
}
disable() {
if (this._timer) clearInterval(this._timer);
if (this._timer !== undefined) window.clearInterval(this._timer);
this._timer = undefined;
}
}
function sizeToHumanReadable(size: number | undefined) {
if (!size) return "-";
const i = Math.floor(Math.log(size) / Math.log(1024));
return Number.parseInt((size / Math.pow(1024, i)).toFixed(2)) + ' ' + ['B', 'kB', 'MB', 'GB', 'TB'][i];
}
export const _requestToCouchDBFetch = async (baseUri: string, username: string, password: string, path?: string, body?: string | any, method?: string) => {
const utf8str = String.fromCharCode.apply(null, new TextEncoder().encode(`${username}:${password}`));
const utf8str = String.fromCharCode.apply(null, [...writeString(`${username}:${password}`)]);
const encoded = window.btoa(utf8str);
const authHeader = "Basic " + encoded;
const transformedHeaders: Record<string, string> = { authorization: authHeader, "content-type": "application/json" };
@@ -462,7 +460,7 @@ export const _requestToCouchDBFetch = async (baseUri: string, username: string,
}
export const _requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string, path?: string, body?: any, method?: string) => {
const utf8str = String.fromCharCode.apply(null, new TextEncoder().encode(`${username}:${password}`));
const utf8str = String.fromCharCode.apply(null, [...writeString(`${username}:${password}`)]);
const encoded = window.btoa(utf8str);
const authHeader = "Basic " + encoded;
const transformedHeaders: Record<string, string> = { authorization: authHeader, origin: origin };
@@ -492,249 +490,3 @@ export async function performRebuildDB(plugin: ObsidianLiveSyncPlugin, method: "
await plugin.addOnSetup.rebuildEverything();
}
}
export const gatherChunkUsage = async (db: PouchDB.Database<EntryDoc>) => {
const used = new Map();
const unreferenced = new Map();
const removed = new Map();
const missing = new Map();
const xx = await db.allDocs({ startkey: "h:", endkey: `h:\u{10ffff}` });
for (const xxd of xx.rows) {
const chunk = xxd.id
unreferenced.set(chunk, xxd.value.rev);
}
const x = await db.find({ limit: 999999999, selector: { children: { $exists: true, $type: "array" } }, fields: ["_id", "path", "mtime", "children"] });
for (const temp of x.docs) {
for (const chunk of (temp as NewEntry).children) {
used.set(chunk, (used.has(chunk) ? used.get(chunk) : 0) + 1);
if (unreferenced.has(chunk)) {
removed.set(chunk, unreferenced.get(chunk));
unreferenced.delete(chunk);
} else {
if (!removed.has(chunk)) {
if (!missing.has(temp._id)) {
missing.set(temp._id, []);
}
missing.get(temp._id).push(chunk);
}
}
}
}
return { used, unreferenced, missing };
}
export const localDatabaseCleanUp = async (plugin: ObsidianLiveSyncPlugin, force: boolean, dryRun: boolean) => {
await runWithLock("clean-up:local", true, async () => {
const db = plugin.localDatabase.localDatabase;
if ((db as any)?.adapter != "indexeddb") {
if (force && !dryRun) {
Logger("Fetch from the remote database", LOG_LEVEL.NOTICE, "clean-up-db");
await performRebuildDB(plugin, "localOnly");
return;
} else {
Logger("This feature requires disabling `Use an old adapter for compatibility`.", LOG_LEVEL.NOTICE, "clean-up-db");
return;
}
}
Logger(`The remote database has been locked for garbage collection`, LOG_LEVEL.NOTICE, "clean-up-db");
Logger(`Gathering chunk usage information`, LOG_LEVEL.NOTICE, "clean-up-db");
const { unreferenced, missing } = await gatherChunkUsage(db);
if (missing.size != 0) {
Logger(`Some chunks are not found! We have to rescue`, LOG_LEVEL.NOTICE);
Logger(missing, LOG_LEVEL.VERBOSE);
} else {
Logger(`All chunks are OK`, LOG_LEVEL.NOTICE);
}
const payload = {} as Record<string, string[]>;
for (const [id, rev] of unreferenced) {
payload[id] = [rev];
}
const removeItems = Object.keys(payload).length;
if (removeItems == 0) {
Logger(`No unreferenced chunks found (Local)`, LOG_LEVEL.NOTICE);
await plugin.markRemoteResolved();
}
if (dryRun) {
Logger(`There are ${removeItems} unreferenced chunks (Local)`, LOG_LEVEL.NOTICE);
return;
}
Logger(`Deleting unreferenced chunks: ${removeItems}`, LOG_LEVEL.NOTICE, "clean-up-db");
for (const [id, rev] of unreferenced) {
//@ts-ignore
const ret = await db.purge(id, rev);
Logger(ret, LOG_LEVEL.VERBOSE);
}
plugin.localDatabase.refreshSettings();
Logger(`Compacting local database...`, LOG_LEVEL.NOTICE, "clean-up-db");
await db.compact();
await plugin.markRemoteResolved();
Logger("Done!", LOG_LEVEL.NOTICE, "clean-up-db");
})
}
export const balanceChunks = async (plugin: ObsidianLiveSyncPlugin, dryRun: boolean) => {
await runWithLock("clean-up:balance", true, async () => {
const localDB = plugin.localDatabase.localDatabase;
Logger(`Gathering chunk usage information`, LOG_LEVEL.NOTICE, "clean-up-db");
const ret = await plugin.replicator.connectRemoteCouchDBWithSetting(plugin.settings, plugin.isMobile);
if (typeof ret === "string") {
Logger(`Connect error: ${ret}`, LOG_LEVEL.NOTICE, "clean-up-db");
return;
}
const localChunks = new Map<string, string>();
const xx = await localDB.allDocs({ startkey: "h:", endkey: `h:\u{10ffff}` });
for (const xxd of xx.rows) {
const chunk = xxd.id
localChunks.set(chunk, xxd.value.rev);
}
// const info = ret.info;
const remoteDB = ret.db;
const remoteChunks = new Map<string, string>();
const xxr = await remoteDB.allDocs({ startkey: "h:", endkey: `h:\u{10ffff}` });
for (const xxd of xxr.rows) {
const chunk = xxd.id
remoteChunks.set(chunk, xxd.value.rev);
}
const localToRemote = new Map<string, string>([...localChunks]);
const remoteToLocal = new Map<string, string>([...remoteChunks]);
for (const id of new Set([...localChunks.keys(), ...remoteChunks.keys()])) {
if (remoteChunks.has(id)) {
localToRemote.delete(id);
}
if (localChunks.has(id)) {
remoteToLocal.delete(id);
}
}
function arrayToChunkedArray<T>(src: T[], size = 25) {
const ret = [] as T[][];
let i = 0;
while (i < src.length) {
ret.push(src.slice(i, i += size));
}
return ret;
}
if (localToRemote.size == 0) {
Logger(`No chunks need to be sent`, LOG_LEVEL.NOTICE);
} else {
Logger(`${localToRemote.size} chunks need to be sent`, LOG_LEVEL.NOTICE);
if (!dryRun) {
const w = arrayToChunkedArray([...localToRemote]);
for (const chunk of w) {
for (const [id,] of chunk) {
const queryRet = await localDB.allDocs({ keys: [id], include_docs: true });
const docs = queryRet.rows.filter(e => !("error" in e)).map(x => x.doc);
const ret = await remoteDB.bulkDocs(docs, { new_edits: false });
Logger(ret, LOG_LEVEL.VERBOSE);
}
}
Logger(`Done! ${remoteToLocal.size} chunks have been sent`, LOG_LEVEL.NOTICE);
}
}
if (remoteToLocal.size == 0) {
Logger(`No chunks need to be retrieved`, LOG_LEVEL.NOTICE);
} else {
Logger(`${remoteToLocal.size} chunks need to be retrieved`, LOG_LEVEL.NOTICE);
if (!dryRun) {
const w = arrayToChunkedArray([...remoteToLocal]);
for (const chunk of w) {
for (const [id,] of chunk) {
const queryRet = await remoteDB.allDocs({ keys: [id], include_docs: true });
const docs = queryRet.rows.filter(e => !("error" in e)).map(x => x.doc);
const ret = await localDB.bulkDocs(docs, { new_edits: false });
Logger(ret, LOG_LEVEL.VERBOSE);
}
}
Logger(`Done! ${remoteToLocal.size} chunks have been retrieved`, LOG_LEVEL.NOTICE);
}
}
})
}
export const remoteDatabaseCleanup = async (plugin: ObsidianLiveSyncPlugin, dryRun: boolean) => {
const getSize = function (info: PouchDB.Core.DatabaseInfo, key: "active" | "external" | "file") {
return Number.parseInt((info as any)?.sizes?.[key] ?? 0);
}
await runWithLock("clean-up:remote", true, async () => {
const CHUNK_SIZE = 100;
function makeChunkedArrayFromArray<T>(items: T[]): T[][] {
const chunked = [];
for (let i = 0; i < items.length; i += CHUNK_SIZE) {
chunked.push(items.slice(i, i + CHUNK_SIZE));
}
return chunked;
}
try {
const ret = await plugin.replicator.connectRemoteCouchDBWithSetting(plugin.settings, plugin.isMobile);
if (typeof ret === "string") {
Logger(`Connect error: ${ret}`, LOG_LEVEL.NOTICE, "clean-up-db");
return;
}
const info = ret.info;
Logger(JSON.stringify(info), LOG_LEVEL.VERBOSE, "clean-up-db");
Logger(`Database active-size: ${sizeToHumanReadable(getSize(info, "active"))}, external-size:${sizeToHumanReadable(getSize(info, "external"))}, file-size: ${sizeToHumanReadable(getSize(info, "file"))}`, LOG_LEVEL.NOTICE);
if (!dryRun) {
Logger(`The remote database has been locked for garbage collection`, LOG_LEVEL.NOTICE, "clean-up-db");
await plugin.markRemoteLocked(true);
}
Logger(`Gathering chunk usage information`, LOG_LEVEL.NOTICE, "clean-up-db");
const db = ret.db;
const { unreferenced, missing } = await gatherChunkUsage(db);
if (missing.size != 0) {
Logger(`Some chunks are not found! We have to rescue`, LOG_LEVEL.NOTICE);
Logger(missing, LOG_LEVEL.VERBOSE);
} else {
Logger(`All chunks are OK`, LOG_LEVEL.NOTICE);
}
const payload = {} as Record<string, string[]>;
for (const [id, rev] of unreferenced) {
payload[id] = [rev];
}
const removeItems = Object.keys(payload).length;
if (removeItems == 0) {
Logger(`No unreferenced chunk found (Remote)`, LOG_LEVEL.NOTICE);
return;
}
if (dryRun) {
Logger(`There are ${removeItems} unreferenced chunks (Remote)`, LOG_LEVEL.NOTICE);
return;
}
Logger(`Deleting unreferenced chunks: ${removeItems}`, LOG_LEVEL.NOTICE, "clean-up-db");
const buffer = makeChunkedArrayFromArray(Object.entries(payload));
for (const chunkedPayload of buffer) {
const rets = await _requestToCouchDBFetch(
`${plugin.settings.couchDB_URI}/${plugin.settings.couchDB_DBNAME}`,
plugin.settings.couchDB_USER,
plugin.settings.couchDB_PASSWORD,
"_purge",
chunkedPayload.reduce((p, c) => ({ ...p, [c[0]]: c[1] }), {}), "POST");
// const result = await rets();
Logger(JSON.stringify(await rets.json()), LOG_LEVEL.VERBOSE);
}
Logger(`Compacting database...`, LOG_LEVEL.NOTICE, "clean-up-db");
await db.compact();
const endInfo = await db.info();
Logger(`Processed database active-size: ${sizeToHumanReadable(getSize(endInfo, "active"))}, external-size:${sizeToHumanReadable(getSize(endInfo, "external"))}, file-size: ${sizeToHumanReadable(getSize(endInfo, "file"))}`, LOG_LEVEL.NOTICE);
Logger(`Reduced sizes: active-size: ${sizeToHumanReadable(getSize(info, "active") - getSize(endInfo, "active"))}, external-size:${sizeToHumanReadable(getSize(info, "external") - getSize(endInfo, "external"))}, file-size: ${sizeToHumanReadable(getSize(info, "file") - getSize(endInfo, "file"))}`, LOG_LEVEL.NOTICE);
Logger(JSON.stringify(endInfo), LOG_LEVEL.VERBOSE, "clean-up-db");
Logger(`Local database cleaning up...`);
await localDatabaseCleanUp(plugin, true, false);
} catch (ex) {
Logger("Failed to clean up db.")
Logger(ex, LOG_LEVEL.VERBOSE);
}
});
}

View File

@@ -260,3 +260,14 @@ div.sls-setting-menu-btn {
.password-input > .setting-item-control >input {
-webkit-text-security: disc;
}
span.ls-mark-cr::after {
user-select: none;
content: "↲";
color: var(--text-muted);
font-size: 0.8em;
}
.deleted span.ls-mark-cr::after {
color: var(--text-on-accent);
}

View File

@@ -15,13 +15,15 @@
// "importsNotUsedAsValues": "error",
"importHelpers": false,
"alwaysStrict": true,
"allowImportingTsExtensions": true,
"lib": [
"es2018",
"DOM",
"ES5",
"ES6",
"ES7",
"es2019.array"
"es2019.array",
"ES2020.BigInt",
]
},
"include": [

View File

@@ -1,74 +1,71 @@
### 0.19.0
### 0.20.0
#### Customization sync
At 0.20.0, Self-hosted LiveSync has changed the binary file format and encrypting format, for efficient synchronisation.
The dialogue will be shown and asks us to decide whether to keep v1 or use v2. Once we have enabled v2, all subsequent edits will be saved in v2. Therefore, devices running 0.19 or below cannot understand this and they might say that decryption error. Please update all devices.
Then we will have an impressive performance.
Since `Plugin and their settings` have been broken, so I tried to fix it, not just fix it, but fix it the way it should be.
Of course, these are very impactful changes. If you have any questions or troubled things, please feel free to open an issue and mention me.
Now, we have `Customization sync`.
Note: if you want to roll it back to v1, please enable `Use binary and encryption version 1` on the `Hatch` pane and perform the `rebuild everything` once.
It is a real shame that the compatibility between these features has been broken. However, this new feature is surely useful and I believe that worth getting over the pain.
We can use the new feature with the same configuration. Only the menu on the command palette has been changed. The dialog can be opened by `Show customization sync dialog`.
Extra but notable information:
I hope you will give it a try.
This format change gives us the ability to detect some `marks` in the binary files as same as text files. Therefore, we can split binary files and some specific sort of them (i.e., PDF files) at the specific character. It means that editing the middle of files could be detected with marks.
Now only a few chunks are transferred, even if we add a comment to the PDF or put new files into the ZIP archives.
#### Minors
- 0.19.1
- Fixed: Fixed hidden file handling on Linux
- Improved: Now customization sync works more smoothly.
- 0.19.2
- Fixed:
- Fixed garbage collection error while unreferenced chunks exist many.
- Fixed filename validation on Linux.
- Improved:
- Showing status is now thinned for performance.
- Enhance caching while collecting chunks.
- 0.19.3
- Improved:
- Now replication will be paced by collecting chunks. If synchronisation has been deadlocked, please enable `Do not pace synchronization` once.
- 0.19.4
- Improved:
- Reduced remote database checking to improve speed and reduce bandwidth.
- Fixed:
- Chunks which previously misinterpreted are now interpreted correctly.
- No more missing chunks which not be found forever, except if it has been actually missing.
- Deleted file detection on hidden file synchronising now works fine.
- Now the Customisation sync is surely quiet while it has been disabled.
- 0.19.5
- Fixed:
- Now hidden file synchronisation would not be hanged, even if so many files exist.
- Improved:
- Customisation sync works more smoothly.
- Note: Concurrent processing has been rollbacked into the original implementation. As a result, the total number of processes is no longer shown next to the hourglass icon. However, only the processes that are running concurrently are shown.
- 0.19.6
- Fixed:
- Logging has been tweaked.
- No more too many planes and rockets.
- The batch database update now surely only works in non-live mode.
- Internal things:
- Some frameworks has been upgraded.
- Import declaration has been fixed.
- Improved:
- The plug-in now asks to enable a new adaptor, when rebuilding, if it is not enabled yet.
- The setting dialogue refined.
- Configurations for compatibilities have been moved under the hatch.
- Made it clear that disabled is the default.
- Ambiguous names configuration have been renamed.
- Items that have no meaning in the settings are no longer displayed.
- Some items have been reordered for clarity.
- Each configuration has been grouped.
- 0.19.7
- Fixed:
- The initial pane of Setting dialogue is now changed to General Settings.
- The Setup Wizard is now able to flush existing settings and get into the mode again.
- 0.19.8
#### Version history
- 0.20.7
- Fixed
- To better replication, path obfuscation is now deterministic even if with E2EE.
Note: Compatible with previous database without any conversion. Only new files will be obfuscated in deterministic.
- 0.20.6
- Fixed
- Now empty file could be decoded.
- Local files are no longer pre-saved before fetching from a remote database.
- No longer deadlock while applying customisation sync.
- Configuration with multiple files is now able to be applied correctly.
- Deleting folder propagation now works without enabling the use of a trash bin.
- 0.20.5
- Fixed
- Now the files which having digit or character prefixes in the path will not be ignored.
- 0.20.4
- Fixed
- The text-input-dialogue is no longer broken.
- Finally, we can use the Setup URI again on mobile.
- 0.20.3
- New feature:
- Vault history: A tab has been implemented to give a birds-eye view of the changes that have occurred in the vault.
- Improved:
- Now the passphrases on the dialogue masked out. Thank you @antoKeinanen!
- Log dialogue is now shown as one of tabs.
- We can launch Customization sync from the Ribbon if we enabled it.
- Fixed:
- Some minor issues has been fixed.
- Setup URI is now back to the previous spec; be encrypted by V1.
- It may avoid the trouble with iOS 17.
- The Settings dialogue is now registered at the beginning of the start-up process.
- We can change the configuration even though LiveSync could not be launched in normal.
- Improved:
- Enumerating documents has been faster.
- 0.20.2
- New feature:
- We can delete all data of customization sync from the `Delete all customization sync data` on the `Hatch` pane.
- Fixed:
- Prevent keep restarting on iOS by yielding microtasks.
- 0.20.1
- Fixed:
- No more UI freezing and keep restarting on iOS.
- Diff of Non-markdown documents are now shown correctly.
- Improved:
- Performance has been a bit improved.
- Customization sync has gotten faster.
- However, We lost forward compatibility again (only for this feature). Please update all devices.
- Misc
- Terser configuration has been more aggressive.
- 0.20.0
- Improved:
- A New binary file handling implemented
- A new encrypted format has been implemented
- Now the chunk sizes will be adjusted for efficient sync
- Fixed:
- levels of exception in some logs have been fixed
- Tidied:
- Some Lint warnings have been suppressed.
... To continue on to `updates_old.md`.

View File

@@ -1,3 +1,194 @@
### 0.19.0
#### Customization sync
Since `Plugin and their settings` have been broken, so I tried to fix it, not just fix it, but fix it the way it should be.
Now, we have `Customization sync`.
It is a real shame that the compatibility between these features has been broken. However, this new feature is surely useful and I believe that worth getting over the pain.
We can use the new feature with the same configuration. Only the menu on the command palette has been changed. The dialog can be opened by `Show customization sync dialog`.
I hope you will give it a try.
#### Minors
- 0.19.1
- Fixed: Fixed hidden file handling on Linux
- Improved: Now customization sync works more smoothly.
- 0.19.2
- Fixed:
- Fixed garbage collection error while unreferenced chunks exist many.
- Fixed filename validation on Linux.
- Improved:
- Showing status is now thinned for performance.
- Enhance caching while collecting chunks.
- 0.19.3
- Improved:
- Now replication will be paced by collecting chunks. If synchronisation has been deadlocked, please enable `Do not pace synchronization` once.
- 0.19.4
- Improved:
- Reduced remote database checking to improve speed and reduce bandwidth.
- Fixed:
- Chunks which previously misinterpreted are now interpreted correctly.
- No more missing chunks which not be found forever, except if it has been actually missing.
- Deleted file detection on hidden file synchronising now works fine.
- Now the Customisation sync is surely quiet while it has been disabled.
- 0.19.5
- Fixed:
- Now hidden file synchronisation would not be hanged, even if so many files exist.
- Improved:
- Customisation sync works more smoothly.
- Note: Concurrent processing has been rollbacked into the original implementation. As a result, the total number of processes is no longer shown next to the hourglass icon. However, only the processes that are running concurrently are shown.
- 0.19.6
- Fixed:
- Logging has been tweaked.
- No more too many planes and rockets.
- The batch database update now surely only works in non-live mode.
- Internal things:
- Some frameworks has been upgraded.
- Import declaration has been fixed.
- Improved:
- The plug-in now asks to enable a new adaptor, when rebuilding, if it is not enabled yet.
- The setting dialogue refined.
- Configurations for compatibilities have been moved under the hatch.
- Made it clear that disabled is the default.
- Ambiguous names configuration have been renamed.
- Items that have no meaning in the settings are no longer displayed.
- Some items have been reordered for clarity.
- Each configuration has been grouped.
- 0.19.7
- Fixed:
- The initial pane of Setting dialogue is now changed to General Settings.
- The Setup Wizard is now able to flush existing settings and get into the mode again.
- 0.19.8
- New feature:
- Vault history: A tab has been implemented to give a birds-eye view of the changes that have occurred in the vault.
- Improved:
- Now the passphrases on the dialogue masked out. Thank you @antoKeinanen!
- Log dialogue is now shown as one of tabs.
- Fixed:
- Some minor issues has been fixed.
- 0.19.9
- New feature (For fixing a problem):
- We can fix the database obfuscated and plain paths that have been mixed up.
- Improvements
- Customisation Sync performance has been improved.
- 0.19.10
- Fixed
- Fixed the issue about fixing the database.
- 0.19.11
- Improvements:
- Hashing ChunkID has been improved.
- Logging keeps 400 lines now.
- Refactored:
- Import statement has been fixed about types.
- 0.19.12
- Improved:
- Boot-up performance has been improved.
- Customisation sync performance has been improved.
- Synchronising performance has been improved.
- 0.19.13
- Implemented:
- Database clean-up is now in beta 2!
We can shrink the remote database by deleting unused chunks, with keeping history.
Note: Local database is not cleaned up totally. We have to `Fetch` again to let it done.
**Note2**: Still in beta. Please back your vault up anything before.
- Fixed:
- The log updates are not thinned out now.
- 0.19.14
- Fixed:
- Internal documents are now ignored.
- Merge dialogue now respond immediately to button pressing.
- Periodic processing now works fine.
- The checking interval of detecting conflicted has got shorter.
- Replication is now cancelled while cleaning up.
- The database locking by the cleaning up is now carefully unlocked.
- Missing chunks message is correctly reported.
- New feature:
- Suspend database reflecting has been implemented.
- This can be disabled by `Fetch database with previous behaviour`.
- Now fetch suspends the reflecting database and storage changes temporarily to improve the performance.
- We can choose the action when the remote database has been cleaned
- Merge dialogue now show `↲` before the new line.
- Improved:
- Now progress is reported while the cleaning up and fetch process.
- Cancelled replication is now detected.
- 0.19.15
- Fixed:
- Now storing files after cleaning up is correct works.
- Improved:
- Cleaning the local database up got incredibly fastened.
Now we can clean instead of fetching again when synchronising with the remote which has been cleaned up.
- 0.19.16
- Many upgrades on this release. I have tried not to let that happen, if something got corrupted, please feel free to notify me.
- New feature:
- (Beta) ignore files handling
We can use `.gitignore`, `.dockerignore`, and anything you like to filter the synchronising files.
- Fixed:
- Buttons on lock-detected-dialogue now can be shown in narrow-width devices.
- Improved:
- Some constant has been flattened to be evaluated.
- The usage of the deprecated API of obsidian has been reduced.
- Now the indexedDB adapter will be enabled while the importing configuration.
- Misc:
- Compiler, framework, and dependencies have been upgraded.
- Due to standing for these impacts (especially in esbuild and svelte,) terser has been introduced.
Feel free to notify your opinion to me! I do not like to obfuscate the code too.
- 0.19.17
- Fixed:
- Now nested ignore files could be parsed correctly.
- The unexpected deletion of hidden files in some cases has been corrected.
- Hidden file change is no longer reflected on the device which has made the change itself.
- Behaviour changed:
- From this version, the file which has `:` in its name should be ignored even if on Linux devices.
- 0.19.18
- Fixed:
- Now the empty (or deleted) file could be conflict-resolved.
- 0.19.19
- Fixed:
- Resolving conflicted revision has become more robust.
- LiveSync now try to keep local changes when fetching from the rebuilt remote database.
Local changes now have been kept as a revision and fetched things will be new revisions.
- Now, all files will be restored after performing `fetch` immediately.
- 0.19.20
- New feature:
- `Sync on Editor save` has been implemented
- We can start synchronisation when we save from the Obsidian explicitly.
- Now we can use the `Hidden file sync` and the `Customization sync` cooperatively.
- We can exclude files from `Hidden file sync` which is already handled in Customization sync.
- We can ignore specific plugins in Customization sync.
- Now the message of leftover conflicted files accepts our click.
- We can open `Resolve all conflicted files` in an instant.
- Refactored:
- Parallelism functions made more explicit.
- Type errors have been reduced.
- Fixed:
- Now documents would not be overwritten if they are conflicted.
It will be saved as a new conflicted revision.
- Some error messages have been fixed.
- Missing dialogue titles have been shown now.
- We can click close buttons on mobile now.
- Conflicted Customisation sync files will be resolved automatically by their modified time.
- 0.19.21
- Fixed:
- Hidden files are no longer handled in the initial replication.
- Report from `Making report` fixed
- No longer contains customisation sync information.
- Version of LiveSync has been added.
- 0.19.22
- Fixed:
- Now the synchronisation will begin without our interaction.
- No longer puts the configuration of the remote database into the log while checking configuration.
- Some outdated description notes have been removed.
- Options that are meaningless depending on other settings configured are now hidden.
- Scan for hidden files before replication
- Scan customization periodically
- 0.19.23
-Improved:
- We can open the log pane also from the command palette now.
- Now, the hidden file scanning interval could be configured to 0.
- `Check database configuration` now points out that we do not have administrator permission.
### 0.18.0