mirror of
https://github.com/vrtmrz/obsidian-livesync.git
synced 2026-02-22 20:18:48 +00:00
Compare commits
80 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5e75917b8d | ||
|
|
3322d13b55 | ||
|
|
851c9f8a71 | ||
|
|
b02596dfa1 | ||
|
|
02c69b202e | ||
|
|
6b2c7b56a5 | ||
|
|
820168a5ab | ||
|
|
40015642e4 | ||
|
|
7a5cffb6a8 | ||
|
|
e395e53248 | ||
|
|
97f91b1eb0 | ||
|
|
2f4159182e | ||
|
|
302a4024a8 | ||
|
|
bc17f4f70d | ||
|
|
6f33d23088 | ||
|
|
4998e2ef0b | ||
|
|
f5e0b826a6 | ||
|
|
3a3f79bb99 | ||
|
|
9efb6ed0c1 | ||
|
|
6b7956ab67 | ||
|
|
58196c2423 | ||
|
|
3940260d42 | ||
|
|
b16333c604 | ||
|
|
7bf6d1f663 | ||
|
|
7046928068 | ||
|
|
333fcbaaeb | ||
|
|
009f92c307 | ||
|
|
3e541bd061 | ||
|
|
52d08301cc | ||
|
|
49d4c239f2 | ||
|
|
748d031b36 | ||
|
|
dbe77718c8 | ||
|
|
f334974cc3 | ||
|
|
8f2ae437c6 | ||
|
|
a0efda9e71 | ||
|
|
be3d61c1c7 | ||
|
|
b24c4ef55b | ||
|
|
ff850b48ca | ||
|
|
ad3860ac40 | ||
|
|
437b7ebae1 | ||
|
|
e3305c24e1 | ||
|
|
d008e2a1d0 | ||
|
|
bb8c7eb043 | ||
|
|
e61bebd3ee | ||
|
|
99594fe517 | ||
|
|
972d208af4 | ||
|
|
2c36ec497c | ||
|
|
677895547c | ||
|
|
aec0b2986b | ||
|
|
e6025b92d8 | ||
|
|
fad9fed5ca | ||
|
|
e46246cd63 | ||
|
|
0f3be19dd7 | ||
|
|
bc568ff479 | ||
|
|
2fdc7669f3 | ||
|
|
ec8d9785ed | ||
|
|
71a80cacc3 | ||
|
|
38daeca89f | ||
|
|
7ec64a6a93 | ||
|
|
c5c6deb742 | ||
|
|
ef57fbfdda | ||
|
|
bc158e9f2b | ||
|
|
6513c53c7e | ||
|
|
5d1074065c | ||
|
|
b444082b0c | ||
|
|
d5e6419504 | ||
|
|
1bf1e1540d | ||
|
|
be1e6b11ac | ||
|
|
a486788572 | ||
|
|
e5784a1da6 | ||
|
|
2100e22276 | ||
|
|
ec08dc5fe8 | ||
|
|
c92e94e552 | ||
|
|
c7db8592c6 | ||
|
|
fc3617d9f9 | ||
|
|
34c1b040db | ||
|
|
6b85aecafe | ||
|
|
4dabadd5ea | ||
|
|
0619c96c48 | ||
|
|
b0f612b61c |
@@ -1,3 +1,4 @@
|
||||
npm node_modules
|
||||
node_modules
|
||||
build
|
||||
.eslintrc.js.bak
|
||||
.eslintrc.js.bak
|
||||
src/lib/src/patches/pouchdb-utils
|
||||
27
.eslintrc
27
.eslintrc
@@ -1,19 +1,34 @@
|
||||
{
|
||||
"root": true,
|
||||
"parser": "@typescript-eslint/parser",
|
||||
"plugins": ["@typescript-eslint"],
|
||||
"extends": ["eslint:recommended", "plugin:@typescript-eslint/eslint-recommended", "plugin:@typescript-eslint/recommended"],
|
||||
"plugins": [
|
||||
"@typescript-eslint"
|
||||
],
|
||||
"extends": [
|
||||
"eslint:recommended",
|
||||
"plugin:@typescript-eslint/eslint-recommended",
|
||||
"plugin:@typescript-eslint/recommended"
|
||||
],
|
||||
"parserOptions": {
|
||||
"sourceType": "module"
|
||||
"sourceType": "module",
|
||||
"project": [
|
||||
"tsconfig.json"
|
||||
]
|
||||
},
|
||||
"rules": {
|
||||
"no-unused-vars": "off",
|
||||
"@typescript-eslint/no-unused-vars": ["error", { "args": "none" }],
|
||||
"@typescript-eslint/no-unused-vars": [
|
||||
"error",
|
||||
{
|
||||
"args": "none"
|
||||
}
|
||||
],
|
||||
"@typescript-eslint/ban-ts-comment": "off",
|
||||
"no-prototype-builtins": "off",
|
||||
"@typescript-eslint/no-empty-function": "off",
|
||||
"require-await": "warn",
|
||||
"no-async-promise-executor": "off",
|
||||
"@typescript-eslint/no-explicit-any": "off"
|
||||
"@typescript-eslint/no-explicit-any": "off",
|
||||
"@typescript-eslint/no-unnecessary-type-assertion": "error"
|
||||
}
|
||||
}
|
||||
}
|
||||
37
README.md
37
README.md
@@ -2,7 +2,7 @@
|
||||
|
||||
[Japanese docs](./README_ja.md) [Chinese docs](./README_cn.md).
|
||||
|
||||
Self-hosted LiveSync is a community implemented synchronization plugin.
|
||||
Self-hosted LiveSync is a community-implemented synchronization plugin.
|
||||
A self-hosted or purchased CouchDB acts as the intermediate server. Available on every obsidian-compatible platform.
|
||||
|
||||
Note: It has no compatibility with the official "Obsidian Sync".
|
||||
@@ -16,19 +16,19 @@ Before installing or upgrading LiveSync, please back your vault up.
|
||||
- Visual conflict resolver included.
|
||||
- Bidirectional synchronization between devices nearly in real-time
|
||||
- You can use CouchDB or its compatibles like IBM Cloudant.
|
||||
- End-to-End encryption supported.
|
||||
- End-to-End encryption is supported.
|
||||
- Plugin synchronization(Beta)
|
||||
- Receive WebClip from [obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf) (End-to-End encryption will not be applicable.)
|
||||
|
||||
Useful for researchers, engineers and developers with a need to keep their notes fully self-hosted for security reasons. Or just anyone who would like the peace of mind knowing that their notes are fully private.
|
||||
Useful for researchers, engineers and developers with a need to keep their notes fully self-hosted for security reasons. Or just anyone who would like the peace of mind of knowing that their notes are fully private.
|
||||
|
||||
## IMPORTANT NOTICE
|
||||
|
||||
- Do not use in conjunction with another synchronization solution (including iCloud, Obsidian Sync). Before enabling this plugin, make sure to disable all the other synchronization methods to avoid content corruption or duplication. If you want to synchronize to two or more services, do them one by one and never enable two synchronization methods at the same time.
|
||||
- Do not enable this plugin with another synchronization solution at the same time (including iCloud and Obsidian Sync). Before enabling this plugin, make sure to disable all the other synchronization methods to avoid content corruption or duplication. If you want to synchronize to two or more services, do them one by one and never enable two synchronization methods at the same time.
|
||||
This includes not putting your vault inside a cloud-synchronized folder (eg. an iCloud folder or Dropbox folder)
|
||||
- This is a synchronization plugin. Not a backup solutions. Do not rely on this for backup.
|
||||
- This is a synchronization plugin. Not a backup solution. Do not rely on this for backup.
|
||||
- If the device's storage runs out, database corruption may happen.
|
||||
- Hidden files or any other invisible files wouldn't be kept in the database, thus won't be synchronized. (**and may also get deleted**)
|
||||
- Hidden files or any other invisible files wouldn't be kept in the database, and thus won't be synchronized. (**and may also get deleted**)
|
||||
|
||||
## How to use
|
||||
|
||||
@@ -38,7 +38,7 @@ First, get your database ready. IBM Cloudant is preferred for testing. Or you ca
|
||||
1. [Setup IBM Cloudant](docs/setup_cloudant.md)
|
||||
2. [Setup your CouchDB](docs/setup_own_server.md)
|
||||
|
||||
Note: More information about alternative hosting methods needed! Currently, [using fly.io](https://github.com/vrtmrz/obsidian-livesync/discussions/85) is being discussed.
|
||||
Note: More information about alternative hosting methods is needed! Currently, [using fly.io](https://github.com/vrtmrz/obsidian-livesync/discussions/85) is being discussed.
|
||||
|
||||
### Configure the plugin
|
||||
|
||||
@@ -46,17 +46,18 @@ See [Quick setup guide](doccs/../docs/quick_setup.md)
|
||||
|
||||
## Something looks corrupted...
|
||||
|
||||
Please open the configuration link again and Answer as below:
|
||||
Please open the configuration link again and Answer below:
|
||||
- If your local database looks corrupted (in other words, when your Obsidian getting weird even standalone.)
|
||||
- Answer `No` to `Keep local DB?`
|
||||
- If your remote database looks corrupted (in other words, when something happens while replicating)
|
||||
- Answer `No` to `Keep remote DB?`
|
||||
|
||||
If you answered `No` to both, your databases will be rebuilt by the content on your device. And the remote database will lock out other devices. You have to synchronize all your devices again. (When this time, almost all your files should be synchronized with a timestamp. So you can use a existed vault).
|
||||
If you answered `No` to both, your databases will be rebuilt by the content on your device. And the remote database will lock out other devices. You have to synchronize all your devices again. (When this time, almost all your files should be synchronized with a timestamp. So you can use an existing vault).
|
||||
|
||||
## Test Server
|
||||
|
||||
Setting up an instance of Cloudant or local CouchDB is a little complicated, so I set up a [Tasting server for self-hosted-livesync](https://olstaste.vrtmrz.net/). Try it out for free!
|
||||
~~Setting up an instance of Cloudant or local CouchDB is a little complicated, so I set up a [Tasting server for self-hosted-livesync](https://olstaste.vrtmrz.net/). Try it out for free!~~
|
||||
Now (30 May 2023) suspending while the server transfer.
|
||||
Note: Please read "Limitations" carefully. Do not send your private vault.
|
||||
|
||||
## Information in StatusBar
|
||||
@@ -78,18 +79,20 @@ If you have deleted or renamed files, please wait until ⏳ icon disappeared.
|
||||
## Hints
|
||||
- If a folder becomes empty after a replication, it will be deleted by default. But you can toggle this behaviour. Check the [Settings](docs/settings.md).
|
||||
- LiveSync mode drains more batteries in mobile devices. Periodic sync with some automatic sync is recommended.
|
||||
- Mobile Obsidian can not connect to a non-secure (HTTP) or a locally-signed servers, even if the root certificate is installed on the device.
|
||||
- Mobile Obsidian can not connect to non-secure (HTTP) or locally-signed servers, even if the root certificate is installed on the device.
|
||||
- There are no 'exclude_folders' like configurations.
|
||||
- While synchronizing, files are compared by their modification time and the older ones will be overwritten by the newer ones. Then plugin checks for conflicts and if a merge is needed, a dialog will open.
|
||||
- Rarely, a file in the database could be corrupted. The plugin will not write to local storage when a file looks corrupted. If a local version of the file is on your device, the corruption could be fixed by editing the local file and synchronizing it. But if the file does not exist on any of your devices, then it can not be rescued. In this case you can delete these items from the settings dialog.
|
||||
- To stop the boot up sequence (eg. for fixing problems on databases), you can put a `redflag.md` file at the root of your vault.
|
||||
- Q: Database is growing, how can I shrink it down?
|
||||
A: each of the docs is saved with their past 100 revisions for detecting and resolving conflicts. Picturing that one device has been offline for a while, and comes online again. The device has to compare its notes with the remotely saved ones. If there exists a historic revision in which the note used to be identical, it could be updated safely (like git fast-forward). Even if that is not in revision histories, we only have to check the differences after the revision that both devices commonly have. This is like git's conflict resolving method. So, We have to make the database again like an enlarged git repo if you want to solve the root of the problem.
|
||||
- And more technical Information are in the [Technical Information](docs/tech_info.md)
|
||||
- Rarely, a file in the database could be corrupted. The plugin will not write to local storage when a file looks corrupted. If a local version of the file is on your device, the corruption could be fixed by editing the local file and synchronizing it. But if the file does not exist on any of your devices, then it can not be rescued. In this case, you can delete these items from the settings dialog.
|
||||
- To stop the boot-up sequence (eg. for fixing problems on databases), you can put a `redflag.md` file (or directory) at the root of your vault.
|
||||
Tip for iOS: a redflag directory can be created at the root of the vault using the File application.
|
||||
- Also, with `redflag2.md` placed, we can automatically rebuild both the local and the remote databases during the boot-up sequence. With `redflag3.md`, we can discard only the local database and fetch from the remote again.
|
||||
- Q: The database is growing, how can I shrink it down?
|
||||
A: each of the docs is saved with their past 100 revisions for detecting and resolving conflicts. Picturing that one device has been offline for a while, and comes online again. The device has to compare its notes with the remotely saved ones. If there exists a historic revision in which the note used to be identical, it could be updated safely (like git fast-forward). Even if that is not in revision histories, we only have to check the differences after the revision that both devices commonly have. This is like git's conflict-resolving method. So, We have to make the database again like an enlarged git repo if you want to solve the root of the problem.
|
||||
- And more technical Information is in the [Technical Information](docs/tech_info.md)
|
||||
- If you want to synchronize files without obsidian, you can use [filesystem-livesync](https://github.com/vrtmrz/filesystem-livesync).
|
||||
- WebClipper is also available on Chrome Web Store:[obsidian-livesync-webclip](https://chrome.google.com/webstore/detail/obsidian-livesync-webclip/jfpaflmpckblieefkegjncjoceapakdf)
|
||||
|
||||
Repo is here: [obsidian-livesync-webclip](https://github.com/vrtmrz/obsidian-livesync-webclip). (Docs are work in progress.)
|
||||
Repo is here: [obsidian-livesync-webclip](https://github.com/vrtmrz/obsidian-livesync-webclip). (Docs are a work in progress.)
|
||||
|
||||
## License
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
# Quick setup
|
||||
The Setup wizard has been implemented since v0.15.0. This simplifies the initial set-up.
|
||||
The Setup wizard has been implemented since v0.15.0. This simplifies the initial setup.
|
||||
|
||||
Note: The subsequent devices should be set up using the `Copy setup URI` and `Open setup URI`.
|
||||
|
||||
@@ -34,18 +34,18 @@ Enter the information in the database we have set up.
|
||||
|
||||

|
||||
|
||||
If End to End encryption is enabled, the possibility of a third party who does not know the Passphrase being able to read the contents of the Remote database in the event that they are leaked is reduced. So we strongly recommend to enable it.
|
||||
If End to End encryption is enabled, the possibility of a third party who does not know the Passphrase being able to read the contents of the Remote database if they are leaked is reduced. So we strongly recommend enabling it.
|
||||
Encryption is based on 256-bit AES-GCM.
|
||||
This setting can be disabled if you are inside a closed network and it is clear that you will not be accessed by third parties.
|
||||
|
||||
### Test database connectionとCheck database configuraion
|
||||
### Test database connection and Check database configuration
|
||||
|
||||
Here we can check the status of the connection to the database and the database settings.
|
||||
|
||||

|
||||
|
||||
#### Test Database Connection
|
||||
Check whether we can connect to the database. If it fails, there are a number of reasons, but once you have done the `Check database configuration`, check if it fails there too.
|
||||
Check whether we can connect to the database. If it fails, there are several reasons, but once you have done the `Check database configuration`, check if it fails there too.
|
||||
|
||||
#### Check database configuration
|
||||
|
||||
@@ -64,11 +64,11 @@ Go to the Local Database configuration.
|
||||
### Discard exist database and proceed
|
||||
Discard the contents of the Remote database and go to the Local Database configuration.
|
||||
|
||||
## Local Database confiuration
|
||||
## Local Database configuration
|
||||
|
||||

|
||||
|
||||
Configure the local database. If we already have a Vaults with Self-hosted LiveSync installed and having same directory name as currently we are setting up, please specify a different suffix than the Vault you have already set up here.
|
||||
Configure the local database. If we already have a Vaults with Self-hosted LiveSync installed and having the same directory name as currently we are setting up, please specify a different suffix than the Vault you have already set up here.
|
||||
|
||||
## Miscellaneous
|
||||
Finally, finish the miscellaneous configurations and select a preset for synchronisation.
|
||||
|
||||
@@ -17,14 +17,14 @@ If you feel something, please feel free to inform me.
|
||||
| 🚑 | [Corrupted data](#corrupted-data) |
|
||||
|
||||
## Remote Database Configurations
|
||||
Configure settings of synchronize server. If any synchronization is enabled, you can't edit this section. Please disable all synchronization to change.
|
||||
Configure the settings of synchronize server. If any synchronization is enabled, you can't edit this section. Please disable all synchronization to change.
|
||||
|
||||
### URI
|
||||
URI of CouchDB. In the case of Cloudant, It's "External Endpoint(preferred)".
|
||||
**Do not end it up with a slash** when it doesn't contain the database name.
|
||||
|
||||
### Username
|
||||
Your CouchDB's Username. With administrator's privilege is preferred.
|
||||
Your CouchDB's Username. Administrator's privilege is preferred.
|
||||
|
||||
### Password
|
||||
Your CouchDB's Password.
|
||||
@@ -47,11 +47,11 @@ The passphrase to used as the key of encryption. Please use the long text.
|
||||
|
||||
### Apply
|
||||
Set the End to End encryption enabled and its passphrase for use in replication.
|
||||
If you change the passphrase of a existing database, overwriting the remote database is strongly recommended.
|
||||
If you change the passphrase of an existing database, overwriting the remote database is strongly recommended.
|
||||
|
||||
|
||||
### Overwrite remote database
|
||||
Overwrite the remote database by the local database using the passphrase you applied.
|
||||
Overwrite the remote database with the local database using the passphrase you applied.
|
||||
|
||||
|
||||
### Rebuild
|
||||
@@ -61,17 +61,17 @@ Rebuild remote and local databases with local files. It will delete all document
|
||||
You can check the connection by clicking this button.
|
||||
|
||||
### Check database configuration
|
||||
You can check and modify your CouchDB's configuration from here directly.
|
||||
You can check and modify your CouchDB configuration from here directly.
|
||||
|
||||
### Lock remote database.
|
||||
Other devices are banned from the database when you have locked the database.
|
||||
If you have something troubled with other devices, you can protect the vault and remote database by your device.
|
||||
If you have something troubled with other devices, you can protect the vault and remote database with your device.
|
||||
|
||||
## Local Database Configurations
|
||||
"Local Database" is created inside your obsidian.
|
||||
|
||||
### Batch database update
|
||||
Delay database update until raise replication, open another file, window visibility changed, or file events except for file modification.
|
||||
Delay database update until raise replication, open another file, window visibility changes, or file events except for file modification.
|
||||
This option can not be used with LiveSync at the same time.
|
||||
|
||||
|
||||
@@ -81,23 +81,23 @@ If one device rebuilds or locks the remote database, every other device will be
|
||||
### minimum chunk size and LongLine threshold
|
||||
The configuration of chunk splitting.
|
||||
|
||||
Self-hosted LiveSync splits the note into chunks for efficient synchronization. This chunk should be longer than "Minimum chunk size".
|
||||
Self-hosted LiveSync splits the note into chunks for efficient synchronization. This chunk should be longer than the "Minimum chunk size".
|
||||
|
||||
Specifically, the length of the chunk is determined by the following orders.
|
||||
|
||||
1. Find the nearest newline character, and if it is farther than LongLineThreshold, this piece becomes an independent chunk.
|
||||
|
||||
2. If not, find nearest to these items.
|
||||
1. Newline character
|
||||
2. Empty line (Windows style)
|
||||
3. Empty line (non-Windows style)
|
||||
3. Compare the farther in these 3 positions and next "\[newline\]#" position, pick a shorter piece to as chunk.
|
||||
2. If not, find the nearest to these items.
|
||||
1. A newline character
|
||||
2. An empty line (Windows style)
|
||||
3. An empty line (non-Windows style)
|
||||
3. Compare the farther in these 3 positions and the next "newline\]#" position, and pick a shorter piece as a chunk.
|
||||
|
||||
This rule was made empirically from my dataset. If this rule acts as badly on your data. Please give me the information.
|
||||
|
||||
You can dump saved note structure to `Dump informations of this doc`. Replace every character to x except newline and "#" when sending information to me.
|
||||
You can dump saved note structure to `Dump informations of this doc`. Replace every character with x except newline and "#" when sending information to me.
|
||||
|
||||
Default values are 20 letters and 250 letters.
|
||||
The default values are 20 letters and 250 letters.
|
||||
|
||||
## General Settings
|
||||
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# Setup CouchDB to your server
|
||||
|
||||
|
||||
## Install CouchDB and access from PC or Mac
|
||||
## Install CouchDB and access from a PC or Mac
|
||||
|
||||
The easiest way to set up the CouchDB is using the [docker image]((https://hub.docker.com/_/couchdb)).
|
||||
|
||||
@@ -39,7 +39,7 @@ $ docker run --rm -it -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=password -v /pat
|
||||
*Remember to replace the path with the path to your local.ini*
|
||||
Note: At this time, the file owner of local.ini became 5984:5984. It's the limitation docker image. please change the owner before editing local.ini again.
|
||||
|
||||
If you could confirm that Self-hosted LiveSync can sync with the server, launch docker image as background as you like.
|
||||
If you could confirm that Self-hosted LiveSync can sync with the server, launch the docker image as a background as you like.
|
||||
|
||||
Example to run docker in detached mode:
|
||||
```
|
||||
@@ -47,10 +47,10 @@ $ docker run -d --restart always -e COUCHDB_USER=admin -e COUCHDB_PASSWORD=passw
|
||||
```
|
||||
*Remember to replace the path with the path to your local.ini*
|
||||
|
||||
## Access from mobile device
|
||||
## Access from a mobile device
|
||||
If you want to access Self-hosted LiveSync from mobile devices, you need a valid SSL certificate.
|
||||
|
||||
### Testing from mobile
|
||||
### Testing from a mobile
|
||||
In the testing phase, [localhost.run](http://localhost.run/) or something like services is very useful.
|
||||
|
||||
example on using localhost.run)
|
||||
@@ -94,6 +94,6 @@ Set the A record of your domain to point to your server, and host reverse proxy
|
||||
Note: Mounting CouchDB on the top directory is not recommended.
|
||||
Using Caddy is a handy way to serve the server with SSL automatically.
|
||||
|
||||
I have published [docker-compose.yml and ini files](https://github.com/vrtmrz/self-hosted-livesync-server) that launches Caddy and CouchDB at once. Please try it out.
|
||||
I have published [docker-compose.yml and ini files](https://github.com/vrtmrz/self-hosted-livesync-server) that launch Caddy and CouchDB at once. Please try it out.
|
||||
|
||||
And, be sure to check the server log and be careful of malicious access.
|
||||
|
||||
@@ -25,14 +25,16 @@ esbuild
|
||||
"MANIFEST_VERSION": `"${manifestJson.version}"`,
|
||||
"PACKAGE_VERSION": `"${packageJson.version}"`,
|
||||
"UPDATE_INFO": `${updateInfo}`,
|
||||
"global":"window",
|
||||
},
|
||||
external: ["obsidian", "electron", ...builtins],
|
||||
external: ["obsidian", "electron", "crypto"],
|
||||
format: "cjs",
|
||||
watch: !prod,
|
||||
target: "es2018",
|
||||
logLevel: "info",
|
||||
sourcemap: prod ? false : "inline",
|
||||
treeShaking: true,
|
||||
platform: "browser",
|
||||
plugins: [
|
||||
sveltePlugin({
|
||||
preprocess: sveltePreprocess(),
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"id": "obsidian-livesync",
|
||||
"name": "Self-hosted LiveSync",
|
||||
"version": "0.17.4",
|
||||
"version": "0.18.4",
|
||||
"minAppVersion": "0.9.12",
|
||||
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||
"author": "vorotamoroz",
|
||||
|
||||
2266
package-lock.json
generated
2266
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
32
package.json
32
package.json
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "obsidian-livesync",
|
||||
"version": "0.17.4",
|
||||
"version": "0.18.4",
|
||||
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||
"main": "main.js",
|
||||
"type": "module",
|
||||
@@ -16,21 +16,31 @@
|
||||
"@types/diff-match-patch": "^1.0.32",
|
||||
"@types/pouchdb": "^6.4.0",
|
||||
"@types/pouchdb-browser": "^6.1.3",
|
||||
"@typescript-eslint/eslint-plugin": "^5.44.0",
|
||||
"@typescript-eslint/parser": "^5.44.0",
|
||||
"@typescript-eslint/eslint-plugin": "^5.54.0",
|
||||
"@typescript-eslint/parser": "^5.54.0",
|
||||
"builtin-modules": "^3.3.0",
|
||||
"esbuild": "0.15.15",
|
||||
"esbuild-svelte": "^0.7.3",
|
||||
"eslint": "^8.28.0",
|
||||
"eslint": "^8.35.0",
|
||||
"eslint-config-airbnb-base": "^15.0.0",
|
||||
"eslint-plugin-import": "^2.26.0",
|
||||
"obsidian": "^0.16.3",
|
||||
"postcss": "^8.4.19",
|
||||
"eslint-plugin-import": "^2.27.5",
|
||||
"events": "^3.3.0",
|
||||
"obsidian": "^1.1.1",
|
||||
"postcss": "^8.4.21",
|
||||
"postcss-load-config": "^4.0.1",
|
||||
"svelte": "^3.53.1",
|
||||
"svelte-preprocess": "^4.10.7",
|
||||
"tslib": "^2.4.1",
|
||||
"typescript": "^4.9.3"
|
||||
"pouchdb-adapter-http": "^8.0.1",
|
||||
"pouchdb-adapter-idb": "^8.0.1",
|
||||
"pouchdb-adapter-indexeddb": "^8.0.1",
|
||||
"pouchdb-core": "^8.0.1",
|
||||
"pouchdb-find": "^8.0.1",
|
||||
"pouchdb-mapreduce": "^8.0.1",
|
||||
"pouchdb-replication": "^8.0.1",
|
||||
"pouchdb-utils": "^8.0.1",
|
||||
"svelte": "^3.55.1",
|
||||
"svelte-preprocess": "^5.0.1",
|
||||
"transform-pouch": "^2.0.0",
|
||||
"tslib": "^2.5.0",
|
||||
"typescript": "^4.9.5"
|
||||
},
|
||||
"dependencies": {
|
||||
"diff-match-patch": "^1.0.5",
|
||||
|
||||
725
src/CmdHiddenFileSync.ts
Normal file
725
src/CmdHiddenFileSync.ts
Normal file
@@ -0,0 +1,725 @@
|
||||
import { Notice, normalizePath, PluginManifest } from "./deps";
|
||||
import { EntryDoc, LoadedEntry, LOG_LEVEL, InternalFileEntry, FilePathWithPrefix, FilePath } from "./lib/src/types";
|
||||
import { InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
|
||||
import { delay, isDocContentSame } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask, trimPrefix, isIdOfInternalMetadata, PeriodicProcessor } from "./utils";
|
||||
import { WrappedNotice } from "./lib/src/wrapper";
|
||||
import { base64ToArrayBuffer, arrayBufferToBase64 } from "./lib/src/strbin";
|
||||
import { runWithLock } from "./lib/src/lock";
|
||||
import { Semaphore } from "./lib/src/semaphore";
|
||||
import { JsonResolveModal } from "./JsonResolveModal";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
|
||||
|
||||
export class HiddenFileSync extends LiveSyncCommands {
|
||||
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
|
||||
confirmPopup: WrappedNotice = null;
|
||||
get kvDB() {
|
||||
return this.plugin.kvDB;
|
||||
}
|
||||
ensureDirectoryEx(fullPath: string) {
|
||||
return this.plugin.ensureDirectoryEx(fullPath);
|
||||
}
|
||||
getConflictedDoc(path: FilePathWithPrefix, rev: string) {
|
||||
return this.plugin.getConflictedDoc(path, rev);
|
||||
}
|
||||
onunload() {
|
||||
this.periodicInternalFileScanProcessor?.disable();
|
||||
}
|
||||
onload(): void | Promise<void> {
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-scaninternal",
|
||||
name: "Sync hidden files",
|
||||
callback: () => {
|
||||
this.syncInternalFilesAndDatabase("safe", true);
|
||||
},
|
||||
});
|
||||
}
|
||||
async onInitializeDatabase(showNotice: boolean) {
|
||||
if (this.settings.syncInternalFiles) {
|
||||
try {
|
||||
Logger("Synchronizing hidden files...");
|
||||
await this.syncInternalFilesAndDatabase("push", showNotice);
|
||||
Logger("Synchronizing hidden files done");
|
||||
} catch (ex) {
|
||||
Logger("Synchronizing hidden files failed");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
}
|
||||
}
|
||||
async beforeReplicate(showNotice: boolean) {
|
||||
if (this.localDatabase.isReady && this.settings.syncInternalFiles && this.settings.syncInternalFilesBeforeReplication && !this.settings.watchInternalFileChanges) {
|
||||
await this.syncInternalFilesAndDatabase("push", showNotice);
|
||||
}
|
||||
}
|
||||
async onResume() {
|
||||
this.periodicInternalFileScanProcessor?.disable();
|
||||
if (this.plugin.suspended)
|
||||
return;
|
||||
if (this.settings.syncInternalFiles) {
|
||||
await this.syncInternalFilesAndDatabase("safe", false);
|
||||
}
|
||||
this.periodicInternalFileScanProcessor.enable(this.settings.syncInternalFiles && this.settings.syncInternalFilesInterval ? (this.settings.syncInternalFilesInterval * 1000) : 0);
|
||||
}
|
||||
parseReplicationResultItem(docs: PouchDB.Core.ExistingDocument<EntryDoc>) {
|
||||
return false;
|
||||
}
|
||||
realizeSettingSyncMode(): Promise<void> {
|
||||
this.periodicInternalFileScanProcessor?.disable();
|
||||
if (this.plugin.suspended)
|
||||
return;
|
||||
if (!this.plugin.isReady)
|
||||
return;
|
||||
this.periodicInternalFileScanProcessor.enable(this.settings.syncInternalFiles && this.settings.syncInternalFilesInterval ? (this.settings.syncInternalFilesInterval * 1000) : 0);
|
||||
return;
|
||||
}
|
||||
|
||||
procInternalFiles: string[] = [];
|
||||
async execInternalFile() {
|
||||
await runWithLock("execinternal", false, async () => {
|
||||
const w = [...this.procInternalFiles];
|
||||
this.procInternalFiles = [];
|
||||
Logger(`Applying hidden ${w.length} files change...`);
|
||||
await this.syncInternalFilesAndDatabase("pull", false, false, w);
|
||||
Logger(`Applying hidden ${w.length} files changed`);
|
||||
});
|
||||
}
|
||||
procInternalFile(filename: string) {
|
||||
this.procInternalFiles.push(filename);
|
||||
scheduleTask("procInternal", 500, async () => {
|
||||
await this.execInternalFile();
|
||||
});
|
||||
}
|
||||
|
||||
recentProcessedInternalFiles = [] as string[];
|
||||
async watchVaultRawEventsAsync(path: FilePath) {
|
||||
const stat = await this.app.vault.adapter.stat(path);
|
||||
// sometimes folder is coming.
|
||||
if (stat && stat.type != "file")
|
||||
return;
|
||||
const storageMTime = ~~((stat && stat.mtime || 0) / 1000);
|
||||
const key = `${path}-${storageMTime}`;
|
||||
if (this.recentProcessedInternalFiles.contains(key)) {
|
||||
//If recently processed, it may caused by self.
|
||||
return;
|
||||
}
|
||||
this.recentProcessedInternalFiles = [key, ...this.recentProcessedInternalFiles].slice(0, 100);
|
||||
// const id = await this.path2id(path, ICHeader);
|
||||
const prefixedFileName = addPrefix(path, ICHeader);
|
||||
const filesOnDB = await this.localDatabase.getDBEntryMeta(prefixedFileName);
|
||||
const dbMTime = ~~((filesOnDB && filesOnDB.mtime || 0) / 1000);
|
||||
|
||||
// Skip unchanged file.
|
||||
if (dbMTime == storageMTime) {
|
||||
// Logger(`STORAGE --> DB:${path}: (hidden) Nothing changed`);
|
||||
return;
|
||||
}
|
||||
|
||||
// Do not compare timestamp. Always local data should be preferred except this plugin wrote one.
|
||||
if (storageMTime == 0) {
|
||||
await this.deleteInternalFileOnDatabase(path);
|
||||
} else {
|
||||
await this.storeInternalFileToDatabase({ path: path, ...stat });
|
||||
const pluginDir = this.app.vault.configDir + "/plugins/";
|
||||
const pluginFiles = ["manifest.json", "data.json", "style.css", "main.js"];
|
||||
if (path.startsWith(pluginDir) && pluginFiles.some(e => path.endsWith(e)) && this.settings.usePluginSync) {
|
||||
const pluginName = trimPrefix(path, pluginDir).split("/")[0];
|
||||
await this.plugin.addOnPluginAndTheirSettings.sweepPlugin(false, pluginName);
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
async resolveConflictOnInternalFiles() {
|
||||
// Scan all conflicted internal files
|
||||
const conflicted = this.localDatabase.findEntries(ICHeader, ICHeaderEnd, { conflicts: true });
|
||||
for await (const doc of conflicted) {
|
||||
if (!("_conflicts" in doc))
|
||||
continue;
|
||||
if (isIdOfInternalMetadata(doc._id)) {
|
||||
await this.resolveConflictOnInternalFile(doc.path);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async resolveConflictOnInternalFile(path: FilePathWithPrefix): Promise<boolean> {
|
||||
try {
|
||||
// Retrieve data
|
||||
const id = await this.path2id(path, ICHeader);
|
||||
const doc = await this.localDatabase.getRaw(id, { conflicts: true });
|
||||
// If there is no conflict, return with false.
|
||||
if (!("_conflicts" in doc))
|
||||
return false;
|
||||
if (doc._conflicts.length == 0)
|
||||
return false;
|
||||
Logger(`Hidden file conflicted:${path}`);
|
||||
const conflicts = doc._conflicts.sort((a, b) => Number(a.split("-")[0]) - Number(b.split("-")[0]));
|
||||
const revA = doc._rev;
|
||||
const revB = conflicts[0];
|
||||
|
||||
if (path.endsWith(".json")) {
|
||||
const conflictedRev = conflicts[0];
|
||||
const conflictedRevNo = Number(conflictedRev.split("-")[0]);
|
||||
//Search
|
||||
const revFrom = (await this.localDatabase.getRaw<EntryDoc>(id, { revs_info: true }));
|
||||
const commonBase = revFrom._revs_info.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
|
||||
const result = await this.plugin.mergeObject(path, commonBase, doc._rev, conflictedRev);
|
||||
if (result) {
|
||||
Logger(`Object merge:${path}`, LOG_LEVEL.INFO);
|
||||
const filename = stripAllPrefixes(path);
|
||||
const isExists = await this.app.vault.adapter.exists(filename);
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
}
|
||||
await this.app.vault.adapter.write(filename, result);
|
||||
const stat = await this.app.vault.adapter.stat(filename);
|
||||
await this.storeInternalFileToDatabase({ path: filename, ...stat });
|
||||
await this.extractInternalFileFromDatabase(filename);
|
||||
await this.localDatabase.removeRaw(id, revB);
|
||||
return this.resolveConflictOnInternalFile(path);
|
||||
} else {
|
||||
Logger(`Object merge is not applicable.`, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
|
||||
const docAMerge = await this.localDatabase.getDBEntry(path, { rev: revA });
|
||||
const docBMerge = await this.localDatabase.getDBEntry(path, { rev: revB });
|
||||
if (docAMerge != false && docBMerge != false) {
|
||||
if (await this.showJSONMergeDialogAndMerge(docAMerge, docBMerge)) {
|
||||
await delay(200);
|
||||
// Again for other conflicted revisions.
|
||||
return this.resolveConflictOnInternalFile(path);
|
||||
}
|
||||
return false;
|
||||
}
|
||||
}
|
||||
const revBDoc = await this.localDatabase.getRaw(id, { rev: revB });
|
||||
// determine which revision should been deleted.
|
||||
// simply check modified time
|
||||
const mtimeA = ("mtime" in doc && doc.mtime) || 0;
|
||||
const mtimeB = ("mtime" in revBDoc && revBDoc.mtime) || 0;
|
||||
// Logger(`Revisions:${new Date(mtimeA).toLocaleString} and ${new Date(mtimeB).toLocaleString}`);
|
||||
// console.log(`mtime:${mtimeA} - ${mtimeB}`);
|
||||
const delRev = mtimeA < mtimeB ? revA : revB;
|
||||
// delete older one.
|
||||
await this.localDatabase.removeRaw(id, delRev);
|
||||
Logger(`Older one has been deleted:${path}`);
|
||||
// check the file again
|
||||
return this.resolveConflictOnInternalFile(path);
|
||||
} catch (ex) {
|
||||
Logger(`Failed to resolve conflict (Hidden): ${path}`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
//TODO: Tidy up. Even though it is experimental feature, So dirty...
|
||||
async syncInternalFilesAndDatabase(direction: "push" | "pull" | "safe" | "pullForce" | "pushForce", showMessage: boolean, files: InternalFileInfo[] | false = false, targetFiles: string[] | false = false) {
|
||||
await this.resolveConflictOnInternalFiles();
|
||||
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
|
||||
Logger("Scanning hidden files.", logLevel, "sync_internal");
|
||||
const ignorePatterns = this.settings.syncInternalFilesIgnorePatterns
|
||||
.replace(/\n| /g, "")
|
||||
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
|
||||
if (!files)
|
||||
files = await this.scanInternalFiles();
|
||||
const filesOnDB = ((await this.localDatabase.allDocsRaw({ startkey: ICHeader, endkey: ICHeaderEnd, include_docs: true })).rows.map(e => e.doc) as InternalFileEntry[]).filter(e => !e.deleted);
|
||||
const allFileNamesSrc = [...new Set([...files.map(e => normalizePath(e.path)), ...filesOnDB.map(e => stripAllPrefixes(this.getPath(e)))])];
|
||||
const allFileNames = allFileNamesSrc.filter(filename => !targetFiles || (targetFiles && targetFiles.indexOf(filename) !== -1));
|
||||
function compareMTime(a: number, b: number) {
|
||||
const wa = ~~(a / 1000);
|
||||
const wb = ~~(b / 1000);
|
||||
const diff = wa - wb;
|
||||
return diff;
|
||||
}
|
||||
|
||||
const fileCount = allFileNames.length;
|
||||
let processed = 0;
|
||||
let filesChanged = 0;
|
||||
// count updated files up as like this below:
|
||||
// .obsidian: 2
|
||||
// .obsidian/workspace: 1
|
||||
// .obsidian/plugins: 1
|
||||
// .obsidian/plugins/recent-files-obsidian: 1
|
||||
// .obsidian/plugins/recent-files-obsidian/data.json: 1
|
||||
const updatedFolders: { [key: string]: number; } = {};
|
||||
const countUpdatedFolder = (path: string) => {
|
||||
const pieces = path.split("/");
|
||||
let c = pieces.shift();
|
||||
let pathPieces = "";
|
||||
filesChanged++;
|
||||
while (c) {
|
||||
pathPieces += (pathPieces != "" ? "/" : "") + c;
|
||||
pathPieces = normalizePath(pathPieces);
|
||||
if (!(pathPieces in updatedFolders)) {
|
||||
updatedFolders[pathPieces] = 0;
|
||||
}
|
||||
updatedFolders[pathPieces]++;
|
||||
c = pieces.shift();
|
||||
}
|
||||
};
|
||||
const p = [] as Promise<void>[];
|
||||
const semaphore = Semaphore(10);
|
||||
// Cache update time information for files which have already been processed (mainly for files that were skipped due to the same content)
|
||||
let caches: { [key: string]: { storageMtime: number; docMtime: number; }; } = {};
|
||||
caches = await this.kvDB.get<{ [key: string]: { storageMtime: number; docMtime: number; }; }>("diff-caches-internal") || {};
|
||||
for (const filename of allFileNames) {
|
||||
if (!filename) continue;
|
||||
processed++;
|
||||
if (processed % 100 == 0)
|
||||
Logger(`Hidden file: ${processed}/${fileCount}`, logLevel, "sync_internal");
|
||||
if (ignorePatterns.some(e => filename.match(e)))
|
||||
continue;
|
||||
|
||||
const fileOnStorage = files.find(e => e.path == filename);
|
||||
const fileOnDatabase = filesOnDB.find(e => stripAllPrefixes(this.getPath(e)) == filename);
|
||||
const addProc = async (p: () => Promise<void>): Promise<void> => {
|
||||
const releaser = await semaphore.acquire(1);
|
||||
try {
|
||||
return p();
|
||||
} catch (ex) {
|
||||
Logger("Some process failed", logLevel);
|
||||
Logger(ex);
|
||||
} finally {
|
||||
releaser();
|
||||
}
|
||||
};
|
||||
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
|
||||
|
||||
p.push(addProc(async () => {
|
||||
const xFileOnStorage = fileOnStorage;
|
||||
const xFileOnDatabase = fileOnDatabase;
|
||||
if (xFileOnStorage && xFileOnDatabase) {
|
||||
// Both => Synchronize
|
||||
if ((direction != "pullForce" && direction != "pushForce") && xFileOnDatabase.mtime == cache.docMtime && xFileOnStorage.mtime == cache.storageMtime) {
|
||||
return;
|
||||
}
|
||||
const nw = compareMTime(xFileOnStorage.mtime, xFileOnDatabase.mtime);
|
||||
if (nw > 0 || direction == "pushForce") {
|
||||
await this.storeInternalFileToDatabase(xFileOnStorage);
|
||||
}
|
||||
if (nw < 0 || direction == "pullForce") {
|
||||
// skip if not extraction performed.
|
||||
if (!await this.extractInternalFileFromDatabase(filename))
|
||||
return;
|
||||
}
|
||||
// If process successfully updated or file contents are same, update cache.
|
||||
cache.docMtime = xFileOnDatabase.mtime;
|
||||
cache.storageMtime = xFileOnStorage.mtime;
|
||||
caches[filename] = cache;
|
||||
countUpdatedFolder(filename);
|
||||
} else if (!xFileOnStorage && xFileOnDatabase) {
|
||||
if (direction == "push" || direction == "pushForce") {
|
||||
if (xFileOnDatabase.deleted)
|
||||
return;
|
||||
await this.deleteInternalFileOnDatabase(filename, false);
|
||||
} else if (direction == "pull" || direction == "pullForce") {
|
||||
if (await this.extractInternalFileFromDatabase(filename)) {
|
||||
countUpdatedFolder(filename);
|
||||
}
|
||||
} else if (direction == "safe") {
|
||||
if (xFileOnDatabase.deleted)
|
||||
return;
|
||||
if (await this.extractInternalFileFromDatabase(filename)) {
|
||||
countUpdatedFolder(filename);
|
||||
}
|
||||
}
|
||||
} else if (xFileOnStorage && !xFileOnDatabase) {
|
||||
await this.storeInternalFileToDatabase(xFileOnStorage);
|
||||
} else {
|
||||
throw new Error("Invalid state on hidden file sync");
|
||||
// Something corrupted?
|
||||
}
|
||||
}));
|
||||
}
|
||||
await Promise.all(p);
|
||||
await this.kvDB.set("diff-caches-internal", caches);
|
||||
|
||||
// When files has been retrieved from the database. they must be reloaded.
|
||||
if ((direction == "pull" || direction == "pullForce") && filesChanged != 0) {
|
||||
const configDir = normalizePath(this.app.vault.configDir);
|
||||
// Show notification to restart obsidian when something has been changed in configDir.
|
||||
if (configDir in updatedFolders) {
|
||||
// Numbers of updated files that is below of configDir.
|
||||
let updatedCount = updatedFolders[configDir];
|
||||
try {
|
||||
//@ts-ignore
|
||||
const manifests = Object.values(this.app.plugins.manifests) as any as PluginManifest[];
|
||||
//@ts-ignore
|
||||
const enabledPlugins = this.app.plugins.enabledPlugins as Set<string>;
|
||||
const enabledPluginManifests = manifests.filter(e => enabledPlugins.has(e.id));
|
||||
for (const manifest of enabledPluginManifests) {
|
||||
if (manifest.dir in updatedFolders) {
|
||||
// If notified about plug-ins, reloading Obsidian may not be necessary.
|
||||
updatedCount -= updatedFolders[manifest.dir];
|
||||
const updatePluginId = manifest.id;
|
||||
const updatePluginName = manifest.name;
|
||||
const fragment = createFragment((doc) => {
|
||||
doc.createEl("span", null, (a) => {
|
||||
a.appendText(`Files in ${updatePluginName} has been updated, Press `);
|
||||
a.appendChild(a.createEl("a", null, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", async () => {
|
||||
Logger(`Unloading plugin: ${updatePluginName}`, LOG_LEVEL.NOTICE, "plugin-reload-" + updatePluginId);
|
||||
// @ts-ignore
|
||||
await this.app.plugins.unloadPlugin(updatePluginId);
|
||||
// @ts-ignore
|
||||
await this.app.plugins.loadPlugin(updatePluginId);
|
||||
Logger(`Plugin reloaded: ${updatePluginName}`, LOG_LEVEL.NOTICE, "plugin-reload-" + updatePluginId);
|
||||
});
|
||||
}));
|
||||
|
||||
a.appendText(` to reload ${updatePluginName}, or press elsewhere to dismiss this message.`);
|
||||
});
|
||||
});
|
||||
|
||||
const updatedPluginKey = "popupUpdated-" + updatePluginId;
|
||||
scheduleTask(updatedPluginKey, 1000, async () => {
|
||||
const popup = await memoIfNotExist(updatedPluginKey, () => new Notice(fragment, 0));
|
||||
//@ts-ignore
|
||||
const isShown = popup?.noticeEl?.isShown();
|
||||
if (!isShown) {
|
||||
memoObject(updatedPluginKey, new Notice(fragment, 0));
|
||||
}
|
||||
scheduleTask(updatedPluginKey + "-close", 20000, () => {
|
||||
const popup = retrieveMemoObject<Notice>(updatedPluginKey);
|
||||
if (!popup)
|
||||
return;
|
||||
//@ts-ignore
|
||||
if (popup?.noticeEl?.isShown()) {
|
||||
popup.hide();
|
||||
}
|
||||
disposeMemoObject(updatedPluginKey);
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger("Error on checking plugin status.");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
|
||||
}
|
||||
|
||||
// If something changes left, notify for reloading Obsidian.
|
||||
if (updatedCount != 0) {
|
||||
const fragment = createFragment((doc) => {
|
||||
doc.createEl("span", null, (a) => {
|
||||
a.appendText(`Hidden files have been synchronized, Press `);
|
||||
a.appendChild(a.createEl("a", null, (anchor) => {
|
||||
anchor.text = "HERE";
|
||||
anchor.addEventListener("click", () => {
|
||||
// @ts-ignore
|
||||
this.app.commands.executeCommandById("app:reload");
|
||||
});
|
||||
}));
|
||||
|
||||
a.appendText(` to reload obsidian, or press elsewhere to dismiss this message.`);
|
||||
});
|
||||
});
|
||||
|
||||
scheduleTask("popupUpdated-" + configDir, 1000, () => {
|
||||
//@ts-ignore
|
||||
const isShown = this.confirmPopup?.noticeEl?.isShown();
|
||||
if (!isShown) {
|
||||
this.confirmPopup = new Notice(fragment, 0);
|
||||
}
|
||||
scheduleTask("popupClose" + configDir, 20000, () => {
|
||||
this.confirmPopup?.hide();
|
||||
this.confirmPopup = null;
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Logger(`Hidden files scanned: ${filesChanged} files had been modified`, logLevel, "sync_internal");
|
||||
}
|
||||
|
||||
async storeInternalFileToDatabase(file: InternalFileInfo, forceWrite = false) {
|
||||
const id = await this.path2id(file.path, ICHeader);
|
||||
const prefixedFileName = addPrefix(file.path, ICHeader);
|
||||
const contentBin = await this.app.vault.adapter.readBinary(file.path);
|
||||
let content: string[];
|
||||
try {
|
||||
content = await arrayBufferToBase64(contentBin);
|
||||
} catch (ex) {
|
||||
Logger(`The file ${file.path} could not be encoded`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
return false;
|
||||
}
|
||||
const mtime = file.mtime;
|
||||
return await runWithLock("file-" + prefixedFileName, false, async () => {
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false);
|
||||
let saveData: LoadedEntry;
|
||||
if (old === false) {
|
||||
saveData = {
|
||||
_id: id,
|
||||
path: prefixedFileName,
|
||||
data: content,
|
||||
mtime,
|
||||
ctime: mtime,
|
||||
datatype: "newnote",
|
||||
size: file.size,
|
||||
children: [],
|
||||
deleted: false,
|
||||
type: "newnote",
|
||||
};
|
||||
} else {
|
||||
if (isDocContentSame(old.data, content) && !forceWrite) {
|
||||
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL.VERBOSE);
|
||||
return;
|
||||
}
|
||||
saveData =
|
||||
{
|
||||
...old,
|
||||
data: content,
|
||||
mtime,
|
||||
size: file.size,
|
||||
datatype: "newnote",
|
||||
children: [],
|
||||
deleted: false,
|
||||
type: "newnote",
|
||||
};
|
||||
}
|
||||
const ret = await this.localDatabase.putDBEntry(saveData, true);
|
||||
Logger(`STORAGE --> DB:${file.path}: (hidden) Done`);
|
||||
return ret;
|
||||
} catch (ex) {
|
||||
Logger(`STORAGE --> DB:${file.path}: (hidden) Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
return false;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async deleteInternalFileOnDatabase(filename: FilePath, forceWrite = false) {
|
||||
const id = await this.path2id(filename, ICHeader);
|
||||
const prefixedFileName = addPrefix(filename, ICHeader);
|
||||
const mtime = new Date().getTime();
|
||||
await runWithLock("file-" + prefixedFileName, false, async () => {
|
||||
try {
|
||||
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false) as InternalFileEntry | false;
|
||||
let saveData: InternalFileEntry;
|
||||
if (old === false) {
|
||||
saveData = {
|
||||
_id: id,
|
||||
path: prefixedFileName,
|
||||
mtime,
|
||||
ctime: mtime,
|
||||
size: 0,
|
||||
children: [],
|
||||
deleted: true,
|
||||
type: "newnote",
|
||||
};
|
||||
} else {
|
||||
if (old.deleted) {
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) already deleted`);
|
||||
return;
|
||||
}
|
||||
saveData =
|
||||
{
|
||||
...old,
|
||||
mtime,
|
||||
size: 0,
|
||||
children: [],
|
||||
deleted: true,
|
||||
type: "newnote",
|
||||
};
|
||||
}
|
||||
await this.localDatabase.putRaw(saveData);
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) Done`);
|
||||
} catch (ex) {
|
||||
Logger(`STORAGE -x> DB:${filename}: (hidden) Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
return false;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async extractInternalFileFromDatabase(filename: FilePath, force = false) {
|
||||
const isExists = await this.app.vault.adapter.exists(filename);
|
||||
const prefixedFileName = addPrefix(filename, ICHeader);
|
||||
|
||||
return await runWithLock("file-" + prefixedFileName, false, async () => {
|
||||
try {
|
||||
// Check conflicted status
|
||||
//TODO option
|
||||
const fileOnDB = await this.localDatabase.getDBEntry(prefixedFileName, { conflicts: true }, false, false);
|
||||
if (fileOnDB === false)
|
||||
throw new Error(`File not found on database.:${filename}`);
|
||||
// Prevent overwrite for Prevent overwriting while some conflicted revision exists.
|
||||
if (fileOnDB?._conflicts?.length) {
|
||||
Logger(`Hidden file ${filename} has conflicted revisions, to keep in safe, writing to storage has been prevented`, LOG_LEVEL.INFO);
|
||||
return;
|
||||
}
|
||||
const deleted = "deleted" in fileOnDB ? fileOnDB.deleted : false;
|
||||
if (deleted) {
|
||||
if (!isExists) {
|
||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
|
||||
} else {
|
||||
Logger(`STORAGE <x- DB:${filename}: deleted (hidden).`);
|
||||
await this.app.vault.adapter.remove(filename);
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await app.vault.adapter.reconcileInternalFile(filename);
|
||||
} catch (ex) {
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
await this.app.vault.adapter.writeBinary(filename, base64ToArrayBuffer(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await app.vault.adapter.reconcileInternalFile(filename);
|
||||
} catch (ex) {
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden,new${force ? ", force" : ""})`);
|
||||
return true;
|
||||
} else {
|
||||
const contentBin = await this.app.vault.adapter.readBinary(filename);
|
||||
const content = await arrayBufferToBase64(contentBin);
|
||||
if (content == fileOnDB.data && !force) {
|
||||
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL.VERBOSE);
|
||||
return true;
|
||||
}
|
||||
await this.app.vault.adapter.writeBinary(filename, base64ToArrayBuffer(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await app.vault.adapter.reconcileInternalFile(filename);
|
||||
} catch (ex) {
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden, overwrite${force ? ", force" : ""})`);
|
||||
return true;
|
||||
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden, overwrite${force ? ", force" : ""}) Failed`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
return false;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
|
||||
showJSONMergeDialogAndMerge(docA: LoadedEntry, docB: LoadedEntry): Promise<boolean> {
|
||||
return new Promise((res) => {
|
||||
Logger("Opening data-merging dialog", LOG_LEVEL.VERBOSE);
|
||||
const docs = [docA, docB];
|
||||
const path = stripAllPrefixes(docA.path);
|
||||
const modal = new JsonResolveModal(this.app, path, [docA, docB], async (keep, result) => {
|
||||
// modal.close();
|
||||
try {
|
||||
const filename = path;
|
||||
let needFlush = false;
|
||||
if (!result && !keep) {
|
||||
Logger(`Skipped merging: ${filename}`);
|
||||
}
|
||||
//Delete old revisions
|
||||
if (result || keep) {
|
||||
for (const doc of docs) {
|
||||
if (doc._rev != keep) {
|
||||
if (await this.localDatabase.deleteDBEntry(this.getPath(doc), { rev: doc._rev })) {
|
||||
Logger(`Conflicted revision has been deleted: ${filename}`);
|
||||
needFlush = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if (!keep && result) {
|
||||
const isExists = await this.app.vault.adapter.exists(filename);
|
||||
if (!isExists) {
|
||||
await this.ensureDirectoryEx(filename);
|
||||
}
|
||||
await this.app.vault.adapter.write(filename, result);
|
||||
const stat = await this.app.vault.adapter.stat(filename);
|
||||
await this.storeInternalFileToDatabase({ path: filename, ...stat }, true);
|
||||
try {
|
||||
//@ts-ignore internalAPI
|
||||
await app.vault.adapter.reconcileInternalFile(filename);
|
||||
} catch (ex) {
|
||||
Logger("Failed to call internal API(reconcileInternalFile)", LOG_LEVEL.VERBOSE);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
Logger(`STORAGE <-- DB:${filename}: written (hidden,merged)`);
|
||||
}
|
||||
if (needFlush) {
|
||||
await this.extractInternalFileFromDatabase(filename, false);
|
||||
Logger(`STORAGE --> DB:${filename}: extracted (hidden,merged)`);
|
||||
}
|
||||
res(true);
|
||||
} catch (ex) {
|
||||
Logger("Could not merge conflicted json");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
res(false);
|
||||
}
|
||||
});
|
||||
modal.open();
|
||||
});
|
||||
}
|
||||
|
||||
async scanInternalFiles(): Promise<InternalFileInfo[]> {
|
||||
const ignoreFilter = this.settings.syncInternalFilesIgnorePatterns
|
||||
.replace(/\n| /g, "")
|
||||
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
|
||||
const root = this.app.vault.getRoot();
|
||||
const findRoot = root.path;
|
||||
const filenames = (await this.getFiles(findRoot, [], null, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
|
||||
const files = filenames.map(async (e) => {
|
||||
return {
|
||||
path: e as FilePath,
|
||||
stat: await this.app.vault.adapter.stat(e)
|
||||
};
|
||||
});
|
||||
const result: InternalFileInfo[] = [];
|
||||
for (const f of files) {
|
||||
const w = await f;
|
||||
result.push({
|
||||
...w,
|
||||
...w.stat
|
||||
});
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
|
||||
|
||||
async getFiles(
|
||||
path: string,
|
||||
ignoreList: string[],
|
||||
filter: RegExp[],
|
||||
ignoreFilter: RegExp[]
|
||||
) {
|
||||
|
||||
const w = await this.app.vault.adapter.list(path);
|
||||
let files = [
|
||||
...w.files
|
||||
.filter((e) => !ignoreList.some((ee) => e.endsWith(ee)))
|
||||
.filter((e) => !filter || filter.some((ee) => e.match(ee)))
|
||||
.filter((e) => !ignoreFilter || ignoreFilter.every((ee) => !e.match(ee))),
|
||||
];
|
||||
|
||||
L1: for (const v of w.folders) {
|
||||
for (const ignore of ignoreList) {
|
||||
if (v.endsWith(ignore)) {
|
||||
continue L1;
|
||||
}
|
||||
}
|
||||
if (ignoreFilter && ignoreFilter.some(e => v.match(e))) {
|
||||
continue L1;
|
||||
}
|
||||
files = files.concat(await this.getFiles(v, ignoreList, filter, ignoreFilter));
|
||||
}
|
||||
return files;
|
||||
}
|
||||
}
|
||||
314
src/CmdPluginAndTheirSettings.ts
Normal file
314
src/CmdPluginAndTheirSettings.ts
Normal file
@@ -0,0 +1,314 @@
|
||||
import { normalizePath, PluginManifest } from "./deps";
|
||||
import { DocumentID, EntryDoc, FilePathWithPrefix, LoadedEntry, LOG_LEVEL } from "./lib/src/types";
|
||||
import { PluginDataEntry, PERIODIC_PLUGIN_SWEEP, PluginList, DevicePluginList, PSCHeader, PSCHeaderEnd } from "./types";
|
||||
import { getDocData, isDocContentSame } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { isPluginMetadata, PeriodicProcessor } from "./utils";
|
||||
import { PluginDialogModal } from "./dialogs";
|
||||
import { NewNotice } from "./lib/src/wrapper";
|
||||
import { versionNumberString2Number } from "./lib/src/strbin";
|
||||
import { runWithLock } from "./lib/src/lock";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
|
||||
export class PluginAndTheirSettings extends LiveSyncCommands {
|
||||
|
||||
get deviceAndVaultName() {
|
||||
return this.plugin.deviceAndVaultName;
|
||||
}
|
||||
pluginDialog: PluginDialogModal = null;
|
||||
periodicPluginSweepProcessor = new PeriodicProcessor(this.plugin, async () => await this.sweepPlugin(false));
|
||||
|
||||
showPluginSyncModal() {
|
||||
if (this.pluginDialog != null) {
|
||||
this.pluginDialog.open();
|
||||
} else {
|
||||
this.pluginDialog = new PluginDialogModal(this.app, this.plugin);
|
||||
this.pluginDialog.open();
|
||||
}
|
||||
}
|
||||
|
||||
hidePluginSyncModal() {
|
||||
if (this.pluginDialog != null) {
|
||||
this.pluginDialog.close();
|
||||
this.pluginDialog = null;
|
||||
}
|
||||
}
|
||||
onload(): void | Promise<void> {
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-plugin-dialog",
|
||||
name: "Show Plugins and their settings",
|
||||
callback: () => {
|
||||
this.showPluginSyncModal();
|
||||
},
|
||||
});
|
||||
}
|
||||
onunload() {
|
||||
this.hidePluginSyncModal();
|
||||
this.periodicPluginSweepProcessor?.disable();
|
||||
}
|
||||
parseReplicationResultItem(doc: PouchDB.Core.ExistingDocument<EntryDoc>) {
|
||||
if (isPluginMetadata(doc._id)) {
|
||||
if (this.settings.notifyPluginOrSettingUpdated) {
|
||||
this.triggerCheckPluginUpdate();
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
}
|
||||
async beforeReplicate(showMessage: boolean) {
|
||||
if (this.settings.autoSweepPlugins) {
|
||||
await this.sweepPlugin(showMessage);
|
||||
}
|
||||
}
|
||||
async onResume() {
|
||||
if (this.plugin.suspended)
|
||||
return;
|
||||
if (this.settings.autoSweepPlugins) {
|
||||
await this.sweepPlugin(false);
|
||||
}
|
||||
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
|
||||
}
|
||||
async onInitializeDatabase(showNotice: boolean) {
|
||||
if (this.settings.usePluginSync) {
|
||||
try {
|
||||
Logger("Scanning plugins...");
|
||||
await this.sweepPlugin(showNotice);
|
||||
Logger("Scanning plugins done");
|
||||
} catch (ex) {
|
||||
Logger("Scanning plugins failed");
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
async realizeSettingSyncMode() {
|
||||
this.periodicPluginSweepProcessor?.disable();
|
||||
if (this.plugin.suspended)
|
||||
return;
|
||||
if (this.settings.autoSweepPlugins) {
|
||||
await this.sweepPlugin(false);
|
||||
}
|
||||
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
|
||||
}
|
||||
|
||||
triggerCheckPluginUpdate() {
|
||||
(async () => await this.checkPluginUpdate())();
|
||||
}
|
||||
|
||||
|
||||
async getPluginList(): Promise<{ plugins: PluginList; allPlugins: DevicePluginList; thisDevicePlugins: DevicePluginList; }> {
|
||||
const docList = await this.localDatabase.allDocsRaw<PluginDataEntry>({ startkey: PSCHeader, endkey: PSCHeaderEnd, include_docs: false });
|
||||
const oldDocs: PluginDataEntry[] = ((await Promise.all(docList.rows.map(async (e) => await this.localDatabase.getDBEntry(e.id as FilePathWithPrefix /* WARN!! THIS SHOULD BE WRAPPED */)))).filter((e) => e !== false) as LoadedEntry[]).map((e) => JSON.parse(getDocData(e.data)));
|
||||
const plugins: { [key: string]: PluginDataEntry[]; } = {};
|
||||
const allPlugins: { [key: string]: PluginDataEntry; } = {};
|
||||
const thisDevicePlugins: { [key: string]: PluginDataEntry; } = {};
|
||||
for (const v of oldDocs) {
|
||||
if (typeof plugins[v.deviceVaultName] === "undefined") {
|
||||
plugins[v.deviceVaultName] = [];
|
||||
}
|
||||
plugins[v.deviceVaultName].push(v);
|
||||
allPlugins[v._id] = v;
|
||||
if (v.deviceVaultName == this.deviceAndVaultName) {
|
||||
thisDevicePlugins[v.manifest.id] = v;
|
||||
}
|
||||
}
|
||||
return { plugins, allPlugins, thisDevicePlugins };
|
||||
}
|
||||
|
||||
async checkPluginUpdate() {
|
||||
if (!this.plugin.settings.usePluginSync)
|
||||
return;
|
||||
await this.sweepPlugin(false);
|
||||
const { allPlugins, thisDevicePlugins } = await this.getPluginList();
|
||||
const arrPlugins = Object.values(allPlugins);
|
||||
let updateFound = false;
|
||||
for (const plugin of arrPlugins) {
|
||||
const ownPlugin = thisDevicePlugins[plugin.manifest.id];
|
||||
if (ownPlugin) {
|
||||
const remoteVersion = versionNumberString2Number(plugin.manifest.version);
|
||||
const ownVersion = versionNumberString2Number(ownPlugin.manifest.version);
|
||||
if (remoteVersion > ownVersion) {
|
||||
updateFound = true;
|
||||
}
|
||||
if (((plugin.mtime / 1000) | 0) > ((ownPlugin.mtime / 1000) | 0) && (plugin.dataJson ?? "") != (ownPlugin.dataJson ?? "")) {
|
||||
updateFound = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
if (updateFound) {
|
||||
const fragment = createFragment((doc) => {
|
||||
doc.createEl("a", null, (a) => {
|
||||
a.text = "There're some new plugins or their settings";
|
||||
a.addEventListener("click", () => this.showPluginSyncModal());
|
||||
});
|
||||
});
|
||||
NewNotice(fragment, 10000);
|
||||
} else {
|
||||
Logger("Everything is up to date.", LOG_LEVEL.NOTICE);
|
||||
}
|
||||
}
|
||||
|
||||
async sweepPlugin(showMessage = false, specificPluginPath = "") {
|
||||
if (!this.settings.usePluginSync)
|
||||
return;
|
||||
if (!this.localDatabase.isReady)
|
||||
return;
|
||||
// @ts-ignore
|
||||
const pl = this.app.plugins;
|
||||
const manifests: PluginManifest[] = Object.values(pl.manifests);
|
||||
let specificPlugin = "";
|
||||
if (specificPluginPath != "") {
|
||||
specificPlugin = manifests.find(e => e.dir.endsWith("/" + specificPluginPath))?.id ?? "";
|
||||
}
|
||||
await runWithLock("sweepplugin", true, async () => {
|
||||
const logLevel = showMessage ? LOG_LEVEL.NOTICE : LOG_LEVEL.INFO;
|
||||
if (!this.deviceAndVaultName) {
|
||||
Logger("You have to set your device and vault name.", LOG_LEVEL.NOTICE);
|
||||
return;
|
||||
}
|
||||
Logger("Scanning plugins", logLevel);
|
||||
const oldDocs = await this.localDatabase.allDocsRaw<EntryDoc>({
|
||||
startkey: `ps:${this.deviceAndVaultName}-${specificPlugin}`,
|
||||
endkey: `ps:${this.deviceAndVaultName}-${specificPlugin}\u{10ffff}`,
|
||||
include_docs: true,
|
||||
});
|
||||
// Logger("OLD DOCS.", LOG_LEVEL.VERBOSE);
|
||||
// sweep current plugin.
|
||||
const procs = manifests.map(async (m) => {
|
||||
const pluginDataEntryID = `ps:${this.deviceAndVaultName}-${m.id}` as DocumentID;
|
||||
try {
|
||||
if (specificPlugin && m.id != specificPlugin) {
|
||||
return;
|
||||
}
|
||||
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL.VERBOSE);
|
||||
const path = normalizePath(m.dir) + "/";
|
||||
const adapter = this.app.vault.adapter;
|
||||
const files = ["manifest.json", "main.js", "styles.css", "data.json"];
|
||||
const pluginData: { [key: string]: string; } = {};
|
||||
for (const file of files) {
|
||||
const thePath = path + file;
|
||||
if (await adapter.exists(thePath)) {
|
||||
pluginData[file] = await adapter.read(thePath);
|
||||
}
|
||||
}
|
||||
let mtime = 0;
|
||||
if (await adapter.exists(path + "/data.json")) {
|
||||
mtime = (await adapter.stat(path + "/data.json")).mtime;
|
||||
}
|
||||
|
||||
const p: PluginDataEntry = {
|
||||
_id: pluginDataEntryID,
|
||||
dataJson: pluginData["data.json"],
|
||||
deviceVaultName: this.deviceAndVaultName,
|
||||
mainJs: pluginData["main.js"],
|
||||
styleCss: pluginData["styles.css"],
|
||||
manifest: m,
|
||||
manifestJson: pluginData["manifest.json"],
|
||||
mtime: mtime,
|
||||
type: "plugin",
|
||||
};
|
||||
const d: LoadedEntry = {
|
||||
_id: p._id,
|
||||
path: p._id as string as FilePathWithPrefix,
|
||||
data: JSON.stringify(p),
|
||||
ctime: mtime,
|
||||
mtime: mtime,
|
||||
size: 0,
|
||||
children: [],
|
||||
datatype: "plain",
|
||||
type: "plain"
|
||||
};
|
||||
Logger(`check diff:${m.name}(${m.id})`, LOG_LEVEL.VERBOSE);
|
||||
await runWithLock("plugin-" + m.id, false, async () => {
|
||||
const old = await this.localDatabase.getDBEntry(p._id as string as FilePathWithPrefix /* This also should be explained */, null, false, false);
|
||||
if (old !== false) {
|
||||
const oldData = { data: old.data, deleted: old._deleted };
|
||||
const newData = { data: d.data, deleted: d._deleted };
|
||||
if (isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
|
||||
Logger(`Nothing changed:${m.name}`);
|
||||
return;
|
||||
}
|
||||
}
|
||||
await this.localDatabase.putDBEntry(d);
|
||||
Logger(`Plugin saved:${m.name}`, logLevel);
|
||||
});
|
||||
} catch (ex) {
|
||||
Logger(`Plugin save failed:${m.name}`, LOG_LEVEL.NOTICE);
|
||||
} finally {
|
||||
oldDocs.rows = oldDocs.rows.filter((e) => e.id != pluginDataEntryID);
|
||||
}
|
||||
//remove saved plugin data.
|
||||
}
|
||||
);
|
||||
|
||||
await Promise.all(procs);
|
||||
|
||||
const delDocs = oldDocs.rows.map((e) => {
|
||||
// e.doc._deleted = true;
|
||||
if (e.doc.type == "newnote" || e.doc.type == "plain") {
|
||||
e.doc.deleted = true;
|
||||
if (this.settings.deleteMetadataOfDeletedFiles) {
|
||||
e.doc._deleted = true;
|
||||
}
|
||||
} else {
|
||||
e.doc._deleted = true;
|
||||
}
|
||||
return e.doc;
|
||||
});
|
||||
Logger(`Deleting old plugin:(${delDocs.length})`, LOG_LEVEL.VERBOSE);
|
||||
await this.localDatabase.bulkDocsRaw(delDocs);
|
||||
Logger(`Scan plugin done.`, logLevel);
|
||||
});
|
||||
}
|
||||
|
||||
async applyPluginData(plugin: PluginDataEntry) {
|
||||
await runWithLock("plugin-" + plugin.manifest.id, false, async () => {
|
||||
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
|
||||
const adapter = this.app.vault.adapter;
|
||||
// @ts-ignore
|
||||
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
|
||||
if (stat) {
|
||||
// @ts-ignore
|
||||
await this.app.plugins.unloadPlugin(plugin.manifest.id);
|
||||
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
if (plugin.dataJson)
|
||||
await adapter.write(pluginTargetFolderPath + "data.json", plugin.dataJson);
|
||||
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL.NOTICE);
|
||||
if (stat) {
|
||||
// @ts-ignore
|
||||
await this.app.plugins.loadPlugin(plugin.manifest.id);
|
||||
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async applyPlugin(plugin: PluginDataEntry) {
|
||||
await runWithLock("plugin-" + plugin.manifest.id, false, async () => {
|
||||
// @ts-ignore
|
||||
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
|
||||
if (stat) {
|
||||
// @ts-ignore
|
||||
await this.app.plugins.unloadPlugin(plugin.manifest.id);
|
||||
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
|
||||
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
|
||||
const adapter = this.app.vault.adapter;
|
||||
if ((await adapter.exists(pluginTargetFolderPath)) === false) {
|
||||
await adapter.mkdir(pluginTargetFolderPath);
|
||||
}
|
||||
await adapter.write(pluginTargetFolderPath + "main.js", plugin.mainJs);
|
||||
await adapter.write(pluginTargetFolderPath + "manifest.json", plugin.manifestJson);
|
||||
if (plugin.styleCss)
|
||||
await adapter.write(pluginTargetFolderPath + "styles.css", plugin.styleCss);
|
||||
if (stat) {
|
||||
// @ts-ignore
|
||||
await this.app.plugins.loadPlugin(plugin.manifest.id);
|
||||
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL.NOTICE);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
232
src/CmdSetupLiveSync.ts
Normal file
232
src/CmdSetupLiveSync.ts
Normal file
@@ -0,0 +1,232 @@
|
||||
import { EntryDoc, ObsidianLiveSyncSettings, LOG_LEVEL, DEFAULT_SETTINGS } from "./lib/src/types";
|
||||
import { configURIBase } from "./types";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { askSelectString, askYesNo, askString } from "./utils";
|
||||
import { decrypt, encrypt } from "./lib/src/e2ee_v2";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { delay } from "./lib/src/utils";
|
||||
|
||||
export class SetupLiveSync extends LiveSyncCommands {
|
||||
onunload() { }
|
||||
onload(): void | Promise<void> {
|
||||
this.plugin.registerObsidianProtocolHandler("setuplivesync", async (conf: any) => await this.setupWizard(conf.settings));
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-copysetupuri",
|
||||
name: "Copy the setup URI",
|
||||
callback: this.command_copySetupURI.bind(this),
|
||||
});
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-copysetupurifull",
|
||||
name: "Copy the setup URI (Full)",
|
||||
callback: this.command_copySetupURIFull.bind(this),
|
||||
});
|
||||
|
||||
this.plugin.addCommand({
|
||||
id: "livesync-opensetupuri",
|
||||
name: "Open the setup URI",
|
||||
callback: this.command_openSetupURI.bind(this),
|
||||
});
|
||||
}
|
||||
onInitializeDatabase(showNotice: boolean) { }
|
||||
beforeReplicate(showNotice: boolean) { }
|
||||
onResume() { }
|
||||
parseReplicationResultItem(docs: PouchDB.Core.ExistingDocument<EntryDoc>): boolean | Promise<boolean> {
|
||||
return false;
|
||||
}
|
||||
async realizeSettingSyncMode() { }
|
||||
|
||||
async command_copySetupURI() {
|
||||
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "");
|
||||
if (encryptingPassphrase === false)
|
||||
return;
|
||||
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
|
||||
const keys = Object.keys(setting) as (keyof ObsidianLiveSyncSettings)[];
|
||||
for (const k of keys) {
|
||||
if (JSON.stringify(k in setting ? setting[k] : "") == JSON.stringify(k in DEFAULT_SETTINGS ? DEFAULT_SETTINGS[k] : "*")) {
|
||||
delete setting[k];
|
||||
}
|
||||
}
|
||||
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
|
||||
const uri = `${configURIBase}${encryptedSetting}`;
|
||||
await navigator.clipboard.writeText(uri);
|
||||
Logger("Setup URI copied to clipboard", LOG_LEVEL.NOTICE);
|
||||
}
|
||||
async command_copySetupURIFull() {
|
||||
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "");
|
||||
if (encryptingPassphrase === false)
|
||||
return;
|
||||
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
|
||||
const encryptedSetting = encodeURIComponent(await encrypt(JSON.stringify(setting), encryptingPassphrase, false));
|
||||
const uri = `${configURIBase}${encryptedSetting}`;
|
||||
await navigator.clipboard.writeText(uri);
|
||||
Logger("Setup URI copied to clipboard", LOG_LEVEL.NOTICE);
|
||||
}
|
||||
async command_openSetupURI() {
|
||||
const setupURI = await askString(this.app, "Easy setup", "Set up URI", `${configURIBase}aaaaa`);
|
||||
if (setupURI === false)
|
||||
return;
|
||||
if (!setupURI.startsWith(`${configURIBase}`)) {
|
||||
Logger("Set up URI looks wrong.", LOG_LEVEL.NOTICE);
|
||||
return;
|
||||
}
|
||||
const config = decodeURIComponent(setupURI.substring(configURIBase.length));
|
||||
console.dir(config);
|
||||
await this.setupWizard(config);
|
||||
}
|
||||
async setupWizard(confString: string) {
|
||||
try {
|
||||
const oldConf = JSON.parse(JSON.stringify(this.settings));
|
||||
const encryptingPassphrase = await askString(this.app, "Passphrase", "The passphrase to decrypt your setup URI", "");
|
||||
if (encryptingPassphrase === false)
|
||||
return;
|
||||
const newConf = await JSON.parse(await decrypt(confString, encryptingPassphrase, false));
|
||||
if (newConf) {
|
||||
const result = await askYesNo(this.app, "Importing LiveSync's conf, OK?");
|
||||
if (result == "yes") {
|
||||
const newSettingW = Object.assign({}, DEFAULT_SETTINGS, newConf) as ObsidianLiveSyncSettings;
|
||||
this.plugin.replicator.closeReplication();
|
||||
this.settings.suspendFileWatching = true;
|
||||
console.dir(newSettingW);
|
||||
// Back into the default method once.
|
||||
newSettingW.configPassphraseStore = "";
|
||||
newSettingW.encryptedPassphrase = "";
|
||||
newSettingW.encryptedCouchDBConnection = "";
|
||||
const setupJustImport = "Just import setting";
|
||||
const setupAsNew = "Set it up as secondary or subsequent device";
|
||||
const setupAgain = "Reconfigure and reconstitute the data";
|
||||
const setupManually = "Leave everything to me";
|
||||
newSettingW.syncInternalFiles = false;
|
||||
newSettingW.usePluginSync = false;
|
||||
const setupType = await askSelectString(this.app, "How would you like to set it up?", [setupAsNew, setupAgain, setupJustImport, setupManually]);
|
||||
if (setupType == setupJustImport) {
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
await this.plugin.saveSettings();
|
||||
} else if (setupType == setupAsNew) {
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
await this.fetchLocal();
|
||||
} else if (setupType == setupAgain) {
|
||||
const confirm = "I know this operation will rebuild all my databases with files on this device, and files that are on the remote database and I didn't synchronize to any other devices will be lost and want to proceed indeed.";
|
||||
if (await askSelectString(this.app, "Do you really want to do this?", ["Cancel", confirm]) != confirm) {
|
||||
return;
|
||||
}
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
await this.rebuildEverything();
|
||||
} else if (setupType == setupManually) {
|
||||
const keepLocalDB = await askYesNo(this.app, "Keep local DB?");
|
||||
const keepRemoteDB = await askYesNo(this.app, "Keep remote DB?");
|
||||
if (keepLocalDB == "yes" && keepRemoteDB == "yes") {
|
||||
// nothing to do. so peaceful.
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
this.suspendAllSync();
|
||||
this.suspendExtraSync();
|
||||
await this.plugin.saveSettings();
|
||||
const replicate = await askYesNo(this.app, "Unlock and replicate?");
|
||||
if (replicate == "yes") {
|
||||
await this.plugin.replicate(true);
|
||||
await this.plugin.markRemoteUnlocked();
|
||||
}
|
||||
Logger("Configuration loaded.", LOG_LEVEL.NOTICE);
|
||||
return;
|
||||
}
|
||||
if (keepLocalDB == "no" && keepRemoteDB == "no") {
|
||||
const reset = await askYesNo(this.app, "Drop everything?");
|
||||
if (reset != "yes") {
|
||||
Logger("Cancelled", LOG_LEVEL.NOTICE);
|
||||
this.plugin.settings = oldConf;
|
||||
return;
|
||||
}
|
||||
}
|
||||
let initDB;
|
||||
this.plugin.settings = newSettingW;
|
||||
this.plugin.usedPassphrase = "";
|
||||
await this.plugin.saveSettings();
|
||||
if (keepLocalDB == "no") {
|
||||
await this.plugin.resetLocalDatabase();
|
||||
await this.plugin.localDatabase.initializeDatabase();
|
||||
const rebuild = await askYesNo(this.app, "Rebuild the database?");
|
||||
if (rebuild == "yes") {
|
||||
initDB = this.plugin.initializeDatabase(true);
|
||||
} else {
|
||||
await this.plugin.markRemoteResolved();
|
||||
}
|
||||
}
|
||||
if (keepRemoteDB == "no") {
|
||||
await this.plugin.tryResetRemoteDatabase();
|
||||
await this.plugin.markRemoteLocked();
|
||||
}
|
||||
if (keepLocalDB == "no" || keepRemoteDB == "no") {
|
||||
const replicate = await askYesNo(this.app, "Replicate once?");
|
||||
if (replicate == "yes") {
|
||||
if (initDB != null) {
|
||||
await initDB;
|
||||
}
|
||||
await this.plugin.replicate(true);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Logger("Configuration loaded.", LOG_LEVEL.NOTICE);
|
||||
} else {
|
||||
Logger("Cancelled.", LOG_LEVEL.NOTICE);
|
||||
}
|
||||
} catch (ex) {
|
||||
Logger("Couldn't parse or decrypt configuration uri.", LOG_LEVEL.NOTICE);
|
||||
}
|
||||
}
|
||||
|
||||
suspendExtraSync() {
|
||||
Logger("Hidden files and plugin synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL.NOTICE)
|
||||
this.plugin.settings.syncInternalFiles = false;
|
||||
this.plugin.settings.usePluginSync = false;
|
||||
this.plugin.settings.autoSweepPlugins = false;
|
||||
}
|
||||
suspendAllSync() {
|
||||
this.plugin.settings.liveSync = false;
|
||||
this.plugin.settings.periodicReplication = false;
|
||||
this.plugin.settings.syncOnSave = false;
|
||||
this.plugin.settings.syncOnStart = false;
|
||||
this.plugin.settings.syncOnFileOpen = false;
|
||||
this.plugin.settings.syncAfterMerge = false;
|
||||
//this.suspendExtraSync();
|
||||
}
|
||||
async fetchLocal() {
|
||||
this.suspendExtraSync();
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.resetLocalDatabase();
|
||||
await delay(1000);
|
||||
await this.plugin.markRemoteResolved();
|
||||
await this.plugin.openDatabase();
|
||||
this.plugin.isReady = true;
|
||||
await delay(500);
|
||||
await this.plugin.replicateAllFromServer(true);
|
||||
}
|
||||
async rebuildRemote() {
|
||||
this.suspendExtraSync();
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.markRemoteLocked();
|
||||
await this.plugin.tryResetRemoteDatabase();
|
||||
await this.plugin.markRemoteLocked();
|
||||
await delay(500);
|
||||
await this.plugin.replicateAllToServer(true);
|
||||
}
|
||||
async rebuildEverything() {
|
||||
this.suspendExtraSync();
|
||||
await this.plugin.realizeSettingSyncMode();
|
||||
await this.plugin.resetLocalDatabase();
|
||||
await delay(1000);
|
||||
await this.plugin.initializeDatabase(true);
|
||||
await this.plugin.markRemoteLocked();
|
||||
await this.plugin.tryResetRemoteDatabase();
|
||||
await this.plugin.markRemoteLocked();
|
||||
await delay(500);
|
||||
await this.plugin.replicateAllToServer(true);
|
||||
}
|
||||
}
|
||||
@@ -1,17 +1,19 @@
|
||||
import { App, Modal } from "obsidian";
|
||||
import { App, Modal } from "./deps";
|
||||
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT } from "diff-match-patch";
|
||||
import { diff_result } from "./lib/src/types";
|
||||
import { escapeStringToHTML } from "./lib/src/utils";
|
||||
import { escapeStringToHTML } from "./lib/src/strbin";
|
||||
|
||||
export class ConflictResolveModal extends Modal {
|
||||
// result: Array<[number, string]>;
|
||||
result: diff_result;
|
||||
filename: string;
|
||||
callback: (remove_rev: string) => Promise<void>;
|
||||
|
||||
constructor(app: App, diff: diff_result, callback: (remove_rev: string) => Promise<void>) {
|
||||
constructor(app: App, filename: string, diff: diff_result, callback: (remove_rev: string) => Promise<void>) {
|
||||
super(app);
|
||||
this.result = diff;
|
||||
this.callback = callback;
|
||||
this.filename = filename;
|
||||
}
|
||||
|
||||
onOpen() {
|
||||
@@ -20,6 +22,7 @@ export class ConflictResolveModal extends Modal {
|
||||
contentEl.empty();
|
||||
|
||||
contentEl.createEl("h2", { text: "This document has conflicted changes." });
|
||||
contentEl.createEl("span", { text: this.filename });
|
||||
const div = contentEl.createDiv("");
|
||||
div.addClass("op-scrollable");
|
||||
let diff = "";
|
||||
|
||||
@@ -1,10 +1,13 @@
|
||||
import { TFile, Modal, App } from "obsidian";
|
||||
import { path2id } from "./utils";
|
||||
import { base64ToArrayBuffer, base64ToString, escapeStringToHTML, isValidPath } from "./lib/src/utils";
|
||||
import { TFile, Modal, App } from "./deps";
|
||||
import { getPathFromTFile, isValidPath } from "./utils";
|
||||
import { base64ToArrayBuffer, base64ToString, escapeStringToHTML } from "./lib/src/strbin";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "diff-match-patch";
|
||||
import { LoadedEntry, LOG_LEVEL } from "./lib/src/types";
|
||||
import { DocumentID, FilePathWithPrefix, LoadedEntry, LOG_LEVEL } from "./lib/src/types";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { isErrorOfMissingDoc } from "./lib/src/utils_couchdb";
|
||||
import { getDocData } from "./lib/src/utils";
|
||||
import { stripPrefix } from "./lib/src/path";
|
||||
|
||||
export class DocumentHistoryModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
@@ -13,38 +16,50 @@ export class DocumentHistoryModal extends Modal {
|
||||
info: HTMLDivElement;
|
||||
fileInfo: HTMLDivElement;
|
||||
showDiff = false;
|
||||
id: DocumentID;
|
||||
|
||||
file: string;
|
||||
file: FilePathWithPrefix;
|
||||
|
||||
revs_info: PouchDB.Core.RevisionInfo[] = [];
|
||||
currentDoc: LoadedEntry;
|
||||
currentText = "";
|
||||
currentDeleted = false;
|
||||
|
||||
constructor(app: App, plugin: ObsidianLiveSyncPlugin, file: TFile | string) {
|
||||
constructor(app: App, plugin: ObsidianLiveSyncPlugin, file: TFile | FilePathWithPrefix, id: DocumentID) {
|
||||
super(app);
|
||||
this.plugin = plugin;
|
||||
this.file = (file instanceof TFile) ? file.path : file;
|
||||
this.file = (file instanceof TFile) ? getPathFromTFile(file) : file;
|
||||
this.id = id;
|
||||
if (!file) {
|
||||
this.file = this.plugin.id2path(id, null);
|
||||
}
|
||||
if (localStorage.getItem("ols-history-highlightdiff") == "1") {
|
||||
this.showDiff = true;
|
||||
}
|
||||
}
|
||||
|
||||
async loadFile() {
|
||||
if (!this.id) {
|
||||
this.id = await this.plugin.path2id(this.file);
|
||||
}
|
||||
const db = this.plugin.localDatabase;
|
||||
try {
|
||||
const w = await db.localDatabase.get(path2id(this.file), { revs_info: true });
|
||||
this.revs_info = w._revs_info.filter((e) => e.status == "available");
|
||||
const w = await db.localDatabase.get(this.id, { revs_info: true });
|
||||
this.revs_info = w._revs_info.filter((e) => e?.status == "available");
|
||||
this.range.max = `${this.revs_info.length - 1}`;
|
||||
this.range.value = this.range.max;
|
||||
this.fileInfo.setText(`${this.file} / ${this.revs_info.length} revisions`);
|
||||
await this.loadRevs();
|
||||
} catch (ex) {
|
||||
if (ex.status && ex.status == 404) {
|
||||
if (isErrorOfMissingDoc(ex)) {
|
||||
this.range.max = "0";
|
||||
this.range.value = "";
|
||||
this.range.disabled = true;
|
||||
this.showDiff
|
||||
this.contentView.setText(`History of this file was not recorded.`);
|
||||
} else {
|
||||
this.contentView.setText(`Error occurred.`);
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -53,7 +68,7 @@ export class DocumentHistoryModal extends Modal {
|
||||
const db = this.plugin.localDatabase;
|
||||
const index = this.revs_info.length - 1 - (this.range.value as any) / 1;
|
||||
const rev = this.revs_info[index];
|
||||
const w = await db.getDBEntry(path2id(this.file), { rev: rev.rev }, false, false, true);
|
||||
const w = await db.getDBEntry(this.file, { rev: rev.rev }, false, false, true);
|
||||
this.currentText = "";
|
||||
this.currentDeleted = false;
|
||||
if (w === false) {
|
||||
@@ -64,17 +79,17 @@ export class DocumentHistoryModal extends Modal {
|
||||
this.currentDoc = w;
|
||||
this.info.innerHTML = `Modified:${new Date(w.mtime).toLocaleString()}`;
|
||||
let result = "";
|
||||
const w1data = w.datatype == "plain" ? w.data : base64ToString(w.data);
|
||||
const w1data = w.datatype == "plain" ? getDocData(w.data) : base64ToString(w.data);
|
||||
this.currentDeleted = w.deleted;
|
||||
this.currentText = w1data;
|
||||
if (this.showDiff) {
|
||||
const prevRevIdx = this.revs_info.length - 1 - ((this.range.value as any) / 1 - 1);
|
||||
if (prevRevIdx >= 0 && prevRevIdx < this.revs_info.length) {
|
||||
const oldRev = this.revs_info[prevRevIdx].rev;
|
||||
const w2 = await db.getDBEntry(path2id(this.file), { rev: oldRev }, false, false, true);
|
||||
const w2 = await db.getDBEntry(this.file, { rev: oldRev }, false, false, true);
|
||||
if (w2 != false) {
|
||||
const dmp = new diff_match_patch();
|
||||
const w2data = w2.datatype == "plain" ? w2.data : base64ToString(w2.data);
|
||||
const w2data = w2.datatype == "plain" ? getDocData(w2.data) : base64ToString(w2.data);
|
||||
const diff = dmp.diff_main(w2data, w1data);
|
||||
dmp.diff_cleanupSemantic(diff);
|
||||
for (const v of diff) {
|
||||
@@ -100,7 +115,6 @@ export class DocumentHistoryModal extends Modal {
|
||||
result = escapeStringToHTML(w1data);
|
||||
}
|
||||
this.contentView.innerHTML = (this.currentDeleted ? "(At this revision, the file has been deleted)\n" : "") + result;
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
@@ -171,12 +185,13 @@ export class DocumentHistoryModal extends Modal {
|
||||
buttons.createEl("button", { text: "Back to this revision" }, (e) => {
|
||||
e.addClass("mod-cta");
|
||||
e.addEventListener("click", async () => {
|
||||
const pathToWrite = this.file.startsWith("i:") ? this.file.substring("i:".length) : this.file;
|
||||
// const pathToWrite = this.plugin.id2path(this.id, true);
|
||||
const pathToWrite = stripPrefix(this.file);
|
||||
if (!isValidPath(pathToWrite)) {
|
||||
Logger("Path is not valid to write content.", LOG_LEVEL.INFO);
|
||||
}
|
||||
if (this.currentDoc?.datatype == "plain") {
|
||||
await this.app.vault.adapter.write(pathToWrite, this.currentDoc.data);
|
||||
await this.app.vault.adapter.write(pathToWrite, getDocData(this.currentDoc.data));
|
||||
await focusFile(pathToWrite);
|
||||
this.close();
|
||||
} else if (this.currentDoc?.datatype == "newnote") {
|
||||
|
||||
55
src/JsonResolveModal.ts
Normal file
55
src/JsonResolveModal.ts
Normal file
@@ -0,0 +1,55 @@
|
||||
import { App, Modal } from "./deps";
|
||||
import { FilePath, LoadedEntry } from "./lib/src/types";
|
||||
import JsonResolvePane from "./JsonResolvePane.svelte";
|
||||
|
||||
export class JsonResolveModal extends Modal {
|
||||
// result: Array<[number, string]>;
|
||||
filename: FilePath;
|
||||
callback: (keepRev: string, mergedStr?: string) => Promise<void>;
|
||||
docs: LoadedEntry[];
|
||||
component: JsonResolvePane;
|
||||
|
||||
constructor(app: App, filename: FilePath, docs: LoadedEntry[], callback: (keepRev: string, mergedStr?: string) => Promise<void>) {
|
||||
super(app);
|
||||
this.callback = callback;
|
||||
this.filename = filename;
|
||||
this.docs = docs;
|
||||
}
|
||||
async UICallback(keepRev: string, mergedStr?: string) {
|
||||
this.close();
|
||||
await this.callback(keepRev, mergedStr);
|
||||
this.callback = null;
|
||||
}
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
|
||||
contentEl.empty();
|
||||
|
||||
if (this.component == null) {
|
||||
this.component = new JsonResolvePane({
|
||||
target: contentEl,
|
||||
props: {
|
||||
docs: this.docs,
|
||||
filename: this.filename,
|
||||
callback: (keepRev, mergedStr) => this.UICallback(keepRev, mergedStr),
|
||||
},
|
||||
});
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
onClose() {
|
||||
const { contentEl } = this;
|
||||
contentEl.empty();
|
||||
// contentEl.empty();
|
||||
if (this.callback != null) {
|
||||
this.callback(null);
|
||||
}
|
||||
if (this.component != null) {
|
||||
this.component.$destroy();
|
||||
this.component = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
164
src/JsonResolvePane.svelte
Normal file
164
src/JsonResolvePane.svelte
Normal file
@@ -0,0 +1,164 @@
|
||||
<script lang="ts">
|
||||
import { Diff, DIFF_DELETE, DIFF_INSERT, diff_match_patch } from "diff-match-patch";
|
||||
import type { FilePath, LoadedEntry } from "./lib/src/types";
|
||||
import { base64ToString } from "./lib/src/strbin";
|
||||
import { getDocData } from "./lib/src/utils";
|
||||
import { mergeObject } from "./utils";
|
||||
|
||||
export let docs: LoadedEntry[] = [];
|
||||
export let callback: (keepRev: string, mergedStr?: string) => Promise<void> = async (_, __) => {
|
||||
Promise.resolve();
|
||||
};
|
||||
export let filename: FilePath = "" as FilePath;
|
||||
|
||||
let docA: LoadedEntry = undefined;
|
||||
let docB: LoadedEntry = undefined;
|
||||
let docAContent = "";
|
||||
let docBContent = "";
|
||||
let objA: any = {};
|
||||
let objB: any = {};
|
||||
let objAB: any = {};
|
||||
let objBA: any = {};
|
||||
let diffs: Diff[];
|
||||
const modes = [
|
||||
["", "Not now"],
|
||||
["A", "A"],
|
||||
["B", "B"],
|
||||
["AB", "A + B"],
|
||||
["BA", "B + A"],
|
||||
] as ["" | "A" | "B" | "AB" | "BA", string][];
|
||||
let mode: "" | "A" | "B" | "AB" | "BA" = "";
|
||||
|
||||
function docToString(doc: LoadedEntry) {
|
||||
return doc.datatype == "plain" ? getDocData(doc.data) : base64ToString(doc.data);
|
||||
}
|
||||
function revStringToRevNumber(rev: string) {
|
||||
return rev.split("-")[0];
|
||||
}
|
||||
|
||||
function getDiff(left: string, right: string) {
|
||||
const dmp = new diff_match_patch();
|
||||
const mapLeft = dmp.diff_linesToChars_(left, right);
|
||||
const diffLeftSrc = dmp.diff_main(mapLeft.chars1, mapLeft.chars2, false);
|
||||
dmp.diff_charsToLines_(diffLeftSrc, mapLeft.lineArray);
|
||||
return diffLeftSrc;
|
||||
}
|
||||
function getJsonDiff(a: object, b: object) {
|
||||
return getDiff(JSON.stringify(a, null, 2), JSON.stringify(b, null, 2));
|
||||
}
|
||||
function apply() {
|
||||
if (mode == "A") return callback(docA._rev, null);
|
||||
if (mode == "B") return callback(docB._rev, null);
|
||||
if (mode == "BA") return callback(null, JSON.stringify(objBA, null, 2));
|
||||
if (mode == "AB") return callback(null, JSON.stringify(objAB, null, 2));
|
||||
callback(null, null);
|
||||
}
|
||||
$: {
|
||||
if (docs && docs.length >= 1) {
|
||||
if (docs[0].mtime < docs[1].mtime) {
|
||||
docA = docs[0];
|
||||
docB = docs[1];
|
||||
} else {
|
||||
docA = docs[1];
|
||||
docB = docs[0];
|
||||
}
|
||||
docAContent = docToString(docA);
|
||||
docBContent = docToString(docB);
|
||||
|
||||
try {
|
||||
objA = false;
|
||||
objB = false;
|
||||
objA = JSON.parse(docAContent);
|
||||
objB = JSON.parse(docBContent);
|
||||
objAB = mergeObject(objA, objB);
|
||||
objBA = mergeObject(objB, objA);
|
||||
if (JSON.stringify(objAB) == JSON.stringify(objBA)) {
|
||||
objBA = false;
|
||||
}
|
||||
} catch (ex) {
|
||||
objBA = false;
|
||||
objAB = false;
|
||||
}
|
||||
}
|
||||
}
|
||||
$: mergedObjs = {
|
||||
"": false,
|
||||
A: objA,
|
||||
B: objB,
|
||||
AB: objAB,
|
||||
BA: objBA,
|
||||
};
|
||||
|
||||
$: selectedObj = mode in mergedObjs ? mergedObjs[mode] : {};
|
||||
$: {
|
||||
diffs = getJsonDiff(objA, selectedObj);
|
||||
console.dir(selectedObj);
|
||||
}
|
||||
</script>
|
||||
|
||||
<h1>Conflicted settings</h1>
|
||||
<div><span>{filename}</span></div>
|
||||
{#if !docA || !docB}
|
||||
<div class="message">Just for a minute, please!</div>
|
||||
<div class="buttons">
|
||||
<button on:click={apply}>Dismiss</button>
|
||||
</div>
|
||||
{:else}
|
||||
<div class="options">
|
||||
{#each modes as m}
|
||||
{#if m[0] == "" || mergedObjs[m[0]] != false}
|
||||
<label class={`sls-setting-label ${m[0] == mode ? "selected" : ""}`}
|
||||
><input type="radio" name="disp" bind:group={mode} value={m[0]} class="sls-setting-tab" />
|
||||
<div class="sls-setting-menu-btn">{m[1]}</div></label
|
||||
>
|
||||
{/if}
|
||||
{/each}
|
||||
</div>
|
||||
|
||||
{#if selectedObj != false}
|
||||
<div class="op-scrollable json-source">
|
||||
{#each diffs as diff}
|
||||
<span class={diff[0] == DIFF_DELETE ? "deleted" : diff[0] == DIFF_INSERT ? "added" : "normal"}>{diff[1]}</span>
|
||||
{/each}
|
||||
</div>
|
||||
{:else}
|
||||
NO PREVIEW
|
||||
{/if}
|
||||
<div>
|
||||
A Rev:{revStringToRevNumber(docA._rev)} ,{new Date(docA.mtime).toLocaleString()}
|
||||
{docAContent.length} letters
|
||||
</div>
|
||||
|
||||
<div>
|
||||
B Rev:{revStringToRevNumber(docB._rev)} ,{new Date(docB.mtime).toLocaleString()}
|
||||
{docBContent.length} letters
|
||||
</div>
|
||||
|
||||
<div class="buttons">
|
||||
<button on:click={apply}>Apply</button>
|
||||
</div>
|
||||
{/if}
|
||||
|
||||
<style>
|
||||
.deleted {
|
||||
text-decoration: line-through;
|
||||
}
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
.scroller {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
overflow-y: scroll;
|
||||
max-height: 60vh;
|
||||
user-select: text;
|
||||
}
|
||||
.json-source {
|
||||
white-space: pre;
|
||||
height: auto;
|
||||
overflow: auto;
|
||||
min-height: var(--font-ui-medium);
|
||||
flex-grow: 1;
|
||||
}
|
||||
</style>
|
||||
37
src/LiveSyncCommands.ts
Normal file
37
src/LiveSyncCommands.ts
Normal file
@@ -0,0 +1,37 @@
|
||||
import { AnyEntry, DocumentID, EntryDoc, EntryHasPath, FilePath, FilePathWithPrefix } from "./lib/src/types";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import type ObsidianLiveSyncPlugin from "./main";
|
||||
|
||||
|
||||
export abstract class LiveSyncCommands {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
get app() {
|
||||
return this.plugin.app;
|
||||
}
|
||||
get settings() {
|
||||
return this.plugin.settings;
|
||||
}
|
||||
get localDatabase() {
|
||||
return this.plugin.localDatabase;
|
||||
}
|
||||
id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
|
||||
return this.plugin.id2path(id, entry, stripPrefix);
|
||||
}
|
||||
async path2id(filename: FilePathWithPrefix | FilePath, prefix?: string): Promise<DocumentID> {
|
||||
return await this.plugin.path2id(filename, prefix);
|
||||
}
|
||||
getPath(entry: AnyEntry): FilePathWithPrefix {
|
||||
return this.plugin.getPath(entry);
|
||||
}
|
||||
|
||||
constructor(plugin: ObsidianLiveSyncPlugin) {
|
||||
this.plugin = plugin;
|
||||
}
|
||||
abstract onunload(): void;
|
||||
abstract onload(): void | Promise<void>;
|
||||
abstract onInitializeDatabase(showNotice: boolean): void | Promise<void>;
|
||||
abstract beforeReplicate(showNotice: boolean): void | Promise<void>;
|
||||
abstract onResume(): void | Promise<void>;
|
||||
abstract parseReplicationResultItem(docs: PouchDB.Core.ExistingDocument<EntryDoc>): Promise<boolean> | boolean;
|
||||
abstract realizeSettingSyncMode(): Promise<void>;
|
||||
}
|
||||
@@ -1,173 +0,0 @@
|
||||
import { requestUrl, RequestUrlParam, RequestUrlResponse } from "obsidian";
|
||||
import { KeyValueDatabase, OpenKeyValueDatabase } from "./KeyValueDB.js";
|
||||
import { LocalPouchDBBase } from "./lib/src/LocalPouchDBBase.js";
|
||||
import { Logger } from "./lib/src/logger.js";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { EntryDoc, LOG_LEVEL } from "./lib/src/types.js";
|
||||
import { enableEncryption } from "./lib/src/utils.js";
|
||||
import { isCloudantURI, isValidRemoteCouchDBURI } from "./lib/src/utils_couchdb.js";
|
||||
import { id2path, path2id } from "./utils.js";
|
||||
|
||||
export class LocalPouchDB extends LocalPouchDBBase {
|
||||
|
||||
kvDB: KeyValueDatabase;
|
||||
id2path(filename: string): string {
|
||||
return id2path(filename);
|
||||
}
|
||||
path2id(filename: string): string {
|
||||
return path2id(filename);
|
||||
}
|
||||
CreatePouchDBInstance<T>(name?: string, options?: PouchDB.Configuration.DatabaseConfiguration): PouchDB.Database<T> {
|
||||
return new PouchDB(name, options);
|
||||
}
|
||||
beforeOnUnload(): void {
|
||||
this.kvDB.close();
|
||||
}
|
||||
onClose(): void {
|
||||
this.kvDB.close();
|
||||
}
|
||||
async onInitializeDatabase(): Promise<void> {
|
||||
this.kvDB = await OpenKeyValueDatabase(this.dbname + "-livesync-kv");
|
||||
}
|
||||
async onResetDatabase(): Promise<void> {
|
||||
await this.kvDB.destroy();
|
||||
}
|
||||
|
||||
last_successful_post = false;
|
||||
getLastPostFailedBySize() {
|
||||
return !this.last_successful_post;
|
||||
}
|
||||
async fetchByAPI(request: RequestUrlParam): Promise<RequestUrlResponse> {
|
||||
const ret = await requestUrl(request);
|
||||
if (ret.status - (ret.status % 100) !== 200) {
|
||||
const er: Error & { status?: number } = new Error(`Request Error:${ret.status}`);
|
||||
if (ret.json) {
|
||||
er.message = ret.json.reason;
|
||||
er.name = `${ret.json.error ?? ""}:${ret.json.message ?? ""}`;
|
||||
}
|
||||
er.status = ret.status;
|
||||
throw er;
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
|
||||
async connectRemoteCouchDB(uri: string, auth: { username: string; password: string }, disableRequestURI: boolean, passphrase: string | boolean): Promise<string | { db: PouchDB.Database<EntryDoc>; info: PouchDB.Core.DatabaseInfo }> {
|
||||
if (!isValidRemoteCouchDBURI(uri)) return "Remote URI is not valid";
|
||||
if (uri.toLowerCase() != uri) return "Remote URI and database name could not contain capital letters.";
|
||||
if (uri.indexOf(" ") !== -1) return "Remote URI and database name could not contain spaces.";
|
||||
let authHeader = "";
|
||||
if (auth.username && auth.password) {
|
||||
const utf8str = String.fromCharCode.apply(null, new TextEncoder().encode(`${auth.username}:${auth.password}`));
|
||||
const encoded = window.btoa(utf8str);
|
||||
authHeader = "Basic " + encoded;
|
||||
} else {
|
||||
authHeader = "";
|
||||
}
|
||||
// const _this = this;
|
||||
|
||||
const conf: PouchDB.HttpAdapter.HttpAdapterConfiguration = {
|
||||
adapter: "http",
|
||||
auth,
|
||||
fetch: async (url: string | Request, opts: RequestInit) => {
|
||||
let size = "";
|
||||
const localURL = url.toString().substring(uri.length);
|
||||
const method = opts.method ?? "GET";
|
||||
if (opts.body) {
|
||||
const opts_length = opts.body.toString().length;
|
||||
if (opts_length > 1000 * 1000 * 10) {
|
||||
// over 10MB
|
||||
if (isCloudantURI(uri)) {
|
||||
this.last_successful_post = false;
|
||||
Logger("This request should fail on IBM Cloudant.", LOG_LEVEL.VERBOSE);
|
||||
throw new Error("This request should fail on IBM Cloudant.");
|
||||
}
|
||||
}
|
||||
size = ` (${opts_length})`;
|
||||
}
|
||||
|
||||
if (!disableRequestURI && typeof url == "string" && typeof (opts.body ?? "") == "string") {
|
||||
const body = opts.body as string;
|
||||
|
||||
const transformedHeaders = { ...(opts.headers as Record<string, string>) };
|
||||
if (authHeader != "") transformedHeaders["authorization"] = authHeader;
|
||||
delete transformedHeaders["host"];
|
||||
delete transformedHeaders["Host"];
|
||||
delete transformedHeaders["content-length"];
|
||||
delete transformedHeaders["Content-Length"];
|
||||
const requestParam: RequestUrlParam = {
|
||||
url: url as string,
|
||||
method: opts.method,
|
||||
body: body,
|
||||
headers: transformedHeaders,
|
||||
contentType: "application/json",
|
||||
// contentType: opts.headers,
|
||||
};
|
||||
|
||||
try {
|
||||
const r = await this.fetchByAPI(requestParam);
|
||||
if (method == "POST" || method == "PUT") {
|
||||
this.last_successful_post = r.status - (r.status % 100) == 200;
|
||||
} else {
|
||||
this.last_successful_post = true;
|
||||
}
|
||||
Logger(`HTTP:${method}${size} to:${localURL} -> ${r.status}`, LOG_LEVEL.DEBUG);
|
||||
|
||||
return new Response(r.arrayBuffer, {
|
||||
headers: r.headers,
|
||||
status: r.status,
|
||||
statusText: `${r.status}`,
|
||||
});
|
||||
} catch (ex) {
|
||||
Logger(`HTTP:${method}${size} to:${localURL} -> failed`, LOG_LEVEL.VERBOSE);
|
||||
// limit only in bulk_docs.
|
||||
if (url.toString().indexOf("_bulk_docs") !== -1) {
|
||||
this.last_successful_post = false;
|
||||
}
|
||||
Logger(ex);
|
||||
throw ex;
|
||||
}
|
||||
}
|
||||
|
||||
// -old implementation
|
||||
|
||||
try {
|
||||
const response: Response = await fetch(url, opts);
|
||||
if (method == "POST" || method == "PUT") {
|
||||
this.last_successful_post = response.ok;
|
||||
} else {
|
||||
this.last_successful_post = true;
|
||||
}
|
||||
Logger(`HTTP:${method}${size} to:${localURL} -> ${response.status}`, LOG_LEVEL.DEBUG);
|
||||
return response;
|
||||
} catch (ex) {
|
||||
Logger(`HTTP:${method}${size} to:${localURL} -> failed`, LOG_LEVEL.VERBOSE);
|
||||
// limit only in bulk_docs.
|
||||
if (url.toString().indexOf("_bulk_docs") !== -1) {
|
||||
this.last_successful_post = false;
|
||||
}
|
||||
Logger(ex);
|
||||
throw ex;
|
||||
}
|
||||
// return await fetch(url, opts);
|
||||
},
|
||||
};
|
||||
|
||||
const db: PouchDB.Database<EntryDoc> = new PouchDB<EntryDoc>(uri, conf);
|
||||
if (passphrase && typeof passphrase === "string") {
|
||||
enableEncryption(db, passphrase);
|
||||
}
|
||||
try {
|
||||
const info = await db.info();
|
||||
return { db: db, info: info };
|
||||
} catch (ex) {
|
||||
let msg = `${ex.name}:${ex.message}`;
|
||||
if (ex.name == "TypeError" && ex.message == "Failed to fetch") {
|
||||
msg += "\n**Note** This error caused by many reasons. The only sure thing is you didn't touch the server.\nTo check details, open inspector.";
|
||||
}
|
||||
Logger(ex, LOG_LEVEL.VERBOSE);
|
||||
return msg;
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
@@ -1,21 +1,17 @@
|
||||
import { App, Modal } from "obsidian";
|
||||
import { escapeStringToHTML } from "./lib/src/utils";
|
||||
import { App, Modal } from "./deps";
|
||||
import { logMessageStore } from "./lib/src/stores";
|
||||
import { escapeStringToHTML } from "./lib/src/strbin";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
|
||||
export class LogDisplayModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
logEl: HTMLDivElement;
|
||||
unsubscribe: () => void;
|
||||
constructor(app: App, plugin: ObsidianLiveSyncPlugin) {
|
||||
super(app);
|
||||
this.plugin = plugin;
|
||||
}
|
||||
updateLog() {
|
||||
let msg = "";
|
||||
for (const v of this.plugin.logMessage) {
|
||||
msg += escapeStringToHTML(v) + "<br>";
|
||||
}
|
||||
this.logEl.innerHTML = msg;
|
||||
}
|
||||
|
||||
onOpen() {
|
||||
const { contentEl } = this;
|
||||
|
||||
@@ -25,13 +21,18 @@ export class LogDisplayModal extends Modal {
|
||||
div.addClass("op-scrollable");
|
||||
div.addClass("op-pre");
|
||||
this.logEl = div;
|
||||
this.updateLog = this.updateLog.bind(this);
|
||||
this.plugin.addLogHook = this.updateLog;
|
||||
this.updateLog();
|
||||
this.unsubscribe = logMessageStore.observe((e) => {
|
||||
let msg = "";
|
||||
for (const v of e) {
|
||||
msg += escapeStringToHTML(v) + "<br>";
|
||||
}
|
||||
this.logEl.innerHTML = msg;
|
||||
})
|
||||
logMessageStore.invalidate();
|
||||
}
|
||||
onClose() {
|
||||
const { contentEl } = this;
|
||||
contentEl.empty();
|
||||
this.plugin.addLogHook = null;
|
||||
if (this.unsubscribe) this.unsubscribe();
|
||||
}
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1,8 +1,9 @@
|
||||
<script lang="ts">
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { onMount } from "svelte";
|
||||
import { DevicePluginList, PluginDataEntry } from "./types";
|
||||
import { versionNumberString2Number } from "./lib/src/utils";
|
||||
import { versionNumberString2Number } from "./lib/src/strbin";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { PluginAndTheirSettings } from "./CmdPluginAndTheirSettings";
|
||||
|
||||
type JudgeResult = "" | "NEWER" | "EVEN" | "EVEN_BUT_DIFFERENT" | "OLDER" | "REMOTE_ONLY";
|
||||
|
||||
@@ -21,6 +22,13 @@
|
||||
let showOwnPlugins = false;
|
||||
let targetList: { [key: string]: boolean } = {};
|
||||
|
||||
let addOn: PluginAndTheirSettings;
|
||||
$: {
|
||||
const f = plugin.addOns.filter((e) => e instanceof PluginAndTheirSettings);
|
||||
if (f && f.length > 0) {
|
||||
addOn = f[0] as PluginAndTheirSettings;
|
||||
}
|
||||
}
|
||||
function saveTargetList() {
|
||||
window.localStorage.setItem("ols-plugin-targetlist", JSON.stringify(targetList));
|
||||
}
|
||||
@@ -39,7 +47,7 @@
|
||||
}
|
||||
|
||||
async function updateList() {
|
||||
let x = await plugin.getPluginList();
|
||||
let x = await addOn.getPluginList();
|
||||
ownPlugins = x.thisDevicePlugins;
|
||||
plugins = Object.values(x.allPlugins);
|
||||
let targetListItems = Array.from(new Set(plugins.map((e) => e.deviceVaultName + "---" + e.manifest.id)));
|
||||
@@ -62,7 +70,13 @@
|
||||
if (!(p.deviceVaultName in deviceAndPlugins)) {
|
||||
deviceAndPlugins[p.deviceVaultName] = [];
|
||||
}
|
||||
let dispInfo: PluginDataEntryDisp = { ...p, versionInfo: "", mtimeInfo: "", versionFlag: "", mtimeFlag: "" };
|
||||
let dispInfo: PluginDataEntryDisp = {
|
||||
...p,
|
||||
versionInfo: "",
|
||||
mtimeInfo: "",
|
||||
versionFlag: "",
|
||||
mtimeFlag: "",
|
||||
};
|
||||
dispInfo.versionInfo = p.manifest.version;
|
||||
let x = new Date().getTime() / 1000;
|
||||
let mtime = p.mtime / 1000;
|
||||
@@ -157,7 +171,7 @@
|
||||
async function sweepPlugins() {
|
||||
//@ts-ignore
|
||||
await plugin.app.plugins.loadManifests();
|
||||
await plugin.sweepPlugin(true);
|
||||
await addOn.sweepPlugin(true);
|
||||
updateList();
|
||||
}
|
||||
|
||||
@@ -169,9 +183,9 @@
|
||||
const entry = deviceAndPlugins[deviceAndVault].find((e) => e.manifest.id == id);
|
||||
if (entry) {
|
||||
if (opt == "plugin") {
|
||||
if (entry.versionFlag != "EVEN") await plugin.applyPlugin(entry);
|
||||
if (entry.versionFlag != "EVEN") await addOn.applyPlugin(entry);
|
||||
} else if (opt == "setting") {
|
||||
if (entry.mtimeFlag != "EVEN") await plugin.applyPluginData(entry);
|
||||
if (entry.mtimeFlag != "EVEN") await addOn.applyPluginData(entry);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -179,12 +193,12 @@
|
||||
}
|
||||
//@ts-ignore
|
||||
await plugin.app.plugins.loadManifests();
|
||||
await plugin.sweepPlugin(true);
|
||||
await addOn.sweepPlugin(true);
|
||||
updateList();
|
||||
}
|
||||
|
||||
async function checkUpdates() {
|
||||
await plugin.checkPluginUpdate();
|
||||
await addOn.checkPluginUpdate();
|
||||
}
|
||||
async function replicateAndRefresh() {
|
||||
await plugin.replicate(true);
|
||||
|
||||
172
src/StorageEventManager.ts
Normal file
172
src/StorageEventManager.ts
Normal file
@@ -0,0 +1,172 @@
|
||||
import { Plugin_2, TAbstractFile, TFile, TFolder } from "./deps";
|
||||
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
|
||||
import { getGlobalStore } from "./lib/src/store";
|
||||
import { FilePath, ObsidianLiveSyncSettings } from "./lib/src/types";
|
||||
import { FileEventItem, FileEventType, FileInfo, InternalFileInfo, queueItem } from "./types";
|
||||
import { recentlyTouched } from "./utils";
|
||||
|
||||
|
||||
export abstract class StorageEventManager {
|
||||
abstract fetchEvent(): FileEventItem | false;
|
||||
abstract cancelRelativeEvent(item: FileEventItem): void;
|
||||
abstract getQueueLength(): number;
|
||||
}
|
||||
|
||||
type LiveSyncForStorageEventManager = Plugin_2 &
|
||||
{
|
||||
settings: ObsidianLiveSyncSettings
|
||||
} & {
|
||||
isTargetFile: (file: string | TAbstractFile) => boolean,
|
||||
procFileEvent: (applyBatch?: boolean) => Promise<boolean>
|
||||
};
|
||||
|
||||
|
||||
export class StorageEventManagerObsidian extends StorageEventManager {
|
||||
plugin: LiveSyncForStorageEventManager;
|
||||
queuedFilesStore = getGlobalStore("queuedFiles", { queuedItems: [] as queueItem[], fileEventItems: [] as FileEventItem[] });
|
||||
|
||||
watchedFileEventQueue = [] as FileEventItem[];
|
||||
|
||||
constructor(plugin: LiveSyncForStorageEventManager) {
|
||||
super();
|
||||
this.plugin = plugin;
|
||||
this.watchVaultChange = this.watchVaultChange.bind(this);
|
||||
this.watchVaultCreate = this.watchVaultCreate.bind(this);
|
||||
this.watchVaultDelete = this.watchVaultDelete.bind(this);
|
||||
this.watchVaultRename = this.watchVaultRename.bind(this);
|
||||
this.watchVaultRawEvents = this.watchVaultRawEvents.bind(this);
|
||||
plugin.registerEvent(app.vault.on("modify", this.watchVaultChange));
|
||||
plugin.registerEvent(app.vault.on("delete", this.watchVaultDelete));
|
||||
plugin.registerEvent(app.vault.on("rename", this.watchVaultRename));
|
||||
plugin.registerEvent(app.vault.on("create", this.watchVaultCreate));
|
||||
//@ts-ignore : Internal API
|
||||
plugin.registerEvent(app.vault.on("raw", this.watchVaultRawEvents));
|
||||
}
|
||||
|
||||
watchVaultCreate(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "CREATE", file }], ctx);
|
||||
}
|
||||
|
||||
watchVaultChange(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "CHANGED", file }], ctx);
|
||||
}
|
||||
|
||||
watchVaultDelete(file: TAbstractFile, ctx?: any) {
|
||||
this.appendWatchEvent([{ type: "DELETE", file }], ctx);
|
||||
}
|
||||
watchVaultRename(file: TAbstractFile, oldFile: string, ctx?: any) {
|
||||
if (file instanceof TFile) {
|
||||
this.appendWatchEvent([
|
||||
{ type: "DELETE", file: { path: oldFile as FilePath, mtime: file.stat.mtime, ctime: file.stat.ctime, size: file.stat.size, deleted: true } },
|
||||
{ type: "CREATE", file },
|
||||
], ctx);
|
||||
}
|
||||
}
|
||||
// Watch raw events (Internal API)
|
||||
watchVaultRawEvents(path: FilePath) {
|
||||
if (!this.plugin.settings.syncInternalFiles) return;
|
||||
if (!this.plugin.settings.watchInternalFileChanges) return;
|
||||
if (!path.startsWith(app.vault.configDir)) return;
|
||||
const ignorePatterns = this.plugin.settings.syncInternalFilesIgnorePatterns
|
||||
.replace(/\n| /g, "")
|
||||
.split(",").filter(e => e).map(e => new RegExp(e, "i"));
|
||||
if (ignorePatterns.some(e => path.match(e))) return;
|
||||
this.appendWatchEvent(
|
||||
[{
|
||||
type: "INTERNAL",
|
||||
file: { path, mtime: 0, ctime: 0, size: 0 }
|
||||
}], null);
|
||||
}
|
||||
|
||||
// Cache file and waiting to can be proceed.
|
||||
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
|
||||
let forcePerform = false;
|
||||
for (const param of params) {
|
||||
if (shouldBeIgnored(param.file.path)) {
|
||||
continue;
|
||||
}
|
||||
const atomicKey = [0, 0, 0, 0, 0, 0].map(e => `${Math.floor(Math.random() * 100000)}`).join("-");
|
||||
const type = param.type;
|
||||
const file = param.file;
|
||||
const oldPath = param.oldPath;
|
||||
if (file instanceof TFolder) continue;
|
||||
if (!this.plugin.isTargetFile(file.path)) continue;
|
||||
if (this.plugin.settings.suspendFileWatching) continue;
|
||||
|
||||
let cache: null | string | ArrayBuffer;
|
||||
// new file or something changed, cache the changes.
|
||||
if (file instanceof TFile && (type == "CREATE" || type == "CHANGED")) {
|
||||
if (recentlyTouched(file)) {
|
||||
continue;
|
||||
}
|
||||
if (!isPlainText(file.name)) {
|
||||
cache = await app.vault.readBinary(file);
|
||||
} else {
|
||||
// cache = await this.app.vault.read(file);
|
||||
cache = await app.vault.cachedRead(file);
|
||||
if (!cache) cache = await app.vault.read(file);
|
||||
}
|
||||
}
|
||||
if (type == "DELETE" || type == "RENAME") {
|
||||
forcePerform = true;
|
||||
}
|
||||
|
||||
|
||||
if (this.plugin.settings.batchSave) {
|
||||
// if the latest event is the same type, omit that
|
||||
// a.md MODIFY <- this should be cancelled when a.md MODIFIED
|
||||
// b.md MODIFY <- this should be cancelled when b.md MODIFIED
|
||||
// a.md MODIFY
|
||||
// a.md CREATE
|
||||
// :
|
||||
let i = this.watchedFileEventQueue.length;
|
||||
L1:
|
||||
while (i >= 0) {
|
||||
i--;
|
||||
if (i < 0) break L1;
|
||||
if (this.watchedFileEventQueue[i].args.file.path != file.path) {
|
||||
continue L1;
|
||||
}
|
||||
if (this.watchedFileEventQueue[i].type != type) break L1;
|
||||
this.watchedFileEventQueue.remove(this.watchedFileEventQueue[i]);
|
||||
//this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
|
||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
||||
}
|
||||
}
|
||||
|
||||
const fileInfo = file instanceof TFile ? {
|
||||
ctime: file.stat.ctime,
|
||||
mtime: file.stat.mtime,
|
||||
file: file,
|
||||
path: file.path,
|
||||
size: file.stat.size
|
||||
} as FileInfo : file as InternalFileInfo;
|
||||
this.watchedFileEventQueue.push({
|
||||
type,
|
||||
args: {
|
||||
file: fileInfo,
|
||||
oldPath,
|
||||
cache,
|
||||
ctx
|
||||
},
|
||||
key: atomicKey
|
||||
})
|
||||
}
|
||||
// this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
|
||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
||||
this.plugin.procFileEvent(forcePerform);
|
||||
}
|
||||
fetchEvent(): FileEventItem | false {
|
||||
if (this.watchedFileEventQueue.length == 0) return false;
|
||||
const item = this.watchedFileEventQueue.shift();
|
||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
||||
return item;
|
||||
}
|
||||
cancelRelativeEvent(item: FileEventItem) {
|
||||
this.watchedFileEventQueue = [...this.watchedFileEventQueue].filter(e => e.key != item.key);
|
||||
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
|
||||
}
|
||||
getQueueLength() {
|
||||
return this.watchedFileEventQueue.length;
|
||||
}
|
||||
}
|
||||
11
src/deps.ts
Normal file
11
src/deps.ts
Normal file
@@ -0,0 +1,11 @@
|
||||
import { FilePath } from "./lib/src/types";
|
||||
|
||||
export {
|
||||
addIcon, App, DataWriteOptions, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginManifest,
|
||||
PluginSettingTab, Plugin_2, requestUrl, RequestUrlParam, RequestUrlResponse, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
|
||||
} from "obsidian";
|
||||
import {
|
||||
normalizePath as normalizePath_
|
||||
} from "obsidian";
|
||||
const normalizePath = normalizePath_ as <T extends string | FilePath>(from: T) => T;
|
||||
export { normalizePath }
|
||||
@@ -1,4 +1,4 @@
|
||||
import { App, FuzzySuggestModal, Modal, Setting } from "obsidian";
|
||||
import { App, FuzzySuggestModal, Modal, Setting } from "./deps";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
|
||||
//@ts-ignore
|
||||
@@ -52,14 +52,14 @@ export class InputStringDialog extends Modal {
|
||||
const { contentEl } = this;
|
||||
|
||||
contentEl.createEl("h1", { text: this.title });
|
||||
|
||||
new Setting(contentEl).setName(this.key).addText((text) =>
|
||||
// For enter to submit
|
||||
const formEl = contentEl.createEl("form");
|
||||
new Setting(formEl).setName(this.key).addText((text) =>
|
||||
text.onChange((value) => {
|
||||
this.result = value;
|
||||
})
|
||||
);
|
||||
|
||||
new Setting(contentEl).addButton((btn) =>
|
||||
new Setting(formEl).addButton((btn) =>
|
||||
btn
|
||||
.setButtonText("Ok")
|
||||
.setCta()
|
||||
|
||||
2
src/lib
2
src/lib
Submodule src/lib updated: af6bbef6fa...f5db618612
2843
src/main.ts
2843
src/main.ts
File diff suppressed because it is too large
Load Diff
48
src/types.ts
48
src/types.ts
@@ -1,5 +1,5 @@
|
||||
import { PluginManifest } from "obsidian";
|
||||
import { DatabaseEntry } from "./lib/src/types";
|
||||
import { PluginManifest, TFile } from "./deps";
|
||||
import { DatabaseEntry, EntryBody, FilePath } from "./lib/src/types";
|
||||
|
||||
export interface PluginDataEntry extends DatabaseEntry {
|
||||
deviceVaultName: string;
|
||||
@@ -24,9 +24,51 @@ export interface DevicePluginList {
|
||||
export const PERIODIC_PLUGIN_SWEEP = 60;
|
||||
|
||||
export interface InternalFileInfo {
|
||||
path: string;
|
||||
path: FilePath;
|
||||
mtime: number;
|
||||
ctime: number;
|
||||
size: number;
|
||||
deleted?: boolean;
|
||||
}
|
||||
|
||||
export interface FileInfo {
|
||||
path: FilePath;
|
||||
mtime: number;
|
||||
ctime: number;
|
||||
size: number;
|
||||
deleted?: boolean;
|
||||
file: TFile;
|
||||
}
|
||||
|
||||
export type queueItem = {
|
||||
entry: EntryBody;
|
||||
missingChildren: string[];
|
||||
timeout?: number;
|
||||
done?: boolean;
|
||||
warned?: boolean;
|
||||
};
|
||||
|
||||
export type CacheData = string | ArrayBuffer;
|
||||
export type FileEventType = "CREATE" | "DELETE" | "CHANGED" | "RENAME" | "INTERNAL";
|
||||
export type FileEventArgs = {
|
||||
file: FileInfo | InternalFileInfo;
|
||||
cache?: CacheData;
|
||||
oldPath?: string;
|
||||
ctx?: any;
|
||||
}
|
||||
export type FileEventItem = {
|
||||
type: FileEventType,
|
||||
args: FileEventArgs,
|
||||
key: string,
|
||||
}
|
||||
|
||||
export const CHeader = "h:";
|
||||
export const PSCHeader = "ps:";
|
||||
export const PSCHeaderEnd = "ps;";
|
||||
export const ICHeader = "i:";
|
||||
export const ICHeaderEnd = "i;";
|
||||
export const ICHeaderLength = ICHeader.length;
|
||||
|
||||
export const FileWatchEventQueueMax = 10;
|
||||
export const configURIBase = "obsidian://setuplivesync?settings=";
|
||||
|
||||
|
||||
274
src/utils.ts
274
src/utils.ts
@@ -1,52 +1,84 @@
|
||||
import { normalizePath } from "obsidian";
|
||||
import { DataWriteOptions, normalizePath, TFile, Platform, TAbstractFile, App, Plugin_2 } from "./deps";
|
||||
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "./lib/src/path";
|
||||
|
||||
import { path2id_base, id2path_base } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { AnyEntry, DocumentID, EntryHasPath, FilePath, FilePathWithPrefix, LOG_LEVEL } from "./lib/src/types";
|
||||
import { CHeader, ICHeader, ICHeaderLength, PSCHeader } from "./types";
|
||||
import { InputStringDialog, PopoverSelectString } from "./dialogs";
|
||||
|
||||
// For backward compatibility, using the path for determining id.
|
||||
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
|
||||
// The first slash will be deleted when the path is normalized.
|
||||
export function path2id(filename: string): string {
|
||||
const x = normalizePath(filename);
|
||||
return path2id_base(x);
|
||||
export async function path2id(filename: FilePathWithPrefix | FilePath, obfuscatePassphrase: string | false): Promise<DocumentID> {
|
||||
const temp = filename.split(":");
|
||||
const path = temp.pop();
|
||||
const normalizedPath = normalizePath(path as FilePath);
|
||||
temp.push(normalizedPath);
|
||||
const fixedPath = temp.join(":") as FilePathWithPrefix;
|
||||
|
||||
const out = await path2id_base(fixedPath, obfuscatePassphrase);
|
||||
return out;
|
||||
}
|
||||
export function id2path(filename: string): string {
|
||||
return id2path_base(normalizePath(filename));
|
||||
export function id2path(id: DocumentID, entry?: EntryHasPath): FilePathWithPrefix {
|
||||
const filename = id2path_base(id, entry);
|
||||
const temp = filename.split(":");
|
||||
const path = temp.pop();
|
||||
const normalizedPath = normalizePath(path as FilePath);
|
||||
temp.push(normalizedPath);
|
||||
const fixedPath = temp.join(":") as FilePathWithPrefix;
|
||||
return fixedPath;
|
||||
}
|
||||
export function getPath(entry: AnyEntry) {
|
||||
return id2path(entry._id, entry);
|
||||
|
||||
}
|
||||
export function getPathWithoutPrefix(entry: AnyEntry) {
|
||||
const f = getPath(entry);
|
||||
return stripAllPrefixes(f);
|
||||
}
|
||||
|
||||
const triggers: { [key: string]: ReturnType<typeof setTimeout> } = {};
|
||||
export function setTrigger(key: string, timeout: number, proc: (() => Promise<any> | void)) {
|
||||
clearTrigger(key);
|
||||
triggers[key] = setTimeout(async () => {
|
||||
delete triggers[key];
|
||||
export function getPathFromTFile(file: TAbstractFile) {
|
||||
return file.path as FilePath;
|
||||
}
|
||||
|
||||
const tasks: { [key: string]: ReturnType<typeof setTimeout> } = {};
|
||||
export function scheduleTask(key: string, timeout: number, proc: (() => Promise<any> | void)) {
|
||||
cancelTask(key);
|
||||
tasks[key] = setTimeout(async () => {
|
||||
delete tasks[key];
|
||||
await proc();
|
||||
}, timeout);
|
||||
}
|
||||
export function clearTrigger(key: string) {
|
||||
if (key in triggers) {
|
||||
clearTimeout(triggers[key]);
|
||||
export function cancelTask(key: string) {
|
||||
if (key in tasks) {
|
||||
clearTimeout(tasks[key]);
|
||||
delete tasks[key];
|
||||
}
|
||||
}
|
||||
export function clearAllTriggers() {
|
||||
for (const v in triggers) {
|
||||
clearTimeout(triggers[v]);
|
||||
export function cancelAllTasks() {
|
||||
for (const v in tasks) {
|
||||
clearTimeout(tasks[v]);
|
||||
delete tasks[v];
|
||||
}
|
||||
}
|
||||
const intervals: { [key: string]: ReturnType<typeof setInterval> } = {};
|
||||
export function setPeriodic(key: string, timeout: number, proc: (() => Promise<any> | void)) {
|
||||
clearPeriodic(key);
|
||||
export function setPeriodicTask(key: string, timeout: number, proc: (() => Promise<any> | void)) {
|
||||
cancelPeriodicTask(key);
|
||||
intervals[key] = setInterval(async () => {
|
||||
delete intervals[key];
|
||||
await proc();
|
||||
}, timeout);
|
||||
}
|
||||
export function clearPeriodic(key: string) {
|
||||
export function cancelPeriodicTask(key: string) {
|
||||
if (key in intervals) {
|
||||
clearInterval(intervals[key]);
|
||||
delete intervals[key];
|
||||
}
|
||||
}
|
||||
export function clearAllPeriodic() {
|
||||
export function cancelAllPeriodicTask() {
|
||||
for (const v in intervals) {
|
||||
clearInterval(intervals[v]);
|
||||
delete intervals[v];
|
||||
}
|
||||
}
|
||||
|
||||
@@ -198,3 +230,201 @@ export function applyPatch(from: Record<string | number | symbol, any>, patch: R
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
export function mergeObject(
|
||||
objA: Record<string | number | symbol, any>,
|
||||
objB: Record<string | number | symbol, any>
|
||||
) {
|
||||
const newEntries = Object.entries(objB);
|
||||
const ret: any = { ...objA };
|
||||
if (
|
||||
typeof objA !== typeof objB ||
|
||||
Array.isArray(objA) !== Array.isArray(objB)
|
||||
) {
|
||||
return objB;
|
||||
}
|
||||
|
||||
for (const [key, v] of newEntries) {
|
||||
if (key in ret) {
|
||||
const value = ret[key];
|
||||
if (
|
||||
typeof v !== typeof value ||
|
||||
Array.isArray(v) !== Array.isArray(value)
|
||||
) {
|
||||
//if type is not match, replace completely.
|
||||
ret[key] = v;
|
||||
} else {
|
||||
if (
|
||||
typeof v == "object" &&
|
||||
typeof value == "object" &&
|
||||
!Array.isArray(v) &&
|
||||
!Array.isArray(value)
|
||||
) {
|
||||
ret[key] = mergeObject(v, value);
|
||||
} else if (
|
||||
typeof v == "object" &&
|
||||
typeof value == "object" &&
|
||||
Array.isArray(v) &&
|
||||
Array.isArray(value)
|
||||
) {
|
||||
ret[key] = [...new Set([...v, ...value])];
|
||||
} else {
|
||||
ret[key] = v;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
ret[key] = v;
|
||||
}
|
||||
}
|
||||
return Object.entries(ret)
|
||||
.sort()
|
||||
.reduce((p, [key, value]) => ({ ...p, [key]: value }), {});
|
||||
}
|
||||
|
||||
export function flattenObject(obj: Record<string | number | symbol, any>, path: string[] = []): [string, any][] {
|
||||
if (typeof (obj) != "object") return [[path.join("."), obj]];
|
||||
if (Array.isArray(obj)) return [[path.join("."), JSON.stringify(obj)]];
|
||||
const e = Object.entries(obj);
|
||||
const ret = []
|
||||
for (const [key, value] of e) {
|
||||
const p = flattenObject(value, [...path, key]);
|
||||
ret.push(...p);
|
||||
}
|
||||
return ret;
|
||||
}
|
||||
|
||||
export function modifyFile(file: TFile, data: string | ArrayBuffer, options?: DataWriteOptions) {
|
||||
if (typeof (data) === "string") {
|
||||
return app.vault.modify(file, data, options);
|
||||
} else {
|
||||
return app.vault.modifyBinary(file, data, options);
|
||||
}
|
||||
}
|
||||
export function createFile(path: string, data: string | ArrayBuffer, options?: DataWriteOptions): Promise<TFile> {
|
||||
if (typeof (data) === "string") {
|
||||
return app.vault.create(path, data, options);
|
||||
} else {
|
||||
return app.vault.createBinary(path, data, options);
|
||||
}
|
||||
}
|
||||
|
||||
export function isValidPath(filename: string) {
|
||||
if (Platform.isDesktop) {
|
||||
// if(Platform.isMacOS) return isValidFilenameInDarwin(filename);
|
||||
if (process.platform == "darwin") return isValidFilenameInDarwin(filename);
|
||||
if (process.platform == "linux") return isValidFilenameInLinux(filename);
|
||||
return isValidFilenameInWidows(filename);
|
||||
}
|
||||
if (Platform.isAndroidApp) return isValidFilenameInAndroid(filename);
|
||||
if (Platform.isIosApp) return isValidFilenameInDarwin(filename);
|
||||
//Fallback
|
||||
Logger("Could not determine platform for checking filename", LOG_LEVEL.VERBOSE);
|
||||
return isValidFilenameInWidows(filename);
|
||||
}
|
||||
|
||||
let touchedFiles: string[] = [];
|
||||
|
||||
export function getAbstractFileByPath(path: FilePath): TAbstractFile | null {
|
||||
// Disabled temporary.
|
||||
return app.vault.getAbstractFileByPath(path);
|
||||
// // Hidden API but so useful.
|
||||
// // @ts-ignore
|
||||
// if ("getAbstractFileByPathInsensitive" in app.vault && (app.vault.adapter?.insensitive ?? false)) {
|
||||
// // @ts-ignore
|
||||
// return app.vault.getAbstractFileByPathInsensitive(path);
|
||||
// } else {
|
||||
// return app.vault.getAbstractFileByPath(path);
|
||||
// }
|
||||
}
|
||||
export function trimPrefix(target: string, prefix: string) {
|
||||
return target.startsWith(prefix) ? target.substring(prefix.length) : target;
|
||||
}
|
||||
|
||||
export function touch(file: TFile | FilePath) {
|
||||
const f = file instanceof TFile ? file : getAbstractFileByPath(file) as TFile;
|
||||
const key = `${f.path}-${f.stat.mtime}-${f.stat.size}`;
|
||||
touchedFiles.unshift(key);
|
||||
touchedFiles = touchedFiles.slice(0, 100);
|
||||
}
|
||||
export function recentlyTouched(file: TFile) {
|
||||
const key = `${file.path}-${file.stat.mtime}-${file.stat.size}`;
|
||||
if (touchedFiles.indexOf(key) == -1) return false;
|
||||
return true;
|
||||
}
|
||||
export function clearTouched() {
|
||||
touchedFiles = [];
|
||||
}
|
||||
|
||||
/**
|
||||
* returns is internal chunk of file
|
||||
* @param id ID
|
||||
* @returns
|
||||
*/
|
||||
export function isIdOfInternalMetadata(id: FilePath | FilePathWithPrefix | DocumentID): boolean {
|
||||
return id.startsWith(ICHeader);
|
||||
}
|
||||
export function stripInternalMetadataPrefix<T extends FilePath | FilePathWithPrefix | DocumentID>(id: T): T {
|
||||
return id.substring(ICHeaderLength) as T;
|
||||
}
|
||||
export function id2InternalMetadataId(id: DocumentID): DocumentID {
|
||||
return ICHeader + id as DocumentID;
|
||||
}
|
||||
|
||||
// const CHeaderLength = CHeader.length;
|
||||
export function isChunk(str: string): boolean {
|
||||
return str.startsWith(CHeader);
|
||||
}
|
||||
|
||||
export function isPluginMetadata(str: string): boolean {
|
||||
return str.startsWith(PSCHeader);
|
||||
}
|
||||
|
||||
export const askYesNo = (app: App, message: string): Promise<"yes" | "no"> => {
|
||||
return new Promise((res) => {
|
||||
const popover = new PopoverSelectString(app, message, null, null, (result) => res(result as "yes" | "no"));
|
||||
popover.open();
|
||||
});
|
||||
};
|
||||
|
||||
export const askSelectString = (app: App, message: string, items: string[]): Promise<string> => {
|
||||
const getItemsFun = () => items;
|
||||
return new Promise((res) => {
|
||||
const popover = new PopoverSelectString(app, message, "", getItemsFun, (result) => res(result));
|
||||
popover.open();
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
export const askString = (app: App, title: string, key: string, placeholder: string): Promise<string | false> => {
|
||||
return new Promise((res) => {
|
||||
const dialog = new InputStringDialog(app, title, key, placeholder, (result) => res(result));
|
||||
dialog.open();
|
||||
});
|
||||
};
|
||||
|
||||
|
||||
export class PeriodicProcessor {
|
||||
_process: () => Promise<any>;
|
||||
_timer?: number;
|
||||
_plugin: Plugin_2;
|
||||
constructor(plugin: Plugin_2, process: () => Promise<any>) {
|
||||
this._plugin = plugin;
|
||||
this._process = process;
|
||||
}
|
||||
async process() {
|
||||
try {
|
||||
await this._process();
|
||||
} catch (ex) {
|
||||
Logger(ex);
|
||||
}
|
||||
}
|
||||
enable(interval: number) {
|
||||
this.disable();
|
||||
if (interval == 0) return;
|
||||
this._timer = window.setInterval(() => this._process().then(() => { }), interval);
|
||||
this._plugin.registerInterval(this._timer);
|
||||
}
|
||||
disable() {
|
||||
if (this._timer) clearInterval(this._timer);
|
||||
}
|
||||
}
|
||||
|
||||
13
styles.css
13
styles.css
@@ -85,13 +85,6 @@
|
||||
|
||||
} */
|
||||
|
||||
.sls-btn-left {
|
||||
padding-right: 4px;
|
||||
}
|
||||
|
||||
.sls-btn-right {
|
||||
padding-left: 4px;
|
||||
}
|
||||
|
||||
.sls-header-button {
|
||||
margin-left: 2em;
|
||||
@@ -108,7 +101,7 @@
|
||||
.CodeMirror-wrap::before,
|
||||
.cm-s-obsidian>.cm-editor::before,
|
||||
.canvas-wrapper::before {
|
||||
content: var(--slsmessage);
|
||||
content: attr(data-log);
|
||||
text-align: right;
|
||||
white-space: pre-wrap;
|
||||
position: absolute;
|
||||
@@ -254,4 +247,8 @@ div.sls-setting-menu-btn {
|
||||
|
||||
.sls-setting:not(.isWizard) .wizardOnly {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.sls-item-dirty::before {
|
||||
content: "✏";
|
||||
}
|
||||
97
updates.md
97
updates.md
@@ -1,3 +1,31 @@
|
||||
### 0.18.0
|
||||
|
||||
#### Now, paths of files in the database can now be obfuscated. (Experimental Feature)
|
||||
At before v0.18.0, Self-hosted LiveSync used the path of files, to detect and resolve conflicts. In naive. The ID of the document stored in the CouchDB was naturally the filename.
|
||||
However, it means a sort of lacking confidentiality. If the credentials of the database have been leaked, the attacker (or an innocent bystander) can read the path of files. So we could not use confidential things in the filename in some environments.
|
||||
Since v0.18.0, they can be obfuscated. so it is no longer possible to decipher the path from the ID. Instead of that, it costs a bit CPU load than before, and the data structure has been changed a bit.
|
||||
|
||||
We can configure the `Path Obfuscation` in the `Remote database configuration` pane.
|
||||
Note: **When changing this configuration, we need to rebuild both of the local and the remote databases**.
|
||||
|
||||
#### Minors
|
||||
- 0.18.1
|
||||
- Fixed:
|
||||
- Some messages are fixed (Typo)
|
||||
- File type detection now works fine!
|
||||
- 0.18.2
|
||||
- Improved:
|
||||
- The setting pane has been refined.
|
||||
- We can enable `hidden files sync` with several initial behaviours; `Merge`, `Fetch` remote, and `Overwrite` remote.
|
||||
- No longer `Touch hidden files`.
|
||||
- 0.18.3
|
||||
- Fixed Pop-up is now correctly shown after hidden file synchronisation.
|
||||
- 0.18.4
|
||||
- Fixed:
|
||||
- `Fetch` and `Rebuild database` will work more safely.
|
||||
- Case-sensitive renaming now works fine.
|
||||
Revoked the logic which was made at #130, however, looks fine now.
|
||||
|
||||
### 0.17.0
|
||||
- 0.17.0 has no surfaced changes but the design of saving chunks has been changed. They have compatibility but changing files after upgrading makes different chunks than before 0.16.x.
|
||||
Please rebuild databases once if you have been worried about storage usage.
|
||||
@@ -10,51 +38,24 @@
|
||||
- Chunk ID numbering rules
|
||||
|
||||
#### Minors
|
||||
- 0.17.1
|
||||
- Fixed: Now we can verify and repair the database.
|
||||
- Refactored inside.
|
||||
|
||||
- 0.17.2
|
||||
- New feature
|
||||
- We can merge conflicted documents automatically if sensible.
|
||||
- Fixed
|
||||
- Writing to the storage will be pended while they have conflicts after replication.
|
||||
|
||||
- 0.17.3
|
||||
- Now we supported canvas! And conflicted JSON files are also synchronised with merging its content if they are obvious.
|
||||
|
||||
- 0.17.4
|
||||
- Canvases are now treated as a sort of plain text file. now we transfer only the metadata and chunks that have differences.
|
||||
### 0.16.0
|
||||
- Now hidden files need not be scanned. Changes will be detected automatically.
|
||||
- If you want it to back to its previous behaviour, please disable `Monitor changes to internal files`.
|
||||
- Due to using an internal API, this feature may become unusable with a major update. If this happens, please disable this once.
|
||||
|
||||
#### Minors
|
||||
|
||||
- 0.16.1 Added missing log updates.
|
||||
- 0.16.2 Fixed many problems caused by combinations of `Sync On Save` and the tracking logic that changed at 0.15.6.
|
||||
- 0.16.3
|
||||
- Fixed detection of IBM Cloudant (And if there are some issues, be fixed automatically).
|
||||
- A configuration information reporting tool has been implemented.
|
||||
- 0.16.4 Fixed detection failure. Please set the `Chunk size` again when using a self-hosted database.
|
||||
- 0.16.5
|
||||
- Fixed
|
||||
- Conflict detection and merging now be able to treat deleted files.
|
||||
- Logs while the boot-up sequence has been tidied up.
|
||||
- Fixed incorrect log entries.
|
||||
- New Feature
|
||||
- The feature of automatically deleting old expired metadata has been implemented.
|
||||
We can configure it in `Delete old metadata of deleted files on start-up` in the `General Settings` pane.
|
||||
- 0.16.6
|
||||
- Fixed
|
||||
- Automatic (temporary) batch size adjustment has been restored to work correctly.
|
||||
- Chunk splitting has been backed to the previous behaviour for saving them correctly.
|
||||
- Improved
|
||||
- Corrupted chunks will be detected automatically.
|
||||
- Now on the case-insensitive system, `aaa.md` and `AAA.md` will be treated as the same file or path at applying changesets.
|
||||
- 0.16.7 Nothing has been changed except toolsets, framework library, and as like them. Please inform me if something had been getting strange!
|
||||
- 0.16.8 Now we can synchronise without `bad_request:invalid UTF-8 JSON` even while end-to-end encryption has been disabled.
|
||||
|
||||
Note:
|
||||
Before 0.16.5, LiveSync had some issues making chunks. In this case, synchronisation had became been always failing after a corrupted one should be made. After 0.16.6, the corrupted chunk is automatically detected. Sorry for troubling you but please do `rebuild everything` when this plug-in notified so.
|
||||
- __0.17.1 to 0.17.30 has been moved into `update_old.md`__
|
||||
- 0.17.31
|
||||
- Fixed:
|
||||
- Now `redflag3` can be run surely.
|
||||
- Synchronisation can now be aborted.
|
||||
- Note: The synchronisation flow has been rewritten drastically. Please do not haste to inform me if you have noticed anything.
|
||||
- 0.17.32
|
||||
- Fixed:
|
||||
- Now periodic internal file scanning works well.
|
||||
- The handler of Window-visibility-changed has been fixed.
|
||||
- And minor fixes possibly included.
|
||||
- Refactored:
|
||||
- Unused logic has been removed.
|
||||
- Some utility functions have been moved into suitable files.
|
||||
- Function names have been renamed.
|
||||
- 0.17.33
|
||||
- Maintenance update: Refactored; the responsibilities that `LocalDatabase` had were shared. (Hoping) No changes in behaviour.
|
||||
- 0.17.34
|
||||
- Fixed: The `Fetch` that was broken at 0.17.33 has been fixed.
|
||||
- Refactored again: Internal file sync, plug-in sync and Set up URI have been moved into each file.
|
||||
... To continue on to `updates_old.md`.
|
||||
211
updates_old.md
211
updates_old.md
@@ -1,3 +1,214 @@
|
||||
### 0.17.0
|
||||
- 0.17.0 has no surfaced changes but the design of saving chunks has been changed. They have compatibility but changing files after upgrading makes different chunks than before 0.16.x.
|
||||
Please rebuild databases once if you have been worried about storage usage.
|
||||
|
||||
- Improved:
|
||||
- Splitting markdown
|
||||
- Saving chunks
|
||||
|
||||
- Changed:
|
||||
- Chunk ID numbering rules
|
||||
|
||||
#### Minors
|
||||
|
||||
- 0.17.1
|
||||
- Fixed: Now we can verify and repair the database.
|
||||
- Refactored inside.
|
||||
|
||||
- 0.17.2
|
||||
- New feature
|
||||
- We can merge conflicted documents automatically if sensible.
|
||||
- Fixed
|
||||
- Writing to the storage will be pended while they have conflicts after replication.
|
||||
|
||||
- 0.17.3
|
||||
- Now we supported canvas! And conflicted JSON files are also synchronised with merging its content if they are obvious.
|
||||
|
||||
- 0.17.4
|
||||
- Canvases are now treated as a sort of plain text file. now we transfer only the metadata and chunks that have differences.
|
||||
|
||||
- 0.17.5 Now `read chunks online` had been fixed, and a new feature: `Use dynamic iteration count` to reduce the load on encryption/decryption.
|
||||
Note: `Use dynamic iteration count` is not compatible with earlier versions.
|
||||
- 0.17.6 Now our renamed/deleted files have been surely deleted again.
|
||||
- 0.17.7
|
||||
- Fixed:
|
||||
- Fixed merging issues.
|
||||
- Fixed button styling.
|
||||
- Changed:
|
||||
- Conflict checking on synchronising has been enabled for every note in default.
|
||||
- 0.17.8
|
||||
- Improved: Performance improved. Prebuilt PouchDB is no longer used.
|
||||
- Fixed: Merging hidden files is also fixed.
|
||||
- New Feature: Now we can synchronise automatically after merging conflicts.
|
||||
- 0.17.9
|
||||
- Fixed: Conflict merge of internal files is no longer broken.
|
||||
- Improved: Smoother status display inside the editor.
|
||||
- 0.17.10
|
||||
- Fixed: Large file synchronising has been now addressed!
|
||||
Note: When synchronising large files, we have to set `Chunk size` to lower than 50, disable `Read chunks online`, `Batch size` should be set 50-100, and `Batch limit` could be around 20.
|
||||
- 0.17.11
|
||||
- Fixed:
|
||||
- Performance improvement
|
||||
- Now `Chunk size` can be set to under one hundred.
|
||||
- New feature:
|
||||
- The number of transfers required before replication stabilises is now displayed.
|
||||
- 0.17.12: Skipped.
|
||||
- 0.17.13
|
||||
- Fixed: Document history is now displayed again.
|
||||
- Reorganised: Many files have been refactored.
|
||||
- 0.17.14: Skipped.
|
||||
- 0.17.15
|
||||
- Improved:
|
||||
- Confidential information has no longer stored in data.json as is.
|
||||
- Synchronising progress has been shown in the notification.
|
||||
- We can commit passphrases with a keyboard.
|
||||
- Configuration which had not been saved yet is marked now.
|
||||
- Now the filename is shown on the Conflict resolving dialog
|
||||
- Fixed:
|
||||
- Hidden files have been synchronised again.
|
||||
- Rename of files has been fixed again.
|
||||
And, minor changes have been included.
|
||||
- 0.17.16:
|
||||
- Improved:
|
||||
- Plugins and their settings no longer need scanning if changes are monitored.
|
||||
- Now synchronising plugins and their settings are performed parallelly and faster.
|
||||
- We can place `redflag2.md` to rebuild the database automatically while the boot sequence.
|
||||
- Experimental:
|
||||
- We can use a new adapter on PouchDB. This will make us smoother.
|
||||
- Note: Not compatible with the older version.
|
||||
- Fixed:
|
||||
- The default batch size is smaller again.
|
||||
- Plugins and their setting can be synchronised again.
|
||||
- Hidden files and plugins are correctly scanned while rebuilding.
|
||||
- Files with the name started `_` are also being performed conflict-checking.
|
||||
- 0.17.17
|
||||
- Fixed: Now we can merge JSON files even if we failed to compare items like null.
|
||||
- 0.17.18
|
||||
- Fixed: Fixed lack of error handling.
|
||||
- 0.17.19
|
||||
- Fixed: Error reporting has been ensured.
|
||||
- 0.17.20
|
||||
- Improved: Changes of hidden files will be notified to Obsidian.
|
||||
- 0.17.21
|
||||
- Fixed: Skip patterns now handle capital letters.
|
||||
- Improved
|
||||
- New configuration to avoid exceeding throttle capacity.
|
||||
- We have been grateful to @karasevm!
|
||||
- The conflicted `data.json` is no longer merged automatically.
|
||||
- This behaviour is not configurable, unlike the `Use newer file if conflicted` of normal files.
|
||||
- 0.17.22
|
||||
- Fixed:
|
||||
- Now hidden files will not be synchronised while we are not configured.
|
||||
- Some processes could start without waiting for synchronisation to complete, but now they will wait for.
|
||||
- Improved
|
||||
- Now, by placing `redflag3.md`, we can discard the local database and fetch again.
|
||||
- The document has been updated! Thanks to @hilsonp!
|
||||
- 0.17.23
|
||||
- Improved:
|
||||
- Now we can preserve the logs into the file.
|
||||
- Note: This option will be enabled automatically also when we flagging a red flag.
|
||||
- File names can now be made platform-appropriate.
|
||||
- Refactored:
|
||||
- Some redundant implementations have been sorted out.
|
||||
- 0.17.24
|
||||
- New feature:
|
||||
- If any conflicted files have been left, they will be reported.
|
||||
- Fixed:
|
||||
- Now the name of the conflicting file is shown on the conflict-resolving dialogue.
|
||||
- Hidden files are now able to be merged again.
|
||||
- No longer error caused at plug-in being loaded.
|
||||
- Improved:
|
||||
- Caching chunks are now limited in total size of cached chunks.
|
||||
- 0.17.25
|
||||
- Fixed:
|
||||
- Now reading error will be reported.
|
||||
- 0.17.26
|
||||
- Fixed(Urgent):
|
||||
- The modified document will be reflected in the storage now.
|
||||
- 0.17.27
|
||||
- Improved:
|
||||
- Now, the filename of the conflicted settings will be shown on the merging dialogue
|
||||
- The plugin data can be resolved when conflicted.
|
||||
- The semaphore status display has been changed to count only.
|
||||
- Applying to the storage will be concurrent with a few files.
|
||||
- 0.17.28
|
||||
-Fixed:
|
||||
- Some messages have been refined.
|
||||
- Boot sequence has been speeded up.
|
||||
- Opening the local database multiple times in a short duration has been suppressed.
|
||||
- Older migration logic.
|
||||
- Note: If you have used 0.10.0 or lower and have not upgraded, you will need to run 0.17.27 or earlier once or reinstall Obsidian.
|
||||
- 0.17.29
|
||||
- Fixed:
|
||||
- Requests of reading chunks online are now split into a reasonable(and configurable) size.
|
||||
- No longer error message will be shown on Linux devices with hidden file synchronisation.
|
||||
- Improved:
|
||||
- The interval of reading chunks online is now configurable.
|
||||
- Boot sequence has been speeded up, more.
|
||||
- Misc:
|
||||
- Messages on the boot sequence will now be more detailed. If you want to see them, please enable the verbose log.
|
||||
- Logs became be kept for 1000 lines while the verbose log is enabled.
|
||||
- 0.17.30
|
||||
- Implemented:
|
||||
- `Resolve all conflicted files` has been implemented.
|
||||
- Fixed:
|
||||
- Fixed a problem about reading chunks online when a file has more chunks than the concurrency limit.
|
||||
- Rollbacked:
|
||||
- Logs are kept only for 100 lines, again.
|
||||
### 0.16.0
|
||||
- Now hidden files need not be scanned. Changes will be detected automatically.
|
||||
- If you want it to back to its previous behaviour, please disable `Monitor changes to internal files`.
|
||||
- Due to using an internal API, this feature may become unusable with a major update. If this happens, please disable this once.
|
||||
|
||||
#### Minors
|
||||
|
||||
- 0.16.1 Added missing log updates.
|
||||
- 0.16.2 Fixed many problems caused by combinations of `Sync On Save` and the tracking logic that changed at 0.15.6.
|
||||
- 0.16.3
|
||||
- Fixed detection of IBM Cloudant (And if there are some issues, be fixed automatically).
|
||||
- A configuration information reporting tool has been implemented.
|
||||
- 0.16.4 Fixed detection failure. Please set the `Chunk size` again when using a self-hosted database.
|
||||
- 0.16.5
|
||||
- Fixed
|
||||
- Conflict detection and merging now be able to treat deleted files.
|
||||
- Logs while the boot-up sequence has been tidied up.
|
||||
- Fixed incorrect log entries.
|
||||
- New Feature
|
||||
- The feature of automatically deleting old expired metadata has been implemented.
|
||||
We can configure it in `Delete old metadata of deleted files on start-up` in the `General Settings` pane.
|
||||
- 0.16.6
|
||||
- Fixed
|
||||
- Automatic (temporary) batch size adjustment has been restored to work correctly.
|
||||
- Chunk splitting has been backed to the previous behaviour for saving them correctly.
|
||||
- Improved
|
||||
- Corrupted chunks will be detected automatically.
|
||||
- Now on the case-insensitive system, `aaa.md` and `AAA.md` will be treated as the same file or path at applying changesets.
|
||||
- 0.16.7 Nothing has been changed except toolsets, framework library, and as like them. Please inform me if something had been getting strange!
|
||||
- 0.16.8 Now we can synchronise without `bad_request:invalid UTF-8 JSON` even while end-to-end encryption has been disabled.
|
||||
|
||||
Note:
|
||||
Before 0.16.5, LiveSync had some issues making chunks. In this case, synchronisation had became been always failing after a corrupted one should be made. After 0.16.6, the corrupted chunk is automatically detected. Sorry for troubling you but please do `rebuild everything` when this plug-in notified so.
|
||||
|
||||
### 0.15.0
|
||||
- Outdated configuration items have been removed.
|
||||
- Setup wizard has been implemented!
|
||||
|
||||
I appreciate for reviewing and giving me advice @Pouhon158!
|
||||
|
||||
#### Minors
|
||||
- 0.15.1 Missed the stylesheet.
|
||||
- 0.15.2 The wizard has been improved and documented!
|
||||
- 0.15.3 Fixed the issue about locking/unlocking remote database while rebuilding in the wizard.
|
||||
- 0.15.4 Fixed issues about asynchronous processing (e.g., Conflict check or hidden file detection)
|
||||
- 0.15.5 Add new features for setting Self-hosted LiveSync up more easier.
|
||||
- 0.15.6 File tracking logic has been refined.
|
||||
- 0.15.7 Fixed bug about renaming file.
|
||||
- 0.15.8 Fixed bug about deleting empty directory, weird behaviour on boot-sequence on mobile devices.
|
||||
- 0.15.9 Improved chunk retrieving, now chunks are retrieved in batch on continuous requests.
|
||||
- 0.15.10 Fixed:
|
||||
- The boot sequence has been corrected and now boots smoothly.
|
||||
- Auto applying of batch save will be processed earlier than before.
|
||||
|
||||
### 0.14.1
|
||||
- The target selecting filter was implemented.
|
||||
Now we can set what files are synchronised by regular expression.
|
||||
|
||||
Reference in New Issue
Block a user