Compare commits

...

115 Commits

Author SHA1 Message Date
vorotamoroz
3cc70b985a bump 2025-11-17 23:30:53 +09:00
vorotamoroz
0e81ec2586 ### Fixed
- Now we can save settings correctly again (#756).
2025-11-17 23:30:16 +09:00
vorotamoroz
bab66a64d7 bump again for release.. 2025-11-17 13:27:26 +09:00
vorotamoroz
477913456f OK. 2025-11-17 13:24:45 +09:00
vorotamoroz
b0661cdbab bump 2025-11-17 13:20:16 +09:00
vorotamoroz
18f9a842b7 ### New feature
- We can now configure hidden file synchronisation to always overwrite with the latest version (#579).

### Fixed
- Timing dependency issues during initialisation have been mitigated (#714)

### Improved
- Error logs now contain stack-traces for better inspection.
2025-11-17 13:18:55 +09:00
vorotamoroz
5130bc5f2a I'm really sorry. Is there any way I can test this locally? 2025-11-17 13:16:50 +09:00
vorotamoroz
ca8af80a27 again. 2025-11-17 13:15:56 +09:00
vorotamoroz
df273d273b Sorry for broke exist release... 2025-11-17 13:14:56 +09:00
vorotamoroz
23aa0a82ca fix again.. 2025-11-17 13:09:59 +09:00
vorotamoroz
8f488b205b fix releaseing 2025-11-17 13:07:47 +09:00
vorotamoroz
893eac5c92 Move from deprecated 2025-11-17 13:06:26 +09:00
vorotamoroz
cd6946bce2 node 22 to 24 2025-11-17 12:53:11 +09:00
vorotamoroz
174ca08954 Add production switch on environment vars 2025-11-17 12:52:52 +09:00
vorotamoroz
4af4d9c4bd bump 2025-11-12 09:23:49 +00:00
vorotamoroz
1b7a25598a ### Improved
- Now we can switch the database adapter between IndexedDB and IDB without rebuilding (#747).
- No longer checking for the adapter by `Doctor`.

### Changes

- The default adapter is reverted to IDB to avoid memory leaks (#747).

### Fixed (?)

- Reverted QR code library to v1.4.4 (To make sure #752).
2025-11-12 09:22:40 +00:00
vorotamoroz
e2a01c14cc Merge branch 'main' of https://github.com/vrtmrz/obsidian-livesync 2025-11-07 09:56:53 +00:00
vorotamoroz
a623b987c8 bump 2025-11-07 09:55:22 +00:00
vorotamoroz
db28b9ec11 ### Improved
- Some JWT notes have been added to the setting dialogue (#742).

### Fixed

- No longer wrong values encoded into the QR code.
- We can acknowledge why the QR codes have not been generated.

### Refactored

- Some dependencies have been updated.
- Internal functions have been modularised into `octagonal-wheels` packages and are well tested.
- Fixed importing from the parent project in library codes. (#729).
2025-11-07 09:54:12 +00:00
vorotamoroz
b2fbbb38f5 Fix tip formatting in jwt-on-couchdb.md
Updated tip formatting in JWT on CouchDB documentation.
2025-11-06 18:49:18 +09:00
vorotamoroz
33c01fdf1e bump 2025-11-06 09:43:58 +00:00
vorotamoroz
536c0426d6 Update lib for switch Refiner to Computed 2025-11-06 09:40:37 +00:00
vorotamoroz
2f848878c2 Add doc 2025-11-06 09:24:25 +00:00
vorotamoroz
c4f2baef5e ### Fixed
#### JWT Authentication

- Now we can use JWT Authentication ES512 correctly (#742).
- Several misdirections in the Setting dialogues have been fixed (i.e., seconds and minutes confusion...).
- The key area in the Setting dialogue has been enlarged and accepts newlines correctly.
- Caching of JWT tokens now works correctly
    - Tokens are now cached and reused until they expire.
    - They will be kept until 10% of the expiration duration is remaining or 10 seconds, whichever is longer (but at a maximum of 1 minute).
- JWT settings are now correctly displayed on the Setting dialogue.

#### Other fixes

- Receiving non-latest revisions no longer causes unexpected overwrites.
    - On receiving revisions that made conflicting changes, we are still able to handle them.

### Improved

- No longer duplicated message notifications are shown when a connection to the remote server fails.
    - Instead, a single notification is shown, and it will be kept on the notification area inside the editor until the situation is resolved.
- The notification area is no longer imposing, distracting, and overwhelming.
    - With a pale background, but bordered and with icons.
2025-11-06 09:24:16 +00:00
vorotamoroz
a5b88a8d47 Add tips 2025-11-06 02:10:12 +00:00
vorotamoroz
88e61fb41f Merge branch 'main' of https://github.com/vrtmrz/obsidian-livesync 2025-11-04 11:36:36 +00:00
vorotamoroz
9bf04332bb bump 2025-11-04 11:35:19 +00:00
vorotamoroz
5238dec3f2 fix: Now hidden file synchronisation respects the filters correctly (#631, #735) 2025-11-04 11:34:39 +00:00
vorotamoroz
2b7b411c52 Merge branch 'svelteui' 2025-11-04 11:09:22 +00:00
vorotamoroz
aab0f7f034 Fix Backporting mis 2025-11-04 10:58:46 +00:00
vorotamoroz
b3a0deb0e3 bump 2025-10-31 11:36:55 +01:00
vorotamoroz
b9138d1395 ### Fixed
- We can enter the fields on some dialogue correctly on mobile devices now.

### New features

- We can use TURN server for P2P connections now.
2025-10-31 11:33:06 +01:00
vorotamoroz
04997b84c0 Merge pull request #740 from shaielc/_local
Default to _local when node not supplied
2025-10-30 17:58:24 +09:00
vorotamoroz
7eb9807aa5 Fix import path 2025-10-30 09:35:17 +01:00
vorotamoroz
91a4f234f1 bump 2025-10-30 09:30:38 +01:00
vorotamoroz
82f2860938 ### Fixed
- P2P Replication got more robust and stable.

### Breaking changes

- Send configuration via Peer-to-Peer connection is not compatible with older versions.
2025-10-30 09:29:51 +01:00
shyelc
41a112cd8a Default to _local when node not supplied 2025-10-30 07:27:52 +00:00
vorotamoroz
294ebf0c31 Add automatic node detection 2025-10-30 13:24:01 +09:00
vorotamoroz
5443317157 merged 2025-10-30 02:38:30 +01:00
vorotamoroz
47fe9d2af3 Adjust method name to actual behaviour 2025-10-30 02:28:18 +01:00
vorotamoroz
4c260a7d2b Merge pull request #723 from GrzybowskiBYD/patch-1
Fix an issue with CouchDB while using couchdb-init.sh
2025-10-29 10:35:00 +09:00
vorotamoroz
82f6fefd35 bump 2025-10-26 19:51:20 +09:00
vorotamoroz
ada8001fcb ### Fixed
- We are now able to enable optional features correctly again (#732).
- No longer oversized files have been processed, furthermore.
  - Before creating a chunk, the file is verified as the target.
  - The behaviour upon receiving replication has been changed as follows:
    - If the remote file is oversized, it is ignored.
    - If not, but while the local file is oversized, it is also ignored.
2025-10-26 19:38:45 +09:00
vorotamoroz
8b81570035 Grammatically fixes 2025-10-22 14:02:52 +01:00
vorotamoroz
d3e50421e4 Merge branch 'svelteui' of https://github.com/vrtmrz/obsidian-livesync into svelteui 2025-10-22 14:02:20 +01:00
vorotamoroz
12605f4604 Add updates 2025-10-22 14:02:00 +01:00
vorotamoroz
2c0dd82886 Add updates 2025-10-22 13:56:24 +01:00
vorotamoroz
f5315aacb8 v0.25.23.beta1
### Fixed (This should be backported to 0.25.22 if the beta phase is prolonged)

- No longer larger files will not create a chunks during preparing `Reset Synchronisation on This Device`.

### Behaviour changes

- Setup wizard is now more `goal-oriented`. Brand-new screens are introduced.
- `Fetch everything` and `Rebuild everything` is now `Reset Synchronisation on This Device` and `Overwrite Server Data with This Device's Files`.
- Remote configuration and E2EE settings are now separated to each modal dialogue.
- Peer-to-Peer settings is also separated into its own modal dialogue.
- Setup-URI, and Report for the Issue are now not copied to clipboard automatically. Instead, there are copy dialogue and buttons to copy them explicitly.
- No longer optional features are introduced during the setup or `Reset Synchronisation on This Device`, `Overwrite Server Data with This Device's Files`.
- We cannot preform `Fetch everything` and `Rebuild everything` (Removed, so the old name) without restarting Obsidian now.

### Miscellaneous

- Setup QR Code generation is separated into a src/lib/src/API/processSetting.ts file. Please use it as a subrepository if you want to generate QR codes in your own application.
- Setup-URI is also separated into a src/lib/src/API/processSetting.ts
- Some direct access to web-APIs are now wrapped into the services layer.

### Dependency updates

- Many dependencies are updated. Please see `package.json`.
- As upgrading TypeScript, Fixed many UInt8Array<ArrayBuffer> and Uint8Array type mismatches.
2025-10-22 13:56:15 +01:00
vorotamoroz
5a93066870 bump 2025-10-15 01:02:24 +09:00
vorotamoroz
3a73073505 ### Fixed
- Fixed a bug that caused wrong event bindings and flag inversion (#727)
  - This caused following issues:
    - In some cases, settings changes were not applied or saved correctly.
    - Automatic synchronisation did not begin correctly.

### Improved
- Too large diffs are not shown in the file comparison view, due to performance reasons.
2025-10-15 01:00:24 +09:00
vorotamoroz
ee0c0ee611 Add a note to prevent my forget 2025-10-13 11:27:03 +09:00
vorotamoroz
d7ea30e304 Merge pull request #726 from vrtmrz/disenchant
Disenchant
2025-10-13 10:40:20 +09:00
vorotamoroz
2b9ded60f7 Add notes 2025-10-13 10:38:32 +09:00
vorotamoroz
40508822cf Bump and add readme 2025-10-13 10:13:45 +09:00
vorotamoroz
6f938d5f54 Update submodule commit reference 2025-10-13 09:48:25 +09:00
vorotamoroz
51dc44bfb0 bump 0.25.21.beta2 2025-10-08 05:01:07 +01:00
vorotamoroz
7c4f2bf78a ### Fixed
- Fixed wrong event type bindings (which caused some events not to be handled correctly).
- Fixed detected a timing issue in StorageEventManager
    - When multiple events for the same file are fired in quick succession, metadata has been kept older information. This induces unexpected wrong notifications and write prevention.
2025-10-08 05:00:42 +01:00
Marcin Grzybowski
d82122de24 Update couchdb-init.sh
Fixed the {"error":"nodedown","reason":"nonode@nohost is down"} issue
2025-10-06 15:11:59 +02:00
vorotamoroz
67c9b4cf06 bump for beta 2025-10-06 10:45:59 +01:00
vorotamoroz
4808876968 - Rename methods for automatic binging checking.
- Add automatic binging checks.
2025-10-06 10:40:25 +01:00
vorotamoroz
cccff21ecc Merge branch 'main' into disenchant and run prettier 2025-10-04 17:59:42 +09:00
vorotamoroz
d8415a97e5 Move some dependency to devDependency 2025-10-04 17:49:02 +09:00
vorotamoroz
85e9aa2978 For convenience 2025-10-04 17:20:59 +09:00
vorotamoroz
b4eb0e4868 Improved: copy a dev build to vault folder 2025-10-04 17:12:37 +09:00
vorotamoroz
3ea348f468 Merge pull request #716 from chriscross12324/bugfix/setting-panel-hierarchy
bugfix/setting-panel-hierarchy: Fixed visual inconsistencies
2025-10-04 17:07:25 +09:00
vorotamoroz
81362816d6 disenchanting and dispelling from the nightmarish implicit something
Indeed, even though if this changeset is mostly another nightmare. It might be in beta for a while.
2025-10-03 14:59:14 +01:00
Chris Coulthard
d6efe4510f bugfix/setting-panel-hierarchy: Updated styling to improve consistency. Updated various setting panes (Fixed hierarchy issues that caused some titles to overlap rather then bump and whitespace formatting) 2025-09-28 19:27:14 -07:00
vorotamoroz
ca5a7ae18c bump 2025-09-26 11:42:06 +01:00
vorotamoroz
a27652ac34 ### Fixed
- Chunk fetching no longer reports errors when the fetched chunk could not be saved (#710).
    - Just using the fetched chunk temporarily.
- Chunk fetching reports errors when the fetched chunk is surely corrupted (#710, #712).
- It no longer detects files that the plug-in has modified.
    - It may reduce unnecessary file comparisons and unexpected file states.

### Improved

- Now checking the remote database configuration respecting the CouchDB version (#714).
2025-09-26 11:40:41 +01:00
vorotamoroz
29b89efc47 ## 0.25.19
### Improved
- Now encoding/decoding for chunk data and encryption/decryption are performed in native functions (if they were available).
2025-09-18 12:29:09 +01:00
vorotamoroz
ef3eef2d08 Bump 2025-09-17 09:07:49 +01:00
vorotamoroz
ffbbe32e36 ### Fixed
- Property encryption detection now works correctly (On Self-hosted LiveSync, it was not broken, but as a library, it was not working correctly).
- Initialising the chunk splitter is now surely performed.
- DirectFileManipulator now works fine (as a library)
    - Old `DirectFileManipulatorV1` is now removed.

### Refactored

- Removed some unnecessary intermediate files.
2025-09-17 09:05:16 +01:00
vorotamoroz
0a5371cdee bump 2025-09-16 10:47:00 +01:00
vorotamoroz
466bb142e2 Refactored: removed some unnecessary intermediate file 2025-09-16 10:45:14 +01:00
vorotamoroz
d394a4ce7f ### Fixed
- No longer information-level logs have produced during toggling `Show only notifications` in the settings (#708).
- Ignoring filters for Hidden file sync now works correctly (#709).
2025-09-16 10:30:48 +01:00
vorotamoroz
71ce76e502 Merge pull request #704 from Gron-HD/main
Add Docker Compose troubleshooting for own server setup
2025-09-16 15:33:14 +09:00
vorotamoroz
ae7a7dd456 Merge pull request #706 from abhith/patch-1
docs(README): update links for Customisation Sync and Hidden File Sync
2025-09-16 15:31:00 +09:00
vorotamoroz
4048186bb5 bump 2025-09-04 11:46:50 +01:00
vorotamoroz
2b94fd9139 ## Improved
- Improved connectivity for P2P connections
- The connection to the signalling server can now be disconnected while in the background or when explicitly disconnected.
  - These features use a patch that has not been incorporated upstream.
2025-09-04 11:44:49 +01:00
vorotamoroz
ec72ece86d bump 2025-09-03 10:12:51 +01:00
vorotamoroz
e394a994c5 Fix typo 2025-09-03 10:11:46 +01:00
vorotamoroz
aa23b6a39a ### Improved
- Now we can configure `forcePathStyle` for bucket synchronisation (#707).
2025-09-03 10:08:49 +01:00
vorotamoroz
58e328a591 bump 2025-09-02 10:27:23 +01:00
vorotamoroz
1730c39d70 ### Fixed
- Opening IndexedDB handling has been ensured.
- Migration check of corrupted files detection has been fixed.
    - Now informs us about conflicted files as non-recoverable, but noted so.
    - No longer errors on not-found files.
2025-09-02 10:24:13 +01:00
Abhith Rajan
dfeac201a2 docs(README): update links for Customisation Sync and Hidden File Sync sections 2025-09-01 21:54:11 +04:00
vorotamoroz
b42152db5e bump 2025-09-01 12:28:01 +09:00
vorotamoroz
171cfc0a38 ### Fixed
- Conflict resolving dialogue now properly displays the changeset name instead of A or B (#691).
2025-09-01 12:23:38 +09:00
vorotamoroz
d2787bdb6a Update older dependencies 2025-09-01 12:21:12 +09:00
Ron Gerber
44b022f003 Add Docker Compose troubleshooting for own server setup
Added another option for the init command that passes the variables directly. When i tried the command above i got the mentioned error.
2025-08-30 01:00:58 +02:00
vorotamoroz
58845276e7 bump 2025-08-29 11:48:33 +01:00
vorotamoroz
a2cc093a9e ### Fixed
- Fixed an issue with automatic synchronisation starting (#702).
2025-08-29 11:46:11 +01:00
vorotamoroz
fec203a751 bump 2025-08-28 10:27:31 +01:00
vorotamoroz
1a06837769 ### Fixed
- Automatic translation detection on the first launch now works correctly (#630).
- No errors are shown during synchronisations in offline (if not explicitly requested) (#699).
- Missing some checking during automatic-synchronisation now works correctly.
2025-08-28 10:26:17 +01:00
vorotamoroz
18d1ce8ec8 bump 2025-08-26 11:17:09 +01:00
vorotamoroz
2221d8c4e8 Update dependency 2025-08-26 11:14:30 +01:00
vorotamoroz
08548f8630 ### New experimental feature
- We can perform Garbage Collection (Beta2) without rebuilding the entire database, and also fetch the database.

### Fixed

- Resetting the bucket now properly clears all uploaded files.

### Refactored

- Some files have been moved to better reflect their purpose and improve maintainability.
- The extensive LiveSyncLocalDB has been split into separate files for each role.
2025-08-26 11:09:33 +01:00
vorotamoroz
5d24c3b984 bump 2025-08-20 10:36:47 +01:00
vorotamoroz
de8fd43c8b ### Fixed
- CORS Checking messages now use replacements.
- Configuring CORS setting via the UI now respects the existing rules.
- Now startup-checking works correctly again, performs migration check serially and then it will also fix starting LiveSync or start-up sync. (#696)
- Statusline in editor now supported 'Bases'.
2025-08-20 10:36:22 +01:00
vorotamoroz
ed88761eaa bump 2025-08-18 06:32:19 +01:00
vorotamoroz
4dcb37f5a2 ## 0.25.8
### New feature
- Insecure chunk detection has been implemented.

### Fixed
- Unexpected `Failed to obtain PBKDF2 salt` or similar errors during bucket-synchronisation no longer occur.
- Unexpected long delays for chunk-missing documents when using bucket-synchronisation have been resolved.
- Fetched remote chunks are now properly stored in the local database if `Fetch chunks on demand` is enabled.
- The 'fetch' dialogue's message has been refined.
- No longer overwriting any corrupted documents to the storage on boot-sequence.

### Refactored
- Type errors have been corrected.
2025-08-18 06:26:50 +01:00
vorotamoroz
db0562eda1 An minor note added 2025-08-15 11:06:55 +01:00
vorotamoroz
b610d5d959 bump 2025-08-15 11:04:34 +01:00
vorotamoroz
5abba74f3b ## 0.25.7
### Fixed

- Off-loaded chunking have been fixed to ensure proper functionality (#693).
- Chunk document ID assignment has been fixed.
- Replication prevention message during version up detection has been improved (#686).
- `Keep A` and `Keep B` on Conflict resolving dialogue has been renamed to `Use Base` and `Use Conflicted` (#691).

### Improved

- Metadata and content-size unmatched documents are now detected and reported, prevented to be applied to the storage.

### New Features

- `Scan for Broken files` has been implemented on `Hatch` -> `TroubleShooting`.

### Refactored

- Off-loaded processes have been refactored for the better maintainability.
- Removed unused code.
2025-08-15 10:51:39 +01:00
vorotamoroz
021c1fccfe Merge pull request #667 from h-exx/main
Edited the Docker Compose documentation to make it work
2025-08-14 14:24:14 +09:00
vorotamoroz
0a30af479f Fix Fetch chunks on demand 2025-08-09 02:17:50 +09:00
vorotamoroz
a9c3f60fe7 bump 2025-08-09 01:51:17 +09:00
vorotamoroz
f996e056af ### Fixed
- Storage scanning no longer occurs when `Suspend file watching` is enabled (including boot-sequence).

### Improved
- Saving notes and files now consumes less memory.
- Chunk caching is now more efficient.
- Both of them (may) are effective for #692, #680, and some more.

### Changed
- `Incubate Chunks in Document` (also known as `Eden`) is now fully sunset.
- The `Compute revisions for chunks` setting has also been removed.
- As mentioned, `Memory cache size (by total characters)` has been removed.

### Refactored
- A significant refactoring of the core codebase is underway (please refer the release-note).
2025-08-09 01:45:41 +09:00
vorotamoroz
1073ee9e30 bump 2025-07-29 12:50:25 +01:00
vorotamoroz
f94653e60e ## 0.25.4
- The PBKDF2Salt is no longer corrupted when attempting replication while the device is offline (#686)
2025-07-29 12:45:28 +01:00
vorotamoroz
3dccf2076f Add draft design documents 2025-07-28 19:14:22 +09:00
Jacques Faulkner
341f0ab12d Updated Table of Contents 2025-06-27 10:39:13 +01:00
Jacques Faulkner
39340c1e1b Sorry 1 more, reworded the creating directories comments 2025-06-26 14:45:04 +01:00
Jacques Faulkner
55cdc58857 Edited warning to make a bit more sense 2025-06-26 14:42:23 +01:00
Jacques Faulkner
4f1a9dc4e8 Forgot a # 2025-06-25 18:54:22 +01:00
Jacques Faulkner
013818b7d0 Update setup_own_server.md 2025-06-25 18:53:31 +01:00
121 changed files with 13821 additions and 6240 deletions

View File

@@ -10,19 +10,19 @@ jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
with:
fetch-depth: 0 # otherwise, you will failed to push refs to dest repo
submodules: recursive
- name: Use Node.js
uses: actions/setup-node@v1
uses: actions/setup-node@v4
with:
node-version: '22.x' # You might need to adjust this value to your own version
node-version: '24.x' # You might need to adjust this value to your own version
# Get the version number and put it in a variable
- name: Get Version
id: version
run: |
echo "::set-output name=tag::$(git describe --abbrev=0 --tags)"
echo "tag=$(git describe --abbrev=0 --tags)" >> $GITHUB_OUTPUT
# Build the plugin
- name: Build
id: build
@@ -36,59 +36,69 @@ jobs:
cp main.js manifest.json styles.css README.md ${{ github.event.repository.name }}
zip -r ${{ github.event.repository.name }}.zip ${{ github.event.repository.name }}
# Create the release on github
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
VERSION: ${{ github.ref }}
# - name: Create Release
# id: create_release
# uses: actions/create-release@v1
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# VERSION: ${{ steps.version.outputs.tag }}
# with:
# tag_name: ${{ steps.version.outputs.tag }}
# release_name: ${{ steps.version.outputs.tag }}
# draft: true
# prerelease: false
# # Upload the packaged release file
# - name: Upload zip file
# id: upload-zip
# uses: actions/upload-release-asset@v1
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# with:
# upload_url: ${{ steps.create_release.outputs.upload_url }}
# asset_path: ./${{ github.event.repository.name }}.zip
# asset_name: ${{ github.event.repository.name }}-${{ steps.version.outputs.tag }}.zip
# asset_content_type: application/zip
# # Upload the main.js
# - name: Upload main.js
# id: upload-main
# uses: actions/upload-release-asset@v1
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# with:
# upload_url: ${{ steps.create_release.outputs.upload_url }}
# asset_path: ./main.js
# asset_name: main.js
# asset_content_type: text/javascript
# # Upload the manifest.json
# - name: Upload manifest.json
# id: upload-manifest
# uses: actions/upload-release-asset@v1
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# with:
# upload_url: ${{ steps.create_release.outputs.upload_url }}
# asset_path: ./manifest.json
# asset_name: manifest.json
# asset_content_type: application/json
# # Upload the style.css
# - name: Upload styles.css
# id: upload-css
# uses: actions/upload-release-asset@v1
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# with:
# upload_url: ${{ steps.create_release.outputs.upload_url }}
# asset_path: ./styles.css
# asset_name: styles.css
# asset_content_type: text/css
- name: Create Release and Upload Assets
uses: softprops/action-gh-release@v2
with:
tag_name: ${{ github.ref }}
release_name: ${{ github.ref }}
draft: true
prerelease: false
# Upload the packaged release file
- name: Upload zip file
id: upload-zip
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./${{ github.event.repository.name }}.zip
asset_name: ${{ github.event.repository.name }}-${{ steps.version.outputs.tag }}.zip
asset_content_type: application/zip
# Upload the main.js
- name: Upload main.js
id: upload-main
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./main.js
asset_name: main.js
asset_content_type: text/javascript
# Upload the manifest.json
- name: Upload manifest.json
id: upload-manifest
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./manifest.json
asset_name: manifest.json
asset_content_type: application/json
# Upload the style.css
- name: Upload styles.css
id: upload-css
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./styles.css
asset_name: styles.css
asset_content_type: text/css
# TODO: release notes???
files: |
${{ github.event.repository.name }}.zip
main.js
manifest.json
styles.css
name: ${{ steps.version.outputs.tag }}
tag_name: ${{ steps.version.outputs.tag }}
draft: true

4
.gitignore vendored
View File

@@ -9,6 +9,7 @@ package-lock.json
# build
main.js
main_org.js
main_org_*.js
*.js.map
meta.json
meta-*.json
@@ -17,3 +18,6 @@ meta-*.json
# obsidian
data.json
.vscode
# environment variables
.env

View File

@@ -18,7 +18,7 @@ Additionally, it supports peer-to-peer synchronisation using WebRTC now (experim
- Use open-source solutions for the server.
- Compatible solutions are supported.
- Support end-to-end encryption.
- Synchronise settings, snippets, themes, and plug-ins via [Customisation Sync (Beta)](#customization-sync) or [Hidden File Sync](#hiddenfilesync).
- Synchronise settings, snippets, themes, and plug-ins via [Customisation Sync (Beta)](docs/settings.md#6-customization-sync-advanced) or [Hidden File Sync](docs/settings.md#7-hidden-files-advanced).
- Enable WebRTC peer-to-peer synchronisation without requiring a `host` (Experimental).
- This feature is still in the experimental stage. Please exercise caution when using it.
- WebRTC is a peer-to-peer synchronisation method, so **at least one device must be online to synchronise**.

View File

@@ -0,0 +1,122 @@
# [WITHDRAWN] Chunk Aggregation by Prefix
## Goal
To address the "document explosion" and storage bloat issues caused by the current chunking mechanism, while preserving the benefits of content-addressable storage and efficient delta synchronisation. This design aims to significantly reduce the number of documents in the database and simplify Garbage Collection (GC).
## Motivation
Our current synchronisation solution splits files into content-defined chunks, with each chunk stored as a separate document in CouchDB, identified by its hash. This architecture effectively leverages CouchDB's replication for automatic deduplication and efficient transfer.
However, this approach faces significant challenges as the number of files and edits increases:
1. **Document Explosion:** A large vault can generate millions of chunk documents, severely degrading CouchDB's performance, particularly during view building and replication.
2. **Storage Bloat & GC Difficulty:** Obsolete chunks generated during edits are difficult to identify and remove. Since CouchDB's deletion (`_deleted: true`) is a soft delete, and compaction is a heavy, space-intensive operation, unused chunks perpetually consume storage, making GC impractical for many users.
3. **The "Eden" Problem:** A previous attempt, "Keep newborn chunks in Eden", aimed to mitigate this by embedding volatile chunks within the parent document. While it reduced the number of standalone chunks, it introduced a new issue: the parent document's history (`_revs_info`) became excessively large, causing its own form of database bloat and making compaction equally necessary but difficult to manage.
This new design addresses the root cause—the sheer number of documents—by aggregating chunks into sets.
## Prerequisites
- The new implementation must maintain the core benefit of deduplication to ensure efficient synchronisation.
- The solution must not introduce a single point of bottleneck and should handle concurrent writes from multiple clients gracefully.
- The system must provide a clear and feasible strategy for Garbage Collection.
- The design should be forward-compatible, allowing for a smooth migration path for existing users.
## Outlined Methods and Implementation Plans
### Abstract
This design introduces a two-tiered document structure to manage chunks: **Index Documents** and **Data Documents**. Chunks are no longer stored as individual documents. Instead, they are grouped into `Data Documents` based on a common hash prefix. The existence and location of each chunk are tracked by `Index Documents`, which are also grouped by the same prefix. This approach dramatically reduces the total document count.
### Detailed Implementation
**1. Document Structure:**
- **Index Document:** Maps chunk hashes to their corresponding Data Document ID. Identified by a prefix of the chunk hash.
- `_id`: `idx:{prefix}` (e.g., `idx:a9f1b`)
- Content:
```json
{
"_id": "idx:a9f1b",
"_rev": "...",
"chunks": {
"a9f1b12...": "dat:a9f1b-001",
"a9f1b34...": "dat:a9f1b-001",
"a9f1b56...": "dat:a9f1b-002"
}
}
```
- **Data Document:** Contains the actual chunk data as base64-encoded strings. Identified by a prefix and a sequential number.
- `_id`: `dat:{prefix}-{sequence}` (e.g., `dat:a9f1b-001`)
- Content:
```json
{
"_id": "dat:a9f1b-001",
"_rev": "...",
"chunks": {
"a9f1b12...": "...", // base64 data
"a9f1b34...": "..." // base64 data
}
}
```
**2. Configuration:**
- `chunk_prefix_length`: The number of characters from the start of a chunk hash to use as a prefix (e.g., `5`). This determines the granularity of aggregation.
- `data_doc_size_limit`: The maximum size for a single Data Document to prevent it from becoming too large (e.g., 1MB). When this limit is reached, a new Data Document with an incremented sequence number is created.
**3. Write/Save Operation Flow:**
When a client creates new chunks:
1. For each new chunk, determine its hash prefix.
2. Read the corresponding `Index Document` (e.g., `idx:a9f1b`).
3. From the index, determine which of the new chunks already exist in the database.
4. For the **truly new chunks only**:
a. Read the last `Data Document` for that prefix (e.g., `dat:a9f1b-005`).
b. If it is nearing its size limit, create a new one (`dat:a9f1b-006`).
c. Add the new chunk data to the Data Document and save it.
5. Update the `Index Document` with the locations of the newly added chunks.
**4. Handling Write Conflicts:**
Concurrent writes to the same `Index Document` or `Data Document` from multiple clients will cause conflicts (409 Conflict). This is expected and must be handled gracefully. Since additions are incremental, the client application must implement a **retry-and-merge loop**:
1. Attempt to save the document.
2. On a conflict, re-fetch the latest version of the document from the server.
3. Merge its own changes into the latest version.
4. Attempt to save again.
5. Repeat until successful or a retry limit is reached.
**5. Garbage Collection (GC):**
GC becomes a manageable, periodic batch process:
1. Scan all file metadata documents to build a master set of all *currently referenced* chunk hashes.
2. Iterate through all `Index Documents`. For each chunk listed:
a. If the chunk hash is not in the master reference set, it is garbage.
b. Remove the garbage entry from the `Index Document`.
c. Remove the corresponding data from its `Data Document`.
3. If a `Data Document` becomes empty after this process, it can be deleted.
## Test Strategy
1. **Unit Tests:** Implement tests for the conflict resolution logic (retry-and-merge loop) to ensure robustness.
2. **Integration Tests:**
- Verify that concurrent writes from multiple simulated clients result in a consistent, merged state without data loss.
- Run a full synchronisation scenario and confirm the resulting database has a significantly lower document count compared to the previous implementation.
3. **GC Test:** Simulate a scenario where files are deleted, run the GC process, and verify that orphaned chunks are correctly removed from both Index and Data documents, and that storage is reclaimed after compaction.
4. **Migration Test:** Develop and test a "rebuild" process for existing users, which migrates their chunk data into the new aggregated structure.
## Documentation Strategy
- This design document will be published to explain the new architecture.
- The configuration options (`chunk_prefix_length`, etc.) will be documented for advanced users.
- A guide for the migration/rebuild process will be provided.
## Future Work
The separation of index and data opens up a powerful possibility. While this design initially implements both within CouchDB, the `Data Documents` could be offloaded to a dedicated object storage service such as **S3, MinIO, or Cloudflare R2**.
In such a hybrid model, CouchDB would handle only the lightweight `Index Documents` and file metadata, serving as a high-speed synchronisation and coordination layer. The bulky chunk data would reside in a more cost-effective and scalable blob store. This would represent the ultimate evolution of this architecture, combining the best of both worlds.
## Consideration and Conclusion
This design directly addresses the scalability limitations of the original chunk-per-document model. By aggregating chunks into sets, it significantly reduces the document count, which in turn improves database performance and makes maintenance feasible. The explicit handling of write conflicts and a clear strategy for garbage collection make this a robust and sustainable long-term solution. It effectively resolves the problems identified in previous approaches, including the "Eden" experiment, by tackling the root cause of database bloat. This architecture provides a solid foundation for future growth and scalability.

View File

@@ -0,0 +1,127 @@
# [WIP] The design intent explanation for using metadata and chunks
## Abstract
## Goal
- To explain the following:
- What metadata and chunks are
- The design intent of using metadata and chunks
## Background and Motivation
We are using PouchDB and CouchDB for storing files and synchronising them. PouchDB is a JavaScript database that stores data on the device (browser, and of course, Obsidian), while CouchDB is a NoSQL database that stores data on the server. The two databases can be synchronised to keep data consistent across devices via the CouchDB replication protocol. This is a powerful and flexible way to store and synchronise data, including conflict management, but it is not well suited for files. Therefore, we needed to manage how to store files and synchronise them.
## Terminology
- Password:
- A string used to authenticate the user.
- Passphrase:
- A string used to encrypt and decrypt data.
- This is not a password.
- Encrypt:
- To convert data into a format that is unreadable to anyone.
- Can be decrypted by the user who has the passphrase.
- Should be 1:n, containing random data to ensure that even the same data, when encrypted, results in different outputs.
- Obfuscate:
- To convert data into a format that is not easily readable.
- Can be decrypted by the user who has the passphrase.
- Should be 1:1, containing no random data, and the same data is always obfuscated to the same result. It is necessarily unreadable.
- Hash:
- To convert data into a fixed-length string that is not easily readable.
- Cannot be decrypted.
- Should be 1:1, containing no random data, and the same data is always hashed to the same result.
## Designs
### Principles
- To synchronise and handle conflicts, we should keep the history of modifications.
- No data should be lost. Even though some extra data may be stored, it should be removed later, safely.
- Each stored data item should be as small as possible to transfer efficiently, but not so small as to be inefficient.
- Any type of file should be supported, including binary files.
- Encryption should be supported efficiently.
- This method should not depart too far from the PouchDB/CouchDB philosophy. It needs to leave room for other `remote`s, to benefit from custom replicators.
As a result, we have adopted the following design.
- Files are stored as one metadata entry and multiple chunks.
- Chunks are content-addressable, and the metadata contains the ids of the chunks.
- Chunks may be referenced from multiple metadata entries. They should be efficiently managed to avoid redundancy.
### Metadata Design
The metadata contains the following information:
| Field | Type | Description | Note |
| -------- | -------------------- | ---------------------------- | ----------------------------------------------------------------------------------------------------- |
| _id | string | The id of the metadata | It is created from the file path |
| _rev | string | The revision of the metadata | It is created by PouchDB |
| children | [string] | The ids of the chunks | |
| path | string | The path of the file | If Obfuscate path has been enabled, it has been encrypted |
| size | number | The size of the metadata | Not respected; for troubleshooting |
| ctime | string | The creation timestamp | This is not used to compare files, but when writing to storage, it will be used |
| mtime | string | The modification timestamp | This will be used to compare files, and will be written to storage |
| type | `plain` \| `newnote` | The type of the file | Children of type `plain` will not be base64 encoded, while `newnote` will be |
| e_ | boolean | The file is encrypted | Encryption is processed during transfer to the remote. In local storage, this property does not exist |
#### Decision Rule for `_id` of Metadata
```ts
// Note: This is pseudo code.
let _id = PATH;
if (!HANDLE_FILES_AS_CASE_SENSITIVE) {
_id = _id.toLowerCase();
}
if (_id.startsWith("_")) {
_id = "/" + _id;
}
if (OBFUSCATE_PATH) {
_id = `f:${OBFUSCATE_PATH(_id, E2EE_PASSPHRASE)}`;
}
return _id;
```
#### Expected Questions
- Why do we need to handle files as case-sensitive?
- Some filesystems are case-sensitive, while others are not. For example, Windows is not case-sensitive, while Linux is. Therefore, we need to handle files as case-sensitive to manage conflicts.
- The trade-off is that you will not be able to manage files with different cases, so this can be disabled if you only have case-sensitive terminals.
- Why obfuscate the path?
- E2EE only encrypts the content of the file, not metadata. Hence, E2EE alone is not enough to protect the vault completely. The path is also part of the metadata, so it should be obfuscated. This is a trade-off between security and performance. However, if you title a note with sensitive information, you should obfuscate the path.
- What is `f:`?
- It is a prefix to indicate that the path is obfuscated. It is used to distinguish between normal paths and obfuscated paths. Due to file enumeration, Self-hosted LiveSync should scan the files to find the metadata, excluding chunks and other information.
- Why does an unobfuscated path not start with `f:`?
- For compatibility. Self-hosted LiveSync, by its nature, must also be able to handle files created with newer versions as far as possible.
### Chunk Design
#### Chunk Structure
The chunk contains the following information:
| Field | Type | Description | Note |
| ----- | ------------ | ------------------------- | ----------------------------------------------------------------------------------------------------- |
| _id | `h:{string}` | The id of the chunk | It is created from the hash of the chunk content |
| _rev | string | The revision of the chunk | It is created by PouchDB |
| data | string | The content of the chunk | |
| type | `leaf` | Fixed | |
| e_ | boolean | The chunk is encrypted | Encryption is processed during transfer to the remote. In local storage, this property does not exist |
**SORRY, TO BE WRITTEN, BUT WE HAVE IMPLEMENTED `v2`, WHICH REQUIRES MORE INFORMATION.**
### How they are unified
## Deduplication and Optimisation
## Synchronisation Strategy
## Performance Considerations
## Security and Privacy
## Edge Cases

View File

@@ -0,0 +1,117 @@
# [IN DESIGN] Tiered Chunk Storage with Live Compaction
** VERY IMPORTANT NOTE: This design must be used with the new journal synchronisation method. Otherwise, we risk introducing the bloat of changes from hot-pack into the Bucket. (CouchDB/PouchDB can synchronise only the most recent changes, or resolve conflicts.) Previous Journal Sync **IS NOT**. Please proceed with caution. **
## Goal
To establish a highly efficient, robust, and scalable synchronisation architecture by introducing a tiered storage system inspired by Log-Structured Merge-Trees (LSM-Trees). This design aims to address the challenges of real-time synchronisation, specifically the massive generation of transient data, while minimising storage bloat and ensuring high performance.
## Motivation
Our previous designs, including "Chunk Aggregation by Prefix", successfully addressed the "document explosion" problem. However, the introduction of real-time editor synchronisation exposed a new, critical challenge: the constant generation of short-lived "garbage" chunks during user input. This "garbage storm" places immense pressure on storage, I/O, and the Garbage Collection (GC) process.
A simple aggregation strategy is insufficient because it treats all data equally, mixing valuable, stable chunks with transient, garbage chunks in permanent storage. This leads to storage bloat and inefficient compaction. We require a system that can intelligently distinguish between "hot" (volatile) and "cold" (stable) data, processing them in the most efficient manner possible.
## Outlined Methods and Implementation Plans
### Abstract
This design implements a two-tiered storage system within CouchDB.
1. **Level 0 Hot Storage:** A set of "Hot-Packs", one for each active client. These act as fast, append-only logs for all newly created chunks. They serve as a temporary staging area, absorbing the "garbage storm" of real-time editing.
2. **Level 1 Cold Storage:** The permanent, immutable storage for stable chunks, consisting of **Index Documents** for fast lookups and **Data Documents (Cold-Packs)** for storing chunk data.
A background "Compaction" process continuously promotes stable chunks from Hot Storage to Cold Storage, while automatically discarding garbage. This keeps the permanent storage clean and highly optimised.
### Detailed Implementation
**1. Document Structure:**
- **Hot-Pack Document (Level 0):** A per-client, append-only log.
- `_id`: `hotpack:{client_id}` (`client_id` could be the same as the `deviceNodeID` used in the `accepted_nodes` in MILESTONE_DOC; enables database 'lockout' for safe synchronisation)
- Content: A log of chunk creation events.
```json
{
"_id": "hotpack:a9f1b12...",
"_rev": "...",
"log": [
{ "hash": "abc...", "data": "...", "ts": ..., "file_id": "file1" },
{ "hash": "def...", "data": "...", "ts": ..., "file_id": "file2" }
]
}
```
- **Index Document (Level 1):** A fast, prefix-based lookup table for stable chunks.
- `_id`: `idx:{prefix}` (e.g., `idx:a9f1b`)
- Content: Maps a chunk hash to the ID of the Cold-Pack it resides in.
```json
{
"_id": "idx:a9f1b",
"chunks": { "a9f1b12...": "dat:1678886400" }
}
```
- **Cold-Pack Document (Level 1):** An immutable data block created by the compaction process.
- `_id`: `dat:{timestamp_or_uuid}` (e.g., `dat:1678886400123`)
- Content: A collection of stable chunks.
```json
{
"_id": "dat:1678886400123",
"chunks": { "a9f1b12...": "...", "c3d4e5f...": "..." }
}
```
- **Hot-Pack List Document:** A central registry of all active Hot-Packs. This might be a computed document that clients maintain in memory on startup.
- `_id`: `hotpack_list`
- Content: `{"active_clients": ["hotpack:a9f1b12...", "hotpack:c3d4e5f..."]}`
**2. Write/Save Operation Flow (Real-time Editing):**
1. A client generates a new chunk.
2. It **immediately appends** the chunk object (`{hash, data, ts, file_id}`) to its **own** Hot-Pack document's `log` array within its local PouchDB. This operation is extremely fast.
3. The PouchDB synchronisation process replicates this change to the remote CouchDB and other clients in the background. No other Hot-Packs are consulted during this write operation.
**3. Read/Load Operation Flow:**
To find a chunk's data:
1. The client first consults its in-memory list of active Hot-Pack IDs (see section 5).
2. It searches for the chunk hash in all **Hot-Pack documents**, starting from its own, then others. It reads them in reverse log order (newest first).
3. If not found, it consults the appropriate **Index Document (`idx:...`)** to get the ID of the Cold-Pack.
4. It then reads the chunk data from the corresponding **Cold-Pack document (`dat:...`)**.
**4. Compaction & Promotion Process (The "GC"):**
This is a background task run periodically by clients, or triggered when the number of unprocessed log entries exceeds a threshold (to maintain the ability to synchronise with the remote database, which has a limited document size).
1. The client takes its own Hot-Pack (`hotpack:{client_id}`) and scans its `log` array from the beginning (oldest first).
2. For each chunk in the log, it checks if the chunk is still referenced in the latest revision of any file.
- **If not referenced (Garbage):** The log entry is simply discarded.
- **If referenced (Stable):** The chunk is added to a "promotion batch".
3. After scanning a certain number of log entries, the client takes the "promotion batch".
4. It creates one or more new, immutable **Cold-Pack (`dat:...`)** documents to store the chunk data from the batch.
5. It updates the corresponding **Index (`idx:...`)** documents to point to the new Cold-Pack(s).
6. Once the promotion is successfully saved to the database, it **removes the processed entries from its Hot-Pack's `log` array**. This is a critical step to prevent reprocessing and keep the Hot-Pack small.
**5. Hot-Pack List Management:**
To know which Hot-Packs to read, clients will:
1. On startup, load the `hotpack_list` document into memory.
2. Use PouchDB's live `changes` feed to monitor the creation of new `hotpack:*` documents.
3. Upon detecting an unknown Hot-Pack, the client updates its in-memory list and attempts to update the central `hotpack_list` document (on a best-effort basis, with conflict resolution).
## Planned Test Strategy
1. **Unit Tests:** Test the Compaction/Promotion logic extensively. Ensure garbage is correctly identified and stable chunks are promoted correctly.
2. **Integration Tests:** Simulate a multi-client real-time editing session.
- Verify that writes are fast and responsive.
- Confirm that transient garbage chunks do not pollute the Cold Storage.
- Confirm that after a period of inactivity, compaction runs and the Hot-Packs shrink.
3. **Stress Tests:** Simulate many clients joining and leaving to test the robustness of the `hotpack_list` management.
## Documentation Strategy
- This design document will serve as the core architectural reference.
- The roles of each document type (Hot-Pack, Index, Cold-Pack, List) will be clearly explained for future developers.
- The logic of the Compaction/Promotion process will be detailed.
## Consideration and Conclusion
This tiered storage design is a direct evolution, born from the lessons of previous architectures. It embraces the ephemeral nature of data in real-time applications. By creating a "staging area" (Hot-Packs) for volatile data, it protects the integrity and performance of the permanent "cold" storage. The Compaction process acts as a self-cleaning mechanism, ensuring that only valuable, stable data is retained long-term. This is not just an optimisation; it is a fundamental shift that enables robust, high-performance, and scalable real-time synchronisation on top of CouchDB.

View File

@@ -0,0 +1,97 @@
# [IN DESIGN] Tiered Chunk Storage for Bucket Sync
## Goal
To evolve the "Journal Sync" mechanism by integrating the Tiered Storage architecture. This design aims to drastically reduce the size and number of sync packs, minimise storage consumption on the backend bucket, and establish a clear, efficient process for Garbage Collection, all while remaining protocol-agnostic.
## Motivation
The original "Journal Sync" liberates us from CouchDB's protocol, but it still packages and transfers entire document changes, including bulky and often transient chunk data. In a real-time or frequent-editing scenario, this results in:
1. **Bloated Sync Packs:** Packs become large with redundant or short-lived chunk data, increasing upload and download times.
2. **Inefficient Storage:** The backend bucket stores numerous packs containing overlapping and obsolete chunk data, wasting space.
3. **Impractical Garbage Collection:** Identifying and purging obsolete *chunk data* from within the pack-based journal history is extremely difficult.
This new design addresses these problems by fundamentally changing *what* is synchronised in the journal packs. We will synchronise lightweight metadata and logs, while handling bulk data separately.
## Outlined methods and implementation plans
### Abstract
This design adapts the Tiered Storage model for a bucket-based backend. The backend bucket is partitioned into distinct areas for different data types. The "Journal Sync" process is now responsible for synchronising only the "hot" volatile data and lightweight metadata. A separate, asynchronous "Compaction" process, which can be run by any client, is responsible for migrating stable data into permanent, deduplicated "cold" storage.
### Detailed Implementation
**1. Bucket Structure:**
The backend bucket will have four distinct logical areas (prefixes):
- `packs/`: For "Journal Sync" packs, containing the journal of metadata and Hot-Log changes.
- `hot_logs/`: A dedicated area for each client's "Hot-Log," containing newly created, volatile chunks.
- `indices/`: For prefix-based Index files, mapping chunk hashes to their permanent location in Cold Storage.
- `cold_chunks/`: For deduplicated, stable chunk data, stored by content hash.
**2. Data Structures (Client-side PouchDB & Backend Bucket):**
- **Client Metadata:** Standard file metadata documents, kept in the client's PouchDB.
- **Hot-Log (in `hot_logs/`):** A per-client, append-only log file on the bucket.
- Path: `hot_logs/{client_id}.jsonlog`
- Content: A sequence of JSON objects, one per line, representing chunk creation events. `{"hash": "...", "data": "...", "ts": ..., "file_id": "..."}`
- **Index File (in `indices/`):** A JSON file for a given hash prefix.
- Path: `indices/{prefix}.json`
- Content: Maps a chunk hash to its content hash (which is its key in `cold_chunks/`). `{"hash_abc...": true, "hash_def...": true}`
- **Cold Chunk (in `cold_chunks/`):** The raw, immutable, deduplicated chunk data.
- Path: `cold_chunks/{chunk_hash}`
**3. "Journal Sync" - Send/Receive Operation (Not Live):**
This process is now extremely lightweight.
1. **Send:**
a. The client takes all newly generated chunks and **appends them to its own Hot-Log file (`hot_logs/{client_id}.jsonlog`)** on the bucket.
b. The client updates its local file metadata in PouchDB.
c. It then creates a "Journal Sync" pack containing **only the PouchDB journal of the file metadata changes.** This pack is very small as it contains no chunk data.
d. The pack is uploaded to `packs/`.
2. **Receive:**
a. The client downloads new packs from `packs/` and applies the metadata journal to its local PouchDB.
b. It downloads the latest versions of all **other clients' Hot-Log files** from `hot_logs/`.
c. Now the client has a complete, up-to-date view of all metadata and all "hot" chunks.
**4. Read/Load Operation Flow:**
To find a chunk's data:
1. The client searches for the chunk hash in its local copy of all **Hot-Logs**.
2. If not found, it downloads and consults the appropriate **Index file (`indices/{prefix}.json`)**.
3. If the index confirms existence, it downloads the data from **`cold_chunks/{chunk_hash}`**.
**5. Compaction & Promotion Process (Asynchronous "GC"):**
This is a deliberate, offline-capable process that any client can choose to run.
1. The client "leases" its own Hot-Log for compaction.
2. It reads its entire `hot_logs/{client_id}.jsonlog`.
3. For each chunk in the log, it checks if the chunk is referenced in the *current, latest state* of the file metadata.
- **If not referenced (Garbage):** The log entry is discarded.
- **If referenced (Stable):** The chunk is added to a "promotion batch."
4. For each chunk in the promotion batch:
a. It checks the corresponding `indices/{prefix}.json` to see if the chunk already exists in Cold Storage.
b. If it does not exist, it **uploads the chunk data to `cold_chunks/{chunk_hash}`** and updates the `indices/{prefix}.json` file.
5. Once the entire Hot-Log has been processed, the client **deletes its `hot_logs/{client_id}.jsonlog` file** (or truncates it to empty), effectively completing the cycle.
## Test strategy
1. **Component Tests:** Test the Compaction process independently. Ensure it correctly identifies stable versus garbage chunks and populates the `cold_chunks/` and `indices/` areas correctly.
2. **Integration Tests:**
- Simulate a multi-client sync cycle. Verify that sync packs in `packs/` are small.
- Confirm that `hot_logs/` are correctly created and updated.
- Run the Compaction process and verify that data migrates correctly to cold storage and the hot log is cleared.
3. **Conflict Tests:** Simulate two clients trying to compact the same index file simultaneously and ensure the outcome is consistent (for example, via a locking mechanism or last-write-wins).
## Documentation strategy
- This design document will be the primary reference for the bucket-based architecture.
- The structure of the backend bucket (`packs/`, `hot_logs/`, etc.) will be clearly defined.
- A detailed description of how to run the Compaction process will be provided to users.
## Consideration and Conclusion
By applying the Tiered Storage model to "Journal Sync", we transform it into a remarkably efficient system. The synchronisation of everyday changes becomes extremely fast and lightweight, as only metadata journals are exchanged. The heavy lifting of data deduplication and permanent storage is offloaded to a separate, asynchronous Compaction process. This clear separation of concerns makes the system highly scalable, minimises storage costs, and finally provides a practical, robust solution for Garbage Collection in a protocol-agnostic, bucket-based environment.

View File

@@ -5,10 +5,15 @@
- [Setup a CouchDB server](#setup-a-couchdb-server)
- [Table of Contents](#table-of-contents)
- [1. Prepare CouchDB](#1-prepare-couchdb)
- [A. Using Docker container](#a-using-docker-container)
- [A. Using Docker](#a-using-docker)
- [1. Prepare](#1-prepare)
- [2. Run docker container](#2-run-docker-container)
- [B. Install CouchDB directly](#b-install-couchdb-directly)
- [B. Using Docker Compose](#b-using-docker-compose)
- [1. Prepare](#1-prepare-1)
- [2. Creating Compose file](#2-create-a-docker-composeyml-file-with-the-following-added-to-it)
- [3. Boot check](#3-run-the-docker-compose-file-to-boot-check)
- [4. Starting Docker Compose in background](#4-run-the-docker-compose-file-in-the-background)
- [C. Install CouchDB directly](#c-install-couchdb-directly)
- [2. Run couchdb-init.sh for initialise](#2-run-couchdb-initsh-for-initialise)
- [3. Expose CouchDB to the Internet](#3-expose-couchdb-to-the-internet)
- [4. Client Setup](#4-client-setup)
@@ -21,44 +26,56 @@
---
## 1. Prepare CouchDB
### A. Using Docker container
### A. Using Docker
#### 1. Prepare
```bash
# Prepare environment variables.
# Adding environment variables.
export hostname=localhost:5984
export username=goojdasjdas #Please change as you like.
export password=kpkdasdosakpdsa #Please change as you like
# Prepare directories which save data and configurations.
# Creating the save data & configuration directories.
mkdir couchdb-data
mkdir couchdb-etc
```
#### 2. Run docker container
1. Boot Check.
```
$ docker run --name couchdb-for-ols --rm -it -e COUCHDB_USER=${username} -e COUCHDB_PASSWORD=${password} -v ${PWD}/couchdb-data:/opt/couchdb/data -v ${PWD}/couchdb-etc:/opt/couchdb/etc/local.d -p 5984:5984 couchdb
```
If your container has been exited, please check the permission of couchdb-data, and couchdb-etc.
Once CouchDB run, these directories will be owned by uid:`5984`. Please chown it for you again.
> [!WARNING]
> If your container threw an error or exited unexpectedly, please check the permission of couchdb-data, and couchdb-etc.
> Once CouchDB starts, these directories will be owned by uid:`5984`. Please chown it for that uid again.
2. Enable it in the background
```
$ docker run --name couchdb-for-ols -d --restart always -e COUCHDB_USER=${username} -e COUCHDB_PASSWORD=${password} -v ${PWD}/couchdb-data:/opt/couchdb/data -v ${PWD}/couchdb-etc:/opt/couchdb/etc/local.d -p 5984:5984 couchdb
```
If you prefer a compose file instead of docker run, here is the equivalent below:
Congrats, move on to [step 2](#2-run-couchdb-initsh-for-initialise)
### B. Using Docker Compose
#### 1. Prepare
```
# Creating the save data & configuration directories.
mkdir couchdb-data
mkdir couchdb-etc
```
#### 2. Create a `docker-compose.yml` file with the following added to it
```
services:
couchdb:
image: couchdb:latest
container_name: couchdb-for-ols
user: 1000:1000
user: 5984:5984
environment:
- COUCHDB_USER=${username}
- COUCHDB_PASSWORD=${password}
- COUCHDB_USER=<INSERT USERNAME HERE> #Please change as you like.
- COUCHDB_PASSWORD=<INSERT PASSWORD HERE> #Please change as you like.
volumes:
- ./couchdb-data:/opt/couchdb/data
- ./couchdb-etc:/opt/couchdb/etc/local.d
@@ -66,7 +83,30 @@ services:
- 5984:5984
restart: unless-stopped
```
### B. Install CouchDB directly
#### 3. Run the Docker Compose file to boot check
```
docker compose up
# Or if using the old version
docker-compose up
```
> [!WARNING]
> If your container threw an error or exited unexpectedly, please check the permission of couchdb-data, and couchdb-etc.
> Once CouchDB starts, these directories will be owned by uid:`5984`. Please chown it for that uid again.
#### 4. Run the Docker Compose file in the background
If all went well and didn't throw any errors, `CTRL+C` out of it, and then run this command
```
docker compose up -d
# Or if using the old version
docker-compose up -d
```
Congrats, move on to [step 2](#2-run-couchdb-initsh-for-initialise)
### C. Install CouchDB directly
Please refer to the [official document](https://docs.couchdb.org/en/stable/install/index.html). However, we do not have to configure it fully. Just the administrator needs to be configured.
## 2. Run couchdb-init.sh for initialise
@@ -92,6 +132,11 @@ If it results like the following:
Your CouchDB has been initialised successfully. If you want this manually, please read the script.
If you are using Docker Compose and the above command does not work or displays `ERROR: Hostname missing`, you can try running the following command, replacing the placeholders with your own values:
```
curl -s https://raw.githubusercontent.com/vrtmrz/obsidian-livesync/main/utils/couchdb/couchdb-init.sh | hostname=http://<YOUR SERVER IP>:5984 username=<INSERT USERNAME HERE> password=<INSERT PASSWORD HERE> bash
```
## 3. Expose CouchDB to the Internet
- You can skip this instruction if you using only in intranet and only with desktop devices.

View File

@@ -0,0 +1,81 @@
---
title: "JWT Authentication on CouchDB"
livesync-version: 0.25.24
tags:
- tips
- CouchDB
- JWT
authors:
- vorotamoroz
---
# JWT Authentication on CouchDB
When using CouchDB as a backend for Self-hosted LiveSync, it is possible to enhance security by employing JWT (JSON Web Token) Authentication. In particular, using asymmetric keys (ES256 and ES512) provides greater security against token interception.
## Setting up JWT Authentication (Asymmetrical Key Example)
### 1. Generate a key pair
We can use `openssl` to generate an EC key pair as follows:
```bash
# Generate private key
# ES512 for secp521r1 curve, we can also use ES256 for prime256v1 curve
openssl ecparam -name secp521r1 -genkey -noout | openssl pkcs8 -topk8 -inform PEM -nocrypt -out private_key.pem
# openssl ecparam -name prime256v1 -genkey -noout | openssl pkcs8 -topk8 -inform PEM -nocrypt -out private_key.pem
# Generate public key in SPKI format
openssl ec -in private_key.pem -pubout -outform PEM -out public_key.pem
```
> [!TIP]
> A key generator will be provided again in a future version of the user interface.
### 2. Configure CouchDB to accept JWT tokens
The following configuration is required:
| Key | Value | Note |
| ------------------------------ | ----------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| chttpd/authentication_handlers | {chttpd_auth, jwt_authentication_handler} | In total, it may be `{chttpd_auth, jwt_authentication_handler}, {chttpd_auth, cookie_authentication_handler}, {chttpd_auth, default_authentication_handler}`, or something similar. |
| jwt_auth/required_claims | "exp" | |
| jwt_keys/ec:your_key_id | Your public key in PEM (SPKI) format | Replace `your_key_id` with your actual key ID. You can decide as you like. Note that you can add multiple keys if needed. If you want to use HSxxx, you should set `jwt_keys/hmac:your_key_id` with your HMAC secret. |
Note: When configuring CouchDB via web interface (Fauxton), new-lines on the public key should be replaced with `\n` for header and footer lines (So wired, but true I have tested). as follows:
```
-----BEGIN PUBLIC KEY-----
\nMIGbMBAGByqGSM49AgEGBSuBBAAjA4GGAAQBq0irb/+K0Qzo7ayIHj0Xtthcntjz
r665J5UYdEQMiTtku5rnp95RuN97uA2pPOJOacMBAoiVUnZ1pqEBz9xH9yoAixji
Ju...........................................................gTt
/xtqrJRwrEy986oRZRQ=
\n-----END PUBLIC KEY-----
```
For detailed information, please refer to the [CouchDB JWT Authentication Documentation](https://docs.couchdb.org/en/stable/api/server/authn.html#jwt-authentication).
### 3. Configure Self-hosted LiveSync to use JWT Authentication
| Setting | Description |
| ----------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------- |
| Use JWT Authentication | Enable this option to use JWT Authentication. |
| JWT Algorithm | Select the JWT signing algorithm (e.g., ES256, ES512) that matches your key pair. |
| JWT Key | Paste your private key in PEM (pkcs8) format. |
| JWT Expiration Duration | Set the token expiration time in minutes. Locally cached tokens are also invalidated after this duration. |
| JWT Key ID (kid) | Enter the key ID that you used when configuring CouchDB, i.e., the one that replaced `your_key_id`. |
| JWT Subject (sub) | Set your user ID; this overrides the original `Username` setting. If you have detected access with `Username`, you have failed to authorise with JWT. |
> [!IMPORTANT]
> Self-hosted LiveSync requests to CouchDB treat the user as `_admin`. If you want to restrict access, configure `jwt_auth/roles_claim_name` to a custom claim name. (Self-hosted LiveSync always sets `_couchdb.roles` with the value `["_admin"]`).
### 4. Test the configuration
Just try to `Test Settings and Continue` in the remote setup dialogue. If you have successfully authenticated, you are all set.
## Additional Notes
This feature is still experimental. Please ensure to test thoroughly in your environment before deploying to production.
However, we think that this is a great step towards enhancing security when using CouchDB with Self-hosted LiveSync. We shall enable this setting by default in future releases.
We would love to hear your feedback and any issues you encounter.

View File

@@ -0,0 +1,29 @@
---
title: "Peer-to-Peer Synchronisation Tips"
livesync-version: 0.25.24
tags:
- tips
- p2p
authors:
- vorotamoroz
---
# Peer-to-Peer Synchronisation Tips
> [!IMPORTANT]
> Peer-to-peer synchronisation is still an experimental feature. Although we have made every effort to ensure its reliability, it may not function correctly in all environments.
## Difficulties with Peer-to-Peer Synchronisation
It is often the case that peer-to-peer connections do not function correctly, for instance, when using mobile data services.
In such circumstances, we recommend connecting all devices to a single Virtual Private Network (VPN). It is advisable to select a service, such as Tailscale, which facilitates direct communication between peers wherever possible.
Should one be in an environment where even Tailscale is unable to connect, or where it cannot be lawfully installed, please continue reading.
## A More Detailed Explanation
The failure of a Peer-to-Peer connection via WebRTC can be attributed to several factors. These may include an unsuccessful UDP hole-punching attempt, or an intermediary gateway intentionally terminating the connection. Troubleshooting this matter is not a simple undertaking. Furthermore, and rather unfortunately, gateway administrators are typically aware of this type of network behaviour. Whilst a legitimate purpose for such traffic can be cited, such as for web conferencing, this is often insufficient to prevent it from being blocked.
This situation, however, is the primary reason that our project does not provide a TURN server. Although it is said that a TURN server within WebRTC does not decrypt communications, the project holds the view that the risk of a malicious party impersonating a TURN server must be avoided. Consequently, configuring a TURN server for relay communication is not currently possible through the user interface. Furthermore, there is no official project TURN server, which is to say, one that could be monitored by a third party.
We request that you provide your own server, using your own Fully Qualified Domain Name (FQDN), and subsequently enter its details into the advanced settings.
For testing purposes, Cloudflare's Real-Time TURN Service is exceedingly convenient and offers a generous amount of free data. However, it must be noted that because it is a well-known destination, such traffic is highly conspicuous. There is also a significant possibility that it may be blocked by default. We advise proceeding with caution.

View File

@@ -12,13 +12,25 @@ import inlineWorkerPlugin from "esbuild-plugin-inline-worker";
import { terserOption } from "./terser.config.mjs";
import path from "node:path";
const prod = process.argv[2] === "production";
const prod = process.argv[2] === "production" || process.env?.BUILD_MODE === "production";
const keepTest = true; //!prod;
const manifestJson = JSON.parse(fs.readFileSync("./manifest.json") + "");
const packageJson = JSON.parse(fs.readFileSync("./package.json") + "");
const updateInfo = JSON.stringify(fs.readFileSync("./updates.md") + "");
const PATHS_TEST_INSTALL = process.env?.PATHS_TEST_INSTALL || "";
const PATH_TEST_INSTALL = PATHS_TEST_INSTALL.split(path.delimiter).map(p => p.trim()).filter(p => p.length);
if (!prod) {
if (PATH_TEST_INSTALL) {
console.log(`Built files will be copied to ${PATH_TEST_INSTALL}`);
} else {
console.log("Development build: You can install the plug-in to Obsidian for testing by exporting the PATHS_TEST_INSTALL environment variable with the paths to your vault plugins directories separated by your system path delimiter (':' on Unix, ';' on Windows).");
}
} else {
console.log("Production build");
}
const moduleAliasPlugin = {
name: "module-alias",
setup(build) {
@@ -95,6 +107,21 @@ const plugins = [
} else {
fs.copyFileSync("./main_org.js", "./main.js");
}
if (PATH_TEST_INSTALL) {
for (const installPath of PATH_TEST_INSTALL) {
const realPath = path.resolve(installPath);
console.log(`Copying built files to ${realPath}`);
if (!fs.existsSync(realPath)) {
console.warn(`Test install path ${installPath} does not exist`);
continue;
}
const manifestX = JSON.parse(fs.readFileSync("./manifest.json") + "");
manifestX.version = manifestJson.version + "." + Date.now();
fs.writeFileSync(path.join(installPath, "manifest.json"), JSON.stringify(manifestX, null, 2));
fs.copyFileSync("./main.js", path.join(installPath, "main.js"));
fs.copyFileSync("./styles.css", path.join(installPath, "styles.css"));
}
}
});
},
},

1
example.env Normal file
View File

@@ -0,0 +1 @@
PATHS_TEST_INSTALL=your-vault-plugin-path:and-another-path

View File

@@ -1,7 +1,7 @@
{
"id": "obsidian-livesync",
"name": "Self-hosted LiveSync",
"version": "0.25.2",
"version": "0.25.24.beta3",
"minAppVersion": "0.9.12",
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"author": "vorotamoroz",

View File

@@ -1,7 +1,7 @@
{
"id": "obsidian-livesync",
"name": "Self-hosted LiveSync",
"version": "0.25.3",
"version": "0.25.30",
"minAppVersion": "0.9.12",
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"author": "vorotamoroz",

8166
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "obsidian-livesync",
"version": "0.25.3",
"version": "0.25.30",
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"main": "main.js",
"type": "module",
@@ -13,7 +13,7 @@
"postbakei18n": "prettier --config ./.prettierrc ./src/lib/src/common/messages/*.ts --write --log-level error",
"posti18n:yaml2json": "npm run prettyjson",
"predev": "npm run bakei18n",
"dev": "node esbuild.config.mjs",
"dev": "node --env-file=.env esbuild.config.mjs",
"prebuild": "npm run bakei18n",
"build": "node esbuild.config.mjs production",
"buildDev": "node esbuild.config.mjs dev",
@@ -23,7 +23,8 @@
"pretty": "npm run prettyNoWrite -- --write --log-level error",
"prettyCheck": "npm run prettyNoWrite -- --check",
"prettyNoWrite": "prettier --config ./.prettierrc \"**/*.js\" \"**/*.ts\" \"**/*.json\" ",
"check": "npm run lint && npm run svelte-check && npm run tsc-check"
"check": "npm run lint && npm run svelte-check",
"unittest": "deno test -A --no-check --coverage=cov_profile --v8-flags=--expose-gc --trace-leaks ./src/"
},
"keywords": [],
"author": "vorotamoroz",
@@ -33,7 +34,9 @@
"@eslint/compat": "^1.2.7",
"@eslint/eslintrc": "^3.3.0",
"@eslint/js": "^9.21.0",
"@tsconfig/svelte": "^5.0.4",
"@sveltejs/vite-plugin-svelte": "^6.2.1",
"@tsconfig/svelte": "^5.0.5",
"@types/deno": "^2.3.0",
"@types/diff-match-patch": "^1.0.36",
"@types/node": "^22.13.8",
"@types/pouchdb": "^6.4.2",
@@ -44,14 +47,15 @@
"@types/pouchdb-mapreduce": "^6.1.10",
"@types/pouchdb-replication": "^6.4.7",
"@types/transform-pouch": "^1.0.6",
"@typescript-eslint/eslint-plugin": "8.25.0",
"@typescript-eslint/parser": "8.25.0",
"@typescript-eslint/eslint-plugin": "8.46.2",
"@typescript-eslint/parser": "8.46.2",
"builtin-modules": "5.0.0",
"esbuild": "0.25.0",
"esbuild-svelte": "^0.9.0",
"eslint": "^9.21.0",
"eslint-plugin-import": "^2.31.0",
"eslint-plugin-svelte": "^3.0.2",
"esbuild-plugin-inline-worker": "^0.1.1",
"esbuild-svelte": "^0.9.3",
"eslint": "^9.38.0",
"eslint-plugin-import": "^2.32.0",
"eslint-plugin-svelte": "^3.12.4",
"events": "^3.3.0",
"glob": "^11.0.3",
"obsidian": "^1.8.7",
@@ -60,6 +64,7 @@
"pouchdb-adapter-http": "^9.0.0",
"pouchdb-adapter-idb": "^9.0.0",
"pouchdb-adapter-indexeddb": "^9.0.0",
"pouchdb-adapter-memory": "^9.0.0",
"pouchdb-core": "^9.0.0",
"pouchdb-errors": "^9.0.0",
"pouchdb-find": "^9.0.0",
@@ -68,13 +73,14 @@
"pouchdb-replication": "^9.0.0",
"pouchdb-utils": "^9.0.0",
"prettier": "3.5.2",
"svelte": "5.28.6",
"svelte": "5.41.1",
"svelte-check": "^4.3.3",
"svelte-preprocess": "^6.0.3",
"terser": "^5.39.0",
"transform-pouch": "^2.0.0",
"tslib": "^2.8.1",
"tsx": "^4.19.4",
"typescript": "5.7.3",
"tsx": "^4.20.6",
"typescript": "5.9.3",
"yaml": "^2.8.0"
},
"dependencies": {
@@ -85,14 +91,12 @@
"@smithy/protocol-http": "^5.1.0",
"@smithy/querystring-builder": "^4.0.2",
"diff-match-patch": "^1.0.5",
"esbuild-plugin-inline-worker": "^0.1.1",
"fflate": "^0.8.2",
"idb": "^8.0.3",
"minimatch": "^10.0.1",
"octagonal-wheels": "^0.1.37",
"minimatch": "^10.0.2",
"octagonal-wheels": "^0.1.44",
"qrcode-generator": "^1.4.4",
"svelte-check": "^4.1.7",
"trystero": "^0.21.5",
"trystero": "^0.22.0",
"xxhash-wasm-102": "npm:xxhash-wasm@^1.0.2"
}
}

View File

@@ -1,13 +1,5 @@
import { deleteDB, type IDBPDatabase, openDB } from "idb";
export interface KeyValueDatabase {
get<T>(key: IDBValidKey): Promise<T>;
set<T>(key: IDBValidKey, value: T): Promise<IDBValidKey>;
del(key: IDBValidKey): Promise<void>;
clear(): Promise<void>;
keys(query?: IDBValidKey | IDBKeyRange, count?: number): Promise<IDBValidKey[]>;
close(): void;
destroy(): Promise<void>;
}
import type { KeyValueDatabase } from "../lib/src/interfaces/KeyValueDatabase.ts";
const databaseCache: { [key: string]: IDBPDatabase<any> } = {};
export const OpenKeyValueDatabase = async (dbKey: string): Promise<KeyValueDatabase> => {
if (dbKey in databaseCache) {
@@ -16,8 +8,8 @@ export const OpenKeyValueDatabase = async (dbKey: string): Promise<KeyValueDatab
}
const storeKey = dbKey;
const dbPromise = openDB(dbKey, 1, {
upgrade(db) {
db.createObjectStore(storeKey);
upgrade(db, _oldVersion, _newVersion, _transaction, _event) {
return db.createObjectStore(storeKey);
},
});
const db = await dbPromise;

View File

@@ -20,6 +20,8 @@ export const EVENT_REQUEST_OPEN_P2P = "request-open-p2p";
export const EVENT_REQUEST_CLOSE_P2P = "request-close-p2p";
export const EVENT_REQUEST_RUN_DOCTOR = "request-run-doctor";
export const EVENT_REQUEST_RUN_FIX_INCOMPLETE = "request-run-fix-incomplete";
export const EVENT_ON_UNRESOLVED_ERROR = "on-unresolved-error";
// export const EVENT_FILE_CHANGED = "file-changed";
@@ -38,6 +40,8 @@ declare global {
[EVENT_REQUEST_COPY_SETUP_URI]: undefined;
[EVENT_REQUEST_SHOW_SETUP_QR]: undefined;
[EVENT_REQUEST_RUN_DOCTOR]: string;
[EVENT_REQUEST_RUN_FIX_INCOMPLETE]: undefined;
[EVENT_ON_UNRESOLVED_ERROR]: undefined;
}
}

View File

@@ -1,4 +1,4 @@
import { PersistentMap } from "../lib/src/dataobject/PersistentMap.ts";
import { PersistentMap } from "octagonal-wheels/dataobject/PersistentMap";
export let sameChangePairs: PersistentMap<number[]>;

View File

@@ -1,11 +1,6 @@
import { type PluginManifest, TFile } from "../deps.ts";
import {
type DatabaseEntry,
type EntryBody,
type FilePath,
type UXFileInfoStub,
type UXInternalFileInfoStub,
} from "../lib/src/common/types.ts";
import { type DatabaseEntry, type EntryBody, type FilePath } from "../lib/src/common/types.ts";
export type { CacheData, FileEventItem } from "../lib/src/common/types.ts";
export interface PluginDataEntry extends DatabaseEntry {
deviceVaultName: string;
@@ -54,23 +49,6 @@ export type queueItem = {
warned?: boolean;
};
export type CacheData = string | ArrayBuffer;
export type FileEventType = "CREATE" | "DELETE" | "CHANGED" | "INTERNAL";
export type FileEventArgs = {
file: UXFileInfoStub | UXInternalFileInfoStub;
cache?: CacheData;
oldPath?: string;
ctx?: any;
};
export type FileEventItem = {
type: FileEventType;
args: FileEventArgs;
key: string;
skipBatchWait?: boolean;
cancelled?: boolean;
batched?: boolean;
};
// Hidden items (Now means `chunk`)
export const CHeader = "h:";
@@ -87,5 +65,5 @@ export const ICHeaderLength = ICHeader.length;
export const ICXHeader = "ix:";
export const FileWatchEventQueueMax = 10;
export const configURIBase = "obsidian://setuplivesync?settings=";
export const configURIBaseQR = "obsidian://setuplivesync?settingsQR=";
export { configURIBase, configURIBaseQR } from "../lib/src/common/types.ts";

View File

@@ -28,13 +28,14 @@ import type ObsidianLiveSyncPlugin from "../main.ts";
import { writeString } from "../lib/src/string_and_binary/convert.ts";
import { fireAndForget } from "../lib/src/common/utils.ts";
import { sameChangePairs } from "./stores.ts";
import type { KeyValueDatabase } from "./KeyValueDB.ts";
import { scheduleTask } from "octagonal-wheels/concurrency/task";
import { EVENT_PLUGIN_UNLOADED, eventHub } from "./events.ts";
import { promiseWithResolver, type PromiseWithResolvers } from "octagonal-wheels/promises";
import { AuthorizationHeaderGenerator } from "../lib/src/replication/httplib.ts";
import type { KeyValueDatabase } from "../lib/src/interfaces/KeyValueDatabase.ts";
export { scheduleTask, cancelTask, cancelAllTasks } from "../lib/src/concurrency/task.ts";
export { scheduleTask, cancelTask, cancelAllTasks } from "octagonal-wheels/concurrency/task";
// For backward compatibility, using the path for determining id.
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
@@ -188,7 +189,7 @@ export class PeriodicProcessor {
() =>
fireAndForget(async () => {
await this.process();
if (this._plugin.$$isUnloaded()) {
if (this._plugin.services?.appLifecycle?.hasUnloaded()) {
this.disable();
}
}),
@@ -565,119 +566,3 @@ export function updatePreviousExecutionTime(key: string, timeDelta: number = 0)
}
waitingTasks[key].leastNext = Math.max(Date.now() + timeDelta, waitingTasks[key].leastNext);
}
const prefixMapObject = {
s: {
1: "V",
2: "W",
3: "X",
4: "Y",
5: "Z",
},
o: {
1: "v",
2: "w",
3: "x",
4: "y",
5: "z",
},
} as Record<string, Record<number, string>>;
const decodePrefixMapObject = Object.fromEntries(
Object.entries(prefixMapObject).flatMap(([prefix, map]) =>
Object.entries(map).map(([len, char]) => [char, { prefix, len: parseInt(len) }])
)
);
const prefixMapNumber = {
n: {
1: "a",
2: "b",
3: "c",
4: "d",
5: "e",
},
N: {
1: "A",
2: "B",
3: "C",
4: "D",
5: "E",
},
} as Record<string, Record<number, string>>;
const decodePrefixMapNumber = Object.fromEntries(
Object.entries(prefixMapNumber).flatMap(([prefix, map]) =>
Object.entries(map).map(([len, char]) => [char, { prefix, len: parseInt(len) }])
)
);
export function encodeAnyArray(obj: any[]): string {
const tempArray = obj.map((v) => {
if (v === null) return "n";
if (v === false) return "f";
if (v === true) return "t";
if (v === undefined) return "u";
if (typeof v == "number") {
const b36 = v.toString(36);
const strNum = v.toString();
const expression = b36.length < strNum.length ? "N" : "n";
const encodedStr = expression == "N" ? b36 : strNum;
const len = encodedStr.length.toString(36);
const lenLen = len.length;
const prefix2 = prefixMapNumber[expression][lenLen];
return prefix2 + len + encodedStr;
}
const str = typeof v == "string" ? v : JSON.stringify(v);
const prefix = typeof v == "string" ? "s" : "o";
const length = str.length.toString(36);
const lenLen = length.length;
const prefix2 = prefixMapObject[prefix][lenLen];
return prefix2 + length + str;
});
const w = tempArray.join("");
return w;
}
const decodeMapConstant = {
u: undefined,
n: null,
f: false,
t: true,
} as Record<string, any>;
export function decodeAnyArray(str: string): any[] {
const result = [];
let i = 0;
while (i < str.length) {
const char = str[i];
i++;
if (char in decodeMapConstant) {
result.push(decodeMapConstant[char]);
continue;
}
if (char in decodePrefixMapNumber) {
const { prefix, len } = decodePrefixMapNumber[char];
const lenStr = str.substring(i, i + len);
i += len;
const radix = prefix == "N" ? 36 : 10;
const lenNum = parseInt(lenStr, 36);
const value = str.substring(i, i + lenNum);
i += lenNum;
result.push(parseInt(value, radix));
continue;
}
const { prefix, len } = decodePrefixMapObject[char];
const lenStr = str.substring(i, i + len);
i += len;
const lenNum = parseInt(lenStr, 36);
const value = str.substring(i, i + lenNum);
i += lenNum;
if (prefix == "s") {
result.push(value);
} else {
result.push(JSON.parse(value));
}
}
return result;
}

View File

@@ -45,7 +45,7 @@ import {
} from "../../lib/src/common/utils.ts";
import { digestHash } from "../../lib/src/string_and_binary/hash.ts";
import { arrayBufferToBase64, decodeBinary, readString } from "../../lib/src/string_and_binary/convert.ts";
import { serialized, shareRunningResult } from "../../lib/src/concurrency/lock.ts";
import { serialized, shareRunningResult } from "octagonal-wheels/concurrency/lock";
import { LiveSyncCommands } from "../LiveSyncCommands.ts";
import { stripAllPrefixes } from "../../lib/src/string_and_binary/path.ts";
import {
@@ -62,20 +62,29 @@ import {
scheduleTask,
} from "../../common/utils.ts";
import { JsonResolveModal } from "../HiddenFileCommon/JsonResolveModal.ts";
import { QueueProcessor } from "../../lib/src/concurrency/processor.ts";
import { QueueProcessor } from "octagonal-wheels/concurrency/processor";
import { pluginScanningCount } from "../../lib/src/mock_and_interop/stores.ts";
import type ObsidianLiveSyncPlugin from "../../main.ts";
import { base64ToArrayBuffer, base64ToString } from "octagonal-wheels/binary/base64";
import { ConflictResolveModal } from "../../modules/features/InteractiveConflictResolving/ConflictResolveModal.ts";
import { Semaphore } from "octagonal-wheels/concurrency/semaphore";
import type { IObsidianModule } from "../../modules/AbstractObsidianModule.ts";
import { EVENT_REQUEST_OPEN_PLUGIN_SYNC_DIALOG, eventHub } from "../../common/events.ts";
import { PluginDialogModal } from "./PluginDialogModal.ts";
import { $msg } from "src/lib/src/common/i18n.ts";
import type { InjectableServiceHub } from "../../lib/src/services/InjectableServices.ts";
import type { LiveSyncCore } from "../../main.ts";
const d = "\u200b";
const d2 = "\n";
declare global {
interface OPTIONAL_SYNC_FEATURES {
DISABLE: "DISABLE";
CUSTOMIZE: "CUSTOMIZE";
DISABLE_CUSTOM: "DISABLE_CUSTOM";
}
}
function serialize(data: PluginDataEx): string {
// For higher performance, create custom plug-in data strings.
// Self-hosted LiveSync uses `\n` to split chunks. Therefore, grouping together those with similar entropy would work nicely.
@@ -384,7 +393,7 @@ export type PluginDataEx = {
mtime: number;
};
export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
export class ConfigSync extends LiveSyncCommands {
constructor(plugin: ObsidianLiveSyncPlugin) {
super(plugin);
pluginScanningCount.onChanged((e) => {
@@ -402,7 +411,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
get useSyncPluginEtc() {
return this.plugin.settings.usePluginEtc;
}
_isThisModuleEnabled() {
isThisModuleEnabled() {
return this.plugin.settings.usePluginSync;
}
@@ -411,7 +420,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
pluginList: IPluginDataExDisplay[] = [];
showPluginSyncModal() {
if (!this._isThisModuleEnabled()) {
if (!this.isThisModuleEnabled()) {
return;
}
if (this.pluginDialog) {
@@ -482,8 +491,8 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
// Idea non-filter option?
return this.getFileCategory(filePath) != "";
}
async $everyOnDatabaseInitialized(showNotice: boolean) {
if (!this._isThisModuleEnabled()) return true;
private async _everyOnDatabaseInitialized(showNotice: boolean) {
if (!this.isThisModuleEnabled()) return true;
try {
this._log("Scanning customizations...");
await this.scanAllConfigFiles(showNotice);
@@ -494,16 +503,16 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
return true;
}
async $everyBeforeReplicate(showNotice: boolean) {
if (!this._isThisModuleEnabled()) return true;
async _everyBeforeReplicate(showNotice: boolean) {
if (!this.isThisModuleEnabled()) return true;
if (this.settings.autoSweepPlugins) {
await this.scanAllConfigFiles(showNotice);
return true;
}
return true;
}
async $everyOnResumeProcess(): Promise<boolean> {
if (!this._isThisModuleEnabled()) return true;
async _everyOnResumeProcess(): Promise<boolean> {
if (!this.isThisModuleEnabled()) return true;
if (this._isMainSuspended()) {
return true;
}
@@ -517,9 +526,9 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
);
return true;
}
$everyAfterResumeProcess(): Promise<boolean> {
_everyAfterResumeProcess(): Promise<boolean> {
const q = activeDocument.querySelector(`.livesync-ribbon-showcustom`);
q?.toggleClass("sls-hidden", !this._isThisModuleEnabled());
q?.toggleClass("sls-hidden", !this.isThisModuleEnabled());
return Promise.resolve(true);
}
async reloadPluginList(showMessage: boolean) {
@@ -633,7 +642,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
).startPipeline();
filenameToUnifiedKey(path: string, termOverRide?: string) {
const term = termOverRide || this.plugin.$$getDeviceAndVaultName();
const term = termOverRide || this.services.setting.getDeviceAndVaultName();
const category = this.getFileCategory(path);
const name =
category == "CONFIG" || category == "SNIPPET"
@@ -645,7 +654,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
filenameWithUnifiedKey(path: string, termOverRide?: string) {
const term = termOverRide || this.plugin.$$getDeviceAndVaultName();
const term = termOverRide || this.services.setting.getDeviceAndVaultName();
const category = this.getFileCategory(path);
const name =
category == "CONFIG" || category == "SNIPPET" ? path.split("/").slice(-1)[0] : path.split("/").slice(-2)[0];
@@ -654,7 +663,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
unifiedKeyPrefixOfTerminal(termOverRide?: string) {
const term = termOverRide || this.plugin.$$getDeviceAndVaultName();
const term = termOverRide || this.services.setting.getDeviceAndVaultName();
return `${ICXHeader}${term}/` as FilePathWithPrefix;
}
@@ -831,7 +840,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
const v2Path = (prefixPath + relativeFilename) as FilePathWithPrefix;
// console.warn(`Migrating ${v1Path} / ${relativeFilename} to ${v2Path}`);
this._log(`Migrating ${v1Path} / ${relativeFilename} to ${v2Path}`, LOG_LEVEL_VERBOSE);
const newId = await this.plugin.$$path2id(v2Path);
const newId = await this.services.path.path2id(v2Path);
// const buf =
const data = createBlob([DUMMY_HEAD, DUMMY_END, ...getDocDataAsArray(f.data)]);
@@ -861,7 +870,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
async updatePluginList(showMessage: boolean, updatedDocumentPath?: FilePathWithPrefix): Promise<void> {
if (!this._isThisModuleEnabled()) {
if (!this.isThisModuleEnabled()) {
this.pluginScanProcessor.clearQueue();
this.pluginList = [];
pluginList.set(this.pluginList);
@@ -999,7 +1008,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
await this.plugin.storageAccess.ensureDir(path);
// If the content has applied, modified time will be updated to the current time.
await this.plugin.storageAccess.writeHiddenFileAuto(path, content);
await this.storeCustomisationFileV2(path, this.plugin.$$getDeviceAndVaultName());
await this.storeCustomisationFileV2(path, this.services.setting.getDeviceAndVaultName());
} else {
const files = data.files;
for (const f of files) {
@@ -1042,7 +1051,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
await this.plugin.storageAccess.writeHiddenFileAuto(path, content, stat);
}
this._log(`Applied ${f.filename} of ${data.displayName || data.name}..`);
await this.storeCustomisationFileV2(path, this.plugin.$$getDeviceAndVaultName());
await this.storeCustomisationFileV2(path, this.services.setting.getDeviceAndVaultName());
}
}
} catch (ex) {
@@ -1114,7 +1123,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
);
}
} else if (data.category == "CONFIG") {
this.plugin.$$askReload();
this.services.appLifecycle.askRestart();
}
return true;
} catch (ex) {
@@ -1157,15 +1166,15 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
return false;
}
}
async $anyModuleParsedReplicationResultItem(docs: PouchDB.Core.ExistingDocument<EntryDoc>) {
if (!docs._id.startsWith(ICXHeader)) return undefined;
if (this._isThisModuleEnabled()) {
async _anyModuleParsedReplicationResultItem(docs: PouchDB.Core.ExistingDocument<EntryDoc>) {
if (!docs._id.startsWith(ICXHeader)) return false;
if (this.isThisModuleEnabled()) {
await this.updatePluginList(
false,
(docs as AnyEntry).path ? (docs as AnyEntry).path : this.getPath(docs as AnyEntry)
);
}
if (this._isThisModuleEnabled() && this.plugin.settings.notifyPluginOrSettingUpdated) {
if (this.isThisModuleEnabled() && this.plugin.settings.notifyPluginOrSettingUpdated) {
if (!this.pluginDialog || (this.pluginDialog && !this.pluginDialog.isOpened())) {
const fragment = createFragment((doc) => {
doc.createEl("span", undefined, (a) => {
@@ -1205,11 +1214,11 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
return true;
}
async $everyRealizeSettingSyncMode(): Promise<boolean> {
async _everyRealizeSettingSyncMode(): Promise<boolean> {
this.periodicPluginSweepProcessor?.disable();
if (!this._isMainReady) return true;
if (!this._isMainSuspended()) return true;
if (!this._isThisModuleEnabled()) return true;
if (!this.isThisModuleEnabled()) return true;
if (this.settings.autoSweepPlugins) {
await this.scanAllConfigFiles(false);
}
@@ -1345,7 +1354,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
});
}
async storeCustomizationFiles(path: FilePath, termOverRide?: string) {
const term = termOverRide || this.plugin.$$getDeviceAndVaultName();
const term = termOverRide || this.services.setting.getDeviceAndVaultName();
if (term == "") {
this._log("We have to configure the device name", LOG_LEVEL_NOTICE);
return;
@@ -1488,14 +1497,14 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
});
}
async $anyProcessOptionalFileEvent(path: FilePath): Promise<boolean | undefined> {
async _anyProcessOptionalFileEvent(path: FilePath): Promise<boolean> {
return await this.watchVaultRawEventsAsync(path);
}
async watchVaultRawEventsAsync(path: FilePath) {
if (!this._isMainReady) return false;
if (this._isMainSuspended()) return false;
if (!this._isThisModuleEnabled()) return false;
if (!this.isThisModuleEnabled()) return false;
// if (!this.isTargetPath(path)) return false;
const stat = await this.plugin.storageAccess.statHidden(path);
// Make sure that target is a file.
@@ -1535,7 +1544,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
await shareRunningResult("scanAllConfigFiles", async () => {
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
this._log("Scanning customizing files.", logLevel, "scan-all-config");
const term = this.plugin.$$getDeviceAndVaultName();
const term = this.services.setting.getDeviceAndVaultName();
if (term == "") {
this._log("We have to configure the device name", LOG_LEVEL_NOTICE);
return;
@@ -1673,11 +1682,14 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
return filenames as FilePath[];
}
async $allAskUsingOptionalSyncFeature(opt: { enableFetch?: boolean; enableOverwrite?: boolean }): Promise<boolean> {
await this._askHiddenFileConfiguration(opt);
private async _allAskUsingOptionalSyncFeature(opt: {
enableFetch?: boolean;
enableOverwrite?: boolean;
}): Promise<boolean> {
await this.__askHiddenFileConfiguration(opt);
return true;
}
async _askHiddenFileConfiguration(opt: { enableFetch?: boolean; enableOverwrite?: boolean }) {
private async __askHiddenFileConfiguration(opt: { enableFetch?: boolean; enableOverwrite?: boolean }) {
const message = `Would you like to enable **Customization sync**?
> [!DETAILS]-
@@ -1707,7 +1719,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
}
$anyGetOptionalConflictCheckMethod(path: FilePathWithPrefix): Promise<boolean | "newer"> {
_anyGetOptionalConflictCheckMethod(path: FilePathWithPrefix): Promise<boolean | "newer"> {
if (isPluginMetadata(path)) {
return Promise.resolve("newer");
}
@@ -1717,7 +1729,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
return Promise.resolve(false);
}
$allSuspendExtraSync(): Promise<boolean> {
private _allSuspendExtraSync(): Promise<boolean> {
if (this.plugin.settings.usePluginSync || this.plugin.settings.autoSweepPlugins) {
this._log(
"Customisation sync have been temporarily disabled. Please enable them after the fetching, if you need them.",
@@ -1729,10 +1741,11 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
return Promise.resolve(true);
}
async $anyConfigureOptionalSyncFeature(mode: "CUSTOMIZE" | "DISABLE" | "DISABLE_CUSTOM") {
private async _allConfigureOptionalSyncFeature(mode: keyof OPTIONAL_SYNC_FEATURES) {
await this.configureHiddenFileSync(mode);
return true;
}
async configureHiddenFileSync(mode: "CUSTOMIZE" | "DISABLE" | "DISABLE_CUSTOM") {
async configureHiddenFileSync(mode: keyof OPTIONAL_SYNC_FEATURES) {
if (mode == "DISABLE") {
this.plugin.settings.usePluginSync = false;
await this.plugin.saveSettings();
@@ -1740,7 +1753,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
if (mode == "CUSTOMIZE") {
if (!this.plugin.$$getDeviceAndVaultName()) {
if (!this.services.setting.getDeviceAndVaultName()) {
let name = await this.plugin.confirm.askString("Device name", "Please set this device name", `desktop`);
if (!name) {
if (Platform.isAndroidApp) {
@@ -1764,7 +1777,7 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
name = name + Math.random().toString(36).slice(-4);
}
this.plugin.$$setDeviceAndVaultName(name);
this.services.setting.setDeviceAndVaultName(name);
}
this.plugin.settings.usePluginSync = true;
this.plugin.settings.useAdvancedMode = true;
@@ -1789,4 +1802,17 @@ export class ConfigSync extends LiveSyncCommands implements IObsidianModule {
}
return files;
}
onBindFunction(core: LiveSyncCore, services: InjectableServiceHub): void {
services.fileProcessing.handleOptionalFileEvent(this._anyProcessOptionalFileEvent.bind(this));
services.conflict.handleGetOptionalConflictCheckMethod(this._anyGetOptionalConflictCheckMethod.bind(this));
services.replication.handleProcessVirtualDocuments(this._anyModuleParsedReplicationResultItem.bind(this));
services.setting.handleOnRealiseSetting(this._everyRealizeSettingSyncMode.bind(this));
services.appLifecycle.handleOnResuming(this._everyOnResumeProcess.bind(this));
services.appLifecycle.handleOnResumed(this._everyAfterResumeProcess.bind(this));
services.replication.handleBeforeReplicate(this._everyBeforeReplicate.bind(this));
services.databaseEvents.handleDatabaseInitialised(this._everyOnDatabaseInitialized.bind(this));
services.setting.handleSuspendExtraSync(this._allSuspendExtraSync.bind(this));
services.setting.handleSuggestOptionalFeatures(this._allAskUsingOptionalSyncFeature.bind(this));
services.setting.handleEnableOptionalFeature(this._allConfigureOptionalSyncFeature.bind(this));
}
}

View File

@@ -25,7 +25,7 @@
export let plugin: ObsidianLiveSyncPlugin;
$: hideNotApplicable = false;
$: thisTerm = plugin.$$getDeviceAndVaultName();
$: thisTerm = plugin.services.setting.getDeviceAndVaultName();
const addOn = plugin.getAddOn(ConfigSync.name) as ConfigSync;
if (!addOn) {
@@ -98,7 +98,7 @@
await requestUpdate();
}
async function replicate() {
await plugin.$$replicate(true);
await plugin.services.replication.replicate(true);
}
function selectAllNewest(selectMode: boolean) {
selectNewestPulse++;
@@ -237,7 +237,7 @@
plugin.settings.pluginSyncExtendedSetting[key].files = files;
plugin.settings.pluginSyncExtendedSetting[key].mode = mode;
}
plugin.$$saveSettingData();
plugin.services.setting.saveSettingData();
}
function getIcon(mode: SYNC_MODE) {
if (mode in ICONS) {

View File

@@ -1,4 +1,4 @@
import { normalizePath, type PluginManifest, type ListedFiles } from "../../deps.ts";
import { type PluginManifest, type ListedFiles } from "../../deps.ts";
import {
type LoadedEntry,
type FilePathWithPrefix,
@@ -10,7 +10,6 @@ import {
MODE_PAUSED,
type SavingEntry,
type DocumentID,
type FilePathWithPrefixLC,
type UXFileInfo,
type UXStat,
LOG_LEVEL_DEBUG,
@@ -45,18 +44,26 @@ import {
BASE_IS_NEW,
EVEN,
} from "../../common/utils.ts";
import { serialized, skipIfDuplicated } from "../../lib/src/concurrency/lock.ts";
import { serialized, skipIfDuplicated } from "octagonal-wheels/concurrency/lock";
import { JsonResolveModal } from "../HiddenFileCommon/JsonResolveModal.ts";
import { LiveSyncCommands } from "../LiveSyncCommands.ts";
import { addPrefix, stripAllPrefixes } from "../../lib/src/string_and_binary/path.ts";
import { QueueProcessor } from "../../lib/src/concurrency/processor.ts";
import { QueueProcessor } from "octagonal-wheels/concurrency/processor";
import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "../../lib/src/mock_and_interop/stores.ts";
import type { IObsidianModule } from "../../modules/AbstractObsidianModule.ts";
import { EVENT_SETTING_SAVED, eventHub } from "../../common/events.ts";
import type { LiveSyncLocalDB } from "../../lib/src/pouchdb/LiveSyncLocalDB.ts";
import { Semaphore } from "octagonal-wheels/concurrency/semaphore";
import type { LiveSyncCore } from "../../main.ts";
type SyncDirection = "push" | "pull" | "safe" | "pullForce" | "pushForce";
declare global {
interface OPTIONAL_SYNC_FEATURES {
FETCH: "FETCH";
OVERWRITE: "OVERWRITE";
MERGE: "MERGE";
DISABLE: "DISABLE";
DISABLE_HIDDEN: "DISABLE_HIDDEN";
}
}
function getComparingMTime(
doc: (MetaEntry | LoadedEntry | false) | UXFileInfo | UXStat | null | undefined,
includeDeleted = false
@@ -72,21 +79,21 @@ function getComparingMTime(
return doc.mtime ?? 0;
}
export class HiddenFileSync extends LiveSyncCommands implements IObsidianModule {
_isThisModuleEnabled() {
export class HiddenFileSync extends LiveSyncCommands {
isThisModuleEnabled() {
return this.plugin.settings.syncInternalFiles;
}
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(
this.plugin,
async () => this._isThisModuleEnabled() && this._isDatabaseReady() && (await this.scanAllStorageChanges(false))
async () => this.isThisModuleEnabled() && this._isDatabaseReady() && (await this.scanAllStorageChanges(false))
);
get kvDB() {
return this.plugin.kvDB;
}
getConflictedDoc(path: FilePathWithPrefix, rev: string) {
return this.plugin.localDatabase.getConflictedDoc(path, rev);
return this.plugin.managers.conflictManager.getConflictedDoc(path, rev);
}
onunload() {
this.periodicInternalFileScanProcessor?.disable();
@@ -132,14 +139,18 @@ export class HiddenFileSync extends LiveSyncCommands implements IObsidianModule
this.updateSettingCache();
});
}
async $everyOnInitializeDatabase(db: LiveSyncLocalDB): Promise<boolean> {
// We cannot initialise autosaveCache because kvDB is not ready yet
// async _everyOnInitializeDatabase(db: LiveSyncLocalDB): Promise<boolean> {
// this._fileInfoLastProcessed = await autosaveCache(this.kvDB, "hidden-file-lastProcessed");
// this._databaseInfoLastProcessed = await autosaveCache(this.kvDB, "hidden-file-lastProcessed-database");
// this._fileInfoLastKnown = await autosaveCache(this.kvDB, "hidden-file-lastKnown");
// return true;
// }
private async _everyOnDatabaseInitialized(showNotice: boolean) {
this._fileInfoLastProcessed = await autosaveCache(this.kvDB, "hidden-file-lastProcessed");
this._databaseInfoLastProcessed = await autosaveCache(this.kvDB, "hidden-file-lastProcessed-database");
this._fileInfoLastKnown = await autosaveCache(this.kvDB, "hidden-file-lastKnown");
return true;
}
async $everyOnDatabaseInitialized(showNotice: boolean) {
if (this._isThisModuleEnabled()) {
if (this.isThisModuleEnabled()) {
if (this._fileInfoLastProcessed.size == 0 && this._fileInfoLastProcessed.size == 0) {
this._log(`No cache found. Performing startup scan.`, LOG_LEVEL_VERBOSE);
await this.performStartupScan(true);
@@ -149,9 +160,9 @@ export class HiddenFileSync extends LiveSyncCommands implements IObsidianModule
}
return true;
}
async $everyBeforeReplicate(showNotice: boolean) {
async _everyBeforeReplicate(showNotice: boolean) {
if (
this._isThisModuleEnabled() &&
this.isThisModuleEnabled() &&
this._isDatabaseReady() &&
this.settings.syncInternalFilesBeforeReplication &&
!this.settings.watchInternalFileChanges
@@ -161,79 +172,66 @@ export class HiddenFileSync extends LiveSyncCommands implements IObsidianModule
return true;
}
$everyOnloadAfterLoadSettings(): Promise<boolean> {
private _everyOnloadAfterLoadSettings(): Promise<boolean> {
this.updateSettingCache();
return Promise.resolve(true);
}
updateSettingCache() {
const ignorePatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesIgnorePatterns");
this.ignorePatterns = ignorePatterns;
const targetFilter = getFileRegExp(this.plugin.settings, "syncInternalFilesTargetPatterns");
this.targetPatterns = targetFilter;
this.shouldSkipFile = [] as FilePathWithPrefixLC[];
// Exclude files handled by customization sync
const configDir = normalizePath(this.app.vault.configDir);
const shouldSKip = !this.settings.usePluginSync
? []
: Object.values(this.settings.pluginSyncExtendedSetting)
.filter((e) => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED)
.map((e) => e.files)
.flat()
.map((e) => `${configDir}/${e}`.toLowerCase());
this.shouldSkipFile = shouldSKip as FilePathWithPrefixLC[];
this._log(`Hidden file will skip ${this.shouldSkipFile.length} files`, LOG_LEVEL_INFO);
updateSettingCache() {
this.cacheCustomisationSyncIgnoredFiles.clear();
this.cacheFileRegExps.clear();
}
isReady() {
if (!this._isMainReady) return false;
if (this._isMainSuspended()) return false;
if (!this._isThisModuleEnabled()) return false;
if (!this.isThisModuleEnabled()) return false;
return true;
}
shouldSkipFile = [] as FilePathWithPrefixLC[];
async performStartupScan(showNotice: boolean) {
await this.applyOfflineChanges(showNotice);
}
async $everyOnResumeProcess(): Promise<boolean> {
async _everyOnResumeProcess(): Promise<boolean> {
this.periodicInternalFileScanProcessor?.disable();
if (this._isMainSuspended()) return true;
if (this._isThisModuleEnabled()) {
if (this.isThisModuleEnabled()) {
await this.performStartupScan(false);
}
this.periodicInternalFileScanProcessor.enable(
this._isThisModuleEnabled() && this.settings.syncInternalFilesInterval
this.isThisModuleEnabled() && this.settings.syncInternalFilesInterval
? this.settings.syncInternalFilesInterval * 1000
: 0
);
return true;
}
$everyRealizeSettingSyncMode(): Promise<boolean> {
_everyRealizeSettingSyncMode(): Promise<boolean> {
this.periodicInternalFileScanProcessor?.disable();
if (this._isMainSuspended()) return Promise.resolve(true);
if (!this.plugin.$$isReady()) return Promise.resolve(true);
if (!this.services.appLifecycle.isReady()) return Promise.resolve(true);
this.periodicInternalFileScanProcessor.enable(
this._isThisModuleEnabled() && this.settings.syncInternalFilesInterval
this.isThisModuleEnabled() && this.settings.syncInternalFilesInterval
? this.settings.syncInternalFilesInterval * 1000
: 0
);
const ignorePatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesIgnorePatterns");
this.ignorePatterns = ignorePatterns;
const targetFilter = getFileRegExp(this.plugin.settings, "syncInternalFilesTargetPatterns");
this.targetPatterns = targetFilter;
this.cacheFileRegExps.clear();
// const ignorePatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesIgnorePatterns");
// this.ignorePatterns = ignorePatterns;
// const targetFilter = getFileRegExp(this.plugin.settings, "syncInternalFilesTargetPatterns");
// this.targetPatterns = targetFilter;
return Promise.resolve(true);
}
async $anyProcessOptionalFileEvent(path: FilePath): Promise<boolean | undefined> {
async _anyProcessOptionalFileEvent(path: FilePath): Promise<boolean> {
if (this.isReady()) {
return await this.trackStorageFileModification(path);
return (await this.trackStorageFileModification(path)) || false;
}
return false;
}
$anyGetOptionalConflictCheckMethod(path: FilePathWithPrefix): Promise<boolean | "newer"> {
_anyGetOptionalConflictCheckMethod(path: FilePathWithPrefix): Promise<boolean | "newer"> {
if (isInternalMetadata(path)) {
this.queueConflictCheck(path);
return Promise.resolve(true);
@@ -241,12 +239,12 @@ export class HiddenFileSync extends LiveSyncCommands implements IObsidianModule
return Promise.resolve(false);
}
async $anyProcessOptionalSyncFiles(doc: LoadedEntry): Promise<boolean | undefined> {
async _anyProcessOptionalSyncFiles(doc: LoadedEntry): Promise<boolean> {
if (isInternalMetadata(doc._id)) {
if (this._isThisModuleEnabled()) {
if (this.isThisModuleEnabled()) {
//system file
const filename = getPath(doc);
if (await this.plugin.$$isTargetFile(filename)) {
if (await this.services.vault.isTargetFile(filename)) {
// this.procInternalFile(filename);
await this.processReplicationResult(doc);
return true;
@@ -545,8 +543,11 @@ Offline Changed files: ${processFiles.length}`;
forceWrite = false,
includeDeleted = true
): Promise<boolean | undefined> {
if (this.shouldSkipFile.some((e) => e.startsWith(path.toLowerCase()))) {
this._log(`Hidden file skipped: ${path} is synchronized in customization sync.`, LOG_LEVEL_VERBOSE);
if (!(await this.isTargetFile(path))) {
this._log(
`Storage file tracking: Hidden file skipped: ${path} is filtered out by the defined patterns.`,
LOG_LEVEL_VERBOSE
);
return false;
}
try {
@@ -699,7 +700,7 @@ Offline Changed files: ${processFiles.length}`;
revFrom._revs_info
?.filter((e) => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo)
.first()?.rev ?? "";
const result = await this.plugin.localDatabase.mergeObject(
const result = await this.plugin.managers.conflictManager.mergeObject(
doc.path,
commonBase,
doc._rev,
@@ -721,6 +722,13 @@ Offline Changed files: ${processFiles.length}`;
} else {
this._log(`Object merge is not applicable.`, LOG_LEVEL_VERBOSE);
}
// const pat = this.settings.syncInternalFileOverwritePatterns;
const regExp = getFileRegExp(this.settings, "syncInternalFileOverwritePatterns");
if (regExp.some((r) => r.test(stripAllPrefixes(path)))) {
this._log(`Overwrite rule applied for conflicted hidden file: ${path}`, LOG_LEVEL_INFO);
await this.resolveByNewerEntry(id, path, doc, revA, revB);
return [];
}
return [{ path, revA, revB, id, doc }];
}
// When not JSON file, resolve conflicts by choosing a newer one.
@@ -849,6 +857,108 @@ Offline Changed files: ${processFiles.length}`;
// --> Database Event Functions
cacheFileRegExps = new Map<string, CustomRegExp[][]>();
/**
* Parses the regular expression settings for hidden file synchronization.
* @returns An object containing the ignore and target filters.
*/
parseRegExpSettings() {
const regExpKey = `${this.plugin.settings.syncInternalFilesTargetPatterns}||${this.plugin.settings.syncInternalFilesIgnorePatterns}`;
let ignoreFilter: CustomRegExp[];
let targetFilter: CustomRegExp[];
if (this.cacheFileRegExps.has(regExpKey)) {
const cached = this.cacheFileRegExps.get(regExpKey)!;
ignoreFilter = cached[1];
targetFilter = cached[0];
} else {
ignoreFilter = getFileRegExp(this.plugin.settings, "syncInternalFilesIgnorePatterns");
targetFilter = getFileRegExp(this.plugin.settings, "syncInternalFilesTargetPatterns");
this.cacheFileRegExps.clear();
this.cacheFileRegExps.set(regExpKey, [targetFilter, ignoreFilter]);
}
return { ignoreFilter, targetFilter };
}
/**
* Checks if the target file path matches the defined patterns.
*/
isTargetFileInPatterns(path: string): boolean {
const { ignoreFilter, targetFilter } = this.parseRegExpSettings();
if (ignoreFilter && ignoreFilter.length > 0) {
for (const pattern of ignoreFilter) {
if (pattern.test(path)) {
return false;
}
}
}
if (targetFilter && targetFilter.length > 0) {
for (const pattern of targetFilter) {
if (pattern.test(path)) {
return true;
}
}
// While having target patterns, it effects as an allow-list.
return false;
}
return true;
}
cacheCustomisationSyncIgnoredFiles = new Map<string, string[]>();
/**
* Gets the list of files ignored for customization synchronization.
* @returns An array of ignored file paths (lowercase).
*/
getCustomisationSynchronizationIgnoredFiles(): string[] {
const configDir = this.plugin.app.vault.configDir;
const key =
JSON.stringify(this.settings.pluginSyncExtendedSetting) + `||${this.settings.usePluginSync}||${configDir}`;
if (this.cacheCustomisationSyncIgnoredFiles.has(key)) {
return this.cacheCustomisationSyncIgnoredFiles.get(key)!;
}
this.cacheCustomisationSyncIgnoredFiles.clear();
const synchronisedInConfigSync = !this.settings.usePluginSync
? []
: Object.values(this.settings.pluginSyncExtendedSetting)
.filter((e) => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED)
.map((e) => e.files)
.flat()
.map((e) => `${configDir}/${e}`.toLowerCase());
this.cacheCustomisationSyncIgnoredFiles.set(key, synchronisedInConfigSync);
return synchronisedInConfigSync;
}
/**
* Checks if the given path is not ignored by customization synchronization.
* @param path The file path to check.
* @returns True if the path is not ignored; otherwise, false.
*/
isNotIgnoredByCustomisationSync(path: string): boolean {
const ignoredFiles = this.getCustomisationSynchronizationIgnoredFiles();
const result = !ignoredFiles.some((e) => path.startsWith(e));
// console.warn(`Assertion: isNotIgnoredByCustomisationSync(${path}) = ${result}`);
return result;
}
isHiddenFileSyncHandlingPath(path: FilePath): boolean {
const result = path.startsWith(".") && !path.startsWith(".trash");
// console.warn(`Assertion: isHiddenFileSyncHandlingPath(${path}) = ${result}`);
return result;
}
async isTargetFile(path: FilePath): Promise<boolean> {
const result =
this.isTargetFileInPatterns(path) &&
this.isNotIgnoredByCustomisationSync(path) &&
this.isHiddenFileSyncHandlingPath(path);
// console.warn(`Assertion: isTargetFile(${path}) : ${result ? "✔️" : "❌"}`);
if (!result) {
return false;
}
const resultByFile = await this.services.vault.isIgnoredByIgnoreFile(path);
// console.warn(`${path} -> isIgnoredByIgnoreFile: ${resultByFile ? "❌" : "✔️"}`);
return !resultByFile;
}
async trackScannedDatabaseChange(
processFiles: MetaEntry[],
showNotice: boolean = false,
@@ -862,14 +972,21 @@ Offline Changed files: ${processFiles.length}`;
const processes = processFiles.map(async (file) => {
try {
const path = stripAllPrefixes(this.getPath(file));
await this.trackDatabaseFileModification(
path,
"[Hidden file scan]",
!forceWriteAll,
onlyNew,
file,
includeDeletion
);
if (!(await this.isTargetFile(path))) {
this._log(
`Database file tracking: Hidden file skipped: ${path} is filtered out by the defined patterns.`,
LOG_LEVEL_VERBOSE
);
} else {
await this.trackDatabaseFileModification(
path,
"[Hidden file scan]",
!forceWriteAll,
onlyNew,
file,
includeDeletion
);
}
notifyProgress();
} catch (ex) {
this._log(`Failed to process storage change file:${file}`, logLevel);
@@ -1091,14 +1208,14 @@ Offline Changed files: ${files.length}`;
// If something changes left, notify for reloading Obsidian.
if (updatedFolders.indexOf(this.plugin.app.vault.configDir) >= 0) {
if (!this.plugin.$$isReloadingScheduled()) {
if (!this.services.appLifecycle.isReloadingScheduled()) {
this.plugin.confirm.askInPopup(
`updated-any-hidden`,
`Some setting files have been modified\nPress {HERE} to schedule a reload of Obsidian, or press elsewhere to dismiss this message.`,
(anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", () => {
this.plugin.$$scheduleAppReload();
this.services.appLifecycle.scheduleRestart();
});
}
);
@@ -1202,7 +1319,13 @@ Offline Changed files: ${files.length}`;
).rows
.filter((e) => isInternalMetadata(e.id as DocumentID))
.map((e) => e.doc) as MetaEntry[];
return allFiles;
const files = [] as MetaEntry[];
for (const file of allFiles) {
if (await this.isTargetFile(stripAllPrefixes(this.getPath(file)))) {
files.push(file);
}
}
return files;
}
async rebuildFromDatabase(showNotice: boolean, targetFiles: FilePath[] | false = false, onlyNew = false) {
@@ -1318,7 +1441,7 @@ Offline Changed files: ${files.length}`;
async storeInternalFileToDatabase(file: InternalFileInfo | UXFileInfo, forceWrite = false) {
const storeFilePath = stripAllPrefixes(file.path as FilePath);
const storageFilePath = file.path;
if (await this.plugin.$$isIgnoredByIgnoreFiles(storageFilePath)) {
if (await this.services.vault.isIgnoredByIgnoreFile(storageFilePath)) {
return undefined;
}
const prefixedFileName = addPrefix(storeFilePath, ICHeader);
@@ -1372,7 +1495,7 @@ Offline Changed files: ${files.length}`;
const displayFileName = filenameSrc;
const prefixedFileName = addPrefix(storeFilePath, ICHeader);
const mtime = new Date().getTime();
if (await this.plugin.$$isIgnoredByIgnoreFiles(storageFilePath)) {
if (await this.services.vault.isIgnoredByIgnoreFile(storageFilePath)) {
return undefined;
}
return await serialized("file-" + prefixedFileName, async () => {
@@ -1432,7 +1555,7 @@ Offline Changed files: ${files.length}`;
includeDeletion = true
) {
const prefixedFileName = addPrefix(storageFilePath, ICHeader);
if (await this.plugin.$$isIgnoredByIgnoreFiles(storageFilePath)) {
if (await this.services.vault.isIgnoredByIgnoreFile(storageFilePath)) {
return undefined;
}
return await serialized("file-" + prefixedFileName, async () => {
@@ -1479,7 +1602,7 @@ Offline Changed files: ${files.length}`;
}
const deleted = metaOnDB.deleted || metaOnDB._deleted || false;
if (deleted) {
const result = await this._deleteFile(storageFilePath);
const result = await this.__deleteFile(storageFilePath);
if (result == "OK") {
this.updateLastProcessedDeletion(storageFilePath, metaOnDB);
return true;
@@ -1493,7 +1616,7 @@ Offline Changed files: ${files.length}`;
if (fileOnDB === false) {
throw new Error(`Failed to read file from database:${storageFilePath}`);
}
const resultStat = await this._writeFile(storageFilePath, fileOnDB, force);
const resultStat = await this.__writeFile(storageFilePath, fileOnDB, force);
if (resultStat) {
this.updateLastProcessed(storageFilePath, metaOnDB, resultStat);
this.queueNotification(storageFilePath);
@@ -1526,7 +1649,7 @@ Offline Changed files: ${files.length}`;
}
}
async _writeFile(storageFilePath: FilePath, fileOnDB: LoadedEntry, force: boolean): Promise<false | UXStat> {
async __writeFile(storageFilePath: FilePath, fileOnDB: LoadedEntry, force: boolean): Promise<false | UXStat> {
try {
const statBefore = await this.plugin.storageAccess.statHidden(storageFilePath);
const isExist = statBefore != null;
@@ -1565,7 +1688,7 @@ Offline Changed files: ${files.length}`;
}
}
async _deleteFile(storageFilePath: FilePath): Promise<false | "OK" | "ALREADY"> {
async __deleteFile(storageFilePath: FilePath): Promise<false | "OK" | "ALREADY"> {
const result = await this.__removeFile(storageFilePath);
if (result === false) {
this._log(`STORAGE <x- DB: ${storageFilePath}: deleting (hidden) Failed`);
@@ -1582,11 +1705,11 @@ Offline Changed files: ${files.length}`;
// <-- Database To Storage Functions
async $allAskUsingOptionalSyncFeature(opt: { enableFetch?: boolean; enableOverwrite?: boolean }) {
await this._askHiddenFileConfiguration(opt);
private async _allAskUsingOptionalSyncFeature(opt: { enableFetch?: boolean; enableOverwrite?: boolean }) {
await this.__askHiddenFileConfiguration(opt);
return true;
}
async _askHiddenFileConfiguration(opt: { enableFetch?: boolean; enableOverwrite?: boolean }) {
private async __askHiddenFileConfiguration(opt: { enableFetch?: boolean; enableOverwrite?: boolean }) {
const messageFetch = `${opt.enableFetch ? `> - Fetch: Use the files stored from other devices. Choose this option if you have already configured hidden file synchronization on those devices and wish to accept their files.\n` : ""}`;
const messageOverwrite = `${opt.enableOverwrite ? `> - Overwrite: Use the files from this device. Select this option if you want to overwrite the files stored on other devices.\n` : ""}`;
const messageMerge = `> - Merge: Merge the files from this device with those on other devices. Choose this option if you wish to combine files from multiple sources.
@@ -1632,7 +1755,7 @@ ${messageFetch}${messageOverwrite}${messageMerge}
}
}
$allSuspendExtraSync(): Promise<boolean> {
private _allSuspendExtraSync(): Promise<boolean> {
if (this.plugin.settings.syncInternalFiles) {
this._log(
"Hidden file synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.",
@@ -1644,11 +1767,12 @@ ${messageFetch}${messageOverwrite}${messageMerge}
}
// --> Configuration handling
async $anyConfigureOptionalSyncFeature(mode: "FETCH" | "OVERWRITE" | "MERGE" | "DISABLE" | "DISABLE_HIDDEN") {
private async _allConfigureOptionalSyncFeature(mode: keyof OPTIONAL_SYNC_FEATURES) {
await this.configureHiddenFileSync(mode);
return true;
}
async configureHiddenFileSync(mode: "FETCH" | "OVERWRITE" | "MERGE" | "DISABLE" | "DISABLE_HIDDEN") {
async configureHiddenFileSync(mode: keyof OPTIONAL_SYNC_FEATURES) {
if (
mode != "FETCH" &&
mode != "OVERWRITE" &&
@@ -1682,29 +1806,13 @@ ${messageFetch}${messageOverwrite}${messageMerge}
// <-- Configuration handling
// --> Local Storage SubFunctions
ignorePatterns: CustomRegExp[] = [];
targetPatterns: CustomRegExp[] = [];
async scanInternalFileNames() {
const configDir = normalizePath(this.app.vault.configDir);
const ignoreFilter = getFileRegExp(this.plugin.settings, "syncInternalFilesIgnorePatterns");
const targetFilter = getFileRegExp(this.plugin.settings, "syncInternalFilesTargetPatterns");
const synchronisedInConfigSync = !this.settings.usePluginSync
? []
: Object.values(this.settings.pluginSyncExtendedSetting)
.filter((e) => e.mode == MODE_SELECTIVE || e.mode == MODE_PAUSED)
.map((e) => e.files)
.flat()
.map((e) => `${configDir}/${e}`.toLowerCase());
const root = this.app.vault.getRoot();
const findRoot = root.path;
const filenames = (await this.getFiles(findRoot, [], targetFilter, ignoreFilter))
.filter((e) => e.startsWith("."))
.filter((e) => !e.startsWith(".trash"));
const files = filenames.filter((path) =>
synchronisedInConfigSync.every((filterFile) => !path.toLowerCase().startsWith(filterFile))
);
return files as FilePath[];
const filenames = await this.getFiles(findRoot, (path) => this.isTargetFile(path));
return filenames as FilePath[];
}
async scanInternalFiles(): Promise<InternalFileInfo[]> {
@@ -1718,7 +1826,7 @@ ${messageFetch}${messageOverwrite}${messageMerge}
const result: InternalFileInfo[] = [];
for (const f of files) {
const w = await f;
if (await this.plugin.$$isIgnoredByIgnoreFiles(w.path)) {
if (await this.services.vault.isIgnoredByIgnoreFile(w.path)) {
continue;
}
const mtime = w.stat?.mtime ?? 0;
@@ -1734,7 +1842,32 @@ ${messageFetch}${messageOverwrite}${messageMerge}
return result;
}
async getFiles(path: string, ignoreList: string[], filter?: CustomRegExp[], ignoreFilter?: CustomRegExp[]) {
async getFiles(path: string, checkFunction: (path: FilePath) => Promise<boolean> | boolean) {
let w: ListedFiles;
try {
w = await this.app.vault.adapter.list(path);
} catch (ex) {
this._log(`Could not traverse(HiddenSync):${path}`, LOG_LEVEL_INFO);
this._log(ex, LOG_LEVEL_VERBOSE);
return [];
}
let files = [] as string[];
for (const file of w.files) {
if (!(await checkFunction(file as FilePath))) {
continue;
}
files.push(file);
}
for (const v of w.folders) {
if (!(await checkFunction(v as FilePath))) {
continue;
}
files = files.concat(await this.getFiles(v, checkFunction));
}
return files;
}
/*
async getFiles_(path: string, ignoreList: string[], filter?: CustomRegExp[], ignoreFilter?: CustomRegExp[]) {
let w: ListedFiles;
try {
w = await this.app.vault.adapter.list(path);
@@ -1756,7 +1889,7 @@ ${messageFetch}${messageOverwrite}${messageMerge}
if (ignoreFilter && ignoreFilter.some((ee) => ee.test(file))) {
continue;
}
if (await this.plugin.$$isIgnoredByIgnoreFiles(file)) continue;
if (await this.services.vault.isIgnoredByIgnoreFile(file)) continue;
files.push(file);
}
L1: for (const v of w.folders) {
@@ -1765,19 +1898,32 @@ ${messageFetch}${messageOverwrite}${messageMerge}
continue L1;
}
}
if (
ignoreFilter &&
ignoreFilter.some((e) => (e.pattern.startsWith("/") || e.pattern.startsWith("\\/")) && e.test(v))
) {
if (ignoreFilter && ignoreFilter.some((e) => e.test(v))) {
continue L1;
}
if (await this.plugin.$$isIgnoredByIgnoreFiles(v)) {
if (await this.services.vault.isIgnoredByIgnoreFile(v)) {
continue L1;
}
files = files.concat(await this.getFiles(v, ignoreList, filter, ignoreFilter));
files = files.concat(await this.getFiles_(v, ignoreList, filter, ignoreFilter));
}
return files;
}
*/
// <-- Local Storage SubFunctions
onBindFunction(core: LiveSyncCore, services: typeof core.services) {
// No longer needed on initialisation
// services.databaseEvents.handleOnDatabaseInitialisation(this._everyOnInitializeDatabase.bind(this));
services.appLifecycle.handleOnSettingLoaded(this._everyOnloadAfterLoadSettings.bind(this));
services.fileProcessing.handleOptionalFileEvent(this._anyProcessOptionalFileEvent.bind(this));
services.conflict.handleGetOptionalConflictCheckMethod(this._anyGetOptionalConflictCheckMethod.bind(this));
services.replication.handleProcessOptionalSynchroniseResult(this._anyProcessOptionalSyncFiles.bind(this));
services.setting.handleOnRealiseSetting(this._everyRealizeSettingSyncMode.bind(this));
services.appLifecycle.handleOnResuming(this._everyOnResumeProcess.bind(this));
services.replication.handleBeforeReplicate(this._everyBeforeReplicate.bind(this));
services.databaseEvents.handleDatabaseInitialised(this._everyOnDatabaseInitialized.bind(this));
services.setting.handleSuspendExtraSync(this._allSuspendExtraSync.bind(this));
services.setting.handleSuggestOptionalFeatures(this._allAskUsingOptionalSyncFeature.bind(this));
services.setting.handleEnableOptionalFeature(this._allConfigureOptionalSyncFeature.bind(this));
}
}

View File

@@ -5,13 +5,14 @@ import {
LOG_LEVEL_NOTICE,
type AnyEntry,
type DocumentID,
type EntryHasPath,
type FilePath,
type FilePathWithPrefix,
type LOG_LEVEL,
} from "../lib/src/common/types.ts";
import type ObsidianLiveSyncPlugin from "../main.ts";
import { MARK_DONE } from "../modules/features/ModuleLog.ts";
import type { LiveSyncCore } from "../main.ts";
import { __$checkInstanceBinding } from "../lib/src/dev/checks.ts";
let noticeIndex = 0;
export abstract class LiveSyncCommands {
@@ -25,12 +26,15 @@ export abstract class LiveSyncCommands {
get localDatabase() {
return this.plugin.localDatabase;
}
id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
return this.plugin.$$id2path(id, entry, stripPrefix);
get services() {
return this.plugin.services;
}
// id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
// return this.plugin.$$id2path(id, entry, stripPrefix);
// }
async path2id(filename: FilePathWithPrefix | FilePath, prefix?: string): Promise<DocumentID> {
return await this.plugin.$$path2id(filename, prefix);
return await this.services.path.path2id(filename, prefix);
}
getPath(entry: AnyEntry): FilePathWithPrefix {
return getPath(entry);
@@ -38,18 +42,20 @@ export abstract class LiveSyncCommands {
constructor(plugin: ObsidianLiveSyncPlugin) {
this.plugin = plugin;
this.onBindFunction(plugin, plugin.services);
__$checkInstanceBinding(this);
}
abstract onunload(): void;
abstract onload(): void | Promise<void>;
_isMainReady() {
return this.plugin.$$isReady();
return this.plugin.services.appLifecycle.isReady();
}
_isMainSuspended() {
return this.plugin.$$isSuspended();
return this.services.appLifecycle.isSuspended();
}
_isDatabaseReady() {
return this.plugin.$$isDatabaseReady();
return this.services.database.isDatabaseReady();
}
_log = (msg: any, level: LOG_LEVEL = LOG_LEVEL_INFO, key?: string) => {
@@ -89,4 +95,8 @@ export abstract class LiveSyncCommands {
_debug = (msg: any, key?: string) => {
this._log(msg, LOG_LEVEL_VERBOSE, key);
};
onBindFunction(core: LiveSyncCore, services: typeof core.services) {
// Override if needed.
}
}

View File

@@ -1,13 +1,27 @@
import { sizeToHumanReadable } from "octagonal-wheels/number";
import { LOG_LEVEL_NOTICE, type MetaEntry } from "../../lib/src/common/types";
import {
EntryTypes,
LOG_LEVEL_INFO,
LOG_LEVEL_NOTICE,
LOG_LEVEL_VERBOSE,
type DocumentID,
type EntryDoc,
type EntryLeaf,
type MetaEntry,
} from "../../lib/src/common/types";
import { getNoFromRev } from "../../lib/src/pouchdb/LiveSyncLocalDB";
import type { IObsidianModule } from "../../modules/AbstractObsidianModule";
import { LiveSyncCommands } from "../LiveSyncCommands";
import { serialized } from "octagonal-wheels/concurrency/lock_v2";
import { arrayToChunkedArray } from "octagonal-wheels/collection";
const DB_KEY_SEQ = "gc-seq";
const DB_KEY_CHUNK_SET = "chunk-set";
const DB_KEY_DOC_USAGE_MAP = "doc-usage-map";
type ChunkID = DocumentID;
type NoteDocumentID = DocumentID;
type Rev = string;
export class LocalDatabaseMaintenance extends LiveSyncCommands implements IObsidianModule {
$everyOnload(): Promise<boolean> {
return Promise.resolve(true);
}
type ChunkUsageMap = Map<NoteDocumentID, Map<Rev, Set<ChunkID>>>;
export class LocalDatabaseMaintenance extends LiveSyncCommands {
onunload(): void {
// NO OP.
}
@@ -28,7 +42,7 @@ export class LocalDatabaseMaintenance extends LiveSyncCommands implements IObsid
return this.localDatabase.localDatabase;
}
clearHash() {
this.localDatabase.hashCaches.clear();
this.localDatabase.clearCaches();
}
async confirm(title: string, message: string, affirmative = "Yes", negative = "No") {
@@ -262,4 +276,213 @@ Note: **Make sure to synchronise all devices before deletion.**
this.clearHash();
}
}
async scanUnusedChunks() {
const kvDB = this.plugin.kvDB;
const chunkSet = (await kvDB.get<Set<DocumentID>>(DB_KEY_CHUNK_SET)) || new Set();
const chunkUsageMap = (await kvDB.get<ChunkUsageMap>(DB_KEY_DOC_USAGE_MAP)) || new Map();
const KEEP_MAX_REVS = 10;
const unusedSet = new Set<DocumentID>([...chunkSet]);
for (const [, revIdMap] of chunkUsageMap) {
const sortedRevId = [...revIdMap.entries()].sort((a, b) => getNoFromRev(b[0]) - getNoFromRev(a[0]));
if (sortedRevId.length > KEEP_MAX_REVS) {
// If we have more revisions than we want to keep, we need to delete the extras
}
const keepRevID = sortedRevId.slice(0, KEEP_MAX_REVS);
keepRevID.forEach((e) => e[1].forEach((ee) => unusedSet.delete(ee)));
}
return {
chunkSet,
chunkUsageMap,
unusedSet,
};
}
/**
* Track changes in the database and update the chunk usage map for garbage collection.
* Note that this only able to perform without Fetch chunks on demand.
*/
async trackChanges(fromStart: boolean = false, showNotice: boolean = false) {
if (!this.isAvailable()) return;
const logLevel = showNotice ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
const kvDB = this.plugin.kvDB;
const previousSeq = fromStart ? "" : await kvDB.get<string>(DB_KEY_SEQ);
const chunkSet = (await kvDB.get<Set<DocumentID>>(DB_KEY_CHUNK_SET)) || new Set();
const chunkUsageMap = (await kvDB.get<ChunkUsageMap>(DB_KEY_DOC_USAGE_MAP)) || new Map();
const db = this.localDatabase.localDatabase;
const verbose = (msg: string) => this._verbose(msg);
const processDoc = async (doc: EntryDoc, isDeleted: boolean) => {
if (!("children" in doc)) {
return;
}
const id = doc._id;
const rev = doc._rev!;
const deleted = doc._deleted || isDeleted;
const softDeleted = doc.deleted;
const children = (doc.children || []) as DocumentID[];
if (!chunkUsageMap.has(id)) {
chunkUsageMap.set(id, new Map<Rev, Set<ChunkID>>());
}
for (const chunkId of children) {
if (deleted) {
chunkUsageMap.get(id)!.delete(rev);
// chunkSet.add(chunkId as DocumentID);
} else {
if (softDeleted) {
//TODO: Soft delete
chunkUsageMap.get(id)!.set(rev, (chunkUsageMap.get(id)!.get(rev) || new Set()).add(chunkId));
} else {
chunkUsageMap.get(id)!.set(rev, (chunkUsageMap.get(id)!.get(rev) || new Set()).add(chunkId));
}
}
}
verbose(
`Tracking chunk: ${id}/${rev} (${doc?.path}), deleted: ${deleted ? "yes" : "no"} Soft-Deleted:${softDeleted ? "yes" : "no"}`
);
return await Promise.resolve();
};
// let saveQueue = 0;
const saveState = async (seq: string | number) => {
await kvDB.set(DB_KEY_SEQ, seq);
await kvDB.set(DB_KEY_CHUNK_SET, chunkSet);
await kvDB.set(DB_KEY_DOC_USAGE_MAP, chunkUsageMap);
};
const processDocRevisions = async (doc: EntryDoc) => {
try {
const oldRevisions = await db.get(doc._id, { revs: true, revs_info: true, conflicts: true });
const allRevs = oldRevisions._revs_info?.length || 0;
const info = (oldRevisions._revs_info || [])
.filter((e) => e.status == "available" && e.rev != doc._rev)
.filter((info) => !chunkUsageMap.get(doc._id)?.has(info.rev));
const infoLength = info.length;
this._log(`Found ${allRevs} old revisions for ${doc._id} . ${infoLength} items to check `);
if (info.length > 0) {
const oldDocs = await Promise.all(
info
.filter((revInfo) => revInfo.status == "available")
.map((revInfo) => db.get(doc._id, { rev: revInfo.rev }))
).then((docs) => docs.filter((doc) => doc));
for (const oldDoc of oldDocs) {
await processDoc(oldDoc as EntryDoc, false);
}
}
} catch (ex) {
if ((ex as any)?.status == 404) {
this._log(`No revisions found for ${doc._id}`, LOG_LEVEL_VERBOSE);
} else {
this._log(`Error finding revisions for ${doc._id}`);
this._verbose(ex);
}
}
};
const processChange = async (doc: EntryDoc, isDeleted: boolean, seq: string | number) => {
if (doc.type === EntryTypes.CHUNK) {
if (isDeleted) return;
chunkSet.add(doc._id);
} else if ("children" in doc) {
await processDoc(doc, isDeleted);
await serialized("x-process-doc", async () => await processDocRevisions(doc));
}
};
// Track changes
let i = 0;
await db
.changes({
since: previousSeq || "",
live: false,
conflicts: true,
include_docs: true,
style: "all_docs",
return_docs: false,
})
.on("change", async (change) => {
// handle change
await processChange(change.doc!, change.deleted ?? false, change.seq);
if (i++ % 100 == 0) {
await saveState(change.seq);
}
})
.on("complete", async (info) => {
await saveState(info.last_seq);
});
// Track all changed docs and new-leafs;
const result = await this.scanUnusedChunks();
const message = `Total chunks: ${result.chunkSet.size}\nUnused chunks: ${result.unusedSet.size}`;
this._log(message, logLevel);
}
async performGC(showingNotice = false) {
if (!this.isAvailable()) return;
await this.trackChanges(false, showingNotice);
const title = "Are all devices synchronised?";
const confirmMessage = `This function deletes unused chunks from the device. If there are differences between devices, some chunks may be missing when resolving conflicts.
Be sure to synchronise before executing.
However, if you have deleted them, you may be able to recover them by performing Hatch -> Recreate missing chunks for all files.
Are you ready to delete unused chunks?`;
const logLevel = showingNotice ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
const BUTTON_OK = `Yes, delete chunks`;
const BUTTON_CANCEL = "Cancel";
const result = await this.plugin.confirm.askSelectStringDialogue(
confirmMessage,
[BUTTON_OK, BUTTON_CANCEL] as const,
{
title,
defaultAction: BUTTON_CANCEL,
}
);
if (result !== BUTTON_OK) {
this._log("User cancelled chunk deletion", logLevel);
return;
}
const { unusedSet, chunkSet } = await this.scanUnusedChunks();
const deleteChunks = await this.database.allDocs({
keys: [...unusedSet],
include_docs: true,
});
for (const chunk of deleteChunks.rows) {
if ((chunk as any)?.value?.deleted) {
chunkSet.delete(chunk.key as DocumentID);
}
}
const deleteDocs = deleteChunks.rows
.filter((e) => "doc" in e)
.map((e) => ({
...(e as any).doc!,
_deleted: true,
}));
this._log(`Deleting chunks: ${deleteDocs.length}`, logLevel);
const deleteChunkBatch = arrayToChunkedArray(deleteDocs, 100);
let successCount = 0;
let errored = 0;
for (const batch of deleteChunkBatch) {
const results = await this.database.bulkDocs(batch as EntryLeaf[]);
for (const result of results) {
if ("ok" in result) {
chunkSet.delete(result.id as DocumentID);
successCount++;
} else {
this._log(`Failed to delete doc: ${result.id}`, LOG_LEVEL_VERBOSE);
errored++;
}
}
this._log(`Deleting chunks: ${successCount} `, logLevel, "gc-preforming");
}
const message = `Garbage Collection completed.
Success: ${successCount}, Errored: ${errored}`;
this._log(message, logLevel);
const kvDB = this.plugin.kvDB;
await kvDB.set(DB_KEY_CHUNK_SET, chunkSet);
}
}

View File

@@ -1,21 +1,27 @@
import type { IObsidianModule } from "../../modules/AbstractObsidianModule";
import { P2PReplicatorPaneView, VIEW_TYPE_P2P } from "./P2PReplicator/P2PReplicatorPaneView.ts";
import {
AutoAccepting,
LOG_LEVEL_NOTICE,
P2P_DEFAULT_SETTINGS,
REMOTE_P2P,
type EntryDoc,
type P2PSyncSetting,
type RemoteDBSettings,
} from "../../lib/src/common/types.ts";
import { LiveSyncCommands } from "../LiveSyncCommands.ts";
import { LiveSyncTrysteroReplicator } from "../../lib/src/replication/trystero/LiveSyncTrysteroReplicator.ts";
import {
LiveSyncTrysteroReplicator,
setReplicatorFunc,
} from "../../lib/src/replication/trystero/LiveSyncTrysteroReplicator.ts";
import { EVENT_REQUEST_OPEN_P2P, eventHub } from "../../common/events.ts";
import type { LiveSyncAbstractReplicator } from "../../lib/src/replication/LiveSyncAbstractReplicator.ts";
import { Logger } from "octagonal-wheels/common/logger";
import { LOG_LEVEL_INFO, LOG_LEVEL_VERBOSE, Logger } from "octagonal-wheels/common/logger";
import type { CommandShim } from "../../lib/src/replication/trystero/P2PReplicatorPaneCommon.ts";
import {
P2PReplicatorMixIn,
addP2PEventHandlers,
closeP2PReplicator,
openP2PReplicator,
P2PLogCollector,
removeP2PReplicatorInstance,
type P2PReplicatorBase,
} from "../../lib/src/replication/trystero/P2PReplicatorCore.ts";
@@ -24,8 +30,11 @@ import type { Confirm } from "../../lib/src/interfaces/Confirm.ts";
import type ObsidianLiveSyncPlugin from "../../main.ts";
import type { SimpleStore } from "octagonal-wheels/databases/SimpleStoreBase";
import { getPlatformName } from "../../lib/src/PlatformAPIs/obsidian/Environment.ts";
import type { LiveSyncCore } from "../../main.ts";
import { TrysteroReplicator } from "../../lib/src/replication/trystero/TrysteroReplicator.ts";
import { SETTING_KEY_P2P_DEVICE_NAME } from "../../lib/src/common/types.ts";
class P2PReplicatorCommandBase extends LiveSyncCommands implements P2PReplicatorBase {
export class P2PReplicator extends LiveSyncCommands implements P2PReplicatorBase, CommandShim {
storeP2PStatusLine = reactiveSource("");
getSettings(): P2PSyncSetting {
@@ -49,47 +58,124 @@ class P2PReplicatorCommandBase extends LiveSyncCommands implements P2PReplicator
constructor(plugin: ObsidianLiveSyncPlugin) {
super(plugin);
setReplicatorFunc(() => this._replicatorInstance);
addP2PEventHandlers(this);
this.afterConstructor();
// onBindFunction is called in super class
// this.onBindFunction(plugin, plugin.services);
}
async handleReplicatedDocuments(docs: EntryDoc[]): Promise<void> {
// console.log("Processing Replicated Docs", docs);
return await this.plugin.$$parseReplicationResult(docs as PouchDB.Core.ExistingDocument<EntryDoc>[]);
}
onunload(): void {
throw new Error("Method not implemented.");
}
onload(): void | Promise<void> {
throw new Error("Method not implemented.");
return await this.services.replication.parseSynchroniseResult(
docs as PouchDB.Core.ExistingDocument<EntryDoc>[]
);
}
init() {
this._simpleStore = this.plugin.$$getSimpleStore("p2p-sync");
return Promise.resolve(this);
}
}
export class P2PReplicator
extends P2PReplicatorMixIn(P2PReplicatorCommandBase)
implements IObsidianModule, CommandShim
{
storeP2PStatusLine = reactiveSource("");
$anyNewReplicator(settingOverride: Partial<RemoteDBSettings> = {}): Promise<LiveSyncAbstractReplicator> {
_anyNewReplicator(settingOverride: Partial<RemoteDBSettings> = {}): Promise<LiveSyncAbstractReplicator> {
const settings = { ...this.settings, ...settingOverride };
if (settings.remoteType == REMOTE_P2P) {
return Promise.resolve(new LiveSyncTrysteroReplicator(this.plugin));
}
return undefined!;
}
override getPlatform(): string {
_replicatorInstance?: TrysteroReplicator;
p2pLogCollector = new P2PLogCollector();
afterConstructor() {
return;
}
async open() {
await openP2PReplicator(this);
}
async close() {
await closeP2PReplicator(this);
}
getConfig(key: string) {
return this.services.config.getSmallConfig(key);
}
setConfig(key: string, value: string) {
return this.services.config.setSmallConfig(key, value);
}
enableBroadcastCastings() {
return this?._replicatorInstance?.enableBroadcastChanges();
}
disableBroadcastCastings() {
return this?._replicatorInstance?.disableBroadcastChanges();
}
init() {
this._simpleStore = this.services.database.openSimpleStore("p2p-sync");
return Promise.resolve(this);
}
async initialiseP2PReplicator(): Promise<TrysteroReplicator> {
await this.init();
try {
if (this._replicatorInstance) {
await this._replicatorInstance.close();
this._replicatorInstance = undefined;
}
if (!this.settings.P2P_AppID) {
this.settings.P2P_AppID = P2P_DEFAULT_SETTINGS.P2P_AppID;
}
const getInitialDeviceName = () =>
this.getConfig(SETTING_KEY_P2P_DEVICE_NAME) || this.services.vault.getVaultName();
const getSettings = () => this.settings;
const store = () => this.simpleStore();
const getDB = () => this.getDB();
const getConfirm = () => this.confirm;
const getPlatform = () => this.getPlatform();
const env = {
get db() {
return getDB();
},
get confirm() {
return getConfirm();
},
get deviceName() {
return getInitialDeviceName();
},
get platform() {
return getPlatform();
},
get settings() {
return getSettings();
},
processReplicatedDocs: async (docs: EntryDoc[]): Promise<void> => {
await this.handleReplicatedDocuments(docs);
// No op. This is a client and does not need to process the docs
},
get simpleStore() {
return store();
},
};
this._replicatorInstance = new TrysteroReplicator(env);
return this._replicatorInstance;
} catch (e) {
this._log(
e instanceof Error ? e.message : "Something occurred on Initialising P2P Replicator",
LOG_LEVEL_INFO
);
this._log(e, LOG_LEVEL_VERBOSE);
throw e;
}
}
getPlatform(): string {
return getPlatformName();
}
override onunload(): void {
onunload(): void {
removeP2PReplicatorInstance();
void this.close();
}
override onload(): void | Promise<void> {
onload(): void | Promise<void> {
eventHub.onEvent(EVENT_REQUEST_OPEN_P2P, () => {
void this.openPane();
});
@@ -97,12 +183,12 @@ export class P2PReplicator
this.storeP2PStatusLine.value = line.value;
});
}
async $everyOnInitializeDatabase(): Promise<boolean> {
async _everyOnInitializeDatabase(): Promise<boolean> {
await this.initialiseP2PReplicator();
return Promise.resolve(true);
}
async $allSuspendExtraSync() {
private async _allSuspendExtraSync() {
this.plugin.settings.P2P_Enabled = false;
this.plugin.settings.P2P_AutoAccepting = AutoAccepting.NONE;
this.plugin.settings.P2P_AutoBroadcast = false;
@@ -112,15 +198,15 @@ export class P2PReplicator
return await Promise.resolve(true);
}
async $everyOnLoadStart() {
return await Promise.resolve();
}
// async $everyOnLoadStart() {
// return await Promise.resolve();
// }
async openPane() {
await this.plugin.$$showView(VIEW_TYPE_P2P);
await this.services.API.showWindow(VIEW_TYPE_P2P);
}
async $everyOnloadStart(): Promise<boolean> {
async _everyOnloadStart(): Promise<boolean> {
this.plugin.registerView(VIEW_TYPE_P2P, (leaf) => new P2PReplicatorPaneView(leaf, this.plugin));
this.plugin.addCommand({
id: "open-p2p-replicator",
@@ -170,10 +256,26 @@ export class P2PReplicator
return await Promise.resolve(true);
}
$everyAfterResumeProcess(): Promise<boolean> {
_everyAfterResumeProcess(): Promise<boolean> {
if (this.settings.P2P_Enabled && this.settings.P2P_AutoStart) {
setTimeout(() => void this.open(), 100);
}
const rep = this._replicatorInstance;
rep?.allowReconnection();
return Promise.resolve(true);
}
_everyBeforeSuspendProcess(): Promise<boolean> {
const rep = this._replicatorInstance;
rep?.disconnectFromServer();
return Promise.resolve(true);
}
override onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.replicator.handleGetNewReplicator(this._anyNewReplicator.bind(this));
services.databaseEvents.handleOnDatabaseInitialisation(this._everyOnInitializeDatabase.bind(this));
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.appLifecycle.handleOnSuspending(this._everyBeforeSuspendProcess.bind(this));
services.appLifecycle.handleOnResumed(this._everyAfterResumeProcess.bind(this));
services.setting.handleSuspendExtraSync(this._allSuspendExtraSync.bind(this));
}
}

View File

@@ -19,6 +19,7 @@
} from "../../../lib/src/replication/trystero/TrysteroReplicatorP2PServer";
import { type P2PReplicatorStatus } from "../../../lib/src/replication/trystero/TrysteroReplicator";
import { $msg as _msg } from "../../../lib/src/common/i18n";
import { SETTING_KEY_P2P_DEVICE_NAME } from "../../../lib/src/common/types";
interface Props {
plugin: PluginShim;
@@ -32,10 +33,10 @@
const initialSettings = { ...plugin.settings };
let settings = $state<P2PSyncSetting>(initialSettings);
// const vaultName = plugin.$$getVaultName();
// const vaultName = service.vault.getVaultName();
// const dbKey = `${vaultName}-p2p-device-name`;
const initialDeviceName = cmdSync.getConfig("p2p_device_name") ?? plugin.$$getVaultName();
const initialDeviceName = cmdSync.getConfig(SETTING_KEY_P2P_DEVICE_NAME) ?? plugin.services.vault.getVaultName();
let deviceName = $state<string>(initialDeviceName);
let eP2PEnabled = $state<boolean>(initialSettings.P2P_Enabled);
@@ -84,7 +85,7 @@
P2P_AutoBroadcast: eAutoBroadcast,
};
plugin.settings = newSettings;
cmdSync.setConfig("p2p_device_name", eDeviceName);
cmdSync.setConfig(SETTING_KEY_P2P_DEVICE_NAME, eDeviceName);
deviceName = eDeviceName;
await plugin.saveSettings();
}
@@ -250,6 +251,9 @@
};
cmdSync.setConfig(initialDialogStatusKey, JSON.stringify(dialogStatus));
});
let isObsidian = $derived.by(() => {
return plugin.services.API.getPlatform() === "obsidian";
});
</script>
<article>
@@ -265,95 +269,105 @@
{/each}
</details>
<h2>Connection Settings</h2>
<details bind:open={isSettingOpened}>
<summary>{eRelay}</summary>
<table class="settings">
<tbody>
<tr>
<th> Enable P2P Replicator </th>
<td>
<label class={{ "is-dirty": isP2PEnabledModified }}>
<input type="checkbox" bind:checked={eP2PEnabled} />
</label>
</td>
</tr><tr>
<th> Relay settings </th>
<td>
<label class={{ "is-dirty": isRelayModified }}>
<input
type="text"
placeholder="wss://exp-relay.vrtmrz.net, wss://xxxxx"
bind:value={eRelay}
autocomplete="off"
/>
<button onclick={() => useDefaultRelay()}> Use vrtmrz's relay </button>
</label>
</td>
</tr>
<tr>
<th> Room ID </th>
<td>
<label class={{ "is-dirty": isRoomIdModified }}>
<input
type="text"
placeholder="anything-you-like"
bind:value={eRoomId}
autocomplete="off"
spellcheck="false"
autocorrect="off"
/>
<button onclick={() => chooseRandom()}> Use Random Number </button>
</label>
<span>
<small>
This can isolate your connections between devices. Use the same Room ID for the same
devices.</small
>
</span>
</td>
</tr>
<tr>
<th> Password </th>
<td>
<label class={{ "is-dirty": isPasswordModified }}>
<input type="password" placeholder="password" bind:value={ePassword} />
</label>
<span>
<small> This password is used to encrypt the connection. Use something long enough. </small>
</span>
</td>
</tr>
<tr>
<th> This device name </th>
<td>
<label class={{ "is-dirty": isDeviceNameModified }}>
<input type="text" placeholder="iphone-16" bind:value={eDeviceName} autocomplete="off" />
</label>
<span>
<small>
Device name to identify the device. Please use shorter one for the stable peer
detection, i.e., "iphone-16" or "macbook-2021".
</small>
</span>
</td>
</tr>
<tr>
<th> Auto Connect </th>
<td>
<label class={{ "is-dirty": isAutoStartModified }}>
<input type="checkbox" bind:checked={eAutoStart} />
</label>
</td>
</tr>
<tr>
<th> Start change-broadcasting on Connect </th>
<td>
<label class={{ "is-dirty": isAutoBroadcastModified }}>
<input type="checkbox" bind:checked={eAutoBroadcast} />
</label>
</td>
</tr>
<!-- <tr>
{#if isObsidian}
You can configure in the Obsidian Plugin Settings.
{:else}
<details bind:open={isSettingOpened}>
<summary>{eRelay}</summary>
<table class="settings">
<tbody>
<tr>
<th> Enable P2P Replicator </th>
<td>
<label class={{ "is-dirty": isP2PEnabledModified }}>
<input type="checkbox" bind:checked={eP2PEnabled} />
</label>
</td>
</tr><tr>
<th> Relay settings </th>
<td>
<label class={{ "is-dirty": isRelayModified }}>
<input
type="text"
placeholder="wss://exp-relay.vrtmrz.net, wss://xxxxx"
bind:value={eRelay}
autocomplete="off"
/>
<button onclick={() => useDefaultRelay()}> Use vrtmrz's relay </button>
</label>
</td>
</tr>
<tr>
<th> Room ID </th>
<td>
<label class={{ "is-dirty": isRoomIdModified }}>
<input
type="text"
placeholder="anything-you-like"
bind:value={eRoomId}
autocomplete="off"
spellcheck="false"
autocorrect="off"
/>
<button onclick={() => chooseRandom()}> Use Random Number </button>
</label>
<span>
<small>
This can isolate your connections between devices. Use the same Room ID for the same
devices.</small
>
</span>
</td>
</tr>
<tr>
<th> Password </th>
<td>
<label class={{ "is-dirty": isPasswordModified }}>
<input type="password" placeholder="password" bind:value={ePassword} />
</label>
<span>
<small>
This password is used to encrypt the connection. Use something long enough.
</small>
</span>
</td>
</tr>
<tr>
<th> This device name </th>
<td>
<label class={{ "is-dirty": isDeviceNameModified }}>
<input
type="text"
placeholder="iphone-16"
bind:value={eDeviceName}
autocomplete="off"
/>
</label>
<span>
<small>
Device name to identify the device. Please use shorter one for the stable peer
detection, i.e., "iphone-16" or "macbook-2021".
</small>
</span>
</td>
</tr>
<tr>
<th> Auto Connect </th>
<td>
<label class={{ "is-dirty": isAutoStartModified }}>
<input type="checkbox" bind:checked={eAutoStart} />
</label>
</td>
</tr>
<tr>
<th> Start change-broadcasting on Connect </th>
<td>
<label class={{ "is-dirty": isAutoBroadcastModified }}>
<input type="checkbox" bind:checked={eAutoBroadcast} />
</label>
</td>
</tr>
<!-- <tr>
<th> Auto Accepting </th>
<td>
<label class={{ "is-dirty": isAutoAcceptModified }}>
@@ -361,11 +375,12 @@
</label>
</td>
</tr> -->
</tbody>
</table>
<button disabled={!isAnyModified} class="button mod-cta" onclick={saveAndApply}>Save and Apply</button>
<button disabled={!isAnyModified} class="button" onclick={revert}>Revert changes</button>
</details>
</tbody>
</table>
<button disabled={!isAnyModified} class="button mod-cta" onclick={saveAndApply}>Save and Apply</button>
<button disabled={!isAnyModified} class="button" onclick={revert}>Revert changes</button>
</details>
{/if}
<div>
<h2>Signaling Server Connection</h2>

View File

@@ -95,7 +95,7 @@ And you can also drop the local database to rebuild from the remote device.`,
if (yn === DROP) {
await this.plugin.rebuilder.scheduleFetch();
} else {
await this.plugin.$$scheduleAppReload();
this.plugin.services.appLifecycle.scheduleRestart();
}
} else {
Logger(`Cancelled\nRemote config for ${peer.name} is not applied`, LOG_LEVEL_NOTICE);

Submodule src/lib updated: 6a8d1738bb...86b0a95d56

View File

@@ -1,41 +1,24 @@
import { Plugin } from "./deps";
import {
type EntryDoc,
type LoadedEntry,
type ObsidianLiveSyncSettings,
type LOG_LEVEL,
type diff_result,
type DatabaseConnectingStatus,
type EntryHasPath,
type DocumentID,
type FilePathWithPrefix,
type FilePath,
LOG_LEVEL_INFO,
type HasSettings,
type MetaEntry,
type UXFileInfoStub,
type MISSING_OR_ERROR,
type AUTO_MERGED,
type RemoteDBSettings,
type TweakValues,
type CouchDBCredentials,
} from "./lib/src/common/types.ts";
import { type FileEventItem } from "./common/types.ts";
import { type SimpleStore } from "./lib/src/common/utils.ts";
import { LiveSyncLocalDB, type LiveSyncLocalDBEnv } from "./lib/src/pouchdb/LiveSyncLocalDB.ts";
import {
LiveSyncAbstractReplicator,
type LiveSyncReplicatorEnv,
} from "./lib/src/replication/LiveSyncAbstractReplicator.js";
import { type KeyValueDatabase } from "./common/KeyValueDB.ts";
import { type KeyValueDatabase } from "./lib/src/interfaces/KeyValueDatabase.ts";
import { LiveSyncCommands } from "./features/LiveSyncCommands.ts";
import { HiddenFileSync } from "./features/HiddenFileSync/CmdHiddenFileSync.ts";
import { ConfigSync } from "./features/ConfigSync/CmdConfigSync.ts";
import { reactiveSource, type ReactiveValue } from "./lib/src/dataobject/reactive.js";
import { reactiveSource, type ReactiveValue } from "octagonal-wheels/dataobject/reactive";
import { type LiveSyncJournalReplicatorEnv } from "./lib/src/replication/journal/LiveSyncJournalReplicator.js";
import { type LiveSyncCouchDBReplicatorEnv } from "./lib/src/replication/couchdb/LiveSyncReplicator.js";
import type { CheckPointInfo } from "./lib/src/replication/journal/JournalSyncTypes.js";
import { ObsHttpHandler } from "./modules/essentialObsidian/APILib/ObsHttpHandler.js";
import type { IObsidianModule } from "./modules/AbstractObsidianModule.ts";
import { ModuleDev } from "./modules/extras/ModuleDev.ts";
@@ -51,6 +34,7 @@ import { ModuleObsidianSettings } from "./modules/features/ModuleObsidianSetting
import { ModuleRedFlag } from "./modules/coreFeatures/ModuleRedFlag.ts";
import { ModuleObsidianMenu } from "./modules/essentialObsidian/ModuleObsidianMenu.ts";
import { ModuleSetupObsidian } from "./modules/features/ModuleSetupObsidian.ts";
import { SetupManager } from "./modules/features/SetupManager.ts";
import type { StorageAccess } from "./modules/interfaces/StorageAccess.ts";
import type { Confirm } from "./lib/src/interfaces/Confirm.ts";
import type { Rebuilder } from "./modules/interfaces/DatabaseRebuilder.ts";
@@ -59,8 +43,7 @@ import { ModuleDatabaseFileAccess } from "./modules/core/ModuleDatabaseFileAcces
import { ModuleFileHandler } from "./modules/core/ModuleFileHandler.ts";
import { ModuleObsidianAPI } from "./modules/essentialObsidian/ModuleObsidianAPI.ts";
import { ModuleObsidianEvents } from "./modules/essentialObsidian/ModuleObsidianEvents.ts";
import { injectModules, type AbstractModule } from "./modules/AbstractModule.ts";
import type { ICoreModule } from "./modules/ModuleTypes.ts";
import { type AbstractModule } from "./modules/AbstractModule.ts";
import { ModuleObsidianSettingDialogue } from "./modules/features/ModuleObsidianSettingTab.ts";
import { ModuleObsidianDocumentHistory } from "./modules/features/ModuleObsidianDocumentHistory.ts";
import { ModuleObsidianGlobalHistory } from "./modules/features/ModuleGlobalHistory.ts";
@@ -84,13 +67,17 @@ import { ModuleLiveSyncMain } from "./modules/main/ModuleLiveSyncMain.ts";
import { ModuleExtraSyncObsidian } from "./modules/extraFeaturesObsidian/ModuleExtraSyncObsidian.ts";
import { LocalDatabaseMaintenance } from "./features/LocalDatabaseMainte/CmdLocalDatabaseMainte.ts";
import { P2PReplicator } from "./features/P2PSync/CmdP2PReplicator.ts";
import type { LiveSyncManagers } from "./lib/src/managers/LiveSyncManagers.ts";
import { ObsidianServiceHub } from "./modules/services/ObsidianServices.ts";
import type { InjectableServiceHub } from "./lib/src/services/InjectableServices.ts";
// function throwShouldBeOverridden(): never {
// throw new Error("This function should be overridden by the module.");
// }
// const InterceptiveAll = Promise.resolve(true);
// const InterceptiveEvery = Promise.resolve(true);
// const InterceptiveAny = Promise.resolve(undefined);
function throwShouldBeOverridden(): never {
throw new Error("This function should be overridden by the module.");
}
const InterceptiveAll = Promise.resolve(true);
const InterceptiveEvery = Promise.resolve(true);
const InterceptiveAny = Promise.resolve(undefined);
/**
* All $prefixed functions are hooked by the modules. Be careful to call them directly.
* Please refer to the module's source code to understand the function.
@@ -100,6 +87,13 @@ const InterceptiveAny = Promise.resolve(undefined);
* $any : Process all modules until the first success.
* $ : Other interceptive points. You should manually assign the module
* All of above performed on injectModules function.
*
* No longer used! See AppLifecycleService in Services.ts.
* For a while, just commented out some previously used code. (sorry, some are deleted...)
* 'Convention over configuration' was a lie for me. At least, very lack of refactor-ability.
*
* Still some modules are separated, and connected by `ThroughHole` class.
* However, it is not a good design. I am going to manage the modules in a more explicit way.
*/
export default class ObsidianLiveSyncPlugin
@@ -111,6 +105,18 @@ export default class ObsidianLiveSyncPlugin
LiveSyncCouchDBReplicatorEnv,
HasSettings<ObsidianLiveSyncSettings>
{
/**
* The service hub for managing all services.
*/
_services: InjectableServiceHub = new ObsidianServiceHub(this);
get services() {
return this._services;
}
/**
* Bind functions to the service hub (for migration purpose).
*/
// bindFunctions = (this.serviceHub as ObsidianServiceHub).bindFunctions.bind(this.serviceHub);
// --> Module System
getAddOn<T extends LiveSyncCommands>(cls: string) {
for (const addon of this.addOns) {
@@ -171,46 +177,23 @@ export default class ObsidianLiveSyncPlugin
new ModuleDev(this, this),
new ModuleReplicateTest(this, this),
new ModuleIntegratedTest(this, this),
new SetupManager(this, this),
] as (IObsidianModule | AbstractModule)[];
injected = injectModules(this, [...this.modules, ...this.addOns] as ICoreModule[]);
getModule<T extends IObsidianModule>(constructor: new (...args: any[]) => T): T {
for (const module of this.modules) {
if (module.constructor === constructor) return module as T;
}
throw new Error(`Module ${constructor} not found or not loaded.`);
}
// injected = injectModules(this, [...this.modules, ...this.addOns] as ICoreModule[]);
// <-- Module System
$$isSuspended(): boolean {
throwShouldBeOverridden();
}
$$setSuspended(value: boolean): void {
throwShouldBeOverridden();
}
$$isDatabaseReady(): boolean {
throwShouldBeOverridden();
}
$$getDeviceAndVaultName(): string {
throwShouldBeOverridden();
}
$$setDeviceAndVaultName(name: string): void {
throwShouldBeOverridden();
}
$$addLog(message: any, level: LOG_LEVEL = LOG_LEVEL_INFO, key = ""): void {
throwShouldBeOverridden();
}
$$isReady(): boolean {
throwShouldBeOverridden();
}
$$markIsReady(): void {
throwShouldBeOverridden();
}
$$resetIsReady(): void {
throwShouldBeOverridden();
}
// Following are plugged by the modules.
settings!: ObsidianLiveSyncSettings;
localDatabase!: LiveSyncLocalDB;
managers!: LiveSyncManagers;
simpleStore!: SimpleStore<CheckPointInfo>;
replicator!: LiveSyncAbstractReplicator;
confirm!: Confirm;
@@ -227,30 +210,6 @@ export default class ObsidianLiveSyncPlugin
return this.settings;
}
$$markFileListPossiblyChanged(): void {
throwShouldBeOverridden();
}
$$customFetchHandler(): ObsHttpHandler {
throwShouldBeOverridden();
}
$$getLastPostFailedBySize(): boolean {
throwShouldBeOverridden();
}
$$isStorageInsensitive(): boolean {
throwShouldBeOverridden();
}
$$shouldCheckCaseInsensitive(): boolean {
throwShouldBeOverridden();
}
$$isUnloaded(): boolean {
throwShouldBeOverridden();
}
requestCount = reactiveSource(0);
responseCount = reactiveSource(0);
totalQueued = reactiveSource(0);
@@ -275,101 +234,6 @@ export default class ObsidianLiveSyncPlugin
syncStatus: "CLOSED" as DatabaseConnectingStatus,
});
$$isReloadingScheduled(): boolean {
throwShouldBeOverridden();
}
$$getReplicator(): LiveSyncAbstractReplicator {
throwShouldBeOverridden();
}
$$connectRemoteCouchDB(
uri: string,
auth: CouchDBCredentials,
disableRequestURI: boolean,
passphrase: string | false,
useDynamicIterationCount: boolean,
performSetup: boolean,
skipInfo: boolean,
compression: boolean,
customHeaders: Record<string, string>,
useRequestAPI: boolean,
getPBKDF2Salt: () => Promise<Uint8Array>
): Promise<
| string
| {
db: PouchDB.Database<EntryDoc>;
info: PouchDB.Core.DatabaseInfo;
}
> {
throwShouldBeOverridden();
}
$$isMobile(): boolean {
throwShouldBeOverridden();
}
$$vaultName(): string {
throwShouldBeOverridden();
}
// --> Path
$$getActiveFilePath(): FilePathWithPrefix | undefined {
throwShouldBeOverridden();
}
// <-- Path
// --> Path conversion
$$id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
throwShouldBeOverridden();
}
$$path2id(filename: FilePathWithPrefix | FilePath, prefix?: string): Promise<DocumentID> {
throwShouldBeOverridden();
}
// <!-- Path conversion
// --> Database
$$createPouchDBInstance<T extends object>(
name?: string,
options?: PouchDB.Configuration.DatabaseConfiguration
): PouchDB.Database<T> {
throwShouldBeOverridden();
}
$allOnDBUnload(db: LiveSyncLocalDB): void {
return;
}
$allOnDBClose(db: LiveSyncLocalDB): void {
return;
}
// <!-- Database
$anyNewReplicator(settingOverride: Partial<ObsidianLiveSyncSettings> = {}): Promise<LiveSyncAbstractReplicator> {
throwShouldBeOverridden();
}
$everyOnInitializeDatabase(db: LiveSyncLocalDB): Promise<boolean> {
return InterceptiveEvery;
}
$everyOnResetDatabase(db: LiveSyncLocalDB): Promise<boolean> {
return InterceptiveEvery;
}
// end interfaces
$$getVaultName(): string {
throwShouldBeOverridden();
}
$$getSimpleStore<T>(kind: string): SimpleStore<T> {
throwShouldBeOverridden();
}
// trench!: Trench;
// --> Events
/*
@@ -410,322 +274,404 @@ export default class ObsidianLiveSyncPlugin
*/
$everyOnLayoutReady(): Promise<boolean> {
return InterceptiveEvery;
}
$everyOnFirstInitialize(): Promise<boolean> {
return InterceptiveEvery;
}
// $everyOnLayoutReady(): Promise<boolean> {
// //TODO: AppLifecycleService.onLayoutReady
// return InterceptiveEvery;
// }
// $everyOnFirstInitialize(): Promise<boolean> {
// //TODO: AppLifecycleService.onFirstInitialize
// return InterceptiveEvery;
// }
// Some Module should call this function to start the plugin.
$$onLiveSyncReady(): Promise<false | undefined> {
throwShouldBeOverridden();
}
$$wireUpEvents(): void {
throwShouldBeOverridden();
}
$$onLiveSyncLoad(): Promise<void> {
throwShouldBeOverridden();
}
// $$onLiveSyncReady(): Promise<false | undefined> {
// //TODO: AppLifecycleService.onLiveSyncReady
// throwShouldBeOverridden();
// }
// $$wireUpEvents(): void {
// //TODO: AppLifecycleService.wireUpEvents
// throwShouldBeOverridden();
// }
// $$onLiveSyncLoad(): Promise<void> {
// //TODO: AppLifecycleService.onLoad
// throwShouldBeOverridden();
// }
$$onLiveSyncUnload(): Promise<void> {
throwShouldBeOverridden();
}
// $$onLiveSyncUnload(): Promise<void> {
// //TODO: AppLifecycleService.onAppUnload
// throwShouldBeOverridden();
// }
$allScanStat(): Promise<boolean> {
return InterceptiveAll;
}
$everyOnloadStart(): Promise<boolean> {
return InterceptiveEvery;
}
// $allScanStat(): Promise<boolean> {
// //TODO: AppLifecycleService.scanStartupIssues
// return InterceptiveAll;
// }
// $everyOnloadStart(): Promise<boolean> {
// //TODO: AppLifecycleService.onInitialise
// return InterceptiveEvery;
// }
$everyOnloadAfterLoadSettings(): Promise<boolean> {
return InterceptiveEvery;
}
// $everyOnloadAfterLoadSettings(): Promise<boolean> {
// //TODO: AppLifecycleService.onApplyStartupLoaded
// return InterceptiveEvery;
// }
$everyOnload(): Promise<boolean> {
return InterceptiveEvery;
}
// $everyOnload(): Promise<boolean> {
// //TODO: AppLifecycleService.onLoaded
// return InterceptiveEvery;
// }
$anyHandlerProcessesFileEvent(item: FileEventItem): Promise<boolean | undefined> {
return InterceptiveAny;
}
// $anyHandlerProcessesFileEvent(item: FileEventItem): Promise<boolean | undefined> {
// //TODO: FileProcessingService.processFileEvent
// return InterceptiveAny;
// }
$allStartOnUnload(): Promise<boolean> {
return InterceptiveAll;
}
$allOnUnload(): Promise<boolean> {
return InterceptiveAll;
}
// $allStartOnUnload(): Promise<boolean> {
// //TODO: AppLifecycleService.onBeforeUnload
// return InterceptiveAll;
// }
// $allOnUnload(): Promise<boolean> {
// //TODO: AppLifecycleService.onUnload
// return InterceptiveAll;
// }
$$openDatabase(): Promise<boolean> {
throwShouldBeOverridden();
}
// $$openDatabase(): Promise<boolean> {
// // DatabaseService.openDatabase
// throwShouldBeOverridden();
// }
$$realizeSettingSyncMode(): Promise<void> {
throwShouldBeOverridden();
}
$$performRestart() {
throwShouldBeOverridden();
}
// $$realizeSettingSyncMode(): Promise<void> {
// // SettingService.realiseSetting
// throwShouldBeOverridden();
// }
// $$performRestart() {
// // AppLifecycleService.performRestart
// throwShouldBeOverridden();
// }
$$clearUsedPassphrase(): void {
throwShouldBeOverridden();
}
// $$clearUsedPassphrase(): void {
// // SettingService.clearUsedPassphrase
// throwShouldBeOverridden();
// }
$$decryptSettings(settings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
throwShouldBeOverridden();
}
$$adjustSettings(settings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
throwShouldBeOverridden();
}
// $$decryptSettings(settings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
// // SettingService.decryptSettings
// throwShouldBeOverridden();
// }
// $$adjustSettings(settings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
// // SettingService.adjustSettings
// throwShouldBeOverridden();
// }
$$loadSettings(): Promise<void> {
throwShouldBeOverridden();
}
// $$loadSettings(): Promise<void> {
// // SettingService.loadSettings
// throwShouldBeOverridden();
// }
$$saveDeviceAndVaultName(): void {
throwShouldBeOverridden();
}
// $$saveDeviceAndVaultName(): void {
// // SettingService.saveDeviceAndVaultName
// throwShouldBeOverridden();
// }
$$saveSettingData(): Promise<void> {
throwShouldBeOverridden();
}
// $$saveSettingData(): Promise<void> {
// // SettingService.saveSettingData
// throwShouldBeOverridden();
// }
$anyProcessOptionalFileEvent(path: FilePath): Promise<boolean | undefined> {
return InterceptiveAny;
}
// $anyProcessOptionalFileEvent(path: FilePath): Promise<boolean | undefined> {
// // FileProcessingService.processOptionalFileEvent
// return InterceptiveAny;
// }
$everyCommitPendingFileEvent(): Promise<boolean> {
return InterceptiveEvery;
}
// $everyCommitPendingFileEvent(): Promise<boolean> {
// // FileProcessingService.commitPendingFileEvent
// return InterceptiveEvery;
// }
// ->
$anyGetOptionalConflictCheckMethod(path: FilePathWithPrefix): Promise<boolean | undefined | "newer"> {
return InterceptiveAny;
}
// $anyGetOptionalConflictCheckMethod(path: FilePathWithPrefix): Promise<boolean | undefined | "newer"> {
// return InterceptiveAny;
// }
$$queueConflictCheckIfOpen(file: FilePathWithPrefix): Promise<void> {
throwShouldBeOverridden();
}
// $$queueConflictCheckIfOpen(file: FilePathWithPrefix): Promise<void> {
// // ConflictEventManager.queueCheckForConflictIfOpen
// throwShouldBeOverridden();
// }
$$queueConflictCheck(file: FilePathWithPrefix): Promise<void> {
throwShouldBeOverridden();
}
// $$queueConflictCheck(file: FilePathWithPrefix): Promise<void> {
// // ConflictEventManager.queueCheckForConflict
// throwShouldBeOverridden();
// }
$$waitForAllConflictProcessed(): Promise<boolean> {
throwShouldBeOverridden();
}
// $$waitForAllConflictProcessed(): Promise<boolean> {
// // ConflictEventManager.ensureAllConflictProcessed
// throwShouldBeOverridden();
// }
//<-- Conflict Check
$anyProcessOptionalSyncFiles(doc: LoadedEntry): Promise<boolean | undefined> {
return InterceptiveAny;
}
// $anyProcessOptionalSyncFiles(doc: LoadedEntry): Promise<boolean | undefined> {
// // ReplicationService.processOptionalSyncFile
// return InterceptiveAny;
// }
$anyProcessReplicatedDoc(doc: MetaEntry): Promise<boolean | undefined> {
return InterceptiveAny;
}
// $anyProcessReplicatedDoc(doc: MetaEntry): Promise<boolean | undefined> {
// // ReplicationService.processReplicatedDocument
// return InterceptiveAny;
// }
//---> Sync
$$parseReplicationResult(docs: Array<PouchDB.Core.ExistingDocument<EntryDoc>>): void {
throwShouldBeOverridden();
}
// $$parseReplicationResult(docs: Array<PouchDB.Core.ExistingDocument<EntryDoc>>): void {
// // ReplicationService.parseSynchroniseResult
// throwShouldBeOverridden();
// }
$anyModuleParsedReplicationResultItem(docs: PouchDB.Core.ExistingDocument<EntryDoc>): Promise<boolean | undefined> {
return InterceptiveAny;
}
$everyBeforeRealizeSetting(): Promise<boolean> {
return InterceptiveEvery;
}
$everyAfterRealizeSetting(): Promise<boolean> {
return InterceptiveEvery;
}
$everyRealizeSettingSyncMode(): Promise<boolean> {
return InterceptiveEvery;
}
// $anyModuleParsedReplicationResultItem(docs: PouchDB.Core.ExistingDocument<EntryDoc>): Promise<boolean | undefined> {
// // ReplicationService.processVirtualDocument
// return InterceptiveAny;
// }
// $everyBeforeRealizeSetting(): Promise<boolean> {
// // SettingEventManager.beforeRealiseSetting
// return InterceptiveEvery;
// }
// $everyAfterRealizeSetting(): Promise<boolean> {
// // SettingEventManager.onSettingRealised
// return InterceptiveEvery;
// }
// $everyRealizeSettingSyncMode(): Promise<boolean> {
// // SettingEventManager.onRealiseSetting
// return InterceptiveEvery;
// }
$everyBeforeSuspendProcess(): Promise<boolean> {
return InterceptiveEvery;
}
$everyOnResumeProcess(): Promise<boolean> {
return InterceptiveEvery;
}
$everyAfterResumeProcess(): Promise<boolean> {
return InterceptiveEvery;
}
// $everyBeforeSuspendProcess(): Promise<boolean> {
// // AppLifecycleService.onSuspending
// return InterceptiveEvery;
// }
// $everyOnResumeProcess(): Promise<boolean> {
// // AppLifecycleService.onResuming
// return InterceptiveEvery;
// }
// $everyAfterResumeProcess(): Promise<boolean> {
// // AppLifecycleService.onResumed
// return InterceptiveEvery;
// }
$$fetchRemotePreferredTweakValues(trialSetting: RemoteDBSettings): Promise<TweakValues | false> {
throwShouldBeOverridden();
}
$$checkAndAskResolvingMismatchedTweaks(preferred: Partial<TweakValues>): Promise<[TweakValues | boolean, boolean]> {
throwShouldBeOverridden();
}
$$askResolvingMismatchedTweaks(preferredSource: TweakValues): Promise<"OK" | "CHECKAGAIN" | "IGNORE"> {
throwShouldBeOverridden();
}
// $$fetchRemotePreferredTweakValues(trialSetting: RemoteDBSettings): Promise<TweakValues | false> {
// //TODO:TweakValueService.fetchRemotePreferred
// throwShouldBeOverridden();
// }
// $$checkAndAskResolvingMismatchedTweaks(preferred: Partial<TweakValues>): Promise<[TweakValues | boolean, boolean]> {
// //TODO:TweakValueService.checkAndAskResolvingMismatched
// throwShouldBeOverridden();
// }
// $$askResolvingMismatchedTweaks(preferredSource: TweakValues): Promise<"OK" | "CHECKAGAIN" | "IGNORE"> {
// //TODO:TweakValueService.askResolvingMismatched
// throwShouldBeOverridden();
// }
$$checkAndAskUseRemoteConfiguration(
settings: RemoteDBSettings
): Promise<{ result: false | TweakValues; requireFetch: boolean }> {
throwShouldBeOverridden();
}
// $$checkAndAskUseRemoteConfiguration(
// settings: RemoteDBSettings
// ): Promise<{ result: false | TweakValues; requireFetch: boolean }> {
// // TweakValueService.checkAndAskUseRemoteConfiguration
// throwShouldBeOverridden();
// }
$$askUseRemoteConfiguration(
trialSetting: RemoteDBSettings,
preferred: TweakValues
): Promise<{ result: false | TweakValues; requireFetch: boolean }> {
throwShouldBeOverridden();
}
$everyBeforeReplicate(showMessage: boolean): Promise<boolean> {
return InterceptiveEvery;
}
$$replicate(showMessage: boolean = false): Promise<boolean | void> {
throwShouldBeOverridden();
}
$$replicateByEvent(showMessage: boolean = false): Promise<boolean | void> {
throwShouldBeOverridden();
}
// $$askUseRemoteConfiguration(
// trialSetting: RemoteDBSettings,
// preferred: TweakValues
// ): Promise<{ result: false | TweakValues; requireFetch: boolean }> {
// // TweakValueService.askUseRemoteConfiguration
// throwShouldBeOverridden();
// }
// $everyBeforeReplicate(showMessage: boolean): Promise<boolean> {
// // ReplicationService.beforeReplicate
// return InterceptiveEvery;
// }
$everyOnDatabaseInitialized(showingNotice: boolean): Promise<boolean> {
throwShouldBeOverridden();
}
// $$canReplicate(showMessage: boolean = false): Promise<boolean> {
// // ReplicationService.isReplicationReady
// throwShouldBeOverridden();
// }
$$initializeDatabase(showingNotice: boolean = false, reopenDatabase = true): Promise<boolean> {
throwShouldBeOverridden();
}
// $$replicate(showMessage: boolean = false): Promise<boolean | void> {
// // ReplicationService.replicate
// throwShouldBeOverridden();
// }
// $$replicateByEvent(showMessage: boolean = false): Promise<boolean | void> {
// // ReplicationService.replicateByEvent
// throwShouldBeOverridden();
// }
$anyAfterConnectCheckFailed(): Promise<boolean | "CHECKAGAIN" | undefined> {
return InterceptiveAny;
}
// $everyOnDatabaseInitialized(showingNotice: boolean): Promise<boolean> {
// // DatabaseEventService.onDatabaseInitialised
// throwShouldBeOverridden();
// }
$$replicateAllToServer(
showingNotice: boolean = false,
sendChunksInBulkDisabled: boolean = false
): Promise<boolean> {
throwShouldBeOverridden();
}
$$replicateAllFromServer(showingNotice: boolean = false): Promise<boolean> {
throwShouldBeOverridden();
}
// $$initializeDatabase(
// showingNotice: boolean = false,
// reopenDatabase = true,
// ignoreSuspending: boolean = false
// ): Promise<boolean> {
// // DatabaseEventService.initializeDatabase
// throwShouldBeOverridden();
// }
// $anyAfterConnectCheckFailed(): Promise<boolean | "CHECKAGAIN" | undefined> {
// // ReplicationService.checkConnectionFailure
// return InterceptiveAny;
// }
// $$replicateAllToServer(
// showingNotice: boolean = false,
// sendChunksInBulkDisabled: boolean = false
// ): Promise<boolean> {
// // RemoteService.replicateAllToRemote
// throwShouldBeOverridden();
// }
// $$replicateAllFromServer(showingNotice: boolean = false): Promise<boolean> {
// // RemoteService.replicateAllFromRemote
// throwShouldBeOverridden();
// }
// Remote Governing
$$markRemoteLocked(lockByClean: boolean = false): Promise<void> {
throwShouldBeOverridden();
}
// $$markRemoteLocked(lockByClean: boolean = false): Promise<void> {
// // RemoteService.markLocked;
// throwShouldBeOverridden();
// }
$$markRemoteUnlocked(): Promise<void> {
throwShouldBeOverridden();
}
// $$markRemoteUnlocked(): Promise<void> {
// // RemoteService.markUnlocked;
// throwShouldBeOverridden();
// }
$$markRemoteResolved(): Promise<void> {
throwShouldBeOverridden();
}
// $$markRemoteResolved(): Promise<void> {
// // RemoteService.markResolved;
// throwShouldBeOverridden();
// }
// <-- Remote Governing
$$isFileSizeExceeded(size: number): boolean {
throwShouldBeOverridden();
}
// $$isFileSizeExceeded(size: number): boolean {
// // VaultService.isFileSizeTooLarge
// throwShouldBeOverridden();
// }
$$performFullScan(showingNotice?: boolean): Promise<void> {
throwShouldBeOverridden();
}
// $$performFullScan(showingNotice?: boolean, ignoreSuspending?: boolean): Promise<void> {
// // VaultService.scanVault
// throwShouldBeOverridden();
// }
$anyResolveConflictByUI(
filename: FilePathWithPrefix,
conflictCheckResult: diff_result
): Promise<boolean | undefined> {
return InterceptiveAny;
}
$$resolveConflictByDeletingRev(
path: FilePathWithPrefix,
deleteRevision: string,
subTitle = ""
): Promise<typeof MISSING_OR_ERROR | typeof AUTO_MERGED> {
throwShouldBeOverridden();
}
$$resolveConflict(filename: FilePathWithPrefix): Promise<void> {
throwShouldBeOverridden();
}
$anyResolveConflictByNewest(filename: FilePathWithPrefix): Promise<boolean> {
throwShouldBeOverridden();
}
// $anyResolveConflictByUI(
// filename: FilePathWithPrefix,
// conflictCheckResult: diff_result
// ): Promise<boolean | undefined> {
// // ConflictService.resolveConflictByUserInteraction
// return InterceptiveAny;
// }
// $$resolveConflictByDeletingRev(
// path: FilePathWithPrefix,
// deleteRevision: string,
// subTitle = ""
// ): Promise<typeof MISSING_OR_ERROR | typeof AUTO_MERGED> {
// // ConflictService.resolveByDeletingRevision
// throwShouldBeOverridden();
// }
// $$resolveConflict(filename: FilePathWithPrefix): Promise<void> {
// // ConflictService.resolveConflict
// throwShouldBeOverridden();
// }
// $anyResolveConflictByNewest(filename: FilePathWithPrefix): Promise<boolean> {
// // ConflictService.resolveByNewest
// throwShouldBeOverridden();
// }
$$resetLocalDatabase(): Promise<void> {
throwShouldBeOverridden();
}
// $$resetLocalDatabase(): Promise<void> {
// // DatabaseService.resetDatabase;
// throwShouldBeOverridden();
// }
$$tryResetRemoteDatabase(): Promise<void> {
throwShouldBeOverridden();
}
// $$tryResetRemoteDatabase(): Promise<void> {
// // RemoteService.tryResetDatabase;
// throwShouldBeOverridden();
// }
$$tryCreateRemoteDatabase(): Promise<void> {
throwShouldBeOverridden();
}
// $$tryCreateRemoteDatabase(): Promise<void> {
// // RemoteService.tryCreateDatabase;
// throwShouldBeOverridden();
// }
$$isIgnoredByIgnoreFiles(file: string | UXFileInfoStub): Promise<boolean> {
throwShouldBeOverridden();
}
// $$isIgnoredByIgnoreFiles(file: string | UXFileInfoStub): Promise<boolean> {
// // VaultService.isIgnoredByIgnoreFiles
// throwShouldBeOverridden();
// }
$$isTargetFile(file: string | UXFileInfoStub, keepFileCheckList = false): Promise<boolean> {
throwShouldBeOverridden();
}
// $$isTargetFile(file: string | UXFileInfoStub, keepFileCheckList = false): Promise<boolean> {
// // VaultService.isTargetFile
// throwShouldBeOverridden();
// }
$$askReload(message?: string) {
throwShouldBeOverridden();
}
$$scheduleAppReload() {
throwShouldBeOverridden();
}
// $$askReload(message?: string) {
// // AppLifecycleService.askRestart
// throwShouldBeOverridden();
// }
// $$scheduleAppReload() {
// // AppLifecycleService.scheduleRestart
// throwShouldBeOverridden();
// }
//--- Setup
$allSuspendAllSync(): Promise<boolean> {
return InterceptiveAll;
}
$allSuspendExtraSync(): Promise<boolean> {
return InterceptiveAll;
}
// $allSuspendAllSync(): Promise<boolean> {
// // SettingEventManager.suspendAllSync
// return InterceptiveAll;
// }
// $allSuspendExtraSync(): Promise<boolean> {
// // SettingEventManager.suspendExtraSync
// return InterceptiveAll;
// }
$allAskUsingOptionalSyncFeature(opt: { enableFetch?: boolean; enableOverwrite?: boolean }): Promise<boolean> {
throwShouldBeOverridden();
}
$anyConfigureOptionalSyncFeature(mode: string): Promise<void> {
throwShouldBeOverridden();
}
// $allAskUsingOptionalSyncFeature(opt: { enableFetch?: boolean; enableOverwrite?: boolean }): Promise<boolean> {
// // SettingEventManager.suggestOptionalFeatures
// throwShouldBeOverridden();
// }
// $anyConfigureOptionalSyncFeature(mode: string): Promise<void> {
// // SettingEventManager.enableOptionalFeature
// throwShouldBeOverridden();
// }
$$showView(viewType: string): Promise<void> {
throwShouldBeOverridden();
}
// $$showView(viewType: string): Promise<void> {
// // UIManager.showWindow //
// throwShouldBeOverridden();
// }
// For Development: Ensure reliability MORE AND MORE. May the this plug-in helps all of us.
$everyModuleTest(): Promise<boolean> {
return InterceptiveEvery;
}
$everyModuleTestMultiDevice(): Promise<boolean> {
return InterceptiveEvery;
}
$$addTestResult(name: string, key: string, result: boolean, summary?: string, message?: string): void {
throwShouldBeOverridden();
}
// $everyModuleTest(): Promise<boolean> {
// return InterceptiveEvery;
// }
// $everyModuleTestMultiDevice(): Promise<boolean> {
// return InterceptiveEvery;
// }
// $$addTestResult(name: string, key: string, result: boolean, summary?: string, message?: string): void {
// throwShouldBeOverridden();
// }
_isThisModuleEnabled(): boolean {
return true;
}
// _isThisModuleEnabled(): boolean {
// return true;
// }
$anyGetAppId(): Promise<string | undefined> {
return InterceptiveAny;
}
// $anyGetAppId(): Promise<string | undefined> {
// // APIService.getAppId
// return InterceptiveAny;
// }
// Plug-in's overrideable functions
onload() {
void this.$$onLiveSyncLoad();
void this.services.appLifecycle.onLoad();
}
async saveSettings() {
await this.$$saveSettingData();
await this.services.setting.saveSettingData();
}
onunload() {
return void this.$$onLiveSyncUnload();
return void this.services.appLifecycle.onAppUnload();
}
// <-- Plug-in's overrideable functions
}

View File

@@ -1,34 +1,35 @@
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, Logger } from "octagonal-wheels/common/logger";
import type { LOG_LEVEL } from "../lib/src/common/types";
import type { LiveSyncCore } from "../main";
import { unique } from "octagonal-wheels/collection";
import type { IObsidianModule } from "./AbstractObsidianModule.ts";
import type {
ICoreModuleBase,
AllInjectableProps,
AllExecuteProps,
EveryExecuteProps,
AnyExecuteProps,
ICoreModule,
} from "./ModuleTypes";
import { __$checkInstanceBinding } from "../lib/src/dev/checks";
// import { unique } from "octagonal-wheels/collection";
// import type { IObsidianModule } from "./AbstractObsidianModule.ts";
// import type {
// ICoreModuleBase,
// AllInjectableProps,
// AllExecuteProps,
// EveryExecuteProps,
// AnyExecuteProps,
// ICoreModule,
// } from "./ModuleTypes";
function isOverridableKey(key: string): key is keyof ICoreModuleBase {
return key.startsWith("$");
}
// function isOverridableKey(key: string): key is keyof ICoreModuleBase {
// return key.startsWith("$");
// }
function isInjectableKey(key: string): key is keyof AllInjectableProps {
return key.startsWith("$$");
}
// function isInjectableKey(key: string): key is keyof AllInjectableProps {
// return key.startsWith("$$");
// }
function isAllExecuteKey(key: string): key is keyof AllExecuteProps {
return key.startsWith("$all");
}
function isEveryExecuteKey(key: string): key is keyof EveryExecuteProps {
return key.startsWith("$every");
}
function isAnyExecuteKey(key: string): key is keyof AnyExecuteProps {
return key.startsWith("$any");
}
// function isAllExecuteKey(key: string): key is keyof AllExecuteProps {
// return key.startsWith("$all");
// }
// function isEveryExecuteKey(key: string): key is keyof EveryExecuteProps {
// return key.startsWith("$every");
// }
// function isAnyExecuteKey(key: string): key is keyof AnyExecuteProps {
// return key.startsWith("$any");
// }
/**
* All $prefixed functions are hooked by the modules. Be careful to call them directly.
* Please refer to the module's source code to understand the function.
@@ -39,100 +40,100 @@ function isAnyExecuteKey(key: string): key is keyof AnyExecuteProps {
* $ : Other interceptive points. You should manually assign the module
* All of above performed on injectModules function.
*/
export function injectModules<T extends ICoreModule>(target: T, modules: ICoreModule[]) {
const allKeys = unique([
...Object.keys(Object.getOwnPropertyDescriptors(target)),
...Object.keys(Object.getOwnPropertyDescriptors(Object.getPrototypeOf(target))),
]).filter((e) => e.startsWith("$")) as (keyof ICoreModule)[];
const moduleMap = new Map<string, IObsidianModule[]>();
for (const module of modules) {
for (const key of allKeys) {
if (isOverridableKey(key)) {
if (key in module) {
const list = moduleMap.get(key) || [];
if (typeof module[key] === "function") {
module[key] = module[key].bind(module) as any;
}
list.push(module);
moduleMap.set(key, list);
}
}
}
}
Logger(`Injecting modules for ${target.constructor.name}`, LOG_LEVEL_VERBOSE);
for (const key of allKeys) {
const modules = moduleMap.get(key) || [];
if (isInjectableKey(key)) {
if (modules.length == 0) {
throw new Error(`No module injected for ${key}. This is a fatal error.`);
}
target[key] = modules[0][key]! as any;
Logger(`[${modules[0].constructor.name}]: Injected ${key} `, LOG_LEVEL_VERBOSE);
} else if (isAllExecuteKey(key)) {
const modules = moduleMap.get(key) || [];
target[key] = async (...args: any) => {
for (const module of modules) {
try {
//@ts-ignore
await module[key]!(...args);
} catch (ex) {
Logger(`[${module.constructor.name}]: All handler for ${key} failed`, LOG_LEVEL_VERBOSE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
return true;
};
for (const module of modules) {
Logger(`[${module.constructor.name}]: Injected (All) ${key} `, LOG_LEVEL_VERBOSE);
}
} else if (isEveryExecuteKey(key)) {
target[key] = async (...args: any) => {
for (const module of modules) {
try {
//@ts-ignore:2556
const ret = await module[key]!(...args);
if (ret !== undefined && !ret) {
// Failed then return that falsy value.
return ret;
}
} catch (ex) {
Logger(`[${module.constructor.name}]: Every handler for ${key} failed`);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
return true;
};
for (const module of modules) {
Logger(`[${module.constructor.name}]: Injected (Every) ${key} `, LOG_LEVEL_VERBOSE);
}
} else if (isAnyExecuteKey(key)) {
//@ts-ignore
target[key] = async (...args: any[]) => {
for (const module of modules) {
try {
//@ts-ignore:2556
const ret = await module[key](...args);
// If truly value returned, then return that value.
if (ret) {
return ret;
}
} catch (ex) {
Logger(`[${module.constructor.name}]: Any handler for ${key} failed`);
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
return false;
};
for (const module of modules) {
Logger(`[${module.constructor.name}]: Injected (Any) ${key} `, LOG_LEVEL_VERBOSE);
}
} else {
Logger(`No injected handler for ${key} `, LOG_LEVEL_VERBOSE);
}
}
Logger(`Injected modules for ${target.constructor.name}`, LOG_LEVEL_VERBOSE);
return true;
}
// export function injectModules<T extends ICoreModule>(target: T, modules: ICoreModule[]) {
// const allKeys = unique([
// ...Object.keys(Object.getOwnPropertyDescriptors(target)),
// ...Object.keys(Object.getOwnPropertyDescriptors(Object.getPrototypeOf(target))),
// ]).filter((e) => e.startsWith("$")) as (keyof ICoreModule)[];
// const moduleMap = new Map<string, IObsidianModule[]>();
// for (const module of modules) {
// for (const key of allKeys) {
// if (isOverridableKey(key)) {
// if (key in module) {
// const list = moduleMap.get(key) || [];
// if (typeof module[key] === "function") {
// module[key] = module[key].bind(module) as any;
// }
// list.push(module);
// moduleMap.set(key, list);
// }
// }
// }
// }
// Logger(`Injecting modules for ${target.constructor.name}`, LOG_LEVEL_VERBOSE);
// for (const key of allKeys) {
// const modules = moduleMap.get(key) || [];
// if (isInjectableKey(key)) {
// if (modules.length == 0) {
// throw new Error(`No module injected for ${key}. This is a fatal error.`);
// }
// target[key] = modules[0][key]! as any;
// Logger(`[${modules[0].constructor.name}]: Injected ${key} `, LOG_LEVEL_VERBOSE);
// } else if (isAllExecuteKey(key)) {
// const modules = moduleMap.get(key) || [];
// target[key] = async (...args: any) => {
// for (const module of modules) {
// try {
// //@ts-ignore
// await module[key]!(...args);
// } catch (ex) {
// Logger(`[${module.constructor.name}]: All handler for ${key} failed`, LOG_LEVEL_VERBOSE);
// Logger(ex, LOG_LEVEL_VERBOSE);
// }
// }
// return true;
// };
// for (const module of modules) {
// Logger(`[${module.constructor.name}]: Injected (All) ${key} `, LOG_LEVEL_VERBOSE);
// }
// } else if (isEveryExecuteKey(key)) {
// target[key] = async (...args: any) => {
// for (const module of modules) {
// try {
// //@ts-ignore:2556
// const ret = await module[key]!(...args);
// if (ret !== undefined && !ret) {
// // Failed then return that falsy value.
// return ret;
// }
// } catch (ex) {
// Logger(`[${module.constructor.name}]: Every handler for ${key} failed`);
// Logger(ex, LOG_LEVEL_VERBOSE);
// }
// }
// return true;
// };
// for (const module of modules) {
// Logger(`[${module.constructor.name}]: Injected (Every) ${key} `, LOG_LEVEL_VERBOSE);
// }
// } else if (isAnyExecuteKey(key)) {
// //@ts-ignore
// target[key] = async (...args: any[]) => {
// for (const module of modules) {
// try {
// //@ts-ignore:2556
// const ret = await module[key](...args);
// // If truly value returned, then return that value.
// if (ret) {
// return ret;
// }
// } catch (ex) {
// Logger(`[${module.constructor.name}]: Any handler for ${key} failed`);
// Logger(ex, LOG_LEVEL_VERBOSE);
// }
// }
// return false;
// };
// for (const module of modules) {
// Logger(`[${module.constructor.name}]: Injected (Any) ${key} `, LOG_LEVEL_VERBOSE);
// }
// } else {
// Logger(`No injected handler for ${key} `, LOG_LEVEL_VERBOSE);
// }
// }
// Logger(`Injected modules for ${target.constructor.name}`, LOG_LEVEL_VERBOSE);
// return true;
// }
export abstract class AbstractModule {
_log = (msg: any, level: LOG_LEVEL = LOG_LEVEL_INFO, key?: string) => {
@@ -153,14 +154,18 @@ export abstract class AbstractModule {
this.core.settings = value;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services) {
// Override if needed.
}
constructor(public core: LiveSyncCore) {
this.onBindFunction(core, core.services);
Logger(`[${this.constructor.name}] Loaded`, LOG_LEVEL_VERBOSE);
__$checkInstanceBinding(this);
}
saveSettings = this.core.saveSettings.bind(this.core);
// abstract $everyTest(): Promise<boolean>;
addTestResult(key: string, value: boolean, summary?: string, message?: string) {
this.core.$$addTestResult(`${this.constructor.name}`, key, value, summary, message);
this.services.test.addTestResult(`${this.constructor.name}`, key, value, summary, message);
}
testDone(result: boolean = true) {
return Promise.resolve(result);
@@ -185,4 +190,8 @@ export abstract class AbstractModule {
}
return this.testDone();
}
get services() {
return this.core._services;
}
}

View File

@@ -37,18 +37,18 @@ export abstract class AbstractObsidianModule extends AbstractModule {
saveSettings = this.plugin.saveSettings.bind(this.plugin);
_isMainReady() {
return this.core.$$isReady();
isMainReady() {
return this.services.appLifecycle.isReady();
}
_isMainSuspended() {
return this.core.$$isSuspended();
isMainSuspended() {
return this.services.appLifecycle.isSuspended();
}
_isDatabaseReady() {
return this.core.$$isDatabaseReady();
isDatabaseReady() {
return this.services.database.isDatabaseReady();
}
//should be overridden
_isThisModuleEnabled() {
isThisModuleEnabled() {
return true;
}
}

View File

@@ -17,7 +17,6 @@ import type {
DocumentID,
} from "../../lib/src/common/types";
import type { DatabaseFileAccess } from "../interfaces/DatabaseFileAccess";
import { type IObsidianModule } from "../AbstractObsidianModule.ts";
import { isPlainText, shouldBeIgnored, stripAllPrefixes } from "../../lib/src/string_and_binary/path";
import {
createBlob,
@@ -30,14 +29,15 @@ import {
import { serialized } from "octagonal-wheels/concurrency/lock";
import { AbstractModule } from "../AbstractModule.ts";
import { ICHeader } from "../../common/types.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleDatabaseFileAccess extends AbstractModule implements IObsidianModule, DatabaseFileAccess {
$everyOnload(): Promise<boolean> {
export class ModuleDatabaseFileAccess extends AbstractModule implements DatabaseFileAccess {
private _everyOnload(): Promise<boolean> {
this.core.databaseFileAccess = this;
return Promise.resolve(true);
}
async $everyModuleTest(): Promise<boolean> {
private async _everyModuleTest(): Promise<boolean> {
if (!this.settings.enableDebugTools) return Promise.resolve(true);
const testString = "Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nullam nec purus nec nunc";
// Before test, we need to delete completely.
@@ -53,7 +53,7 @@ export class ModuleDatabaseFileAccess extends AbstractModule implements IObsidia
async () => await this.storeContent("autoTest.md" as FilePathWithPrefix, testString)
);
// For test, we need to clear the caches.
await this.localDatabase.hashCaches.clear();
this.localDatabase.clearCaches();
await this._test("readContent", async () => {
const content = await this.fetch("autoTest.md" as FilePathWithPrefix);
if (!content) return "File not found";
@@ -75,7 +75,7 @@ export class ModuleDatabaseFileAccess extends AbstractModule implements IObsidia
async checkIsTargetFile(file: UXFileInfoStub | FilePathWithPrefix): Promise<boolean> {
const path = getStoragePathFromUXFileInfo(file);
if (!(await this.core.$$isTargetFile(path))) {
if (!(await this.services.vault.isTargetFile(path))) {
this._log(`File is not target`, LOG_LEVEL_VERBOSE);
return false;
}
@@ -102,11 +102,11 @@ export class ModuleDatabaseFileAccess extends AbstractModule implements IObsidia
}
async createChunks(file: UXFileInfo, force: boolean = false, skipCheck?: boolean): Promise<boolean> {
return await this._store(file, force, skipCheck, true);
return await this.__store(file, force, skipCheck, true);
}
async store(file: UXFileInfo, force: boolean = false, skipCheck?: boolean): Promise<boolean> {
return await this._store(file, force, skipCheck, false);
return await this.__store(file, force, skipCheck, false);
}
async storeContent(path: FilePathWithPrefix, content: string): Promise<boolean> {
const blob = createTextBlob(content);
@@ -124,10 +124,10 @@ export class ModuleDatabaseFileAccess extends AbstractModule implements IObsidia
body: blob,
isInternal,
};
return await this._store(dummyUXFileInfo, true, false, false);
return await this.__store(dummyUXFileInfo, true, false, false);
}
async _store(
private async __store(
file: UXFileInfo,
force: boolean = false,
skipCheck?: boolean,
@@ -177,7 +177,7 @@ export class ModuleDatabaseFileAccess extends AbstractModule implements IObsidia
}
}
const idMain = await this.core.$$path2id(fullPath);
const idMain = await this.services.path.path2id(fullPath);
const id = (idPrefix + idMain) as DocumentID;
const d: SavingEntry = {
@@ -345,4 +345,8 @@ export class ModuleDatabaseFileAccess extends AbstractModule implements IObsidia
eventHub.emitEvent(EVENT_FILE_SAVED);
return ret;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnLoaded(this._everyOnload.bind(this));
services.test.handleTest(this._everyModuleTest.bind(this));
}
}

View File

@@ -18,13 +18,13 @@ import {
getStoragePathFromUXFileInfo,
markChangesAreSame,
} from "../../common/utils";
import { getDocDataAsArray, isDocContentSame, readContent } from "../../lib/src/common/utils";
import { getDocDataAsArray, isDocContentSame, readAsBlob, readContent } from "../../lib/src/common/utils";
import { shouldBeIgnored } from "../../lib/src/string_and_binary/path";
import type { ICoreModule } from "../ModuleTypes";
import { Semaphore } from "octagonal-wheels/concurrency/semaphore";
import { eventHub } from "../../common/events.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleFileHandler extends AbstractModule implements ICoreModule {
export class ModuleFileHandler extends AbstractModule {
get db() {
return this.core.databaseFileAccess;
}
@@ -32,7 +32,7 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
return this.core.storageAccess;
}
$everyOnloadStart(): Promise<boolean> {
_everyOnloadStart(): Promise<boolean> {
this.core.fileHandler = this;
return Promise.resolve(true);
}
@@ -52,7 +52,7 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
info: UXFileInfoStub | UXFileInfo | UXInternalFileInfoStub | FilePathWithPrefix,
force: boolean = false,
onlyChunks: boolean = false
): Promise<boolean | undefined> {
): Promise<boolean> {
const file = typeof info === "string" ? this.storage.getFileStub(info) : info;
if (file == null) {
this._log(`File ${info} is not exist on the storage`, LOG_LEVEL_VERBOSE);
@@ -94,10 +94,14 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
let readFile: UXFileInfo | undefined = undefined;
if (!shouldApplied) {
readFile = await this.readFileFromStub(file);
if (!readFile) {
this._log(`File ${file.path} is not exist on the storage`, LOG_LEVEL_NOTICE);
return false;
}
if (await isDocContentSame(getDocDataAsArray(entry.data), readFile.body)) {
// Timestamp is different but the content is same. therefore, two timestamps should be handled as same.
// So, mark the changes are same.
markChangesAreSame(file, file.stat.mtime, entry.mtime);
markChangesAreSame(readFile, readFile.stat.mtime, entry.mtime);
} else {
shouldApplied = true;
}
@@ -125,7 +129,7 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
}
}
async deleteFileFromDB(info: UXFileInfoStub | UXInternalFileInfoStub | FilePath): Promise<boolean | undefined> {
async deleteFileFromDB(info: UXFileInfoStub | UXInternalFileInfoStub | FilePath): Promise<boolean> {
const file = typeof info === "string" ? this.storage.getFileStub(info) : info;
if (file == null) {
this._log(`File ${info} is not exist on the storage`, LOG_LEVEL_VERBOSE);
@@ -222,7 +226,7 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
// NO OP
} else {
// If not, then it should be checked. and will be processed later (i.e., after the conflict is resolved).
await this.core.$$queueConflictCheckIfOpen(path);
await this.services.conflict.queueCheckForIfOpen(path);
return true;
}
}
@@ -256,6 +260,17 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
this._log(`File ${path} is not exist on the database`, LOG_LEVEL_VERBOSE);
return false;
}
// If we want to process size mismatched files -- in case of having files created by some integrations, enable the toggle.
if (!this.settings.processSizeMismatchedFiles) {
// Check the file is not corrupted
// (Zero is a special case, may be created by some APIs and it might be acceptable).
if (docRead.size != 0 && docRead.size !== readAsBlob(docRead).size) {
this._log(`File ${path} seems to be corrupted! Writing prevented.`, LOG_LEVEL_NOTICE);
return false;
}
}
const docData = readContent(docRead);
if (existOnStorage && !force) {
@@ -302,11 +317,11 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
return ret;
}
async $anyHandlerProcessesFileEvent(item: FileEventItem): Promise<boolean | undefined> {
private async _anyHandlerProcessesFileEvent(item: FileEventItem): Promise<boolean> {
const eventItem = item.args;
const type = item.type;
const path = eventItem.file.path;
if (!(await this.core.$$isTargetFile(path))) {
if (!(await this.services.vault.isTargetFile(path))) {
this._log(`File ${path} is not the target file`, LOG_LEVEL_VERBOSE);
return false;
}
@@ -332,12 +347,16 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
});
}
async $anyProcessReplicatedDoc(entry: MetaEntry): Promise<boolean | undefined> {
async _anyProcessReplicatedDoc(entry: MetaEntry): Promise<boolean> {
return await serialized(entry.path, async () => {
if (!(await this.core.$$isTargetFile(entry.path))) {
if (!(await this.services.vault.isTargetFile(entry.path))) {
this._log(`File ${entry.path} is not the target file`, LOG_LEVEL_VERBOSE);
return false;
}
if (this.services.vault.isFileSizeTooLarge(entry.size)) {
this._log(`File ${entry.path} is too large (on database) to be processed`, LOG_LEVEL_VERBOSE);
return false;
}
if (shouldBeIgnored(entry.path)) {
this._log(`File ${entry.path} should be ignored`, LOG_LEVEL_VERBOSE);
return false;
@@ -350,8 +369,12 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
// Nothing to do and other modules should also nothing to do.
return true;
} else {
if (targetFile && this.services.vault.isFileSizeTooLarge(targetFile.stat.size)) {
this._log(`File ${targetFile.path} is too large (on storage) to be processed`, LOG_LEVEL_VERBOSE);
return false;
}
this._log(
`Processing ${path} (${entry._id.substring(0, 8)}: ${entry._rev?.substring(0, 5)}) :Started...`,
`Processing ${path} (${entry._id.substring(0, 8)} :${entry._rev?.substring(0, 5)}) : Started...`,
LOG_LEVEL_VERBOSE
);
// Before writing (or skipped ), merging dialogue should be cancelled.
@@ -380,7 +403,11 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
};
const total = filesStorageSrc.length;
const procAllChunks = filesStorageSrc.map(async (file) => {
if (!(await this.core.$$isTargetFile(file))) {
if (!(await this.services.vault.isTargetFile(file))) {
incProcessed();
return true;
}
if (this.services.vault.isFileSizeTooLarge(file.stat.size)) {
incProcessed();
return true;
}
@@ -405,4 +432,9 @@ export class ModuleFileHandler extends AbstractModule implements ICoreModule {
"chunkCreation"
);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.fileProcessing.handleProcessFileEvent(this._anyHandlerProcessesFileEvent.bind(this));
services.replication.handleProcessSynchroniseResult(this._anyProcessReplicatedDoc.bind(this));
}
}

View File

@@ -2,24 +2,45 @@ import { $msg } from "../../lib/src/common/i18n";
import { LiveSyncLocalDB } from "../../lib/src/pouchdb/LiveSyncLocalDB.ts";
import { initializeStores } from "../../common/stores.ts";
import { AbstractModule } from "../AbstractModule.ts";
import type { ICoreModule } from "../ModuleTypes.ts";
import { LiveSyncManagers } from "../../lib/src/managers/LiveSyncManagers.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleLocalDatabaseObsidian extends AbstractModule implements ICoreModule {
$everyOnloadStart(): Promise<boolean> {
export class ModuleLocalDatabaseObsidian extends AbstractModule {
_everyOnloadStart(): Promise<boolean> {
return Promise.resolve(true);
}
async $$openDatabase(): Promise<boolean> {
private async _openDatabase(): Promise<boolean> {
if (this.localDatabase != null) {
await this.localDatabase.close();
}
const vaultName = this.core.$$getVaultName();
const vaultName = this.services.vault.getVaultName();
this._log($msg("moduleLocalDatabase.logWaitingForReady"));
const getDB = () => this.core.localDatabase.localDatabase;
const getSettings = () => this.core.settings;
this.core.managers = new LiveSyncManagers({
get database() {
return getDB();
},
getActiveReplicator: () => this.core.replicator,
id2path: this.services.path.id2path,
// path2id: this.core.$$path2id.bind(this.core),
path2id: this.services.path.path2id,
get settings() {
return getSettings();
},
});
this.core.localDatabase = new LiveSyncLocalDB(vaultName, this.core);
initializeStores(vaultName);
return await this.localDatabase.initializeDatabase();
}
$$isDatabaseReady(): boolean {
_isDatabaseReady(): boolean {
return this.localDatabase != null && this.localDatabase.isReady;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.database.handleIsDatabaseReady(this._isDatabaseReady.bind(this));
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.database.handleOpenDatabase(this._openDatabase.bind(this));
}
}

View File

@@ -1,33 +1,41 @@
import { PeriodicProcessor } from "../../common/utils";
import type { LiveSyncCore } from "../../main";
import { AbstractModule } from "../AbstractModule";
import type { ICoreModule } from "../ModuleTypes";
export class ModulePeriodicProcess extends AbstractModule implements ICoreModule {
periodicSyncProcessor = new PeriodicProcessor(this.core, async () => await this.core.$$replicate());
export class ModulePeriodicProcess extends AbstractModule {
periodicSyncProcessor = new PeriodicProcessor(this.core, async () => await this.services.replication.replicate());
_disablePeriodic() {
disablePeriodic() {
this.periodicSyncProcessor?.disable();
return Promise.resolve(true);
}
_resumePeriodic() {
resumePeriodic() {
this.periodicSyncProcessor.enable(
this.settings.periodicReplication ? this.settings.periodicReplicationInterval * 1000 : 0
);
return Promise.resolve(true);
}
$allOnUnload() {
return this._disablePeriodic();
private _allOnUnload() {
return this.disablePeriodic();
}
$everyBeforeRealizeSetting(): Promise<boolean> {
return this._disablePeriodic();
private _everyBeforeRealizeSetting(): Promise<boolean> {
return this.disablePeriodic();
}
$everyBeforeSuspendProcess(): Promise<boolean> {
return this._disablePeriodic();
private _everyBeforeSuspendProcess(): Promise<boolean> {
return this.disablePeriodic();
}
$everyAfterResumeProcess(): Promise<boolean> {
return this._resumePeriodic();
private _everyAfterResumeProcess(): Promise<boolean> {
return this.resumePeriodic();
}
$everyAfterRealizeSetting(): Promise<boolean> {
return this._resumePeriodic();
private _everyAfterRealizeSetting(): Promise<boolean> {
return this.resumePeriodic();
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnUnload(this._allOnUnload.bind(this));
services.setting.handleBeforeRealiseSetting(this._everyBeforeRealizeSetting.bind(this));
services.setting.handleSettingRealised(this._everyAfterRealizeSetting.bind(this));
services.appLifecycle.handleOnSuspending(this._everyBeforeSuspendProcess.bind(this));
services.appLifecycle.handleOnResumed(this._everyAfterResumeProcess.bind(this));
}
}

View File

@@ -1,9 +1,10 @@
import { AbstractModule } from "../AbstractModule";
import type { ICoreModule } from "../ModuleTypes";
import { PouchDB } from "../../lib/src/pouchdb/pouchdb-browser";
import type { LiveSyncCore } from "../../main";
import { ExtraSuffixIndexedDB } from "../../lib/src/common/types";
export class ModulePouchDB extends AbstractModule implements ICoreModule {
$$createPouchDBInstance<T extends object>(
export class ModulePouchDB extends AbstractModule {
_createPouchDBInstance<T extends object>(
name?: string,
options?: PouchDB.Configuration.DatabaseConfiguration
): PouchDB.Database<T> {
@@ -12,8 +13,11 @@ export class ModulePouchDB extends AbstractModule implements ICoreModule {
optionPass.adapter = "indexeddb";
//@ts-ignore :missing def
optionPass.purged_infos_limit = 1;
return new PouchDB(name + "-indexeddb", optionPass);
return new PouchDB(name + ExtraSuffixIndexedDB, optionPass);
}
return new PouchDB(name, optionPass);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.database.handleCreatePouchDBInstance(this._createPouchDBInstance.bind(this));
}
}

View File

@@ -9,13 +9,13 @@ import {
} from "../../lib/src/common/types.ts";
import { AbstractModule } from "../AbstractModule.ts";
import type { Rebuilder } from "../interfaces/DatabaseRebuilder.ts";
import type { ICoreModule } from "../ModuleTypes.ts";
import type { LiveSyncCouchDBReplicator } from "../../lib/src/replication/couchdb/LiveSyncReplicator.ts";
import { fetchAllUsedChunks } from "@/lib/src/pouchdb/chunks.ts";
import { EVENT_DATABASE_REBUILT, eventHub } from "src/common/events.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebuilder {
$everyOnload(): Promise<boolean> {
export class ModuleRebuilder extends AbstractModule implements Rebuilder {
private _everyOnload(): Promise<boolean> {
this.core.rebuilder = this;
return Promise.resolve(true);
}
@@ -36,6 +36,14 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
}
}
async informOptionalFeatures() {
await this.core.services.UI.showMarkdownDialog(
"All optional features are disabled",
`Customisation Sync and Hidden File Sync will all be disabled.
Please enable them from the settings screen after setup is complete.`,
["OK"]
);
}
async askUsingOptionalFeature(opt: { enableFetch?: boolean; enableOverwrite?: boolean }) {
if (
(await this.core.confirm.askYesNoDialog(
@@ -43,47 +51,49 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
{ title: "Enable extra features", defaultOption: "No", timeout: 15 }
)) == "yes"
) {
await this.core.$allAskUsingOptionalSyncFeature(opt);
await this.services.setting.suggestOptionalFeatures(opt);
}
}
async rebuildRemote() {
await this.core.$allSuspendExtraSync();
await this.services.setting.suspendExtraSync();
this.core.settings.isConfigured = true;
await this.core.$$realizeSettingSyncMode();
await this.core.$$markRemoteLocked();
await this.core.$$tryResetRemoteDatabase();
await this.core.$$markRemoteLocked();
await this.services.setting.realiseSetting();
await this.services.remote.markLocked();
await this.services.remote.tryResetDatabase();
await this.services.remote.markLocked();
await delay(500);
await this.askUsingOptionalFeature({ enableOverwrite: true });
// await this.askUsingOptionalFeature({ enableOverwrite: true });
await delay(1000);
await this.core.$$replicateAllToServer(true);
await this.services.remote.replicateAllToRemote(true);
await delay(1000);
await this.core.$$replicateAllToServer(true, true);
await this.services.remote.replicateAllToRemote(true, true);
await this.informOptionalFeatures();
}
$rebuildRemote(): Promise<void> {
return this.rebuildRemote();
}
async rebuildEverything() {
await this.core.$allSuspendExtraSync();
await this.services.setting.suspendExtraSync();
await this.askUseNewAdapter();
this.core.settings.isConfigured = true;
await this.core.$$realizeSettingSyncMode();
await this.services.setting.realiseSetting();
await this.resetLocalDatabase();
await delay(1000);
await this.core.$$initializeDatabase(true);
await this.core.$$markRemoteLocked();
await this.core.$$tryResetRemoteDatabase();
await this.core.$$markRemoteLocked();
await this.services.databaseEvents.initialiseDatabase(true, true, true);
await this.services.remote.markLocked();
await this.services.remote.tryResetDatabase();
await this.services.remote.markLocked();
await delay(500);
// We do not have any other devices' data, so we do not need to ask for overwriting.
await this.askUsingOptionalFeature({ enableOverwrite: false });
// await this.askUsingOptionalFeature({ enableOverwrite: false });
await delay(1000);
await this.core.$$replicateAllToServer(true);
await this.services.remote.replicateAllToRemote(true);
await delay(1000);
await this.core.$$replicateAllToServer(true, true);
await this.services.remote.replicateAllToRemote(true, true);
await this.informOptionalFeatures();
}
$rebuildEverything(): Promise<void> {
@@ -101,7 +111,7 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
this._log("Could not create red_flag_rebuild.md", LOG_LEVEL_NOTICE);
this._log(ex, LOG_LEVEL_VERBOSE);
}
this.core.$$performRestart();
this.services.appLifecycle.performRestart();
}
async scheduleFetch(): Promise<void> {
try {
@@ -110,20 +120,20 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
this._log("Could not create red_flag_fetch.md", LOG_LEVEL_NOTICE);
this._log(ex, LOG_LEVEL_VERBOSE);
}
this.core.$$performRestart();
this.services.appLifecycle.performRestart();
}
async $$tryResetRemoteDatabase(): Promise<void> {
private async _tryResetRemoteDatabase(): Promise<void> {
await this.core.replicator.tryResetRemoteDatabase(this.settings);
}
async $$tryCreateRemoteDatabase(): Promise<void> {
private async _tryCreateRemoteDatabase(): Promise<void> {
await this.core.replicator.tryCreateRemoteDatabase(this.settings);
}
async $$resetLocalDatabase(): Promise<void> {
private async _resetLocalDatabase(): Promise<boolean> {
this.core.storageAccess.clearTouched();
await this.localDatabase.resetDatabase();
return await this.localDatabase.resetDatabase();
}
async suspendAllSync() {
@@ -134,7 +144,7 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
this.core.settings.syncOnStart = false;
this.core.settings.syncOnFileOpen = false;
this.core.settings.syncAfterMerge = false;
await this.core.$allSuspendExtraSync();
await this.services.setting.suspendExtraSync();
}
async suspendReflectingDatabase() {
if (this.core.settings.doNotSuspendOnFetching) return;
@@ -153,8 +163,8 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
this._log(`Database and storage reflection has been resumed!`, LOG_LEVEL_NOTICE);
this.core.settings.suspendParseReplicationResult = false;
this.core.settings.suspendFileWatching = false;
await this.core.$$performFullScan(true);
await this.core.$everyBeforeReplicate(false); //TODO: Check actual need of this.
await this.services.vault.scanVault(true);
await this.services.replication.onBeforeReplicate(false); //TODO: Check actual need of this.
await this.core.saveSettings();
}
async askUseNewAdapter() {
@@ -177,36 +187,38 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
}
}
async fetchLocal(makeLocalChunkBeforeSync?: boolean, preventMakeLocalFilesBeforeSync?: boolean) {
await this.core.$allSuspendExtraSync();
await this.services.setting.suspendExtraSync();
await this.askUseNewAdapter();
this.core.settings.isConfigured = true;
await this.suspendReflectingDatabase();
await this.core.$$realizeSettingSyncMode();
await this.services.setting.realiseSetting();
await this.resetLocalDatabase();
await delay(1000);
await this.core.$$openDatabase();
await this.services.database.openDatabase();
// this.core.isReady = true;
this.core.$$markIsReady();
this.services.appLifecycle.markIsReady();
if (makeLocalChunkBeforeSync) {
await this.core.fileHandler.createAllChunks(true);
} else if (!preventMakeLocalFilesBeforeSync) {
await this.core.$$initializeDatabase(true);
await this.services.databaseEvents.initialiseDatabase(true, true, true);
} else {
// Do not create local file entries before sync (Means use remote information)
}
await this.core.$$markRemoteResolved();
await this.services.remote.markResolved();
await delay(500);
await this.core.$$replicateAllFromServer(true);
await this.services.remote.replicateAllFromRemote(true);
await delay(1000);
await this.core.$$replicateAllFromServer(true);
await this.services.remote.replicateAllFromRemote(true);
await this.resumeReflectingDatabase();
await this.askUsingOptionalFeature({ enableFetch: true });
await this.informOptionalFeatures();
// No longer enable
// await this.askUsingOptionalFeature({ enableFetch: true });
}
async fetchLocalWithRebuild() {
return await this.fetchLocal(true);
}
async $allSuspendAllSync(): Promise<boolean> {
private async _allSuspendAllSync(): Promise<boolean> {
await this.suspendAllSync();
return true;
}
@@ -214,11 +226,11 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
async resetLocalDatabase() {
if (this.core.settings.isConfigured && this.core.settings.additionalSuffixOfDatabaseName == "") {
// Discard the non-suffixed database
await this.core.$$resetLocalDatabase();
await this.services.database.resetDatabase();
}
const suffix = (await this.core.$anyGetAppId()) || "";
const suffix = this.services.API.getAppID() || "";
this.core.settings.additionalSuffixOfDatabaseName = suffix;
await this.core.$$resetLocalDatabase();
await this.services.database.resetDatabase();
eventHub.emitEvent(EVENT_DATABASE_REBUILT);
}
async fetchRemoteChunks() {
@@ -228,10 +240,10 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
this.core.settings.remoteType == REMOTE_COUCHDB
) {
this._log(`Fetching chunks`, LOG_LEVEL_NOTICE);
const replicator = this.core.$$getReplicator() as LiveSyncCouchDBReplicator;
const replicator = this.services.replicator.getActiveReplicator() as LiveSyncCouchDBReplicator;
const remoteDB = await replicator.connectRemoteCouchDBWithSetting(
this.settings,
this.core.$$isMobile(),
this.services.API.isMobile(),
true
);
if (typeof remoteDB == "string") {
@@ -254,8 +266,15 @@ export class ModuleRebuilder extends AbstractModule implements ICoreModule, Rebu
LOG_LEVEL_NOTICE,
"resolveAllConflictedFilesByNewerOnes"
);
await this.core.$anyResolveConflictByNewest(file);
await this.services.conflict.resolveByNewest(file);
}
this._log(`Done!`, LOG_LEVEL_NOTICE, "resolveAllConflictedFilesByNewerOnes");
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnLoaded(this._everyOnload.bind(this));
services.database.handleResetDatabase(this._resetLocalDatabase.bind(this));
services.remote.handleTryResetDatabase(this._tryResetRemoteDatabase.bind(this));
services.remote.handleTryCreateDatabase(this._tryCreateRemoteDatabase.bind(this));
services.setting.handleSuspendAllSync(this._allSuspendAllSync.bind(this));
}
}

View File

@@ -1,8 +1,15 @@
import { fireAndForget, yieldMicrotask } from "octagonal-wheels/promises";
import type { LiveSyncLocalDB } from "../../lib/src/pouchdb/LiveSyncLocalDB";
import { AbstractModule } from "../AbstractModule";
import type { ICoreModule } from "../ModuleTypes";
import { Logger, LOG_LEVEL_NOTICE, LOG_LEVEL_INFO, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import {
Logger,
LOG_LEVEL_NOTICE,
LOG_LEVEL_INFO,
LOG_LEVEL_VERBOSE,
LEVEL_NOTICE,
LEVEL_INFO,
type LOG_LEVEL,
} from "octagonal-wheels/common/logger";
import { isLockAcquired, shareRunningResult, skipIfDuplicated } from "octagonal-wheels/concurrency/lock";
import { balanceChunkPurgedDBs } from "@/lib/src/pouchdb/chunks";
import { purgeUnreferencedChunks } from "@/lib/src/pouchdb/chunks";
@@ -17,6 +24,7 @@ import {
type EntryLeaf,
type LoadedEntry,
type MetaEntry,
type RemoteType,
} from "../../lib/src/common/types";
import { QueueProcessor } from "octagonal-wheels/concurrency/processor";
import {
@@ -28,20 +36,37 @@ import {
updatePreviousExecutionTime,
} from "../../common/utils";
import { isAnyNote } from "../../lib/src/common/utils";
import { EVENT_FILE_SAVED, EVENT_SETTING_SAVED, eventHub } from "../../common/events";
import { EVENT_FILE_SAVED, EVENT_ON_UNRESOLVED_ERROR, EVENT_SETTING_SAVED, eventHub } from "../../common/events";
import type { LiveSyncAbstractReplicator } from "../../lib/src/replication/LiveSyncAbstractReplicator";
import { globalSlipBoard } from "../../lib/src/bureau/bureau";
import { $msg } from "../../lib/src/common/i18n";
import { clearHandlers } from "../../lib/src/replication/SyncParamsHandler";
import type { LiveSyncCore } from "../../main";
const KEY_REPLICATION_ON_EVENT = "replicationOnEvent";
const REPLICATION_ON_EVENT_FORECASTED_TIME = 5000;
export class ModuleReplicator extends AbstractModule implements ICoreModule {
_replicatorType?: string;
$everyOnloadAfterLoadSettings(): Promise<boolean> {
export class ModuleReplicator extends AbstractModule {
_replicatorType?: RemoteType;
_previousErrors = new Set<string>();
showError(msg: string, max_log_level: LOG_LEVEL = LEVEL_NOTICE) {
const level = this._previousErrors.has(msg) ? LEVEL_INFO : max_log_level;
this._log(msg, level);
if (!this._previousErrors.has(msg)) {
this._previousErrors.add(msg);
eventHub.emitEvent(EVENT_ON_UNRESOLVED_ERROR);
}
}
clearErrors() {
this._previousErrors.clear();
eventHub.emitEvent(EVENT_ON_UNRESOLVED_ERROR);
}
private _everyOnloadAfterLoadSettings(): Promise<boolean> {
eventHub.onEvent(EVENT_FILE_SAVED, () => {
if (this.settings.syncOnSave && !this.core.$$isSuspended()) {
scheduleTask("perform-replicate-after-save", 250, () => this.core.$$replicateByEvent());
if (this.settings.syncOnSave && !this.core.services.appLifecycle.isSuspended()) {
scheduleTask("perform-replicate-after-save", 250, () => this.services.replication.replicateByEvent());
}
});
eventHub.onEvent(EVENT_SETTING_SAVED, (setting) => {
@@ -54,9 +79,9 @@ export class ModuleReplicator extends AbstractModule implements ICoreModule {
}
async setReplicator() {
const replicator = await this.core.$anyNewReplicator();
const replicator = await this.services.replicator.getNewReplicator();
if (!replicator) {
this._log($msg("Replicator.Message.InitialiseFatalError"), LOG_LEVEL_NOTICE);
this.showError($msg("Replicator.Message.InitialiseFatalError"), LOG_LEVEL_NOTICE);
return false;
}
if (this.core.replicator) {
@@ -66,37 +91,49 @@ export class ModuleReplicator extends AbstractModule implements ICoreModule {
this.core.replicator = replicator;
this._replicatorType = this.settings.remoteType;
await yieldMicrotask();
// Clear any existing sync parameter handlers (means clearing key-deriving salt).
clearHandlers();
return true;
}
$$getReplicator(): LiveSyncAbstractReplicator {
_getReplicator(): LiveSyncAbstractReplicator {
return this.core.replicator;
}
$everyOnInitializeDatabase(db: LiveSyncLocalDB): Promise<boolean> {
_everyOnInitializeDatabase(db: LiveSyncLocalDB): Promise<boolean> {
return this.setReplicator();
}
$everyOnResetDatabase(db: LiveSyncLocalDB): Promise<boolean> {
_everyOnResetDatabase(db: LiveSyncLocalDB): Promise<boolean> {
return this.setReplicator();
}
async ensureReplicatorPBKDF2Salt(showMessage: boolean = false): Promise<boolean> {
// Checking salt
const replicator = this.core.$$getReplicator();
const replicator = this.services.replicator.getActiveReplicator();
if (!replicator) {
this.showError($msg("Replicator.Message.InitialiseFatalError"), LOG_LEVEL_NOTICE);
return false;
}
return await replicator.ensurePBKDF2Salt(this.settings, showMessage, true);
}
async $everyBeforeReplicate(showMessage: boolean): Promise<boolean> {
async _everyBeforeReplicate(showMessage: boolean): Promise<boolean> {
// Checking salt
if (!(await this.ensureReplicatorPBKDF2Salt(showMessage))) {
Logger("Failed to ensure PBKDF2 salt for replication.", LOG_LEVEL_NOTICE);
if (!this.core.managers.networkManager.isOnline) {
this.showError("Network is offline", showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO);
return false;
}
// Showing message is false: that because be shown here. (And it is a fatal error, no way to hide it).
if (!(await this.ensureReplicatorPBKDF2Salt(false))) {
this.showError("Failed to initialise the encryption key, preventing replication.");
return false;
}
await this.loadQueuedFiles();
this.clearErrors();
return true;
}
async $$replicate(showMessage: boolean = false): Promise<boolean | void> {
private async _replicate(showMessage: boolean = false): Promise<boolean | void> {
try {
updatePreviousExecutionTime(KEY_REPLICATION_ON_EVENT, REPLICATION_ON_EVENT_FORECASTED_TIME);
return await this.$$_replicate(showMessage);
@@ -133,11 +170,11 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
await this.core.rebuilder.$performRebuildDB("localOnly");
}
if (ret == CHOICE_CLEAN) {
const replicator = this.core.$$getReplicator();
const replicator = this.services.replicator.getActiveReplicator();
if (!(replicator instanceof LiveSyncCouchDBReplicator)) return;
const remoteDB = await replicator.connectRemoteCouchDBWithSetting(
this.settings,
this.core.$$isMobile(),
this.services.API.isMobile(),
true
);
if (typeof remoteDB == "string") {
@@ -146,13 +183,13 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
}
await purgeUnreferencedChunks(this.localDatabase.localDatabase, false);
this.localDatabase.hashCaches.clear();
this.localDatabase.clearCaches();
// Perform the synchronisation once.
if (await this.core.replicator.openReplication(this.settings, false, showMessage, true)) {
await balanceChunkPurgedDBs(this.localDatabase.localDatabase, remoteDB.db);
await purgeUnreferencedChunks(this.localDatabase.localDatabase, false);
this.localDatabase.hashCaches.clear();
await this.core.$$getReplicator().markRemoteResolved(this.settings);
this.localDatabase.clearCaches();
await this.services.replicator.getActiveReplicator()?.markRemoteResolved(this.settings);
Logger("The local database has been cleaned up.", showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO);
} else {
Logger(
@@ -163,31 +200,49 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
}
});
}
async $$_replicate(showMessage: boolean = false): Promise<boolean | void> {
//--?
if (!this.core.$$isReady()) return;
async _canReplicate(showMessage: boolean = false): Promise<boolean> {
if (!this.services.appLifecycle.isReady()) {
Logger(`Not ready`);
return false;
}
if (isLockAcquired("cleanup")) {
Logger($msg("Replicator.Message.Cleaned"), LOG_LEVEL_NOTICE);
return;
return false;
}
if (this.settings.versionUpFlash != "") {
Logger($msg("Replicator.Message.VersionUpFlash"), LOG_LEVEL_NOTICE);
return;
}
if (!(await this.core.$everyCommitPendingFileEvent())) {
Logger($msg("Replicator.Message.Pending"), LOG_LEVEL_NOTICE);
return false;
}
if (!(await this.core.$everyBeforeReplicate(showMessage))) {
Logger($msg("Replicator.Message.SomeModuleFailed"), LOG_LEVEL_NOTICE);
if (!(await this.services.fileProcessing.commitPendingFileEvents())) {
this.showError($msg("Replicator.Message.Pending"), LOG_LEVEL_NOTICE);
return false;
}
if (!this.core.managers.networkManager.isOnline) {
this.showError("Network is offline", showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO);
return false;
}
if (!(await this.services.replication.onBeforeReplicate(showMessage))) {
this.showError($msg("Replicator.Message.SomeModuleFailed"), LOG_LEVEL_NOTICE);
return false;
}
this.clearErrors();
return true;
}
async $$_replicate(showMessage: boolean = false): Promise<boolean | void> {
const checkBeforeReplicate = await this.services.replication.isReplicationReady(showMessage);
if (!checkBeforeReplicate) return false;
//<-- Here could be an module.
const ret = await this.core.replicator.openReplication(this.settings, false, showMessage, false);
if (!ret) {
if (this.core.replicator.tweakSettingsMismatched && this.core.replicator.preferredTweakValue) {
await this.core.$$askResolvingMismatchedTweaks(this.core.replicator.preferredTweakValue);
await this.services.tweakValue.askResolvingMismatched(this.core.replicator.preferredTweakValue);
} else {
if (this.core.replicator?.remoteLockedAndDeviceNotAccepted) {
if (this.core.replicator.remoteCleaned && this.settings.useIndexedDBAdapter) {
@@ -209,7 +264,7 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
if (ret == CHOICE_FETCH) {
this._log($msg("Replicator.Dialogue.Locked.Message.Fetch"), LOG_LEVEL_NOTICE);
await this.core.rebuilder.scheduleFetch();
this.core.$$scheduleAppReload();
this.services.appLifecycle.scheduleRestart();
return;
} else if (ret == CHOICE_UNLOCK) {
await this.core.replicator.markRemoteResolved(this.settings);
@@ -223,16 +278,16 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
return ret;
}
async $$replicateByEvent(): Promise<boolean | void> {
private async _replicateByEvent(): Promise<boolean | void> {
const least = this.settings.syncMinimumInterval;
if (least > 0) {
return rateLimitedSharedExecution(KEY_REPLICATION_ON_EVENT, least, async () => {
return await this.$$replicate();
return await this.services.replication.replicate();
});
}
return await shareRunningResult(`replication`, () => this.core.$$replicate());
return await shareRunningResult(`replication`, () => this.services.replication.replicate());
}
$$parseReplicationResult(docs: Array<PouchDB.Core.ExistingDocument<EntryDoc>>): void {
_parseReplicationResult(docs: Array<PouchDB.Core.ExistingDocument<EntryDoc>>): void {
if (this.settings.suspendParseReplicationResult && !this.replicationResultProcessor.isSuspended) {
this.replicationResultProcessor.suspend();
}
@@ -306,10 +361,10 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
const change = docs[0];
if (!change) return;
if (isChunk(change._id)) {
globalSlipBoard.submit("read-chunk", change._id, change as EntryLeaf);
this.localDatabase.onNewLeaf(change as EntryLeaf);
return;
}
if (await this.core.$anyModuleParsedReplicationResultItem(change)) return;
if (await this.services.replication.processVirtualDocument(change)) return;
// any addon needs this item?
// for (const proc of this.core.addOns) {
// if (await proc.parseReplicationResultItem(change)) {
@@ -334,7 +389,7 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
}
if (isAnyNote(change)) {
const docPath = getPath(change);
if (!(await this.core.$$isTargetFile(docPath))) {
if (!(await this.services.vault.isTargetFile(docPath))) {
Logger(`Skipped: ${docPath}`, LOG_LEVEL_VERBOSE);
return;
}
@@ -342,7 +397,7 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
Logger(`Processing scheduled: ${docPath}`, LOG_LEVEL_INFO);
}
const size = change.size;
if (this.core.$$isFileSizeExceeded(size)) {
if (this.services.vault.isFileSizeTooLarge(size)) {
Logger(
`Processing ${docPath} has been skipped due to file size exceeding the limit`,
LOG_LEVEL_NOTICE
@@ -370,11 +425,56 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
this.saveQueuedFiles();
});
async checkIsChangeRequiredForDatabaseProcessing(dbDoc: LoadedEntry): Promise<boolean> {
const path = getPath(dbDoc);
try {
const savedDoc = await this.localDatabase.getRaw<LoadedEntry>(dbDoc._id, {
conflicts: true,
revs_info: true,
});
const newRev = dbDoc._rev ?? "";
const latestRev = savedDoc._rev ?? "";
const revisions = savedDoc._revs_info?.map((e) => e.rev) ?? [];
if (savedDoc._conflicts && savedDoc._conflicts.length > 0) {
// There are conflicts, so we have to process it.
return true;
}
if (newRev == latestRev) {
// The latest revision. We need to process it.
return true;
}
const index = revisions.indexOf(newRev);
if (index >= 0) {
// the revision has been inserted before.
return false; // Already processed.
}
return true; // This mostly should not happen, but we have to process it just in case.
} catch (e: any) {
if ("status" in e && e.status == 404) {
return true;
// Not existing, so we have to process it.
} else {
Logger(
`Failed to get existing document for ${path} (${dbDoc._id.substring(0, 8)}, ${dbDoc._rev?.substring(0, 10)}) `,
LOG_LEVEL_NOTICE
);
Logger(e, LOG_LEVEL_VERBOSE);
return true;
}
}
return true;
}
databaseQueuedProcessor = new QueueProcessor(
async (docs: EntryBody[]) => {
const dbDoc = docs[0] as LoadedEntry; // It has no `data`
const path = getPath(dbDoc);
// If the document is existing with any revision, confirm that we have to process it.
const isRequired = await this.checkIsChangeRequiredForDatabaseProcessing(dbDoc);
if (!isRequired) {
Logger(`Skipped (Not latest): ${path} (${dbDoc._id.substring(0, 8)})`, LOG_LEVEL_VERBOSE);
return;
}
// If `Read chunks online` is disabled, chunks should be transferred before here.
// However, in some cases, chunks are after that. So, if missing chunks exist, we have to wait for them.
const doc = await this.localDatabase.getDBEntryFromMeta({ ...dbDoc }, false, true);
@@ -386,7 +486,7 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
return;
}
if (await this.core.$anyProcessOptionalSyncFiles(dbDoc)) {
if (await this.services.replication.processOptionalSynchroniseResult(dbDoc)) {
// Already processed
} else if (isValidPath(getPath(doc))) {
this.storageApplyingProcessor.enqueue(doc as MetaEntry);
@@ -413,7 +513,7 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
storageApplyingProcessor = new QueueProcessor(
async (docs: MetaEntry[]) => {
const entry = docs[0];
await this.core.$anyProcessReplicatedDoc(entry);
await this.services.replication.processSynchroniseResult(entry);
return;
},
{
@@ -431,17 +531,17 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
})
.startPipeline();
$everyBeforeSuspendProcess(): Promise<boolean> {
this.core.replicator.closeReplication();
_everyBeforeSuspendProcess(): Promise<boolean> {
this.core.replicator?.closeReplication();
return Promise.resolve(true);
}
async $$replicateAllToServer(
private async _replicateAllToServer(
showingNotice: boolean = false,
sendChunksInBulkDisabled: boolean = false
): Promise<boolean> {
if (!this.core.$$isReady()) return false;
if (!(await this.core.$everyBeforeReplicate(showingNotice))) {
if (!this.services.appLifecycle.isReady()) return false;
if (!(await this.services.replication.onBeforeReplicate(showingNotice))) {
Logger($msg("Replicator.Message.SomeModuleFailed"), LOG_LEVEL_NOTICE);
return false;
}
@@ -459,16 +559,36 @@ Even if you choose to clean up, you will see this option again if you exit Obsid
}
const ret = await this.core.replicator.replicateAllToServer(this.settings, showingNotice);
if (ret) return true;
const checkResult = await this.core.$anyAfterConnectCheckFailed();
if (checkResult == "CHECKAGAIN") return await this.core.$$replicateAllToServer(showingNotice);
const checkResult = await this.services.replication.checkConnectionFailure();
if (checkResult == "CHECKAGAIN") return await this.services.remote.replicateAllToRemote(showingNotice);
return !checkResult;
}
async $$replicateAllFromServer(showingNotice: boolean = false): Promise<boolean> {
if (!this.core.$$isReady()) return false;
async _replicateAllFromServer(showingNotice: boolean = false): Promise<boolean> {
if (!this.services.appLifecycle.isReady()) return false;
const ret = await this.core.replicator.replicateAllFromServer(this.settings, showingNotice);
if (ret) return true;
const checkResult = await this.core.$anyAfterConnectCheckFailed();
if (checkResult == "CHECKAGAIN") return await this.core.$$replicateAllFromServer(showingNotice);
const checkResult = await this.services.replication.checkConnectionFailure();
if (checkResult == "CHECKAGAIN") return await this.services.remote.replicateAllFromRemote(showingNotice);
return !checkResult;
}
private _reportUnresolvedMessages(): Promise<string[]> {
return Promise.resolve([...this._previousErrors]);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.replicator.handleGetActiveReplicator(this._getReplicator.bind(this));
services.databaseEvents.handleOnDatabaseInitialisation(this._everyOnInitializeDatabase.bind(this));
services.databaseEvents.handleOnResetDatabase(this._everyOnResetDatabase.bind(this));
services.appLifecycle.handleOnSettingLoaded(this._everyOnloadAfterLoadSettings.bind(this));
services.replication.handleParseSynchroniseResult(this._parseReplicationResult.bind(this));
services.appLifecycle.handleOnSuspending(this._everyBeforeSuspendProcess.bind(this));
services.replication.handleBeforeReplicate(this._everyBeforeReplicate.bind(this));
services.replication.handleIsReplicationReady(this._canReplicate.bind(this));
services.replication.handleReplicate(this._replicate.bind(this));
services.replication.handleReplicateByEvent(this._replicateByEvent.bind(this));
services.remote.handleReplicateAllToRemote(this._replicateAllToServer.bind(this));
services.remote.handleReplicateAllFromRemote(this._replicateAllFromServer.bind(this));
services.appLifecycle.reportUnresolvedMessages(this._reportUnresolvedMessages.bind(this));
}
}

View File

@@ -3,30 +3,40 @@ import { REMOTE_MINIO, REMOTE_P2P, type RemoteDBSettings } from "../../lib/src/c
import { LiveSyncCouchDBReplicator } from "../../lib/src/replication/couchdb/LiveSyncReplicator";
import type { LiveSyncAbstractReplicator } from "../../lib/src/replication/LiveSyncAbstractReplicator";
import { AbstractModule } from "../AbstractModule";
import type { ICoreModule } from "../ModuleTypes";
import type { LiveSyncCore } from "../../main";
export class ModuleReplicatorCouchDB extends AbstractModule implements ICoreModule {
$anyNewReplicator(settingOverride: Partial<RemoteDBSettings> = {}): Promise<LiveSyncAbstractReplicator> {
export class ModuleReplicatorCouchDB extends AbstractModule {
_anyNewReplicator(settingOverride: Partial<RemoteDBSettings> = {}): Promise<LiveSyncAbstractReplicator | false> {
const settings = { ...this.settings, ...settingOverride };
// If new remote types were added, add them here. Do not use `REMOTE_COUCHDB` directly for the safety valve.
if (settings.remoteType == REMOTE_MINIO || settings.remoteType == REMOTE_P2P) {
return undefined!;
return Promise.resolve(false);
}
return Promise.resolve(new LiveSyncCouchDBReplicator(this.core));
}
$everyAfterResumeProcess(): Promise<boolean> {
_everyAfterResumeProcess(): Promise<boolean> {
if (this.services.appLifecycle.isSuspended()) return Promise.resolve(true);
if (!this.services.appLifecycle.isReady()) return Promise.resolve(true);
if (this.settings.remoteType != REMOTE_MINIO && this.settings.remoteType != REMOTE_P2P) {
// If LiveSync enabled, open replication
if (this.settings.liveSync) {
fireAndForget(() => this.core.replicator.openReplication(this.settings, true, false, false));
}
// If sync on start enabled, open replication
if (!this.settings.liveSync && this.settings.syncOnStart) {
// Possibly ok as if only share the result
fireAndForget(() => this.core.replicator.openReplication(this.settings, false, false, false));
const LiveSyncEnabled = this.settings.liveSync;
const continuous = LiveSyncEnabled;
const eventualOnStart = !LiveSyncEnabled && this.settings.syncOnStart;
// If enabled LiveSync or on start, open replication
if (LiveSyncEnabled || eventualOnStart) {
// And note that we do not open the conflict detection dialogue directly during this process.
// This should be raised explicitly if needed.
fireAndForget(async () => {
const canReplicate = await this.services.replication.isReplicationReady(false);
if (!canReplicate) return;
void this.core.replicator.openReplication(this.settings, continuous, false, false);
});
}
}
return Promise.resolve(true);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.replicator.handleGetNewReplicator(this._anyNewReplicator.bind(this));
services.appLifecycle.handleOnResumed(this._everyAfterResumeProcess.bind(this));
}
}

View File

@@ -1,15 +1,18 @@
import { REMOTE_MINIO, type RemoteDBSettings } from "../../lib/src/common/types";
import { LiveSyncJournalReplicator } from "../../lib/src/replication/journal/LiveSyncJournalReplicator";
import type { LiveSyncAbstractReplicator } from "../../lib/src/replication/LiveSyncAbstractReplicator";
import type { LiveSyncCore } from "../../main";
import { AbstractModule } from "../AbstractModule";
import type { ICoreModule } from "../ModuleTypes";
export class ModuleReplicatorMinIO extends AbstractModule implements ICoreModule {
$anyNewReplicator(settingOverride: Partial<RemoteDBSettings> = {}): Promise<LiveSyncAbstractReplicator> {
export class ModuleReplicatorMinIO extends AbstractModule {
_anyNewReplicator(settingOverride: Partial<RemoteDBSettings> = {}): Promise<LiveSyncAbstractReplicator | false> {
const settings = { ...this.settings, ...settingOverride };
if (settings.remoteType == REMOTE_MINIO) {
return Promise.resolve(new LiveSyncJournalReplicator(this.core));
}
return undefined!;
return Promise.resolve(false);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.replicator.handleGetNewReplicator(this._anyNewReplicator.bind(this));
}
}

View File

@@ -1,18 +1,18 @@
import { REMOTE_P2P, type RemoteDBSettings } from "../../lib/src/common/types";
import type { LiveSyncAbstractReplicator } from "../../lib/src/replication/LiveSyncAbstractReplicator";
import { AbstractModule } from "../AbstractModule";
import type { ICoreModule } from "../ModuleTypes";
import { LiveSyncTrysteroReplicator } from "../../lib/src/replication/trystero/LiveSyncTrysteroReplicator";
import type { LiveSyncCore } from "../../main";
export class ModuleReplicatorP2P extends AbstractModule implements ICoreModule {
$anyNewReplicator(settingOverride: Partial<RemoteDBSettings> = {}): Promise<LiveSyncAbstractReplicator> {
export class ModuleReplicatorP2P extends AbstractModule {
_anyNewReplicator(settingOverride: Partial<RemoteDBSettings> = {}): Promise<LiveSyncAbstractReplicator | false> {
const settings = { ...this.settings, ...settingOverride };
if (settings.remoteType == REMOTE_P2P) {
return Promise.resolve(new LiveSyncTrysteroReplicator(this.core));
}
return undefined!;
return Promise.resolve(false);
}
$everyAfterResumeProcess(): Promise<boolean> {
_everyAfterResumeProcess(): Promise<boolean> {
if (this.settings.remoteType == REMOTE_P2P) {
// // If LiveSync enabled, open replication
// if (this.settings.liveSync) {
@@ -27,4 +27,8 @@ export class ModuleReplicatorP2P extends AbstractModule implements ICoreModule {
return Promise.resolve(true);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.replicator.handleGetNewReplicator(this._anyNewReplicator.bind(this));
services.appLifecycle.handleOnResumed(this._everyAfterResumeProcess.bind(this));
}
}

View File

@@ -18,14 +18,15 @@ import {
} from "../../lib/src/common/types";
import { addPrefix, isAcceptedAll } from "../../lib/src/string_and_binary/path";
import { AbstractModule } from "../AbstractModule";
import type { ICoreModule } from "../ModuleTypes";
import { EVENT_REQUEST_RELOAD_SETTING_TAB, EVENT_SETTING_SAVED, eventHub } from "../../common/events";
import { isDirty } from "../../lib/src/common/utils";
export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
import type { LiveSyncCore } from "../../main";
export class ModuleTargetFilter extends AbstractModule {
reloadIgnoreFiles() {
this.ignoreFiles = this.settings.ignoreFiles.split(",").map((e) => e.trim());
}
$everyOnload(): Promise<boolean> {
private _everyOnload(): Promise<boolean> {
this.reloadIgnoreFiles();
eventHub.onEvent(EVENT_SETTING_SAVED, (evt: ObsidianLiveSyncSettings) => {
this.reloadIgnoreFiles();
});
@@ -35,7 +36,7 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
return Promise.resolve(true);
}
$$id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
_id2path(id: DocumentID, entry?: EntryHasPath, stripPrefix?: boolean): FilePathWithPrefix {
const tempId = id2path(id, entry);
if (stripPrefix && isInternalMetadata(tempId)) {
const out = stripInternalMetadataPrefix(tempId);
@@ -43,7 +44,7 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
}
return tempId;
}
async $$path2id(filename: FilePathWithPrefix | FilePath, prefix?: string): Promise<DocumentID> {
async _path2id(filename: FilePathWithPrefix | FilePath, prefix?: string): Promise<DocumentID> {
const destPath = addPrefix(filename, prefix ?? "");
return await path2id(
destPath,
@@ -52,7 +53,7 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
);
}
$$isFileSizeExceeded(size: number) {
private _isFileSizeExceeded(size: number) {
if (this.settings.syncMaxSizeInMB > 0 && size > 0) {
if (this.settings.syncMaxSizeInMB * 1024 * 1024 < size) {
return true;
@@ -61,7 +62,7 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
return false;
}
$$markFileListPossiblyChanged(): void {
_markFileListPossiblyChanged(): void {
this.totalFileEventCount++;
}
totalFileEventCount = 0;
@@ -72,7 +73,7 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
return false;
}
async $$isTargetFile(file: string | UXFileInfoStub, keepFileCheckList = false) {
private async _isTargetFile(file: string | UXFileInfoStub, keepFileCheckList = false) {
const fileCount = useMemo<Record<string, number>>(
{
key: "fileCount", // forceUpdate: !keepFileCheckList,
@@ -109,7 +110,7 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
const filepath = getStoragePathFromUXFileInfo(file);
const lc = filepath.toLowerCase();
if (this.core.$$shouldCheckCaseInsensitive()) {
if (this.services.setting.shouldCheckCaseInsensitively()) {
if (lc in fileCount && fileCount[lc] > 1) {
return false;
}
@@ -120,7 +121,7 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
// We must reload ignore files due to the its change.
await this.readIgnoreFile(filepath);
}
if (await this.core.$$isIgnoredByIgnoreFiles(file)) {
if (await this.services.vault.isIgnoredByIgnoreFile(file)) {
return false;
}
}
@@ -132,12 +133,19 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
ignoreFiles = [] as string[];
async readIgnoreFile(path: string) {
try {
const file = await this.core.storageAccess.readFileText(path);
// this._log(`[ignore]Reading ignore file: ${path}`, LOG_LEVEL_VERBOSE);
if (!(await this.core.storageAccess.isExistsIncludeHidden(path))) {
this.ignoreFileCache.set(path, false);
// this._log(`[ignore]Ignore file not found: ${path}`, LOG_LEVEL_VERBOSE);
return false;
}
const file = await this.core.storageAccess.readHiddenFileText(path);
const gitignore = file.split(/\r?\n/g);
this.ignoreFileCache.set(path, gitignore);
this._log(`[ignore]Ignore file loaded: ${path}`, LOG_LEVEL_VERBOSE);
return gitignore;
} catch (ex) {
this._log(`Failed to read ignore file ${path}`);
this._log(`[ignore]Failed to read ignore file ${path}`);
this._log(ex, LOG_LEVEL_VERBOSE);
this.ignoreFileCache.set(path, false);
return false;
@@ -150,7 +158,7 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
return await this.readIgnoreFile(path);
}
}
async $$isIgnoredByIgnoreFiles(file: string | UXFileInfoStub): Promise<boolean> {
private async _isIgnoredByIgnoreFiles(file: string | UXFileInfoStub): Promise<boolean> {
if (!this.settings.useIgnoreFiles) {
return false;
}
@@ -164,4 +172,14 @@ export class ModuleTargetFilter extends AbstractModule implements ICoreModule {
}
return false;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.vault.handleMarkFileListPossiblyChanged(this._markFileListPossiblyChanged.bind(this));
services.path.handleId2Path(this._id2path.bind(this));
services.path.handlePath2Id(this._path2id.bind(this));
services.appLifecycle.handleOnLoaded(this._everyOnload.bind(this));
services.vault.handleIsFileSizeTooLarge(this._isFileSizeExceeded.bind(this));
services.vault.handleIsIgnoredByIgnoreFile(this._isIgnoredByIgnoreFiles.bind(this));
services.vault.handleIsTargetFile(this._isTargetFile.bind(this));
}
}

View File

@@ -1,11 +1,15 @@
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import { AbstractModule } from "../AbstractModule.ts";
import { sizeToHumanReadable } from "octagonal-wheels/number";
import type { ICoreModule } from "../ModuleTypes.ts";
import { $msg } from "src/lib/src/common/i18n.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleCheckRemoteSize extends AbstractModule implements ICoreModule {
async $allScanStat(): Promise<boolean> {
export class ModuleCheckRemoteSize extends AbstractModule {
async _allScanStat(): Promise<boolean> {
if (this.core.managers.networkManager.isOnline === false) {
this._log("Network is offline, skipping remote size check.", LOG_LEVEL_INFO);
return true;
}
this._log($msg("moduleCheckRemoteSize.logCheckingStorageSizes"), LOG_LEVEL_VERBOSE);
if (this.settings.notifyThresholdOfRemoteStorageSize < 0) {
const message = $msg("moduleCheckRemoteSize.msgSetDBCapacity");
@@ -105,4 +109,7 @@ export class ModuleCheckRemoteSize extends AbstractModule implements ICoreModule
}
return true;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnScanningStartupIssues(this._allScanStat.bind(this));
}
}

View File

@@ -2,35 +2,36 @@ import { AbstractModule } from "../AbstractModule.ts";
import { LOG_LEVEL_NOTICE, type FilePathWithPrefix } from "../../lib/src/common/types";
import { QueueProcessor } from "octagonal-wheels/concurrency/processor";
import { sendValue } from "octagonal-wheels/messagepassing/signal";
import type { ICoreModule } from "../ModuleTypes.ts";
import type { InjectableServiceHub } from "../../lib/src/services/InjectableServices.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleConflictChecker extends AbstractModule implements ICoreModule {
async $$queueConflictCheckIfOpen(file: FilePathWithPrefix): Promise<void> {
export class ModuleConflictChecker extends AbstractModule {
async _queueConflictCheckIfOpen(file: FilePathWithPrefix): Promise<void> {
const path = file;
if (this.settings.checkConflictOnlyOnOpen) {
const af = this.core.$$getActiveFilePath();
const af = this.services.vault.getActiveFilePath();
if (af && af != path) {
this._log(`${file} is conflicted, merging process has been postponed.`, LOG_LEVEL_NOTICE);
return;
}
}
await this.core.$$queueConflictCheck(path);
await this.services.conflict.queueCheckFor(path);
}
async $$queueConflictCheck(file: FilePathWithPrefix): Promise<void> {
const optionalConflictResult = await this.core.$anyGetOptionalConflictCheckMethod(file);
async _queueConflictCheck(file: FilePathWithPrefix): Promise<void> {
const optionalConflictResult = await this.services.conflict.getOptionalConflictCheckMethod(file);
if (optionalConflictResult == true) {
// The conflict has been resolved by another process.
return;
} else if (optionalConflictResult === "newer") {
// The conflict should be resolved by the newer entry.
await this.core.$anyResolveConflictByNewest(file);
await this.services.conflict.resolveByNewest(file);
} else {
this.conflictCheckQueue.enqueue(file);
}
}
$$waitForAllConflictProcessed(): Promise<boolean> {
_waitForAllConflictProcessed(): Promise<boolean> {
return this.conflictResolveQueue.waitForAllProcessed();
}
@@ -38,7 +39,7 @@ export class ModuleConflictChecker extends AbstractModule implements ICoreModule
conflictResolveQueue = new QueueProcessor(
async (filenames: FilePathWithPrefix[]) => {
const filename = filenames[0];
return await this.core.$$resolveConflict(filename);
return await this.services.conflict.resolve(filename);
},
{
suspended: false,
@@ -73,4 +74,9 @@ export class ModuleConflictChecker extends AbstractModule implements ICoreModule
totalRemainingReactiveSource: this.core.conflictProcessQueueCount,
}
);
onBindFunction(core: LiveSyncCore, services: InjectableServiceHub): void {
services.conflict.handleQueueCheckForIfOpen(this._queueConflictCheckIfOpen.bind(this));
services.conflict.handleQueueCheckFor(this._queueConflictCheck.bind(this));
services.conflict.handleEnsureAllProcessed(this._waitForAllConflictProcessed.bind(this));
}
}

View File

@@ -20,8 +20,9 @@ import {
} from "../../common/utils";
import diff_match_patch from "diff-match-patch";
import { stripAllPrefixes, isPlainText } from "../../lib/src/string_and_binary/path";
import type { ICoreModule } from "../ModuleTypes.ts";
import { eventHub } from "../../common/events.ts";
import type { InjectableServiceHub } from "../../lib/src/services/InjectableServices.ts";
import type { LiveSyncCore } from "../../main.ts";
declare global {
interface LSEvents {
@@ -29,8 +30,8 @@ declare global {
}
}
export class ModuleConflictResolver extends AbstractModule implements ICoreModule {
async $$resolveConflictByDeletingRev(
export class ModuleConflictResolver extends AbstractModule {
private async _resolveConflictByDeletingRev(
path: FilePathWithPrefix,
deleteRevision: string,
subTitle = ""
@@ -82,7 +83,7 @@ export class ModuleConflictResolver extends AbstractModule implements ICoreModul
return MISSING_OR_ERROR;
}
// 2. As usual, delete the conflicted revision and if there are no conflicts, write the resolved content to the storage.
return await this.core.$$resolveConflictByDeletingRev(path, ret.conflictedRev, "Sensible");
return await this.services.conflict.resolveByDeletingRevision(path, ret.conflictedRev, "Sensible");
}
const { rightRev, leftLeaf, rightLeaf } = ret;
@@ -95,7 +96,7 @@ export class ModuleConflictResolver extends AbstractModule implements ICoreModul
}
if (rightLeaf == false) {
// Conflicted item could not load, delete this.
return await this.core.$$resolveConflictByDeletingRev(path, rightRev, "MISSING OLD REV");
return await this.services.conflict.resolveByDeletingRevision(path, rightRev, "MISSING OLD REV");
}
const isSame = leftLeaf.data == rightLeaf.data && leftLeaf.deleted == rightLeaf.deleted;
@@ -115,7 +116,7 @@ export class ModuleConflictResolver extends AbstractModule implements ICoreModul
]
.filter((e) => e.trim())
.join(",");
return await this.core.$$resolveConflictByDeletingRev(path, loser.rev, subTitle);
return await this.services.conflict.resolveByDeletingRevision(path, loser.rev, subTitle);
}
// make diff.
const dmp = new diff_match_patch();
@@ -129,7 +130,7 @@ export class ModuleConflictResolver extends AbstractModule implements ICoreModul
};
}
async $$resolveConflict(filename: FilePathWithPrefix): Promise<void> {
private async _resolveConflict(filename: FilePathWithPrefix): Promise<void> {
// const filename = filenames[0];
return await serialized(`conflict-resolve:${filename}`, async () => {
const conflictCheckResult = await this.checkConflictAndPerformAutoMerge(filename);
@@ -144,16 +145,16 @@ export class ModuleConflictResolver extends AbstractModule implements ICoreModul
}
if (conflictCheckResult === AUTO_MERGED) {
//auto resolved, but need check again;
if (this.settings.syncAfterMerge && !this.core.$$isSuspended()) {
if (this.settings.syncAfterMerge && !this.services.appLifecycle.isSuspended()) {
//Wait for the running replication, if not running replication, run it once.
await this.core.$$replicateByEvent();
await this.services.replication.replicateByEvent();
}
this._log("[conflict] Automatically merged, but we have to check it again");
await this.core.$$queueConflictCheck(filename);
await this.services.conflict.queueCheckFor(filename);
return;
}
if (this.settings.showMergeDialogOnlyOnActive) {
const af = this.core.$$getActiveFilePath();
const af = this.services.vault.getActiveFilePath();
if (af && af != filename) {
this._log(
`[conflict] ${filename} is conflicted. Merging process has been postponed to the file have got opened.`,
@@ -164,11 +165,11 @@ export class ModuleConflictResolver extends AbstractModule implements ICoreModul
}
this._log("[conflict] Manual merge required!");
eventHub.emitEvent("conflict-cancelled", filename);
await this.core.$anyResolveConflictByUI(filename, conflictCheckResult);
await this.services.conflict.resolveByUserInteraction(filename, conflictCheckResult);
});
}
async $anyResolveConflictByNewest(filename: FilePathWithPrefix): Promise<boolean> {
private async _anyResolveConflictByNewest(filename: FilePathWithPrefix): Promise<boolean> {
const currentRev = await this.core.databaseFileAccess.fetchEntryMeta(filename, undefined, true);
if (currentRev == false) {
this._log(`Could not get current revision of ${filename}`);
@@ -206,8 +207,14 @@ export class ModuleConflictResolver extends AbstractModule implements ICoreModul
this._log(
`conflict: Deleting the older revision ${mTimeAndRev[i][1]} (${new Date(mTimeAndRev[i][0]).toLocaleString()}) of ${filename}`
);
await this.core.$$resolveConflictByDeletingRev(filename, mTimeAndRev[i][1], "NEWEST");
await this.services.conflict.resolveByDeletingRevision(filename, mTimeAndRev[i][1], "NEWEST");
}
return true;
}
onBindFunction(core: LiveSyncCore, services: InjectableServiceHub): void {
services.conflict.handleResolveByDeletingRevision(this._resolveConflictByDeletingRev.bind(this));
services.conflict.handleResolve(this._resolveConflict.bind(this));
services.conflict.handleResolveByNewest(this._anyResolveConflictByNewest.bind(this));
}
}

View File

@@ -1,18 +1,19 @@
import { LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import { normalizePath } from "../../deps.ts";
import {
FLAGMD_REDFLAG,
FLAGMD_REDFLAG2,
FLAGMD_REDFLAG2_HR,
FLAGMD_REDFLAG3,
FLAGMD_REDFLAG3_HR,
FlagFilesHumanReadable,
FlagFilesOriginal,
TweakValuesShouldMatchedTemplate,
type ObsidianLiveSyncSettings,
} from "../../lib/src/common/types.ts";
import { AbstractModule } from "../AbstractModule.ts";
import type { ICoreModule } from "../ModuleTypes.ts";
import { $msg } from "../../lib/src/common/i18n.ts";
import type { LiveSyncCore } from "../../main.ts";
import { SvelteDialogManager } from "../features/SetupWizard/ObsidianSvelteDialog.ts";
import FetchEverything from "../features/SetupWizard/dialogs/FetchEverything.svelte";
import RebuildEverything from "../features/SetupWizard/dialogs/RebuildEverything.svelte";
import { extractObject } from "octagonal-wheels/object";
export class ModuleRedFlag extends AbstractModule implements ICoreModule {
export class ModuleRedFlag extends AbstractModule {
async isFlagFileExist(path: string) {
const redflag = await this.core.storageAccess.isExists(normalizePath(path));
if (redflag) {
@@ -33,169 +34,292 @@ export class ModuleRedFlag extends AbstractModule implements ICoreModule {
}
}
isRedFlagRaised = async () => await this.isFlagFileExist(FLAGMD_REDFLAG);
isRedFlag2Raised = async () =>
(await this.isFlagFileExist(FLAGMD_REDFLAG2)) || (await this.isFlagFileExist(FLAGMD_REDFLAG2_HR));
isRedFlag3Raised = async () =>
(await this.isFlagFileExist(FLAGMD_REDFLAG3)) || (await this.isFlagFileExist(FLAGMD_REDFLAG3_HR));
isSuspendFlagActive = async () => await this.isFlagFileExist(FlagFilesOriginal.SUSPEND_ALL);
isRebuildFlagActive = async () =>
(await this.isFlagFileExist(FlagFilesOriginal.REBUILD_ALL)) ||
(await this.isFlagFileExist(FlagFilesHumanReadable.REBUILD_ALL));
isFetchAllFlagActive = async () =>
(await this.isFlagFileExist(FlagFilesOriginal.FETCH_ALL)) ||
(await this.isFlagFileExist(FlagFilesHumanReadable.FETCH_ALL));
async deleteRedFlag2() {
await this.deleteFlagFile(FLAGMD_REDFLAG2);
await this.deleteFlagFile(FLAGMD_REDFLAG2_HR);
async cleanupRebuildFlag() {
await this.deleteFlagFile(FlagFilesOriginal.REBUILD_ALL);
await this.deleteFlagFile(FlagFilesHumanReadable.REBUILD_ALL);
}
async deleteRedFlag3() {
await this.deleteFlagFile(FLAGMD_REDFLAG3);
await this.deleteFlagFile(FLAGMD_REDFLAG3_HR);
async cleanupFetchAllFlag() {
await this.deleteFlagFile(FlagFilesOriginal.FETCH_ALL);
await this.deleteFlagFile(FlagFilesHumanReadable.FETCH_ALL);
}
async $everyOnLayoutReady(): Promise<boolean> {
try {
const isRedFlagRaised = await this.isRedFlagRaised();
const isRedFlag2Raised = await this.isRedFlag2Raised();
const isRedFlag3Raised = await this.isRedFlag3Raised();
dialogManager = new SvelteDialogManager(this.core);
if (isRedFlagRaised || isRedFlag2Raised || isRedFlag3Raised) {
if (isRedFlag2Raised) {
if (
(await this.core.confirm.askYesNoDialog(
"Rebuild everything has been scheduled! Are you sure to rebuild everything?",
{ defaultOption: "Yes", timeout: 0 }
)) !== "yes"
) {
await this.deleteRedFlag2();
await this.core.$$performRestart();
return false;
/**
* Adjust setting to remote if needed.
* @param extra result of dialogues that may contain preventFetchingConfig flag (e.g, from FetchEverything or RebuildEverything)
* @param config current configuration to retrieve remote preferred config
*/
async adjustSettingToRemoteIfNeeded(extra: { preventFetchingConfig: boolean }, config: ObsidianLiveSyncSettings) {
if (extra && extra.preventFetchingConfig) {
return;
}
// Remote configuration fetched and applied.
if (await this.adjustSettingToRemote(config)) {
config = this.core.settings;
} else {
this._log("Remote configuration not applied.", LOG_LEVEL_NOTICE);
}
console.debug(config);
}
/**
* Adjust setting to remote configuration.
* @param config current configuration to retrieve remote preferred config
* @returns updated configuration if applied, otherwise null.
*/
async adjustSettingToRemote(config: ObsidianLiveSyncSettings) {
// Fetch remote configuration unless prevented.
const SKIP_FETCH = "Skip and proceed";
const RETRY_FETCH = "Retry (recommended)";
let canProceed = false;
do {
const remoteTweaks = await this.services.tweakValue.fetchRemotePreferred(config);
if (!remoteTweaks) {
const choice = await this.core.confirm.askSelectStringDialogue(
"Could not fetch remote configuration. What do you want to do?",
[SKIP_FETCH, RETRY_FETCH] as const,
{
defaultAction: RETRY_FETCH,
timeout: 0,
title: "Fetch Remote Configuration Failed",
}
);
if (choice === SKIP_FETCH) {
canProceed = true;
}
if (isRedFlag3Raised) {
if (
(await this.core.confirm.askYesNoDialog("Fetch again has been scheduled! Are you sure?", {
defaultOption: "Yes",
timeout: 0,
})) !== "yes"
) {
await this.deleteRedFlag3();
await this.core.$$performRestart();
return false;
}
}
this.settings.batchSave = false;
await this.core.$allSuspendAllSync();
await this.core.$allSuspendExtraSync();
this.settings.suspendFileWatching = true;
await this.saveSettings();
if (isRedFlag2Raised) {
} else {
const necessary = extractObject(TweakValuesShouldMatchedTemplate, remoteTweaks);
// Check if any necessary tweak value is different from current config.
const differentItems = Object.entries(necessary).filter(([key, value]) => {
return (config as any)[key] !== value;
});
if (differentItems.length === 0) {
this._log(
`${FLAGMD_REDFLAG2} or ${FLAGMD_REDFLAG2_HR} has been detected! Self-hosted LiveSync suspends all sync and rebuild everything.`,
"Remote configuration matches local configuration. No changes applied.",
LOG_LEVEL_NOTICE
);
await this.core.rebuilder.$rebuildEverything();
await this.deleteRedFlag2();
if (
(await this.core.confirm.askYesNoDialog(
"Do you want to resume file and database processing, and restart obsidian now?",
{ defaultOption: "Yes", timeout: 15 }
)) == "yes"
) {
this.settings.suspendFileWatching = false;
await this.saveSettings();
this.core.$$performRestart();
return false;
}
} else if (isRedFlag3Raised) {
this._log(
`${FLAGMD_REDFLAG3} or ${FLAGMD_REDFLAG3_HR} has been detected! Self-hosted LiveSync will discard the local database and fetch everything from the remote once again.`,
LOG_LEVEL_NOTICE
);
const method1 = $msg("RedFlag.Fetch.Method.FetchSafer");
const method2 = $msg("RedFlag.Fetch.Method.FetchSmoother");
const method3 = $msg("RedFlag.Fetch.Method.FetchTraditional");
const methods = [method1, method2, method3] as const;
const chunkMode = await this.core.confirm.askSelectStringDialogue(
$msg("RedFlag.Fetch.Method.Desc"),
methods,
{
defaultAction: method1,
timeout: 0,
title: $msg("RedFlag.Fetch.Method.Title"),
}
);
let makeLocalChunkBeforeSync = false;
let makeLocalFilesBeforeSync = false;
if (chunkMode === method1) {
makeLocalFilesBeforeSync = true;
} else if (chunkMode === method2) {
makeLocalChunkBeforeSync = true;
} else if (chunkMode === method3) {
// Do nothing.
} else {
this._log("Cancelled the fetch operation", LOG_LEVEL_NOTICE);
return false;
}
const optionFetchRemoteConf = $msg("RedFlag.FetchRemoteConfig.Buttons.Fetch");
const optionCancel = $msg("RedFlag.FetchRemoteConfig.Buttons.Cancel");
const fetchRemote = await this.core.confirm.askSelectStringDialogue(
$msg("RedFlag.FetchRemoteConfig.Message"),
[optionFetchRemoteConf, optionCancel],
{
defaultAction: optionFetchRemoteConf,
timeout: 0,
title: $msg("RedFlag.FetchRemoteConfig.Title"),
}
);
if (fetchRemote === optionFetchRemoteConf) {
this._log("Fetching remote configuration", LOG_LEVEL_NOTICE);
const newSettings = JSON.parse(JSON.stringify(this.core.settings)) as ObsidianLiveSyncSettings;
const remoteConfig = await this.core.$$fetchRemotePreferredTweakValues(newSettings);
if (remoteConfig) {
this._log("Remote configuration found.", LOG_LEVEL_NOTICE);
const mergedSettings = {
...this.core.settings,
...remoteConfig,
} satisfies ObsidianLiveSyncSettings;
this._log("Remote configuration applied.", LOG_LEVEL_NOTICE);
this.core.settings = mergedSettings;
} else {
this._log("Remote configuration not applied.", LOG_LEVEL_NOTICE);
}
}
await this.core.rebuilder.$fetchLocal(makeLocalChunkBeforeSync, !makeLocalFilesBeforeSync);
await this.deleteRedFlag3();
if (this.settings.suspendFileWatching) {
if (
(await this.core.confirm.askYesNoDialog(
"Do you want to resume file and database processing, and restart obsidian now?",
{ defaultOption: "Yes", timeout: 15 }
)) == "yes"
) {
this.settings.suspendFileWatching = false;
await this.saveSettings();
this.core.$$performRestart();
return false;
}
} else {
this._log(
"Your content of files will be synchronised gradually. Please wait for the completion.",
LOG_LEVEL_NOTICE
);
}
} else {
// Case of FLAGMD_REDFLAG.
this.settings.writeLogToTheFile = true;
// await this.plugin.openDatabase();
const warningMessage =
"The red flag is raised! The whole initialize steps are skipped, and any file changes are not captured.";
this._log(warningMessage, LOG_LEVEL_NOTICE);
await this.core.confirm.askSelectStringDialogue(
"Your settings differed slightly from the server's. The plug-in has supplemented the incompatible parts with the server settings!",
["OK"] as const,
{
defaultAction: "OK",
timeout: 0,
}
);
}
config = {
...config,
...Object.fromEntries(differentItems),
} satisfies ObsidianLiveSyncSettings;
this.core.settings = config;
await this.core.services.setting.saveSettingData();
this._log("Remote configuration applied.", LOG_LEVEL_NOTICE);
canProceed = true;
return this.core.settings;
}
} while (!canProceed);
}
/**
* Process vault initialisation with suspending file watching and sync.
* @param proc process to be executed during initialisation, should return true if can be continued, false if app is unable to continue the process.
* @param keepSuspending whether to keep suspending file watching after the process.
* @returns result of the process, or false if error occurs.
*/
async processVaultInitialisation(proc: () => Promise<boolean>, keepSuspending = false) {
try {
// Disable batch saving and file watching during initialisation.
this.settings.batchSave = false;
await this.services.setting.suspendAllSync();
await this.services.setting.suspendExtraSync();
this.settings.suspendFileWatching = true;
await this.saveSettings();
try {
const result = await proc();
return result;
} catch (ex) {
this._log("Error during vault initialisation process.", LOG_LEVEL_NOTICE);
this._log(ex, LOG_LEVEL_VERBOSE);
return false;
}
} catch (ex) {
this._log("Error during vault initialisation.", LOG_LEVEL_NOTICE);
this._log(ex, LOG_LEVEL_VERBOSE);
return false;
} finally {
if (!keepSuspending) {
// Re-enable file watching after initialisation.
this.settings.suspendFileWatching = false;
await this.saveSettings();
}
}
}
/**
* Handle the rebuild everything scheduled operation.
* @returns true if can be continued, false if app restart is needed.
*/
async onRebuildEverythingScheduled() {
const method = await this.dialogManager.openWithExplicitCancel(RebuildEverything);
if (method === "cancelled") {
// Clean up the flag file and restart the app.
this._log("Rebuild everything cancelled by user.", LOG_LEVEL_NOTICE);
await this.cleanupRebuildFlag();
this.services.appLifecycle.performRestart();
return false;
}
const { extra } = method;
await this.adjustSettingToRemoteIfNeeded(extra, this.settings);
return await this.processVaultInitialisation(async () => {
await this.core.rebuilder.$rebuildEverything();
await this.cleanupRebuildFlag();
this._log("Rebuild everything operation completed.", LOG_LEVEL_NOTICE);
return true;
});
}
/**
* Handle the fetch all scheduled operation.
* @returns true if can be continued, false if app restart is needed.
*/
async onFetchAllScheduled() {
const method = await this.dialogManager.openWithExplicitCancel(FetchEverything);
if (method === "cancelled") {
this._log("Fetch everything cancelled by user.", LOG_LEVEL_NOTICE);
// Clean up the flag file and restart the app.
await this.cleanupFetchAllFlag();
this.services.appLifecycle.performRestart();
return false;
}
const { vault, extra } = method;
const mapVaultStateToAction = {
identical: {
// If both are identical, no need to make local files/chunks before sync,
// Just for the efficiency, chunks should be made before sync.
makeLocalChunkBeforeSync: true,
makeLocalFilesBeforeSync: false,
},
independent: {
// If both are independent, nothing needs to be made before sync.
// Respect the remote state.
makeLocalChunkBeforeSync: false,
makeLocalFilesBeforeSync: false,
},
unbalanced: {
// If both are unbalanced, local files should be made before sync to avoid data loss.
// Then, chunks should be made before sync for the efficiency, but also the metadata made and should be detected as conflicting.
makeLocalChunkBeforeSync: false,
makeLocalFilesBeforeSync: true,
},
cancelled: {
// Cancelled case, not actually used.
makeLocalChunkBeforeSync: false,
makeLocalFilesBeforeSync: false,
},
} as const;
return await this.processVaultInitialisation(async () => {
await this.adjustSettingToRemoteIfNeeded(extra, this.settings);
// Okay, proceed to fetch everything.
const { makeLocalChunkBeforeSync, makeLocalFilesBeforeSync } = mapVaultStateToAction[vault];
this._log(
`Fetching everything with settings: makeLocalChunkBeforeSync=${makeLocalChunkBeforeSync}, makeLocalFilesBeforeSync=${makeLocalFilesBeforeSync}`,
LOG_LEVEL_INFO
);
await this.core.rebuilder.$fetchLocal(makeLocalChunkBeforeSync, !makeLocalFilesBeforeSync);
await this.cleanupFetchAllFlag();
this._log("Fetch everything operation completed. Vault files will be gradually synced.", LOG_LEVEL_NOTICE);
return true;
});
}
async onSuspendAllScheduled() {
this._log("SCRAM is detected. All operations are suspended.", LOG_LEVEL_NOTICE);
return await this.processVaultInitialisation(async () => {
this._log(
"All operations are suspended as per SCRAM.\nLogs will be written to the file. This might be a performance impact.",
LOG_LEVEL_NOTICE
);
this.settings.writeLogToTheFile = true;
await this.core.services.setting.saveSettingData();
return Promise.resolve(false);
}, true);
}
async verifyAndUnlockSuspension() {
if (!this.settings.suspendFileWatching) {
return true;
}
if (
(await this.core.confirm.askYesNoDialog(
"Do you want to resume file and database processing, and restart obsidian now?",
{ defaultOption: "Yes", timeout: 15 }
)) != "yes"
) {
// TODO: Confirm actually proceed to next process.
return true;
}
this.settings.suspendFileWatching = false;
await this.saveSettings();
this.services.appLifecycle.performRestart();
return false;
}
private async processFlagFilesOnStartup(): Promise<boolean> {
const isFlagSuspensionActive = await this.isSuspendFlagActive();
const isFlagRebuildActive = await this.isRebuildFlagActive();
const isFlagFetchAllActive = await this.isFetchAllFlagActive();
// TODO: Address the case when both flags are active (very unlikely though).
// if(isFlagFetchAllActive && isFlagRebuildActive) {
// const message = "Rebuild everything and Fetch everything flags are both detected.";
// await this.core.confirm.askSelectStringDialogue(
// "Both Rebuild Everything and Fetch Everything flags are detected. Please remove one of them and restart the app.",
// ["OK"] as const,)
if (isFlagFetchAllActive) {
const res = await this.onFetchAllScheduled();
if (res) {
return await this.verifyAndUnlockSuspension();
}
return false;
}
if (isFlagRebuildActive) {
const res = await this.onRebuildEverythingScheduled();
if (res) {
return await this.verifyAndUnlockSuspension();
}
return false;
}
if (isFlagSuspensionActive) {
const res = await this.onSuspendAllScheduled();
return res;
}
return true;
}
async _everyOnLayoutReady(): Promise<boolean> {
try {
const flagProcessResult = await this.processFlagFilesOnStartup();
return flagProcessResult;
} catch (ex) {
this._log("Something went wrong on FlagFile Handling", LOG_LEVEL_NOTICE);
this._log(ex, LOG_LEVEL_VERBOSE);
}
return true;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
super.onBindFunction(core, services);
services.appLifecycle.handleLayoutReady(this._everyOnLayoutReady.bind(this));
}
}

View File

@@ -1,16 +1,22 @@
import type { InjectableServiceHub } from "../../lib/src/services/InjectableServices.ts";
import type { LiveSyncCore } from "../../main.ts";
import { AbstractModule } from "../AbstractModule.ts";
import type { ICoreModule } from "../ModuleTypes.ts";
export class ModuleRemoteGovernor extends AbstractModule implements ICoreModule {
async $$markRemoteLocked(lockByClean: boolean = false): Promise<void> {
export class ModuleRemoteGovernor extends AbstractModule {
private async _markRemoteLocked(lockByClean: boolean = false): Promise<void> {
return await this.core.replicator.markRemoteLocked(this.settings, true, lockByClean);
}
async $$markRemoteUnlocked(): Promise<void> {
private async _markRemoteUnlocked(): Promise<void> {
return await this.core.replicator.markRemoteLocked(this.settings, false, false);
}
async $$markRemoteResolved(): Promise<void> {
private async _markRemoteResolved(): Promise<void> {
return await this.core.replicator.markRemoteResolved(this.settings);
}
onBindFunction(core: LiveSyncCore, services: InjectableServiceHub): void {
services.remote.handleMarkLocked(this._markRemoteLocked.bind(this));
services.remote.handleMarkUnlocked(this._markRemoteUnlocked.bind(this));
services.remote.handleMarkResolved(this._markRemoteResolved.bind(this));
}
}

View File

@@ -11,21 +11,22 @@ import {
} from "../../lib/src/common/types.ts";
import { escapeMarkdownValue } from "../../lib/src/common/utils.ts";
import { AbstractModule } from "../AbstractModule.ts";
import type { ICoreModule } from "../ModuleTypes.ts";
import { $msg } from "../../lib/src/common/i18n.ts";
import type { InjectableServiceHub } from "../../lib/src/services/InjectableServices.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleResolvingMismatchedTweaks extends AbstractModule implements ICoreModule {
async $anyAfterConnectCheckFailed(): Promise<boolean | "CHECKAGAIN" | undefined> {
export class ModuleResolvingMismatchedTweaks extends AbstractModule {
async _anyAfterConnectCheckFailed(): Promise<boolean | "CHECKAGAIN" | undefined> {
if (!this.core.replicator.tweakSettingsMismatched && !this.core.replicator.preferredTweakValue) return false;
const preferred = this.core.replicator.preferredTweakValue;
if (!preferred) return false;
const ret = await this.core.$$askResolvingMismatchedTweaks(preferred);
const ret = await this.services.tweakValue.askResolvingMismatched(preferred);
if (ret == "OK") return false;
if (ret == "CHECKAGAIN") return "CHECKAGAIN";
if (ret == "IGNORE") return true;
}
async $$checkAndAskResolvingMismatchedTweaks(
async _checkAndAskResolvingMismatchedTweaks(
preferred: Partial<TweakValues>
): Promise<[TweakValues | boolean, boolean]> {
const mine = extractObject(TweakValuesShouldMatchedTemplate, this.settings);
@@ -127,7 +128,7 @@ export class ModuleResolvingMismatchedTweaks extends AbstractModule implements I
return CHOICES[retKey];
}
async $$askResolvingMismatchedTweaks(): Promise<"OK" | "CHECKAGAIN" | "IGNORE"> {
async _askResolvingMismatchedTweaks(): Promise<"OK" | "CHECKAGAIN" | "IGNORE"> {
if (!this.core.replicator.tweakSettingsMismatched) {
return "OK";
}
@@ -137,7 +138,7 @@ export class ModuleResolvingMismatchedTweaks extends AbstractModule implements I
}
const preferred = extractObject(TweakValuesShouldMatchedTemplate, tweaks);
const [conf, rebuildRequired] = await this.core.$$checkAndAskResolvingMismatchedTweaks(preferred);
const [conf, rebuildRequired] = await this.services.tweakValue.checkAndAskResolvingMismatched(preferred);
if (!conf) return "IGNORE";
if (conf === true) {
@@ -154,7 +155,7 @@ export class ModuleResolvingMismatchedTweaks extends AbstractModule implements I
if (conf) {
this.settings = { ...this.settings, ...conf };
await this.core.replicator.setPreferredRemoteTweakSettings(this.settings);
await this.core.$$saveSettingData();
await this.services.setting.saveSettingData();
if (rebuildRequired) {
await this.core.rebuilder.$fetchLocal();
}
@@ -164,8 +165,12 @@ export class ModuleResolvingMismatchedTweaks extends AbstractModule implements I
return "IGNORE";
}
async $$fetchRemotePreferredTweakValues(trialSetting: RemoteDBSettings): Promise<TweakValues | false> {
const replicator = await this.core.$anyNewReplicator();
async _fetchRemotePreferredTweakValues(trialSetting: RemoteDBSettings): Promise<TweakValues | false> {
const replicator = await this.services.replicator.getNewReplicator(trialSetting);
if (!replicator) {
this._log("The remote type is not supported for fetching preferred tweak values.", LOG_LEVEL_NOTICE);
return false;
}
if (await replicator.tryConnectRemote(trialSetting)) {
const preferred = await replicator.getRemotePreferredTweakValues(trialSetting);
if (preferred) {
@@ -178,17 +183,17 @@ export class ModuleResolvingMismatchedTweaks extends AbstractModule implements I
return false;
}
async $$checkAndAskUseRemoteConfiguration(
async _checkAndAskUseRemoteConfiguration(
trialSetting: RemoteDBSettings
): Promise<{ result: false | TweakValues; requireFetch: boolean }> {
const preferred = await this.core.$$fetchRemotePreferredTweakValues(trialSetting);
const preferred = await this.services.tweakValue.fetchRemotePreferred(trialSetting);
if (preferred) {
return await this.$$askUseRemoteConfiguration(trialSetting, preferred);
return await this.services.tweakValue.askUseRemoteConfiguration(trialSetting, preferred);
}
return { result: false, requireFetch: false };
}
async $$askUseRemoteConfiguration(
async _askUseRemoteConfiguration(
trialSetting: RemoteDBSettings,
preferred: TweakValues
): Promise<{ result: false | TweakValues; requireFetch: boolean }> {
@@ -278,4 +283,13 @@ export class ModuleResolvingMismatchedTweaks extends AbstractModule implements I
}
return { result: false, requireFetch: false };
}
onBindFunction(core: LiveSyncCore, services: InjectableServiceHub): void {
services.tweakValue.handleFetchRemotePreferred(this._fetchRemotePreferredTweakValues.bind(this));
services.tweakValue.handleCheckAndAskResolvingMismatched(this._checkAndAskResolvingMismatchedTweaks.bind(this));
services.tweakValue.handleAskResolvingMismatched(this._askResolvingMismatchedTweaks.bind(this));
services.tweakValue.handleCheckAndAskUseRemoteConfiguration(this._checkAndAskUseRemoteConfiguration.bind(this));
services.tweakValue.handleAskUseRemoteConfiguration(this._askUseRemoteConfiguration.bind(this));
services.replication.handleCheckConnectionFailure(this._anyAfterConnectCheckFailed.bind(this));
}
}

View File

@@ -1,6 +1,6 @@
import { TFile, TFolder, type ListedFiles } from "obsidian";
import { SerializedFileAccess } from "./storageLib/SerializedFileAccess";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { LOG_LEVEL_INFO, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import type {
FilePath,
@@ -15,43 +15,72 @@ import { TFileToUXFileInfoStub, TFolderToUXFileInfoStub } from "./storageLib/uti
import { StorageEventManagerObsidian, type StorageEventManager } from "./storageLib/StorageEventManager";
import type { StorageAccess } from "../interfaces/StorageAccess";
import { createBlob, type CustomRegExp } from "../../lib/src/common/utils";
import { serialized } from "octagonal-wheels/concurrency/lock_v2";
import type { LiveSyncCore } from "../../main.ts";
import type ObsidianLiveSyncPlugin from "../../main.ts";
import type { InjectableServiceHub } from "../../lib/src/services/InjectableServices.ts";
export class ModuleFileAccessObsidian extends AbstractObsidianModule implements IObsidianModule, StorageAccess {
const fileLockPrefix = "file-lock:";
export class ModuleFileAccessObsidian extends AbstractObsidianModule implements StorageAccess {
processingFiles: Set<FilePathWithPrefix> = new Set();
processWriteFile<T>(file: UXFileInfoStub | FilePathWithPrefix, proc: () => Promise<T>): Promise<T> {
const path = typeof file === "string" ? file : file.path;
return serialized(`${fileLockPrefix}${path}`, async () => {
try {
this.processingFiles.add(path);
return await proc();
} finally {
this.processingFiles.delete(path);
}
});
}
processReadFile<T>(file: UXFileInfoStub | FilePathWithPrefix, proc: () => Promise<T>): Promise<T> {
const path = typeof file === "string" ? file : file.path;
return serialized(`${fileLockPrefix}${path}`, async () => {
try {
this.processingFiles.add(path);
return await proc();
} finally {
this.processingFiles.delete(path);
}
});
}
isFileProcessing(file: UXFileInfoStub | FilePathWithPrefix): boolean {
const path = typeof file === "string" ? file : file.path;
return this.processingFiles.has(path);
}
vaultAccess!: SerializedFileAccess;
vaultManager: StorageEventManager = new StorageEventManagerObsidian(this.plugin, this.core);
$everyOnload(): Promise<boolean> {
vaultManager: StorageEventManager = new StorageEventManagerObsidian(this.plugin, this.core, this);
private _everyOnload(): Promise<boolean> {
this.core.storageAccess = this;
return Promise.resolve(true);
}
$everyOnFirstInitialize(): Promise<boolean> {
_everyOnFirstInitialize(): Promise<boolean> {
this.vaultManager.beginWatch();
return Promise.resolve(true);
}
$allOnUnload(): Promise<boolean> {
// this.vaultManager.
return Promise.resolve(true);
}
// $$flushFileEventQueue(): void {
// this.vaultManager.flushQueue();
// }
$everyCommitPendingFileEvent(): Promise<boolean> {
_everyCommitPendingFileEvent(): Promise<boolean> {
this.vaultManager.flushQueue();
return Promise.resolve(true);
}
$everyOnloadStart(): Promise<boolean> {
this.vaultAccess = new SerializedFileAccess(this.app, this.plugin);
_everyOnloadStart(): Promise<boolean> {
this.vaultAccess = new SerializedFileAccess(this.app, this.plugin, this);
return Promise.resolve(true);
}
$$isStorageInsensitive(): boolean {
_isStorageInsensitive(): boolean {
return this.vaultAccess.isStorageInsensitive();
}
$$shouldCheckCaseInsensitive(): boolean {
if (this.$$isStorageInsensitive()) return false;
_shouldCheckCaseInsensitive(): boolean {
if (this.services.vault.isStorageInsensitive()) return false;
return !this.settings.handleFilenameCaseSensitive;
}
@@ -193,6 +222,7 @@ export class ModuleFileAccessObsidian extends AbstractObsidianModule implements
return null;
}
}
async readStubContent(stub: UXFileInfoStub): Promise<UXFileInfo | false> {
const file = this.vaultAccess.getAbstractFileByPath(stub.path);
if (!(file instanceof TFile)) {
@@ -202,6 +232,7 @@ export class ModuleFileAccessObsidian extends AbstractObsidianModule implements
const data = await this.vaultAccess.vaultReadAuto(file);
return {
...stub,
...TFileToUXFileInfoStub(file),
body: createBlob(data),
};
}
@@ -245,7 +276,7 @@ export class ModuleFileAccessObsidian extends AbstractObsidianModule implements
if (excludeFilter && excludeFilter.some((ee) => ee.test(file))) {
continue;
}
if (await this.plugin.$$isIgnoredByIgnoreFiles(file)) continue;
if (await this.services.vault.isIgnoredByIgnoreFile(file)) continue;
files.push(file);
}
@@ -258,7 +289,7 @@ export class ModuleFileAccessObsidian extends AbstractObsidianModule implements
if (excludeFilter && excludeFilter.some((e) => e.test(v))) {
continue;
}
if (await this.plugin.$$isIgnoredByIgnoreFiles(v)) {
if (await this.services.vault.isIgnoredByIgnoreFile(v)) {
continue;
}
// OK, deep dive!
@@ -314,9 +345,9 @@ export class ModuleFileAccessObsidian extends AbstractObsidianModule implements
// }
// }
async _deleteVaultItem(file: TFile | TFolder) {
async __deleteVaultItem(file: TFile | TFolder) {
if (file instanceof TFile) {
if (!(await this.core.$$isTargetFile(file.path))) return;
if (!(await this.services.vault.isTargetFile(file.path))) return;
}
const dir = file.parent;
if (this.settings.trashInsteadDelete) {
@@ -332,7 +363,7 @@ export class ModuleFileAccessObsidian extends AbstractObsidianModule implements
this._log(
`All files under the parent directory (${dir.path}) have been deleted, so delete this one.`
);
await this._deleteVaultItem(dir);
await this.__deleteVaultItem(dir);
}
}
}
@@ -343,7 +374,19 @@ export class ModuleFileAccessObsidian extends AbstractObsidianModule implements
const file = this.vaultAccess.getAbstractFileByPath(path);
if (file === null) return;
if (file instanceof TFile || file instanceof TFolder) {
return await this._deleteVaultItem(file);
return await this.__deleteVaultItem(file);
}
}
constructor(plugin: ObsidianLiveSyncPlugin, core: LiveSyncCore) {
super(plugin, core);
}
onBindFunction(core: LiveSyncCore, services: InjectableServiceHub): void {
services.vault.handleIsStorageInsensitive(this._isStorageInsensitive.bind(this));
services.setting.handleShouldCheckCaseInsensitively(this._shouldCheckCaseInsensitive.bind(this));
services.appLifecycle.handleFirstInitialise(this._everyOnFirstInitialize.bind(this));
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.appLifecycle.handleOnLoaded(this._everyOnload.bind(this));
services.fileProcessing.handleCommitPendingFileEvents(this._everyCommitPendingFileEvent.bind(this));
}
}

View File

@@ -1,5 +1,5 @@
// ModuleInputUIObsidian.ts
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { scheduleTask } from "octagonal-wheels/concurrency/task";
import { disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject } from "../../common/utils.ts";
import {
@@ -13,12 +13,13 @@ import { Notice } from "../../deps.ts";
import type { Confirm } from "../../lib/src/interfaces/Confirm.ts";
import { setConfirmInstance } from "../../lib/src/PlatformAPIs/obsidian/Confirm.ts";
import { $msg } from "src/lib/src/common/i18n.ts";
import type { LiveSyncCore } from "../../main.ts";
// This module cannot be a common module because it depends on Obsidian's API.
// However, we have to make compatible one for other platform.
export class ModuleInputUIObsidian extends AbstractObsidianModule implements IObsidianModule, Confirm {
$everyOnload(): Promise<boolean> {
export class ModuleInputUIObsidian extends AbstractObsidianModule implements Confirm {
private _everyOnload(): Promise<boolean> {
this.core.confirm = this;
setConfirmInstance(this);
return Promise.resolve(true);
@@ -110,4 +111,8 @@ export class ModuleInputUIObsidian extends AbstractObsidianModule implements IOb
): Promise<(typeof buttons)[number] | false> {
return confirmWithMessage(this.plugin, title, contentMd, buttons, defaultAction, timeout);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnLoaded(this._everyOnload.bind(this));
}
}

View File

@@ -1,17 +1,12 @@
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "../../../deps.ts";
import { serialized } from "../../../lib/src/concurrency/lock.ts";
import { Logger } from "../../../lib/src/common/logger.ts";
import { isPlainText } from "../../../lib/src/string_and_binary/path.ts";
import type { FilePath, HasSettings, UXFileInfoStub } from "../../../lib/src/common/types.ts";
import { createBinaryBlob, isDocContentSame } from "../../../lib/src/common/utils.ts";
import type { InternalFileInfo } from "../../../common/types.ts";
import { markChangesAreSame } from "../../../common/utils.ts";
import { type UXFileInfo } from "../../../lib/src/common/types.ts";
function getFileLockKey(file: TFile | TFolder | string | UXFileInfo) {
return `fl:${typeof file == "string" ? file : file.path}`;
}
function toArrayBuffer(arr: Uint8Array | ArrayBuffer | DataView): ArrayBufferLike {
import type { StorageAccess } from "../../interfaces/StorageAccess.ts";
function toArrayBuffer(arr: Uint8Array<ArrayBuffer> | ArrayBuffer | DataView<ArrayBuffer>): ArrayBuffer {
if (arr instanceof Uint8Array) {
return arr.buffer;
}
@@ -21,94 +16,97 @@ function toArrayBuffer(arr: Uint8Array | ArrayBuffer | DataView): ArrayBufferLik
return arr;
}
// function isFile(file: TFile | TFolder | string | UXFileInfo): boolean {
// file instanceof TFile;
// }
async function processReadFile<T>(file: TFile | TFolder | string | UXFileInfo, proc: () => Promise<T>) {
const ret = await serialized(getFileLockKey(file), () => proc());
return ret;
}
async function processWriteFile<T>(file: TFile | TFolder | string | UXFileInfo, proc: () => Promise<T>) {
const ret = await serialized(getFileLockKey(file), () => proc());
return ret;
}
export class SerializedFileAccess {
app: App;
plugin: HasSettings<{ handleFilenameCaseSensitive: boolean }>;
constructor(app: App, plugin: (typeof this)["plugin"]) {
storageAccess: StorageAccess;
constructor(app: App, plugin: SerializedFileAccess["plugin"], storageAccess: StorageAccess) {
this.app = app;
this.plugin = plugin;
this.storageAccess = storageAccess;
}
async tryAdapterStat(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await processReadFile(file, async () => {
return await this.storageAccess.processReadFile(path as FilePath, async () => {
if (!(await this.app.vault.adapter.exists(path))) return null;
return this.app.vault.adapter.stat(path);
});
}
async adapterStat(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await processReadFile(file, () => this.app.vault.adapter.stat(path));
return await this.storageAccess.processReadFile(path as FilePath, () => this.app.vault.adapter.stat(path));
}
async adapterExists(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await processReadFile(file, () => this.app.vault.adapter.exists(path));
return await this.storageAccess.processReadFile(path as FilePath, () => this.app.vault.adapter.exists(path));
}
async adapterRemove(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await processReadFile(file, () => this.app.vault.adapter.remove(path));
return await this.storageAccess.processReadFile(path as FilePath, () => this.app.vault.adapter.remove(path));
}
async adapterRead(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await processReadFile(file, () => this.app.vault.adapter.read(path));
return await this.storageAccess.processReadFile(path as FilePath, () => this.app.vault.adapter.read(path));
}
async adapterReadBinary(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
return await processReadFile(file, () => this.app.vault.adapter.readBinary(path));
return await this.storageAccess.processReadFile(path as FilePath, () =>
this.app.vault.adapter.readBinary(path)
);
}
async adapterReadAuto(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
if (isPlainText(path)) return await processReadFile(file, () => this.app.vault.adapter.read(path));
return await processReadFile(file, () => this.app.vault.adapter.readBinary(path));
if (isPlainText(path)) {
return await this.storageAccess.processReadFile(path as FilePath, () => this.app.vault.adapter.read(path));
}
return await this.storageAccess.processReadFile(path as FilePath, () =>
this.app.vault.adapter.readBinary(path)
);
}
async adapterWrite(file: TFile | string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
async adapterWrite(
file: TFile | string,
data: string | ArrayBuffer | Uint8Array<ArrayBuffer>,
options?: DataWriteOptions
) {
const path = file instanceof TFile ? file.path : file;
if (typeof data === "string") {
return await processWriteFile(file, () => this.app.vault.adapter.write(path, data, options));
return await this.storageAccess.processWriteFile(path as FilePath, () =>
this.app.vault.adapter.write(path, data, options)
);
} else {
return await processWriteFile(file, () =>
return await this.storageAccess.processWriteFile(path as FilePath, () =>
this.app.vault.adapter.writeBinary(path, toArrayBuffer(data), options)
);
}
}
async vaultCacheRead(file: TFile) {
return await processReadFile(file, () => this.app.vault.cachedRead(file));
return await this.storageAccess.processReadFile(file.path as FilePath, () => this.app.vault.cachedRead(file));
}
async vaultRead(file: TFile) {
return await processReadFile(file, () => this.app.vault.read(file));
return await this.storageAccess.processReadFile(file.path as FilePath, () => this.app.vault.read(file));
}
async vaultReadBinary(file: TFile) {
return await processReadFile(file, () => this.app.vault.readBinary(file));
return await this.storageAccess.processReadFile(file.path as FilePath, () => this.app.vault.readBinary(file));
}
async vaultReadAuto(file: TFile) {
const path = file.path;
if (isPlainText(path)) return await processReadFile(file, () => this.app.vault.read(file));
return await processReadFile(file, () => this.app.vault.readBinary(file));
if (isPlainText(path)) {
return await this.storageAccess.processReadFile(path as FilePath, () => this.app.vault.read(file));
}
return await this.storageAccess.processReadFile(path as FilePath, () => this.app.vault.readBinary(file));
}
async vaultModify(file: TFile, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
async vaultModify(file: TFile, data: string | ArrayBuffer | Uint8Array<ArrayBuffer>, options?: DataWriteOptions) {
if (typeof data === "string") {
return await processWriteFile(file, async () => {
return await this.storageAccess.processWriteFile(file.path as FilePath, async () => {
const oldData = await this.app.vault.read(file);
if (data === oldData) {
if (options && options.mtime) markChangesAreSame(file.path, file.stat.mtime, options.mtime);
@@ -118,7 +116,7 @@ export class SerializedFileAccess {
return true;
});
} else {
return await processWriteFile(file, async () => {
return await this.storageAccess.processWriteFile(file.path as FilePath, async () => {
const oldData = await this.app.vault.readBinary(file);
if (await isDocContentSame(createBinaryBlob(oldData), createBinaryBlob(data))) {
if (options && options.mtime) markChangesAreSame(file.path, file.stat.mtime, options.mtime);
@@ -131,13 +129,17 @@ export class SerializedFileAccess {
}
async vaultCreate(
path: string,
data: string | ArrayBuffer | Uint8Array,
data: string | ArrayBuffer | Uint8Array<ArrayBuffer>,
options?: DataWriteOptions
): Promise<TFile> {
if (typeof data === "string") {
return await processWriteFile(path, () => this.app.vault.create(path, data, options));
return await this.storageAccess.processWriteFile(path as FilePath, () =>
this.app.vault.create(path, data, options)
);
} else {
return await processWriteFile(path, () => this.app.vault.createBinary(path, toArrayBuffer(data), options));
return await this.storageAccess.processWriteFile(path as FilePath, () =>
this.app.vault.createBinary(path, toArrayBuffer(data), options)
);
}
}
@@ -150,10 +152,14 @@ export class SerializedFileAccess {
}
async delete(file: TFile | TFolder, force = false) {
return await processWriteFile(file, () => this.app.vault.delete(file, force));
return await this.storageAccess.processWriteFile(file.path as FilePath, () =>
this.app.vault.delete(file, force)
);
}
async trash(file: TFile | TFolder, force = false) {
return await processWriteFile(file, () => this.app.vault.trash(file, force));
return await this.storageAccess.processWriteFile(file.path as FilePath, () =>
this.app.vault.trash(file, force)
);
}
isStorageInsensitive(): boolean {

View File

@@ -7,24 +7,27 @@ import {
LOG_LEVEL_INFO,
LOG_LEVEL_NOTICE,
LOG_LEVEL_VERBOSE,
type FileEventType,
type FilePath,
type FilePathWithPrefix,
type UXFileInfoStub,
type UXInternalFileInfoStub,
} from "../../../lib/src/common/types.ts";
import { delay, fireAndForget, getFileRegExp } from "../../../lib/src/common/utils.ts";
import { type FileEventItem, type FileEventType } from "../../../common/types.ts";
import { serialized, skipIfDuplicated } from "../../../lib/src/concurrency/lock.ts";
import { delay, fireAndForget } from "../../../lib/src/common/utils.ts";
import { type FileEventItem } from "../../../common/types.ts";
import { serialized, skipIfDuplicated } from "octagonal-wheels/concurrency/lock";
import {
finishAllWaitingForTimeout,
finishWaitingForTimeout,
isWaitingForTimeout,
waitForTimeout,
} from "../../../lib/src/concurrency/task.ts";
import { Semaphore } from "../../../lib/src/concurrency/semaphore.ts";
} from "octagonal-wheels/concurrency/task";
import { Semaphore } from "octagonal-wheels/concurrency/semaphore";
import type { LiveSyncCore } from "../../../main.ts";
import { InternalFileToUXFileInfoStub, TFileToUXFileInfoStub } from "./utilObsidian.ts";
import ObsidianLiveSyncPlugin from "../../../main.ts";
import type { StorageAccess } from "../../interfaces/StorageAccess.ts";
import { HiddenFileSync } from "../../../features/HiddenFileSync/CmdHiddenFileSync.ts";
// import { InternalFileToUXFileInfo } from "../platforms/obsidian.ts";
export type FileEvent = {
@@ -46,6 +49,10 @@ export abstract class StorageEventManager {
export class StorageEventManagerObsidian extends StorageEventManager {
plugin: ObsidianLiveSyncPlugin;
core: LiveSyncCore;
storageAccess: StorageAccess;
get services() {
return this.core.services;
}
get shouldBatchSave() {
return this.core.settings?.batchSave && this.core.settings?.liveSync != true;
@@ -56,10 +63,15 @@ export class StorageEventManagerObsidian extends StorageEventManager {
get batchSaveMaximumDelay(): number {
return this.core.settings?.batchSaveMaximumDelay ?? DEFAULT_SETTINGS.batchSaveMaximumDelay;
}
constructor(plugin: ObsidianLiveSyncPlugin, core: LiveSyncCore) {
// Necessary evil.
cmdHiddenFileSync: HiddenFileSync;
constructor(plugin: ObsidianLiveSyncPlugin, core: LiveSyncCore, storageAccess: StorageAccess) {
super();
this.storageAccess = storageAccess;
this.plugin = plugin;
this.core = core;
this.cmdHiddenFileSync = this.plugin.getAddOn(HiddenFileSync.name) as HiddenFileSync;
}
beginWatch() {
const plugin = this.plugin;
@@ -88,6 +100,10 @@ export class StorageEventManagerObsidian extends StorageEventManager {
}
const file = info?.file as TFile;
if (!file) return;
if (this.storageAccess.isFileProcessing(file.path as FilePath)) {
// Logger(`Editor change skipped because the file is being processed: ${file.path}`, LOG_LEVEL_VERBOSE);
return;
}
if (!this.isWaiting(file.path as FilePath)) {
return;
}
@@ -102,22 +118,35 @@ export class StorageEventManagerObsidian extends StorageEventManager {
watchVaultCreate(file: TAbstractFile, ctx?: any) {
if (file instanceof TFolder) return;
if (this.storageAccess.isFileProcessing(file.path as FilePath)) {
// Logger(`File create skipped because the file is being processed: ${file.path}`, LOG_LEVEL_VERBOSE);
return;
}
const fileInfo = TFileToUXFileInfoStub(file);
void this.appendQueue([{ type: "CREATE", file: fileInfo }], ctx);
}
watchVaultChange(file: TAbstractFile, ctx?: any) {
if (file instanceof TFolder) return;
if (this.storageAccess.isFileProcessing(file.path as FilePath)) {
// Logger(`File change skipped because the file is being processed: ${file.path}`, LOG_LEVEL_VERBOSE);
return;
}
const fileInfo = TFileToUXFileInfoStub(file);
void this.appendQueue([{ type: "CHANGED", file: fileInfo }], ctx);
}
watchVaultDelete(file: TAbstractFile, ctx?: any) {
if (file instanceof TFolder) return;
if (this.storageAccess.isFileProcessing(file.path as FilePath)) {
// Logger(`File delete skipped because the file is being processed: ${file.path}`, LOG_LEVEL_VERBOSE);
return;
}
const fileInfo = TFileToUXFileInfoStub(file, true);
void this.appendQueue([{ type: "DELETE", file: fileInfo }], ctx);
}
watchVaultRename(file: TAbstractFile, oldFile: string, ctx?: any) {
// vault Rename will not be raised for self-events (Self-hosted LiveSync will not handle 'rename').
if (file instanceof TFile) {
const fileInfo = TFileToUXFileInfoStub(file);
void this.appendQueue(
@@ -145,30 +174,32 @@ export class StorageEventManagerObsidian extends StorageEventManager {
}
// Watch raw events (Internal API)
watchVaultRawEvents(path: FilePath) {
if (this.storageAccess.isFileProcessing(path)) {
// Logger(`Raw file event skipped because the file is being processed: ${path}`, LOG_LEVEL_VERBOSE);
return;
}
// Only for internal files.
if (!this.plugin.settings) return;
// if (this.plugin.settings.useIgnoreFiles && this.plugin.ignoreFiles.some(e => path.endsWith(e.trim()))) {
if (this.plugin.settings.useIgnoreFiles) {
// If it is one of ignore files, refresh the cached one.
// (Calling$$isTargetFile will refresh the cache)
void this.plugin.$$isTargetFile(path).then(() => this._watchVaultRawEvents(path));
void this.services.vault.isTargetFile(path).then(() => this._watchVaultRawEvents(path));
} else {
this._watchVaultRawEvents(path);
void this._watchVaultRawEvents(path);
}
}
_watchVaultRawEvents(path: FilePath) {
async _watchVaultRawEvents(path: FilePath) {
if (!this.plugin.settings.syncInternalFiles && !this.plugin.settings.usePluginSync) return;
if (!this.plugin.settings.watchInternalFileChanges) return;
if (!path.startsWith(this.plugin.app.vault.configDir)) return;
const ignorePatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesIgnorePatterns");
const targetPatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesTargetPatterns");
if (ignorePatterns.some((e) => e.test(path))) return;
if (!targetPatterns.some((e) => e.test(path))) return;
if (path.endsWith("/")) {
// Folder
return;
}
const isTargetFile = await this.cmdHiddenFileSync.isTargetFile(path);
if (!isTargetFile) return;
void this.appendQueue(
[
@@ -185,7 +216,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
async appendQueue(params: FileEvent[], ctx?: any) {
if (!this.core.settings.isConfigured) return;
if (this.core.settings.suspendFileWatching) return;
this.core.$$markFileListPossiblyChanged();
this.core.services.vault.markFileListPossiblyChanged();
// Flag up to be reload
const processFiles = new Set<FilePath>();
for (const param of params) {
@@ -198,7 +229,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
const oldPath = param.oldPath;
if (type !== "INTERNAL") {
const size = (file as UXFileInfoStub).stat.size;
if (this.core.$$isFileSizeExceeded(size) && (type == "CREATE" || type == "CHANGED")) {
if (this.services.vault.isFileSizeTooLarge(size) && (type == "CREATE" || type == "CHANGED")) {
Logger(
`The storage file has been changed but exceeds the maximum size. Skipping: ${param.file.path}`,
LOG_LEVEL_NOTICE
@@ -207,7 +238,10 @@ export class StorageEventManagerObsidian extends StorageEventManager {
}
}
if (file instanceof TFolder) continue;
if (!(await this.core.$$isTargetFile(file.path))) continue;
// TODO: Confirm why only the TFolder skipping
// Possibly following line is needed...
// if (file?.isFolder) continue;
if (!(await this.services.vault.isTargetFile(file.path))) continue;
// Stop cache using to prevent the corruption;
// let cache: null | string | ArrayBuffer;
@@ -264,7 +298,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
concurrentProcessing = Semaphore(5);
waitedSince = new Map<FilePath | FilePathWithPrefix, number>();
async startStandingBy(filename: FilePath) {
// If waited, cancel previous waiting.
// If waited, no need to start again (looping inside the function)
await skipIfDuplicated(`storage-event-manager-${filename}`, async () => {
Logger(`Processing ${filename}: Starting`, LOG_LEVEL_DEBUG);
const release = await this.concurrentProcessing.acquire();
@@ -284,6 +318,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
// continue;
// }
const type = target.type;
// If already cancelled by other operation, skip this.
if (target.cancelled) {
Logger(`Processing ${filename}: Cancelled (scheduled): ${operationType}`, LOG_LEVEL_DEBUG);
this.cancelStandingBy(target);
@@ -384,12 +419,12 @@ export class StorageEventManagerObsidian extends StorageEventManager {
const lockKey = `handleFile:${file.path}`;
return await serialized(lockKey, async () => {
if (queue.type == "INTERNAL" || file.isInternal) {
await this.core.$anyProcessOptionalFileEvent(file.path as unknown as FilePath);
await this.core.services.fileProcessing.processOptionalFileEvent(file.path as unknown as FilePath);
} else {
const key = `file-last-proc-${queue.type}-${file.path}`;
const last = Number((await this.core.kvDB.get(key)) || 0);
if (queue.type == "DELETE") {
await this.core.$anyHandlerProcessesFileEvent(queue);
await this.core.services.fileProcessing.processFileEvent(queue);
} else {
if (file.stat.mtime == last) {
Logger(`File has been already scanned on ${queue.type}, skip: ${file.path}`, LOG_LEVEL_VERBOSE);
@@ -397,7 +432,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
// this.cancelRelativeEvent(queue);
return;
}
if (!(await this.core.$anyHandlerProcessesFileEvent(queue))) {
if (!(await this.core.services.fileProcessing.processFileEvent(queue))) {
Logger(
`STORAGE -> DB: Handler failed, cancel the relative operations: ${file.path}`,
LOG_LEVEL_INFO

View File

@@ -17,10 +17,11 @@ import {
import { isAnyNote } from "../../lib/src/common/utils.ts";
import { stripAllPrefixes } from "../../lib/src/string_and_binary/path.ts";
import { AbstractModule } from "../AbstractModule.ts";
import type { ICoreModule } from "../ModuleTypes.ts";
import { withConcurrency } from "octagonal-wheels/iterable/map";
export class ModuleInitializerFile extends AbstractModule implements ICoreModule {
async $$performFullScan(showingNotice?: boolean): Promise<void> {
import type { InjectableServiceHub } from "../../lib/src/services/InjectableServices.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleInitializerFile extends AbstractModule {
private async _performFullScan(showingNotice?: boolean, ignoreSuspending: boolean = false): Promise<boolean> {
this._log("Opening the key-value database", LOG_LEVEL_VERBOSE);
const isInitialized = (await this.core.kvDB.get<boolean>("initialized")) || false;
// synchronize all files between database and storage.
@@ -32,7 +33,17 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
"syncAll"
);
}
return;
return false;
}
if (!ignoreSuspending && this.settings.suspendFileWatching) {
if (showingNotice) {
this._log(
"Now suspending file watching. Synchronising between the storage and the local database is now prevented.",
LOG_LEVEL_NOTICE,
"syncAll"
);
}
return false;
}
if (showingNotice) {
@@ -49,7 +60,7 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
const _filesStorage = [] as typeof filesStorageSrc;
for (const f of filesStorageSrc) {
if (await this.core.$$isTargetFile(f.path, f != filesStorageSrc[0])) {
if (await this.services.vault.isTargetFile(f.path, f != filesStorageSrc[0])) {
_filesStorage.push(f);
}
}
@@ -93,7 +104,7 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
);
const path = getPath(doc);
if (isValidPath(path) && (await this.core.$$isTargetFile(path, true))) {
if (isValidPath(path) && (await this.services.vault.isTargetFile(path, true))) {
if (!isMetaEntry(doc)) {
this._log(`Invalid entry: ${path}`, LOG_LEVEL_INFO);
continue;
@@ -123,7 +134,6 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
this._log(`Total files in the database: ${databaseFileNames.length}`, LOG_LEVEL_VERBOSE, "syncAll");
this._log(`Total files in the storage: ${storageFileNames.length}`, LOG_LEVEL_VERBOSE, "syncAll");
this._log(`Total files: ${allFiles.length}`, LOG_LEVEL_VERBOSE, "syncAll");
const filesExistOnlyInStorage = allFiles.filter((e) => !databaseFileNameCI2CS[e]);
const filesExistOnlyInDatabase = allFiles.filter((e) => !storageFileNameCI2CS[e]);
const filesExistBoth = allFiles.filter((e) => databaseFileNameCI2CS[e] && storageFileNameCI2CS[e]);
@@ -182,7 +192,7 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
runAll("UPDATE DATABASE", filesExistOnlyInStorage, async (e) => {
// Exists in storage but not in database.
const file = storageFileNameMap[storageFileNameCI2CS[e]];
if (!this.core.$$isFileSizeExceeded(file.stat.size)) {
if (!this.services.vault.isFileSizeTooLarge(file.stat.size)) {
const path = file.path;
await this.core.fileHandler.storeFileToDB(file);
// fireAndForget(() => this.checkAndApplySettingFromMarkdown(path, true));
@@ -198,7 +208,7 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
// Exists in database but not in storage.
const path = getPath(w) ?? e;
if (w && !(w.deleted || w._deleted)) {
if (!this.core.$$isFileSizeExceeded(w.size)) {
if (!this.services.vault.isFileSizeTooLarge(w.size)) {
// Prevent applying the conflicted state to the storage.
if (w._conflicts?.length ?? 0 > 0) {
this._log(`UPDATE STORAGE: ${path} has conflicts. skipped (x)`, LOG_LEVEL_INFO);
@@ -240,7 +250,10 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
this._log(`SYNC DATABASE AND STORAGE: ${file.path} has conflicts. skipped`, LOG_LEVEL_INFO);
return;
}
if (!this.core.$$isFileSizeExceeded(file.stat.size) && !this.core.$$isFileSizeExceeded(doc.size)) {
if (
!this.services.vault.isFileSizeTooLarge(file.stat.size) &&
!this.services.vault.isFileSizeTooLarge(doc.size)
) {
await this.syncFileBetweenDBandStorage(file, doc);
} else {
this._log(
@@ -261,6 +274,7 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
if (showingNotice) {
this._log("Initialize done!", LOG_LEVEL_NOTICE, "syncAll");
}
return true;
}
async syncFileBetweenDBandStorage(file: UXFileInfoStub, doc: MetaEntry) {
@@ -279,7 +293,7 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
const compareResult = compareFileFreshness(file, doc);
switch (compareResult) {
case BASE_IS_NEW:
if (!this.core.$$isFileSizeExceeded(file.stat.size)) {
if (!this.services.vault.isFileSizeTooLarge(file.stat.size)) {
this._log("STORAGE -> DB :" + file.path);
await this.core.fileHandler.storeFileToDB(file);
} else {
@@ -290,7 +304,7 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
}
break;
case TARGET_IS_NEW:
if (!this.core.$$isFileSizeExceeded(doc.size)) {
if (!this.services.vault.isFileSizeTooLarge(doc.size)) {
this._log("STORAGE <- DB :" + file.path);
if (await this.core.fileHandler.dbToStorage(doc, stripAllPrefixes(file.path), true)) {
eventHub.emitEvent("event-file-changed", {
@@ -355,23 +369,31 @@ export class ModuleInitializerFile extends AbstractModule implements ICoreModule
this._log(`Checking expired file history done`);
}
async $$initializeDatabase(showingNotice: boolean = false, reopenDatabase = true): Promise<boolean> {
this.core.$$resetIsReady();
if (!reopenDatabase || (await this.core.$$openDatabase())) {
private async _initializeDatabase(
showingNotice: boolean = false,
reopenDatabase = true,
ignoreSuspending: boolean = false
): Promise<boolean> {
this.services.appLifecycle.resetIsReady();
if (!reopenDatabase || (await this.services.database.openDatabase())) {
if (this.localDatabase.isReady) {
await this.core.$$performFullScan(showingNotice);
await this.services.vault.scanVault(showingNotice, ignoreSuspending);
}
if (!(await this.core.$everyOnDatabaseInitialized(showingNotice))) {
this._log(`Initializing database has been failed on some module`, LOG_LEVEL_NOTICE);
if (!(await this.services.databaseEvents.onDatabaseInitialised(showingNotice))) {
this._log(`Initializing database has been failed on some module!`, LOG_LEVEL_NOTICE);
return false;
}
this.core.$$markIsReady();
this.services.appLifecycle.markIsReady();
// run queued event once.
await this.core.$everyCommitPendingFileEvent();
await this.services.fileProcessing.commitPendingFileEvents();
return true;
} else {
this.core.$$resetIsReady();
this.services.appLifecycle.resetIsReady();
return false;
}
}
onBindFunction(core: LiveSyncCore, services: InjectableServiceHub): void {
services.databaseEvents.handleInitialiseDatabase(this._initializeDatabase.bind(this));
services.vault.handleScanVault(this._performFullScan.bind(this));
}
}

View File

@@ -3,9 +3,9 @@ import { OpenKeyValueDatabase } from "../../common/KeyValueDB.ts";
import type { LiveSyncLocalDB } from "../../lib/src/pouchdb/LiveSyncLocalDB.ts";
import { LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import { AbstractModule } from "../AbstractModule.ts";
import type { ICoreModule } from "../ModuleTypes.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleKeyValueDB extends AbstractModule implements ICoreModule {
export class ModuleKeyValueDB extends AbstractModule {
tryCloseKvDB() {
try {
this.core.kvDB?.close();
@@ -22,7 +22,7 @@ export class ModuleKeyValueDB extends AbstractModule implements ICoreModule {
this.tryCloseKvDB();
await delay(10);
await yieldMicrotask();
this.core.kvDB = await OpenKeyValueDatabase(this.core.$$getVaultName() + "-livesync-kv");
this.core.kvDB = await OpenKeyValueDatabase(this.services.vault.getVaultName() + "-livesync-kv");
await yieldMicrotask();
await delay(100);
} catch (e) {
@@ -33,21 +33,23 @@ export class ModuleKeyValueDB extends AbstractModule implements ICoreModule {
}
return true;
}
$allOnDBUnload(db: LiveSyncLocalDB): void {
_onDBUnload(db: LiveSyncLocalDB) {
if (this.core.kvDB) this.core.kvDB.close();
return Promise.resolve(true);
}
$allOnDBClose(db: LiveSyncLocalDB): void {
_onDBClose(db: LiveSyncLocalDB) {
if (this.core.kvDB) this.core.kvDB.close();
return Promise.resolve(true);
}
async $everyOnloadAfterLoadSettings(): Promise<boolean> {
private async _everyOnloadAfterLoadSettings(): Promise<boolean> {
if (!(await this.openKeyValueDB())) {
return false;
}
this.core.simpleStore = this.core.$$getSimpleStore<any>("os");
this.core.simpleStore = this.services.database.openSimpleStore<any>("os");
return Promise.resolve(true);
}
$$getSimpleStore<T>(kind: string) {
_getSimpleStore<T>(kind: string) {
const prefix = `${kind}-`;
return {
get: async (key: string): Promise<T> => {
@@ -75,18 +77,18 @@ export class ModuleKeyValueDB extends AbstractModule implements ICoreModule {
},
};
}
$everyOnInitializeDatabase(db: LiveSyncLocalDB): Promise<boolean> {
_everyOnInitializeDatabase(db: LiveSyncLocalDB): Promise<boolean> {
return this.openKeyValueDB();
}
async $everyOnResetDatabase(db: LiveSyncLocalDB): Promise<boolean> {
async _everyOnResetDatabase(db: LiveSyncLocalDB): Promise<boolean> {
try {
const kvDBKey = "queued-files";
await this.core.kvDB.del(kvDBKey);
// localStorage.removeItem(lsKey);
await this.core.kvDB.destroy();
await yieldMicrotask();
this.core.kvDB = await OpenKeyValueDatabase(this.core.$$getVaultName() + "-livesync-kv");
this.core.kvDB = await OpenKeyValueDatabase(this.services.vault.getVaultName() + "-livesync-kv");
await delay(100);
} catch (e) {
this.core.kvDB = undefined!;
@@ -96,4 +98,12 @@ export class ModuleKeyValueDB extends AbstractModule implements ICoreModule {
}
return true;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.databaseEvents.handleOnUnloadDatabase(this._onDBUnload.bind(this));
services.databaseEvents.handleOnCloseDatabase(this._onDBClose.bind(this));
services.databaseEvents.handleOnDatabaseInitialisation(this._everyOnInitializeDatabase.bind(this));
services.databaseEvents.handleOnResetDatabase(this._everyOnResetDatabase.bind(this));
services.database.handleOpenSimpleStore(this._getSimpleStore.bind(this));
services.appLifecycle.handleOnSettingLoaded(this._everyOnloadAfterLoadSettings.bind(this));
}
}

View File

@@ -1,18 +1,32 @@
import { LOG_LEVEL_NOTICE } from "octagonal-wheels/common/logger";
import { LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, Logger } from "../../lib/src/common/logger.ts";
import {
EVENT_REQUEST_OPEN_P2P,
EVENT_REQUEST_OPEN_SETTING_WIZARD,
EVENT_REQUEST_OPEN_SETTINGS,
EVENT_REQUEST_OPEN_SETUP_URI,
EVENT_REQUEST_RUN_DOCTOR,
EVENT_REQUEST_RUN_FIX_INCOMPLETE,
eventHub,
} from "../../common/events.ts";
import { AbstractModule } from "../AbstractModule.ts";
import type { ICoreModule } from "../ModuleTypes.ts";
import { $msg } from "src/lib/src/common/i18n.ts";
import { performDoctorConsultation, RebuildOptions } from "../../lib/src/common/configForDoc.ts";
import { getPath, isValidPath } from "../../common/utils.ts";
import { isMetaEntry } from "../../lib/src/common/types.ts";
import { isDeletedEntry, isDocContentSame, isLoadedEntry, readAsBlob } from "../../lib/src/common/utils.ts";
import { countCompromisedChunks } from "../../lib/src/pouchdb/negotiation.ts";
import type { LiveSyncCore } from "../../main.ts";
import { SetupManager } from "../features/SetupManager.ts";
export class ModuleMigration extends AbstractModule implements ICoreModule {
type ErrorInfo = {
path: string;
recordedSize: number;
actualSize: number;
storageSize: number;
contentMatched: boolean;
isConflicted?: boolean;
};
export class ModuleMigration extends AbstractModule {
async migrateUsingDoctor(skipRebuild: boolean = false, activateReason = "updated", forceRescan = false) {
const { shouldRebuild, shouldRebuildLocal, isModified, settings } = await performDoctorConsultation(
this.core,
@@ -31,12 +45,15 @@ export class ModuleMigration extends AbstractModule implements ICoreModule {
if (!skipRebuild) {
if (shouldRebuild) {
await this.core.rebuilder.scheduleRebuild();
await this.core.$$performRestart();
this.services.appLifecycle.performRestart();
return false;
} else if (shouldRebuildLocal) {
await this.core.rebuilder.scheduleFetch();
await this.core.$$performRestart();
this.services.appLifecycle.performRestart();
return false;
}
}
return true;
}
async migrateDisableBulkSend() {
@@ -49,6 +66,9 @@ export class ModuleMigration extends AbstractModule implements ICoreModule {
}
async initialMessage() {
const manager = this.core.getModule(SetupManager);
return await manager.startOnBoarding();
/*
const message = $msg("moduleMigration.msgInitialSetup", {
URI_DOC: $msg("moduleMigration.docUri"),
});
@@ -66,6 +86,7 @@ export class ModuleMigration extends AbstractModule implements ICoreModule {
return true;
}
return false;
*/
}
async askAgainForSetupURI() {
@@ -96,30 +117,245 @@ export class ModuleMigration extends AbstractModule implements ICoreModule {
return false;
}
async $everyOnFirstInitialize(): Promise<boolean> {
async hasIncompleteDocs(force: boolean = false): Promise<boolean> {
const incompleteDocsChecked = (await this.core.kvDB.get<boolean>("checkIncompleteDocs")) || false;
if (incompleteDocsChecked && !force) {
this._log("Incomplete docs check already done, skipping.", LOG_LEVEL_VERBOSE);
return Promise.resolve(true);
}
this._log("Checking for incomplete documents...", LOG_LEVEL_NOTICE, "check-incomplete");
const errorFiles = [] as ErrorInfo[];
for await (const metaDoc of this.localDatabase.findAllNormalDocs({ conflicts: true })) {
const path = getPath(metaDoc);
if (!isValidPath(path)) {
continue;
}
if (!(await this.services.vault.isTargetFile(path, true))) {
continue;
}
if (!isMetaEntry(metaDoc)) {
continue;
}
const doc = await this.localDatabase.getDBEntryFromMeta(metaDoc);
if (!doc || !isLoadedEntry(doc)) {
continue;
}
if (isDeletedEntry(doc)) {
continue;
}
const isConflicted = metaDoc?._conflicts && metaDoc._conflicts.length > 0;
let storageFileContent;
try {
storageFileContent = await this.core.storageAccess.readHiddenFileBinary(path);
} catch (e) {
Logger(`Failed to read file ${path}: Possibly unprocessed or missing`);
Logger(e, LOG_LEVEL_VERBOSE);
continue;
}
// const storageFileBlob = createBlob(storageFileContent);
const sizeOnStorage = storageFileContent.byteLength;
const recordedSize = doc.size;
const docBlob = readAsBlob(doc);
const actualSize = docBlob.size;
if (
recordedSize !== actualSize ||
sizeOnStorage !== actualSize ||
sizeOnStorage !== recordedSize ||
isConflicted
) {
const contentMatched = await isDocContentSame(doc.data, storageFileContent);
errorFiles.push({
path,
recordedSize,
actualSize,
storageSize: sizeOnStorage,
contentMatched,
isConflicted,
});
Logger(
`Size mismatch for ${path}: ${recordedSize} (DB Recorded) , ${actualSize} (DB Stored) , ${sizeOnStorage} (Storage Stored), ${contentMatched ? "Content Matched" : "Content Mismatched"} ${isConflicted ? "Conflicted" : "Not Conflicted"}`
);
}
}
if (errorFiles.length == 0) {
Logger("No size mismatches found", LOG_LEVEL_NOTICE);
await this.core.kvDB.set("checkIncompleteDocs", true);
return Promise.resolve(true);
}
Logger(`Found ${errorFiles.length} size mismatches`, LOG_LEVEL_NOTICE);
// We have to repair them following rules and situations:
// A. DB Recorded != DB Stored
// A.1. DB Recorded == Storage Stored
// Possibly recoverable from storage. Just overwrite the DB content with storage content.
// A.2. Neither
// Probably it cannot be resolved on this device. Even if the storage content is larger than DB Recorded, it possibly corrupted.
// We do not fix it automatically. Leave it as is. Possibly other device can do this.
// B. DB Recorded == DB Stored , < Storage Stored
// Very fragile, if DB Recorded size is less than Storage Stored size, we possibly repair the content (The issue was `unexpectedly shortened file`).
// We do not fix it automatically, but it will be automatically overwritten in other process.
// C. DB Recorded == DB Stored , > Storage Stored
// Probably restored by the user by resolving A or B on other device, We should overwrite the storage
// Also do not fix it automatically. It should be overwritten by replication.
const recoverable = errorFiles.filter((e) => {
return e.recordedSize === e.storageSize && !e.isConflicted;
});
const unrecoverable = errorFiles.filter((e) => {
return e.recordedSize !== e.storageSize || e.isConflicted;
});
const fileInfo = (e: (typeof errorFiles)[0]) => {
return `${e.path} (M: ${e.recordedSize}, A: ${e.actualSize}, S: ${e.storageSize}) ${e.isConflicted ? "(Conflicted)" : ""}`;
};
const messageUnrecoverable =
unrecoverable.length > 0
? $msg("moduleMigration.fix0256.messageUnrecoverable", {
filesNotRecoverable: unrecoverable.map((e) => `- ${fileInfo(e)}`).join("\n"),
})
: "";
const message = $msg("moduleMigration.fix0256.message", {
files: recoverable.map((e) => `- ${fileInfo(e)}`).join("\n"),
messageUnrecoverable,
});
const CHECK_IT_LATER = $msg("moduleMigration.fix0256.buttons.checkItLater");
const FIX = $msg("moduleMigration.fix0256.buttons.fix");
const DISMISS = $msg("moduleMigration.fix0256.buttons.DismissForever");
const ret = await this.core.confirm.askSelectStringDialogue(message, [CHECK_IT_LATER, FIX, DISMISS], {
title: $msg("moduleMigration.fix0256.title"),
defaultAction: CHECK_IT_LATER,
});
if (ret == FIX) {
for (const file of recoverable) {
// Overwrite the database with the files on the storage
const stubFile = this.core.storageAccess.getFileStub(file.path);
if (stubFile == null) {
Logger(`Could not find stub file for ${file.path}`, LOG_LEVEL_NOTICE);
continue;
}
stubFile.stat.mtime = Date.now();
const result = await this.core.fileHandler.storeFileToDB(stubFile, true, false);
if (result) {
Logger(`Successfully restored ${file.path} from storage`);
} else {
Logger(`Failed to restore ${file.path} from storage`, LOG_LEVEL_NOTICE);
}
}
} else if (ret === DISMISS) {
// User chose to dismiss the issue
await this.core.kvDB.set("checkIncompleteDocs", true);
}
return Promise.resolve(true);
}
async hasCompromisedChunks(): Promise<boolean> {
Logger(`Checking for compromised chunks...`, LOG_LEVEL_VERBOSE);
if (!this.settings.encrypt) {
// If not encrypted, we do not need to check for compromised chunks.
return true;
}
// Check local database for compromised chunks
const localCompromised = await countCompromisedChunks(this.localDatabase.localDatabase);
const remote = this.services.replicator.getActiveReplicator();
const remoteCompromised = this.core.managers.networkManager.isOnline
? await remote?.countCompromisedChunks()
: 0;
if (localCompromised === false) {
Logger(`Failed to count compromised chunks in local database`, LOG_LEVEL_NOTICE);
return false;
}
if (remoteCompromised === false) {
Logger(`Failed to count compromised chunks in remote database`, LOG_LEVEL_NOTICE);
return false;
}
if (remoteCompromised === 0 && localCompromised === 0) {
return true;
}
Logger(
`Found compromised chunks : ${localCompromised} in local, ${remoteCompromised} in remote`,
LOG_LEVEL_NOTICE
);
const title = $msg("moduleMigration.insecureChunkExist.title");
const msg = $msg("moduleMigration.insecureChunkExist.message");
const REBUILD = $msg("moduleMigration.insecureChunkExist.buttons.rebuild");
const FETCH = $msg("moduleMigration.insecureChunkExist.buttons.fetch");
const DISMISS = $msg("moduleMigration.insecureChunkExist.buttons.later");
const buttons = [REBUILD, FETCH, DISMISS];
if (remoteCompromised != 0) {
buttons.splice(buttons.indexOf(FETCH), 1);
}
const result = await this.core.confirm.askSelectStringDialogue(msg, buttons, {
title,
defaultAction: DISMISS,
timeout: 0,
});
if (result === REBUILD) {
// Rebuild the database
await this.core.rebuilder.scheduleRebuild();
this.services.appLifecycle.performRestart();
return false;
} else if (result === FETCH) {
// Fetch the latest data from remote
await this.core.rebuilder.scheduleFetch();
this.services.appLifecycle.performRestart();
return false;
} else {
// User chose to dismiss the issue
this._log($msg("moduleMigration.insecureChunkExist.laterMessage"), LOG_LEVEL_NOTICE);
}
return true;
}
async _everyOnFirstInitialize(): Promise<boolean> {
if (!this.localDatabase.isReady) {
this._log($msg("moduleMigration.logLocalDatabaseNotReady"), LOG_LEVEL_NOTICE);
return false;
}
if (this.settings.isConfigured) {
await this.migrateUsingDoctor(false);
if (!(await this.hasCompromisedChunks())) {
return false;
}
if (!(await this.hasIncompleteDocs())) {
return false;
}
if (!(await this.migrateUsingDoctor(false))) {
return false;
}
// await this.migrationCheck();
await this.migrateDisableBulkSend();
}
if (!this.settings.isConfigured) {
// Case sensitivity
if (!(await this.initialMessage()) || !(await this.askAgainForSetupURI())) {
// if (!(await this.initialMessage()) || !(await this.askAgainForSetupURI())) {
// this._log($msg("moduleMigration.logSetupCancelled"), LOG_LEVEL_NOTICE);
// return false;
// }
if (!(await this.initialMessage())) {
this._log($msg("moduleMigration.logSetupCancelled"), LOG_LEVEL_NOTICE);
return false;
}
await this.migrateUsingDoctor(true);
if (!(await this.migrateUsingDoctor(true))) {
return false;
}
}
return true;
}
$everyOnLayoutReady(): Promise<boolean> {
_everyOnLayoutReady(): Promise<boolean> {
eventHub.onEvent(EVENT_REQUEST_RUN_DOCTOR, async (reason) => {
await this.migrateUsingDoctor(false, reason, true);
});
eventHub.onEvent(EVENT_REQUEST_RUN_FIX_INCOMPLETE, async () => {
await this.hasIncompleteDocs(true);
});
return Promise.resolve(true);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
super.onBindFunction(core, services);
services.appLifecycle.handleLayoutReady(this._everyOnLayoutReady.bind(this));
services.appLifecycle.handleFirstInitialise(this._everyOnFirstInitialize.bind(this));
}
}

View File

@@ -1,7 +1,14 @@
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { LOG_LEVEL_DEBUG, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import {
LEVEL_INFO,
LEVEL_NOTICE,
LOG_LEVEL_DEBUG,
LOG_LEVEL_NOTICE,
LOG_LEVEL_VERBOSE,
type LOG_LEVEL,
} from "octagonal-wheels/common/logger";
import { Notice, requestUrl, type RequestUrlParam, type RequestUrlResponse } from "../../deps.ts";
import { type CouchDBCredentials, type EntryDoc, type FilePathWithPrefix } from "../../lib/src/common/types.ts";
import { type CouchDBCredentials, type EntryDoc, type FilePath } from "../../lib/src/common/types.ts";
import { getPathFromTFile } from "../../common/utils.ts";
import { isCloudantURI, isValidRemoteCouchDBURI } from "../../lib/src/pouchdb/utils_couchdb.ts";
import { replicationFilter } from "@/lib/src/pouchdb/compress.ts";
@@ -11,6 +18,8 @@ import { setNoticeClass } from "../../lib/src/mock_and_interop/wrapper.ts";
import { ObsHttpHandler } from "./APILib/ObsHttpHandler.ts";
import { PouchDB } from "../../lib/src/pouchdb/pouchdb-browser.ts";
import { AuthorizationHeaderGenerator } from "../../lib/src/replication/httplib.ts";
import type { LiveSyncCore } from "../../main.ts";
import { EVENT_ON_UNRESOLVED_ERROR, eventHub } from "../../common/events.ts";
setNoticeClass(Notice);
@@ -19,21 +28,34 @@ async function fetchByAPI(request: RequestUrlParam, errorAsResult = false): Prom
return ret;
}
export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidianModule {
export class ModuleObsidianAPI extends AbstractObsidianModule {
_customHandler!: ObsHttpHandler;
_authHeader = new AuthorizationHeaderGenerator();
_previousErrors = new Set<string>();
showError(msg: string, max_log_level: LOG_LEVEL = LEVEL_NOTICE) {
const level = this._previousErrors.has(msg) ? LEVEL_INFO : max_log_level;
this._log(msg, level);
if (!this._previousErrors.has(msg)) {
this._previousErrors.add(msg);
eventHub.emitEvent(EVENT_ON_UNRESOLVED_ERROR);
}
}
clearErrors() {
this._previousErrors.clear();
eventHub.emitEvent(EVENT_ON_UNRESOLVED_ERROR);
}
last_successful_post = false;
$$customFetchHandler(): ObsHttpHandler {
_customFetchHandler(): ObsHttpHandler {
if (!this._customHandler) this._customHandler = new ObsHttpHandler(undefined, undefined);
return this._customHandler;
}
$$getLastPostFailedBySize(): boolean {
_getLastPostFailedBySize(): boolean {
return !this.last_successful_post;
}
async _fetchByAPI(url: string, authHeader: string, opts?: RequestInit): Promise<Response> {
async __fetchByAPI(url: string, authHeader: string, opts?: RequestInit): Promise<Response> {
const body = opts?.body as string;
const transformedHeaders = { ...(opts?.headers as Record<string, string>) };
@@ -68,7 +90,7 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
const body = opts?.body as string;
const size = body ? ` (${body.length})` : "";
try {
const r = await this._fetchByAPI(url, authHeader, opts);
const r = await this.__fetchByAPI(url, authHeader, opts);
this.plugin.requestCount.value = this.plugin.requestCount.value + 1;
if (method == "POST" || method == "PUT") {
this.last_successful_post = r.status - (r.status % 100) == 200;
@@ -90,7 +112,7 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
}
}
async $$connectRemoteCouchDB(
async _connectRemoteCouchDB(
uri: string,
auth: CouchDBCredentials,
disableRequestURI: boolean,
@@ -101,11 +123,14 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
compression: boolean,
customHeaders: Record<string, string>,
useRequestAPI: boolean,
getPBKDF2Salt: () => Promise<Uint8Array>
getPBKDF2Salt: () => Promise<Uint8Array<ArrayBuffer>>
): Promise<string | { db: PouchDB.Database<EntryDoc>; info: PouchDB.Core.DatabaseInfo }> {
if (!isValidRemoteCouchDBURI(uri)) return "Remote URI is not valid";
if (uri.toLowerCase() != uri) return "Remote URI and database name could not contain capital letters.";
if (uri.indexOf(" ") !== -1) return "Remote URI and database name could not contain spaces.";
if (!this.core.managers.networkManager.isOnline) {
return "Network is offline";
}
// let authHeader = await this._authHeader.getAuthorizationHeader(auth);
const conf: PouchDB.HttpAdapter.HttpAdapterConfiguration = {
@@ -145,7 +170,7 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
try {
this.plugin.requestCount.value = this.plugin.requestCount.value + 1;
const response: Response = await (useRequestAPI
? this._fetchByAPI(url.toString(), authHeader, { ...opts, headers })
? this.__fetchByAPI(url.toString(), authHeader, { ...opts, headers })
: fetch(url, { ...opts, headers }));
if (method == "POST" || method == "PUT") {
this.last_successful_post = response.ok;
@@ -176,6 +201,7 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
}
}
}
this.clearErrors();
return response;
} catch (ex) {
if (ex instanceof TypeError) {
@@ -191,7 +217,7 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
headers,
});
if (resp2.status / 100 == 2) {
this._log(
this.showError(
"The request was successful by API. But the native fetch API failed! Please check CORS settings on the remote database!. While this condition, you cannot enable LiveSync",
LOG_LEVEL_NOTICE
);
@@ -199,7 +225,7 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
}
const r2 = resp2.clone();
const msg = await r2.text();
this._log(`Failed to fetch by API. ${resp2.status}: ${msg}`, LOG_LEVEL_NOTICE);
this.showError(`Failed to fetch by API. ${resp2.status}: ${msg}`, LOG_LEVEL_NOTICE);
return resp2;
}
throw ex;
@@ -207,7 +233,7 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
} catch (ex: any) {
this._log(`HTTP:${method}${size} to:${localURL} -> failed`, LOG_LEVEL_VERBOSE);
const msg = ex instanceof Error ? `${ex?.name}:${ex?.message}` : ex?.toString();
this._log(`Failed to fetch: ${msg}`, LOG_LEVEL_NOTICE);
this.showError(`Failed to fetch: ${msg}`); // Do not show notice, due to throwing below
this._log(ex, LOG_LEVEL_VERBOSE);
// limit only in bulk_docs.
if (url.toString().indexOf("_bulk_docs") !== -1) {
@@ -249,21 +275,21 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
}
}
$$isMobile(): boolean {
_isMobile(): boolean {
//@ts-ignore : internal API
return this.app.isMobile;
}
$$vaultName(): string {
_vaultName(): string {
return this.app.vault.getName();
}
$$getVaultName(): string {
_getVaultName(): string {
return (
this.core.$$vaultName() +
this.services.vault.vaultName() +
(this.settings?.additionalSuffixOfDatabaseName ? "-" + this.settings.additionalSuffixOfDatabaseName : "")
);
}
$$getActiveFilePath(): FilePathWithPrefix | undefined {
_getActiveFilePath(): FilePath | undefined {
const file = this.app.workspace.getActiveFile();
if (file) {
return getPathFromTFile(file);
@@ -271,7 +297,23 @@ export class ModuleObsidianAPI extends AbstractObsidianModule implements IObsidi
return undefined;
}
$anyGetAppId(): Promise<string | undefined> {
return Promise.resolve(`${"appId" in this.app ? this.app.appId : ""}`);
_anyGetAppId(): string {
return `${"appId" in this.app ? this.app.appId : ""}`;
}
private _reportUnresolvedMessages(): Promise<string[]> {
return Promise.resolve([...this._previousErrors]);
}
onBindFunction(core: LiveSyncCore, services: typeof core.services) {
services.API.handleGetCustomFetchHandler(this._customFetchHandler.bind(this));
services.API.handleIsLastPostFailedDueToPayloadSize(this._getLastPostFailedBySize.bind(this));
services.remote.handleConnect(this._connectRemoteCouchDB.bind(this));
services.API.handleIsMobile(this._isMobile.bind(this));
services.vault.handleGetVaultName(this._getVaultName.bind(this));
services.vault.handleVaultName(this._vaultName.bind(this));
services.vault.handleGetActiveFilePath(this._getActiveFilePath.bind(this));
services.API.handleGetAppID(this._anyGetAppId.bind(this));
services.appLifecycle.reportUnresolvedMessages(this._reportUnresolvedMessages.bind(this));
}
}

View File

@@ -1,4 +1,4 @@
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { EVENT_FILE_RENAMED, EVENT_LEAF_ACTIVE_CHANGED, eventHub } from "../../common/events.js";
import { LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import { scheduleTask } from "octagonal-wheels/concurrency/task";
@@ -12,9 +12,10 @@ import {
hiddenFilesEventCount,
hiddenFilesProcessingCount,
} from "../../lib/src/mock_and_interop/stores.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleObsidianEvents extends AbstractObsidianModule implements IObsidianModule {
$everyOnloadStart(): Promise<boolean> {
export class ModuleObsidianEvents extends AbstractObsidianModule {
_everyOnloadStart(): Promise<boolean> {
// this.registerEvent(this.app.workspace.on("editor-change", ));
this.plugin.registerEvent(
this.app.vault.on("rename", (file, oldPath) => {
@@ -30,11 +31,11 @@ export class ModuleObsidianEvents extends AbstractObsidianModule implements IObs
return Promise.resolve(true);
}
$$performRestart(): void {
this._performAppReload();
private _performRestart(): void {
this.__performAppReload();
}
_performAppReload() {
__performAppReload() {
//@ts-ignore
this.app.commands.executeCommandById("app:reload");
}
@@ -49,14 +50,14 @@ export class ModuleObsidianEvents extends AbstractObsidianModule implements IObs
this.initialCallback = save;
saveCommandDefinition.callback = () => {
scheduleTask("syncOnEditorSave", 250, () => {
if (this.core.$$isUnloaded()) {
if (this.services.appLifecycle.hasUnloaded()) {
this._log("Unload and remove the handler.", LOG_LEVEL_VERBOSE);
saveCommandDefinition.callback = this.initialCallback;
this.initialCallback = undefined;
} else {
if (this.settings.syncOnEditorSave) {
this._log("Sync on Editor Save.", LOG_LEVEL_VERBOSE);
fireAndForget(() => this.core.$$replicateByEvent());
fireAndForget(() => this.services.replication.replicateByEvent());
}
}
});
@@ -106,14 +107,14 @@ export class ModuleObsidianEvents extends AbstractObsidianModule implements IObs
// TODO:FIXME AT V0.17.31, this logic has been disabled.
if (navigator.onLine && this.localDatabase.needScanning) {
this.localDatabase.needScanning = false;
await this.core.$$performFullScan();
await this.services.vault.scanVault();
}
}
async watchWindowVisibilityAsync() {
if (this.settings.suspendFileWatching) return;
if (!this.settings.isConfigured) return;
if (!this.core.$$isReady()) return;
if (!this.services.appLifecycle.isReady()) return;
if (this.isLastHidden && !this.hasFocus) {
// NO OP while non-focused after made hidden;
@@ -126,22 +127,22 @@ export class ModuleObsidianEvents extends AbstractObsidianModule implements IObs
}
this.isLastHidden = isHidden;
await this.core.$everyCommitPendingFileEvent();
await this.services.fileProcessing.commitPendingFileEvents();
if (isHidden) {
await this.core.$everyBeforeSuspendProcess();
await this.services.appLifecycle.onSuspending();
} else {
// suspend all temporary.
if (this.core.$$isSuspended()) return;
if (this.services.appLifecycle.isSuspended()) return;
if (!this.hasFocus) return;
await this.core.$everyOnResumeProcess();
await this.core.$everyAfterResumeProcess();
await this.services.appLifecycle.onResuming();
await this.services.appLifecycle.onResumed();
}
}
watchWorkspaceOpen(file: TFile | null) {
if (this.settings.suspendFileWatching) return;
if (!this.settings.isConfigured) return;
if (!this.core.$$isReady()) return;
if (!this.services.appLifecycle.isReady()) return;
if (!file) return;
scheduleTask("watch-workspace-open", 500, () => fireAndForget(() => this.watchWorkspaceOpenAsync(file)));
}
@@ -149,25 +150,25 @@ export class ModuleObsidianEvents extends AbstractObsidianModule implements IObs
async watchWorkspaceOpenAsync(file: TFile) {
if (this.settings.suspendFileWatching) return;
if (!this.settings.isConfigured) return;
if (!this.core.$$isReady()) return;
await this.core.$everyCommitPendingFileEvent();
if (!this.services.appLifecycle.isReady()) return;
await this.services.fileProcessing.commitPendingFileEvents();
if (file == null) {
return;
}
if (this.settings.syncOnFileOpen && !this.core.$$isSuspended()) {
await this.core.$$replicateByEvent();
if (this.settings.syncOnFileOpen && !this.services.appLifecycle.isSuspended()) {
await this.services.replication.replicateByEvent();
}
await this.core.$$queueConflictCheckIfOpen(file.path as FilePathWithPrefix);
await this.services.conflict.queueCheckForIfOpen(file.path as FilePathWithPrefix);
}
$everyOnLayoutReady(): Promise<boolean> {
_everyOnLayoutReady(): Promise<boolean> {
this.swapSaveCommand();
this.registerWatchEvents();
return Promise.resolve(true);
}
$$askReload(message?: string) {
if (this.core.$$isReloadingScheduled()) {
private _askReload(message?: string) {
if (this.services.appLifecycle.isReloadingScheduled()) {
this._log(`Reloading is already scheduled`, LOG_LEVEL_VERBOSE);
return;
}
@@ -181,13 +182,13 @@ export class ModuleObsidianEvents extends AbstractObsidianModule implements IObs
{ defaultAction: RETRY_LATER }
);
if (ret == RESTART_NOW) {
this._performAppReload();
this.__performAppReload();
} else if (ret == RESTART_AFTER_STABLE) {
this.core.$$scheduleAppReload();
this.services.appLifecycle.scheduleRestart();
}
});
}
$$scheduleAppReload() {
private _scheduleAppReload() {
if (!this.core._totalProcessingCount) {
const __tick = reactiveSource(0);
this.core._totalProcessingCount = reactive(() => {
@@ -224,7 +225,7 @@ export class ModuleObsidianEvents extends AbstractObsidianModule implements IObs
this.core._totalProcessingCount.onChanged((e) => {
if (e.value == 0) {
if (stableCheck-- <= 0) {
this._performAppReload();
this.__performAppReload();
}
this._log(
`Obsidian will be restarted soon! (Within ${stableCheck} seconds)`,
@@ -237,4 +238,11 @@ export class ModuleObsidianEvents extends AbstractObsidianModule implements IObs
});
}
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleLayoutReady(this._everyOnLayoutReady.bind(this));
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.appLifecycle.handlePerformRestart(this._performRestart.bind(this));
services.appLifecycle.handleAskRestart(this._askReload.bind(this));
services.appLifecycle.handleScheduleRestart(this._scheduleAppReload.bind(this));
}
}

View File

@@ -1,11 +1,12 @@
import { fireAndForget } from "octagonal-wheels/promises";
import { addIcon, type Editor, type MarkdownFileInfo, type MarkdownView } from "../../deps.ts";
import { LOG_LEVEL_NOTICE, type FilePathWithPrefix } from "../../lib/src/common/types.ts";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { $msg } from "src/lib/src/common/i18n.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleObsidianMenu extends AbstractObsidianModule implements IObsidianModule {
$everyOnloadStart(): Promise<boolean> {
export class ModuleObsidianMenu extends AbstractObsidianModule {
_everyOnloadStart(): Promise<boolean> {
// UI
addIcon(
"replicate",
@@ -18,21 +19,21 @@ export class ModuleObsidianMenu extends AbstractObsidianModule implements IObsid
);
this.addRibbonIcon("replicate", $msg("moduleObsidianMenu.replicate"), async () => {
await this.core.$$replicate(true);
await this.services.replication.replicate(true);
}).addClass("livesync-ribbon-replicate");
this.addCommand({
id: "livesync-replicate",
name: "Replicate now",
callback: async () => {
await this.core.$$replicate();
await this.services.replication.replicate();
},
});
this.addCommand({
id: "livesync-dump",
name: "Dump information of this doc ",
callback: () => {
const file = this.core.$$getActiveFilePath();
const file = this.services.vault.getActiveFilePath();
if (!file) return;
fireAndForget(() => this.localDatabase.getDBEntry(file, {}, true, false));
},
@@ -43,7 +44,7 @@ export class ModuleObsidianMenu extends AbstractObsidianModule implements IObsid
editorCallback: (editor: Editor, view: MarkdownView | MarkdownFileInfo) => {
const file = view.file;
if (!file) return;
void this.core.$$queueConflictCheckIfOpen(file.path as FilePathWithPrefix);
void this.services.conflict.queueCheckForIfOpen(file.path as FilePathWithPrefix);
},
});
@@ -58,23 +59,23 @@ export class ModuleObsidianMenu extends AbstractObsidianModule implements IObsid
this.settings.liveSync = true;
this._log("LiveSync Enabled.", LOG_LEVEL_NOTICE);
}
await this.core.$$realizeSettingSyncMode();
await this.core.$$saveSettingData();
await this.services.setting.realiseSetting();
await this.services.setting.saveSettingData();
},
});
this.addCommand({
id: "livesync-suspendall",
name: "Toggle All Sync.",
callback: async () => {
if (this.core.$$isSuspended()) {
this.core.$$setSuspended(false);
if (this.services.appLifecycle.isSuspended()) {
this.services.appLifecycle.setSuspended(false);
this._log("Self-hosted LiveSync resumed", LOG_LEVEL_NOTICE);
} else {
this.core.$$setSuspended(true);
this.services.appLifecycle.setSuspended(true);
this._log("Self-hosted LiveSync suspended", LOG_LEVEL_NOTICE);
}
await this.core.$$realizeSettingSyncMode();
await this.core.$$saveSettingData();
await this.services.setting.realiseSetting();
await this.services.setting.saveSettingData();
},
});
@@ -82,7 +83,7 @@ export class ModuleObsidianMenu extends AbstractObsidianModule implements IObsid
id: "livesync-scan-files",
name: "Scan storage and database again",
callback: async () => {
await this.core.$$performFullScan(true);
await this.services.vault.scanVault(true);
},
});
@@ -90,7 +91,7 @@ export class ModuleObsidianMenu extends AbstractObsidianModule implements IObsid
id: "livesync-runbatch",
name: "Run pended batch processes",
callback: async () => {
await this.core.$everyCommitPendingFileEvent();
await this.services.fileProcessing.commitPendingFileEvents();
},
});
@@ -104,12 +105,15 @@ export class ModuleObsidianMenu extends AbstractObsidianModule implements IObsid
});
return Promise.resolve(true);
}
$everyOnload(): Promise<boolean> {
this.app.workspace.onLayoutReady(this.core.$$onLiveSyncReady.bind(this.core));
private __onWorkspaceReady() {
void this.services.appLifecycle.onReady();
}
private _everyOnload(): Promise<boolean> {
this.app.workspace.onLayoutReady(this.__onWorkspaceReady.bind(this));
return Promise.resolve(true);
}
async $$showView(viewType: string) {
private async _showView(viewType: string) {
const leaves = this.app.workspace.getLeavesOfType(viewType);
if (leaves.length == 0) {
await this.app.workspace.getLeaf(true).setViewState({
@@ -126,4 +130,9 @@ export class ModuleObsidianMenu extends AbstractObsidianModule implements IObsid
await this.app.workspace.revealLeaf(leaves[0]);
}
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.appLifecycle.handleOnLoaded(this._everyOnload.bind(this));
services.API.handleShowWindow(this._showView.bind(this));
}
}

View File

@@ -1,12 +1,18 @@
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import type { LiveSyncCore } from "../../main.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
export class ModuleExtraSyncObsidian extends AbstractObsidianModule implements IObsidianModule {
export class ModuleExtraSyncObsidian extends AbstractObsidianModule {
deviceAndVaultName: string = "";
$$getDeviceAndVaultName(): string {
_getDeviceAndVaultName(): string {
return this.deviceAndVaultName;
}
$$setDeviceAndVaultName(name: string): void {
_setDeviceAndVaultName(name: string): void {
this.deviceAndVaultName = name;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.setting.handleGetDeviceAndVaultName(this._getDeviceAndVaultName.bind(this));
services.setting.handleSetDeviceAndVaultName(this._setDeviceAndVaultName.bind(this));
}
}

View File

@@ -1,15 +1,16 @@
import { delay, fireAndForget } from "octagonal-wheels/promises";
import { __onMissingTranslation } from "../../lib/src/common/i18n";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import { eventHub } from "../../common/events";
import { enableTestFunction } from "./devUtil/testUtils.ts";
import { TestPaneView, VIEW_TYPE_TEST } from "./devUtil/TestPaneView.ts";
import { writable } from "svelte/store";
import type { FilePathWithPrefix } from "../../lib/src/common/types.ts";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleDev extends AbstractObsidianModule implements IObsidianModule {
$everyOnloadStart(): Promise<boolean> {
export class ModuleDev extends AbstractObsidianModule {
_everyOnloadStart(): Promise<boolean> {
__onMissingTranslation(() => {});
return Promise.resolve(true);
}
@@ -35,7 +36,7 @@ export class ModuleDev extends AbstractObsidianModule implements IObsidianModule
}
}
$everyOnloadAfterLoadSettings(): Promise<boolean> {
private _everyOnloadAfterLoadSettings(): Promise<boolean> {
if (!this.settings.enableDebugTools) return Promise.resolve(true);
this.onMissingTranslation = this.onMissingTranslation.bind(this);
__onMissingTranslation((key) => {
@@ -92,12 +93,12 @@ export class ModuleDev extends AbstractObsidianModule implements IObsidianModule
id: "view-test",
name: "Open Test dialogue",
callback: () => {
void this.core.$$showView(VIEW_TYPE_TEST);
void this.services.API.showWindow(VIEW_TYPE_TEST);
},
});
return Promise.resolve(true);
}
async $everyOnLayoutReady(): Promise<boolean> {
async _everyOnLayoutReady(): Promise<boolean> {
if (!this.settings.enableDebugTools) return Promise.resolve(true);
// if (await this.core.storageAccess.isExistsIncludeHidden("_SHOWDIALOGAUTO.md")) {
// void this.core.$$showView(VIEW_TYPE_TEST);
@@ -121,7 +122,7 @@ export class ModuleDev extends AbstractObsidianModule implements IObsidianModule
},
});
if (w) {
const id = await this.core.$$path2id(filename as FilePathWithPrefix);
const id = await this.services.path.path2id(filename as FilePathWithPrefix);
const f = await this.core.localDatabase.getRaw(id);
console.log(f);
console.log(f._rev);
@@ -139,14 +140,14 @@ export class ModuleDev extends AbstractObsidianModule implements IObsidianModule
testResults = writable<[boolean, string, string][]>([]);
// testResults: string[] = [];
$$addTestResult(name: string, key: string, result: boolean, summary?: string, message?: string): void {
private _addTestResult(name: string, key: string, result: boolean, summary?: string, message?: string): void {
const logLine = `${name}: ${key} ${summary ?? ""}`;
this.testResults.update((results) => {
results.push([result, logLine, message ?? ""]);
return results;
});
}
$everyModuleTest(): Promise<boolean> {
private _everyModuleTest(): Promise<boolean> {
if (!this.settings.enableDebugTools) return Promise.resolve(true);
// this.core.$$addTestResult("DevModule", "Test", true);
// return Promise.resolve(true);
@@ -155,4 +156,11 @@ export class ModuleDev extends AbstractObsidianModule implements IObsidianModule
// this.addTestResult("Test of test3", true);
return this.testDone();
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleLayoutReady(this._everyOnLayoutReady.bind(this));
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.appLifecycle.handleOnSettingLoaded(this._everyOnloadAfterLoadSettings.bind(this));
services.test.handleTest(this._everyModuleTest.bind(this));
services.test.handleAddTestResult(this._addTestResult.bind(this));
}
}

View File

@@ -1,9 +1,9 @@
import { delay } from "octagonal-wheels/promises";
import { LOG_LEVEL_NOTICE, REMOTE_MINIO, type FilePathWithPrefix } from "src/lib/src/common/types";
import { shareRunningResult } from "octagonal-wheels/concurrency/lock";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule";
import { AbstractObsidianModule } from "../AbstractObsidianModule";
export class ModuleIntegratedTest extends AbstractObsidianModule implements IObsidianModule {
export class ModuleIntegratedTest extends AbstractObsidianModule {
async waitFor(proc: () => Promise<boolean>, timeout = 10000): Promise<boolean> {
await delay(100);
const start = Date.now();
@@ -45,7 +45,7 @@ export class ModuleIntegratedTest extends AbstractObsidianModule implements IObs
}
return true;
}
async _orDie(key: string, proc: () => Promise<boolean>): Promise<true> | never {
async __orDie(key: string, proc: () => Promise<boolean>): Promise<true> | never {
if (!(await this._test(key, proc))) {
throw new Error(`${key}`);
}
@@ -54,7 +54,7 @@ export class ModuleIntegratedTest extends AbstractObsidianModule implements IObs
tryReplicate() {
if (!this.settings.liveSync) {
return shareRunningResult("replicate-test", async () => {
await this.core.$$replicate();
await this.services.replication.replicate();
});
}
}
@@ -64,13 +64,13 @@ export class ModuleIntegratedTest extends AbstractObsidianModule implements IObs
}
return await this.core.storageAccess.readHiddenFileText(file);
}
async _proceed(no: number, title: string): Promise<boolean> {
async __proceed(no: number, title: string): Promise<boolean> {
const stepFile = "_STEP.md" as FilePathWithPrefix;
const stepAckFile = "_STEP_ACK.md" as FilePathWithPrefix;
const stepContent = `Step ${no}`;
await this.core.$anyResolveConflictByNewest(stepFile);
await this.services.conflict.resolveByNewest(stepFile);
await this.core.storageAccess.writeFileAuto(stepFile, stepContent);
await this._orDie(`Wait for acknowledge ${no}`, async () => {
await this.__orDie(`Wait for acknowledge ${no}`, async () => {
if (
!(await this.waitWithReplicating(async () => {
return await this.storageContentIsEqual(stepAckFile, stepContent);
@@ -81,13 +81,13 @@ export class ModuleIntegratedTest extends AbstractObsidianModule implements IObs
});
return true;
}
async _join(no: number, title: string): Promise<boolean> {
async __join(no: number, title: string): Promise<boolean> {
const stepFile = "_STEP.md" as FilePathWithPrefix;
const stepAckFile = "_STEP_ACK.md" as FilePathWithPrefix;
// const otherStepFile = `_STEP_${isLeader ? "R" : "L"}.md` as FilePathWithPrefix;
const stepContent = `Step ${no}`;
await this._orDie(`Wait for step ${no} (${title})`, async () => {
await this.__orDie(`Wait for step ${no} (${title})`, async () => {
if (
!(await this.waitWithReplicating(async () => {
return await this.storageContentIsEqual(stepFile, stepContent);
@@ -96,7 +96,7 @@ export class ModuleIntegratedTest extends AbstractObsidianModule implements IObs
return false;
return true;
});
await this.core.$anyResolveConflictByNewest(stepAckFile);
await this.services.conflict.resolveByNewest(stepAckFile);
await this.core.storageAccess.writeFileAuto(stepAckFile, stepContent);
await this.tryReplicate();
return true;
@@ -116,16 +116,16 @@ export class ModuleIntegratedTest extends AbstractObsidianModule implements IObs
check: () => Promise<boolean>;
}): Promise<boolean> {
if (isGameChanger) {
await this._proceed(step, title);
await this.__proceed(step, title);
try {
await proc();
} catch (e) {
this._log(`Error: ${e}`);
return false;
}
return await this._orDie(`Step ${step} - ${title}`, async () => await this.waitWithReplicating(check));
return await this.__orDie(`Step ${step} - ${title}`, async () => await this.waitWithReplicating(check));
} else {
return await this._join(step, title);
return await this.__join(step, title);
}
}
// // see scenario.md
@@ -151,7 +151,7 @@ export class ModuleIntegratedTest extends AbstractObsidianModule implements IObs
`Test as ${isLeader ? "Leader" : "Receiver"} command file ${testCommandFile}`
);
if (isLeader) {
await this._proceed(0, "start");
await this.__proceed(0, "start");
}
await this.tryReplicate();
@@ -424,9 +424,9 @@ Line4:D`;
await this._test("basic", async () => await this.nonLiveTestRunner(isLeader, (t, l) => this.testBasic(t, l)));
}
async $everyModuleTestMultiDevice(): Promise<boolean> {
async _everyModuleTestMultiDevice(): Promise<boolean> {
if (!this.settings.enableDebugTools) return Promise.resolve(true);
const isLeader = this.core.$$vaultName().indexOf("recv") === -1;
const isLeader = this.core.services.vault.vaultName().indexOf("recv") === -1;
this.addTestResult("-------", true, `Test as ${isLeader ? "Leader" : "Receiver"}`);
try {
this._log(`Starting Test`);
@@ -440,4 +440,7 @@ Line4:D`;
return Promise.resolve(true);
}
onBindFunction(core: typeof this.core, services: typeof core.services): void {
services.test.handleTestMultiDevice(this._everyModuleTestMultiDevice.bind(this));
}
}

View File

@@ -1,5 +1,5 @@
import { delay } from "octagonal-wheels/promises";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
import { eventHub } from "../../common/events";
import { getWebCrypto } from "../../lib/src/mods.ts";
@@ -8,6 +8,7 @@ import { parseYaml, requestUrl, stringifyYaml } from "obsidian";
import type { FilePath } from "../../lib/src/common/types.ts";
import { scheduleTask } from "octagonal-wheels/concurrency/task";
import { getFileRegExp } from "../../lib/src/common/utils.ts";
import type { LiveSyncCore } from "../../main.ts";
declare global {
interface LSEvents {
@@ -15,12 +16,15 @@ declare global {
}
}
export class ModuleReplicateTest extends AbstractObsidianModule implements IObsidianModule {
export class ModuleReplicateTest extends AbstractObsidianModule {
testRootPath = "_test/";
testInfoPath = "_testinfo/";
get isLeader() {
return this.core.$$getVaultName().indexOf("dev") >= 0 && this.core.$$vaultName().indexOf("recv") < 0;
return (
this.services.vault.getVaultName().indexOf("dev") >= 0 &&
this.services.vault.vaultName().indexOf("recv") < 0
);
}
get nameByKind() {
@@ -52,24 +56,24 @@ export class ModuleReplicateTest extends AbstractObsidianModule implements IObsi
async dumpList() {
if (this.settings.syncInternalFiles) {
this._log("Write file list (Include Hidden)");
await this._dumpFileListIncludeHidden("files.md");
await this.__dumpFileListIncludeHidden("files.md");
} else {
this._log("Write file list");
await this._dumpFileList("files.md");
await this.__dumpFileList("files.md");
}
}
async $everyBeforeReplicate(showMessage: boolean): Promise<boolean> {
async _everyBeforeReplicate(showMessage: boolean): Promise<boolean> {
if (!this.settings.enableDebugTools) return Promise.resolve(true);
await this.dumpList();
return true;
}
$everyOnloadAfterLoadSettings(): Promise<boolean> {
private _everyOnloadAfterLoadSettings(): Promise<boolean> {
if (!this.settings.enableDebugTools) return Promise.resolve(true);
this.addCommand({
id: "dump-file-structure-normal",
name: `Dump Structure (Normal)`,
callback: () => {
void this._dumpFileList("files.md").finally(() => {
void this.__dumpFileList("files.md").finally(() => {
void this.refreshSyncStatus();
});
},
@@ -79,7 +83,7 @@ export class ModuleReplicateTest extends AbstractObsidianModule implements IObsi
name: "Dump Structure (Include Hidden)",
callback: () => {
const d = "files.md";
void this._dumpFileListIncludeHidden(d);
void this.__dumpFileListIncludeHidden(d);
},
});
this.addCommand({
@@ -160,7 +164,7 @@ export class ModuleReplicateTest extends AbstractObsidianModule implements IObsi
}
}
async _dumpFileList(outFile?: string) {
async __dumpFileList(outFile?: string) {
if (!this.core || !this.core.storageAccess) {
this._log("No storage access", LOG_LEVEL_INFO);
return;
@@ -169,7 +173,7 @@ export class ModuleReplicateTest extends AbstractObsidianModule implements IObsi
const out = [] as any[];
const webcrypto = await getWebCrypto();
for (const file of files) {
if (!(await this.core.$$isTargetFile(file.path))) {
if (!(await this.services.vault.isTargetFile(file.path))) {
continue;
}
if (file.path.startsWith(this.testInfoPath)) continue;
@@ -200,7 +204,7 @@ export class ModuleReplicateTest extends AbstractObsidianModule implements IObsi
this._log(`Dumped ${out.length} files`, LOG_LEVEL_INFO);
}
async _dumpFileListIncludeHidden(outFile?: string) {
async __dumpFileListIncludeHidden(outFile?: string) {
const ignorePatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesIgnorePatterns");
const targetPatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesTargetPatterns");
const out = [] as any[];
@@ -316,7 +320,7 @@ export class ModuleReplicateTest extends AbstractObsidianModule implements IObsi
}
async testConflictedManually1() {
await this.core.$$replicate();
await this.services.replication.replicate();
const commonFile = `Resolve!
*****, the amazing chocolatier!!`;
@@ -325,8 +329,8 @@ export class ModuleReplicateTest extends AbstractObsidianModule implements IObsi
await this.core.storageAccess.writeHiddenFileAuto(this.testRootPath + "wonka.md", commonFile);
}
await this.core.$$replicate();
await this.core.$$replicate();
await this.services.replication.replicate();
await this.services.replication.replicate();
if (
(await this.core.confirm.askYesNoDialog("Ready to begin the test conflict Manually 1?", {
timeout: 30,
@@ -356,12 +360,12 @@ Charlie Bucket, Charlie Bucket, the amazing chocolatier!!`;
) {
return;
}
await this.core.$$replicate();
await this.core.$$replicate();
await this.services.replication.replicate();
await this.services.replication.replicate();
if (
!(await this.waitFor(async () => {
await this.core.$$replicate();
await this.services.replication.replicate();
return (
(await this.__assertStorageContent(
(this.testRootPath + "wonka.md") as FilePath,
@@ -379,7 +383,7 @@ Charlie Bucket, Charlie Bucket, the amazing chocolatier!!`;
}
async testConflictedManually2() {
await this.core.$$replicate();
await this.services.replication.replicate();
const commonFile = `Resolve To concatenate
ABCDEFG`;
@@ -388,8 +392,8 @@ ABCDEFG`;
await this.core.storageAccess.writeHiddenFileAuto(this.testRootPath + "concat.md", commonFile);
}
await this.core.$$replicate();
await this.core.$$replicate();
await this.services.replication.replicate();
await this.services.replication.replicate();
if (
(await this.core.confirm.askYesNoDialog("Ready to begin the test conflict Manually 2?", {
timeout: 30,
@@ -420,12 +424,12 @@ ABCDEFGHIJKLMNOPQRSTUVWXYZ`;
) {
return;
}
await this.core.$$replicate();
await this.core.$$replicate();
await this.services.replication.replicate();
await this.services.replication.replicate();
if (
!(await this.waitFor(async () => {
await this.core.$$replicate();
await this.services.replication.replicate();
return (
(await this.__assertStorageContent(
(this.testRootPath + "concat.md") as FilePath,
@@ -457,8 +461,8 @@ ABCDEFGHIJKLMNOPQRSTUVWXYZ`;
await this.core.storageAccess.writeHiddenFileAuto(this.testRootPath + "task.md", baseDoc);
}
await delay(100);
await this.core.$$replicate();
await this.core.$$replicate();
await this.services.replication.replicate();
await this.services.replication.replicate();
if (
(await this.core.confirm.askYesNoDialog("Ready to test conflict?", {
@@ -487,8 +491,8 @@ ABCDEFGHIJKLMNOPQRSTUVWXYZ`;
await this.core.storageAccess.writeHiddenFileAuto(this.testRootPath + "task.md", mod2Doc);
}
await this.core.$$replicate();
await this.core.$$replicate();
await this.services.replication.replicate();
await this.services.replication.replicate();
await delay(1000);
if (
(await this.core.confirm.askYesNoDialog("Ready to check result?", { timeout: 30, defaultOption: "Yes" })) ==
@@ -496,8 +500,8 @@ ABCDEFGHIJKLMNOPQRSTUVWXYZ`;
) {
return;
}
await this.core.$$replicate();
await this.core.$$replicate();
await this.services.replication.replicate();
await this.services.replication.replicate();
const mergedDoc = `Tasks!
- [ ] Task 1
- [v] Task 2
@@ -511,7 +515,7 @@ ABCDEFGHIJKLMNOPQRSTUVWXYZ`;
this._log("Before testing conflicted files, resolve all once", LOG_LEVEL_NOTICE);
await this.core.rebuilder.resolveAllConflictedFilesByNewerOnes();
await this.core.rebuilder.resolveAllConflictedFilesByNewerOnes();
await this.core.$$replicate();
await this.services.replication.replicate();
await delay(1000);
if (!(await this.testConflictAutomatic())) {
this._log("Conflict resolution (Auto) failed", LOG_LEVEL_NOTICE);
@@ -569,11 +573,16 @@ ABCDEFGHIJKLMNOPQRSTUVWXYZ`;
// return results;
// });
// }
async $everyModuleTestMultiDevice(): Promise<boolean> {
private async _everyModuleTestMultiDevice(): Promise<boolean> {
if (!this.settings.enableDebugTools) return Promise.resolve(true);
// this.core.$$addTestResult("DevModule", "Test", true);
// return Promise.resolve(true);
await this._test("Conflict resolution", async () => await this.checkConflictResolution());
return this.testDone();
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnSettingLoaded(this._everyOnloadAfterLoadSettings.bind(this));
services.replication.handleBeforeReplicate(this._everyBeforeReplicate.bind(this));
services.test.handleTestMultiDevice(this._everyModuleTestMultiDevice.bind(this));
}
}

View File

@@ -57,14 +57,14 @@
function moduleMultiDeviceTest() {
if (moduleTesting) return;
moduleTesting = true;
plugin.$everyModuleTestMultiDevice().finally(() => {
plugin.services.test.testMultiDevice().finally(() => {
moduleTesting = false;
});
}
function moduleSingleDeviceTest() {
if (moduleTesting) return;
moduleTesting = true;
plugin.$everyModuleTest().finally(() => {
plugin.services.test.test().finally(() => {
moduleTesting = false;
});
}
@@ -72,8 +72,8 @@
if (moduleTesting) return;
moduleTesting = true;
try {
await plugin.$everyModuleTest();
await plugin.$everyModuleTestMultiDevice();
await plugin.services.test.test();
await plugin.services.test.testMultiDevice();
} finally {
moduleTesting = false;
}

View File

@@ -1,5 +1,5 @@
import { fireAndForget } from "../../../lib/src/common/utils.ts";
import { serialized } from "../../../lib/src/concurrency/lock.ts";
import { serialized } from "octagonal-wheels/concurrency/lock";
import type ObsidianLiveSyncPlugin from "../../../main.ts";
let plugin: ObsidianLiveSyncPlugin;

View File

@@ -1,4 +1,4 @@
import { Trench } from "../../../lib/src/memory/memutil.ts";
import { Trench } from "octagonal-wheels/memory/memutil";
import type ObsidianLiveSyncPlugin from "../../../main.ts";
type MeasureResult = [times: number, spent: number];
type NamedMeasureResult = [name: string, result: MeasureResult];

View File

@@ -46,6 +46,9 @@ function readDocument(w: LoadedEntry) {
}
export class DocumentHistoryModal extends Modal {
plugin: ObsidianLiveSyncPlugin;
get services() {
return this.plugin.services;
}
range!: HTMLInputElement;
contentView!: HTMLDivElement;
info!: HTMLDivElement;
@@ -74,7 +77,7 @@ export class DocumentHistoryModal extends Modal {
this.id = id;
this.initialRev = revision;
if (!file && id) {
this.file = this.plugin.$$id2path(id);
this.file = this.services.path.id2path(id);
}
if (localStorage.getItem("ols-history-highlightdiff") == "1") {
this.showDiff = true;
@@ -83,7 +86,7 @@ export class DocumentHistoryModal extends Modal {
async loadFile(initialRev?: string) {
if (!this.id) {
this.id = await this.plugin.$$path2id(this.file);
this.id = await this.services.path.path2id(this.file);
}
const db = this.plugin.localDatabase;
try {
@@ -126,7 +129,7 @@ export class DocumentHistoryModal extends Modal {
}
this.BlobURLs.delete(key);
}
generateBlobURL(key: string, data: Uint8Array) {
generateBlobURL(key: string, data: Uint8Array<ArrayBuffer>) {
this.revokeURL(key);
const v = URL.createObjectURL(new Blob([data], { endings: "transparent", type: "application/octet-stream" }));
this.BlobURLs.set(key, v);
@@ -175,7 +178,10 @@ export class DocumentHistoryModal extends Modal {
result = result.replace(/\n/g, "<br>");
} else if (isImage(this.file)) {
const src = this.generateBlobURL("base", w1data);
const overlay = this.generateBlobURL("overlay", readDocument(w2) as Uint8Array);
const overlay = this.generateBlobURL(
"overlay",
readDocument(w2) as Uint8Array<ArrayBuffer>
);
result = `<div class='ls-imgdiff-wrap'>
<div class='overlay'>
<img class='img-base' src="${src}">

View File

@@ -25,8 +25,8 @@ export class ConflictResolveModal extends Modal {
title: string = "Conflicting changes";
pluginPickMode: boolean = false;
localName: string = "Keep A";
remoteName: string = "Keep B";
localName: string = "Base";
remoteName: string = "Conflicted";
offEvent?: ReturnType<typeof eventHub.onEvent>;
constructor(app: App, filename: string, diff: diff_result, pluginPickMode?: boolean, remoteName?: string) {
@@ -36,8 +36,8 @@ export class ConflictResolveModal extends Modal {
this.pluginPickMode = pluginPickMode || false;
if (this.pluginPickMode) {
this.title = "Pick a version";
this.remoteName = `Use ${remoteName || "Remote"}`;
this.localName = "Use Local";
this.remoteName = `${remoteName || "Remote"}`;
this.localName = "Local";
}
// Send cancel signal for the previous merge dialogue
// if not there, simply be ignored.
@@ -85,20 +85,19 @@ export class ConflictResolveModal extends Modal {
}
}
diff = diff.replace(/\n/g, "<br>");
div.innerHTML = diff;
const div2 = contentEl.createDiv("");
const date1 =
new Date(this.result.left.mtime).toLocaleString() + (this.result.left.deleted ? " (Deleted)" : "");
const date2 =
new Date(this.result.right.mtime).toLocaleString() + (this.result.right.deleted ? " (Deleted)" : "");
div2.innerHTML = `
<span class='deleted'>A:${date1}</span><br /><span class='added'>B:${date2}</span><br>
`;
contentEl.createEl("button", { text: this.localName }, (e) =>
div2.setHTMLUnsafe(`
<span class='deleted'><span class='conflict-dev-name'>${this.localName}</span>: ${date1}</span><br>
<span class='added'><span class='conflict-dev-name'>${this.remoteName}</span>: ${date2}</span><br>
`);
contentEl.createEl("button", { text: `Use ${this.localName}` }, (e) =>
e.addEventListener("click", () => this.sendResponse(this.result.right.rev))
).style.marginRight = "4px";
contentEl.createEl("button", { text: this.remoteName }, (e) =>
contentEl.createEl("button", { text: `Use ${this.remoteName}` }, (e) =>
e.addEventListener("click", () => this.sendResponse(this.result.left.rev))
).style.marginRight = "4px";
if (!this.pluginPickMode) {
@@ -109,6 +108,13 @@ export class ConflictResolveModal extends Modal {
contentEl.createEl("button", { text: !this.pluginPickMode ? "Not now" : "Cancel" }, (e) =>
e.addEventListener("click", () => this.sendResponse(CANCELLED))
).style.marginRight = "4px";
diff = diff.replace(/\n/g, "<br>");
// div.innerHTML = diff;
if (diff.length > 100 * 1024) {
div.setText("(Too large diff to display)");
} else {
div.setHTMLUnsafe(diff);
}
}
sendResponse(result: MergeDialogResult) {

View File

@@ -1,7 +1,7 @@
<script lang="ts">
import { onDestroy, onMount } from "svelte";
import { logMessages } from "../../../lib/src/mock_and_interop/stores";
import { reactive, type ReactiveInstance } from "../../../lib/src/dataobject/reactive";
import { reactive, type ReactiveInstance } from "octagonal-wheels/dataobject/reactive";
import { Logger } from "../../../lib/src/common/logger";
import { $msg as msg, currentLang as lang } from "../../../lib/src/common/i18n.ts";

View File

@@ -1,8 +1,8 @@
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { VIEW_TYPE_GLOBAL_HISTORY, GlobalHistoryView } from "./GlobalHistory/GlobalHistoryView.ts";
export class ModuleObsidianGlobalHistory extends AbstractObsidianModule implements IObsidianModule {
$everyOnloadStart(): Promise<boolean> {
export class ModuleObsidianGlobalHistory extends AbstractObsidianModule {
_everyOnloadStart(): Promise<boolean> {
this.addCommand({
id: "livesync-global-history",
name: "Show vault history",
@@ -17,6 +17,9 @@ export class ModuleObsidianGlobalHistory extends AbstractObsidianModule implemen
}
showGlobalHistory() {
void this.core.$$showView(VIEW_TYPE_GLOBAL_HISTORY);
void this.services.API.showWindow(VIEW_TYPE_GLOBAL_HISTORY);
}
onBindFunction(core: typeof this.core, services: typeof core.services): void {
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
}
}

View File

@@ -10,13 +10,14 @@ import {
type diff_result,
} from "../../lib/src/common/types.ts";
import { ConflictResolveModal } from "./InteractiveConflictResolving/ConflictResolveModal.ts";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { displayRev, getPath, getPathWithoutPrefix } from "../../common/utils.ts";
import { fireAndForget } from "octagonal-wheels/promises";
import { serialized } from "../../lib/src/concurrency/lock.ts";
import { serialized } from "octagonal-wheels/concurrency/lock";
import type { LiveSyncCore } from "../../main.ts";
export class ModuleInteractiveConflictResolver extends AbstractObsidianModule implements IObsidianModule {
$everyOnloadStart(): Promise<boolean> {
export class ModuleInteractiveConflictResolver extends AbstractObsidianModule {
_everyOnloadStart(): Promise<boolean> {
this.addCommand({
id: "livesync-conflictcheck",
name: "Pick a file to resolve conflict",
@@ -34,7 +35,7 @@ export class ModuleInteractiveConflictResolver extends AbstractObsidianModule im
return Promise.resolve(true);
}
async $anyResolveConflictByUI(filename: FilePathWithPrefix, conflictCheckResult: diff_result): Promise<boolean> {
async _anyResolveConflictByUI(filename: FilePathWithPrefix, conflictCheckResult: diff_result): Promise<boolean> {
// UI for resolving conflicts should one-by-one.
return await serialized(`conflict-resolve-ui`, async () => {
this._log("Merge:open conflict dialog", LOG_LEVEL_VERBOSE);
@@ -68,7 +69,7 @@ export class ModuleInteractiveConflictResolver extends AbstractObsidianModule im
}
// 2. As usual, delete the conflicted revision and if there are no conflicts, write the resolved content to the storage.
if (
(await this.core.$$resolveConflictByDeletingRev(filename, delRev, "UI Concatenated")) ==
(await this.services.conflict.resolveByDeletingRevision(filename, delRev, "UI Concatenated")) ==
MISSING_OR_ERROR
) {
this._log(
@@ -80,7 +81,7 @@ export class ModuleInteractiveConflictResolver extends AbstractObsidianModule im
} else if (typeof toDelete === "string") {
// Select one of the conflicted revision to delete.
if (
(await this.core.$$resolveConflictByDeletingRev(filename, toDelete, "UI Selected")) ==
(await this.services.conflict.resolveByDeletingRevision(filename, toDelete, "UI Selected")) ==
MISSING_OR_ERROR
) {
this._log(`Merge: Something went wrong: ${filename}, (${toDelete})`, LOG_LEVEL_NOTICE);
@@ -93,11 +94,11 @@ export class ModuleInteractiveConflictResolver extends AbstractObsidianModule im
// In here, some merge has been processed.
// So we have to run replication if configured.
// TODO: Make this is as a event request
if (this.settings.syncAfterMerge && !this.core.$$isSuspended()) {
await this.core.$$replicateByEvent();
if (this.settings.syncAfterMerge && !this.services.appLifecycle.isSuspended()) {
await this.services.replication.replicateByEvent();
}
// And, check it again.
await this.core.$$queueConflictCheck(filename);
await this.services.conflict.queueCheckFor(filename);
return false;
});
}
@@ -120,14 +121,14 @@ export class ModuleInteractiveConflictResolver extends AbstractObsidianModule im
const target = await this.core.confirm.askSelectString("File to resolve conflict", notesList);
if (target) {
const targetItem = notes.find((e) => e.dispPath == target)!;
await this.core.$$queueConflictCheck(targetItem.path);
await this.core.$$waitForAllConflictProcessed();
await this.services.conflict.queueCheckFor(targetItem.path);
await this.services.conflict.ensureAllProcessed();
return true;
}
return false;
}
async $allScanStat(): Promise<boolean> {
async _allScanStat(): Promise<boolean> {
const notes: { path: string; mtime: number }[] = [];
this._log(`Checking conflicted files`, LOG_LEVEL_VERBOSE);
for await (const doc of this.localDatabase.findAllDocs({ conflicts: true })) {
@@ -157,4 +158,9 @@ export class ModuleInteractiveConflictResolver extends AbstractObsidianModule im
}
return true;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnScanningStartupIssues(this._allScanStat.bind(this));
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.conflict.handleResolveByUserInteraction(this._anyResolveConflictByUI.bind(this));
}
}

View File

@@ -19,8 +19,13 @@ import {
logMessages,
} from "../../lib/src/mock_and_interop/stores.ts";
import { eventHub } from "../../lib/src/hub/hub.ts";
import { EVENT_FILE_RENAMED, EVENT_LAYOUT_READY, EVENT_LEAF_ACTIVE_CHANGED } from "../../common/events.ts";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import {
EVENT_FILE_RENAMED,
EVENT_LAYOUT_READY,
EVENT_LEAF_ACTIVE_CHANGED,
EVENT_ON_UNRESOLVED_ERROR,
} from "../../common/events.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { addIcon, normalizePath, Notice } from "../../deps.ts";
import { LOG_LEVEL_NOTICE, setGlobalLogFunction } from "octagonal-wheels/common/logger";
import { QueueProcessor } from "octagonal-wheels/concurrency/processor";
@@ -28,12 +33,18 @@ import { LogPaneView, VIEW_TYPE_LOG } from "./Log/LogPaneView.ts";
import { serialized } from "octagonal-wheels/concurrency/lock";
import { $msg } from "src/lib/src/common/i18n.ts";
import { P2PLogCollector } from "../../lib/src/replication/trystero/P2PReplicatorCore.ts";
import type { LiveSyncCore } from "../../main.ts";
import { LiveSyncError } from "@/lib/src/common/LSError.ts";
// This module cannot be a core module because it depends on the Obsidian UI.
// DI the log again.
setGlobalLogFunction((message: any, level?: number, key?: string) => {
const entry = { message, level, key } as LogEntry;
const messageX =
message instanceof Error
? new LiveSyncError("[Error Logged]: " + message.message, { cause: message })
: message;
const entry = { message: messageX, level, key } as LogEntry;
logStore.enqueue(entry);
});
let recentLogs = [] as string[];
@@ -50,7 +61,7 @@ const recentLogProcessor = new QueueProcessor(
const showDebugLog = false;
export const MARK_DONE = "\u{2009}\u{2009}";
export class ModuleLog extends AbstractObsidianModule implements IObsidianModule {
export class ModuleLog extends AbstractObsidianModule {
registerView = this.plugin.registerView.bind(this.plugin);
statusBar?: HTMLElement;
@@ -178,7 +189,7 @@ export class ModuleLog extends AbstractObsidianModule implements IObsidianModule
});
const statusBarLabels = reactive(() => {
const scheduleMessage = this.core.$$isReloadingScheduled()
const scheduleMessage = this.services.appLifecycle.isReloadingScheduled()
? `WARNING! RESTARTING OBSIDIAN IS SCHEDULED!\n`
: "";
const { message } = statusLineLabel();
@@ -197,11 +208,13 @@ export class ModuleLog extends AbstractObsidianModule implements IObsidianModule
this.applyStatusBarText();
}, 20);
statusBarLabels.onChanged((label) => applyToDisplay(label.value));
this.activeFileStatus.onChanged(() => this.updateMessageArea());
}
$everyOnload(): Promise<boolean> {
private _everyOnload(): Promise<boolean> {
eventHub.onEvent(EVENT_LEAF_ACTIVE_CHANGED, () => this.onActiveLeafChange());
eventHub.onceEvent(EVENT_LAYOUT_READY, () => this.onActiveLeafChange());
eventHub.onEvent(EVENT_ON_UNRESOLVED_ERROR, () => this.updateMessageArea());
return Promise.resolve(true);
}
@@ -219,22 +232,33 @@ export class ModuleLog extends AbstractObsidianModule implements IObsidianModule
const thisFile = this.app.workspace.getActiveFile();
if (!thisFile) return "";
// Case Sensitivity
if (this.core.$$shouldCheckCaseInsensitive()) {
if (this.services.setting.shouldCheckCaseInsensitively()) {
const f = this.core.storageAccess
.getFiles()
.map((e) => e.path)
.filter((e) => e.toLowerCase() == thisFile.path.toLowerCase());
if (f.length > 1) return "Not synchronised: There are multiple files with the same name";
}
if (!(await this.core.$$isTargetFile(thisFile.path))) return "Not synchronised: not a target file";
if (this.core.$$isFileSizeExceeded(thisFile.stat.size)) return "Not synchronised: File size exceeded";
if (!(await this.services.vault.isTargetFile(thisFile.path))) return "Not synchronised: not a target file";
if (this.services.vault.isFileSizeTooLarge(thisFile.stat.size)) return "Not synchronised: File size exceeded";
return "";
}
async setFileStatus() {
const fileStatus = await this.getActiveFileStatus();
this.activeFileStatus.value = fileStatus;
this.messageArea!.innerText = this.settings.hideFileWarningNotice ? "" : fileStatus;
}
async updateMessageArea() {
if (this.messageArea) {
const messageLines = [];
const fileStatus = this.activeFileStatus.value;
if (fileStatus && !this.settings.hideFileWarningNotice) messageLines.push(fileStatus);
const messages = (await this.services.appLifecycle.getUnresolvedMessages()).flat().filter((e) => e);
messageLines.push(...messages);
this.messageArea.innerText = messageLines.map((e) => `⚠️ ${e}`).join("\n");
}
}
onActiveLeafChange() {
fireAndForget(async () => {
this.adjustStatusDivPosition();
@@ -287,14 +311,14 @@ export class ModuleLog extends AbstractObsidianModule implements IObsidianModule
});
}
$allStartOnUnload(): Promise<boolean> {
private _allStartOnUnload(): Promise<boolean> {
if (this.statusDiv) {
this.statusDiv.remove();
}
document.querySelectorAll(`.livesync-status`)?.forEach((e) => e.remove());
return Promise.resolve(true);
}
$everyOnloadStart(): Promise<boolean> {
_everyOnloadStart(): Promise<boolean> {
addIcon(
"view-log",
`<g transform="matrix(1.28 0 0 1.28 -131 -411)" fill="currentColor" fill-rule="evenodd">
@@ -303,23 +327,23 @@ export class ModuleLog extends AbstractObsidianModule implements IObsidianModule
</g>`
);
this.addRibbonIcon("view-log", $msg("moduleLog.showLog"), () => {
void this.core.$$showView(VIEW_TYPE_LOG);
void this.services.API.showWindow(VIEW_TYPE_LOG);
}).addClass("livesync-ribbon-showlog");
this.addCommand({
id: "view-log",
name: "Show log",
callback: () => {
void this.core.$$showView(VIEW_TYPE_LOG);
void this.services.API.showWindow(VIEW_TYPE_LOG);
},
});
this.registerView(VIEW_TYPE_LOG, (leaf) => new LogPaneView(leaf, this.plugin));
return Promise.resolve(true);
}
$everyOnloadAfterLoadSettings(): Promise<boolean> {
private _everyOnloadAfterLoadSettings(): Promise<boolean> {
logStore
.pipeTo(
new QueueProcessor((logs) => logs.forEach((e) => this.core.$$addLog(e.message, e.level, e.key)), {
new QueueProcessor((logs) => logs.forEach((e) => this.__addLog(e.message, e.level, e.key)), {
suspended: false,
batchSize: 20,
concurrentLimit: 1,
@@ -366,32 +390,42 @@ export class ModuleLog extends AbstractObsidianModule implements IObsidianModule
})
);
}
$$addLog(message: any, level: LOG_LEVEL = LOG_LEVEL_INFO, key = ""): void {
__addLog(message: any, level: LOG_LEVEL = LOG_LEVEL_INFO, key = ""): void {
if (level == LOG_LEVEL_DEBUG && !showDebugLog) {
return;
}
if (level < LOG_LEVEL_INFO && this.settings && this.settings.lessInformationInLog) {
if (level <= LOG_LEVEL_INFO && this.settings && this.settings.lessInformationInLog) {
return;
}
if (this.settings && !this.settings.showVerboseLog && level == LOG_LEVEL_VERBOSE) {
return;
}
const vaultName = this.core.$$getVaultName();
const vaultName = this.services.vault.getVaultName();
const now = new Date();
const timestamp = now.toLocaleString();
let errorInfo = "";
if (message instanceof Error) {
if (message instanceof LiveSyncError) {
errorInfo = `${message.cause?.name}:${message.cause?.message}\n[StackTrace]: ${message.stack}\n[CausedBy]: ${message.cause?.stack}`;
} else {
const thisStack = new Error().stack;
errorInfo = `${message.name}:${message.message}\n[StackTrace]: ${message.stack}\n[LogCallStack]: ${thisStack}`;
}
}
const messageContent =
typeof message == "string"
? message
: message instanceof Error
? `${message.name}:${message.message}`
? `${errorInfo}`
: JSON.stringify(message, null, 2);
if (message instanceof Error) {
// debugger;
console.dir(message.stack);
}
const newMessage = timestamp + "->" + messageContent;
console.log(vaultName + ":" + newMessage);
if (message instanceof Error) {
console.error(vaultName + ":" + newMessage);
} else if (level >= LOG_LEVEL_INFO) {
console.log(vaultName + ":" + newMessage);
} else {
console.debug(vaultName + ":" + newMessage);
}
if (!this.settings?.showOnlyIconsOnEditor) {
this.statusLog.value = messageContent;
}
@@ -437,4 +471,10 @@ export class ModuleLog extends AbstractObsidianModule implements IObsidianModule
}
}
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
services.appLifecycle.handleOnSettingLoaded(this._everyOnloadAfterLoadSettings.bind(this));
services.appLifecycle.handleOnLoaded(this._everyOnload.bind(this));
services.appLifecycle.handleOnBeforeUnload(this._allStartOnUnload.bind(this));
}
}

View File

@@ -2,18 +2,18 @@ import { type TFile } from "obsidian";
import { eventHub } from "../../common/events.ts";
import { EVENT_REQUEST_SHOW_HISTORY } from "../../common/obsidianEvents.ts";
import type { FilePathWithPrefix, LoadedEntry, DocumentID } from "../../lib/src/common/types.ts";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { DocumentHistoryModal } from "./DocumentHistory/DocumentHistoryModal.ts";
import { getPath } from "../../common/utils.ts";
import { fireAndForget } from "octagonal-wheels/promises";
export class ModuleObsidianDocumentHistory extends AbstractObsidianModule implements IObsidianModule {
$everyOnloadStart(): Promise<boolean> {
export class ModuleObsidianDocumentHistory extends AbstractObsidianModule {
_everyOnloadStart(): Promise<boolean> {
this.addCommand({
id: "livesync-history",
name: "Show history",
callback: () => {
const file = this.core.$$getActiveFilePath();
const file = this.services.vault.getActiveFilePath();
if (file) this.showHistory(file, undefined);
},
});
@@ -51,4 +51,7 @@ export class ModuleObsidianDocumentHistory extends AbstractObsidianModule implem
this.showHistory(targetId.path, targetId.id);
}
}
onBindFunction(core: typeof this.core, services: typeof core.services): void {
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
}
}

View File

@@ -1,6 +1,6 @@
import { type IObsidianModule, AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
// import { PouchDB } from "../../lib/src/pouchdb/pouchdb-browser";
import { EVENT_REQUEST_RELOAD_SETTING_TAB, EVENT_SETTING_SAVED, eventHub } from "../../common/events";
import { EVENT_REQUEST_RELOAD_SETTING_TAB, EVENT_SETTING_SAVED, eventHub } from "../../common/events.ts";
import {
type BucketSyncSetting,
ChunkAlgorithmNames,
@@ -9,22 +9,23 @@ import {
DEFAULT_SETTINGS,
type ObsidianLiveSyncSettings,
SALT_OF_PASSPHRASE,
SETTING_KEY_P2P_DEVICE_NAME,
} from "../../lib/src/common/types";
import { LOG_LEVEL_NOTICE, LOG_LEVEL_URGENT } from "octagonal-wheels/common/logger";
import { $msg, setLang } from "../../lib/src/common/i18n";
import { isCloudantURI } from "../../lib/src/pouchdb/utils_couchdb";
import { $msg, setLang } from "../../lib/src/common/i18n.ts";
import { isCloudantURI } from "../../lib/src/pouchdb/utils_couchdb.ts";
import { getLanguage } from "obsidian";
import { SUPPORTED_I18N_LANGS, type I18N_LANGS } from "../../lib/src/common/rosetta.ts";
import { decryptString, encryptString } from "@/lib/src/encryption/stringEncryption.ts";
export class ModuleObsidianSettings extends AbstractObsidianModule implements IObsidianModule {
async $everyOnLayoutReady(): Promise<boolean> {
import type { LiveSyncCore } from "../../main.ts";
export class ModuleObsidianSettings extends AbstractObsidianModule {
async _everyOnLayoutReady(): Promise<boolean> {
let isChanged = false;
if (this.settings.displayLanguage == "") {
const obsidianLanguage = getLanguage();
if (
SUPPORTED_I18N_LANGS.indexOf(obsidianLanguage) !== -1 && // Check if the language is supported
obsidianLanguage != this.settings.displayLanguage && // Check if the language is different from the current setting
this.settings.displayLanguage != ""
obsidianLanguage != this.settings.displayLanguage // Check if the language is different from the current setting
) {
// Check if the current setting is not empty (Means migrated or installed).
this.settings.displayLanguage = obsidianLanguage as I18N_LANGS;
@@ -33,7 +34,7 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
} else if (this.settings.displayLanguage == "") {
this.settings.displayLanguage = "def";
setLang(this.settings.displayLanguage);
await this.core.$$saveSettingData();
await this.services.setting.saveSettingData();
}
}
if (isChanged) {
@@ -47,7 +48,7 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
this.settings.displayLanguage = "def";
setLang(this.settings.displayLanguage);
}
await this.core.$$saveSettingData();
await this.services.setting.saveSettingData();
}
return true;
}
@@ -62,13 +63,13 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
return methodFunc();
}
$$saveDeviceAndVaultName(): void {
const lsKey = "obsidian-live-sync-vaultanddevicename-" + this.core.$$getVaultName();
localStorage.setItem(lsKey, this.core.$$getDeviceAndVaultName() || "");
_saveDeviceAndVaultName(): void {
const lsKey = "obsidian-live-sync-vaultanddevicename-" + this.services.vault.getVaultName();
localStorage.setItem(lsKey, this.services.setting.getDeviceAndVaultName() || "");
}
usedPassphrase = "";
$$clearUsedPassphrase(): void {
private _clearUsedPassphrase(): void {
this.usedPassphrase = "";
}
@@ -107,10 +108,15 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
return `${"appId" in this.app ? this.app.appId : ""}`;
}
async $$saveSettingData() {
this.core.$$saveDeviceAndVaultName();
async _saveSettingData() {
this.services.setting.saveDeviceAndVaultName();
const settings = { ...this.settings };
settings.deviceAndVaultName = "";
if (settings.P2P_DevicePeerName && settings.P2P_DevicePeerName.trim() !== "") {
console.log("Saving device peer name to small config");
this.services.config.setSmallConfig(SETTING_KEY_P2P_DEVICE_NAME, settings.P2P_DevicePeerName.trim());
settings.P2P_DevicePeerName = "";
}
if (this.usedPassphrase == "" && !(await this.getPassphrase(settings))) {
this._log("Failed to retrieve passphrase. data.json contains unencrypted items!", LOG_LEVEL_NOTICE);
} else {
@@ -141,6 +147,7 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
jwtSub: settings.jwtSub,
useRequestAPI: settings.useRequestAPI,
bucketPrefix: settings.bucketPrefix,
forcePathStyle: settings.forcePathStyle,
};
settings.encryptedCouchDBConnection = await this.encryptConfigurationItem(
JSON.stringify(connectionSetting),
@@ -174,7 +181,7 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
}
}
async $$decryptSettings(settings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
async _decryptSettings(settings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
const passphrase = await this.getPassphrase(settings);
if (passphrase === false) {
this._log("No passphrase found for data.json! Verify configuration before syncing.", LOG_LEVEL_URGENT);
@@ -234,7 +241,7 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
* @param settings
* @returns
*/
$$adjustSettings(settings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
_adjustSettings(settings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
// Adjust settings as needed
// Delete this feature to avoid problems on mobile.
@@ -264,7 +271,7 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
return Promise.resolve(settings);
}
async $$loadSettings(): Promise<void> {
async _loadSettings(): Promise<void> {
const settings = Object.assign({}, DEFAULT_SETTINGS, await this.core.loadData()) as ObsidianLiveSyncSettings;
if (typeof settings.isConfigured == "undefined") {
@@ -277,17 +284,17 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
}
}
this.settings = await this.core.$$decryptSettings(settings);
this.settings = await this.services.setting.decryptSettings(settings);
setLang(this.settings.displayLanguage);
await this.core.$$adjustSettings(this.settings);
await this.services.setting.adjustSettings(this.settings);
const lsKey = "obsidian-live-sync-vaultanddevicename-" + this.core.$$getVaultName();
const lsKey = "obsidian-live-sync-vaultanddevicename-" + this.services.vault.getVaultName();
if (this.settings.deviceAndVaultName != "") {
if (!localStorage.getItem(lsKey)) {
this.core.$$setDeviceAndVaultName(this.settings.deviceAndVaultName);
this.$$saveDeviceAndVaultName();
this.services.setting.setDeviceAndVaultName(this.settings.deviceAndVaultName);
this.services.setting.saveDeviceAndVaultName();
this.settings.deviceAndVaultName = "";
}
}
@@ -298,8 +305,8 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
);
this.settings.customChunkSize = 0;
}
this.core.$$setDeviceAndVaultName(localStorage.getItem(lsKey) || "");
if (this.core.$$getDeviceAndVaultName() == "") {
this.services.setting.setDeviceAndVaultName(localStorage.getItem(lsKey) || "");
if (this.services.setting.getDeviceAndVaultName() == "") {
if (this.settings.usePluginSync) {
this._log("Device name missing. Disabling plug-in sync.", LOG_LEVEL_NOTICE);
this.settings.usePluginSync = false;
@@ -309,4 +316,20 @@ export class ModuleObsidianSettings extends AbstractObsidianModule implements IO
// this.core.ignoreFiles = this.settings.ignoreFiles.split(",").map(e => e.trim());
eventHub.emitEvent(EVENT_REQUEST_RELOAD_SETTING_TAB);
}
private _currentSettings(): ObsidianLiveSyncSettings {
return this.settings;
}
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
super.onBindFunction(core, services);
services.appLifecycle.handleLayoutReady(this._everyOnLayoutReady.bind(this));
services.setting.handleClearUsedPassphrase(this._clearUsedPassphrase.bind(this));
services.setting.handleDecryptSettings(this._decryptSettings.bind(this));
services.setting.handleAdjustSettings(this._adjustSettings.bind(this));
services.setting.handleLoadSettings(this._loadSettings.bind(this));
services.setting.handleCurrentSettings(this._currentSettings.bind(this));
services.setting.handleSaveDeviceAndVaultName(this._saveDeviceAndVaultName.bind(this));
services.setting.handleSaveSettingData(this._saveSettingData.bind(this));
}
}

View File

@@ -1,4 +1,4 @@
import { type IObsidianModule, AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
// import { PouchDB } from "../../lib/src/pouchdb/pouchdb-browser";
import { isObjectDifferent } from "octagonal-wheels/object";
import { EVENT_SETTING_SAVED, eventHub } from "../../common/events";
@@ -8,8 +8,8 @@ import { parseYaml, stringifyYaml } from "../../deps";
import { LOG_LEVEL_DEBUG, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "octagonal-wheels/common/logger";
const SETTING_HEADER = "````yaml:livesync-setting\n";
const SETTING_FOOTER = "\n````";
export class ModuleObsidianSettingsAsMarkdown extends AbstractObsidianModule implements IObsidianModule {
$everyOnloadStart(): Promise<boolean> {
export class ModuleObsidianSettingsAsMarkdown extends AbstractObsidianModule {
_everyOnloadStart(): Promise<boolean> {
this.addCommand({
id: "livesync-export-config",
name: "Write setting markdown manually",
@@ -18,7 +18,7 @@ export class ModuleObsidianSettingsAsMarkdown extends AbstractObsidianModule imp
return this.settings.settingSyncFile != "";
}
fireAndForget(async () => {
await this.core.$$saveSettingData();
await this.services.setting.saveSettingData();
});
},
});
@@ -160,7 +160,7 @@ export class ModuleObsidianSettingsAsMarkdown extends AbstractObsidianModule imp
result == APPLY_AND_FETCH
) {
this.core.settings = settingToApply;
await this.core.$$saveSettingData();
await this.services.setting.saveSettingData();
if (result == APPLY_ONLY) {
this._log("Loaded settings have been applied!", LOG_LEVEL_NOTICE);
return;
@@ -171,7 +171,7 @@ export class ModuleObsidianSettingsAsMarkdown extends AbstractObsidianModule imp
if (result == APPLY_AND_FETCH) {
await this.core.rebuilder.scheduleFetch();
}
this.core.$$performRestart();
this.services.appLifecycle.performRestart();
}
});
});
@@ -242,4 +242,7 @@ We can perform a command in this file.
this._log(`Markdown setting: ${filename} has been updated!`, LOG_LEVEL_VERBOSE);
}
}
onBindFunction(core: typeof this.plugin, services: typeof core.services): void {
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
}
}

View File

@@ -1,12 +1,12 @@
import { ObsidianLiveSyncSettingTab } from "./SettingDialogue/ObsidianLiveSyncSettingTab.ts";
import { type IObsidianModule, AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
// import { PouchDB } from "../../lib/src/pouchdb/pouchdb-browser";
import { EVENT_REQUEST_OPEN_SETTING_WIZARD, EVENT_REQUEST_OPEN_SETTINGS, eventHub } from "../../common/events.ts";
export class ModuleObsidianSettingDialogue extends AbstractObsidianModule implements IObsidianModule {
export class ModuleObsidianSettingDialogue extends AbstractObsidianModule {
settingTab!: ObsidianLiveSyncSettingTab;
$everyOnloadStart(): Promise<boolean> {
_everyOnloadStart(): Promise<boolean> {
this.settingTab = new ObsidianLiveSyncSettingTab(this.app, this.plugin);
this.plugin.addSettingTab(this.settingTab);
eventHub.onEvent(EVENT_REQUEST_OPEN_SETTINGS, () => this.openSetting());
@@ -29,4 +29,7 @@ export class ModuleObsidianSettingDialogue extends AbstractObsidianModule implem
get appId() {
return `${"appId" in this.app ? this.app.appId : ""}`;
}
onBindFunction(core: typeof this.plugin, services: typeof core.services): void {
services.appLifecycle.handleOnInitialise(this._everyOnloadStart.bind(this));
}
}

View File

@@ -1,33 +1,38 @@
import {
type ObsidianLiveSyncSettings,
DEFAULT_SETTINGS,
KeyIndexOfSettings,
LOG_LEVEL_NOTICE,
LOG_LEVEL_VERBOSE,
} from "../../lib/src/common/types.ts";
import { configURIBase, configURIBaseQR } from "../../common/types.ts";
import { type ObsidianLiveSyncSettings, LOG_LEVEL_NOTICE } from "../../lib/src/common/types.ts";
import { configURIBase } from "../../common/types.ts";
// import { PouchDB } from "../../lib/src/pouchdb/pouchdb-browser.js";
import { fireAndForget } from "../../lib/src/common/utils.ts";
import {
EVENT_REQUEST_COPY_SETUP_URI,
EVENT_REQUEST_OPEN_P2P_SETTINGS,
EVENT_REQUEST_OPEN_SETUP_URI,
EVENT_REQUEST_SHOW_SETUP_QR,
eventHub,
} from "../../common/events.ts";
import { AbstractObsidianModule, type IObsidianModule } from "../AbstractObsidianModule.ts";
import { decodeAnyArray, encodeAnyArray } from "../../common/utils.ts";
import qrcode from "qrcode-generator";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { $msg } from "../../lib/src/common/i18n.ts";
import { performDoctorConsultation, RebuildOptions } from "@/lib/src/common/configForDoc.ts";
import { encryptString, decryptString } from "@/lib/src/encryption/stringEncryption.ts";
// import { performDoctorConsultation, RebuildOptions } from "@/lib/src/common/configForDoc.ts";
import type { LiveSyncCore } from "../../main.ts";
import {
encodeQR,
encodeSettingsToQRCodeData,
encodeSettingsToSetupURI,
OutputFormat,
} from "../../lib/src/API/processSetting.ts";
import { SetupManager, UserMode } from "./SetupManager.ts";
export class ModuleSetupObsidian extends AbstractObsidianModule implements IObsidianModule {
$everyOnload(): Promise<boolean> {
export class ModuleSetupObsidian extends AbstractObsidianModule {
private _setupManager!: SetupManager;
private _everyOnload(): Promise<boolean> {
this._setupManager = this.plugin.getModule(SetupManager);
this.registerObsidianProtocolHandler("setuplivesync", async (conf: any) => {
if (conf.settings) {
await this.setupWizard(conf.settings);
await this._setupManager.onUseSetupURI(
UserMode.Unknown,
`${configURIBase}${encodeURIComponent(conf.settings)}`
);
} else if (conf.settingsQR) {
await this.decodeQR(conf.settingsQR);
await this._setupManager.decodeQR(conf.settingsQR);
}
});
this.addCommand({
@@ -58,291 +63,138 @@ export class ModuleSetupObsidian extends AbstractObsidianModule implements IObsi
name: "Use the copied setup URI (Formerly Open setup URI)",
callback: () => fireAndForget(this.command_openSetupURI()),
});
eventHub.onEvent(EVENT_REQUEST_OPEN_SETUP_URI, () => fireAndForget(() => this.command_openSetupURI()));
eventHub.onEvent(EVENT_REQUEST_COPY_SETUP_URI, () => fireAndForget(() => this.command_copySetupURI()));
eventHub.onEvent(EVENT_REQUEST_SHOW_SETUP_QR, () => fireAndForget(() => this.encodeQR()));
eventHub.onEvent(EVENT_REQUEST_OPEN_P2P_SETTINGS, () =>
fireAndForget(() => {
return this._setupManager.onP2PManualSetup(UserMode.Update, this.settings, false);
})
);
return Promise.resolve(true);
}
async encodeQR() {
const settingArr = [];
const fullIndexes = Object.entries(KeyIndexOfSettings) as [keyof ObsidianLiveSyncSettings, number][];
for (const [settingKey, index] of fullIndexes) {
const settingValue = this.settings[settingKey];
if (index < 0) {
// This setting should be ignored.
continue;
}
settingArr[index] = settingValue;
const settingString = encodeSettingsToQRCodeData(this.settings);
const codeSVG = encodeQR(settingString, OutputFormat.SVG);
if (codeSVG == "") {
return "";
}
const w = encodeAnyArray(settingArr);
const qr = qrcode(0, "L");
const uri = `${configURIBaseQR}${encodeURIComponent(w)}`;
qr.addData(uri);
qr.make();
const img = qr.createSvgTag(3);
const msg = $msg("Setup.QRCode", { qr_image: img });
const msg = $msg("Setup.QRCode", { qr_image: codeSVG });
await this.core.confirm.confirmWithMessage("Settings QR Code", msg, ["OK"], "OK");
return await Promise.resolve(w);
return await Promise.resolve(codeSVG);
}
async decodeQR(qr: string) {
const settingArr = decodeAnyArray(qr);
// console.warn(settingArr);
const fullIndexes = Object.entries(KeyIndexOfSettings) as [keyof ObsidianLiveSyncSettings, number][];
const newSettings = { ...DEFAULT_SETTINGS } as ObsidianLiveSyncSettings;
for (const [settingKey, index] of fullIndexes) {
if (index < 0) {
// This setting should be ignored.
continue;
}
if (index >= settingArr.length) {
// Possibly a new setting added.
continue;
}
const settingValue = settingArr[index];
//@ts-ignore
newSettings[settingKey] = settingValue;
}
await this.applySettingWizard(this.settings, newSettings, "QR Code");
async askEncryptingPassphrase(): Promise<string | false> {
const encryptingPassphrase = await this.core.confirm.askString(
"Encrypt your settings",
"The passphrase to encrypt the setup URI",
"",
true
);
return encryptingPassphrase;
}
async command_copySetupURI(stripExtra = true) {
const encryptingPassphrase = await this.core.confirm.askString(
"Encrypt your settings",
"The passphrase to encrypt the setup URI",
"",
const encryptingPassphrase = await this.askEncryptingPassphrase();
if (encryptingPassphrase === false) return;
const encryptedURI = await encodeSettingsToSetupURI(
this.settings,
encryptingPassphrase,
[...((stripExtra ? ["pluginSyncExtendedSetting"] : []) as (keyof ObsidianLiveSyncSettings)[])],
true
);
if (encryptingPassphrase === false) return;
const setting = {
...this.settings,
configPassphraseStore: "",
encryptedCouchDBConnection: "",
encryptedPassphrase: "",
} as Partial<ObsidianLiveSyncSettings>;
if (stripExtra) {
delete setting.pluginSyncExtendedSetting;
if (await this.services.UI.promptCopyToClipboard("Setup URI", encryptedURI)) {
this._log("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
}
const keys = Object.keys(setting) as (keyof ObsidianLiveSyncSettings)[];
for (const k of keys) {
if (
JSON.stringify(k in setting ? setting[k] : "") ==
JSON.stringify(k in DEFAULT_SETTINGS ? DEFAULT_SETTINGS[k] : "*")
) {
delete setting[k];
}
}
const encryptedSetting = encodeURIComponent(await encryptString(JSON.stringify(setting), encryptingPassphrase));
const uri = `${configURIBase}${encryptedSetting} `;
await navigator.clipboard.writeText(uri);
this._log("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
// await navigator.clipboard.writeText(encryptedURI);
}
async command_copySetupURIFull() {
const encryptingPassphrase = await this.core.confirm.askString(
"Encrypt your settings",
"The passphrase to encrypt the setup URI",
"",
true
);
const encryptingPassphrase = await this.askEncryptingPassphrase();
if (encryptingPassphrase === false) return;
const setting = {
...this.settings,
configPassphraseStore: "",
encryptedCouchDBConnection: "",
encryptedPassphrase: "",
};
const encryptedSetting = encodeURIComponent(await encryptString(JSON.stringify(setting), encryptingPassphrase));
const uri = `${configURIBase}${encryptedSetting} `;
await navigator.clipboard.writeText(uri);
const encryptedURI = await encodeSettingsToSetupURI(this.settings, encryptingPassphrase, [], false);
await navigator.clipboard.writeText(encryptedURI);
this._log("Setup URI copied to clipboard", LOG_LEVEL_NOTICE);
}
async command_copySetupURIWithSync() {
await this.command_copySetupURI(false);
}
async command_openSetupURI() {
const setupURI = await this.core.confirm.askString("Easy setup", "Set up URI", `${configURIBase} aaaaa`);
if (setupURI === false) return;
if (!setupURI.startsWith(`${configURIBase}`)) {
this._log("Set up URI looks wrong.", LOG_LEVEL_NOTICE);
return;
}
const config = decodeURIComponent(setupURI.substring(configURIBase.length));
await this.setupWizard(config);
}
async askSyncWithRemoteConfig(tryingSettings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
const buttons = {
fetch: $msg("Setup.FetchRemoteConf.Buttons.Fetch"),
no: $msg("Setup.FetchRemoteConf.Buttons.Skip"),
} as const;
const fetchRemoteConf = await this.core.confirm.askSelectStringDialogue(
$msg("Setup.FetchRemoteConf.Message"),
Object.values(buttons),
{ defaultAction: buttons.fetch, timeout: 0, title: $msg("Setup.FetchRemoteConf.Title") }
);
if (fetchRemoteConf == buttons.no) {
return tryingSettings;
}
const newSettings = JSON.parse(JSON.stringify(tryingSettings)) as ObsidianLiveSyncSettings;
const remoteConfig = await this.core.$$fetchRemotePreferredTweakValues(newSettings);
if (remoteConfig) {
this._log("Remote configuration found.", LOG_LEVEL_NOTICE);
const resultSettings = {
...DEFAULT_SETTINGS,
...tryingSettings,
...remoteConfig,
} satisfies ObsidianLiveSyncSettings;
return resultSettings;
} else {
this._log("Remote configuration not applied.", LOG_LEVEL_NOTICE);
return {
...DEFAULT_SETTINGS,
...tryingSettings,
} satisfies ObsidianLiveSyncSettings;
}
}
async askPerformDoctor(
tryingSettings: ObsidianLiveSyncSettings
): Promise<{ settings: ObsidianLiveSyncSettings; shouldRebuild: boolean; isModified: boolean }> {
const buttons = {
yes: $msg("Setup.Doctor.Buttons.Yes"),
no: $msg("Setup.Doctor.Buttons.No"),
} as const;
const performDoctor = await this.core.confirm.askSelectStringDialogue(
$msg("Setup.Doctor.Message"),
Object.values(buttons),
{ defaultAction: buttons.yes, timeout: 0, title: $msg("Setup.Doctor.Title") }
);
if (performDoctor == buttons.no) {
return { settings: tryingSettings, shouldRebuild: false, isModified: false };
}
const newSettings = JSON.parse(JSON.stringify(tryingSettings)) as ObsidianLiveSyncSettings;
const { settings, shouldRebuild, isModified } = await performDoctorConsultation(this.core, newSettings, {
localRebuild: RebuildOptions.AutomaticAcceptable, // Because we are in the setup wizard, we can skip the confirmation.
remoteRebuild: RebuildOptions.SkipEvenIfRequired,
activateReason: "New settings from URI",
});
if (isModified) {
this._log("Doctor has fixed some issues!", LOG_LEVEL_NOTICE);
return {
settings,
shouldRebuild,
isModified,
};
} else {
this._log("Doctor detected no issues!", LOG_LEVEL_NOTICE);
return { settings: tryingSettings, shouldRebuild: false, isModified: false };
}
await this._setupManager.onUseSetupURI(UserMode.Unknown);
}
async applySettingWizard(
oldConf: ObsidianLiveSyncSettings,
newConf: ObsidianLiveSyncSettings,
method = "Setup URI"
) {
const result = await this.core.confirm.askYesNoDialog(
"Importing Configuration from the " + method + ". Are you sure to proceed ? ",
{}
);
if (result == "yes") {
let newSettingW = Object.assign({}, DEFAULT_SETTINGS, newConf) as ObsidianLiveSyncSettings;
this.core.replicator.closeReplication();
this.settings.suspendFileWatching = true;
newSettingW = await this.askSyncWithRemoteConfig(newSettingW);
const { settings, shouldRebuild, isModified } = await this.askPerformDoctor(newSettingW);
if (isModified) {
newSettingW = settings;
}
// Back into the default method once.
newSettingW.configPassphraseStore = "";
newSettingW.encryptedPassphrase = "";
newSettingW.encryptedCouchDBConnection = "";
newSettingW.additionalSuffixOfDatabaseName = `${"appId" in this.app ? this.app.appId : ""} `;
const setupJustImport = $msg("Setup.Apply.Buttons.OnlyApply");
const setupAsNew = $msg("Setup.Apply.Buttons.ApplyAndFetch");
const setupAsMerge = $msg("Setup.Apply.Buttons.ApplyAndMerge");
const setupAgain = $msg("Setup.Apply.Buttons.ApplyAndRebuild");
const setupCancel = $msg("Setup.Apply.Buttons.Cancel");
newSettingW.syncInternalFiles = false;
newSettingW.usePluginSync = false;
newSettingW.isConfigured = true;
// Migrate completely obsoleted configuration.
if (!newSettingW.useIndexedDBAdapter) {
newSettingW.useIndexedDBAdapter = true;
}
const warn = shouldRebuild ? $msg("Setup.Apply.WarningRebuildRecommended") : "";
const message = $msg("Setup.Apply.Message", {
method,
warn,
});
// TODO: Where to implement these?
const setupType = await this.core.confirm.askSelectStringDialogue(
message,
[setupAsNew, setupAsMerge, setupAgain, setupJustImport, setupCancel],
{ defaultAction: setupAsNew, title: $msg("Setup.Apply.Title", { method }), timeout: 0 }
);
if (setupType == setupJustImport) {
this.core.settings = newSettingW;
this.core.$$clearUsedPassphrase();
await this.core.saveSettings();
} else if (setupType == setupAsNew) {
this.core.settings = newSettingW;
this.core.$$clearUsedPassphrase();
await this.core.saveSettings();
await this.core.rebuilder.$fetchLocal();
} else if (setupType == setupAsMerge) {
this.core.settings = newSettingW;
this.core.$$clearUsedPassphrase();
await this.core.saveSettings();
await this.core.rebuilder.$fetchLocal(true);
} else if (setupType == setupAgain) {
const confirm =
"This operation will rebuild all databases with files on this device. Any files on the remote database not synced here will be lost.";
if (
(await this.core.confirm.askSelectStringDialogue(
"Are you sure you want to do this?",
["Cancel", confirm],
{ defaultAction: "Cancel" }
)) != confirm
) {
return;
}
this.core.settings = newSettingW;
await this.core.saveSettings();
this.core.$$clearUsedPassphrase();
await this.core.rebuilder.$rebuildEverything();
} else {
// Explicitly cancel the operation or the dialog was closed.
this._log("Cancelled", LOG_LEVEL_NOTICE);
this.core.settings = oldConf;
return;
}
this._log("Configuration loaded.", LOG_LEVEL_NOTICE);
} else {
this._log("Cancelled", LOG_LEVEL_NOTICE);
this.core.settings = oldConf;
return;
}
}
async setupWizard(confString: string) {
try {
const oldConf = JSON.parse(JSON.stringify(this.settings));
const encryptingPassphrase = await this.core.confirm.askString(
"Passphrase",
"The passphrase to decrypt your setup URI",
"",
true
);
if (encryptingPassphrase === false) return;
const newConf = await JSON.parse(await decryptString(confString, encryptingPassphrase));
if (newConf) {
await this.applySettingWizard(oldConf, newConf);
this._log("Configuration loaded.", LOG_LEVEL_NOTICE);
} else {
this._log("Cancelled.", LOG_LEVEL_NOTICE);
}
} catch (ex) {
this._log("Couldn't parse or decrypt configuration uri.", LOG_LEVEL_NOTICE);
this._log(ex, LOG_LEVEL_VERBOSE);
}
// async askSyncWithRemoteConfig(tryingSettings: ObsidianLiveSyncSettings): Promise<ObsidianLiveSyncSettings> {
// const buttons = {
// fetch: $msg("Setup.FetchRemoteConf.Buttons.Fetch"),
// no: $msg("Setup.FetchRemoteConf.Buttons.Skip"),
// } as const;
// const fetchRemoteConf = await this.core.confirm.askSelectStringDialogue(
// $msg("Setup.FetchRemoteConf.Message"),
// Object.values(buttons),
// { defaultAction: buttons.fetch, timeout: 0, title: $msg("Setup.FetchRemoteConf.Title") }
// );
// if (fetchRemoteConf == buttons.no) {
// return tryingSettings;
// }
// const newSettings = JSON.parse(JSON.stringify(tryingSettings)) as ObsidianLiveSyncSettings;
// const remoteConfig = await this.services.tweakValue.fetchRemotePreferred(newSettings);
// if (remoteConfig) {
// this._log("Remote configuration found.", LOG_LEVEL_NOTICE);
// const resultSettings = {
// ...DEFAULT_SETTINGS,
// ...tryingSettings,
// ...remoteConfig,
// } satisfies ObsidianLiveSyncSettings;
// return resultSettings;
// } else {
// this._log("Remote configuration not applied.", LOG_LEVEL_NOTICE);
// return {
// ...DEFAULT_SETTINGS,
// ...tryingSettings,
// } satisfies ObsidianLiveSyncSettings;
// }
// }
// async askPerformDoctor(
// tryingSettings: ObsidianLiveSyncSettings
// ): Promise<{ settings: ObsidianLiveSyncSettings; shouldRebuild: boolean; isModified: boolean }> {
// const buttons = {
// yes: $msg("Setup.Doctor.Buttons.Yes"),
// no: $msg("Setup.Doctor.Buttons.No"),
// } as const;
// const performDoctor = await this.core.confirm.askSelectStringDialogue(
// $msg("Setup.Doctor.Message"),
// Object.values(buttons),
// { defaultAction: buttons.yes, timeout: 0, title: $msg("Setup.Doctor.Title") }
// );
// if (performDoctor == buttons.no) {
// return { settings: tryingSettings, shouldRebuild: false, isModified: false };
// }
// const newSettings = JSON.parse(JSON.stringify(tryingSettings)) as ObsidianLiveSyncSettings;
// const { settings, shouldRebuild, isModified } = await performDoctorConsultation(this.core, newSettings, {
// localRebuild: RebuildOptions.AutomaticAcceptable, // Because we are in the setup wizard, we can skip the confirmation.
// remoteRebuild: RebuildOptions.SkipEvenIfRequired,
// activateReason: "New settings from URI",
// });
// if (isModified) {
// this._log("Doctor has fixed some issues!", LOG_LEVEL_NOTICE);
// return {
// settings,
// shouldRebuild,
// isModified,
// };
// } else {
// this._log("Doctor detected no issues!", LOG_LEVEL_NOTICE);
// return { settings: tryingSettings, shouldRebuild: false, isModified: false };
// }
// }
onBindFunction(core: LiveSyncCore, services: typeof core.services): void {
services.appLifecycle.handleOnLoaded(this._everyOnload.bind(this));
}
}

View File

@@ -0,0 +1,15 @@
<script lang="ts">
/**
* Info Panel to display key-value information from the port
* Mostly used in the Setting Dialogue
*/
import { type SveltePanelProps } from "./SveltePanel";
import InfoTable from "@lib/UI/components/InfoTable.svelte";
type Props = SveltePanelProps<{
info: Record<string, any>;
}>;
const { port }: Props = $props();
const info = $derived.by(() => $port?.info ?? {});
</script>
<InfoTable {info} />

View File

@@ -17,9 +17,7 @@ import { delay, isObjectDifferent, sizeToHumanReadable } from "../../../lib/src/
import { versionNumberString2Number } from "../../../lib/src/string_and_binary/convert.ts";
import { Logger } from "../../../lib/src/common/logger.ts";
import { checkSyncInfo } from "@/lib/src/pouchdb/negotiation.ts";
import { balanceChunkPurgedDBs } from "@/lib/src/pouchdb/chunks.ts";
import { purgeUnreferencedChunks } from "@/lib/src/pouchdb/chunks.ts";
import { testCrypt } from "../../../lib/src/encryption/e2ee_v2.ts";
import { testCrypt } from "octagonal-wheels/encryption/encryption";
import ObsidianLiveSyncPlugin from "../../../main.ts";
import { scheduleTask } from "../../../common/utils.ts";
import { LiveSyncCouchDBReplicator } from "../../../lib/src/replication/couchdb/LiveSyncReplicator.ts";
@@ -38,7 +36,6 @@ import { LiveSyncSetting as Setting } from "./LiveSyncSetting.ts";
import { fireAndForget, yieldNextAnimationFrame } from "octagonal-wheels/promises";
import { confirmWithMessage } from "../../coreObsidian/UILib/dialogs.ts";
import { EVENT_REQUEST_RELOAD_SETTING_TAB, eventHub } from "../../../common/events.ts";
import { skipIfDuplicated } from "octagonal-wheels/concurrency/lock";
import { JournalSyncMinio } from "../../../lib/src/replication/journal/objectstore/JournalSyncMinio.ts";
import { paneChangeLog } from "./PaneChangeLog.ts";
import {
@@ -89,6 +86,9 @@ export function createStub(name: string, key: string, value: string, panel: stri
export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
plugin: ObsidianLiveSyncPlugin;
get services() {
return this.plugin.services;
}
selectedScreen = "";
_editingSettings?: AllSettings;
@@ -142,8 +142,8 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
return await Promise.resolve();
}
if (key == "deviceAndVaultName") {
this.plugin.$$setDeviceAndVaultName(this.editingSettings?.[key] ?? "");
this.plugin.$$saveDeviceAndVaultName();
this.services.setting.setDeviceAndVaultName(this.editingSettings?.[key] ?? "");
this.services.setting.saveDeviceAndVaultName();
return await Promise.resolve();
}
}
@@ -213,7 +213,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
const ret = { ...OnDialogSettingsDefault };
ret.configPassphrase = localStorage.getItem("ls-setting-passphrase") || "";
ret.preset = "";
ret.deviceAndVaultName = this.plugin.$$getDeviceAndVaultName();
ret.deviceAndVaultName = this.services.setting.getDeviceAndVaultName();
return ret;
}
computeAllLocalSettings(): Partial<OnDialogSettings> {
@@ -298,7 +298,11 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
async testConnection(settingOverride: Partial<ObsidianLiveSyncSettings> = {}): Promise<void> {
const trialSetting = { ...this.editingSettings, ...settingOverride };
const replicator = await this.plugin.$anyNewReplicator(trialSetting);
const replicator = await this.services.replicator.getNewReplicator(trialSetting);
if (!replicator) {
Logger("No replicator available for the current settings.", LOG_LEVEL_NOTICE);
return;
}
await replicator.tryConnectRemote(trialSetting);
const status = await replicator.getRemoteStatus(trialSetting);
if (status) {
@@ -549,10 +553,14 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
const settingForCheck: RemoteDBSettings = {
...this.editingSettings,
};
const replicator = this.plugin.$anyNewReplicator(settingForCheck);
const replicator = this.services.replicator.getNewReplicator(settingForCheck);
if (!(replicator instanceof LiveSyncCouchDBReplicator)) return true;
const db = await replicator.connectRemoteCouchDBWithSetting(settingForCheck, this.plugin.$$isMobile(), true);
const db = await replicator.connectRemoteCouchDBWithSetting(
settingForCheck,
this.services.API.isMobile(),
true
);
if (typeof db === "string") {
Logger($msg("obsidianLiveSyncSettingTab.logCheckPassphraseFailed", { db }), LOG_LEVEL_NOTICE);
return false;
@@ -591,8 +599,8 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
this.editingSettings.passphrase = "";
}
this.applyAllSettings();
await this.plugin.$allSuspendAllSync();
await this.plugin.$allSuspendExtraSync();
await this.services.setting.suspendAllSync();
await this.services.setting.suspendExtraSync();
this.reloadAllSettings();
this.editingSettings.isConfigured = true;
Logger($msg("obsidianLiveSyncSettingTab.logRebuildNote"), LOG_LEVEL_NOTICE);
@@ -641,12 +649,12 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
await this.applyAllSettings();
if (result == OPTION_FETCH) {
await this.plugin.storageAccess.writeFileAuto(FLAGMD_REDFLAG3_HR, "");
this.plugin.$$scheduleAppReload();
this.services.appLifecycle.scheduleRestart();
this.closeSetting();
// await rebuildDB("localOnly");
} else if (result == OPTION_REBUILD_BOTH) {
await this.plugin.storageAccess.writeFileAuto(FLAGMD_REDFLAG2_HR, "");
this.plugin.$$scheduleAppReload();
this.services.appLifecycle.scheduleRestart();
this.closeSetting();
} else if (result == OPTION_ONLY_SETTING) {
await this.plugin.saveSettings();
@@ -861,70 +869,8 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
});
}
async dryRunGC() {
await skipIfDuplicated("cleanup", async () => {
const replicator = this.plugin.$$getReplicator();
if (!(replicator instanceof LiveSyncCouchDBReplicator)) return;
const remoteDBConn = await replicator.connectRemoteCouchDBWithSetting(
this.plugin.settings,
this.plugin.$$isMobile()
);
if (typeof remoteDBConn == "string") {
Logger(remoteDBConn);
return;
}
await purgeUnreferencedChunks(remoteDBConn.db, true, this.plugin.settings, false);
await purgeUnreferencedChunks(this.plugin.localDatabase.localDatabase, true);
this.plugin.localDatabase.hashCaches.clear();
});
}
async dbGC() {
// Lock the remote completely once.
await skipIfDuplicated("cleanup", async () => {
const replicator = this.plugin.$$getReplicator();
if (!(replicator instanceof LiveSyncCouchDBReplicator)) return;
await this.plugin.$$getReplicator().markRemoteLocked(this.plugin.settings, true, true);
const remoteDBConnection = await replicator.connectRemoteCouchDBWithSetting(
this.plugin.settings,
this.plugin.$$isMobile()
);
if (typeof remoteDBConnection == "string") {
Logger(remoteDBConnection);
return;
}
await purgeUnreferencedChunks(remoteDBConnection.db, false, this.plugin.settings, true);
await purgeUnreferencedChunks(this.plugin.localDatabase.localDatabase, false);
this.plugin.localDatabase.hashCaches.clear();
await balanceChunkPurgedDBs(this.plugin.localDatabase.localDatabase, remoteDBConnection.db);
this.plugin.localDatabase.refreshSettings();
Logger(
"The remote database has been cleaned up! Other devices will be cleaned up on the next synchronisation."
);
});
}
getMinioJournalSyncClient() {
const id = this.plugin.settings.accessKey;
const key = this.plugin.settings.secretKey;
const bucket = this.plugin.settings.bucket;
const prefix = this.plugin.settings.bucketPrefix;
const region = this.plugin.settings.region;
const endpoint = this.plugin.settings.endpoint;
const useCustomRequestHandler = this.plugin.settings.useCustomRequestHandler;
const customHeaders = this.plugin.settings.bucketCustomHeaders;
return new JournalSyncMinio(
id,
key,
endpoint,
bucket,
prefix,
this.plugin.simpleStore,
this.plugin,
useCustomRequestHandler,
region,
customHeaders
);
return new JournalSyncMinio(this.plugin.settings, this.plugin.simpleStore, this.plugin);
}
async resetRemoteBucket() {
const minioJournal = this.getMinioJournalSyncClient();

View File

@@ -6,7 +6,7 @@ import type { PageFunctions } from "./SettingPane.ts";
export function paneAdvanced(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElement, { addPanel }: PageFunctions): void {
void addPanel(paneEl, "Memory cache").then((paneEl) => {
new Setting(paneEl).autoWireNumeric("hashCacheMaxCount", { clampMin: 10 });
new Setting(paneEl).autoWireNumeric("hashCacheMaxAmount", { clampMin: 1 });
// new Setting(paneEl).autoWireNumeric("hashCacheMaxAmount", { clampMin: 1 });
});
void addPanel(paneEl, "Local Database Tweak").then((paneEl) => {
paneEl.addClass("wizardHidden");

View File

@@ -3,6 +3,7 @@ import { versionNumberString2Number } from "../../../lib/src/string_and_binary/c
import { $msg } from "../../../lib/src/common/i18n.ts";
import { fireAndForget } from "octagonal-wheels/promises";
import type { ObsidianLiveSyncSettingTab } from "./ObsidianLiveSyncSettingTab.ts";
import { visibleOnly } from "./SettingPane.ts";
//@ts-ignore
const manifestVersion: string = MANIFEST_VERSION || "-";
//@ts-ignore
@@ -10,8 +11,34 @@ const updateInformation: string = UPDATE_INFO || "";
const lastVersion = ~~(versionNumberString2Number(manifestVersion) / 1000);
export function paneChangeLog(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElement): void {
const informationDivEl = this.createEl(paneEl, "div", { text: "" });
const cx = this.createEl(
paneEl,
"div",
{
cls: "op-warn-info",
},
undefined,
visibleOnly(() => !this.isConfiguredAs("versionUpFlash", ""))
);
this.createEl(
cx,
"div",
{
text: this.editingSettings.versionUpFlash,
},
undefined
);
this.createEl(cx, "button", { text: $msg("obsidianLiveSyncSettingTab.btnGotItAndUpdated") }, (e) => {
e.addClass("mod-cta");
e.addEventListener("click", () => {
fireAndForget(async () => {
this.editingSettings.versionUpFlash = "";
await this.saveAllDirtySettings();
});
});
});
const informationDivEl = this.createEl(paneEl, "div", { text: "" });
const tmpDiv = createDiv();
// tmpDiv.addClass("sls-header-button");
tmpDiv.addClass("op-warn-info");

View File

@@ -26,7 +26,7 @@ import { addPrefix, shouldBeIgnored, stripAllPrefixes } from "../../../lib/src/s
import { $msg } from "../../../lib/src/common/i18n.ts";
import { Semaphore } from "octagonal-wheels/concurrency/semaphore";
import { LiveSyncSetting as Setting } from "./LiveSyncSetting.ts";
import { EVENT_REQUEST_RUN_DOCTOR, eventHub } from "../../../common/events.ts";
import { EVENT_REQUEST_RUN_DOCTOR, EVENT_REQUEST_RUN_FIX_INCOMPLETE, eventHub } from "../../../common/events.ts";
import { ICHeader, ICXHeader, PSCHeader } from "../../../common/types.ts";
import { HiddenFileSync } from "../../../features/HiddenFileSync/CmdHiddenFileSync.ts";
import { EVENT_REQUEST_SHOW_HISTORY } from "../../../common/obsidianEvents.ts";
@@ -50,6 +50,19 @@ export function paneHatch(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElement,
eventHub.emitEvent(EVENT_REQUEST_RUN_DOCTOR, "you wanted(Thank you)!");
})
);
new Setting(paneEl)
.setName($msg("Setting.TroubleShooting.ScanBrokenFiles"))
.setDesc($msg("Setting.TroubleShooting.ScanBrokenFiles.Desc"))
.addButton((button) =>
button
.setButtonText("Scan for Broken files")
.setCta()
.setDisabled(false)
.onClick(() => {
this.closeSetting();
eventHub.emitEvent(EVENT_REQUEST_RUN_FIX_INCOMPLETE);
})
);
new Setting(paneEl).setName("Prepare the 'report' to create an issue").addButton((button) =>
button
.setButtonText("Copy Report to clipboard")
@@ -130,6 +143,9 @@ export function paneHatch(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElement,
pluginConfig.jwtKid = redact(pluginConfig.jwtKid);
pluginConfig.bucketCustomHeaders = redact(pluginConfig.bucketCustomHeaders);
pluginConfig.couchDB_CustomHeaders = redact(pluginConfig.couchDB_CustomHeaders);
pluginConfig.P2P_turnCredential = redact(pluginConfig.P2P_turnCredential);
pluginConfig.P2P_turnUsername = redact(pluginConfig.P2P_turnUsername);
pluginConfig.P2P_turnServers = `(${pluginConfig.P2P_turnServers.split(",").length} servers configured)`;
const endpoint = pluginConfig.endpoint;
if (endpoint == "") {
pluginConfig.endpoint = "Not configured or AWS";
@@ -143,7 +159,7 @@ export function paneHatch(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElement,
}
const obsidianInfo = {
navigator: navigator.userAgent,
fileSystem: this.plugin.$$isStorageInsensitive() ? "insensitive" : "sensitive",
fileSystem: this.plugin.services.vault.isStorageInsensitive() ? "insensitive" : "sensitive",
};
const msgConfig = `# ---- Obsidian info ----
${stringifyYaml(obsidianInfo)}
@@ -157,11 +173,13 @@ ${stringifyYaml({
...pluginConfig,
})}`;
console.log(msgConfig);
await navigator.clipboard.writeText(msgConfig);
Logger(
`Generated report has been copied to clipboard. Please report the issue with this! Thank you for your cooperation!`,
LOG_LEVEL_NOTICE
);
if ((await this.services.UI.promptCopyToClipboard("Generated report", msgConfig)) == true) {
// await navigator.clipboard.writeText(msgConfig);
// Logger(
// `Generated report has been copied to clipboard. Please report the issue with this! Thank you for your cooperation!`,
// LOG_LEVEL_NOTICE
// );
}
})
);
new Setting(paneEl).autoWireToggle("writeLogToTheFile");
@@ -169,10 +187,10 @@ ${stringifyYaml({
void addPanel(paneEl, "Scram Switches").then((paneEl) => {
new Setting(paneEl).autoWireToggle("suspendFileWatching");
this.addOnSaved("suspendFileWatching", () => this.plugin.$$askReload());
this.addOnSaved("suspendFileWatching", () => this.services.appLifecycle.askRestart());
new Setting(paneEl).autoWireToggle("suspendParseReplicationResult");
this.addOnSaved("suspendParseReplicationResult", () => this.plugin.$$askReload());
this.addOnSaved("suspendParseReplicationResult", () => this.services.appLifecycle.askRestart());
});
void addPanel(paneEl, "Recovery and Repair").then((paneEl) => {
@@ -190,7 +208,7 @@ ${stringifyYaml({
);
infoGroupEl.appendChild(
this.createEl(infoGroupEl, "div", {
text: `Database: Modified: ${!fileOnDB ? `Missing:` : `${new Date(fileOnDB.mtime).toLocaleString()}, Size:${fileOnDB.size}`}`,
text: `Database: Modified: ${!fileOnDB ? `Missing:` : `${new Date(fileOnDB.mtime).toLocaleString()}, Size:${fileOnDB.size} (actual size:${readAsBlob(fileOnDB).size})`}`,
})
);
})
@@ -335,7 +353,7 @@ ${stringifyYaml({
Logger("Start verifying all files", LOG_LEVEL_NOTICE, "verify");
const ignorePatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesIgnorePatterns");
const targetPatterns = getFileRegExp(this.plugin.settings, "syncInternalFilesTargetPatterns");
this.plugin.localDatabase.hashCaches.clear();
this.plugin.localDatabase.clearCaches();
Logger("Start verifying all files", LOG_LEVEL_NOTICE, "verify");
const files = this.plugin.settings.syncInternalFiles
? await this.plugin.storageAccess.getFilesIncludeHidden("/", targetPatterns, ignorePatterns)
@@ -371,15 +389,16 @@ ${stringifyYaml({
? await this.plugin.storageAccess.statHidden(path)
: false;
const fileOnStorage = stat != null ? stat : false;
if (!(await this.plugin.$$isTargetFile(path))) return incProc();
if (!(await this.services.vault.isTargetFile(path))) return incProc();
const releaser = await semaphore.acquire(1);
if (fileOnStorage && this.plugin.$$isFileSizeExceeded(fileOnStorage.size))
if (fileOnStorage && this.services.vault.isFileSizeTooLarge(fileOnStorage.size))
return incProc();
try {
const isHiddenFile = path.startsWith(".");
const dbPath = isHiddenFile ? addPrefix(path, ICHeader) : path;
const fileOnDB = await this.plugin.localDatabase.getDBEntry(dbPath);
if (fileOnDB && this.plugin.$$isFileSizeExceeded(fileOnDB.size)) return incProc();
if (fileOnDB && this.services.vault.isFileSizeTooLarge(fileOnDB.size))
return incProc();
if (!fileOnDB && fileOnStorage) {
Logger(`Compare: Not found on the local database: ${path}`, LOG_LEVEL_NOTICE);
@@ -423,7 +442,7 @@ ${stringifyYaml({
.onClick(async () => {
for await (const docName of this.plugin.localDatabase.findAllDocNames()) {
if (!docName.startsWith("f:")) {
const idEncoded = await this.plugin.$$path2id(docName as FilePathWithPrefix);
const idEncoded = await this.services.path.path2id(docName as FilePathWithPrefix);
const doc = await this.plugin.localDatabase.getRaw(docName as DocumentID);
if (!doc) continue;
if (doc.type != "newnote" && doc.type != "plain") {
@@ -464,7 +483,7 @@ ${stringifyYaml({
if ((await this.plugin.localDatabase.putRaw(doc)).ok) {
Logger(`Old ${docName} has been deleted`, LOG_LEVEL_NOTICE);
}
await this.plugin.$$queueConflictCheckIfOpen(docName as FilePathWithPrefix);
await this.services.conflict.queueCheckForIfOpen(docName as FilePathWithPrefix);
} else {
Logger(`Converting ${docName} Failed!`, LOG_LEVEL_NOTICE);
Logger(ret, LOG_LEVEL_VERBOSE);
@@ -499,7 +518,7 @@ ${stringifyYaml({
.onClick(async () => {
this.editingSettings.isConfigured = false;
await this.saveAllDirtySettings();
this.plugin.$$askReload();
this.services.appLifecycle.askRestart();
})
);

View File

@@ -1,6 +1,6 @@
import { LocalDatabaseMaintenance } from "../../../features/LocalDatabaseMainte/CmdLocalDatabaseMainte.ts";
import { LOG_LEVEL_NOTICE, Logger } from "../../../lib/src/common/logger.ts";
import { FLAGMD_REDFLAG, FLAGMD_REDFLAG2_HR, FLAGMD_REDFLAG3_HR } from "../../../lib/src/common/types.ts";
import { FlagFilesHumanReadable, FLAGMD_REDFLAG } from "../../../lib/src/common/types.ts";
import { fireAndForget } from "../../../lib/src/common/utils.ts";
import { LiveSyncCouchDBReplicator } from "../../../lib/src/replication/couchdb/LiveSyncReplicator.ts";
import { LiveSyncSetting as Setting } from "./LiveSyncSetting.ts";
@@ -32,7 +32,7 @@ export function paneMaintenance(
(e) => {
e.addEventListener("click", () => {
fireAndForget(async () => {
await this.plugin.$$markRemoteResolved();
await this.services.remote.markResolved();
this.display();
});
});
@@ -59,7 +59,7 @@ export function paneMaintenance(
(e) => {
e.addEventListener("click", () => {
fireAndForget(async () => {
await this.plugin.$$markRemoteUnlocked();
await this.services.remote.markUnlocked();
this.display();
});
});
@@ -78,7 +78,7 @@ export function paneMaintenance(
.setDisabled(false)
.setWarning()
.onClick(async () => {
await this.plugin.$$markRemoteLocked();
await this.services.remote.markLocked();
})
)
.addOnUpdate(this.onlyOnCouchDBOrMinIO);
@@ -93,7 +93,36 @@ export function paneMaintenance(
.setWarning()
.onClick(async () => {
await this.plugin.storageAccess.writeFileAuto(FLAGMD_REDFLAG, "");
this.plugin.$$performRestart();
this.services.appLifecycle.performRestart();
})
);
});
void addPanel(paneEl, "Reset Synchronisation information").then((paneEl) => {
new Setting(paneEl)
.setName("Reset Synchronisation on This Device")
.setDesc("Restore or reconstruct local database from remote.")
.addButton((button) =>
button
.setButtonText("Schedule and Restart")
.setCta()
.setDisabled(false)
.onClick(async () => {
await this.plugin.storageAccess.writeFileAuto(FlagFilesHumanReadable.FETCH_ALL, "");
this.services.appLifecycle.performRestart();
})
);
new Setting(paneEl)
.setName("Overwrite Server Data with This Device's Files")
.setDesc("Rebuild local and remote database with local files.")
.addButton((button) =>
button
.setButtonText("Schedule and Restart")
.setCta()
.setDisabled(false)
.onClick(async () => {
await this.plugin.storageAccess.writeFileAuto(FlagFilesHumanReadable.REBUILD_ALL, "");
this.services.appLifecycle.performRestart();
})
);
});
@@ -158,36 +187,40 @@ export function paneMaintenance(
)
.addOnUpdate(this.onlyOnMinIO);
});
void addPanel(paneEl, "Garbage Collection (Beta)", (e) => e, this.onlyOnP2POrCouchDB).then((paneEl) => {
void addPanel(paneEl, "Garbage Collection (Beta2)", (e) => e, this.onlyOnP2POrCouchDB).then((paneEl) => {
new Setting(paneEl)
.setName("Remove all orphaned chunks")
.setDesc("Remove all orphaned chunks from the local database.")
.setName("Scan garbage")
.setDesc("Scan for garbage chunks in the database.")
.addButton((button) =>
button
.setButtonText("Remove")
.setWarning()
.setButtonText("Scan")
// .setWarning()
.setDisabled(false)
.onClick(async () => {
await this.plugin
.getAddOn<LocalDatabaseMaintenance>(LocalDatabaseMaintenance.name)
?.removeUnusedChunks();
?.trackChanges(false, true);
})
);
new Setting(paneEl)
.setName("Resurrect deleted chunks")
.setDesc(
"If you have deleted chunks before fully synchronised and missed some chunks, you possibly can resurrect them."
)
.addButton((button) =>
button.setButtonText("Rescan").onClick(async () => {
await this.plugin
.getAddOn<LocalDatabaseMaintenance>(LocalDatabaseMaintenance.name)
?.trackChanges(true, true);
})
);
new Setting(paneEl)
.setName("Collect garbage")
.setDesc("Remove all unused chunks from the local database.")
.addButton((button) =>
button
.setButtonText("Try resurrect")
.setButtonText("Collect")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.plugin
.getAddOn<LocalDatabaseMaintenance>(LocalDatabaseMaintenance.name)
?.resurrectChunks();
?.performGC(true);
})
);
new Setting(paneEl)
@@ -205,69 +238,42 @@ export function paneMaintenance(
})
);
});
void addPanel(paneEl, "Rebuilding Operations (Local)").then((paneEl) => {
new Setting(paneEl)
.setName("Fetch from remote")
.setDesc("Restore or reconstruct local database from remote.")
.addButton((button) =>
button
.setButtonText("Fetch")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.plugin.storageAccess.writeFileAuto(FLAGMD_REDFLAG3_HR, "");
this.plugin.$$performRestart();
})
)
.addButton((button) =>
button
.setButtonText("Fetch w/o restarting")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.rebuildDB("localOnly");
})
);
void addPanel(paneEl, "Garbage Collection (Old and Experimental)", (e) => e, this.onlyOnP2POrCouchDB).then(
(paneEl) => {
new Setting(paneEl)
.setName("Remove all orphaned chunks")
.setDesc("Remove all orphaned chunks from the local database.")
.addButton((button) =>
button
.setButtonText("Remove")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.plugin
.getAddOn<LocalDatabaseMaintenance>(LocalDatabaseMaintenance.name)
?.removeUnusedChunks();
})
);
new Setting(paneEl)
.setName("Fetch rebuilt DB (Save local documents before)")
.setDesc("Restore or reconstruct local database from remote database but use local chunks.")
.addButton((button) =>
button
.setButtonText("Save and Fetch")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.rebuildDB("localOnlyWithChunks");
})
)
.addOnUpdate(this.onlyOnCouchDB);
});
new Setting(paneEl)
.setName("Resurrect deleted chunks")
.setDesc(
"If you have deleted chunks before fully synchronised and missed some chunks, you possibly can resurrect them."
)
.addButton((button) =>
button
.setButtonText("Try resurrect")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.plugin
.getAddOn<LocalDatabaseMaintenance>(LocalDatabaseMaintenance.name)
?.resurrectChunks();
})
);
}
);
void addPanel(paneEl, "Total Overhaul", () => {}, this.onlyOnCouchDBOrMinIO).then((paneEl) => {
new Setting(paneEl)
.setName("Rebuild everything")
.setDesc("Rebuild local and remote database with local files.")
.addButton((button) =>
button
.setButtonText("Rebuild")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.plugin.storageAccess.writeFileAuto(FLAGMD_REDFLAG2_HR, "");
this.plugin.$$performRestart();
})
)
.addButton((button) =>
button
.setButtonText("Rebuild w/o restarting")
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.rebuildDB("rebuildBothByThisDevice");
})
);
});
void addPanel(paneEl, "Rebuilding Operations (Remote Only)", () => {}, this.onlyOnCouchDBOrMinIO).then((paneEl) => {
new Setting(paneEl)
.setName("Perform cleanup")
@@ -366,8 +372,8 @@ export function paneMaintenance(
.setWarning()
.setDisabled(false)
.onClick(async () => {
await this.plugin.$$resetLocalDatabase();
await this.plugin.$$initializeDatabase();
await this.services.database.resetDatabase();
await this.services.databaseEvents.initialiseDatabase();
})
);
});

View File

@@ -3,12 +3,16 @@ import {
E2EEAlgorithms,
type HashAlgorithm,
LOG_LEVEL_NOTICE,
SuffixDatabaseName,
} from "../../../lib/src/common/types.ts";
import { Logger } from "../../../lib/src/common/logger.ts";
import { LiveSyncSetting as Setting } from "./LiveSyncSetting.ts";
import type { ObsidianLiveSyncSettingTab } from "./ObsidianLiveSyncSettingTab.ts";
import type { PageFunctions } from "./SettingPane.ts";
import { visibleOnly } from "./SettingPane.ts";
import { PouchDB } from "../../../lib/src/pouchdb/pouchdb-browser";
import { ExtraSuffixIndexedDB } from "../../../lib/src/common/types.ts";
import { migrateDatabases } from "./settingUtils.ts";
export function panePatches(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElement, { addPanel }: PageFunctions): void {
void addPanel(paneEl, "Compatibility (Metadata)").then((paneEl) => {
@@ -26,17 +30,88 @@ export function panePatches(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElemen
});
void addPanel(paneEl, "Compatibility (Database structure)").then((paneEl) => {
new Setting(paneEl).autoWireToggle("useIndexedDBAdapter", { invert: true, holdValue: true });
new Setting(paneEl)
.autoWireToggle("doNotUseFixedRevisionForChunks", { holdValue: true })
.setClass("wizardHidden");
const migrateAllToIndexedDB = async () => {
const dbToName = this.plugin.localDatabase.dbname + SuffixDatabaseName + ExtraSuffixIndexedDB;
const options = {
adapter: "indexeddb",
//@ts-ignore :missing def
purged_infos_limit: 1,
auto_compaction: false,
deterministic_revs: true,
};
const openTo = () => {
return new PouchDB(dbToName, options);
};
if (await migrateDatabases("to IndexedDB", this.plugin.localDatabase.localDatabase, openTo)) {
Logger(
"Migration to IndexedDB completed. Obsidian will be restarted with new configuration immediately.",
LOG_LEVEL_NOTICE
);
this.plugin.settings.useIndexedDBAdapter = true;
await this.services.setting.saveSettingData();
this.services.appLifecycle.performRestart();
}
};
const migrateAllToIDB = async () => {
const dbToName = this.plugin.localDatabase.dbname + SuffixDatabaseName;
const options = {
adapter: "idb",
auto_compaction: false,
deterministic_revs: true,
};
const openTo = () => {
return new PouchDB(dbToName, options);
};
if (await migrateDatabases("to IDB", this.plugin.localDatabase.localDatabase, openTo)) {
Logger(
"Migration to IDB completed. Obsidian will be restarted with new configuration immediately.",
LOG_LEVEL_NOTICE
);
this.plugin.settings.useIndexedDBAdapter = false;
await this.services.setting.saveSettingData();
this.services.appLifecycle.performRestart();
}
};
{
const infoClass = this.editingSettings.useIndexedDBAdapter ? "op-warn" : "op-warn-info";
paneEl.createDiv({
text: "The IndexedDB adapter often offers superior performance in certain scenarios, but it has been found to cause memory leaks when used with LiveSync mode. When using LiveSync mode, please use IDB adapter instead.",
cls: infoClass,
});
paneEl.createDiv({
text: "Changing this setting requires migrating existing data (a bit time may be taken) and restarting Obsidian. Please make sure to back up your data before proceeding.",
cls: "op-warn-info",
});
const setting = new Setting(paneEl)
.setName("Database Adapter")
.setDesc("Select the database adapter to use. ");
const el = setting.controlEl.createDiv({});
el.setText(`Current adapter: ${this.editingSettings.useIndexedDBAdapter ? "IndexedDB" : "IDB"}`);
if (!this.editingSettings.useIndexedDBAdapter) {
setting.addButton((button) => {
button.setButtonText("Switch to IndexedDB").onClick(async () => {
Logger("Migrating all data to IndexedDB...", LOG_LEVEL_NOTICE);
await migrateAllToIndexedDB();
Logger(
"Migration to IndexedDB completed. Please switch the adapter and restart Obsidian.",
LOG_LEVEL_NOTICE
);
});
});
} else {
setting.addButton((button) => {
button.setButtonText("Switch to IDB").onClick(async () => {
Logger("Migrating all data to IDB...", LOG_LEVEL_NOTICE);
await migrateAllToIDB();
Logger(
"Migration to IDB completed. Please switch the adapter and restart Obsidian.",
LOG_LEVEL_NOTICE
);
});
});
}
}
new Setting(paneEl).autoWireToggle("handleFilenameCaseSensitive", { holdValue: true }).setClass("wizardHidden");
this.addOnSaved("useIndexedDBAdapter", async () => {
await this.saveAllDirtySettings();
await this.rebuildDB("localOnly");
});
});
void addPanel(paneEl, "Compatibility (Internal API Usage)").then((paneEl) => {
@@ -63,7 +138,7 @@ export function panePatches(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElemen
this.addOnSaved("additionalSuffixOfDatabaseName", async (key) => {
Logger("Suffix has been changed. Reopening database...", LOG_LEVEL_NOTICE);
await this.plugin.$$initializeDatabase();
await this.services.databaseEvents.initialiseDatabase();
});
new Setting(paneEl).autoWireDropDown("hashAlg", {
@@ -82,6 +157,7 @@ export function panePatches(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElemen
void addPanel(paneEl, "Edge case addressing (Behaviour)").then((paneEl) => {
new Setting(paneEl).autoWireToggle("doNotSuspendOnFetching");
new Setting(paneEl).setClass("wizardHidden").autoWireToggle("doNotDeleteFolder");
new Setting(paneEl).autoWireToggle("processSizeMismatchedFiles");
});
void addPanel(paneEl, "Edge case addressing (Processing)").then((paneEl) => {
@@ -99,13 +175,13 @@ export function panePatches(this: ObsidianLiveSyncSettingTab, paneEl: HTMLElemen
});
void addPanel(paneEl, "Remote Database Tweak (In sunset)").then((paneEl) => {
new Setting(paneEl).autoWireToggle("useEden").setClass("wizardHidden");
const onlyUsingEden = visibleOnly(() => this.isConfiguredAs("useEden", true));
new Setting(paneEl).autoWireNumeric("maxChunksInEden", { onUpdate: onlyUsingEden }).setClass("wizardHidden");
new Setting(paneEl)
.autoWireNumeric("maxTotalLengthInEden", { onUpdate: onlyUsingEden })
.setClass("wizardHidden");
new Setting(paneEl).autoWireNumeric("maxAgeInEden", { onUpdate: onlyUsingEden }).setClass("wizardHidden");
// new Setting(paneEl).autoWireToggle("useEden").setClass("wizardHidden");
// const onlyUsingEden = visibleOnly(() => this.isConfiguredAs("useEden", true));
// new Setting(paneEl).autoWireNumeric("maxChunksInEden", { onUpdate: onlyUsingEden }).setClass("wizardHidden");
// new Setting(paneEl)
// .autoWireNumeric("maxTotalLengthInEden", { onUpdate: onlyUsingEden })
// .setClass("wizardHidden");
// new Setting(paneEl).autoWireNumeric("maxAgeInEden", { onUpdate: onlyUsingEden }).setClass("wizardHidden");
new Setting(paneEl).autoWireToggle("enableCompression").setClass("wizardHidden");
});

File diff suppressed because it is too large Load Diff

View File

@@ -117,5 +117,26 @@ export function paneSelector(this: ObsidianLiveSyncSettingTab, paneEl: HTMLEleme
await addDefaultPatterns(defaultSkipPatternXPlat);
});
});
const overwritePatterns = new Setting(paneEl)
.setName("Overwrite patterns")
.setClass("wizardHidden")
.setDesc("Patterns to match files for overwriting instead of merging");
const patTarget2 = splitCustomRegExpList(this.editingSettings.syncInternalFileOverwritePatterns, ",");
mount(MultipleRegExpControl, {
target: overwritePatterns.controlEl,
props: {
patterns: patTarget2,
originals: [...patTarget2],
apply: async (newPatterns: CustomRegExpSource[]) => {
this.editingSettings.syncInternalFileOverwritePatterns = constructCustomRegExpList(
newPatterns,
","
);
await this.saveAllDirtySettings();
this.display();
},
},
});
});
}

View File

@@ -13,6 +13,7 @@ import type { PageFunctions } from "./SettingPane.ts";
import { visibleOnly } from "./SettingPane.ts";
import { DEFAULT_SETTINGS } from "../../../lib/src/common/types.ts";
import { request } from "obsidian";
import { SetupManager, UserMode } from "../SetupManager.ts";
export function paneSetup(
this: ObsidianLiveSyncSettingTab,
paneEl: HTMLElement,
@@ -30,11 +31,13 @@ export function paneSetup(
});
new Setting(paneEl)
.setName($msg("obsidianLiveSyncSettingTab.nameManualSetup"))
.setDesc($msg("obsidianLiveSyncSettingTab.descManualSetup"))
.setName("Rerun Onboarding Wizard")
.setDesc("Rerun the onboarding wizard to set up Self-hosted LiveSync again.")
.addButton((text) => {
text.setButtonText($msg("obsidianLiveSyncSettingTab.btnStart")).onClick(async () => {
await this.enableMinimalSetup();
text.setButtonText("Rerun Wizard").onClick(async () => {
const setupManager = this.plugin.getModule(SetupManager);
await setupManager.onOnboard(UserMode.ExistingUser);
// await this.plugin.moduleSetupObsidian.onBoardingWizard(true);
});
});
@@ -46,7 +49,7 @@ export function paneSetup(
text.setButtonText($msg("obsidianLiveSyncSettingTab.btnEnable")).onClick(async () => {
this.editingSettings.isConfigured = true;
await this.saveAllDirtySettings();
this.plugin.$$askReload();
this.services.appLifecycle.askRestart();
});
});
});
@@ -91,10 +94,10 @@ export function paneSetup(
this.editingSettings = { ...this.editingSettings, ...DEFAULT_SETTINGS };
await this.saveAllDirtySettings();
this.plugin.settings = { ...DEFAULT_SETTINGS };
await this.plugin.$$saveSettingData();
await this.plugin.$$resetLocalDatabase();
await this.services.setting.saveSettingData();
await this.services.database.resetDatabase();
// await this.plugin.initializeDatabase();
this.plugin.$$askReload();
this.services.appLifecycle.askRestart();
}
})
.setWarning();

View File

@@ -7,7 +7,6 @@ import {
import { Logger } from "../../../lib/src/common/logger.ts";
import { $msg } from "../../../lib/src/common/i18n.ts";
import { LiveSyncSetting as Setting } from "./LiveSyncSetting.ts";
import { fireAndForget } from "octagonal-wheels/promises";
import { EVENT_REQUEST_COPY_SETUP_URI, eventHub } from "../../../common/events.ts";
import type { ObsidianLiveSyncSettingTab } from "./ObsidianLiveSyncSettingTab.ts";
import type { PageFunctions } from "./SettingPane.ts";
@@ -17,30 +16,6 @@ export function paneSyncSettings(
paneEl: HTMLElement,
{ addPanel, addPane }: PageFunctions
): void {
if (this.editingSettings.versionUpFlash != "") {
const c = this.createEl(
paneEl,
"div",
{
text: this.editingSettings.versionUpFlash,
cls: "op-warn sls-setting-hidden",
},
(el) => {
this.createEl(el, "button", { text: $msg("obsidianLiveSyncSettingTab.btnGotItAndUpdated") }, (e) => {
e.addClass("mod-cta");
e.addEventListener("click", () => {
fireAndForget(async () => {
this.editingSettings.versionUpFlash = "";
await this.saveAllDirtySettings();
c.remove();
});
});
});
},
visibleOnly(() => !this.isConfiguredAs("versionUpFlash", ""))
);
}
this.createEl(paneEl, "div", {
text: $msg("obsidianLiveSyncSettingTab.msgSelectAndApplyPreset"),
cls: "wizardOnly",
@@ -130,7 +105,7 @@ export function paneSyncSettings(
if (!this.editingSettings.isConfigured) {
this.editingSettings.isConfigured = true;
await this.saveAllDirtySettings();
await this.plugin.$$realizeSettingSyncMode();
await this.services.setting.realiseSetting();
await this.rebuildDB("localOnly");
// this.resetEditingSettings();
if (
@@ -149,13 +124,13 @@ export function paneSyncSettings(
await this.confirmRebuild();
} else {
await this.saveAllDirtySettings();
await this.plugin.$$realizeSettingSyncMode();
this.plugin.$$askReload();
await this.services.setting.realiseSetting();
this.services.appLifecycle.askRestart();
}
}
} else {
await this.saveAllDirtySettings();
await this.plugin.$$realizeSettingSyncMode();
await this.services.setting.realiseSetting();
}
});
});
@@ -194,7 +169,7 @@ export function paneSyncSettings(
}
await this.saveSettings(["liveSync", "periodicReplication"]);
await this.plugin.$$realizeSettingSyncMode();
await this.services.setting.realiseSetting();
});
new Setting(paneEl)
@@ -314,21 +289,21 @@ export function paneSyncSettings(
button.setButtonText("Merge").onClick(async () => {
this.closeSetting();
// this.resetEditingSettings();
await this.plugin.$anyConfigureOptionalSyncFeature("MERGE");
await this.services.setting.enableOptionalFeature("MERGE");
});
})
.addButton((button) => {
button.setButtonText("Fetch").onClick(async () => {
this.closeSetting();
// this.resetEditingSettings();
await this.plugin.$anyConfigureOptionalSyncFeature("FETCH");
await this.services.setting.enableOptionalFeature("FETCH");
});
})
.addButton((button) => {
button.setButtonText("Overwrite").onClick(async () => {
this.closeSetting();
// this.resetEditingSettings();
await this.plugin.$anyConfigureOptionalSyncFeature("OVERWRITE");
await this.services.setting.enableOptionalFeature("OVERWRITE");
});
});
}

View File

@@ -0,0 +1,54 @@
import { mount, type Component, unmount } from "svelte";
import { type Writable, writable, get } from "svelte/store";
/**
* Props passed to Svelte panels, containing a writable port
* to communicate with the panel
*/
export type SveltePanelProps<T = any> = {
port: Writable<T | undefined>;
};
/**
* A class to manage a Svelte panel within Obsidian
* Especially useful for settings panels
*/
export class SveltePanel<T = any> {
private _mountedComponent: ReturnType<typeof mount>;
private _componentValue = writable<T | undefined>(undefined);
/**
* Creates a Svelte panel instance
* @param component Component to mount
* @param mountTo HTMLElement to mount the component to
* @param valueStore Optional writable store to bind to the component's port, if not provided a new one will be created
* @returns The SveltePanel instance
*/
constructor(component: Component<SveltePanelProps<T>>, mountTo: HTMLElement, valueStore?: Writable<T>) {
this._componentValue = valueStore ?? writable<T | undefined>(undefined);
this._mountedComponent = mount(component, {
target: mountTo,
props: {
port: this._componentValue,
},
});
return this;
}
/**
* Destroys the Svelte panel instance by unmounting the component
*/
destroy() {
if (this._mountedComponent) {
void unmount(this._mountedComponent);
}
}
/**
* Gets or sets the current value of the component's port
*/
get componentValue() {
return get(this._componentValue);
}
set componentValue(value: T | undefined) {
this._componentValue.set(value);
}
}

View File

@@ -0,0 +1,154 @@
import { escapeStringToHTML } from "octagonal-wheels/string";
import {
E2EEAlgorithmNames,
MILESTONE_DOCID,
NODEINFO_DOCID,
type ObsidianLiveSyncSettings,
} from "../../../lib/src/common/types";
import {
pickCouchDBSyncSettings,
pickBucketSyncSettings,
pickP2PSyncSettings,
pickEncryptionSettings,
} from "../../../lib/src/common/utils";
import { getConfig, type AllSettingItemKey } from "./settingConstants";
import { LOG_LEVEL_NOTICE, Logger } from "octagonal-wheels/common/logger";
/**
* Generates a summary of P2P configuration settings
* @param setting Settings object
* @param additional Additional summary information to include
* @param showAdvanced Whether to include advanced settings
* @returns Summary object
*/
export function getP2PConfigSummary(
setting: ObsidianLiveSyncSettings,
additional: Record<string, string> = {},
showAdvanced = false
) {
const settingTable: Partial<ObsidianLiveSyncSettings> = pickP2PSyncSettings(setting);
return { ...getSummaryFromPartialSettings({ ...settingTable }, showAdvanced), ...additional };
}
/**
* Generates a summary of Object Storage configuration settings
* @param setting Settings object
* @param showAdvanced Whether to include advanced settings
* @returns Summary object
*/
export function getBucketConfigSummary(setting: ObsidianLiveSyncSettings, showAdvanced = false) {
const settingTable: Partial<ObsidianLiveSyncSettings> = pickBucketSyncSettings(setting);
return getSummaryFromPartialSettings(settingTable, showAdvanced);
}
/**
* Generates a summary of CouchDB configuration settings
* @param setting Settings object
* @param showAdvanced Whether to include advanced settings
* @returns Summary object
*/
export function getCouchDBConfigSummary(setting: ObsidianLiveSyncSettings, showAdvanced = false) {
const settingTable: Partial<ObsidianLiveSyncSettings> = pickCouchDBSyncSettings(setting);
return getSummaryFromPartialSettings(settingTable, showAdvanced || setting.useJWT);
}
/**
* Generates a summary of E2EE configuration settings
* @param setting Settings object
* @param showAdvanced Whether to include advanced settings
* @returns Summary object
*/
export function getE2EEConfigSummary(setting: ObsidianLiveSyncSettings, showAdvanced = false) {
const settingTable: Partial<ObsidianLiveSyncSettings> = pickEncryptionSettings(setting);
return getSummaryFromPartialSettings(settingTable, showAdvanced);
}
/**
* Converts partial settings into a summary object
* @param setting Partial settings object
* @param showAdvanced Whether to include advanced settings
* @returns Summary object
*/
export function getSummaryFromPartialSettings(setting: Partial<ObsidianLiveSyncSettings>, showAdvanced = false) {
const outputSummary: Record<string, string> = {};
for (const key of Object.keys(setting) as (keyof ObsidianLiveSyncSettings)[]) {
const config = getConfig(key as AllSettingItemKey);
if (!config) continue;
if (config.isAdvanced && !showAdvanced) continue;
const value =
key != "E2EEAlgorithm"
? `${setting[key]}`
: E2EEAlgorithmNames[`${setting[key]}` as keyof typeof E2EEAlgorithmNames];
const displayValue = config.isHidden ? "•".repeat(value.length) : escapeStringToHTML(value);
outputSummary[config.name] = displayValue;
}
return outputSummary;
}
// Migration or de-migration helper functions
/**
* Copy document from one database to another for migration purposes
* @param docName document ID
* @param dbFrom source database
* @param dbTo destination database
* @returns
*/
export async function copyMigrationDocs(docName: string, dbFrom: PouchDB.Database, dbTo: PouchDB.Database) {
try {
const doc = await dbFrom.get(docName);
delete (doc as any)._rev;
await dbTo.put(doc);
} catch (e) {
if ((e as any).status === 404) {
return;
}
throw e;
}
}
type PouchDBOpenFunction = () => Promise<PouchDB.Database> | PouchDB.Database;
/**
* Migrate databases from one to another
* @param operationName Name of the migration operation
* @param from source database
* @param openTo function to open destination database
* @returns True if migration succeeded
*/
export async function migrateDatabases(operationName: string, from: PouchDB.Database, openTo: PouchDBOpenFunction) {
const dbTo = await openTo();
await dbTo.info(); // ensure created
Logger(`Opening destination database for migration: ${operationName}.`, LOG_LEVEL_NOTICE, "migration");
// destroy existing data
await dbTo.destroy();
Logger(`Destroyed existing destination database for migration: ${operationName}.`, LOG_LEVEL_NOTICE, "migration");
const dbTo2 = await openTo();
const info2 = await dbTo2.info(); // ensure created
console.log(info2);
Logger(`Re-created destination database for migration: ${operationName}.`, LOG_LEVEL_NOTICE, "migration");
const info = await from.info();
const totalDocs = info.doc_count || 0;
const result = await from.replicate
.to(dbTo2, {
//@ts-ignore Missing in typedefs
style: "all_docs",
})
.on("change", (info) => {
Logger(
`Replicating... Docs replicated: ${info.docs_written} / ${totalDocs}`,
LOG_LEVEL_NOTICE,
"migration"
);
});
if (result.ok) {
Logger(`Replication completed for migration: ${operationName}.`, LOG_LEVEL_NOTICE, "migration");
} else {
throw new Error(`Replication failed for migration: ${operationName}.`);
}
await copyMigrationDocs(MILESTONE_DOCID, from, dbTo2);
await copyMigrationDocs(NODEINFO_DOCID, from, dbTo2);
Logger(`Copied migration documents for migration: ${operationName}.`, LOG_LEVEL_NOTICE, "migration");
await dbTo2.close();
return true;
}

View File

@@ -0,0 +1,274 @@
import { requestToCouchDBWithCredentials } from "../../../common/utils";
import { $msg } from "../../../lib/src/common/i18n";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, Logger } from "../../../lib/src/common/logger";
import type { ObsidianLiveSyncSettings } from "../../../lib/src/common/types";
import { fireAndForget, parseHeaderValues } from "../../../lib/src/common/utils";
import { isCloudantURI } from "../../../lib/src/pouchdb/utils_couchdb";
import { generateCredentialObject } from "../../../lib/src/replication/httplib";
export const checkConfig = async (
checkResultDiv: HTMLDivElement | undefined,
editingSettings: ObsidianLiveSyncSettings
) => {
Logger($msg("obsidianLiveSyncSettingTab.logCheckingDbConfig"), LOG_LEVEL_INFO);
let isSuccessful = true;
const emptyDiv = createDiv();
emptyDiv.innerHTML = "<span></span>";
checkResultDiv?.replaceChildren(...[emptyDiv]);
const addResult = (msg: string, classes?: string[]) => {
const tmpDiv = createDiv();
tmpDiv.addClass("ob-btn-config-fix");
if (classes) {
tmpDiv.addClasses(classes);
}
tmpDiv.innerHTML = `${msg}`;
checkResultDiv?.appendChild(tmpDiv);
};
try {
if (isCloudantURI(editingSettings.couchDB_URI)) {
Logger($msg("obsidianLiveSyncSettingTab.logCannotUseCloudant"), LOG_LEVEL_NOTICE);
return;
}
// Tip: Add log for cloudant as Logger($msg("obsidianLiveSyncSettingTab.logServerConfigurationCheck"));
const customHeaders = parseHeaderValues(editingSettings.couchDB_CustomHeaders);
const credential = generateCredentialObject(editingSettings);
const r = await requestToCouchDBWithCredentials(
editingSettings.couchDB_URI,
credential,
window.origin,
undefined,
undefined,
undefined,
customHeaders
);
const responseConfig = r.json;
const addConfigFixButton = (title: string, key: string, value: string) => {
if (!checkResultDiv) return;
const tmpDiv = createDiv();
tmpDiv.addClass("ob-btn-config-fix");
tmpDiv.innerHTML = `<label>${title}</label><button>${$msg("obsidianLiveSyncSettingTab.btnFix")}</button>`;
const x = checkResultDiv.appendChild(tmpDiv);
x.querySelector("button")?.addEventListener("click", () => {
fireAndForget(async () => {
Logger($msg("obsidianLiveSyncSettingTab.logCouchDbConfigSet", { title, key, value }));
const res = await requestToCouchDBWithCredentials(
editingSettings.couchDB_URI,
credential,
undefined,
key,
value,
undefined,
customHeaders
);
if (res.status == 200) {
Logger($msg("obsidianLiveSyncSettingTab.logCouchDbConfigUpdated", { title }), LOG_LEVEL_NOTICE);
checkResultDiv.removeChild(x);
await checkConfig(checkResultDiv, editingSettings);
} else {
Logger($msg("obsidianLiveSyncSettingTab.logCouchDbConfigFail", { title }), LOG_LEVEL_NOTICE);
Logger(res.text, LOG_LEVEL_VERBOSE);
}
});
});
};
addResult($msg("obsidianLiveSyncSettingTab.msgNotice"), ["ob-btn-config-head"]);
addResult($msg("obsidianLiveSyncSettingTab.msgIfConfigNotPersistent"), ["ob-btn-config-info"]);
addResult($msg("obsidianLiveSyncSettingTab.msgConfigCheck"), ["ob-btn-config-head"]);
const serverBanner = r.headers["server"] ?? r.headers["Server"] ?? "unknown";
addResult($msg("obsidianLiveSyncSettingTab.serverVersion", { info: serverBanner }));
const versionMatch = serverBanner.match(/CouchDB(\/([0-9.]+))?/);
const versionStr = versionMatch ? versionMatch[2] : "0.0.0";
const versionParts = `${versionStr}.0.0.0`.split(".");
// Compare version string with the target version.
// version must be a string like "3.2.1" or "3.10.2", and must be two or three parts.
function isGreaterThanOrEqual(version: string) {
const targetParts = version.split(".");
for (let i = 0; i < targetParts.length; i++) {
// compare as number if possible (so 3.10 > 3.2, 3.10.1b > 3.10.1a)
const result = versionParts[i].localeCompare(targetParts[i], undefined, { numeric: true });
if (result > 0) return true;
if (result < 0) return false;
}
return true;
}
// Admin check
// for database creation and deletion
if (!(editingSettings.couchDB_USER in responseConfig.admins)) {
addResult($msg("obsidianLiveSyncSettingTab.warnNoAdmin"));
} else {
addResult($msg("obsidianLiveSyncSettingTab.okAdminPrivileges"));
}
if (isGreaterThanOrEqual("3.2.0")) {
// HTTP user-authorization check
if (responseConfig?.chttpd?.require_valid_user != "true") {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errRequireValidUser"));
addConfigFixButton(
$msg("obsidianLiveSyncSettingTab.msgSetRequireValidUser"),
"chttpd/require_valid_user",
"true"
);
} else {
addResult($msg("obsidianLiveSyncSettingTab.okRequireValidUser"));
}
} else {
if (responseConfig?.chttpd_auth?.require_valid_user != "true") {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errRequireValidUserAuth"));
addConfigFixButton(
$msg("obsidianLiveSyncSettingTab.msgSetRequireValidUserAuth"),
"chttpd_auth/require_valid_user",
"true"
);
} else {
addResult($msg("obsidianLiveSyncSettingTab.okRequireValidUserAuth"));
}
}
// HTTPD check
// Check Authentication header
if (!responseConfig?.httpd["WWW-Authenticate"]) {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errMissingWwwAuth"));
addConfigFixButton(
$msg("obsidianLiveSyncSettingTab.msgSetWwwAuth"),
"httpd/WWW-Authenticate",
'Basic realm="couchdb"'
);
} else {
addResult($msg("obsidianLiveSyncSettingTab.okWwwAuth"));
}
if (isGreaterThanOrEqual("3.2.0")) {
if (responseConfig?.chttpd?.enable_cors != "true") {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errEnableCorsChttpd"));
addConfigFixButton(
$msg("obsidianLiveSyncSettingTab.msgEnableCorsChttpd"),
"chttpd/enable_cors",
"true"
);
} else {
addResult($msg("obsidianLiveSyncSettingTab.okEnableCorsChttpd"));
}
} else {
if (responseConfig?.httpd?.enable_cors != "true") {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errEnableCors"));
addConfigFixButton($msg("obsidianLiveSyncSettingTab.msgEnableCors"), "httpd/enable_cors", "true");
} else {
addResult($msg("obsidianLiveSyncSettingTab.okEnableCors"));
}
}
// If the server is not cloudant, configure request size
if (!isCloudantURI(editingSettings.couchDB_URI)) {
// REQUEST SIZE
if (Number(responseConfig?.chttpd?.max_http_request_size ?? 0) < 4294967296) {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errMaxRequestSize"));
addConfigFixButton(
$msg("obsidianLiveSyncSettingTab.msgSetMaxRequestSize"),
"chttpd/max_http_request_size",
"4294967296"
);
} else {
addResult($msg("obsidianLiveSyncSettingTab.okMaxRequestSize"));
}
if (Number(responseConfig?.couchdb?.max_document_size ?? 0) < 50000000) {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errMaxDocumentSize"));
addConfigFixButton(
$msg("obsidianLiveSyncSettingTab.msgSetMaxDocSize"),
"couchdb/max_document_size",
"50000000"
);
} else {
addResult($msg("obsidianLiveSyncSettingTab.okMaxDocumentSize"));
}
}
// CORS check
// checking connectivity for mobile
if (responseConfig?.cors?.credentials != "true") {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errCorsCredentials"));
addConfigFixButton($msg("obsidianLiveSyncSettingTab.msgSetCorsCredentials"), "cors/credentials", "true");
} else {
addResult($msg("obsidianLiveSyncSettingTab.okCorsCredentials"));
}
const ConfiguredOrigins = ((responseConfig?.cors?.origins ?? "") + "").split(",");
if (
responseConfig?.cors?.origins == "*" ||
(ConfiguredOrigins.indexOf("app://obsidian.md") !== -1 &&
ConfiguredOrigins.indexOf("capacitor://localhost") !== -1 &&
ConfiguredOrigins.indexOf("http://localhost") !== -1)
) {
addResult($msg("obsidianLiveSyncSettingTab.okCorsOrigins"));
} else {
const fixedValue = [
...new Set([
...ConfiguredOrigins.map((e) => e.trim()),
"app://obsidian.md",
"capacitor://localhost",
"http://localhost",
]),
].join(",");
addResult($msg("obsidianLiveSyncSettingTab.errCorsOrigins"));
addConfigFixButton($msg("obsidianLiveSyncSettingTab.msgSetCorsOrigins"), "cors/origins", fixedValue);
isSuccessful = false;
}
addResult($msg("obsidianLiveSyncSettingTab.msgConnectionCheck"), ["ob-btn-config-head"]);
addResult($msg("obsidianLiveSyncSettingTab.msgCurrentOrigin", { origin: window.location.origin }));
// Request header check
const origins = ["app://obsidian.md", "capacitor://localhost", "http://localhost"];
for (const org of origins) {
const rr = await requestToCouchDBWithCredentials(
editingSettings.couchDB_URI,
credential,
org,
undefined,
undefined,
undefined,
customHeaders
);
const responseHeaders = Object.fromEntries(
Object.entries(rr.headers).map((e) => {
e[0] = `${e[0]}`.toLowerCase();
return e;
})
);
addResult($msg("obsidianLiveSyncSettingTab.msgOriginCheck", { org }));
if (responseHeaders["access-control-allow-credentials"] != "true") {
addResult($msg("obsidianLiveSyncSettingTab.errCorsNotAllowingCredentials"));
isSuccessful = false;
} else {
addResult($msg("obsidianLiveSyncSettingTab.okCorsCredentialsForOrigin"));
}
if (responseHeaders["access-control-allow-origin"] != org) {
addResult(
$msg("obsidianLiveSyncSettingTab.warnCorsOriginUnmatched", {
from: origin,
to: responseHeaders["access-control-allow-origin"],
})
);
} else {
addResult($msg("obsidianLiveSyncSettingTab.okCorsOriginMatched"));
}
}
addResult($msg("obsidianLiveSyncSettingTab.msgDone"), ["ob-btn-config-head"]);
addResult($msg("obsidianLiveSyncSettingTab.msgConnectionProxyNote"), ["ob-btn-config-info"]);
Logger($msg("obsidianLiveSyncSettingTab.logCheckingConfigDone"), LOG_LEVEL_INFO);
} catch (ex: any) {
if (ex?.status == 401) {
isSuccessful = false;
addResult($msg("obsidianLiveSyncSettingTab.errAccessForbidden"));
addResult($msg("obsidianLiveSyncSettingTab.errCannotContinueTest"));
Logger($msg("obsidianLiveSyncSettingTab.logCheckingConfigDone"), LOG_LEVEL_INFO);
} else {
Logger($msg("obsidianLiveSyncSettingTab.logCheckingConfigFailed"), LOG_LEVEL_NOTICE);
Logger(ex);
isSuccessful = false;
}
}
return isSuccessful;
};

View File

@@ -0,0 +1,378 @@
import {
type ObsidianLiveSyncSettings,
DEFAULT_SETTINGS,
LOG_LEVEL_NOTICE,
LOG_LEVEL_VERBOSE,
REMOTE_COUCHDB,
REMOTE_MINIO,
REMOTE_P2P,
} from "../../lib/src/common/types.ts";
import { generatePatchObj, isObjectDifferent } from "../../lib/src/common/utils.ts";
import { AbstractObsidianModule } from "../AbstractObsidianModule.ts";
import { SvelteDialogManager } from "./SetupWizard/ObsidianSvelteDialog.ts";
import Intro from "./SetupWizard/dialogs/Intro.svelte";
import SelectMethodNewUser from "./SetupWizard/dialogs/SelectMethodNewUser.svelte";
import SelectMethodExisting from "./SetupWizard/dialogs/SelectMethodExisting.svelte";
import ScanQRCode from "./SetupWizard/dialogs/ScanQRCode.svelte";
import UseSetupURI from "./SetupWizard/dialogs/UseSetupURI.svelte";
import OutroNewUser from "./SetupWizard/dialogs/OutroNewUser.svelte";
import OutroExistingUser from "./SetupWizard/dialogs/OutroExistingUser.svelte";
import OutroAskUserMode from "./SetupWizard/dialogs/OutroAskUserMode.svelte";
import SetupRemote from "./SetupWizard/dialogs/SetupRemote.svelte";
import SetupRemoteCouchDB from "./SetupWizard/dialogs/SetupRemoteCouchDB.svelte";
import SetupRemoteBucket from "./SetupWizard/dialogs/SetupRemoteBucket.svelte";
import SetupRemoteP2P from "./SetupWizard/dialogs/SetupRemoteP2P.svelte";
import SetupRemoteE2EE from "./SetupWizard/dialogs/SetupRemoteE2EE.svelte";
import { decodeSettingsFromQRCodeData } from "../../lib/src/API/processSetting.ts";
/**
* User modes for onboarding and setup
*/
export const enum UserMode {
/**
* New User Mode - for users who are new to the plugin
*/
NewUser = "new-user",
/**
* Existing User Mode - for users who have used the plugin before, or just configuring again
*/
ExistingUser = "existing-user",
/**
* Unknown User Mode - for cases where the user mode is not determined
*/
Unknown = "unknown",
/**
* Update User Mode - for users who are updating configuration. May be `existing-user` as well, but possibly they want to treat it differently.
*/
// eslint-disable-next-line @typescript-eslint/no-duplicate-enum-values
Update = "unknown", // Alias for Unknown for better readability
}
/**
* Setup Manager to handle onboarding and configuration setup
*/
export class SetupManager extends AbstractObsidianModule {
/**
* Dialog manager for handling Svelte dialogs
*/
private dialogManager: SvelteDialogManager = new SvelteDialogManager(this.plugin);
/**
* Starts the onboarding process
* @returns Promise that resolves to true if onboarding completed successfully, false otherwise
*/
async startOnBoarding(): Promise<boolean> {
const isUserNewOrExisting = await this.dialogManager.openWithExplicitCancel(Intro);
if (isUserNewOrExisting === "new-user") {
await this.onOnboard(UserMode.NewUser);
} else if (isUserNewOrExisting === "existing-user") {
await this.onOnboard(UserMode.ExistingUser);
} else if (isUserNewOrExisting === "cancelled") {
this._log("Onboarding cancelled by user.", LOG_LEVEL_NOTICE);
return false;
}
return false;
}
/**
* Handles the onboarding process based on user mode
* @param userMode
* @returns Promise that resolves to true if onboarding completed successfully, false otherwise
*/
async onOnboard(userMode: UserMode): Promise<boolean> {
const originalSetting = userMode === UserMode.NewUser ? DEFAULT_SETTINGS : this.core.settings;
if (userMode === UserMode.NewUser) {
//Ask how to apply initial setup
const method = await this.dialogManager.openWithExplicitCancel(SelectMethodNewUser);
if (method === "use-setup-uri") {
await this.onUseSetupURI(userMode);
} else if (method === "configure-manually") {
await this.onConfigureManually(originalSetting, userMode);
} else if (method === "cancelled") {
this._log("Onboarding cancelled by user.", LOG_LEVEL_NOTICE);
return false;
}
} else if (userMode === UserMode.ExistingUser) {
const method = await this.dialogManager.openWithExplicitCancel(SelectMethodExisting);
if (method === "use-setup-uri") {
await this.onUseSetupURI(userMode);
} else if (method === "configure-manually") {
await this.onConfigureManually(originalSetting, userMode);
} else if (method === "scan-qr-code") {
await this.onPromptQRCodeInstruction();
} else if (method === "cancelled") {
this._log("Onboarding cancelled by user.", LOG_LEVEL_NOTICE);
return false;
}
}
return false;
}
/**
* Handles setup using a setup URI
* @param userMode
* @param setupURI
* @returns Promise that resolves to true if onboarding completed successfully, false otherwise
*/
async onUseSetupURI(userMode: UserMode, setupURI: string = ""): Promise<boolean> {
const newSetting = await this.dialogManager.openWithExplicitCancel(UseSetupURI, setupURI);
if (newSetting === "cancelled") {
this._log("Setup URI dialog cancelled.", LOG_LEVEL_NOTICE);
return false;
}
this._log("Setup URI dialog closed.", LOG_LEVEL_VERBOSE);
return await this.onConfirmApplySettingsFromWizard(newSetting, userMode);
}
/**
* Handles manual setup for CouchDB
* @param userMode
* @param currentSetting
* @param activate Whether to activate the CouchDB as remote type
* @returns Promise that resolves to true if setup completed successfully, false otherwise
*/
async onCouchDBManualSetup(
userMode: UserMode,
currentSetting: ObsidianLiveSyncSettings,
activate = true
): Promise<boolean> {
const originalSetting = JSON.parse(JSON.stringify(currentSetting)) as ObsidianLiveSyncSettings;
const baseSetting = JSON.parse(JSON.stringify(originalSetting)) as ObsidianLiveSyncSettings;
const couchConf = await this.dialogManager.openWithExplicitCancel(SetupRemoteCouchDB, originalSetting);
if (couchConf === "cancelled") {
this._log("Manual configuration cancelled.", LOG_LEVEL_NOTICE);
return await this.onOnboard(userMode);
}
const newSetting = { ...baseSetting, ...couchConf } as ObsidianLiveSyncSettings;
if (activate) {
newSetting.remoteType = REMOTE_COUCHDB;
}
return await this.onConfirmApplySettingsFromWizard(newSetting, userMode, activate);
}
/**
* Handles manual setup for S3-compatible bucket
* @param userMode
* @param currentSetting
* @param activate Whether to activate the Bucket as remote type
* @returns Promise that resolves to true if setup completed successfully, false otherwise
*/
async onBucketManualSetup(
userMode: UserMode,
currentSetting: ObsidianLiveSyncSettings,
activate = true
): Promise<boolean> {
const bucketConf = await this.dialogManager.openWithExplicitCancel(SetupRemoteBucket, currentSetting);
if (bucketConf === "cancelled") {
this._log("Manual configuration cancelled.", LOG_LEVEL_NOTICE);
return await this.onOnboard(userMode);
}
const newSetting = { ...currentSetting, ...bucketConf } as ObsidianLiveSyncSettings;
if (activate) {
newSetting.remoteType = REMOTE_MINIO;
}
return await this.onConfirmApplySettingsFromWizard(newSetting, userMode, activate);
}
/**
* Handles manual setup for P2P
* @param userMode
* @param currentSetting
* @param activate Whether to activate the P2P as remote type (as P2P Only setup)
* @returns Promise that resolves to true if setup completed successfully, false otherwise
*/
async onP2PManualSetup(
userMode: UserMode,
currentSetting: ObsidianLiveSyncSettings,
activate = true
): Promise<boolean> {
const p2pConf = await this.dialogManager.openWithExplicitCancel(SetupRemoteP2P, currentSetting);
if (p2pConf === "cancelled") {
this._log("Manual configuration cancelled.", LOG_LEVEL_NOTICE);
return await this.onOnboard(userMode);
}
const newSetting = { ...currentSetting, ...p2pConf } as ObsidianLiveSyncSettings;
if (activate) {
newSetting.remoteType = REMOTE_P2P;
}
return await this.onConfirmApplySettingsFromWizard(newSetting, userMode, activate);
}
/**
* Handles only E2EE configuration
* @param userMode
* @param currentSetting
* @returns
*/
async onlyE2EEConfiguration(userMode: UserMode, currentSetting: ObsidianLiveSyncSettings): Promise<boolean> {
const e2eeConf = await this.dialogManager.openWithExplicitCancel(SetupRemoteE2EE, currentSetting);
if (e2eeConf === "cancelled") {
this._log("E2EE configuration cancelled.", LOG_LEVEL_NOTICE);
return await false;
}
const newSetting = {
...currentSetting,
...e2eeConf,
} as ObsidianLiveSyncSettings;
return await this.onConfirmApplySettingsFromWizard(newSetting, userMode);
}
/**
* Handles manual configuration flow (E2EE + select server)
* @param originalSetting
* @param userMode
* @returns
*/
async onConfigureManually(originalSetting: ObsidianLiveSyncSettings, userMode: UserMode): Promise<boolean> {
const e2eeConf = await this.dialogManager.openWithExplicitCancel(SetupRemoteE2EE, originalSetting);
if (e2eeConf === "cancelled") {
this._log("Manual configuration cancelled.", LOG_LEVEL_NOTICE);
return await this.onOnboard(userMode);
}
const currentSetting = {
...originalSetting,
...e2eeConf,
} as ObsidianLiveSyncSettings;
return await this.onSelectServer(currentSetting, userMode);
}
/**
* Handles server selection during manual configuration
* @param currentSetting
* @param userMode
* @returns
*/
async onSelectServer(currentSetting: ObsidianLiveSyncSettings, userMode: UserMode): Promise<boolean> {
const method = await this.dialogManager.openWithExplicitCancel(SetupRemote);
if (method === "couchdb") {
return await this.onCouchDBManualSetup(userMode, currentSetting, true);
} else if (method === "bucket") {
return await this.onBucketManualSetup(userMode, currentSetting, true);
} else if (method === "p2p") {
return await this.onP2PManualSetup(userMode, currentSetting, true);
} else if (method === "cancelled") {
this._log("Manual configuration cancelled.", LOG_LEVEL_NOTICE);
if (userMode !== UserMode.Unknown) {
return await this.onOnboard(userMode);
}
}
// Should not reach here.
return false;
}
/**
* Confirms and applies settings obtained from the wizard
* @param newConf
* @param _userMode
* @param activate Whether to activate the remote type in the new settings
* @param extra Extra function to run before applying settings
* @returns Promise that resolves to true if settings applied successfully, false otherwise
*/
async onConfirmApplySettingsFromWizard(
newConf: ObsidianLiveSyncSettings,
_userMode: UserMode,
activate: boolean = true,
extra: () => void = () => {}
): Promise<boolean> {
let userMode = _userMode;
if (userMode === UserMode.Unknown) {
if (isObjectDifferent(this.settings, newConf, true) === false) {
this._log("No changes in settings detected. Skipping applying settings from wizard.", LOG_LEVEL_NOTICE);
return true;
}
const patch = generatePatchObj(this.settings, newConf);
console.log(`Changes:`);
console.dir(patch);
if (!activate) {
extra();
await this.applySetting(newConf, UserMode.ExistingUser);
this._log("Setting Applied", LOG_LEVEL_NOTICE);
return true;
}
// Check virtual changes
const original = { ...this.settings, P2P_DevicePeerName: "" } as ObsidianLiveSyncSettings;
const modified = { ...newConf, P2P_DevicePeerName: "" } as ObsidianLiveSyncSettings;
const isOnlyVirtualChange = isObjectDifferent(original, modified, true) === false;
if (isOnlyVirtualChange) {
extra();
await this.applySetting(newConf, UserMode.ExistingUser);
this._log("Settings from wizard applied.", LOG_LEVEL_NOTICE);
return true;
} else {
const userModeResult = await this.dialogManager.openWithExplicitCancel(OutroAskUserMode);
if (userModeResult === "new-user") {
userMode = UserMode.NewUser;
} else if (userModeResult === "existing-user") {
userMode = UserMode.ExistingUser;
} else if (userModeResult === "compatible-existing-user") {
extra();
await this.applySetting(newConf, UserMode.ExistingUser);
this._log("Settings from wizard applied.", LOG_LEVEL_NOTICE);
return true;
} else if (userModeResult === "cancelled") {
this._log("User cancelled applying settings from wizard.", LOG_LEVEL_NOTICE);
return false;
}
}
}
const component = userMode === UserMode.NewUser ? OutroNewUser : OutroExistingUser;
const confirm = await this.dialogManager.openWithExplicitCancel(component);
if (confirm === "cancelled") {
this._log("User cancelled applying settings from wizard..", LOG_LEVEL_NOTICE);
return false;
}
if (confirm) {
extra();
await this.applySetting(newConf, userMode);
if (userMode === UserMode.NewUser) {
// For new users, schedule a rebuild everything.
await this.core.rebuilder.scheduleRebuild();
} else {
// For existing users, schedule a fetch.
await this.core.rebuilder.scheduleFetch();
}
}
// Settings applied, but may require rebuild to take effect.
return false;
}
/**
* Prompts the user with QR code scanning instructions
* @returns Promise that resolves to false as QR code instruction dialog does not yield settings directly
*/
async onPromptQRCodeInstruction(): Promise<boolean> {
const qrResult = await this.dialogManager.open(ScanQRCode);
this._log("QR Code dialog closed.", LOG_LEVEL_VERBOSE);
// Result is not used, but log it for debugging.
this._log(`QR Code result: ${qrResult}`, LOG_LEVEL_VERBOSE);
// QR Code instruction dialog never yields settings directly.
return false;
}
/**
* Decodes settings from a QR code string and applies them
* @param qr QR code string containing encoded settings
* @returns Promise that resolves to true if settings applied successfully, false otherwise
*/
async decodeQR(qr: string) {
const newSettings = decodeSettingsFromQRCodeData(qr);
return await this.onConfirmApplySettingsFromWizard(newSettings, UserMode.Unknown);
}
/**
* Applies the new settings to the core settings and saves them
* @param newConf
* @param userMode
* @returns Promise that resolves to true if settings applied successfully, false otherwise
*/
async applySetting(newConf: ObsidianLiveSyncSettings, userMode: UserMode) {
const newSetting = {
...this.core.settings,
...newConf,
};
this.core.settings = newSetting;
this.services.setting.clearUsedPassphrase();
await this.services.setting.saveSettingData();
return true;
}
}

View File

@@ -0,0 +1,141 @@
import { eventHub, EVENT_PLUGIN_UNLOADED } from "@/common/events";
import { Modal } from "@/deps";
import type ObsidianLiveSyncPlugin from "@/main";
import { mount, unmount } from "svelte";
import DialogHost from "@lib/UI/DialogHost.svelte";
import { fireAndForget, promiseWithResolvers, type PromiseWithResolvers } from "octagonal-wheels/promises";
import { LOG_LEVEL_NOTICE, Logger } from "octagonal-wheels/common/logger";
import {
type DialogControlBase,
type DialogSvelteComponentBaseProps,
type ComponentHasResult,
setupDialogContext,
getDialogContext,
type SvelteDialogManagerBase,
} from "@/lib/src/UI/svelteDialog.ts";
export type DialogSvelteComponentProps = DialogSvelteComponentBaseProps & {
plugin: ObsidianLiveSyncPlugin;
services: ObsidianLiveSyncPlugin["services"];
};
export type DialogControls<T = any, U = any> = DialogControlBase<T, U> & {
plugin: ObsidianLiveSyncPlugin;
services: ObsidianLiveSyncPlugin["services"];
};
export type DialogMessageProps = Record<string, any>;
// type DialogSvelteComponent<T extends DialogSvelteComponentProps = DialogSvelteComponentProps> = Component<SvelteComponent<T>,any>;
export class SvelteDialog<T, U> extends Modal {
plugin: ObsidianLiveSyncPlugin;
mountedComponent?: ReturnType<typeof mount>;
component: ComponentHasResult<T, U>;
result?: T;
initialData?: U;
title: string = "Obsidian LiveSync - Setup Wizard";
constructor(plugin: ObsidianLiveSyncPlugin, component: ComponentHasResult<T, U>, initialData?: U) {
super(plugin.app);
this.plugin = plugin;
this.component = component;
this.initialData = initialData;
}
resolveResult() {
this.resultPromiseWithResolvers?.resolve(this.result);
this.resultPromiseWithResolvers = undefined;
}
resultPromiseWithResolvers?: PromiseWithResolvers<T | undefined>;
onOpen() {
const { contentEl } = this;
contentEl.empty();
// eslint-disable-next-line @typescript-eslint/no-this-alias
const dialog = this;
if (this.resultPromiseWithResolvers) {
this.resultPromiseWithResolvers.reject("Dialog opened again");
}
const pr = promiseWithResolvers<any>();
eventHub.once(EVENT_PLUGIN_UNLOADED, () => {
if (this.resultPromiseWithResolvers === pr) {
pr.reject("Plugin unloaded");
this.close();
}
});
this.resultPromiseWithResolvers = pr;
this.mountedComponent = mount(DialogHost, {
target: contentEl,
props: {
onSetupContext: (props: DialogSvelteComponentBaseProps) => {
setupDialogContext({
...props,
plugin: this.plugin,
services: this.plugin.services,
});
},
setTitle: (title: string) => {
dialog.setTitle(title);
},
closeDialog: () => {
dialog.close();
},
setResult: (result: T) => {
this.result = result;
},
getInitialData: () => this.initialData,
mountComponent: this.component,
},
});
}
waitForClose(): Promise<T | undefined> {
if (!this.resultPromiseWithResolvers) {
throw new Error("Dialog not opened yet");
}
return this.resultPromiseWithResolvers.promise;
}
onClose() {
this.resolveResult();
fireAndForget(async () => {
if (this.mountedComponent) {
await unmount(this.mountedComponent);
}
});
}
}
export async function openSvelteDialog<T, U>(
plugin: ObsidianLiveSyncPlugin,
component: ComponentHasResult<T, U>,
initialData?: U
): Promise<T | undefined> {
const dialog = new SvelteDialog<T, U>(plugin, component, initialData);
dialog.open();
return await dialog.waitForClose();
}
export class SvelteDialogManager implements SvelteDialogManagerBase {
plugin: ObsidianLiveSyncPlugin;
constructor(plugin: ObsidianLiveSyncPlugin) {
this.plugin = plugin;
}
async open<T, U>(component: ComponentHasResult<T, U>, initialData?: U): Promise<T | undefined> {
return await openSvelteDialog<T, U>(this.plugin, component, initialData);
}
async openWithExplicitCancel<T, U>(component: ComponentHasResult<T, U>, initialData?: U): Promise<T> {
for (let i = 0; i < 10; i++) {
const ret = await openSvelteDialog<T, U>(this.plugin, component, initialData);
if (ret !== undefined) {
return ret;
}
if (this.plugin.services.appLifecycle.hasUnloaded()) {
throw new Error("Operation cancelled due to app shutdown.");
}
Logger("Please select 'Cancel' explicitly to cancel this operation.", LOG_LEVEL_NOTICE);
}
throw new Error("Operation Forcibly cancelled by user.");
}
}
export function getObsidianDialogContext<T = any>(): DialogControls<T> {
return getDialogContext<T>() as DialogControls<T>;
}

View File

@@ -0,0 +1,154 @@
<script lang="ts">
import DialogHeader from "@/lib/src/UI/components/DialogHeader.svelte";
import Guidance from "@/lib/src/UI/components/Guidance.svelte";
import Decision from "@/lib/src/UI/components/Decision.svelte";
import Question from "@/lib/src/UI/components/Question.svelte";
import Option from "@/lib/src/UI/components/Option.svelte";
import Options from "@/lib/src/UI/components/Options.svelte";
import Instruction from "@/lib/src/UI/components/Instruction.svelte";
import UserDecisions from "@/lib/src/UI/components/UserDecisions.svelte";
import InfoNote from "@/lib/src/UI/components/InfoNote.svelte";
import ExtraItems from "@/lib/src/UI/components/ExtraItems.svelte";
import Check from "@/lib/src/UI/components/Check.svelte";
const TYPE_IDENTICAL = "identical";
const TYPE_INDEPENDENT = "independent";
const TYPE_UNBALANCED = "unbalanced";
const TYPE_CANCEL = "cancelled";
const TYPE_BACKUP_DONE = "backup_done";
const TYPE_BACKUP_SKIPPED = "backup_skipped";
const TYPE_UNABLE_TO_BACKUP = "unable_to_backup";
type ResultTypeVault =
| typeof TYPE_IDENTICAL
| typeof TYPE_INDEPENDENT
| typeof TYPE_UNBALANCED
| typeof TYPE_CANCEL;
type ResultTypeBackup =
| typeof TYPE_BACKUP_DONE
| typeof TYPE_BACKUP_SKIPPED
| typeof TYPE_UNABLE_TO_BACKUP
| typeof TYPE_CANCEL;
type ResultTypeExtra = {
preventFetchingConfig: boolean;
};
type ResultType =
| {
vault: ResultTypeVault;
backup: ResultTypeBackup;
extra: ResultTypeExtra;
}
| typeof TYPE_CANCEL;
type Props = {
setResult: (result: ResultType) => void;
};
const { setResult }: Props = $props();
let vaultType = $state<ResultTypeVault>(TYPE_CANCEL);
let backupType = $state<ResultTypeBackup>(TYPE_CANCEL);
const canProceed = $derived.by(() => {
return (
(vaultType === TYPE_IDENTICAL || vaultType === TYPE_INDEPENDENT || vaultType === TYPE_UNBALANCED) &&
(backupType === TYPE_BACKUP_DONE || backupType === TYPE_BACKUP_SKIPPED)
);
});
let preventFetchingConfig = $state(false);
function commit() {
setResult({
vault: vaultType,
backup: backupType,
extra: {
preventFetchingConfig,
},
});
}
</script>
<DialogHeader title="Reset Synchronisation on This Device" />
<Guidance
>This will rebuild the local database on this device using the most recent data from the server. This action is
designed to resolve synchronisation inconsistencies and restore correct functionality.</Guidance
>
<Guidance important title="⚠️ Important Notice">
<strong
>If you have unsynchronised changes in your Vault on this device, they will likely diverge from the server's
versions after the reset. This may result in a large number of file conflicts.</strong
><br />
Furthermore, if conflicts are already present in the server data, they will be synchronised to this device as they are,
and you will need to resolve them locally.
</Guidance>
<hr />
<Instruction>
<Question
><strong>To minimise the creation of new conflicts</strong>, please select the option that best describes the
current state of your Vault. The application will then check your files in the most appropriate way based on
your selection.</Question
>
<Options>
<Option
selectedValue={TYPE_IDENTICAL}
title="The files in this Vault are almost identical to the server's."
bind:value={vaultType}
>
(e.g., immediately after restoring on another computer, or having recovered from a backup)
</Option>
<Option
selectedValue={TYPE_INDEPENDENT}
title="This Vault is empty, or contains only new files that are not on the server."
bind:value={vaultType}
>
(e.g., setting up for the first time on a new smartphone, starting from a clean slate)
</Option>
<Option
selectedValue={TYPE_UNBALANCED}
title="There may be differences between the files in this Vault and the server."
bind:value={vaultType}
>
(e.g., after editing many files whilst offline)
<InfoNote info>
In this scenario, Self-hosted LiveSync will recreate metadata for every file and deliberately generate
conflicts. Where the file content is identical, these conflicts will be resolved automatically.
</InfoNote>
</Option>
</Options>
</Instruction>
<hr />
<Instruction>
<Question>Have you created a backup before proceeding?</Question>
<InfoNote>
We recommend that you copy your Vault folder to a safe location. This will provide a safeguard in case a large
number of conflicts arise, or if you accidentally synchronise with an incorrect destination.
</InfoNote>
<Options>
<Option selectedValue={TYPE_BACKUP_DONE} title="I have created a backup of my Vault." bind:value={backupType} />
<Option
selectedValue={TYPE_BACKUP_SKIPPED}
title="I understand the risks and will proceed without a backup."
bind:value={backupType}
/>
<Option
selectedValue={TYPE_UNABLE_TO_BACKUP}
title="I am unable to create a backup of my Vault."
bind:value={backupType}
>
<InfoNote error visible={backupType === TYPE_UNABLE_TO_BACKUP}>
<strong
>It is strongly advised to create a backup before proceeding. Continuing without a backup may lead
to data loss.
</strong>
<br />
If you understand the risks and still wish to proceed, select so.
</InfoNote>
</Option>
</Options>
</Instruction>
<Instruction>
<ExtraItems title="Advanced">
<Check title="Prevent fetching configuration from server" bind:value={preventFetchingConfig} />
</ExtraItems>
</Instruction>
<UserDecisions>
<Decision title="Reset and Resume Synchronisation" important disabled={!canProceed} commit={() => commit()} />
<Decision title="Cancel" commit={() => setResult(TYPE_CANCEL)} />
</UserDecisions>

View File

@@ -0,0 +1,55 @@
<script lang="ts">
import DialogHeader from "@/lib/src/UI/components/DialogHeader.svelte";
import Guidance from "@/lib/src/UI/components/Guidance.svelte";
import Decision from "@/lib/src/UI/components/Decision.svelte";
import Question from "@/lib/src/UI/components/Question.svelte";
import Option from "@/lib/src/UI/components/Option.svelte";
import Options from "@/lib/src/UI/components/Options.svelte";
import Instruction from "@/lib/src/UI/components/Instruction.svelte";
import UserDecisions from "@/lib/src/UI/components/UserDecisions.svelte";
const TYPE_NEW_USER = "new-user";
const TYPE_EXISTING_USER = "existing-user";
const TYPE_CANCELLED = "cancelled";
type ResultType = typeof TYPE_NEW_USER | typeof TYPE_EXISTING_USER | typeof TYPE_CANCELLED;
type Props = {
setResult: (result: ResultType) => void;
};
const { setResult }: Props = $props();
let userType = $state<ResultType>(TYPE_CANCELLED);
let proceedTitle = $derived.by(() => {
if (userType === TYPE_NEW_USER) {
return "Yes, I want to set up a new synchronisation";
} else if (userType === TYPE_EXISTING_USER) {
return "Yes, I want to add this device to my existing synchronisation";
} else {
return "Please select an option to proceed";
}
});
const canProceed = $derived.by(() => {
return userType === TYPE_NEW_USER || userType === TYPE_EXISTING_USER;
});
</script>
<DialogHeader title="Welcome to Self-hosted LiveSync" />
<Guidance>We will now guide you through a few questions to simplify the synchronisation setup.</Guidance>
<Instruction>
<Question>First, please select the option that best describes your current situation.</Question>
<Options>
<Option selectedValue={TYPE_NEW_USER} title="I am setting this up for the first time" bind:value={userType}>
(Select this if you are configuring this device as the first synchronisation device.) This option is
suitable if you are new to LiveSync and want to set it up from scratch.
</Option>
<Option
selectedValue={TYPE_EXISTING_USER}
title="I am adding a device to an existing synchronisation setup"
bind:value={userType}
>
(Select this if you are already using synchronisation on another computer or smartphone.) This option is
suitable if you are new to LiveSync and want to set it up from scratch.
</Option>
</Options>
</Instruction>
<UserDecisions>
<Decision title={proceedTitle} important={canProceed} disabled={!canProceed} commit={() => setResult(userType)} />
<Decision title="No, please take me back" commit={() => setResult(TYPE_CANCELLED)} />
</UserDecisions>

View File

@@ -0,0 +1,75 @@
<script lang="ts">
import DialogHeader from "@/lib/src/UI/components/DialogHeader.svelte";
import Guidance from "@/lib/src/UI/components/Guidance.svelte";
import Decision from "@/lib/src/UI/components/Decision.svelte";
import Question from "@/lib/src/UI/components/Question.svelte";
import Option from "@/lib/src/UI/components/Option.svelte";
import Instruction from "@/lib/src/UI/components/Instruction.svelte";
import UserDecisions from "@/lib/src/UI/components/UserDecisions.svelte";
import InfoNote from "@/lib/src/UI/components/InfoNote.svelte";
const TYPE_EXISTING = "existing-user";
const TYPE_NEW = "new-user";
const TYPE_COMPATIBLE_EXISTING = "compatible-existing-user";
const TYPE_CANCELLED = "cancelled";
type ResultType = typeof TYPE_EXISTING | typeof TYPE_NEW | typeof TYPE_COMPATIBLE_EXISTING | typeof TYPE_CANCELLED;
type Props = {
setResult: (result: ResultType) => void;
};
const { setResult }: Props = $props();
let userType = $state<ResultType>(TYPE_CANCELLED);
const canProceed = $derived.by(() => {
return userType === TYPE_EXISTING || userType === TYPE_NEW || userType === TYPE_COMPATIBLE_EXISTING;
});
const proceedMessage = $derived.by(() => {
if (userType === TYPE_NEW) {
return "Proceed to the next step.";
} else if (userType === TYPE_EXISTING) {
return "Proceed to the next step.";
} else if (userType === TYPE_COMPATIBLE_EXISTING) {
return "Apply the settings";
} else {
return "Please select an option to proceed";
}
});
</script>
<DialogHeader title="Mostly Complete: Decision Required" />
<Guidance>
The connection to the server has been configured successfully. As the next step, <strong
>the local database, that is to say the synchronisation information, must be reconstituted.</strong
>
</Guidance>
<Instruction>
<Question>Please select your situation.</Question>
<Option title="I am setting up a new server for the first time / I want to reset my existing server." bind:value={userType} selectedValue={TYPE_NEW}>
<InfoNote>
Selecting this option will result in the current data on this device being used to initialise the server.
Any existing data on the server will be completely overwritten.
</InfoNote>
</Option>
<Option
title="My remote server is already set up. I want to join this device."
bind:value={userType}
selectedValue={TYPE_EXISTING}
>
<InfoNote>
Selecting this option will result in this device joining the existing server. You need to fetching the
existing synchronisation data from the server to this device.
</InfoNote>
</Option>
<Option
title="The remote is already set up, and the configuration is compatible (or got compatible by this operation)."
bind:value={userType}
selectedValue={TYPE_COMPATIBLE_EXISTING}
>
<InfoNote warning>
Unless you are certain, selecting this options is bit dangerous. It assumes that the server configuration is
compatible with this device. If this is not the case, data loss may occur. Please ensure you know what you
are doing.
</InfoNote>
</Option>
</Instruction>
<UserDecisions>
<Decision title={proceedMessage} important={true} disabled={!canProceed} commit={() => setResult(userType)} />
<Decision title="No, please take me back" commit={() => setResult(TYPE_CANCELLED)} />
</UserDecisions>

View File

@@ -0,0 +1,37 @@
<script lang="ts">
import DialogHeader from "@/lib/src/UI/components/DialogHeader.svelte";
import Guidance from "@/lib/src/UI/components/Guidance.svelte";
import Decision from "@/lib/src/UI/components/Decision.svelte";
import Question from "@/lib/src/UI/components/Question.svelte";
import Instruction from "@/lib/src/UI/components/Instruction.svelte";
import UserDecisions from "@/lib/src/UI/components/UserDecisions.svelte";
const TYPE_APPLY = "apply";
const TYPE_CANCELLED = "cancelled";
type ResultType = typeof TYPE_APPLY | typeof TYPE_CANCELLED;
type Props = {
setResult: (result: ResultType) => void;
};
const { setResult }: Props = $props();
</script>
<DialogHeader title="Setup Complete: Preparing to Fetch Synchronisation Data" />
<Guidance>
<p>
The connection to the server has been configured successfully. As the next step, <strong
>the latest synchronisation data will be downloaded from the server to this device.</strong
>
</p>
<p>
<strong>PLEASE NOTE</strong>
<br />
After restarting, the database on this device will be rebuilt using data from the server. If there are any unsynchronised
files in this vault, conflicts may occur with the server data.
</p>
</Guidance>
<Instruction>
<Question>Please select the button below to restart and proceed to the data fetching confirmation.</Question>
</Instruction>
<UserDecisions>
<Decision title="Restart and Fetch Data" important={true} commit={() => setResult(TYPE_APPLY)} />
<Decision title="No, please take me back" commit={() => setResult(TYPE_CANCELLED)} />
</UserDecisions>

View File

@@ -0,0 +1,38 @@
<script lang="ts">
import DialogHeader from "@/lib/src/UI/components/DialogHeader.svelte";
import Guidance from "@/lib/src/UI/components/Guidance.svelte";
import Decision from "@/lib/src/UI/components/Decision.svelte";
import Question from "@/lib/src/UI/components/Question.svelte";
import Instruction from "@/lib/src/UI/components/Instruction.svelte";
import UserDecisions from "@/lib/src/UI/components/UserDecisions.svelte";
const TYPE_APPLY = "apply";
const TYPE_CANCELLED = "cancelled";
type ResultType = typeof TYPE_APPLY | typeof TYPE_CANCELLED;
type Props = {
setResult: (result: ResultType) => void;
};
const { setResult }: Props = $props();
// let userType = $state<ResultType>(TYPE_CANCELLED);
</script>
<DialogHeader title="Setup Complete: Preparing to Initialise Server" />
<Guidance>
<p>
The connection to the server has been configured successfully. As the next step, <strong
>the synchronisation data on the server will be built based on the current data on this device.</strong
>
</p>
<p>
<strong>IMPORTANT</strong>
<br />
After restarting, the data on this device will be uploaded to the server as the 'master copy'. Please be aware that
any unintended data currently on the server will be completely overwritten.
</p>
</Guidance>
<Instruction>
<Question>Please select the button below to restart and proceed to the final confirmation.</Question>
</Instruction>
<UserDecisions>
<Decision title="Restart and Initialise Server" important={true} commit={() => setResult(TYPE_APPLY)} />
<Decision title="No, please take me back" commit={() => setResult(TYPE_CANCELLED)} />
</UserDecisions>

View File

@@ -0,0 +1,141 @@
<script lang="ts">
/**
* Panel to check and fix CouchDB configuration issues
*/
import type { ObsidianLiveSyncSettings } from "../../../../lib/src/common/types";
import Decision from "../../../../lib/src/UI/components/Decision.svelte";
import UserDecisions from "../../../../lib/src/UI/components/UserDecisions.svelte";
import { checkConfig, type ConfigCheckResult, type ResultError, type ResultErrorMessage } from "./utilCheckCouchDB";
type Props = {
trialRemoteSetting: ObsidianLiveSyncSettings;
};
const { trialRemoteSetting }: Props = $props();
let detectedIssues = $state<ConfigCheckResult[]>([]);
async function testAndFixSettings() {
detectedIssues = [];
try {
const fixResults = await checkConfig(trialRemoteSetting);
console.dir(fixResults);
detectedIssues = fixResults;
} catch (e) {
console.error("Error during testAndFixSettings:", e);
detectedIssues.push({ message: `Error during testAndFixSettings: ${e}`, result: "error", classes: [] });
}
}
function isErrorResult(result: ConfigCheckResult): result is ResultError | ResultErrorMessage {
return "result" in result && result.result === "error";
}
function isFixableError(result: ConfigCheckResult): result is ResultError {
return isErrorResult(result) && "fix" in result && typeof result.fix === "function";
}
function isSuccessResult(result: ConfigCheckResult): result is { message: string; result: "ok"; value?: any } {
return "result" in result && result.result === "ok";
}
let processing = $state(false);
async function fixIssue(issue: ResultError) {
try {
processing = true;
await issue.fix();
} catch (e) {
console.error("Error during fixIssue:", e);
}
await testAndFixSettings();
processing = false;
}
const errorIssueCount = $derived.by(() => {
return detectedIssues.filter((issue) => isErrorResult(issue)).length;
});
const isAllSuccess = $derived.by(() => {
return !(errorIssueCount > 0 && detectedIssues.length > 0);
});
</script>
{#snippet result(issue: ConfigCheckResult)}
<div class="check-result {isErrorResult(issue) ? 'error' : isSuccessResult(issue) ? 'success' : ''}">
<div class="message">
{issue.message}
</div>
{#if isFixableError(issue)}
<div class="operations">
<button onclick={() => fixIssue(issue)} class="mod-cta" disabled={processing}>Fix</button>
</div>
{/if}
</div>
{/snippet}
<UserDecisions>
<Decision title="Detect and Fix CouchDB Issues" important={true} commit={testAndFixSettings} />
</UserDecisions>
<div class="check-results">
<details open={!isAllSuccess}>
<summary>
{#if detectedIssues.length === 0}
No checks have been performed yet.
{:else if isAllSuccess}
All checks passed successfully!
{:else}
{errorIssueCount} issue(s) detected!
{/if}
</summary>
{#if detectedIssues.length > 0}
<h3>Issue detection log:</h3>
{#each detectedIssues as issue}
{@render result(issue)}
{/each}
{/if}
</details>
</div>
<style>
/* Make .check-result a CSS Grid: let .message expand and keep .operations at minimum width, aligned to the right */
.check-results {
/* Adjust spacing as required */
margin-top: 0.75rem;
}
.check-result {
display: grid;
grid-template-columns: 1fr auto; /* message takes remaining space, operations use minimum width */
align-items: center; /* vertically centre align */
gap: 0.5rem 1rem;
padding: 0rem 0.5rem;
border-radius: 0;
box-shadow: none;
border-left: 0.5em solid var(--interactive-accent);
margin-bottom: 0.25lh;
}
.check-result.error {
border-left: 0.5em solid var(--text-error);
}
.check-result.success {
border-left: 0.5em solid var(--text-success);
}
.check-result .message {
/* Wrap long messages */
white-space: normal;
word-break: break-word;
font-size: 0.95rem;
color: var(--text-normal);
}
.check-result .operations {
/* Centre the button(s) vertically and align to the right */
display: flex;
align-items: center;
justify-content: flex-end;
gap: 0.5rem;
}
/* For small screens: move .operations below and stack vertically */
@media (max-width: 520px) {
.check-result {
grid-template-columns: 1fr;
grid-auto-rows: auto;
}
.check-result .operations {
justify-content: flex-start;
margin-top: 0.5rem;
}
}
</style>

Some files were not shown because too many files have changed in this diff Show More