mirror of
https://github.com/vrtmrz/obsidian-livesync.git
synced 2026-02-22 20:18:48 +00:00
Compare commits
9 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c6ed867498 | ||
|
|
4f4923e977 | ||
|
|
a5ebf29b3d | ||
|
|
ee465184c8 | ||
|
|
d7d4f1e6f2 | ||
|
|
cbf5023593 | ||
|
|
3925052f92 | ||
|
|
1934418258 | ||
|
|
2ae018b2bd |
81
docs/design_docs_of_keep_newborn_chunks.md
Normal file
81
docs/design_docs_of_keep_newborn_chunks.md
Normal file
@@ -0,0 +1,81 @@
|
||||
# Keep newborn chunks in Eden.
|
||||
|
||||
NOTE: This is the planned feature design document. This is planned, but not be implemented now (v0.23.3). This has not reached the design freeze and will be added to from time to time.
|
||||
|
||||
## Goal
|
||||
|
||||
Reduce the number of chunks which in volatile, and reduce the usage of storage of the remote database in middle or long term.
|
||||
|
||||
## Motivation
|
||||
|
||||
- In the current implementation, Self-hosted LiveSync splits documents into metadata and multiple chunks. In particular, chunks are split so that they do not exceed a certain length.
|
||||
- This is to optimise the transfer and take advantage of the properties of CouchDB. This also complies with the restriction of IBM Cloudant on the size of a single document.
|
||||
- However, creating chunks halfway through each editing operation increases the number of unnecessary chunks.
|
||||
- Chunks are shared by several documents. For this reason, it is not clear whether these chunks are needed or not unless all revisions of all documents are checked. This makes it difficult to remove unnecessary data.
|
||||
- On the other hand, chunks are done in units that can be neatly divided as markdown to ensure relatively accurate de-duplication, even if they are created simultaneously on multiple terminals. Therefore, it is unlikely that the data in the editing process will be reused.
|
||||
- For this reason, we have made features such as Batch save available, but they are not a fundamental solution.
|
||||
- As a result, there is a large amount of data that cannot be erased and is probably unused. Therefore, `Fetch chunks on demand` is currently performed for optimal communication.
|
||||
- If the generation of unnecessary chunks is sufficiently reduced, this function will become unnecessary.
|
||||
- The problem is that this unnecessary chunking slows down both local and remote operations.
|
||||
|
||||
## Prerequisite
|
||||
- The implementation must be able to control the size of the document appropriately so that it does not become non-transferable (1).
|
||||
- The implementation must be such that data corruption can be avoided even if forward compatibility is not maintained; due to the nature of Self-hosted LiveSync, backward version connexions are expected.
|
||||
- Viewed as a feature:
|
||||
- This feature should be disabled for migration users.
|
||||
- This feature should be enabled for new users and after rebuilds of migrated users.
|
||||
- Therefore, back into the implementation view, Ideally, the implementation should be such that data recovery can be achieved by immediately upgrading after replication.
|
||||
|
||||
## Outlined methods and implementation plans
|
||||
### Abstract
|
||||
To store and transfer only stable chunks independently and share them from multiple documents after stabilisation, new chunks, i.e. chunks that are considered non-stable, are modified to be stored in the document and transferred with the document. In this case, care should be taken not to exceed prerequisite (1).
|
||||
|
||||
If this is achieved, the non-leaf document will not be transferred, and even if it is, the chunk will be stored in the document, so that the size can be reduced by the compaction.
|
||||
|
||||
Details are given below.
|
||||
|
||||
1. The document will henceforth have the property eden.
|
||||
```typescript
|
||||
// Paritally Type
|
||||
type EntryWithEden = {
|
||||
eden: {
|
||||
[key: DocumentID]: {
|
||||
data: string,
|
||||
epoch: number, // The document revision which this chunk has been born.
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
2. The following configuration items are added:
|
||||
Note: These configurations should be shared as `Tweaks value` between each client.
|
||||
- useEden : boolean
|
||||
- Max chunks in eden : number
|
||||
- Max Total chunk lengths in eden: number
|
||||
- Max age while in eden: number
|
||||
3. In the document saving operation, chunks are added to Eden within each document, having the revision number of the existing document. And if some chunks in eden are not used in the operating revision, they would be removed.
|
||||
Then after being so chosen, a few chunks are also chosen to be graduated as an independent `chunk` in following rules, and they would be left the eden:
|
||||
- Those that have already been confirmed to exist as independent chunks.
|
||||
- This confirmation of existence may ideally be determined by a fast first-order determination, e.g. by a Bloom filter.
|
||||
- Those whose length exceeds the configured maximum length.
|
||||
- Those have aged over the configured value, since epoch at the operating revision.
|
||||
- Those whose total length, when added up when they are arranged in reverse order of the revision in which they were generated, is after the point at which they exceed the max length in the configuration. Or, those after the configured maximum items.
|
||||
4. In the document loading operation, chunks are firstly read from these eden.
|
||||
5. In End-to-End Encryption, property `eden` of documents will also be encrypted.
|
||||
|
||||
### Note
|
||||
- When this feature has been enabled, forward compatibility is temporarily lost. However, it is detected as missing chunks, and this data is not reflected in the storage in the old version. Therefore, no data loss will occur.
|
||||
|
||||
## Test strategy
|
||||
|
||||
1. Confirm that synchronisation with the previous version is possible with this feature disabled.
|
||||
2. With this feature enabled, connect from the previous version and confirm that errors are detected in the previous version but the files are not corrupted.
|
||||
3. Ensure that the two versions with this feature enabled can withstand normal use.
|
||||
|
||||
## Documentation strategy
|
||||
|
||||
- This document is published, and will be referred from the release note.
|
||||
- Indeed, we lack a fulfilled configuration table. Efforts will be made and, if they can be produced, this document will then be referenced. But not required while in the experimental or beta feature.
|
||||
- However, this might be an essential feature. Further efforts are desired.
|
||||
|
||||
### Consideration and Conclusion
|
||||
To be described after implemented, tested, and, released.
|
||||
55
docs/design_docs_of_sharing_tweak_value.md
Normal file
55
docs/design_docs_of_sharing_tweak_value.md
Normal file
@@ -0,0 +1,55 @@
|
||||
# Sharing `Tweak values`
|
||||
|
||||
NOTE: This is the planned feature design document. This is planned, but not be implemented now (v0.23.3). This has not reached the design freeze and will be added to from time to time.
|
||||
|
||||
## Goal
|
||||
|
||||
Share `Tweak values` between clients to match the chunk lengths, and match per-server configurations for better performance.
|
||||
|
||||
## Motivation
|
||||
|
||||
- In the current implementation, Self-hosted LiveSync splits documents into metadata and multiple chunks. In particular, chunks are split so that they do not exceed a certain length.
|
||||
- This is to optimise the transfer and take advantage of the properties of CouchDB. This also complies with the restriction of IBM Cloudant on the size of a single document.
|
||||
- The length of this chunk is adjusted according to a configured factor. Therefore, if this is inconsistent between clients, de-duplication will not work. This is because, in fact, they point to the same content in total, but are split in different places. This results in unnecessary transfers or storage consumption.
|
||||
- The same applies to hash algorithms.
|
||||
- There are more configurations which `preferred to be matched`, even if it is not required. such as the maximum size of files to be handled and the interval between requests to the remote database, unless there are specific circumstances.
|
||||
- To avoid the tragedy of "Too many toggles", "Unexpected transfer amount", or "Poor performance" at once, the plug-in should know these problems or potential problems and be able to let us know.
|
||||
|
||||
## Prerequisite
|
||||
- We must be informed of a discrepancy in a configured value that is required to be absolutely consistent and be able to make a decision on the spot.
|
||||
- We should be able to see on the configuration dialogue, that there is a discrepancy between configured values that should be matched, and it should be possible to adjust them to a specific one of them (or default).
|
||||
- We must not be exposed to unexpected; such as leaking credentials or their secrets.
|
||||
|
||||
## Outlined methods and implementation plans
|
||||
### Abstract
|
||||
- In the current implementation, each client checks the remote database for the existence of their node information, to detect whether the remote database accepts them.
|
||||
- This is what 'Lock' is all about.
|
||||
- To achieve this feature, the client will also send each configuration value. However, the configuration contains credentials and/or secret values. Hence we cannot send all of them.
|
||||
- With a favourable prediction, Self-hosted LiveSync will continue to increase in feature. Each time this happens, the number of configuration values to be kept secret will also increase. Therefore, they must be handled by an allow-list.
|
||||
- This allow-listed configuration are the `Tweak values`.
|
||||
- If the plug-in detects mismatched `Tweak values` on checking the remote database, the plug-in will ask us to decide which is win (Mine, or theirs).
|
||||
- Node information is one of the documents. Therefore, it will be replicated and saved locally. While showing dialogue, show the notice on each `Match preferred` configuration.
|
||||
|
||||
## Note
|
||||
This feature should be mostly harmless. We will not be able to disable this.
|
||||
|
||||
## Test strategy
|
||||
|
||||
A: During synchronisation.
|
||||
1. No message shall be displayed with all settings matched.
|
||||
2. Message shall be displayed when there are mismatched, required match items.
|
||||
1. The setting values can be changed according to the message.
|
||||
2. The message can be ignored.
|
||||
3. The message shall not be displayed even if there are mismatched items which is recommended to be matched.
|
||||
|
||||
B: On the setting dialogue.
|
||||
1. All mismatched items shall be highlighted in some way.
|
||||
|
||||
## Documentation strategy
|
||||
|
||||
- This document is published, and will be referred from the release note.
|
||||
- Indeed, we lack a fulfilled configuration table. Efforts will be made and, if they can be produced, this document will then be referenced. But not required while in the experimental or beta feature.
|
||||
- However, this might be an essential feature. Further efforts are desired.
|
||||
|
||||
### Consideration and Conclusion
|
||||
To be described after implemented, tested, and, released.
|
||||
@@ -1,7 +1,7 @@
|
||||
{
|
||||
"id": "obsidian-livesync",
|
||||
"name": "Self-hosted LiveSync",
|
||||
"version": "0.23.3",
|
||||
"version": "0.23.5",
|
||||
"minAppVersion": "0.9.12",
|
||||
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||
"author": "vorotamoroz",
|
||||
|
||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -1,12 +1,12 @@
|
||||
{
|
||||
"name": "obsidian-livesync",
|
||||
"version": "0.23.3",
|
||||
"version": "0.23.5",
|
||||
"lockfileVersion": 2,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "obsidian-livesync",
|
||||
"version": "0.23.3",
|
||||
"version": "0.23.5",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@aws-sdk/client-s3": "^3.556.0",
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "obsidian-livesync",
|
||||
"version": "0.23.3",
|
||||
"version": "0.23.5",
|
||||
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
|
||||
"main": "main.js",
|
||||
"type": "module",
|
||||
|
||||
@@ -11,7 +11,7 @@ import { HttpRequest, HttpResponse, type HttpHandlerOptions } from "@smithy/prot
|
||||
//@ts-ignore
|
||||
import { requestTimeout } from "@smithy/fetch-http-handler/dist-es/request-timeout";
|
||||
import { buildQueryString } from "@smithy/querystring-builder";
|
||||
import { requestUrl, type RequestUrlParam } from "./deps";
|
||||
import { requestUrl, type RequestUrlParam } from "../deps.ts";
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
// special handler using Obsidian requestUrl
|
||||
////////////////////////////////////////////////////////////////////////////////
|
||||
@@ -1,9 +1,9 @@
|
||||
import { ButtonComponent } from "obsidian";
|
||||
import { App, FuzzySuggestModal, MarkdownRenderer, Modal, Plugin, Setting } from "./deps";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { App, FuzzySuggestModal, MarkdownRenderer, Modal, Plugin, Setting } from "../deps.ts";
|
||||
import ObsidianLiveSyncPlugin from "../main.ts";
|
||||
|
||||
//@ts-ignore
|
||||
import PluginPane from "./PluginPane.svelte";
|
||||
import PluginPane from "../ui/PluginPane.svelte";
|
||||
|
||||
export class PluginDialogModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
@@ -1,4 +1,4 @@
|
||||
import { PersistentMap } from "./lib/src/PersistentMap";
|
||||
import { PersistentMap } from "../lib/src/dataobject/PersistentMap.ts";
|
||||
|
||||
export let sameChangePairs: PersistentMap<number[]>;
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import { type PluginManifest, TFile } from "./deps";
|
||||
import { type DatabaseEntry, type EntryBody, type FilePath } from "./lib/src/types";
|
||||
import { type PluginManifest, TFile } from "../deps.ts";
|
||||
import { type DatabaseEntry, type EntryBody, type FilePath } from "../lib/src/common/types.ts";
|
||||
|
||||
export interface PluginDataEntry extends DatabaseEntry {
|
||||
deviceVaultName: string;
|
||||
@@ -1,16 +1,16 @@
|
||||
import { normalizePath, Platform, TAbstractFile, App, type RequestUrlParam, requestUrl, TFile } from "./deps";
|
||||
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "./lib/src/path";
|
||||
import { normalizePath, Platform, TAbstractFile, App, type RequestUrlParam, requestUrl, TFile } from "../deps.ts";
|
||||
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "../lib/src/string_and_binary/path.ts";
|
||||
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { LOG_LEVEL_VERBOSE, type AnyEntry, type DocumentID, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "./lib/src/types";
|
||||
import { CHeader, ICHeader, ICHeaderLength, ICXHeader, PSCHeader } from "./types";
|
||||
import { InputStringDialog, PopoverSelectString } from "./dialogs";
|
||||
import type ObsidianLiveSyncPlugin from "./main";
|
||||
import { writeString } from "./lib/src/strbin";
|
||||
import { fireAndForget } from "./lib/src/utils";
|
||||
import { sameChangePairs } from "./stores";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { LOG_LEVEL_VERBOSE, type AnyEntry, type DocumentID, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "../lib/src/common/types.ts";
|
||||
import { CHeader, ICHeader, ICHeaderLength, ICXHeader, PSCHeader } from "./types.ts";
|
||||
import { InputStringDialog, PopoverSelectString } from "./dialogs.ts";
|
||||
import type ObsidianLiveSyncPlugin from "../main.ts";
|
||||
import { writeString } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import { fireAndForget } from "../lib/src/common/utils.ts";
|
||||
import { sameChangePairs } from "./stores.ts";
|
||||
|
||||
export { scheduleTask, setPeriodicTask, cancelTask, cancelAllTasks, cancelPeriodicTask, cancelAllPeriodicTask, } from "./lib/src/task";
|
||||
export { scheduleTask, setPeriodicTask, cancelTask, cancelAllTasks, cancelPeriodicTask, cancelAllPeriodicTask, } from "../lib/src/concurrency/task.ts";
|
||||
|
||||
// For backward compatibility, using the path for determining id.
|
||||
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
|
||||
@@ -1,4 +1,4 @@
|
||||
import { type FilePath } from "./lib/src/types";
|
||||
import { type FilePath } from "./lib/src/common/types.ts";
|
||||
|
||||
export {
|
||||
addIcon, App, debounce, Editor, FuzzySuggestModal, MarkdownRenderer, MarkdownView, Modal, Notice, Platform, Plugin, PluginSettingTab, requestUrl, sanitizeHTMLToDom, Setting, stringifyYaml, TAbstractFile, TextAreaComponent, TFile, TFolder,
|
||||
|
||||
@@ -1,21 +1,21 @@
|
||||
import { writable } from 'svelte/store';
|
||||
import { Notice, type PluginManifest, parseYaml, normalizePath, type ListedFiles } from "./deps";
|
||||
import { Notice, type PluginManifest, parseYaml, normalizePath, type ListedFiles } from "../deps.ts";
|
||||
|
||||
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry, SavingEntry } from "./lib/src/types";
|
||||
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "./lib/src/types";
|
||||
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
|
||||
import { createSavingEntryFromLoadedEntry, createTextBlob, delay, fireAndForget, getDocData, isDocContentSame, throttle } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { readString, decodeBinary, arrayBufferToBase64, digestHash } from "./lib/src/strbin";
|
||||
import { serialized } from "./lib/src/lock";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { stripAllPrefixes } from "./lib/src/path";
|
||||
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "./utils";
|
||||
import { PluginDialogModal } from "./dialogs";
|
||||
import { JsonResolveModal } from "./JsonResolveModal";
|
||||
import { QueueProcessor } from './lib/src/processor';
|
||||
import { pluginScanningCount } from './lib/src/stores';
|
||||
import type ObsidianLiveSyncPlugin from './main';
|
||||
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry, SavingEntry } from "../lib/src/common/types.ts";
|
||||
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "../lib/src/common/types.ts";
|
||||
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "../common/types.ts";
|
||||
import { createSavingEntryFromLoadedEntry, createTextBlob, delay, fireAndForget, getDocData, isDocContentSame, throttle } from "../lib/src/common/utils.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { readString, decodeBinary, arrayBufferToBase64, digestHash } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import { serialized } from "../lib/src/concurrency/lock.ts";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands.ts";
|
||||
import { stripAllPrefixes } from "../lib/src/string_and_binary/path.ts";
|
||||
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "../common/utils.ts";
|
||||
import { PluginDialogModal } from "../common/dialogs.ts";
|
||||
import { JsonResolveModal } from "../ui/JsonResolveModal.ts";
|
||||
import { QueueProcessor } from '../lib/src/concurrency/processor.ts';
|
||||
import { pluginScanningCount } from '../lib/src/mock_and_interop/stores.ts';
|
||||
import type ObsidianLiveSyncPlugin from '../main.ts';
|
||||
|
||||
const d = "\u200b";
|
||||
const d2 = "\n";
|
||||
@@ -1,16 +1,16 @@
|
||||
import { normalizePath, type PluginManifest, type ListedFiles } from "./deps";
|
||||
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry, type DocumentID } from "./lib/src/types";
|
||||
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
|
||||
import { readAsBlob, isDocContentSame, sendSignal, readContent, createBlob } from "./lib/src/utils";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { isInternalMetadata, PeriodicProcessor } from "./utils";
|
||||
import { serialized } from "./lib/src/lock";
|
||||
import { JsonResolveModal } from "./JsonResolveModal";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
|
||||
import { QueueProcessor } from "./lib/src/processor";
|
||||
import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "./lib/src/stores";
|
||||
import { normalizePath, type PluginManifest, type ListedFiles } from "../deps.ts";
|
||||
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry, type DocumentID } from "../lib/src/common/types.ts";
|
||||
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "../common/types.ts";
|
||||
import { readAsBlob, isDocContentSame, sendSignal, readContent, createBlob } from "../lib/src/common/utils.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { PouchDB } from "../lib/src/pouchdb/pouchdb-browser.js";
|
||||
import { isInternalMetadata, PeriodicProcessor } from "../common/utils.ts";
|
||||
import { serialized } from "../lib/src/concurrency/lock.ts";
|
||||
import { JsonResolveModal } from "../ui/JsonResolveModal.ts";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands.ts";
|
||||
import { addPrefix, stripAllPrefixes } from "../lib/src/string_and_binary/path.ts";
|
||||
import { QueueProcessor } from "../lib/src/concurrency/processor.ts";
|
||||
import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "../lib/src/mock_and_interop/stores.ts";
|
||||
|
||||
export class HiddenFileSync extends LiveSyncCommands {
|
||||
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
|
||||
@@ -1,15 +1,15 @@
|
||||
import { type EntryDoc, type ObsidianLiveSyncSettings, DEFAULT_SETTINGS, LOG_LEVEL_NOTICE, REMOTE_COUCHDB, REMOTE_MINIO } from "./lib/src/types";
|
||||
import { configURIBase } from "./types";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { askSelectString, askYesNo, askString } from "./utils";
|
||||
import { decrypt, encrypt } from "./lib/src/e2ee_v2";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { delay, fireAndForget } from "./lib/src/utils";
|
||||
import { confirmWithMessage } from "./dialogs";
|
||||
import { Platform } from "./deps";
|
||||
import { fetchAllUsedChunks } from "./lib/src/utils_couchdb";
|
||||
import type { LiveSyncCouchDBReplicator } from "./lib/src/LiveSyncReplicator.js";
|
||||
import { type EntryDoc, type ObsidianLiveSyncSettings, DEFAULT_SETTINGS, LOG_LEVEL_NOTICE, REMOTE_COUCHDB, REMOTE_MINIO } from "../lib/src/common/types.ts";
|
||||
import { configURIBase } from "../common/types.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { PouchDB } from "../lib/src/pouchdb/pouchdb-browser.js";
|
||||
import { askSelectString, askYesNo, askString } from "../common/utils.ts";
|
||||
import { decrypt, encrypt } from "../lib/src/encryption/e2ee_v2.ts";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands.ts";
|
||||
import { delay, fireAndForget } from "../lib/src/common/utils.ts";
|
||||
import { confirmWithMessage } from "../common/dialogs.ts";
|
||||
import { Platform } from "../deps.ts";
|
||||
import { fetchAllUsedChunks } from "../lib/src/pouchdb/utils_couchdb.ts";
|
||||
import type { LiveSyncCouchDBReplicator } from "../lib/src/replication/couchdb/LiveSyncReplicator.js";
|
||||
|
||||
export class SetupLiveSync extends LiveSyncCommands {
|
||||
onunload() { }
|
||||
@@ -1,6 +1,6 @@
|
||||
import { type AnyEntry, type DocumentID, type EntryDoc, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "./lib/src/types";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import type ObsidianLiveSyncPlugin from "./main";
|
||||
import { type AnyEntry, type DocumentID, type EntryDoc, type EntryHasPath, type FilePath, type FilePathWithPrefix } from "../lib/src/common/types.ts";
|
||||
import { PouchDB } from "../lib/src/pouchdb/pouchdb-browser.js";
|
||||
import type ObsidianLiveSyncPlugin from "../main.ts";
|
||||
|
||||
|
||||
export abstract class LiveSyncCommands {
|
||||
2
src/lib
2
src/lib
Submodule src/lib updated: 60d012f92d...3c0ff967e9
214
src/main.ts
214
src/main.ts
@@ -2,43 +2,44 @@ const isDebug = false;
|
||||
|
||||
import { type Diff, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch, stringifyYaml, parseYaml } from "./deps";
|
||||
import { Notice, Plugin, TFile, addIcon, TFolder, normalizePath, TAbstractFile, Editor, MarkdownView, type RequestUrlParam, type RequestUrlResponse, requestUrl, type MarkdownFileInfo } from "./deps";
|
||||
import { type EntryDoc, type LoadedEntry, type ObsidianLiveSyncSettings, type diff_check_result, type diff_result_leaf, type EntryBody, LOG_LEVEL, VER, DEFAULT_SETTINGS, type diff_result, FLAGMD_REDFLAG, SYNCINFO_ID, SALT_OF_PASSPHRASE, type ConfigPassphraseStore, type CouchDBConnection, FLAGMD_REDFLAG2, FLAGMD_REDFLAG3, PREFIXMD_LOGFILE, type DatabaseConnectingStatus, type EntryHasPath, type DocumentID, type FilePathWithPrefix, type FilePath, type AnyEntry, LOG_LEVEL_DEBUG, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_URGENT, LOG_LEVEL_VERBOSE, type SavingEntry, MISSING_OR_ERROR, NOT_CONFLICTED, AUTO_MERGED, CANCELLED, LEAVE_TO_SUBSEQUENT, FLAGMD_REDFLAG2_HR, FLAGMD_REDFLAG3_HR, REMOTE_MINIO, REMOTE_COUCHDB, type BucketSyncSetting, } from "./lib/src/types";
|
||||
import { type InternalFileInfo, type CacheData, type FileEventItem, FileWatchEventQueueMax } from "./types";
|
||||
import { arrayToChunkedArray, createBlob, delay, determineTypeFromBlob, fireAndForget, getDocData, isAnyNote, isDocContentSame, isObjectDifferent, readContent, sendValue, throttle, type SimpleStore } from "./lib/src/utils";
|
||||
import { Logger, setGlobalLogFunction } from "./lib/src/logger";
|
||||
import { PouchDB } from "./lib/src/pouchdb-browser.js";
|
||||
import { ConflictResolveModal } from "./ConflictResolveModal";
|
||||
import { ObsidianLiveSyncSettingTab } from "./ObsidianLiveSyncSettingTab";
|
||||
import { DocumentHistoryModal } from "./DocumentHistoryModal";
|
||||
import { applyPatch, cancelAllPeriodicTask, cancelAllTasks, cancelTask, generatePatchObj, id2path, isObjectMargeApplicable, isSensibleMargeApplicable, flattenObject, path2id, scheduleTask, tryParseJSON, isValidPath, isInternalMetadata, isPluginMetadata, stripInternalMetadataPrefix, isChunk, askSelectString, askYesNo, askString, PeriodicProcessor, getPath, getPathWithoutPrefix, getPathFromTFile, performRebuildDB, memoIfNotExist, memoObject, retrieveMemoObject, disposeMemoObject, isCustomisationSyncMetadata, compareFileFreshness, BASE_IS_NEW, TARGET_IS_NEW, EVEN, compareMTime, markChangesAreSame } from "./utils";
|
||||
import { encrypt, tryDecrypt } from "./lib/src/e2ee_v2";
|
||||
import { balanceChunkPurgedDBs, enableCompression, enableEncryption, isCloudantURI, isErrorOfMissingDoc, isValidRemoteCouchDBURI, purgeUnreferencedChunks } from "./lib/src/utils_couchdb";
|
||||
import { logStore, type LogEntry, collectingChunks, pluginScanningCount, hiddenFilesProcessingCount, hiddenFilesEventCount, logMessages } from "./lib/src/stores";
|
||||
import { setNoticeClass } from "./lib/src/wrapper";
|
||||
import { versionNumberString2Number, writeString, decodeBinary, readString } from "./lib/src/strbin";
|
||||
import { addPrefix, isAcceptedAll, isPlainText, shouldBeIgnored, stripAllPrefixes } from "./lib/src/path";
|
||||
import { isLockAcquired, serialized, shareRunningResult, skipIfDuplicated } from "./lib/src/lock";
|
||||
import { StorageEventManager, StorageEventManagerObsidian } from "./StorageEventManager";
|
||||
import { LiveSyncLocalDB, type LiveSyncLocalDBEnv } from "./lib/src/LiveSyncLocalDB";
|
||||
import { LiveSyncAbstractReplicator, type LiveSyncReplicatorEnv } from "./lib/src/LiveSyncAbstractReplicator.js";
|
||||
import { type KeyValueDatabase, OpenKeyValueDatabase } from "./KeyValueDB";
|
||||
import { LiveSyncCommands } from "./LiveSyncCommands";
|
||||
import { HiddenFileSync } from "./CmdHiddenFileSync";
|
||||
import { SetupLiveSync } from "./CmdSetupLiveSync";
|
||||
import { ConfigSync } from "./CmdConfigSync";
|
||||
import { confirmWithMessage } from "./dialogs";
|
||||
import { GlobalHistoryView, VIEW_TYPE_GLOBAL_HISTORY } from "./GlobalHistoryView";
|
||||
import { LogPaneView, VIEW_TYPE_LOG } from "./LogPaneView";
|
||||
import { LRUCache } from "./lib/src/LRUCache";
|
||||
import { SerializedFileAccess } from "./SerializedFileAccess.js";
|
||||
import { QueueProcessor } from "./lib/src/processor.js";
|
||||
import { reactive, reactiveSource } from "./lib/src/reactive.js";
|
||||
import { initializeStores } from "./stores.js";
|
||||
import { JournalSyncMinio } from "./lib/src/JournalSyncMinio.js";
|
||||
import { LiveSyncJournalReplicator, type LiveSyncJournalReplicatorEnv } from "./lib/src/LiveSyncJournalReplicator.js";
|
||||
import { LiveSyncCouchDBReplicator, type LiveSyncCouchDBReplicatorEnv } from "./lib/src/LiveSyncReplicator.js";
|
||||
import type { CheckPointInfo } from "./lib/src/JournalSyncTypes.js";
|
||||
import { ObsHttpHandler } from "./ObsHttpHandler.js";
|
||||
import { type EntryDoc, type LoadedEntry, type ObsidianLiveSyncSettings, type diff_check_result, type diff_result_leaf, type EntryBody, LOG_LEVEL, VER, DEFAULT_SETTINGS, type diff_result, FLAGMD_REDFLAG, SYNCINFO_ID, SALT_OF_PASSPHRASE, type ConfigPassphraseStore, type CouchDBConnection, FLAGMD_REDFLAG2, FLAGMD_REDFLAG3, PREFIXMD_LOGFILE, type DatabaseConnectingStatus, type EntryHasPath, type DocumentID, type FilePathWithPrefix, type FilePath, type AnyEntry, LOG_LEVEL_DEBUG, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_URGENT, LOG_LEVEL_VERBOSE, type SavingEntry, MISSING_OR_ERROR, NOT_CONFLICTED, AUTO_MERGED, CANCELLED, LEAVE_TO_SUBSEQUENT, FLAGMD_REDFLAG2_HR, FLAGMD_REDFLAG3_HR, REMOTE_MINIO, REMOTE_COUCHDB, type BucketSyncSetting, TweakValuesShouldMatchedTemplate, confName, type TweakValues, } from "./lib/src/common/types.ts";
|
||||
import { type InternalFileInfo, type CacheData, type FileEventItem, FileWatchEventQueueMax } from "./common/types.ts";
|
||||
import { arrayToChunkedArray, createBlob, delay, determineTypeFromBlob, escapeMarkdownValue, extractObject, fireAndForget, getDocData, isAnyNote, isDocContentSame, isObjectDifferent, readContent, sendValue, throttle, type SimpleStore } from "./lib/src/common/utils.ts";
|
||||
import { Logger, setGlobalLogFunction } from "./lib/src/common/logger.ts";
|
||||
import { PouchDB } from "./lib/src/pouchdb/pouchdb-browser.js";
|
||||
import { ConflictResolveModal } from "./ui/ConflictResolveModal.ts";
|
||||
import { ObsidianLiveSyncSettingTab } from "./ui/ObsidianLiveSyncSettingTab.ts";
|
||||
import { DocumentHistoryModal } from "./ui/DocumentHistoryModal.ts";
|
||||
import { applyPatch, cancelAllPeriodicTask, cancelAllTasks, cancelTask, generatePatchObj, id2path, isObjectMargeApplicable, isSensibleMargeApplicable, flattenObject, path2id, scheduleTask, tryParseJSON, isValidPath, isInternalMetadata, isPluginMetadata, stripInternalMetadataPrefix, isChunk, askSelectString, askYesNo, askString, PeriodicProcessor, getPath, getPathWithoutPrefix, getPathFromTFile, performRebuildDB, memoIfNotExist, memoObject, retrieveMemoObject, disposeMemoObject, isCustomisationSyncMetadata, compareFileFreshness, BASE_IS_NEW, TARGET_IS_NEW, EVEN, compareMTime, markChangesAreSame } from "./common/utils.ts";
|
||||
import { encrypt, tryDecrypt } from "./lib/src/encryption/e2ee_v2.ts";
|
||||
import { balanceChunkPurgedDBs, enableCompression, enableEncryption, isCloudantURI, isErrorOfMissingDoc, isValidRemoteCouchDBURI, purgeUnreferencedChunks } from "./lib/src/pouchdb/utils_couchdb.ts";
|
||||
import { logStore, type LogEntry, collectingChunks, pluginScanningCount, hiddenFilesProcessingCount, hiddenFilesEventCount, logMessages } from "./lib/src/mock_and_interop/stores.ts";
|
||||
import { setNoticeClass } from "./lib/src/mock_and_interop/wrapper.ts";
|
||||
import { versionNumberString2Number, writeString, decodeBinary, readString } from "./lib/src/string_and_binary/strbin.ts";
|
||||
import { addPrefix, isAcceptedAll, isPlainText, shouldBeIgnored, stripAllPrefixes } from "./lib/src/string_and_binary/path.ts";
|
||||
import { isLockAcquired, serialized, shareRunningResult, skipIfDuplicated } from "./lib/src/concurrency/lock.ts";
|
||||
import { StorageEventManager, StorageEventManagerObsidian } from "./storages/StorageEventManager.ts";
|
||||
import { LiveSyncLocalDB, type LiveSyncLocalDBEnv } from "./lib/src/pouchdb/LiveSyncLocalDB.ts";
|
||||
import { LiveSyncAbstractReplicator, type LiveSyncReplicatorEnv } from "./lib/src/replication/LiveSyncAbstractReplicator.js";
|
||||
import { type KeyValueDatabase, OpenKeyValueDatabase } from "./common/KeyValueDB.ts";
|
||||
import { LiveSyncCommands } from "./features/LiveSyncCommands.ts";
|
||||
import { HiddenFileSync } from "./features/CmdHiddenFileSync.ts";
|
||||
import { SetupLiveSync } from "./features/CmdSetupLiveSync.ts";
|
||||
import { ConfigSync } from "./features/CmdConfigSync.ts";
|
||||
import { confirmWithMessage } from "./common/dialogs.ts";
|
||||
import { GlobalHistoryView, VIEW_TYPE_GLOBAL_HISTORY } from "./ui/GlobalHistoryView.ts";
|
||||
import { LogPaneView, VIEW_TYPE_LOG } from "./ui/LogPaneView.ts";
|
||||
import { LRUCache } from "./lib/src/memory/LRUCache.ts";
|
||||
import { SerializedFileAccess } from "./storages/SerializedFileAccess.js";
|
||||
import { QueueProcessor } from "./lib/src/concurrency/processor.js";
|
||||
import { reactive, reactiveSource } from "./lib/src/dataobject/reactive.js";
|
||||
import { initializeStores } from "./common/stores.js";
|
||||
import { JournalSyncMinio } from "./lib/src/replication/journal/objectstore/JournalSyncMinio.js";
|
||||
import { LiveSyncJournalReplicator, type LiveSyncJournalReplicatorEnv } from "./lib/src/replication/journal/LiveSyncJournalReplicator.js";
|
||||
import { LiveSyncCouchDBReplicator, type LiveSyncCouchDBReplicatorEnv } from "./lib/src/replication/couchdb/LiveSyncReplicator.js";
|
||||
import type { CheckPointInfo } from "./lib/src/replication/journal/JournalSyncTypes.js";
|
||||
import { ObsHttpHandler } from "./common/ObsHttpHandler.js";
|
||||
// import { Trench } from "./lib/src/memory/memutil.js";
|
||||
|
||||
setNoticeClass(Notice);
|
||||
|
||||
@@ -327,6 +328,7 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
}
|
||||
async onInitializeDatabase(db: LiveSyncLocalDB): Promise<void> {
|
||||
this.kvDB = await OpenKeyValueDatabase(db.dbname + "-livesync-kv");
|
||||
// this.trench = new Trench(this.simpleStore);
|
||||
this.replicator = this.getNewReplicator();
|
||||
}
|
||||
async onResetDatabase(db: LiveSyncLocalDB): Promise<void> {
|
||||
@@ -335,6 +337,7 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
// localStorage.removeItem(lsKey);
|
||||
await this.kvDB.destroy();
|
||||
this.kvDB = await OpenKeyValueDatabase(db.dbname + "-livesync-kv");
|
||||
// this.trench = new Trench(this.simpleStore);
|
||||
this.replicator = this.getNewReplicator()
|
||||
}
|
||||
getReplicator() {
|
||||
@@ -480,6 +483,8 @@ export default class ObsidianLiveSyncPlugin extends Plugin
|
||||
return (await ret).map(e => e.toString()).filter(e => e.startsWith("os-")).map(e => e.substring(3));
|
||||
}
|
||||
}
|
||||
// trench!: Trench;
|
||||
|
||||
getMinioJournalSyncClient() {
|
||||
const id = this.settings.accessKey
|
||||
const key = this.settings.secretKey
|
||||
@@ -2047,58 +2052,114 @@ We can perform a command in this file.
|
||||
await this.loadQueuedFiles();
|
||||
const ret = await this.replicator.openReplication(this.settings, false, showMessage, false);
|
||||
if (!ret) {
|
||||
if (this.replicator.remoteLockedAndDeviceNotAccepted) {
|
||||
if (this.replicator.remoteCleaned && this.settings.useIndexedDBAdapter) {
|
||||
Logger(`The remote database has been cleaned.`, showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO);
|
||||
await skipIfDuplicated("cleanup", async () => {
|
||||
const count = await purgeUnreferencedChunks(this.localDatabase.localDatabase, true);
|
||||
const message = `The remote database has been cleaned up.
|
||||
if (this.replicator.tweakSettingsMismatched) {
|
||||
const remoteSettings = this.replicator.mismatchedTweakValues;
|
||||
const mustSettings = remoteSettings.map(e => extractObject(TweakValuesShouldMatchedTemplate, e));
|
||||
const items = Object.entries(TweakValuesShouldMatchedTemplate);
|
||||
// Making tables:
|
||||
let table = `| Value name | Ours | ${mustSettings.map((_, i) => `Remote ${i + 1} |`).join("")}\n` +
|
||||
`|: --- |: --- :${`|: --- :`.repeat(mustSettings.length)}|\n`
|
||||
for (const v of items) {
|
||||
const key = v[0] as keyof typeof TweakValuesShouldMatchedTemplate;
|
||||
const value = mustSettings.map(e => e[key]);
|
||||
table += `| ${confName(key)} | ${escapeMarkdownValue(this.settings[key])} | ${value.map((v) => `${escapeMarkdownValue(v)} |`).join("")}\n`;
|
||||
}
|
||||
|
||||
const message = `
|
||||
Configuration mismatching between the clients has been detected.
|
||||
This can be harmful or extra capacity consumption. We have to make these value unified.
|
||||
|
||||
Configured values:
|
||||
|
||||
${table}
|
||||
|
||||
Please select a unification method.
|
||||
|
||||
However, even if we answer that you will \`Use mine\`, we will be prompted to accept it again on the other device and have to decide accept or not.`;
|
||||
|
||||
//TODO: apply this settings.
|
||||
const CHOICE_USE_REMOTE = "Use Remote ";
|
||||
const CHOICE_USR_MINE = "Use ours";
|
||||
const CHOICE_DISMISS = "Dismiss";
|
||||
// const ourConfig = extractObject(TweakValuesShouldMatchedTemplate, this.settings);
|
||||
const CHOICE_AND_VALUES = [
|
||||
...mustSettings.map((e, i) => [`${CHOICE_USE_REMOTE} ${i + 1}`, e]),
|
||||
[CHOICE_USR_MINE, true],
|
||||
[CHOICE_DISMISS, false]
|
||||
]
|
||||
const CHOICES = Object.fromEntries(CHOICE_AND_VALUES) as Record<string, TweakValues | boolean>;
|
||||
const retKey = await confirmWithMessage(this, "Locked", message, Object.keys(CHOICES), CHOICE_DISMISS, 60);
|
||||
if (!retKey) return;
|
||||
const conf = CHOICES[retKey];
|
||||
if (!conf) {
|
||||
return;
|
||||
}
|
||||
if (conf === true) {
|
||||
await this.replicator.resetRemoteTweakSettings(this.settings);
|
||||
Logger(`Tweak values on the remote server have been cleared, and will be overwritten in next synchronisation.`, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
if (conf) {
|
||||
this.settings = { ...this.settings, ...conf };
|
||||
await this.saveSettingData();
|
||||
Logger(`Tweak Values have been overwritten by the chosen one.`, LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
|
||||
} else {
|
||||
if (this.replicator.remoteLockedAndDeviceNotAccepted) {
|
||||
if (this.replicator.remoteCleaned && this.settings.useIndexedDBAdapter) {
|
||||
Logger(`The remote database has been cleaned.`, showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO);
|
||||
await skipIfDuplicated("cleanup", async () => {
|
||||
const count = await purgeUnreferencedChunks(this.localDatabase.localDatabase, true);
|
||||
const message = `The remote database has been cleaned up.
|
||||
To synchronize, this device must be also cleaned up. ${count} chunk(s) will be erased from this device.
|
||||
However, If there are many chunks to be deleted, maybe fetching again is faster.
|
||||
We will lose the history of this device if we fetch the remote database again.
|
||||
Even if you choose to clean up, you will see this option again if you exit Obsidian and then synchronise again.`
|
||||
const CHOICE_FETCH = "Fetch again";
|
||||
const CHOICE_CLEAN = "Cleanup";
|
||||
const CHOICE_DISMISS = "Dismiss";
|
||||
const ret = await confirmWithMessage(this, "Cleaned", message, [CHOICE_FETCH, CHOICE_CLEAN, CHOICE_DISMISS], CHOICE_DISMISS, 30);
|
||||
if (ret == CHOICE_FETCH) {
|
||||
await performRebuildDB(this, "localOnly");
|
||||
}
|
||||
if (ret == CHOICE_CLEAN) {
|
||||
const replicator = this.getReplicator();
|
||||
if (!(replicator instanceof LiveSyncCouchDBReplicator)) return;
|
||||
const remoteDB = await replicator.connectRemoteCouchDBWithSetting(this.settings, this.getIsMobile(), true);
|
||||
if (typeof remoteDB == "string") {
|
||||
Logger(remoteDB, LOG_LEVEL_NOTICE);
|
||||
return false;
|
||||
const CHOICE_FETCH = "Fetch again";
|
||||
const CHOICE_CLEAN = "Cleanup";
|
||||
const CHOICE_DISMISS = "Dismiss";
|
||||
const ret = await confirmWithMessage(this, "Cleaned", message, [CHOICE_FETCH, CHOICE_CLEAN, CHOICE_DISMISS], CHOICE_DISMISS, 30);
|
||||
if (ret == CHOICE_FETCH) {
|
||||
await performRebuildDB(this, "localOnly");
|
||||
}
|
||||
if (ret == CHOICE_CLEAN) {
|
||||
const replicator = this.getReplicator();
|
||||
if (!(replicator instanceof LiveSyncCouchDBReplicator)) return;
|
||||
const remoteDB = await replicator.connectRemoteCouchDBWithSetting(this.settings, this.getIsMobile(), true);
|
||||
if (typeof remoteDB == "string") {
|
||||
Logger(remoteDB, LOG_LEVEL_NOTICE);
|
||||
return false;
|
||||
}
|
||||
|
||||
await purgeUnreferencedChunks(this.localDatabase.localDatabase, false);
|
||||
this.localDatabase.hashCaches.clear();
|
||||
// Perform the synchronisation once.
|
||||
if (await this.replicator.openReplication(this.settings, false, showMessage, true)) {
|
||||
await balanceChunkPurgedDBs(this.localDatabase.localDatabase, remoteDB.db);
|
||||
await purgeUnreferencedChunks(this.localDatabase.localDatabase, false);
|
||||
this.localDatabase.hashCaches.clear();
|
||||
await this.getReplicator().markRemoteResolved(this.settings);
|
||||
Logger("The local database has been cleaned up.", showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO)
|
||||
} else {
|
||||
Logger("Replication has been cancelled. Please try it again.", showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO)
|
||||
}
|
||||
// Perform the synchronisation once.
|
||||
if (await this.replicator.openReplication(this.settings, false, showMessage, true)) {
|
||||
await balanceChunkPurgedDBs(this.localDatabase.localDatabase, remoteDB.db);
|
||||
await purgeUnreferencedChunks(this.localDatabase.localDatabase, false);
|
||||
this.localDatabase.hashCaches.clear();
|
||||
await this.getReplicator().markRemoteResolved(this.settings);
|
||||
Logger("The local database has been cleaned up.", showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO)
|
||||
} else {
|
||||
Logger("Replication has been cancelled. Please try it again.", showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO)
|
||||
}
|
||||
|
||||
}
|
||||
});
|
||||
} else {
|
||||
const message = `
|
||||
}
|
||||
});
|
||||
} else {
|
||||
const message = `
|
||||
The remote database has been rebuilt.
|
||||
To synchronize, this device must fetch everything again once.
|
||||
Or if you are sure know what had been happened, we can unlock the database from the setting dialog.
|
||||
`
|
||||
const CHOICE_FETCH = "Fetch again";
|
||||
const CHOICE_DISMISS = "Dismiss";
|
||||
const ret = await confirmWithMessage(this, "Locked", message, [CHOICE_FETCH, CHOICE_DISMISS], CHOICE_DISMISS, 10);
|
||||
if (ret == CHOICE_FETCH) {
|
||||
await performRebuildDB(this, "localOnly");
|
||||
const CHOICE_FETCH = "Fetch again";
|
||||
const CHOICE_DISMISS = "Dismiss";
|
||||
const ret = await confirmWithMessage(this, "Locked", message, [CHOICE_FETCH, CHOICE_DISMISS], CHOICE_DISMISS, 10);
|
||||
if (ret == CHOICE_FETCH) {
|
||||
await performRebuildDB(this, "localOnly");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -2873,6 +2934,7 @@ Or if you are sure know what had been happened, we can unlock the database from
|
||||
children: [],
|
||||
datatype: datatype,
|
||||
type: datatype,
|
||||
eden: {},
|
||||
};
|
||||
//upsert should locked
|
||||
const msg = `STORAGE -> DB (${datatype}) `;
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "./deps";
|
||||
import { serialized } from "./lib/src/lock";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { isPlainText } from "./lib/src/path";
|
||||
import type { FilePath } from "./lib/src/types";
|
||||
import { createBinaryBlob, isDocContentSame } from "./lib/src/utils";
|
||||
import type { InternalFileInfo } from "./types";
|
||||
import { markChangesAreSame } from "./utils";
|
||||
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "../deps.ts";
|
||||
import { serialized } from "../lib/src/concurrency/lock.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { isPlainText } from "../lib/src/string_and_binary/path.ts";
|
||||
import type { FilePath } from "../lib/src/common/types.ts";
|
||||
import { createBinaryBlob, isDocContentSame } from "../lib/src/common/utils.ts";
|
||||
import type { InternalFileInfo } from "../common/types.ts";
|
||||
import { markChangesAreSame } from "../common/utils.ts";
|
||||
|
||||
function getFileLockKey(file: TFile | TFolder | string) {
|
||||
return `fl:${typeof (file) == "string" ? file : file.path}`;
|
||||
@@ -1,11 +1,11 @@
|
||||
import type { SerializedFileAccess } from "./SerializedFileAccess";
|
||||
import { Plugin, TAbstractFile, TFile, TFolder } from "./deps";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { shouldBeIgnored } from "./lib/src/path";
|
||||
import type { QueueProcessor } from "./lib/src/processor";
|
||||
import { LOG_LEVEL_NOTICE, type FilePath, type ObsidianLiveSyncSettings } from "./lib/src/types";
|
||||
import { delay } from "./lib/src/utils";
|
||||
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo } from "./types";
|
||||
import type { SerializedFileAccess } from "./SerializedFileAccess.ts";
|
||||
import { Plugin, TAbstractFile, TFile, TFolder } from "../deps.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { shouldBeIgnored } from "../lib/src/string_and_binary/path.ts";
|
||||
import type { QueueProcessor } from "../lib/src/concurrency/processor.ts";
|
||||
import { LOG_LEVEL_NOTICE, type FilePath, type ObsidianLiveSyncSettings } from "../lib/src/common/types.ts";
|
||||
import { delay } from "../lib/src/common/utils.ts";
|
||||
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo } from "../common/types.ts";
|
||||
|
||||
|
||||
export abstract class StorageEventManager {
|
||||
@@ -1,8 +1,8 @@
|
||||
import { App, Modal } from "./deps";
|
||||
import { App, Modal } from "../deps.ts";
|
||||
import { DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT } from "diff-match-patch";
|
||||
import { CANCELLED, LEAVE_TO_SUBSEQUENT, RESULT_TIMED_OUT, type diff_result } from "./lib/src/types";
|
||||
import { escapeStringToHTML } from "./lib/src/strbin";
|
||||
import { delay, sendValue, waitForValue } from "./lib/src/utils";
|
||||
import { CANCELLED, LEAVE_TO_SUBSEQUENT, RESULT_TIMED_OUT, type diff_result } from "../lib/src/common/types.ts";
|
||||
import { escapeStringToHTML } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import { delay, sendValue, waitForValue } from "../lib/src/common/utils.ts";
|
||||
|
||||
export type MergeDialogResult = typeof LEAVE_TO_SUBSEQUENT | typeof CANCELLED | string;
|
||||
export class ConflictResolveModal extends Modal {
|
||||
@@ -1,12 +1,12 @@
|
||||
import { TFile, Modal, App, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "./deps";
|
||||
import { getPathFromTFile, isValidPath } from "./utils";
|
||||
import { decodeBinary, escapeStringToHTML, readString } from "./lib/src/strbin";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { isErrorOfMissingDoc } from "./lib/src/utils_couchdb";
|
||||
import { getDocData, readContent } from "./lib/src/utils";
|
||||
import { isPlainText, stripPrefix } from "./lib/src/path";
|
||||
import { TFile, Modal, App, DIFF_DELETE, DIFF_EQUAL, DIFF_INSERT, diff_match_patch } from "../deps.ts";
|
||||
import { getPathFromTFile, isValidPath } from "../common/utils.ts";
|
||||
import { decodeBinary, escapeStringToHTML, readString } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import ObsidianLiveSyncPlugin from "../main.ts";
|
||||
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "../lib/src/common/types.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { isErrorOfMissingDoc } from "../lib/src/pouchdb/utils_couchdb.ts";
|
||||
import { getDocData, readContent } from "../lib/src/common/utils.ts";
|
||||
import { isPlainText, stripPrefix } from "../lib/src/string_and_binary/path.ts";
|
||||
|
||||
function isImage(path: string) {
|
||||
const ext = path.split(".").splice(-1)[0].toLowerCase();
|
||||
@@ -1,12 +1,12 @@
|
||||
<script lang="ts">
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import ObsidianLiveSyncPlugin from "../main";
|
||||
import { onDestroy, onMount } from "svelte";
|
||||
import type { AnyEntry, FilePathWithPrefix } from "./lib/src/types";
|
||||
import { getDocData, isAnyNote, isDocContentSame, readAsBlob } from "./lib/src/utils";
|
||||
import { diff_match_patch } from "./deps";
|
||||
import type { AnyEntry, FilePathWithPrefix } from "../lib/src/common/types";
|
||||
import { getDocData, isAnyNote, isDocContentSame, readAsBlob } from "../lib/src/common/utils";
|
||||
import { diff_match_patch } from "../deps";
|
||||
import { DocumentHistoryModal } from "./DocumentHistoryModal";
|
||||
import { isPlainText, stripAllPrefixes } from "./lib/src/path";
|
||||
import { TFile } from "./deps";
|
||||
import { isPlainText, stripAllPrefixes } from "../lib/src/string_and_binary/path";
|
||||
import { TFile } from "../deps";
|
||||
export let plugin: ObsidianLiveSyncPlugin;
|
||||
|
||||
let showDiffInfo = false;
|
||||
@@ -1,9 +1,9 @@
|
||||
import {
|
||||
ItemView,
|
||||
WorkspaceLeaf
|
||||
} from "./deps";
|
||||
} from "../deps.ts";
|
||||
import GlobalHistoryComponent from "./GlobalHistory.svelte";
|
||||
import type ObsidianLiveSyncPlugin from "./main";
|
||||
import type ObsidianLiveSyncPlugin from "../main.ts";
|
||||
|
||||
export const VIEW_TYPE_GLOBAL_HISTORY = "global-history";
|
||||
export class GlobalHistoryView extends ItemView {
|
||||
@@ -1,7 +1,7 @@
|
||||
import { App, Modal } from "./deps";
|
||||
import { type FilePath, type LoadedEntry } from "./lib/src/types";
|
||||
import { App, Modal } from "../deps.ts";
|
||||
import { type FilePath, type LoadedEntry } from "../lib/src/common/types.ts";
|
||||
import JsonResolvePane from "./JsonResolvePane.svelte";
|
||||
import { waitForSignal } from "./lib/src/utils";
|
||||
import { waitForSignal } from "../lib/src/common/utils.ts";
|
||||
|
||||
export class JsonResolveModal extends Modal {
|
||||
// result: Array<[number, string]>;
|
||||
@@ -1,9 +1,9 @@
|
||||
<script lang="ts">
|
||||
import { type Diff, DIFF_DELETE, DIFF_INSERT, diff_match_patch } from "./deps";
|
||||
import type { FilePath, LoadedEntry } from "./lib/src/types";
|
||||
import { decodeBinary, readString } from "./lib/src/strbin";
|
||||
import { getDocData } from "./lib/src/utils";
|
||||
import { mergeObject } from "./utils";
|
||||
import { type Diff, DIFF_DELETE, DIFF_INSERT, diff_match_patch } from "../deps";
|
||||
import type { FilePath, LoadedEntry } from "../lib/src/common/types";
|
||||
import { decodeBinary, readString } from "../lib/src/string_and_binary/strbin";
|
||||
import { getDocData } from "../lib/src/common/utils";
|
||||
import { mergeObject } from "../common/utils";
|
||||
|
||||
export let docs: LoadedEntry[] = [];
|
||||
export let callback: (keepRev?: string, mergedStr?: string) => Promise<void> = async (_, __) => {
|
||||
@@ -1,8 +1,8 @@
|
||||
import { App, Modal } from "./deps";
|
||||
import type { ReactiveInstance, } from "./lib/src/reactive";
|
||||
import { logMessages } from "./lib/src/stores";
|
||||
import { escapeStringToHTML } from "./lib/src/strbin";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { App, Modal } from "../deps.ts";
|
||||
import type { ReactiveInstance, } from "../lib/src/dataobject/reactive.ts";
|
||||
import { logMessages } from "../lib/src/mock_and_interop/stores.ts";
|
||||
import { escapeStringToHTML } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import ObsidianLiveSyncPlugin from "../main.ts";
|
||||
|
||||
export class LogDisplayModal extends Modal {
|
||||
plugin: ObsidianLiveSyncPlugin;
|
||||
@@ -1,8 +1,8 @@
|
||||
<script lang="ts">
|
||||
import { onDestroy, onMount } from "svelte";
|
||||
import { logMessages } from "./lib/src/stores";
|
||||
import type { ReactiveInstance } from "./lib/src/reactive";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { logMessages } from "../lib/src/mock_and_interop/stores";
|
||||
import type { ReactiveInstance } from "../lib/src/dataobject/reactive";
|
||||
import { Logger } from "../lib/src/common/logger";
|
||||
|
||||
let unsubscribe: () => void;
|
||||
let messages = [] as string[];
|
||||
@@ -3,7 +3,7 @@ import {
|
||||
WorkspaceLeaf
|
||||
} from "obsidian";
|
||||
import LogPaneComponent from "./LogPane.svelte";
|
||||
import type ObsidianLiveSyncPlugin from "./main";
|
||||
import type ObsidianLiveSyncPlugin from "../main.ts";
|
||||
export const VIEW_TYPE_LOG = "log-log";
|
||||
//Log view
|
||||
export class LogPaneView extends ItemView {
|
||||
@@ -1,16 +1,38 @@
|
||||
import { App, PluginSettingTab, Setting, sanitizeHTMLToDom, MarkdownRenderer, stringifyYaml } from "./deps";
|
||||
import { DEFAULT_SETTINGS, type ObsidianLiveSyncSettings, type ConfigPassphraseStore, type RemoteDBSettings, type FilePathWithPrefix, type HashAlgorithm, type DocumentID, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, LOG_LEVEL_INFO, type LoadedEntry, PREFERRED_SETTING_CLOUDANT, PREFERRED_SETTING_SELF_HOSTED, FLAGMD_REDFLAG2_HR, FLAGMD_REDFLAG3_HR, REMOTE_COUCHDB, REMOTE_MINIO, type BucketSyncSetting, type RemoteType, PREFERRED_JOURNAL_SYNC } from "./lib/src/types";
|
||||
import { createBlob, delay, extractObject, isDocContentSame, readAsBlob } from "./lib/src/utils";
|
||||
import { versionNumberString2Number } from "./lib/src/strbin";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb";
|
||||
import { testCrypt } from "./lib/src/e2ee_v2";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { askYesNo, performRebuildDB, requestToCouchDB, scheduleTask } from "./utils";
|
||||
import { App, PluginSettingTab, Setting, sanitizeHTMLToDom, MarkdownRenderer, stringifyYaml } from "../deps.ts";
|
||||
import {
|
||||
DEFAULT_SETTINGS,
|
||||
type ObsidianLiveSyncSettings,
|
||||
type ConfigPassphraseStore,
|
||||
type RemoteDBSettings,
|
||||
type FilePathWithPrefix,
|
||||
type HashAlgorithm,
|
||||
type DocumentID,
|
||||
LOG_LEVEL_NOTICE,
|
||||
LOG_LEVEL_VERBOSE,
|
||||
LOG_LEVEL_INFO,
|
||||
type LoadedEntry,
|
||||
PREFERRED_SETTING_CLOUDANT,
|
||||
PREFERRED_SETTING_SELF_HOSTED,
|
||||
FLAGMD_REDFLAG2_HR,
|
||||
FLAGMD_REDFLAG3_HR,
|
||||
REMOTE_COUCHDB,
|
||||
REMOTE_MINIO,
|
||||
type BucketSyncSetting,
|
||||
type RemoteType,
|
||||
PREFERRED_JOURNAL_SYNC,
|
||||
confName
|
||||
} from "../lib/src/common/types.ts";
|
||||
import { createBlob, delay, extractObject, isDocContentSame, readAsBlob } from "../lib/src/common/utils.ts";
|
||||
import { versionNumberString2Number } from "../lib/src/string_and_binary/strbin.ts";
|
||||
import { Logger } from "../lib/src/common/logger.ts";
|
||||
import { checkSyncInfo, isCloudantURI } from "../lib/src/pouchdb/utils_couchdb.ts";
|
||||
import { testCrypt } from "../lib/src/encryption/e2ee_v2.ts";
|
||||
import ObsidianLiveSyncPlugin from "../main.ts";
|
||||
import { askYesNo, performRebuildDB, requestToCouchDB, scheduleTask } from "../common/utils.ts";
|
||||
import { request, type ButtonComponent, TFile } from "obsidian";
|
||||
import { shouldBeIgnored } from "./lib/src/path";
|
||||
import MultipleRegExpControl from './MultipleRegExpControl.svelte';
|
||||
import { LiveSyncCouchDBReplicator } from "./lib/src/LiveSyncReplicator";
|
||||
import { shouldBeIgnored } from "../lib/src/string_and_binary/path.ts";
|
||||
import MultipleRegExpControl from './components/MultipleRegExpControl.svelte';
|
||||
import { LiveSyncCouchDBReplicator } from "../lib/src/replication/couchdb/LiveSyncReplicator.ts";
|
||||
|
||||
|
||||
export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
@@ -21,12 +43,14 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
super(app, plugin);
|
||||
this.plugin = plugin;
|
||||
}
|
||||
|
||||
async testConnection(settingOverride: Partial<ObsidianLiveSyncSettings> = {}): Promise<void> {
|
||||
const trialSetting = { ...this.plugin.settings, ...settingOverride };
|
||||
const replicator = this.plugin.getNewReplicator(trialSetting);
|
||||
|
||||
await replicator.tryConnectRemote(trialSetting);
|
||||
}
|
||||
|
||||
askReload(message?: string) {
|
||||
scheduleTask("configReload", 250, async () => {
|
||||
if (await askYesNo(this.app, message || "Do you want to restart and reload Obsidian now?") == "yes") {
|
||||
@@ -35,10 +59,12 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
closeSetting() {
|
||||
// @ts-ignore
|
||||
this.plugin.app.setting.close()
|
||||
}
|
||||
|
||||
display(): void {
|
||||
const { containerEl } = this;
|
||||
let encrypt = this.plugin.settings.encrypt;
|
||||
@@ -112,7 +138,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
|
||||
const tmpDiv = createSpan();
|
||||
tmpDiv.addClass("sls-header-button");
|
||||
tmpDiv.innerHTML = `<button> OK, I read all. </button>`;
|
||||
tmpDiv.innerHTML = `<button> OK, I read everything. </button>`;
|
||||
if (lastVersion > this.plugin.settings.lastReadUpdates) {
|
||||
const informationButtonDiv = h3El.appendChild(tmpDiv);
|
||||
informationButtonDiv.querySelector("button")?.addEventListener("click", async () => {
|
||||
@@ -186,7 +212,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
})
|
||||
if (!this.plugin.settings.isConfigured) {
|
||||
new Setting(setupWizardEl)
|
||||
.setName("Enable LiveSync on this device as the set-up was completed manually")
|
||||
.setName("Enable LiveSync on this device as the setup was completed manually")
|
||||
.addButton((text) => {
|
||||
text.setButtonText("Enable").onClick(async () => {
|
||||
this.plugin.settings.isConfigured = true;
|
||||
@@ -197,10 +223,10 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
}
|
||||
if (this.plugin.settings.isConfigured) {
|
||||
new Setting(setupWizardEl)
|
||||
.setName("Discard exist settings and databases")
|
||||
.setName("Discard existing settings and databases")
|
||||
.addButton((text) => {
|
||||
text.setButtonText("Discard").onClick(async () => {
|
||||
if (await askYesNo(this.plugin.app, "Do you really want to discard exist settings and databases?") == "yes") {
|
||||
if (await askYesNo(this.plugin.app, "Do you really want to discard existing settings and databases?") == "yes") {
|
||||
this.plugin.settings = { ...DEFAULT_SETTINGS };
|
||||
await this.plugin.saveSettingData();
|
||||
await this.plugin.resetLocalDatabase();
|
||||
@@ -230,13 +256,17 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
try {
|
||||
remoteTroubleShootMDSrc = await request(`${rawRepoURI}${basePath}/${filename}`);
|
||||
} catch (ex: any) {
|
||||
remoteTroubleShootMDSrc = "Error Occurred!!\n" + ex.toString();
|
||||
remoteTroubleShootMDSrc = "An error occurred!!\n" + ex.toString();
|
||||
}
|
||||
const remoteTroubleShootMD = remoteTroubleShootMDSrc.replace(/\((.*?(.png)|(.jpg))\)/g, `(${rawRepoURI}${basePath}/$1)`)
|
||||
// Render markdown
|
||||
await MarkdownRenderer.render(this.plugin.app, `<a class='sls-troubleshoot-anchor'></a> [Tips and Troubleshooting](${topPath}) [PageTop](${filename})\n\n${remoteTroubleShootMD}`, troubleShootEl, `${rawRepoURI}`, this.plugin);
|
||||
// Menu
|
||||
troubleShootEl.querySelector<HTMLAnchorElement>(".sls-troubleshoot-anchor")?.parentElement?.setCssStyles({ position: "sticky", top: "-1em", backgroundColor: "var(--modal-background)" });
|
||||
troubleShootEl.querySelector<HTMLAnchorElement>(".sls-troubleshoot-anchor")?.parentElement?.setCssStyles({
|
||||
position: "sticky",
|
||||
top: "-1em",
|
||||
backgroundColor: "var(--modal-background)"
|
||||
});
|
||||
// Trap internal links.
|
||||
troubleShootEl.querySelectorAll<HTMLAnchorElement>("a.internal-link").forEach((anchorEl) => {
|
||||
anchorEl.addEventListener("click", async (evt) => {
|
||||
@@ -286,7 +316,8 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
})
|
||||
})
|
||||
|
||||
let applyDisplayEnabled = () => { }
|
||||
let applyDisplayEnabled = () => {
|
||||
}
|
||||
const editing = extractObject<BucketSyncSetting>({
|
||||
accessKey: "",
|
||||
bucket: "",
|
||||
@@ -303,7 +334,7 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
|
||||
const ObjectStorageMessage = `Kindly notice: this is a pretty experimental feature, hence we have some limitations.
|
||||
- Append only architecture. It will not shrink used storage if we do not perform a rebuild.
|
||||
- A bit fragile.
|
||||
- During the first synchronization, the entire history to date will be transferred. For this reason, it is preferable to do this under the WiFi network.
|
||||
- During the first synchronization, the entire history to date will be transferred. For this reason, it is preferable to do this while connected to a Wi-Fi network.
|
||||
- From the second, we always transfer only differences.
|
||||
|
||||
However, your report is needed to stabilise this. I appreciate you for your great dedication.
|
||||
@@ -373,7 +404,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
})
|
||||
);
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Apply Setting")
|
||||
.setName("Apply Settings")
|
||||
.setClass("wizardHidden")
|
||||
.addButton((button) =>
|
||||
button
|
||||
@@ -480,7 +511,6 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
await this.plugin.saveSettings();
|
||||
})
|
||||
)
|
||||
|
||||
);
|
||||
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
@@ -497,7 +527,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
);
|
||||
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Check and Fix database configuration")
|
||||
.setName("Check and fix database configuration")
|
||||
.setDesc("Check the database configuration, and fix if there are any problems.")
|
||||
.addButton((button) =>
|
||||
button
|
||||
@@ -563,13 +593,13 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
}
|
||||
// HTTP user-authorization check
|
||||
if (responseConfig?.chttpd?.require_valid_user != "true") {
|
||||
addResult("❗ chttpd.require_valid_user looks like wrong.");
|
||||
addResult("❗ chttpd.require_valid_user is wrong.");
|
||||
addConfigFixButton("Set chttpd.require_valid_user = true", "chttpd/require_valid_user", "true");
|
||||
} else {
|
||||
addResult("✔ chttpd.require_valid_user is ok.");
|
||||
}
|
||||
if (responseConfig?.chttpd_auth?.require_valid_user != "true") {
|
||||
addResult("❗ chttpd_auth.require_valid_user looks like wrong.");
|
||||
addResult("❗ chttpd_auth.require_valid_user is wrong.");
|
||||
addConfigFixButton("Set chttpd_auth.require_valid_user = true", "chttpd_auth/require_valid_user", "true");
|
||||
} else {
|
||||
addResult("✔ chttpd_auth.require_valid_user is ok.");
|
||||
@@ -636,9 +666,9 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
}));
|
||||
addResult(`Origin check:${org}`);
|
||||
if (responseHeaders["access-control-allow-credentials"] != "true") {
|
||||
addResult("❗ CORS is not allowing credential");
|
||||
addResult("❗ CORS is not allowing credentials");
|
||||
} else {
|
||||
addResult("✔ CORS credential OK");
|
||||
addResult("✔ CORS credentials OK");
|
||||
}
|
||||
if (responseHeaders["access-control-allow-origin"] != org) {
|
||||
addResult(`❗ CORS Origin is unmatched:${origin}->${responseHeaders["access-control-allow-origin"]}`);
|
||||
@@ -647,7 +677,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
}
|
||||
}
|
||||
addResult("--Done--", ["ob-btn-config-head"]);
|
||||
addResult("If you have some trouble with Connection-check even though all Config-check has been passed, Please check your reverse proxy's configuration.", ["ob-btn-config-info"]);
|
||||
addResult("If you have some trouble with Connection-check even though all Config-check has been passed, please check your reverse proxy's configuration.", ["ob-btn-config-info"]);
|
||||
Logger(`Checking configuration done`, LOG_LEVEL_INFO);
|
||||
} catch (ex: any) {
|
||||
if (ex?.status == 401) {
|
||||
@@ -667,9 +697,73 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
text: "",
|
||||
});
|
||||
|
||||
containerRemoteDatabaseEl.createEl("h4", { text: "Effective Storage Using" });
|
||||
containerRemoteDatabaseEl.createEl("h4", { text: "Effective Storage Using" }).addClass("wizardHidden")
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Data Compression (Experimental)")
|
||||
.setName(confName("useEden"))
|
||||
.setDesc("If enabled, newly created chunks are temporarily kept within the document, and graduated to become independent chunks once stabilised.")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.useEden).onChange(async (value) => {
|
||||
this.plugin.settings.useEden = value;
|
||||
await this.plugin.saveSettings();
|
||||
this.display();
|
||||
})
|
||||
)
|
||||
.setClass("wizardHidden");
|
||||
if (this.plugin.settings.useEden) {
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Maximum Incubating Chunks")
|
||||
.setDesc("The maximum number of chunks that can be incubated within the document. Chunks exceeding this number will immediately graduate to independent chunks.")
|
||||
.addText((text) => {
|
||||
text.setPlaceholder("")
|
||||
.setValue(this.plugin.settings.maxChunksInEden + "")
|
||||
.onChange(async (value) => {
|
||||
let v = Number(value);
|
||||
if (isNaN(v) || v < 3) {
|
||||
v = 3;
|
||||
}
|
||||
this.plugin.settings.maxChunksInEden = v;
|
||||
await this.plugin.saveSettings();
|
||||
});
|
||||
text.inputEl.setAttribute("type", "number");
|
||||
})
|
||||
.setClass("wizardHidden");
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Maximum Incubating Chunk Size")
|
||||
.setDesc("The maximum total size of chunks that can be incubated within the document. Chunks exceeding this size will immediately graduate to independent chunks.")
|
||||
.addText((text) => {
|
||||
text.setPlaceholder("")
|
||||
.setValue(this.plugin.settings.maxTotalLengthInEden + "")
|
||||
.onChange(async (value) => {
|
||||
let v = Number(value);
|
||||
if (isNaN(v) || v < 100) {
|
||||
v = 100;
|
||||
}
|
||||
this.plugin.settings.maxTotalLengthInEden = v;
|
||||
await this.plugin.saveSettings();
|
||||
});
|
||||
text.inputEl.setAttribute("type", "number");
|
||||
})
|
||||
.setClass("wizardHidden");
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Maximum Incubation Period")
|
||||
.setDesc("The maximum duration for which chunks can be incubated within the document. Chunks exceeding this period will graduate to independent chunks.")
|
||||
.addText((text) => {
|
||||
text.setPlaceholder("")
|
||||
.setValue(this.plugin.settings.maxAgeInEden + "")
|
||||
.onChange(async (value) => {
|
||||
let v = Number(value);
|
||||
if (isNaN(v) || v < 3) {
|
||||
v = 3;
|
||||
}
|
||||
this.plugin.settings.maxAgeInEden = v;
|
||||
await this.plugin.saveSettings();
|
||||
});
|
||||
text.inputEl.setAttribute("type", "number");
|
||||
})
|
||||
.setClass("wizardHidden");
|
||||
}
|
||||
new Setting(containerRemoteDatabaseEl)
|
||||
.setName(confName("enableCompression"))
|
||||
.setDesc("Compresses data during transfer, saving space in the remote database. Note: Please ensure that all devices have v0.22.18 and connected tools are also supported compression.")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.enableCompression).onChange(async (value) => {
|
||||
@@ -677,15 +771,15 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
await this.plugin.saveSettings();
|
||||
this.display();
|
||||
})
|
||||
);
|
||||
)
|
||||
.setClass("wizardHidden");
|
||||
}
|
||||
|
||||
|
||||
|
||||
containerRemoteDatabaseEl.createEl("h4", { text: "Confidentiality" });
|
||||
|
||||
const e2e = new Setting(containerRemoteDatabaseEl)
|
||||
.setName("End to End Encryption")
|
||||
.setName(confName("encrypt"))
|
||||
.setDesc("Encrypt contents on the remote database. If you use the plugin's synchronization feature, enabling this is recommend.")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(encrypt).onChange(async (value) => {
|
||||
@@ -734,7 +828,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
// if (showEncryptOptionDetail) {
|
||||
const passphraseSetting = new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Passphrase")
|
||||
.setDesc("Encrypting passphrase. If you change the passphrase of a existing database, overwriting the remote database is strongly recommended.")
|
||||
.setDesc("Encrypting passphrase. If you change the passphrase of an existing database, overwriting the remote database is strongly recommended.")
|
||||
.addText((text) => {
|
||||
text.setPlaceholder("")
|
||||
.setValue(passphrase)
|
||||
@@ -753,7 +847,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
});
|
||||
|
||||
const usePathObfuscationEl = new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Path Obfuscation")
|
||||
.setName(confName("usePathObfuscation"))
|
||||
.setDesc("Obfuscate paths of files. If we configured, we should rebuild the database.")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(usePathObfuscation).onChange(async (value) => {
|
||||
@@ -770,7 +864,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
);
|
||||
|
||||
const dynamicIteration = new Setting(containerRemoteDatabaseEl)
|
||||
.setName("Use dynamic iteration count (experimental)")
|
||||
.setName(confName("useDynamicIterationCount"))
|
||||
.setDesc("Balancing the encryption/decryption load against the length of the passphrase if toggled.")
|
||||
.addToggle((toggle) => {
|
||||
toggle.setValue(useDynamicIterationCount)
|
||||
@@ -803,7 +897,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
)
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Apply and Fetch")
|
||||
.setButtonText("Apply and fetch")
|
||||
.setWarning()
|
||||
.setDisabled(false)
|
||||
.onClick(async () => {
|
||||
@@ -812,7 +906,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
)
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Apply and Rebuild")
|
||||
.setButtonText("Apply and rebuild")
|
||||
.setWarning()
|
||||
.setDisabled(false)
|
||||
.onClick(async () => {
|
||||
@@ -853,7 +947,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
return;
|
||||
}
|
||||
if (encrypt && !(await testCrypt())) {
|
||||
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL_NOTICE);
|
||||
Logger("WARNING! Your device does not support encryption.", LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
if (!(await checkWorkingPassphrase()) && !sendToServer) {
|
||||
@@ -885,7 +979,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
return;
|
||||
}
|
||||
if (encrypt && !(await testCrypt())) {
|
||||
Logger("WARNING! Your device would not support encryption.", LOG_LEVEL_NOTICE);
|
||||
Logger("WARNING! Your device does not support encryption.", LOG_LEVEL_NOTICE);
|
||||
return;
|
||||
}
|
||||
if (!encrypt) {
|
||||
@@ -898,7 +992,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
this.plugin.settings.useDynamicIterationCount = useDynamicIterationCount;
|
||||
this.plugin.settings.usePathObfuscation = usePathObfuscation;
|
||||
this.plugin.settings.isConfigured = true;
|
||||
Logger("All synchronization have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL_NOTICE)
|
||||
Logger("All synchronizations have been temporarily disabled. Please enable them after the fetching, if you need them.", LOG_LEVEL_NOTICE)
|
||||
await this.plugin.saveSettings();
|
||||
updateE2EControls();
|
||||
applyDisplayEnabled();
|
||||
@@ -1034,7 +1128,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
let buttonApplyFilename: ButtonComponent;
|
||||
new Setting(containerGeneralSettingsEl)
|
||||
.setName("Filename")
|
||||
.setDesc("If you set this, all settings are saved in a markdown file. You will also be notified when new settings were arrived. You can set different files by the platform.")
|
||||
.setDesc("If you set this, all settings are saved in a markdown file. You will be notified when new settings arrive. You can set different files by the platform.")
|
||||
.addText((text) => {
|
||||
text.setPlaceholder("livesync/setting.md")
|
||||
.setValue(settingSyncFile)
|
||||
@@ -1140,9 +1234,14 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
|
||||
let currentPreset = "NONE";
|
||||
containerSyncSettingEl.createEl("div",
|
||||
{ text: `Please select any preset to complete wizard.` }
|
||||
{ text: `Please select any preset to complete the wizard.` }
|
||||
).addClasses(["op-warn-info", "wizardOnly"]);
|
||||
const options: Record<string, string> = this.plugin.settings.remoteType == REMOTE_COUCHDB ? { NONE: "", LIVESYNC: "LiveSync", PERIODIC: "Periodic w/ batch", DISABLE: "Disable all automatic" } : { NONE: "", PERIODIC: "Periodic w/ batch", DISABLE: "Disable all automatic" };
|
||||
const options: Record<string, string> = this.plugin.settings.remoteType == REMOTE_COUCHDB ? {
|
||||
NONE: "",
|
||||
LIVESYNC: "LiveSync",
|
||||
PERIODIC: "Periodic w/ batch",
|
||||
DISABLE: "Disable all automatic"
|
||||
} : { NONE: "", PERIODIC: "Periodic w/ batch", DISABLE: "Disable all automatic" };
|
||||
new Setting(containerSyncSettingEl)
|
||||
.setName("Presets")
|
||||
.setDesc("Apply preset configuration")
|
||||
@@ -1200,7 +1299,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
}
|
||||
Logger("Synchronization setting configured as Periodic sync with batch database update.", LOG_LEVEL_NOTICE);
|
||||
} else {
|
||||
Logger("All synchronization disabled.", LOG_LEVEL_NOTICE);
|
||||
Logger("All synchronizations disabled.", LOG_LEVEL_NOTICE);
|
||||
this.plugin.settings = {
|
||||
...this.plugin.settings,
|
||||
...presetAllDisabled
|
||||
@@ -1234,7 +1333,11 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
syncMode = "PERIODIC";
|
||||
}
|
||||
|
||||
const optionsSyncMode = this.plugin.settings.remoteType == REMOTE_COUCHDB ? { "": "On events", PERIODIC: "Periodic and On events", "LIVESYNC": "LiveSync" } : { "": "On events", PERIODIC: "Periodic and On events" }
|
||||
const optionsSyncMode = this.plugin.settings.remoteType == REMOTE_COUCHDB ? {
|
||||
"": "On events",
|
||||
PERIODIC: "Periodic and On events",
|
||||
"LIVESYNC": "LiveSync"
|
||||
} : { "": "On events", PERIODIC: "Periodic and On events" }
|
||||
new Setting(containerSyncSettingEl)
|
||||
.setName("Sync Mode")
|
||||
.setClass("wizardHidden")
|
||||
@@ -1280,7 +1383,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
|
||||
new Setting(containerSyncSettingEl)
|
||||
.setName("Sync on Save")
|
||||
.setDesc("When you save file, sync automatically")
|
||||
.setDesc("When you save a file, sync automatically")
|
||||
.setClass("wizardHidden")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.syncOnSave).onChange(async (value) => {
|
||||
@@ -1291,7 +1394,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
)
|
||||
new Setting(containerSyncSettingEl)
|
||||
.setName("Sync on Editor Save")
|
||||
.setDesc("When you save file on the editor, sync automatically")
|
||||
.setDesc("When you save a file in the editor, sync automatically")
|
||||
.setClass("wizardHidden")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.syncOnEditorSave).onChange(async (value) => {
|
||||
@@ -1302,7 +1405,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
)
|
||||
new Setting(containerSyncSettingEl)
|
||||
.setName("Sync on File Open")
|
||||
.setDesc("When you open file, sync automatically")
|
||||
.setDesc("When you open a file, sync automatically")
|
||||
.setClass("wizardHidden")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.syncOnFileOpen).onChange(async (value) => {
|
||||
@@ -1390,7 +1493,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
);
|
||||
containerSyncSettingEl.createEl("h4", { text: "Compatibility" }).addClass("wizardHidden");
|
||||
new Setting(containerSyncSettingEl)
|
||||
.setName("Always resolve conflict manually")
|
||||
.setName("Always resolve conflicts manually")
|
||||
.setDesc("If this switch is turned on, a merge dialog will be displayed, even if the sensible-merge is possible automatically. (Turn on to previous behavior)")
|
||||
.setClass("wizardHidden")
|
||||
.addToggle((toggle) =>
|
||||
@@ -1548,7 +1651,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
);
|
||||
|
||||
new Setting(containerSyncSettingEl)
|
||||
.setName("Enhance chunk size")
|
||||
.setName(confName("customChunkSize"))
|
||||
.setDesc("Enhance chunk size for binary files (Ratio). This cannot be increased when using IBM Cloudant.")
|
||||
.setClass("wizardHidden")
|
||||
.addText((text) => {
|
||||
@@ -1587,13 +1690,15 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
|
||||
const syncFilesSetting = new Setting(containerSyncSettingEl)
|
||||
.setName("Synchronising files")
|
||||
.setDesc("(RegExp) Empty to sync all files. set filter as a regular expression to limit synchronising files.")
|
||||
.setDesc("(RegExp) Empty to sync all files. Set filter as a regular expression to limit synchronising files.")
|
||||
.setClass("wizardHidden")
|
||||
new MultipleRegExpControl(
|
||||
{
|
||||
target: syncFilesSetting.controlEl,
|
||||
props: {
|
||||
patterns: this.plugin.settings.syncOnlyRegEx.split("|[]|"), originals: [...this.plugin.settings.syncOnlyRegEx.split("|[]|")], apply: async (newPatterns) => {
|
||||
patterns: this.plugin.settings.syncOnlyRegEx.split("|[]|"),
|
||||
originals: [...this.plugin.settings.syncOnlyRegEx.split("|[]|")],
|
||||
apply: async (newPatterns) => {
|
||||
this.plugin.settings.syncOnlyRegEx = newPatterns.map(e => e.trim()).filter(e => e != "").join("|[]|");
|
||||
await this.plugin.saveSettings();
|
||||
this.display();
|
||||
@@ -1611,7 +1716,9 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
{
|
||||
target: nonSyncFilesSetting.controlEl,
|
||||
props: {
|
||||
patterns: this.plugin.settings.syncIgnoreRegEx.split("|[]|"), originals: [...this.plugin.settings.syncIgnoreRegEx.split("|[]|")], apply: async (newPatterns) => {
|
||||
patterns: this.plugin.settings.syncIgnoreRegEx.split("|[]|"),
|
||||
originals: [...this.plugin.settings.syncIgnoreRegEx.split("|[]|")],
|
||||
apply: async (newPatterns) => {
|
||||
this.plugin.settings.syncIgnoreRegEx = newPatterns.map(e => e.trim()).filter(e => e != "").join("|[]|");
|
||||
await this.plugin.saveSettings();
|
||||
this.display();
|
||||
@@ -1791,7 +1898,7 @@ However, your report is needed to stabilise this. I appreciate you for your grea
|
||||
responseConfig["admins"] = REDACTED;
|
||||
|
||||
} catch (ex) {
|
||||
responseConfig = "Requesting information to the remote CouchDB has been failed. If you are using IBM Cloudant, it is the normal behaviour."
|
||||
responseConfig = "Requesting information from the remote CouchDB has failed. If you are using IBM Cloudant, this is normal behaviour."
|
||||
}
|
||||
} else if (this.plugin.settings.remoteType == REMOTE_MINIO) {
|
||||
responseConfig = "Object Storage Synchronisation";
|
||||
@@ -1834,7 +1941,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
|
||||
if (this.plugin.replicator.remoteLockedAndDeviceNotAccepted) {
|
||||
const c = containerHatchEl.createEl("div", {
|
||||
text: "To prevent unwanted vault corruption, the remote database has been locked for synchronization, and this device was not marked as 'resolved'. it caused by some operations like this. re-initialized. Local database initialization should be required. please back your vault up, reset local database, and press 'Mark this device as resolved'. ",
|
||||
text: "To prevent unwanted vault corruption, the remote database has been locked for synchronization, and this device was not marked as 'resolved'. It caused by some operations like this. Re-initialized. Local database initialization should be required. Please back your vault up, reset the local database, and press 'Mark this device as resolved'. ",
|
||||
});
|
||||
c.createEl("button", { text: "I'm ready, mark this device 'resolved'" }, (e) => {
|
||||
e.addClass("mod-warning");
|
||||
@@ -1875,7 +1982,6 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
hatchWarn.addClass("op-warn-info");
|
||||
|
||||
|
||||
|
||||
const addResult = (path: string, file: TFile | false, fileOnDB: LoadedEntry | false) => {
|
||||
resultArea.appendChild(resultArea.createEl("div", {}, el => {
|
||||
el.appendChild(el.createEl("h6", { text: path }));
|
||||
@@ -1922,7 +2028,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
}
|
||||
new Setting(containerHatchEl)
|
||||
.setName("Verify and repair all files")
|
||||
.setDesc("Compare the content of files between on local database and storage. If not matched, you will asked which one want to keep.")
|
||||
.setDesc("Compare the content of files between on local database and storage. If not matched, you will be asked which one you want to keep.")
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Verify all")
|
||||
@@ -2025,7 +2131,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
}
|
||||
}
|
||||
} else {
|
||||
Logger(`Something went wrong on converting ${docName}`, LOG_LEVEL_NOTICE);
|
||||
Logger(`Something went wrong while converting ${docName}`, LOG_LEVEL_NOTICE);
|
||||
Logger(ex, LOG_LEVEL_VERBOSE);
|
||||
// Something wrong.
|
||||
}
|
||||
@@ -2044,7 +2150,11 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
.setWarning()
|
||||
.onClick(async () => {
|
||||
Logger(`Deleting customization sync data`, LOG_LEVEL_NOTICE);
|
||||
const entriesToDelete = (await this.plugin.localDatabase.allDocsRaw({ startkey: "ix:", endkey: "ix:\u{10ffff}", include_docs: true }));
|
||||
const entriesToDelete = (await this.plugin.localDatabase.allDocsRaw({
|
||||
startkey: "ix:",
|
||||
endkey: "ix:\u{10ffff}",
|
||||
include_docs: true
|
||||
}));
|
||||
const newData = entriesToDelete.rows.map(e => ({ ...e.doc, _deleted: true }));
|
||||
const r = await this.plugin.localDatabase.bulkDocsRaw(newData as any[]);
|
||||
// Do not care about the result.
|
||||
@@ -2175,12 +2285,17 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
})
|
||||
|
||||
new Setting(containerHatchEl)
|
||||
.setName("The Hash algorithm for chunk IDs")
|
||||
.setName(confName("hashAlg"))
|
||||
.setDesc("xxhash64 is the current default.")
|
||||
.setClass("wizardHidden")
|
||||
.addDropdown((dropdown) =>
|
||||
dropdown
|
||||
.addOptions({ "": "Old Algorithm", "xxhash32": "xxhash32 (Fast)", "xxhash64": "xxhash64 (Fastest)", "sha1": "Fallback (Without WebAssembly)" } as Record<HashAlgorithm, string>)
|
||||
.addOptions({
|
||||
"": "Old Algorithm",
|
||||
"xxhash32": "xxhash32 (Fast)",
|
||||
"xxhash64": "xxhash64 (Fastest)",
|
||||
"sha1": "Fallback (Without WebAssembly)"
|
||||
} as Record<HashAlgorithm, string>)
|
||||
.setValue(this.plugin.settings.hashAlg)
|
||||
.onChange(async (value) => {
|
||||
this.plugin.settings.hashAlg = value as HashAlgorithm;
|
||||
@@ -2199,6 +2314,15 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
await this.plugin.saveSettings();
|
||||
})
|
||||
);
|
||||
new Setting(containerHatchEl)
|
||||
.setName("Do not check configuration mismatch before replication")
|
||||
.setDesc("")
|
||||
.addToggle((toggle) =>
|
||||
toggle.setValue(this.plugin.settings.disableCheckingConfigMismatch).onChange(async (value) => {
|
||||
this.plugin.settings.disableCheckingConfigMismatch = value;
|
||||
await this.plugin.saveSettings();
|
||||
})
|
||||
);
|
||||
addScreenElement("50", containerHatchEl);
|
||||
|
||||
|
||||
@@ -2295,6 +2419,26 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
|
||||
containerMaintenanceEl.createEl("h4", { text: "Remote" });
|
||||
|
||||
if (this.plugin.settings.remoteType == REMOTE_COUCHDB) {
|
||||
new Setting(containerMaintenanceEl)
|
||||
.setName("Perform compaction")
|
||||
.setDesc("Compaction discards all of Eden in the non-latest revisions, reducing the storage usage. However, this operation requires the same free space on the remote as the current database.")
|
||||
.addButton((button) =>
|
||||
button
|
||||
.setButtonText("Perform")
|
||||
.setDisabled(false)
|
||||
.onClick(async () => {
|
||||
const replicator = this.plugin.replicator as LiveSyncCouchDBReplicator;
|
||||
Logger(`Compaction has been began`, LOG_LEVEL_NOTICE, "compaction")
|
||||
if (await replicator.compactRemote(this.plugin.settings)) {
|
||||
Logger(`Compaction has been completed!`, LOG_LEVEL_NOTICE, "compaction");
|
||||
} else {
|
||||
Logger(`Compaction has been failed!`, LOG_LEVEL_NOTICE, "compaction");
|
||||
}
|
||||
})
|
||||
)
|
||||
}
|
||||
|
||||
new Setting(containerMaintenanceEl)
|
||||
.setName("Lock remote")
|
||||
.setDesc("Lock remote to prevent synchronization with other devices.")
|
||||
@@ -2321,6 +2465,7 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
})
|
||||
)
|
||||
|
||||
|
||||
if (this.plugin.settings.remoteType != REMOTE_COUCHDB) {
|
||||
new Setting(containerMaintenanceEl)
|
||||
.setName("Reset journal received history")
|
||||
@@ -2331,7 +2476,11 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
.setWarning()
|
||||
.setDisabled(false)
|
||||
.onClick(async () => {
|
||||
await this.plugin.getMinioJournalSyncClient().updateCheckPointInfo((info) => ({ ...info, receivedFiles: new Set(), knownIDs: new Set() }));
|
||||
await this.plugin.getMinioJournalSyncClient().updateCheckPointInfo((info) => ({
|
||||
...info,
|
||||
receivedFiles: new Set(),
|
||||
knownIDs: new Set()
|
||||
}));
|
||||
Logger(`Journal received history has been cleared.`, LOG_LEVEL_NOTICE);
|
||||
})
|
||||
)
|
||||
@@ -2344,7 +2493,12 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
.setWarning()
|
||||
.setDisabled(false)
|
||||
.onClick(async () => {
|
||||
await this.plugin.getMinioJournalSyncClient().updateCheckPointInfo((info) => ({ ...info, lastLocalSeq: 0, sentIDs: new Set(), sentFiles: new Set() }));
|
||||
await this.plugin.getMinioJournalSyncClient().updateCheckPointInfo((info) => ({
|
||||
...info,
|
||||
lastLocalSeq: 0,
|
||||
sentIDs: new Set(),
|
||||
sentFiles: new Set()
|
||||
}));
|
||||
Logger(`Journal sent history has been cleared.`, LOG_LEVEL_NOTICE);
|
||||
})
|
||||
)
|
||||
@@ -2384,7 +2538,14 @@ ${stringifyYaml(pluginConfig)}`;
|
||||
.setWarning()
|
||||
.setDisabled(false)
|
||||
.onClick(async () => {
|
||||
await this.plugin.getMinioJournalSyncClient().updateCheckPointInfo((info) => ({ ...info, receivedFiles: new Set(), knownIDs: new Set(), lastLocalSeq: 0, sentIDs: new Set(), sentFiles: new Set() }));
|
||||
await this.plugin.getMinioJournalSyncClient().updateCheckPointInfo((info) => ({
|
||||
...info,
|
||||
receivedFiles: new Set(),
|
||||
knownIDs: new Set(),
|
||||
lastLocalSeq: 0,
|
||||
sentIDs: new Set(),
|
||||
sentFiles: new Set()
|
||||
}));
|
||||
await this.plugin.resetRemoteBucket();
|
||||
Logger(`the bucket has been cleared.`, LOG_LEVEL_NOTICE);
|
||||
})
|
||||
@@ -1,12 +1,12 @@
|
||||
<script lang="ts">
|
||||
import { onMount } from "svelte";
|
||||
import ObsidianLiveSyncPlugin from "./main";
|
||||
import { type PluginDataExDisplay, pluginIsEnumerating, pluginList } from "./CmdConfigSync";
|
||||
import PluginCombo from "./PluginCombo.svelte";
|
||||
import ObsidianLiveSyncPlugin from "../main";
|
||||
import { type PluginDataExDisplay, pluginIsEnumerating, pluginList } from "../features/CmdConfigSync";
|
||||
import PluginCombo from "./components/PluginCombo.svelte";
|
||||
import { Menu } from "obsidian";
|
||||
import { unique } from "./lib/src/utils";
|
||||
import { MODE_SELECTIVE, MODE_AUTOMATIC, MODE_PAUSED, type SYNC_MODE, type PluginSyncSettingEntry } from "./lib/src/types";
|
||||
import { normalizePath } from "./deps";
|
||||
import { unique } from "../lib/src/common/utils";
|
||||
import { MODE_SELECTIVE, MODE_AUTOMATIC, MODE_PAUSED, type SYNC_MODE, type PluginSyncSettingEntry } from "../lib/src/common/types";
|
||||
import { normalizePath } from "../deps";
|
||||
export let plugin: ObsidianLiveSyncPlugin;
|
||||
|
||||
$: hideNotApplicable = false;
|
||||
@@ -1,11 +1,11 @@
|
||||
<script lang="ts">
|
||||
import type { PluginDataExDisplay } from "./CmdConfigSync";
|
||||
import { Logger } from "./lib/src/logger";
|
||||
import { versionNumberString2Number } from "./lib/src/strbin";
|
||||
import { type FilePath, LOG_LEVEL_NOTICE } from "./lib/src/types";
|
||||
import { getDocData } from "./lib/src/utils";
|
||||
import type ObsidianLiveSyncPlugin from "./main";
|
||||
import { askString, scheduleTask } from "./utils";
|
||||
import type { PluginDataExDisplay } from "../../features/CmdConfigSync";
|
||||
import { Logger } from "../../lib/src/common/logger";
|
||||
import { versionNumberString2Number } from "../../lib/src/string_and_binary/strbin";
|
||||
import { type FilePath, LOG_LEVEL_NOTICE } from "../../lib/src/common/types";
|
||||
import { getDocData } from "../../lib/src/common/utils";
|
||||
import type ObsidianLiveSyncPlugin from "../../main";
|
||||
import { askString, scheduleTask } from "../../common/utils";
|
||||
|
||||
export let list: PluginDataExDisplay[] = [];
|
||||
export let thisTerm = "";
|
||||
18
updates.md
18
updates.md
@@ -18,6 +18,24 @@ I have a lot of respect for that plugin, even though it is sometimes treated as
|
||||
Hooray for open source, and generous licences, and the sharing of knowledge by experts.
|
||||
|
||||
#### Version history
|
||||
- 0.23.5:
|
||||
- New feature:
|
||||
- Now we can check configuration mismatching between clients before synchronisation.
|
||||
- Default: enabled / Preferred: enabled / We can disable this by the `Do not check configuration mismatch before replication` toggle in the `Hatch` pane.
|
||||
- It detects configuration mismatches and prevents synchronisation failures and wasted storage.
|
||||
- Now we can perform remote database compaction from the `Maintenance` pane.
|
||||
- Fixed:
|
||||
- We can detect the bucket could not be reachable.
|
||||
- Note:
|
||||
- Known inexplicable behaviour: Recently, (Maybe while enabling `Incubate chunks in Document` and `Fetch chunks on demand` or some more toggles), our customisation sync data is sometimes corrupted. It will be addressed by the next release.
|
||||
- 0.23.4
|
||||
- Fixed:
|
||||
- No longer experimental configuration is shown on the Minimal Setup.
|
||||
- New feature:
|
||||
- We can now use `Incubate Chunks in Document` to reduce non-well-formed chunks.
|
||||
- Default: disabled / Preferred: enabled in all devices.
|
||||
- When we enabled this toggle, newly created chunks are temporarily kept within the document, and graduated to become independent chunks once stabilised.
|
||||
- The [design document](https://github.com/vrtmrz/obsidian-livesync/blob/3925052f9290b3579e45a4b716b3679c833d8ca0/docs/design_docs_of_keep_newborn_chunks.md) has been also available..
|
||||
- 0.23.3
|
||||
- Fixed: No longer unwanted `\f` in journal sync.
|
||||
- 0.23.2
|
||||
|
||||
Reference in New Issue
Block a user