Compare commits

...

27 Commits

Author SHA1 Message Date
vorotamoroz
133f5a7109 bump 2024-04-30 11:49:16 +01:00
vorotamoroz
daa3feebf1 Fixed:
- Journal Sync will not hang up during big replication, especially the initial one.
- All changes which have been replicated while rebuilding will not be postponed (Previous behaviour).
Improved:
- Now Journal Sync works efficiently in download and parse, or pack and upload.
- Less server storage and faster packing/unpacking usage by the new chunk format.
2024-04-30 11:48:27 +01:00
vorotamoroz
7b5f7d0fbf bump 2024-04-30 01:40:01 +09:00
vorotamoroz
29532193cb - Fixed:
- Now journal synchronisation considers untransferred each from sent and received.
  - Journal sync now handles retrying.
  - Journal synchronisation no longer considers the synchronisation of chunks as revision updates (Simply ignored).
  - Journal sync now splits the journal pack to prevent mobile device rebooting.
  - Maintenance menus which had been on the command palette are now back in the maintain pane on the setting dialogue.
- Improved:
  - Now all changes which have been replicated while rebuilding will be postponed.
2024-04-30 01:39:09 +09:00
vorotamoroz
5b4309c09d For the future. Because of a good opportunity. 2024-04-29 02:01:27 +09:00
vorotamoroz
16ef582453 Update: wrote about the new Remote Type. 2024-04-28 23:37:26 +09:00
vorotamoroz
3e22f70c7a Update README.md 2024-04-28 17:49:50 +09:00
vorotamoroz
0a8dbe097e bump 2024-04-27 03:35:32 +09:00
vorotamoroz
2c0fcf74d0 New feature: Object storage support 2024-04-27 03:33:59 +09:00
vorotamoroz
a1ab1efd5d Update README.md 2024-04-20 21:45:21 +09:00
vorotamoroz
c8fcf2d0d5 Bump 2024-04-19 12:06:09 +01:00
vorotamoroz
c384e2f7fb Fixed:
- No longer data corrupting due to false BASE64 detections.
2024-04-19 12:04:14 +01:00
vorotamoroz
99c1c7dc1a bump 2024-04-18 12:37:49 +01:00
vorotamoroz
84adec4b1a New feature: Automatic data compression to reduce amount of traffic and the usage of remote database. 2024-04-18 12:30:29 +01:00
vorotamoroz
f0b202bd91 bump 2024-04-12 01:32:03 +09:00
vorotamoroz
d54b7e2d93 - Fixed:
- Error handling on booting now works fine.
  - Replication is now started automatically in LiveSync mode.
  - Batch database update is now disabled in LiveSync mode.
  - No longer automatically reconnection while off-focused.
  - Status saves are thinned out.
  - Now Self-hosted LiveSync waits for all files between the local database and storage to be surely checked.
- Improved:
  - The job scheduler is now more robust and stable.
  - The status indicator no longer flickers and keeps zero for a while.
  - No longer meaningless frequent updates of status indicators.
  - Now we can configure regular expression filters in handy UI. Thank you so much, @eth-p!
  - `Fetch` or `Rebuild everything` is now more safely performed.
- Minor things
  - Some utility function has been added.
  - Customisation sync now less wrong messages.
  - Digging the weeds for eradication of type errors.
2024-04-12 01:30:35 +09:00
vorotamoroz
6952ef37f5 Update quick_setup.md 2024-04-09 13:10:31 +09:00
vorotamoroz
9630bcbae8 bump 2024-03-22 10:50:03 +01:00
vorotamoroz
c3f925ab9a Merge branch 'main' of https://github.com/vrtmrz/obsidian-livesync 2024-03-22 10:48:25 +01:00
vorotamoroz
034dc0538f - Fixed:
- Fixed the issue that binary files were sometimes corrupted.
  - Fixed customisation sync data could be corrupted.
- Improved:
  - Now the remote database costs lower memory.
    - This release requires a brief wait on the first synchronisation, to track the latest changeset again.
  - Description added for the `Device name`.
- Refactored:
  - Many type-errors have been resolved.
  - Obsolete file has been deleted.
2024-03-22 10:48:16 +01:00
vorotamoroz
b6136df836 Update quick_setup.md 2024-03-22 14:27:34 +09:00
vorotamoroz
24aacdc2a1 bump 2024-03-22 04:07:17 +01:00
vorotamoroz
f91109b1ad - Improved:
- Faster start-up by removing too many logs which indicates normality
  - By streamlined scanning of customised synchronisation extra phases have been deleted.
2024-03-22 04:07:07 +01:00
vorotamoroz
e76e7ae8ea bump 2024-03-19 17:59:38 +01:00
vorotamoroz
f7fbe85d65 - New feature:
- We can disable the status bar in the setting dialogue.
- Improved:
  - Now some files are handled as correct data type.
  - Customisation sync now uses the digest of each file for better performance.
  - The status in the Editor now works performant.
- Refactored:
  - Common functions have been ready and the codebase has been organised.
  - Stricter type checking following TypeScript updates.
  - Remove old iOS workaround for simplicity and performance.
2024-03-19 17:58:55 +01:00
vorotamoroz
0313443b29 Merge pull request #389 from Seeker0472/fix-command
Fixed docker-compose command in docs
2024-03-19 14:06:23 +09:00
seeker0472
755c30f468 fix docker-compose command 2024-03-17 14:30:35 +08:00
28 changed files with 4572 additions and 1358 deletions

View File

@@ -2,7 +2,7 @@
# Self-hosted LiveSync
[Japanese docs](./README_ja.md) - [Chinese docs](./README_cn.md).
Self-hosted LiveSync is a community-implemented synchronization plugin, available on every obsidian-compatible platform and using CouchDB as the server.
Self-hosted LiveSync is a community-implemented synchronization plugin, available on every obsidian-compatible platform and using CouchDB or Object Storage (e.g., MinIO, S3, R2, etc.) as the server.
![obsidian_live_sync_demo](https://user-images.githubusercontent.com/45774780/137355323-f57a8b09-abf2-4501-836c-8cb7d2ff24a3.gif)
@@ -45,7 +45,7 @@ This plug-in might be useful for researchers, engineers, and developers with a n
2. Configure plug-in in [Quick Setup](docs/quick_setup.md)
> [!TIP]
> We are still able to use IBM Cloudant. However, it is not recommended for several reasons nowadays. Here is [Setup IBM Cloudant](docs/setup_cloudant.md)
> Now, fly.io has become not free. Fortunately, even though there are some issues, we are still able to use IBM Cloudant. Here is [Setup IBM Cloudant](docs/setup_cloudant.md). It will be updated soon!
## Information in StatusBar

View File

@@ -0,0 +1,50 @@
## The design document of the journal sync
Original title: Synchronise without CouchDB
### Goal
- Synchronise vaults without CouchDB
### Motivation
- Serving CouchDB is not pretty easy.
- Full spec DBaaS (Paid IBM Cloudant) is a bit expensive and lacking of alternatives.
- Securing alternatives, from just one protocol.
### Prerequisite
- We should have multiple implementations of the server software.
- We should also be able to use SaaS, with a choice of options.
- We should require them a reasonable sense of cost, ideally free of charge for trials.
- We should be able to serve some instance of the server software, as OSS — with transparency, availability of auditing, and the fact that they actually took place.
### Methods and implementations
Ordinarily, local pouchDB and the remote CouchDB are synchronised by sending each missing document through several conversations in their replication protocol. However, to achieve this plan, we cannot rely on CouchDB and its protocols. This limitation is so harsh. However, Overcoming this means gaining new possibilities. After some trials, It was concluded that synchronisation could be completed even if the actions that could be performed were limited to uploading, downloading and retrieving the list. This means we can use any old-fashioned WebDAV server, and Sophisticated “Object storages” such as Self-hosted MinIO, S3, and R2 or any we like. This is realised by sharing and complementing the differences of the journal by each client. Therefore, The focus is therefore on how to identify which are the differences and send them without dynamic communication.
All clients manage their data in PouchDB. I know this is probably known information, but it has its own journal.
First, all clients should record to what point in the journal they sent themselves last time. The client then packs from the previous point to the latest when sending and also updates their record. This pack is uploaded to the server with the name starting with the timestamp of its creation. This is the send operation.
Conversely, when receiving, the packs uploaded to the server that have not yet been received are received in order. This is easy as their names are in date order. When the process is successfully completed, the names of the files received are recorded. The journals from this pack are then reflected in their own database. Conflict resolution is left to PouchDB, so the client only needs to do the work of applying any differences. And here is the key: the client records the ID and revision of the document that was in the journal and applied.
This key works when creating a pack. When creating a pack, the client omits this 'document recorded as received and used'. This is because received and applied means that it has already been sent by another client and exists on the server. This ensures that unnecessary transmissions do not take place.
Synchronisation is then always started by receiving. This is a little trick to avoid including unnecessary documents in the pack.
These behaviours allow clients to voluntarily send and receive only the missing parts of the journal that are not stored on the server, without having to communicate with each other, and still keep a single, consistent journal on the server.
Source codes actually implemented this is already committed into the repository.
### Test strategy
This implementation replaces the synchronisation performed by CouchDB. Therefore, testing was simply done by comparing the same changes to the same vault, replicated in CouchDB, with those done by this implementation.
### Documentation strategy
- Documentation should be done in a quick setup, at least.
- As several server implementations can be selected, the description is omitted with regard to specific configuration values.
- A MinIO set-up might be nice to have. However, it is not considered essential.
- It would be a good opportunity to also publish these design documents.
### Consideration and Conclusion
This design offers a novel approach to journal synchronisation without relying on CouchDB. It leverages PouchDB's journaling capabilities and leverages simple server-side storage for efficient data exchange. Hence, the new design could be said to have gotten a broader outlook.

View File

@@ -16,7 +16,8 @@ There are three methods to set up Self-hosted LiveSync.
### 1. Using setup URIs
> [!TIP] What is the setup URI? Why is it required?
> [!TIP]
> What is the setup URI? Why is it required?
> The setup URI is the encrypted representation of Self-hosted LiveSync configuration as a URI. This starts `obsidian://setuplivesync?settings=`. This is encrypted with a passphrase, so that it can be shared relatively securely between devices. It is a bit long, but it is one line. This allows a series of settings to be set at once without any inconsistencies.
>
> If you have configured the remote database by [Automated setup on Fly.io](./setup_flyio.md#a-very-automated-setup) or [set up your server with the tool](./setup_own_server.md#1-generate-the-setup-uri-on-a-desktop-device-or-server), **you should have one of them**
@@ -44,19 +45,38 @@ If you do not have any setup URI, Press the `start` button. The setting dialogue
![](../images/quick_setup_2.png)
#### Remote database configuration
1. Enter the information for the database we have set up.
#### Select the remote type
1. Select the Remote Type from dropdown list.
We now have a choice between CouchDB (and its compatibles) and object storage (MinIO, S3, R2). CouchDB is the first choice and is also recommended. And supporting Object Storage is an experimental feature.
#### Remote configuration
##### CouchDB
Enter the information for the database we have set up.
![](../images/quick_setup_3.png)
##### Object Storage
#### Test database connection and Check database configuration
1. Enter the information for the S3 API and bucket.
![](../images/quick_setup_3b.png)
Note 1: if you use S3, you can leave the Endpoint URL empty.
Note 2: if your Object Storage cannot configure the CORS setting fully, you may able to connect to the server by enabling the `Use Custom HTTP Handler` toggle.
2. Press `Test` of `Test Connection` once and ensure you can connect to the Object Storage.
#### Only CouchDB: Test database connection and Check database configuration
We can check the connectivity to the database, and the database settings.
![](../images/quick_setup_5.png)
#### Check and Fix database configuration
#### Only CouchDB: Check and Fix database configuration
Check the database settings and fix any problems on the spot.
@@ -81,6 +101,8 @@ We should proceed to the Next step.
#### Sync Settings
Finally, finish the wizard by selecting a preset for synchronisation.
Note: If you are going to use Object Storage, you cannot select `LiveSync`.
![](../images/quick_setup_9_1.png)
Select any synchronisation methods we want to use and `Apply`. If database initialisation is required, it will be performed at this time. When `All done!` is displayed, we are ready to synchronise.
@@ -104,4 +126,4 @@ And, please copy the setup URI by `Copy current settings as a new setup URI` and
## At the subsequent device
After installing Self-hosted LiveSync on the first device, we should have a setup URI. **The first choice is to use it**. Please share it with the device you want to setup.
It is completely same as [Using setup URIs on the first device](#1-using-setup-uris). Please refer it.
It is completely same as [Using setup URIs on the first device](#1-using-setup-uris). Please refer it.

View File

@@ -91,7 +91,7 @@ services:
最后, 创建并启动容器:
```
# -d will launch detached so the container runs in background
docker compose up -d
docker-compose up -d
```
## 创建数据库

BIN
images/quick_setup_3b.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 74 KiB

View File

@@ -1,7 +1,7 @@
{
"id": "obsidian-livesync",
"name": "Self-hosted LiveSync",
"version": "0.22.13",
"version": "0.23.2",
"minAppVersion": "0.9.12",
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"author": "vorotamoroz",

2748
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "obsidian-livesync",
"version": "0.22.13",
"version": "0.23.2",
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"main": "main.js",
"type": "module",
@@ -54,7 +54,12 @@
"typescript": "^5.4.2"
},
"dependencies": {
"@aws-sdk/client-s3": "^3.556.0",
"@smithy/fetch-http-handler": "^2.5.0",
"@smithy/protocol-http": "^3.3.0",
"@smithy/querystring-builder": "^2.2.0",
"diff-match-patch": "^1.0.5",
"fflate": "^0.8.2",
"idb": "^8.0.0",
"minimatch": "^9.0.3",
"xxhash-wasm": "0.4.2",

View File

@@ -4,10 +4,9 @@ import { Notice, type PluginManifest, parseYaml, normalizePath, type ListedFiles
import type { EntryDoc, LoadedEntry, InternalFileEntry, FilePathWithPrefix, FilePath, DocumentID, AnyEntry, SavingEntry } from "./lib/src/types";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE } from "./lib/src/types";
import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
import { createTextBlob, delay, getDocData, isDocContentSame, sendSignal, waitForSignal } from "./lib/src/utils";
import { createSavingEntryFromLoadedEntry, createTextBlob, delay, fireAndForget, getDocData, isDocContentSame, throttle } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { WrappedNotice } from "./lib/src/wrapper";
import { readString, decodeBinary, arrayBufferToBase64, sha1 } from "./lib/src/strbin";
import { readString, decodeBinary, arrayBufferToBase64, digestHash } from "./lib/src/strbin";
import { serialized } from "./lib/src/lock";
import { LiveSyncCommands } from "./LiveSyncCommands";
import { stripAllPrefixes } from "./lib/src/path";
@@ -31,7 +30,8 @@ function serialize(data: PluginDataEx): string {
ret += data.mtime + d2;
for (const file of data.files) {
ret += file.filename + d + (file.displayName ?? "") + d + (file.version ?? "") + d2;
ret += file.mtime + d + file.size + d2;
const hash = digestHash((file.data ?? []).join());
ret += file.mtime + d + file.size + d + hash + d2;
for (const data of file.data ?? []) {
ret += data + d
}
@@ -95,6 +95,7 @@ function deserialize2(str: string): PluginDataEx {
tokens.nextLine();
const mtime = Number(tokens.next());
const size = Number(tokens.next());
const hash = tokens.next();
tokens.nextLine();
const data = [] as string[];
let piece = "";
@@ -110,7 +111,8 @@ function deserialize2(str: string): PluginDataEx {
version,
mtime,
size,
data
data,
hash
}
)
tokens.nextLine();
@@ -137,10 +139,11 @@ export const pluginIsEnumerating = writable(false);
export type PluginDataExFile = {
filename: string,
data?: string[],
data: string[],
mtime: number,
size: number,
version?: string,
hash?: string,
displayName?: string,
}
export type PluginDataExDisplay = {
@@ -169,17 +172,16 @@ export class ConfigSync extends LiveSyncCommands {
pluginScanningCount.onChanged((e) => {
const total = e.value;
pluginIsEnumerating.set(total != 0);
if (total == 0) {
Logger(`Processing configurations done`, LOG_LEVEL_INFO, "get-plugins");
}
// if (total == 0) {
// Logger(`Processing configurations done`, LOG_LEVEL_INFO, "get-plugins");
// }
})
}
confirmPopup: WrappedNotice = null;
get kvDB() {
return this.plugin.kvDB;
}
pluginDialog: PluginDialogModal = null;
pluginDialog?: PluginDialogModal = undefined;
periodicPluginSweepProcessor = new PeriodicProcessor(this.plugin, async () => await this.scanAllConfigFiles(false));
pluginList: PluginDataExDisplay[] = [];
@@ -187,7 +189,7 @@ export class ConfigSync extends LiveSyncCommands {
if (!this.settings.usePluginSync) {
return;
}
if (this.pluginDialog != null) {
if (this.pluginDialog) {
this.pluginDialog.open();
} else {
this.pluginDialog = new PluginDialogModal(this.app, this.plugin);
@@ -198,7 +200,7 @@ export class ConfigSync extends LiveSyncCommands {
hidePluginSyncModal() {
if (this.pluginDialog != null) {
this.pluginDialog.close();
this.pluginDialog = null;
this.pluginDialog = undefined;
}
}
onunload() {
@@ -273,16 +275,28 @@ export class ConfigSync extends LiveSyncCommands {
await this.updatePluginList(showMessage);
}
async loadPluginData(path: FilePathWithPrefix): Promise<PluginDataExDisplay | false> {
const wx = await this.localDatabase.getDBEntry(path, null, false, false);
const wx = await this.localDatabase.getDBEntry(path, undefined, false, false);
if (wx) {
const data = deserialize(getDocData(wx.data), {}) as PluginDataEx;
const xFiles = [] as PluginDataExFile[];
let missingHash = false;
for (const file of data.files) {
const work = { ...file };
const tempStr = getDocData(work.data);
work.data = [await sha1(tempStr)];
const work = { ...file, data: [] as string[] };
if (!file.hash) {
// debugger;
const tempStr = getDocData(work.data);
const hash = digestHash(tempStr);
file.hash = hash;
missingHash = true;
}
work.data = [file.hash];
xFiles.push(work);
}
if (missingHash) {
Logger(`Digest created for ${path} to improve checking`, LOG_LEVEL_VERBOSE);
wx.data = serialize(data);
fireAndForget(() => this.localDatabase.putDBEntry(createSavingEntryFromLoadedEntry(wx)));
}
return ({
...data,
documentPath: this.getPath(wx),
@@ -291,7 +305,8 @@ export class ConfigSync extends LiveSyncCommands {
}
return false;
}
createMissingConfigurationEntry() {
createMissingConfigurationEntry = throttle(() => this._createMissingConfigurationEntry(), 1000);
_createMissingConfigurationEntry() {
let saveRequired = false;
for (const v of this.pluginList) {
const key = `${v.category}/${v.name}`;
@@ -317,42 +332,27 @@ export class ConfigSync extends LiveSyncCommands {
const plugin = v[0];
const path = plugin.path || this.getPath(plugin);
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
if (oldEntry && oldEntry.mtime == plugin.mtime) return;
if (oldEntry && oldEntry.mtime == plugin.mtime) return [];
try {
const pluginData = await this.loadPluginData(path);
if (pluginData) {
return [pluginData];
let newList = [...this.pluginList];
newList = newList.filter(x => x.documentPath != pluginData.documentPath);
newList.push(pluginData);
this.pluginList = newList;
pluginList.set(newList);
}
// Failed to load
return;
return [];
} catch (ex) {
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
return;
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 300, yieldThreshold: 10 }).pipeTo(
new QueueProcessor(
async (pluginDataList) => {
// Concurrency is two, therefore, we can unlock the previous awaiting.
sendSignal("plugin-next-load");
let newList = [...this.pluginList];
for (const item of pluginDataList) {
newList = newList.filter(x => x.documentPath != item.documentPath);
newList.push(item)
}
this.pluginList = newList;
pluginList.set(newList);
if (pluginDataList.length != 10) {
// If the queue is going to be empty, await subsequent for a while.
await waitForSignal("plugin-next-load", 1000);
}
return;
}
, { suspended: true, batchSize: 10, concurrentLimit: 2, delay: 250, yieldThreshold: 25, totalRemainingReactiveSource: pluginScanningCount })).startPipeline().root.onIdle(() => {
Logger(`All files enumerated`, LOG_LEVEL_INFO, "get-plugins");
this.createMissingConfigurationEntry();
});
return [];
}, { suspended: false, batchSize: 1, concurrentLimit: 10, delay: 100, yieldThreshold: 10, maintainDelay: false, totalRemainingReactiveSource: pluginScanningCount }).startPipeline().root.onUpdateProgress(() => {
this.createMissingConfigurationEntry();
});
async updatePluginList(showMessage: boolean, updatedDocumentPath?: FilePathWithPrefix): Promise<void> {
@@ -398,7 +398,7 @@ export class ConfigSync extends LiveSyncCommands {
showJSONMergeDialogAndMerge(docA: LoadedEntry, docB: LoadedEntry, pluginDataA: PluginDataEx, pluginDataB: PluginDataEx): Promise<boolean> {
const fileA = { ...pluginDataA.files[0], ctime: pluginDataA.files[0].mtime, _id: `${pluginDataA.documentPath}` as DocumentID };
const fileB = pluginDataB.files[0];
const docAx = { ...docA, ...fileA } as LoadedEntry, docBx = { ...docB, ...fileB } as LoadedEntry
const docAx = { ...docA, ...fileA, datatype: "newnote" } as LoadedEntry, docBx = { ...docB, ...fileB, datatype: "newnote" } as LoadedEntry
return serialized("config:merge-data", () => new Promise((res) => {
Logger("Opening data-merging dialog", LOG_LEVEL_VERBOSE);
// const docs = [docA, docB];
@@ -502,9 +502,9 @@ export class ConfigSync extends LiveSyncCommands {
if (this.plugin.settings.usePluginSync && this.plugin.settings.notifyPluginOrSettingUpdated) {
if (!this.pluginDialog || (this.pluginDialog && !this.pluginDialog.isOpened())) {
const fragment = createFragment((doc) => {
doc.createEl("span", null, (a) => {
doc.createEl("span", undefined, (a) => {
a.appendText(`Some configuration has been arrived, Press `);
a.appendChild(a.createEl("a", null, (anchor) => {
a.appendChild(a.createEl("a", undefined, (anchor) => {
anchor.text = "HERE";
anchor.addEventListener("click", () => {
this.showPluginSyncModal();
@@ -670,7 +670,7 @@ export class ConfigSync extends LiveSyncCommands {
const content = createTextBlob(serialize(dt));
try {
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false);
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, false);
let saveData: SavingEntry;
if (old === false) {
saveData = {
@@ -694,7 +694,7 @@ export class ConfigSync extends LiveSyncCommands {
if (oldC) {
const d = await deserialize(getDocData(oldC.data), {}) as PluginDataEx;
const diffs = (d.files.map(previous => ({ prev: previous, curr: dt.files.find(e => e.filename == previous.filename) })).map(async e => {
try { return await isDocContentSame(e.curr.data, e.prev.data) } catch (_) { return false }
try { return await isDocContentSame(e.curr?.data ?? [], e.prev.data) } catch (_) { return false }
}))
const isSame = (await Promise.all(diffs)).every(e => e == true);
if (isSame) {
@@ -767,7 +767,11 @@ export class ConfigSync extends LiveSyncCommands {
const filesOnDB = ((await this.localDatabase.allDocsRaw({ startkey: ICXHeader + "", endkey: `${ICXHeader}\u{10ffff}`, include_docs: true })).rows.map(e => e.doc) as InternalFileEntry[]).filter(e => !e.deleted);
let deleteCandidate = filesOnDB.map(e => this.getPath(e)).filter(e => e.startsWith(`${ICXHeader}${term}/`));
for (const vp of virtualPathsOfLocalFiles) {
const p = files.find(e => e.key == vp).file;
const p = files.find(e => e.key == vp)?.file;
if (!p) {
Logger(`scanAllConfigFiles - File not found: ${vp}`, LOG_LEVEL_VERBOSE);
continue;
}
await this.storeCustomizationFiles(p);
deleteCandidate = deleteCandidate.filter(e => e != vp);
}
@@ -782,10 +786,11 @@ export class ConfigSync extends LiveSyncCommands {
const mtime = new Date().getTime();
await serialized("file-x-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, false) as InternalFileEntry | false;
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, false) as InternalFileEntry | false;
let saveData: InternalFileEntry;
if (old === false) {
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) already deleted (Not found on database)`);
return;
} else {
if (old.deleted) {
Logger(`STORAGE -x> DB:${prefixedFileName}: (config) already deleted`);

View File

@@ -1,22 +1,20 @@
import { normalizePath, type PluginManifest, type ListedFiles } from "./deps";
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry, type DocumentID } from "./lib/src/types";
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
import { createBinaryBlob, isDocContentSame, sendSignal } from "./lib/src/utils";
import { readAsBlob, isDocContentSame, sendSignal, readContent, createBlob } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
import { isInternalMetadata, PeriodicProcessor } from "./utils";
import { WrappedNotice } from "./lib/src/wrapper";
import { decodeBinary, encodeBinary } from "./lib/src/strbin";
import { serialized } from "./lib/src/lock";
import { JsonResolveModal } from "./JsonResolveModal";
import { LiveSyncCommands } from "./LiveSyncCommands";
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
import { KeyedQueueProcessor, QueueProcessor } from "./lib/src/processor";
import { QueueProcessor } from "./lib/src/processor";
import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "./lib/src/stores";
export class HiddenFileSync extends LiveSyncCommands {
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
confirmPopup: WrappedNotice = null;
get kvDB() {
return this.plugin.kvDB;
}
@@ -67,23 +65,23 @@ export class HiddenFileSync extends LiveSyncCommands {
realizeSettingSyncMode(): Promise<void> {
this.periodicInternalFileScanProcessor?.disable();
if (this.plugin.suspended)
return;
return Promise.resolve();
if (!this.plugin.isReady)
return;
return Promise.resolve();
this.periodicInternalFileScanProcessor.enable(this.settings.syncInternalFiles && this.settings.syncInternalFilesInterval ? (this.settings.syncInternalFilesInterval * 1000) : 0);
return;
return Promise.resolve();
}
procInternalFile(filename: string) {
this.internalFileProcessor.enqueueWithKey(filename, filename);
this.internalFileProcessor.enqueue(filename);
}
internalFileProcessor = new KeyedQueueProcessor<string, any>(
internalFileProcessor = new QueueProcessor<string, any>(
async (filenames) => {
Logger(`START :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
await this.syncInternalFilesAndDatabase("pull", false, false, filenames);
Logger(`DONE :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
return;
}, { batchSize: 100, concurrentLimit: 1, delay: 10, yieldThreshold: 10, suspended: false, totalRemainingReactiveSource: hiddenFilesEventCount }
}, { batchSize: 100, concurrentLimit: 1, delay: 10, yieldThreshold: 100, suspended: false, totalRemainingReactiveSource: hiddenFilesEventCount }
);
recentProcessedInternalFiles = [] as string[];
@@ -125,7 +123,7 @@ export class HiddenFileSync extends LiveSyncCommands {
if (storageMTime == 0) {
await this.deleteInternalFileOnDatabase(path);
} else {
await this.storeInternalFileToDatabase({ path: path, ...stat });
await this.storeInternalFileToDatabase({ path: path, mtime, ctime: stat?.ctime ?? mtime, size: stat?.size ?? 0 });
}
}
@@ -162,7 +160,7 @@ export class HiddenFileSync extends LiveSyncCommands {
await this.localDatabase.removeRevision(id, delRev);
Logger(`Older one has been deleted:${path}`);
const cc = await this.localDatabase.getRaw(id, { conflicts: true });
if (cc._conflicts.length == 0) {
if (cc._conflicts?.length === 0) {
await this.extractInternalFileFromDatabase(stripAllPrefixes(path))
} else {
this.conflictResolutionProcessor.enqueue(path);
@@ -177,11 +175,12 @@ export class HiddenFileSync extends LiveSyncCommands {
// Retrieve data
const id = await this.path2id(path, ICHeader);
const doc = await this.localDatabase.getRaw(id, { conflicts: true });
// If there is no conflict, return with false.
if (!("_conflicts" in doc))
return;
// if (!("_conflicts" in doc)){
// return [];
// }
if (doc._conflicts === undefined) return [];
if (doc._conflicts.length == 0)
return;
return [];
Logger(`Hidden file conflicted:${path}`);
const conflicts = doc._conflicts.sort((a, b) => Number(a.split("-")[0]) - Number(b.split("-")[0]));
const revA = doc._rev;
@@ -192,7 +191,7 @@ export class HiddenFileSync extends LiveSyncCommands {
const conflictedRevNo = Number(conflictedRev.split("-")[0]);
//Search
const revFrom = (await this.localDatabase.getRaw<EntryDoc>(id, { revs_info: true }));
const commonBase = revFrom._revs_info.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
const commonBase = revFrom._revs_info?.filter(e => e.status == "available" && Number(e.rev.split("-")[0]) < conflictedRevNo).first()?.rev ?? "";
const result = await this.plugin.mergeObject(path, commonBase, doc._rev, conflictedRev);
if (result) {
Logger(`Object merge:${path}`, LOG_LEVEL_INFO);
@@ -203,11 +202,14 @@ export class HiddenFileSync extends LiveSyncCommands {
}
await this.plugin.vaultAccess.adapterWrite(filename, result);
const stat = await this.vaultAccess.adapterStat(filename);
if (!stat) {
throw new Error(`conflictResolutionProcessor: Failed to stat file ${filename}`);
}
await this.storeInternalFileToDatabase({ path: filename, ...stat });
await this.extractInternalFileFromDatabase(filename);
await this.localDatabase.removeRevision(id, revB);
this.conflictResolutionProcessor.enqueue(path);
return;
return [];
} else {
Logger(`Object merge is not applicable.`, LOG_LEVEL_VERBOSE);
}
@@ -215,11 +217,11 @@ export class HiddenFileSync extends LiveSyncCommands {
}
// When not JSON file, resolve conflicts by choosing a newer one.
await this.resolveByNewerEntry(id, path, doc, revA, revB);
return;
return [];
} catch (ex) {
Logger(`Failed to resolve conflict (Hidden): ${path}`);
Logger(ex, LOG_LEVEL_VERBOSE);
return;
return [];
}
}, {
suspended: false, batchSize: 1, concurrentLimit: 5, delay: 10, keepResultUntilDownstreamConnected: true, yieldThreshold: 10,
@@ -312,11 +314,11 @@ export class HiddenFileSync extends LiveSyncCommands {
if (processed % 100 == 0) {
Logger(`Hidden file: ${processed}/${fileCount}`, logLevel, "sync_internal");
}
if (!filename) return;
if (!filename) return [];
if (ignorePatterns.some(e => filename.match(e)))
return;
return [];
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
return;
return [];
}
const fileOnStorage = filename in filesMap ? filesMap[filename] : undefined;
@@ -403,7 +405,7 @@ export class HiddenFileSync extends LiveSyncCommands {
const enabledPlugins = this.app.plugins.enabledPlugins as Set<string>;
const enabledPluginManifests = manifests.filter(e => enabledPlugins.has(e.id));
for (const manifest of enabledPluginManifests) {
if (manifest.dir in updatedFolders) {
if (manifest.dir && manifest.dir in updatedFolders) {
// If notified about plug-ins, reloading Obsidian may not be necessary.
updatedCount -= updatedFolders[manifest.dir];
const updatePluginId = manifest.id;
@@ -451,19 +453,11 @@ export class HiddenFileSync extends LiveSyncCommands {
const id = await this.path2id(file.path, ICHeader);
const prefixedFileName = addPrefix(file.path, ICHeader);
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(file.path);
let content: Blob;
try {
content = createBinaryBlob(contentBin);
} catch (ex) {
Logger(`The file ${file.path} could not be encoded`);
Logger(ex, LOG_LEVEL_VERBOSE);
return false;
}
const content = createBlob(await this.plugin.vaultAccess.adapterReadAuto(file.path));
const mtime = file.mtime;
return await serialized("file-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntry(prefixedFileName, null, false, false);
const old = await this.localDatabase.getDBEntry(prefixedFileName, undefined, false, false);
let saveData: SavingEntry;
if (old === false) {
saveData = {
@@ -479,7 +473,7 @@ export class HiddenFileSync extends LiveSyncCommands {
type: "newnote",
};
} else {
if (await isDocContentSame(createBinaryBlob(decodeBinary(old.data)), content) && !forceWrite) {
if (await isDocContentSame(readAsBlob(old), content) && !forceWrite) {
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return;
}
@@ -489,10 +483,10 @@ export class HiddenFileSync extends LiveSyncCommands {
data: content,
mtime,
size: file.size,
datatype: "newnote",
datatype: old.datatype,
children: [],
deleted: false,
type: "newnote",
type: old.datatype,
};
}
const ret = await this.localDatabase.putDBEntry(saveData);
@@ -515,7 +509,7 @@ export class HiddenFileSync extends LiveSyncCommands {
}
await serialized("file-" + prefixedFileName, async () => {
try {
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, null, true) as InternalFileEntry | false;
const old = await this.localDatabase.getDBEntryMeta(prefixedFileName, undefined, true) as InternalFileEntry | false;
let saveData: InternalFileEntry;
if (old === false) {
saveData = {
@@ -531,7 +525,7 @@ export class HiddenFileSync extends LiveSyncCommands {
} else {
// Remove all conflicted before deleting.
const conflicts = await this.localDatabase.getRaw(old._id, { conflicts: true });
if ("_conflicts" in conflicts) {
if (conflicts._conflicts !== undefined) {
for (const conflictRev of conflicts._conflicts) {
await this.localDatabase.removeRevision(old._id, conflictRev);
Logger(`STORAGE -x> DB:${filename}: (hidden) conflict removed ${old._rev} => ${conflictRev}`, LOG_LEVEL_VERBOSE);
@@ -581,7 +575,7 @@ export class HiddenFileSync extends LiveSyncCommands {
const deleted = fileOnDB.deleted || fileOnDB._deleted || false;
if (deleted) {
if (!isExists) {
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
Logger(`STORAGE <x- DB:${filename}: deleted (hidden) Deleted on DB, but the file is already not found on storage.`);
} else {
Logger(`STORAGE <x- DB:${filename}: deleted (hidden).`);
await this.plugin.vaultAccess.adapterRemove(filename);
@@ -597,7 +591,7 @@ export class HiddenFileSync extends LiveSyncCommands {
}
if (!isExists) {
await this.vaultAccess.ensureDirectory(filename);
await this.plugin.vaultAccess.adapterWrite(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
await this.plugin.vaultAccess.adapterWrite(filename, readContent(fileOnDB), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
try {
//@ts-ignore internalAPI
await this.app.vault.adapter.reconcileInternalFile(filename);
@@ -608,13 +602,13 @@ export class HiddenFileSync extends LiveSyncCommands {
Logger(`STORAGE <-- DB:${filename}: written (hidden,new${force ? ", force" : ""})`);
return true;
} else {
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(filename);
const content = await encodeBinary(contentBin);
if (await isDocContentSame(content, fileOnDB.data) && !force) {
const content = await this.plugin.vaultAccess.adapterReadAuto(filename);
const docContent = readContent(fileOnDB);
if (await isDocContentSame(content, docContent) && !force) {
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return true;
}
await this.plugin.vaultAccess.adapterWrite(filename, decodeBinary(fileOnDB.data), { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
await this.plugin.vaultAccess.adapterWrite(filename, docContent, { mtime: fileOnDB.mtime, ctime: fileOnDB.ctime });
try {
//@ts-ignore internalAPI
await this.app.vault.adapter.reconcileInternalFile(filename);
@@ -669,7 +663,11 @@ export class HiddenFileSync extends LiveSyncCommands {
}
await this.plugin.vaultAccess.adapterWrite(filename, result);
const stat = await this.plugin.vaultAccess.adapterStat(filename);
await this.storeInternalFileToDatabase({ path: filename, ...stat }, true);
if (!stat) {
throw new Error("Stat failed");
}
const mtime = stat?.mtime ?? 0;
await this.storeInternalFileToDatabase({ path: filename, mtime, ctime: stat?.ctime ?? mtime, size: stat?.size ?? 0 }, true);
try {
//@ts-ignore internalAPI
await this.app.vault.adapter.reconcileInternalFile(filename);
@@ -703,7 +701,7 @@ export class HiddenFileSync extends LiveSyncCommands {
const root = this.app.vault.getRoot();
const findRoot = root.path;
const filenames = (await this.getFiles(findRoot, [], null, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
const filenames = (await this.getFiles(findRoot, [], undefined, ignoreFilter)).filter(e => e.startsWith(".")).filter(e => !e.startsWith(".trash"));
const files = filenames.filter(path => synchronisedInConfigSync.every(filterFile => !path.toLowerCase().startsWith(filterFile))).map(async (e) => {
return {
path: e as FilePath,
@@ -716,9 +714,12 @@ export class HiddenFileSync extends LiveSyncCommands {
if (await this.plugin.isIgnoredByIgnoreFiles(w.path)) {
continue
}
const mtime = w.stat?.mtime ?? 0
const ctime = w.stat?.ctime ?? mtime;
const size = w.stat?.size ?? 0;
result.push({
...w,
...w.stat
mtime, ctime, size
});
}
return result;
@@ -729,8 +730,8 @@ export class HiddenFileSync extends LiveSyncCommands {
async getFiles(
path: string,
ignoreList: string[],
filter: RegExp[],
ignoreFilter: RegExp[]
filter?: RegExp[],
ignoreFilter?: RegExp[]
) {
let w: ListedFiles;
try {

View File

@@ -1,314 +0,0 @@
import { normalizePath, type PluginManifest } from "./deps";
import type { DocumentID, EntryDoc, FilePathWithPrefix, LoadedEntry, SavingEntry } from "./lib/src/types";
import { LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
import { type PluginDataEntry, PERIODIC_PLUGIN_SWEEP, type PluginList, type DevicePluginList, PSCHeader, PSCHeaderEnd } from "./types";
import { createTextBlob, getDocData, isDocContentSame } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
import { isPluginMetadata, PeriodicProcessor } from "./utils";
import { PluginDialogModal } from "./dialogs";
import { NewNotice } from "./lib/src/wrapper";
import { versionNumberString2Number } from "./lib/src/strbin";
import { serialized, skipIfDuplicated } from "./lib/src/lock";
import { LiveSyncCommands } from "./LiveSyncCommands";
export class PluginAndTheirSettings extends LiveSyncCommands {
get deviceAndVaultName() {
return this.plugin.deviceAndVaultName;
}
pluginDialog: PluginDialogModal = null;
periodicPluginSweepProcessor = new PeriodicProcessor(this.plugin, async () => await this.sweepPlugin(false));
showPluginSyncModal() {
if (this.pluginDialog != null) {
this.pluginDialog.open();
} else {
this.pluginDialog = new PluginDialogModal(this.app, this.plugin);
this.pluginDialog.open();
}
}
hidePluginSyncModal() {
if (this.pluginDialog != null) {
this.pluginDialog.close();
this.pluginDialog = null;
}
}
onload(): void | Promise<void> {
this.plugin.addCommand({
id: "livesync-plugin-dialog",
name: "Show Plugins and their settings",
callback: () => {
this.showPluginSyncModal();
},
});
this.showPluginSyncModal();
}
onunload() {
this.hidePluginSyncModal();
this.periodicPluginSweepProcessor?.disable();
}
parseReplicationResultItem(doc: PouchDB.Core.ExistingDocument<EntryDoc>) {
if (isPluginMetadata(doc._id)) {
if (this.settings.notifyPluginOrSettingUpdated) {
this.triggerCheckPluginUpdate();
return true;
}
}
return false;
}
async beforeReplicate(showMessage: boolean) {
if (this.settings.autoSweepPlugins) {
await this.sweepPlugin(showMessage);
}
}
async onResume() {
if (this.plugin.suspended)
return;
if (this.settings.autoSweepPlugins) {
await this.sweepPlugin(false);
}
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
}
async onInitializeDatabase(showNotice: boolean) {
if (this.settings.usePluginSync) {
try {
Logger("Scanning plugins...");
await this.sweepPlugin(showNotice);
Logger("Scanning plugins done");
} catch (ex) {
Logger("Scanning plugins failed");
Logger(ex, LOG_LEVEL_VERBOSE);
}
}
}
async realizeSettingSyncMode() {
this.periodicPluginSweepProcessor?.disable();
if (this.plugin.suspended)
return;
if (this.settings.autoSweepPlugins) {
await this.sweepPlugin(false);
}
this.periodicPluginSweepProcessor.enable(this.settings.autoSweepPluginsPeriodic && !this.settings.watchInternalFileChanges ? (PERIODIC_PLUGIN_SWEEP * 1000) : 0);
}
triggerCheckPluginUpdate() {
(async () => await this.checkPluginUpdate())();
}
async getPluginList(): Promise<{ plugins: PluginList; allPlugins: DevicePluginList; thisDevicePlugins: DevicePluginList; }> {
const docList = await this.localDatabase.allDocsRaw<PluginDataEntry>({ startkey: PSCHeader, endkey: PSCHeaderEnd, include_docs: false });
const oldDocs: PluginDataEntry[] = ((await Promise.all(docList.rows.map(async (e) => await this.localDatabase.getDBEntry(e.id as FilePathWithPrefix /* WARN!! THIS SHOULD BE WRAPPED */)))).filter((e) => e !== false) as LoadedEntry[]).map((e) => JSON.parse(getDocData(e.data)));
const plugins: { [key: string]: PluginDataEntry[]; } = {};
const allPlugins: { [key: string]: PluginDataEntry; } = {};
const thisDevicePlugins: { [key: string]: PluginDataEntry; } = {};
for (const v of oldDocs) {
if (typeof plugins[v.deviceVaultName] === "undefined") {
plugins[v.deviceVaultName] = [];
}
plugins[v.deviceVaultName].push(v);
allPlugins[v._id] = v;
if (v.deviceVaultName == this.deviceAndVaultName) {
thisDevicePlugins[v.manifest.id] = v;
}
}
return { plugins, allPlugins, thisDevicePlugins };
}
async checkPluginUpdate() {
if (!this.plugin.settings.usePluginSync)
return;
await this.sweepPlugin(false);
const { allPlugins, thisDevicePlugins } = await this.getPluginList();
const arrPlugins = Object.values(allPlugins);
let updateFound = false;
for (const plugin of arrPlugins) {
const ownPlugin = thisDevicePlugins[plugin.manifest.id];
if (ownPlugin) {
const remoteVersion = versionNumberString2Number(plugin.manifest.version);
const ownVersion = versionNumberString2Number(ownPlugin.manifest.version);
if (remoteVersion > ownVersion) {
updateFound = true;
}
if (((plugin.mtime / 1000) | 0) > ((ownPlugin.mtime / 1000) | 0) && (plugin.dataJson ?? "") != (ownPlugin.dataJson ?? "")) {
updateFound = true;
}
}
}
if (updateFound) {
const fragment = createFragment((doc) => {
doc.createEl("a", null, (a) => {
a.text = "There're some new plugins or their settings";
a.addEventListener("click", () => this.showPluginSyncModal());
});
});
NewNotice(fragment, 10000);
} else {
Logger("Everything is up to date.", LOG_LEVEL_NOTICE);
}
}
async sweepPlugin(showMessage = false, specificPluginPath = "") {
if (!this.settings.usePluginSync)
return;
if (!this.localDatabase.isReady)
return;
// @ts-ignore
const pl = this.app.plugins;
const manifests: PluginManifest[] = Object.values(pl.manifests);
let specificPlugin = "";
if (specificPluginPath != "") {
specificPlugin = manifests.find(e => e.dir.endsWith("/" + specificPluginPath))?.id ?? "";
}
await skipIfDuplicated("sweepplugin", async () => {
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
if (!this.deviceAndVaultName) {
Logger("You have to set your device name.", LOG_LEVEL_NOTICE);
return;
}
Logger("Scanning plugins", logLevel);
const oldDocs = await this.localDatabase.allDocsRaw<EntryDoc>({
startkey: `ps:${this.deviceAndVaultName}-${specificPlugin}`,
endkey: `ps:${this.deviceAndVaultName}-${specificPlugin}\u{10ffff}`,
include_docs: true,
});
// Logger("OLD DOCS.", LOG_LEVEL_VERBOSE);
// sweep current plugin.
const procs = manifests.map(async (m) => {
const pluginDataEntryID = `ps:${this.deviceAndVaultName}-${m.id}` as DocumentID;
try {
if (specificPlugin && m.id != specificPlugin) {
return;
}
Logger(`Reading plugin:${m.name}(${m.id})`, LOG_LEVEL_VERBOSE);
const path = normalizePath(m.dir) + "/";
const files = ["manifest.json", "main.js", "styles.css", "data.json"];
const pluginData: { [key: string]: string; } = {};
for (const file of files) {
const thePath = path + file;
if (await this.plugin.vaultAccess.adapterExists(thePath)) {
pluginData[file] = await this.plugin.vaultAccess.adapterRead(thePath);
}
}
let mtime = 0;
if (await this.plugin.vaultAccess.adapterExists(path + "/data.json")) {
mtime = (await this.plugin.vaultAccess.adapterStat(path + "/data.json")).mtime;
}
const p: PluginDataEntry = {
_id: pluginDataEntryID,
dataJson: pluginData["data.json"],
deviceVaultName: this.deviceAndVaultName,
mainJs: pluginData["main.js"],
styleCss: pluginData["styles.css"],
manifest: m,
manifestJson: pluginData["manifest.json"],
mtime: mtime,
type: "plugin",
};
const blob = createTextBlob(JSON.stringify(p));
const d: SavingEntry = {
_id: p._id,
path: p._id as string as FilePathWithPrefix,
data: blob,
ctime: mtime,
mtime: mtime,
size: blob.size,
children: [],
datatype: "plain",
type: "plain"
};
Logger(`check diff:${m.name}(${m.id})`, LOG_LEVEL_VERBOSE);
await serialized("plugin-" + m.id, async () => {
const old = await this.localDatabase.getDBEntry(p._id as string as FilePathWithPrefix /* This also should be explained */, null, false, false);
if (old !== false) {
const oldData = { data: old.data, deleted: old._deleted };
const newData = { data: d.data, deleted: d._deleted };
if (await isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
Logger(`Nothing changed:${m.name}`);
return;
}
}
await this.localDatabase.putDBEntry(d);
Logger(`Plugin saved:${m.name}`, logLevel);
});
} catch (ex) {
Logger(`Plugin save failed:${m.name}`, LOG_LEVEL_NOTICE);
} finally {
oldDocs.rows = oldDocs.rows.filter((e) => e.id != pluginDataEntryID);
}
//remove saved plugin data.
}
);
await Promise.all(procs);
const delDocs = oldDocs.rows.map((e) => {
// e.doc._deleted = true;
if (e.doc.type == "newnote" || e.doc.type == "plain") {
e.doc.deleted = true;
if (this.settings.deleteMetadataOfDeletedFiles) {
e.doc._deleted = true;
}
} else {
e.doc._deleted = true;
}
return e.doc;
});
Logger(`Deleting old plugin:(${delDocs.length})`, LOG_LEVEL_VERBOSE);
await this.localDatabase.bulkDocsRaw(delDocs);
Logger(`Scan plugin done.`, logLevel);
});
}
async applyPluginData(plugin: PluginDataEntry) {
await serialized("plugin-" + plugin.manifest.id, async () => {
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
// @ts-ignore
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
if (stat) {
// @ts-ignore
await this.app.plugins.unloadPlugin(plugin.manifest.id);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
if (plugin.dataJson)
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "data.json", plugin.dataJson);
Logger("wrote:" + pluginTargetFolderPath + "data.json", LOG_LEVEL_NOTICE);
if (stat) {
// @ts-ignore
await this.app.plugins.loadPlugin(plugin.manifest.id);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
});
}
async applyPlugin(plugin: PluginDataEntry) {
await serialized("plugin-" + plugin.manifest.id, async () => {
// @ts-ignore
const stat = this.app.plugins.enabledPlugins.has(plugin.manifest.id) == true;
if (stat) {
// @ts-ignore
await this.app.plugins.unloadPlugin(plugin.manifest.id);
Logger(`Unload plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
const pluginTargetFolderPath = normalizePath(plugin.manifest.dir) + "/";
if ((await this.plugin.vaultAccess.adapterExists(pluginTargetFolderPath)) === false) {
await this.app.vault.adapter.mkdir(pluginTargetFolderPath);
}
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "main.js", plugin.mainJs);
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "manifest.json", plugin.manifestJson);
if (plugin.styleCss)
await this.plugin.vaultAccess.adapterWrite(pluginTargetFolderPath + "styles.css", plugin.styleCss);
if (stat) {
// @ts-ignore
await this.app.plugins.loadPlugin(plugin.manifest.id);
Logger(`Load plugin:${plugin.manifest.id}`, LOG_LEVEL_NOTICE);
}
});
}
}

View File

@@ -1,4 +1,4 @@
import { type EntryDoc, type ObsidianLiveSyncSettings, DEFAULT_SETTINGS, LOG_LEVEL_NOTICE } from "./lib/src/types";
import { type EntryDoc, type ObsidianLiveSyncSettings, DEFAULT_SETTINGS, LOG_LEVEL_NOTICE, REMOTE_COUCHDB, REMOTE_MINIO } from "./lib/src/types";
import { configURIBase } from "./types";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
@@ -9,6 +9,7 @@ import { delay, fireAndForget } from "./lib/src/utils";
import { confirmWithMessage } from "./dialogs";
import { Platform } from "./deps";
import { fetchAllUsedChunks } from "./lib/src/utils_couchdb";
import type { LiveSyncCouchDBReplicator } from "./lib/src/LiveSyncReplicator.js";
export class SetupLiveSync extends LiveSyncCommands {
onunload() { }
@@ -50,7 +51,7 @@ export class SetupLiveSync extends LiveSyncCommands {
const encryptingPassphrase = await askString(this.app, "Encrypt your settings", "The passphrase to encrypt the setup URI", "", true);
if (encryptingPassphrase === false)
return;
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" };
const setting = { ...this.settings, configPassphraseStore: "", encryptedCouchDBConnection: "", encryptedPassphrase: "" } as Partial<ObsidianLiveSyncSettings>;
if (stripExtra) {
delete setting.pluginSyncExtendedSetting;
}
@@ -311,6 +312,7 @@ Of course, we are able to disable these features.`
}
async suspendReflectingDatabase() {
if (this.plugin.settings.doNotSuspendOnFetching) return;
if (this.plugin.settings.remoteType == REMOTE_MINIO) return;
Logger(`Suspending reflection: Database and storage changes will not be reflected in each other until completely finished the fetching.`, LOG_LEVEL_NOTICE);
this.plugin.settings.suspendParseReplicationResult = true;
this.plugin.settings.suspendFileWatching = true;
@@ -318,6 +320,7 @@ Of course, we are able to disable these features.`
}
async resumeReflectingDatabase() {
if (this.plugin.settings.doNotSuspendOnFetching) return;
if (this.plugin.settings.remoteType == REMOTE_MINIO) return;
Logger(`Database and storage reflection has been resumed!`, LOG_LEVEL_NOTICE);
this.plugin.settings.suspendParseReplicationResult = false;
this.plugin.settings.suspendFileWatching = false;
@@ -348,9 +351,10 @@ Of course, we are able to disable these features.`
await this.plugin.resetLocalDatabase();
}
async fetchRemoteChunks() {
if (!this.plugin.settings.doNotSuspendOnFetching && this.plugin.settings.readChunksOnline) {
if (!this.plugin.settings.doNotSuspendOnFetching && this.plugin.settings.readChunksOnline && this.plugin.settings.remoteType == REMOTE_COUCHDB) {
Logger(`Fetching chunks`, LOG_LEVEL_NOTICE);
const remoteDB = await this.plugin.getReplicator().connectRemoteCouchDBWithSetting(this.settings, this.plugin.getIsMobile(), true);
const replicator = this.plugin.getReplicator() as LiveSyncCouchDBReplicator;
const remoteDB = await replicator.connectRemoteCouchDBWithSetting(this.settings, this.plugin.getIsMobile(), true);
if (typeof remoteDB == "string") {
Logger(remoteDB, LOG_LEVEL_NOTICE);
} else {
@@ -377,9 +381,6 @@ Of course, we are able to disable these features.`
await this.plugin.replicateAllFromServer(true);
await delay(1000);
await this.plugin.replicateAllFromServer(true);
// if (!tryLessFetching) {
// await this.fetchRemoteChunks();
// }
await this.resumeReflectingDatabase();
await this.askHiddenFileConfiguration({ enableFetch: true });
}

View File

@@ -5,7 +5,7 @@ import ObsidianLiveSyncPlugin from "./main";
import { type DocumentID, type FilePathWithPrefix, type LoadedEntry, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE } from "./lib/src/types";
import { Logger } from "./lib/src/logger";
import { isErrorOfMissingDoc } from "./lib/src/utils_couchdb";
import { getDocData } from "./lib/src/utils";
import { getDocData, readContent } from "./lib/src/utils";
import { isPlainText, stripPrefix } from "./lib/src/path";
function isImage(path: string) {
@@ -25,7 +25,7 @@ function readDocument(w: LoadedEntry) {
if (isImage(w.path)) {
return new Uint8Array(decodeBinary(w.data));
}
if (w.data == "plain") return getDocData(w.data);
if (w.type == "plain" || w.datatype == "plain") return getDocData(w.data);
if (isComparableTextDecode(w.path)) return readString(new Uint8Array(decodeBinary(w.data)));
if (isComparableText(w.path)) return getDocData(w.data);
try {
@@ -66,7 +66,7 @@ export class DocumentHistoryModal extends Modal {
}
}
async loadFile(initialRev: string) {
async loadFile(initialRev?: string) {
if (!this.id) {
this.id = await this.plugin.path2id(this.file);
}
@@ -109,7 +109,7 @@ export class DocumentHistoryModal extends Modal {
if (v) {
URL.revokeObjectURL(v);
}
this.BlobURLs.set(key, undefined);
this.BlobURLs.delete(key);
}
generateBlobURL(key: string, data: Uint8Array) {
this.revokeURL(key);
@@ -247,12 +247,10 @@ export class DocumentHistoryModal extends Modal {
Logger(`Old content copied to clipboard`, LOG_LEVEL_NOTICE);
});
});
async function focusFile(path: string) {
const targetFile = app.vault
.getFiles()
.find((f) => f.path === path);
const focusFile = async (path: string) => {
const targetFile = this.plugin.app.vault.getFileByPath(path);
if (targetFile) {
const leaf = app.workspace.getLeaf(false);
const leaf = this.plugin.app.workspace.getLeaf(false);
await leaf.openFile(targetFile);
} else {
Logger("The file could not view on the editor", LOG_LEVEL_NOTICE)
@@ -265,19 +263,16 @@ export class DocumentHistoryModal extends Modal {
const pathToWrite = stripPrefix(this.file);
if (!isValidPath(pathToWrite)) {
Logger("Path is not valid to write content.", LOG_LEVEL_INFO);
return;
}
if (this.currentDoc?.datatype == "plain") {
await this.plugin.vaultAccess.adapterWrite(pathToWrite, getDocData(this.currentDoc.data));
await focusFile(pathToWrite);
this.close();
} else if (this.currentDoc?.datatype == "newnote") {
await this.plugin.vaultAccess.adapterWrite(pathToWrite, decodeBinary(this.currentDoc.data));
await focusFile(pathToWrite);
this.close();
} else {
Logger(`Could not parse entry`, LOG_LEVEL_NOTICE);
if (!this.currentDoc) {
Logger("No active file loaded.", LOG_LEVEL_INFO);
return;
}
const d = readContent(this.currentDoc);
await this.plugin.vaultAccess.adapterWrite(pathToWrite, d);
await focusFile(pathToWrite);
this.close();
});
});
}

View File

@@ -2,12 +2,11 @@
import ObsidianLiveSyncPlugin from "./main";
import { onDestroy, onMount } from "svelte";
import type { AnyEntry, FilePathWithPrefix } from "./lib/src/types";
import { createBinaryBlob, getDocData, isDocContentSame } from "./lib/src/utils";
import { getDocData, isAnyNote, isDocContentSame, readAsBlob } from "./lib/src/utils";
import { diff_match_patch } from "./deps";
import { DocumentHistoryModal } from "./DocumentHistoryModal";
import { isPlainText, stripAllPrefixes } from "./lib/src/path";
import { TFile } from "./deps";
import { decodeBinary } from "./lib/src/strbin";
export let plugin: ObsidianLiveSyncPlugin;
let showDiffInfo = false;
@@ -31,7 +30,7 @@
type HistoryData = {
id: string;
rev: string;
rev?: string;
path: string;
dirname: string;
filename: string;
@@ -54,12 +53,12 @@
if (docA.mtime < range_from_epoch) {
continue;
}
if (docA.type != "newnote" && docA.type != "plain") continue;
if (!isAnyNote(docA)) continue;
const path = plugin.getPath(docA as AnyEntry);
const isPlain = isPlainText(docA.path);
const revs = await db.getRaw(docA._id, { revs_info: true });
let p: string = undefined;
const reversedRevs = revs._revs_info.reverse();
let p: string | undefined = undefined;
const reversedRevs = (revs._revs_info ?? []).reverse();
const DIFF_DELETE = -1;
const DIFF_EQUAL = 0;
@@ -107,15 +106,9 @@
if (checkStorageDiff) {
const abs = plugin.vaultAccess.getAbstractFileByPath(stripAllPrefixes(plugin.getPath(docA)));
if (abs instanceof TFile) {
let result = false;
if (isPlainText(docA.path)) {
const data = await plugin.vaultAccess.adapterRead(abs);
result = await isDocContentSame(data, doc.data);
} else {
const data = await plugin.vaultAccess.adapterReadBinary(abs);
const dataEEncoded = createBinaryBlob(data);
result = await isDocContentSame(dataEEncoded, createBinaryBlob(decodeBinary(doc.data)));
}
const data = await plugin.vaultAccess.adapterReadAuto(abs);
const d = readAsBlob(doc);
const result = await isDocContentSame(data, d);
if (result) {
diffDetail += " ⚖️";
} else {
@@ -184,7 +177,7 @@
onDestroy(() => {});
function showHistory(file: string, rev: string) {
new DocumentHistoryModal(plugin.app, plugin, file as unknown as FilePathWithPrefix, null, rev).open();
new DocumentHistoryModal(plugin.app, plugin, file as unknown as FilePathWithPrefix, undefined, rev).open();
}
function openFile(file: string) {
plugin.app.workspace.openLinkText(file, file);
@@ -239,7 +232,7 @@
<td>
<span class="rev">
{#if entry.isPlain}
<a on:click={() => showHistory(entry.path, entry.rev)}>{entry.rev}</a>
<a on:click={() => showHistory(entry.path, entry?.rev || "")}>{entry.rev}</a>
{:else}
{entry.rev}
{/if}

View File

@@ -6,15 +6,15 @@
import { mergeObject } from "./utils";
export let docs: LoadedEntry[] = [];
export let callback: (keepRev: string, mergedStr?: string) => Promise<void> = async (_, __) => {
export let callback: (keepRev?: string, mergedStr?: string) => Promise<void> = async (_, __) => {
Promise.resolve();
};
export let filename: FilePath = "" as FilePath;
export let nameA: string = "A";
export let nameB: string = "B";
export let defaultSelect: string = "";
let docA: LoadedEntry = undefined;
let docB: LoadedEntry = undefined;
let docA: LoadedEntry;
let docB: LoadedEntry;
let docAContent = "";
let docBContent = "";
let objA: any = {};
@@ -28,7 +28,8 @@
function docToString(doc: LoadedEntry) {
return doc.datatype == "plain" ? getDocData(doc.data) : readString(new Uint8Array(decodeBinary(doc.data)));
}
function revStringToRevNumber(rev: string) {
function revStringToRevNumber(rev?: string) {
if (!rev) return "";
return rev.split("-")[0];
}
@@ -44,15 +45,15 @@
}
function apply() {
if (docA._id == docB._id) {
if (mode == "A") return callback(docA._rev, null);
if (mode == "B") return callback(docB._rev, null);
if (mode == "A") return callback(docA._rev!, undefined);
if (mode == "B") return callback(docB._rev!, undefined);
} else {
if (mode == "A") return callback(null, docToString(docA));
if (mode == "B") return callback(null, docToString(docB));
if (mode == "A") return callback(undefined, docToString(docA));
if (mode == "B") return callback(undefined, docToString(docB));
}
if (mode == "BA") return callback(null, JSON.stringify(objBA, null, 2));
if (mode == "AB") return callback(null, JSON.stringify(objAB, null, 2));
callback(null, null);
if (mode == "BA") return callback(undefined, JSON.stringify(objBA, null, 2));
if (mode == "AB") return callback(undefined, JSON.stringify(objAB, null, 2));
callback(undefined, undefined);
}
$: {
if (docs && docs.length >= 1) {
@@ -133,13 +134,17 @@
{/if}
<div>
{nameA}
{#if docA._id == docB._id} Rev:{revStringToRevNumber(docA._rev)} {/if} ,{new Date(docA.mtime).toLocaleString()}
{#if docA._id == docB._id}
Rev:{revStringToRevNumber(docA._rev)}
{/if} ,{new Date(docA.mtime).toLocaleString()}
{docAContent.length} letters
</div>
<div>
{nameB}
{#if docA._id == docB._id} Rev:{revStringToRevNumber(docB._rev)} {/if} ,{new Date(docB.mtime).toLocaleString()}
{#if docA._id == docB._id}
Rev:{revStringToRevNumber(docB._rev)}
{/if} ,{new Date(docB.mtime).toLocaleString()}
{docBContent.length} letters
</div>

View File

@@ -1,12 +1,12 @@
import { deleteDB, type IDBPDatabase, openDB } from "idb";
export interface KeyValueDatabase {
get<T>(key: string): Promise<T>;
set<T>(key: string, value: T): Promise<IDBValidKey>;
del(key: string): Promise<void>;
get<T>(key: IDBValidKey): Promise<T>;
set<T>(key: IDBValidKey, value: T): Promise<IDBValidKey>;
del(key: IDBValidKey): Promise<void>;
clear(): Promise<void>;
keys(query?: IDBValidKey | IDBKeyRange, count?: number): Promise<IDBValidKey[]>;
close(): void;
destroy(): void;
destroy(): Promise<void>;
}
const databaseCache: { [key: string]: IDBPDatabase<any> } = {};
export const OpenKeyValueDatabase = async (dbKey: string): Promise<KeyValueDatabase> => {
@@ -20,24 +20,23 @@ export const OpenKeyValueDatabase = async (dbKey: string): Promise<KeyValueDatab
db.createObjectStore(storeKey);
},
});
let db: IDBPDatabase<any> = null;
db = await dbPromise;
const db = await dbPromise;
databaseCache[dbKey] = db;
return {
get<T>(key: string): Promise<T> {
return db.get(storeKey, key);
async get<T>(key: IDBValidKey): Promise<T> {
return await db.get(storeKey, key);
},
set<T>(key: string, value: T) {
return db.put(storeKey, value, key);
async set<T>(key: IDBValidKey, value: T) {
return await db.put(storeKey, value, key);
},
del(key: string) {
return db.delete(storeKey, key);
async del(key: IDBValidKey) {
return await db.delete(storeKey, key);
},
clear() {
return db.clear(storeKey);
async clear() {
return await db.clear(storeKey);
},
keys(query?: IDBValidKey | IDBKeyRange, count?: number) {
return db.getAllKeys(storeKey, query, count);
async keys(query?: IDBValidKey | IDBKeyRange, count?: number) {
return await db.getAllKeys(storeKey, query, count);
},
close() {
delete databaseCache[dbKey];

View File

@@ -0,0 +1,83 @@
<script lang="ts">
export let patterns = [] as string[];
export let originals = [] as string[];
export let apply: (args: string[]) => Promise<void> = (_: string[]) => Promise.resolve();
function revert() {
patterns = [...originals];
}
const CHECK_OK = "✔";
const CHECK_NG = "⚠";
const MARK_MODIFIED = "✏ ";
function checkRegExp(pattern: string) {
if (pattern.trim() == "") return "";
try {
const _ = new RegExp(pattern);
return CHECK_OK;
} catch (ex) {
return CHECK_NG;
}
}
$: status = patterns.map((e) => checkRegExp(e));
$: modified = patterns.map((e, i) => (e != originals?.[i] ?? "" ? MARK_MODIFIED : ""));
function remove(idx: number) {
patterns[idx] = "";
}
function add() {
patterns = [...patterns, ""];
}
</script>
<ul>
{#each patterns as pattern, idx}
<li><label>{modified[idx]}{status[idx]}</label><input type="text" bind:value={pattern} class={modified[idx]} /><button class="iconbutton" on:click={() => remove(idx)}>🗑</button></li>
{/each}
<li>
<label><button on:click={() => add()}>Add</button></label>
</li>
<li class="buttons">
<button on:click={() => apply(patterns)} disabled={status.some((e) => e == CHECK_NG) || modified.every((e) => e == "")}>Apply</button>
<button on:click={() => revert()} disabled={status.some((e) => e == CHECK_NG) || modified.every((e) => e == "")}>Revert</button>
</li>
</ul>
<style>
label {
min-width: 4em;
width: 4em;
display: inline-flex;
flex-direction: row;
justify-content: flex-end;
}
ul {
flex-grow: 1;
display: inline-flex;
flex-direction: column;
list-style-type: none;
margin-block-start: 0;
margin-block-end: 0;
margin-inline-start: 0px;
margin-inline-end: 0px;
padding-inline-start: 0;
}
li {
padding: var(--size-2-1) var(--size-4-1);
display: inline-flex;
flex-grow: 1;
align-items: center;
justify-content: flex-end;
gap: var(--size-4-2);
}
li input {
min-width: 10em;
}
li.buttons {
}
button.iconbutton {
max-width: 4em;
}
span.spacer {
flex-grow: 1;
}
</style>

133
src/ObsHttpHandler.ts Normal file
View File

@@ -0,0 +1,133 @@
// This file is based on a file that was published by the @remotely-save, under the Apache 2 License.
// I would love to express my deepest gratitude to the original authors for their hard work and dedication. Without their contributions, this project would not have been possible.
//
// Original Implementation is here: https://github.com/remotely-save/remotely-save/blob/28b99557a864ef59c19d2ad96101196e401718f0/src/remoteForS3.ts
import {
FetchHttpHandler,
type FetchHttpHandlerOptions,
} from "@smithy/fetch-http-handler";
import { HttpRequest, HttpResponse, type HttpHandlerOptions } from "@smithy/protocol-http";
//@ts-ignore
import { requestTimeout } from "@smithy/fetch-http-handler/dist-es/request-timeout";
import { buildQueryString } from "@smithy/querystring-builder";
import { requestUrl, type RequestUrlParam } from "./deps";
////////////////////////////////////////////////////////////////////////////////
// special handler using Obsidian requestUrl
////////////////////////////////////////////////////////////////////////////////
/**
* This is close to origin implementation of FetchHttpHandler
* https://github.com/aws/aws-sdk-js-v3/blob/main/packages/fetch-http-handler/src/fetch-http-handler.ts
* that is released under Apache 2 License.
* But this uses Obsidian requestUrl instead.
*/
export class ObsHttpHandler extends FetchHttpHandler {
requestTimeoutInMs: number | undefined;
reverseProxyNoSignUrl: string | undefined;
constructor(
options?: FetchHttpHandlerOptions,
reverseProxyNoSignUrl?: string
) {
super(options);
this.requestTimeoutInMs =
options === undefined ? undefined : options.requestTimeout;
this.reverseProxyNoSignUrl = reverseProxyNoSignUrl;
}
async handle(
request: HttpRequest,
{ abortSignal }: HttpHandlerOptions = {}
): Promise<{ response: HttpResponse }> {
if (abortSignal?.aborted) {
const abortError = new Error("Request aborted");
abortError.name = "AbortError";
return Promise.reject(abortError);
}
let path = request.path;
if (request.query) {
const queryString = buildQueryString(request.query);
if (queryString) {
path += `?${queryString}`;
}
}
const { port, method } = request;
let url = `${request.protocol}//${request.hostname}${port ? `:${port}` : ""
}${path}`;
if (
this.reverseProxyNoSignUrl !== undefined &&
this.reverseProxyNoSignUrl !== ""
) {
const urlObj = new URL(url);
urlObj.host = this.reverseProxyNoSignUrl;
url = urlObj.href;
}
const body =
method === "GET" || method === "HEAD" ? undefined : request.body;
const transformedHeaders: Record<string, string> = {};
for (const key of Object.keys(request.headers)) {
const keyLower = key.toLowerCase();
if (keyLower === "host" || keyLower === "content-length") {
continue;
}
transformedHeaders[keyLower] = request.headers[key];
}
let contentType: string | undefined = undefined;
if (transformedHeaders["content-type"] !== undefined) {
contentType = transformedHeaders["content-type"];
}
let transformedBody: any = body;
if (ArrayBuffer.isView(body)) {
transformedBody = new Uint8Array(body.buffer).buffer;
}
const param: RequestUrlParam = {
body: transformedBody,
headers: transformedHeaders,
method: method,
url: url,
contentType: contentType,
};
const raceOfPromises = [
requestUrl(param).then((rsp) => {
const headers = rsp.headers;
const headersLower: Record<string, string> = {};
for (const key of Object.keys(headers)) {
headersLower[key.toLowerCase()] = headers[key];
}
const stream = new ReadableStream<Uint8Array>({
start(controller) {
controller.enqueue(new Uint8Array(rsp.arrayBuffer));
controller.close();
},
});
return {
response: new HttpResponse({
headers: headersLower,
statusCode: rsp.status,
body: stream,
}),
};
}),
requestTimeout(this.requestTimeoutInMs),
];
if (abortSignal) {
raceOfPromises.push(
new Promise<never>((resolve, reject) => {
abortSignal.onabort = () => {
const abortError = new Error("Request aborted");
abortError.name = "AbortError";
reject(abortError);
};
})
);
}
return Promise.race(raceOfPromises);
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,7 @@
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "./deps";
import { serialized } from "./lib/src/lock";
import { Logger } from "./lib/src/logger";
import { isPlainText } from "./lib/src/path";
import type { FilePath } from "./lib/src/types";
import { createBinaryBlob, isDocContentSame } from "./lib/src/utils";
import type { InternalFileInfo } from "./types";
@@ -56,6 +57,12 @@ export class SerializedFileAccess {
return await processReadFile(file, () => this.app.vault.adapter.readBinary(path));
}
async adapterReadAuto(file: TFile | string) {
const path = file instanceof TFile ? file.path : file;
if (isPlainText(path)) return await processReadFile(file, () => this.app.vault.adapter.read(path));
return await processReadFile(file, () => this.app.vault.adapter.readBinary(path));
}
async adapterWrite(file: TFile | string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
const path = file instanceof TFile ? file.path : file;
if (typeof (data) === "string") {
@@ -77,12 +84,19 @@ export class SerializedFileAccess {
return await processReadFile(file, () => this.app.vault.readBinary(file));
}
async vaultReadAuto(file: TFile) {
const path = file.path;
if (isPlainText(path)) return await processReadFile(file, () => this.app.vault.read(file));
return await processReadFile(file, () => this.app.vault.readBinary(file));
}
async vaultModify(file: TFile, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
if (typeof (data) === "string") {
return await processWriteFile(file, async () => {
const oldData = await this.app.vault.read(file);
if (data === oldData) {
markChangesAreSame(file, file.stat.mtime, options.mtime);
if (options && options.mtime) markChangesAreSame(file, file.stat.mtime, options.mtime);
return false
}
await this.app.vault.modify(file, data, options)
@@ -93,7 +107,7 @@ export class SerializedFileAccess {
return await processWriteFile(file, async () => {
const oldData = await this.app.vault.readBinary(file);
if (await isDocContentSame(createBinaryBlob(oldData), createBinaryBlob(data))) {
markChangesAreSame(file, file.stat.mtime, options.mtime);
if (options && options.mtime) markChangesAreSame(file, file.stat.mtime, options.mtime);
return false;
}
await this.app.vault.modifyBinary(file, toArrayBuffer(data), options)
@@ -149,10 +163,9 @@ export class SerializedFileAccess {
c += v;
try {
await this.app.vault.adapter.mkdir(c);
} catch (ex) {
// basically skip exceptions.
if (ex.message && ex.message == "Folder already exists.") {
// especially this message is.
} catch (ex: any) {
if (ex?.message == "Folder already exists.") {
// Skip if already exists.
} else {
Logger("Folder Create Error");
Logger(ex);

View File

@@ -1,8 +1,8 @@
import type { SerializedFileAccess } from "./SerializedFileAccess";
import { Plugin, TAbstractFile, TFile, TFolder } from "./deps";
import { Logger } from "./lib/src/logger";
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
import type { KeyedQueueProcessor } from "./lib/src/processor";
import { shouldBeIgnored } from "./lib/src/path";
import type { QueueProcessor } from "./lib/src/processor";
import { LOG_LEVEL_NOTICE, type FilePath, type ObsidianLiveSyncSettings } from "./lib/src/types";
import { delay } from "./lib/src/utils";
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo } from "./types";
@@ -19,7 +19,7 @@ type LiveSyncForStorageEventManager = Plugin &
vaultAccess: SerializedFileAccess
} & {
isTargetFile: (file: string | TAbstractFile) => Promise<boolean>,
fileEventQueue: KeyedQueueProcessor<FileEventItem, any>,
fileEventQueue: QueueProcessor<FileEventItem, any>,
isFileSizeExceeded: (size: number) => boolean;
};
@@ -109,7 +109,8 @@ export class StorageEventManagerObsidian extends StorageEventManager {
if (file instanceof TFolder) continue;
if (!await this.plugin.isTargetFile(file.path)) continue;
let cache: null | string | ArrayBuffer;
// Stop cache using to prevent the corruption;
// let cache: null | string | ArrayBuffer;
// new file or something changed, cache the changes.
if (file instanceof TFile && (type == "CREATE" || type == "CHANGED")) {
// Wait for a bit while to let the writer has marked `touched` at the file.
@@ -117,12 +118,13 @@ export class StorageEventManagerObsidian extends StorageEventManager {
if (this.plugin.vaultAccess.recentlyTouched(file)) {
continue;
}
if (!isPlainText(file.name)) {
cache = await this.plugin.vaultAccess.vaultReadBinary(file);
} else {
cache = await this.plugin.vaultAccess.vaultCacheRead(file);
if (!cache) cache = await this.plugin.vaultAccess.vaultRead(file);
}
// cache = await this.plugin.vaultAccess.vaultReadAuto(file);
// if (!isPlainText(file.name)) {
// cache = await this.plugin.vaultAccess.vaultReadBinary(file);
// } else {
// cache = await this.plugin.vaultAccess.vaultCacheRead(file);
// if (!cache) cache = await this.plugin.vaultAccess.vaultRead(file);
// }
}
const fileInfo = file instanceof TFile ? {
ctime: file.stat.ctime,
@@ -131,13 +133,12 @@ export class StorageEventManagerObsidian extends StorageEventManager {
path: file.path,
size: file.stat.size
} as FileInfo : file as InternalFileInfo;
this.plugin.fileEventQueue.enqueueWithKey(`file-${fileInfo.path}`, {
this.plugin.fileEventQueue.enqueue({
type,
args: {
file: fileInfo,
oldPath,
cache,
// cache,
ctx
},
key: atomicKey

View File

@@ -7,10 +7,9 @@ import PluginPane from "./PluginPane.svelte";
export class PluginDialogModal extends Modal {
plugin: ObsidianLiveSyncPlugin;
logEl: HTMLDivElement;
component: PluginPane = null;
component: PluginPane | undefined;
isOpened() {
return this.component != null;
return this.component != undefined;
}
constructor(app: App, plugin: ObsidianLiveSyncPlugin) {
@@ -21,7 +20,7 @@ export class PluginDialogModal extends Modal {
onOpen() {
const { contentEl } = this;
this.titleEl.setText("Customization Sync (Beta2)")
if (this.component == null) {
if (!this.component) {
this.component = new PluginPane({
target: contentEl,
props: { plugin: this.plugin },
@@ -30,9 +29,9 @@ export class PluginDialogModal extends Modal {
}
onClose() {
if (this.component != null) {
if (this.component) {
this.component.$destroy();
this.component = null;
this.component = undefined;
}
}
}
@@ -94,13 +93,13 @@ export class InputStringDialog extends Modal {
}
export class PopoverSelectString extends FuzzySuggestModal<string> {
app: App;
callback: (e: string) => void = () => { };
callback: ((e: string) => void) | undefined = () => { };
getItemsFun: () => string[] = () => {
return ["yes", "no"];
}
constructor(app: App, note: string, placeholder: string | null, getItemsFun: () => string[], callback: (e: string) => void) {
constructor(app: App, note: string, placeholder: string | undefined, getItemsFun: (() => string[]) | undefined, callback: (e: string) => void) {
super(app);
this.app = app;
this.setPlaceholder((placeholder ?? "y/n) ") + note);
@@ -118,13 +117,14 @@ export class PopoverSelectString extends FuzzySuggestModal<string> {
onChooseItem(item: string, evt: MouseEvent | KeyboardEvent): void {
// debugger;
this.callback(item);
this.callback = null;
this.callback?.(item);
this.callback = undefined;
}
onClose(): void {
setTimeout(() => {
if (this.callback != null) {
if (this.callback) {
this.callback("");
this.callback = undefined;
}
}, 100);
}
@@ -136,16 +136,16 @@ export class MessageBox extends Modal {
title: string;
contentMd: string;
buttons: string[];
result: string;
result: string | false = false;
isManuallyClosed = false;
defaultAction: string | undefined;
timeout: number | undefined;
timer: ReturnType<typeof setInterval> = undefined;
timer: ReturnType<typeof setInterval> | undefined = undefined;
defaultButtonComponent: ButtonComponent | undefined;
onSubmit: (result: string | false) => void;
constructor(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout: number, onSubmit: (result: (typeof buttons)[number] | false) => void) {
constructor(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout: number | undefined, onSubmit: (result: (typeof buttons)[number] | false) => void) {
super(plugin.app);
this.plugin = plugin;
this.title = title;
@@ -156,6 +156,7 @@ export class MessageBox extends Modal {
this.timeout = timeout;
if (this.timeout) {
this.timer = setInterval(() => {
if (this.timeout === undefined) return;
this.timeout--;
if (this.timeout < 0) {
if (this.timer) {
@@ -166,7 +167,7 @@ export class MessageBox extends Modal {
this.isManuallyClosed = true;
this.close();
} else {
this.defaultButtonComponent.setButtonText(`( ${this.timeout} ) ${defaultAction}`);
this.defaultButtonComponent?.setButtonText(`( ${this.timeout} ) ${defaultAction}`);
}
}, 1000);
}
@@ -223,7 +224,7 @@ export class MessageBox extends Modal {
}
export function confirmWithMessage(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction?: (typeof buttons)[number], timeout?: number): Promise<(typeof buttons)[number] | false> {
export function confirmWithMessage(plugin: Plugin, title: string, contentMd: string, buttons: string[], defaultAction: (typeof buttons)[number], timeout?: number): Promise<(typeof buttons)[number] | false> {
return new Promise((res) => {
const dialog = new MessageBox(plugin, title, contentMd, buttons, defaultAction, timeout, (result) => res(result));
dialog.open();

Submodule src/lib updated: b9b70535ed...da470ddc41

File diff suppressed because it is too large Load Diff

View File

@@ -242,14 +242,11 @@ export function mergeObject(
ret[key] = v;
}
}
const retSorted = Object.fromEntries(Object.entries(ret).sort((a, b) => a[0] < b[0] ? -1 : a[0] > b[0] ? 1 : 0));
if (Array.isArray(objA) && Array.isArray(objB)) {
return Object.values(Object.entries(ret)
.sort()
.reduce((p, [key, value]) => ({ ...p, [key]: value }), {}));
return Object.values(retSorted);
}
return Object.entries(ret)
.sort()
.reduce((p, [key, value]) => ({ ...p, [key]: value }), {});
return retSorted;
}
export function flattenObject(obj: Record<string | number | symbol, any>, path: string[] = []): [string, any][] {
@@ -313,7 +310,7 @@ export function isCustomisationSyncMetadata(str: string): boolean {
export const askYesNo = (app: App, message: string): Promise<"yes" | "no"> => {
return new Promise((res) => {
const popover = new PopoverSelectString(app, message, null, null, (result) => res(result as "yes" | "no"));
const popover = new PopoverSelectString(app, message, undefined, undefined, (result) => res(result as "yes" | "no"));
popover.open();
});
};
@@ -327,7 +324,7 @@ export const askSelectString = (app: App, message: string, items: string[]): Pro
};
export const askString = (app: App, title: string, key: string, placeholder: string, isPassword?: boolean): Promise<string | false> => {
export const askString = (app: App, title: string, key: string, placeholder: string, isPassword: boolean = false): Promise<string | false> => {
return new Promise((res) => {
const dialog = new InputStringDialog(app, title, key, placeholder, isPassword, (result) => res(result));
dialog.open();
@@ -400,7 +397,7 @@ export const _requestToCouchDB = async (baseUri: string, username: string, passw
};
return await requestUrl(requestParam);
}
export const requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string, key?: string, body?: string, method?: string) => {
export const requestToCouchDB = async (baseUri: string, username: string, password: string, origin: string = "", key?: string, body?: string, method?: string) => {
const uri = `_node/_local/_config${key ? "/" + key : ""}`;
return await _requestToCouchDB(baseUri, username, password, origin, uri, body, method);
};
@@ -440,7 +437,7 @@ export function compareMTime(baseMTime: number, targetMTime: number): typeof BAS
export function markChangesAreSame(file: TFile | AnyEntry | string, mtime1: number, mtime2: number) {
if (mtime1 === mtime2) return true;
const key = typeof file == "string" ? file : file instanceof TFile ? file.path : file.path ?? file._id;
const pairs = sameChangePairs.get(key, []);
const pairs = sameChangePairs.get(key, []) || [];
if (pairs.some(e => e == mtime1 || e == mtime2)) {
sameChangePairs.set(key, [...new Set([...pairs, mtime1, mtime2])]);
} else {
@@ -449,12 +446,16 @@ export function markChangesAreSame(file: TFile | AnyEntry | string, mtime1: numb
}
export function isMarkedAsSameChanges(file: TFile | AnyEntry | string, mtimes: number[]) {
const key = typeof file == "string" ? file : file instanceof TFile ? file.path : file.path ?? file._id;
const pairs = sameChangePairs.get(key, []);
const pairs = sameChangePairs.get(key, []) || [];
if (mtimes.every(e => pairs.indexOf(e) !== -1)) {
return EVEN;
}
}
export function compareFileFreshness(baseFile: TFile | AnyEntry, checkTarget: TFile | AnyEntry): typeof BASE_IS_NEW | typeof TARGET_IS_NEW | typeof EVEN {
export function compareFileFreshness(baseFile: TFile | AnyEntry | undefined, checkTarget: TFile | AnyEntry | undefined): typeof BASE_IS_NEW | typeof TARGET_IS_NEW | typeof EVEN {
if (baseFile === undefined && checkTarget == undefined) return EVEN;
if (baseFile == undefined) return TARGET_IS_NEW;
if (checkTarget == undefined) return BASE_IS_NEW;
const modifiedBase = baseFile instanceof TFile ? baseFile?.stat?.mtime ?? 0 : baseFile?.mtime ?? 0;
const modifiedTarget = checkTarget instanceof TFile ? checkTarget?.stat?.mtime ?? 0 : checkTarget?.mtime ?? 0;

View File

@@ -77,14 +77,6 @@
border-top: 1px solid var(--background-modifier-border);
}
/* .sls-table-head{
width:50%;
}
.sls-table-tail{
width:50%;
} */
.sls-header-button {
margin-left: 2em;
}
@@ -94,7 +86,7 @@
}
:root {
--slsmessage: "";
--sls-log-text: "";
}
.sls-troubleshoot-preview {
@@ -110,7 +102,10 @@
.markdown-source-view.cm-s-obsidian::before,
.canvas-wrapper::before,
.empty-state::before {
content: attr(data-log);
content: var(--sls-log-text, "");
font-variant-numeric: tabular-nums;
font-variant-emoji: emoji;
tab-size: 4;
text-align: right;
white-space: pre-wrap;
position: absolute;

View File

@@ -1,65 +1,41 @@
### 0.22.0
A few years passed since Self-hosted LiveSync was born, and our codebase had been very complicated. This could be patient now, but it should be a tremendous hurt.
Therefore at v0.22.0, for future maintainability, I refined task scheduling logic totally.
### 0.23.0
Incredibly new features!
Of course, I think this would be our suffering in some cases. However, I would love to ask you for your cooperation and contribution.
Now, we can use object storage (MinIO, S3, R2 or anything you like) for synchronising! Moreover, despite that, we can use all the features as if we were using CouchDB.
Note: As this is a pretty experimental feature, hence we have some limitations.
- This is built on the append-only architecture. It will not shrink used storage if we do not perform a rebuild.
- A bit fragile. However, our version x.yy.0 is always so.
- When the first synchronisation, the entire history to date is transferred. For this reason, it is preferable to do this under the WiFi network.
- Do not worry, from the second synchronisation, we always transfer only differences.
Sorry for being absent so much long. And thank you for your patience!
I hope this feature empowers users to maintain independence and self-host their data, offering an alternative for those who prefer to manage their own storage solutions and avoid being stuck on the right side of a sudden change in business model.
Note: we got a very performance improvement.
Note at 0.22.2: **Now, to rescue mobile devices, Maximum file size is set to 50 by default**. Please configure the limit as you need. If you do not want to limit the sizes, set zero manually, please.
Of course, I use Self-hosted MinIO for testing and recommend this. It is for the same reason as using CouchDB. -- open, controllable, auditable and indeed already audited by numerous eyes.
Let me write one more acknowledgement.
I have a lot of respect for that plugin, even though it is sometimes treated as if it is a competitor, remotely-save. I think it is a great architecture that embodies a different approach to my approach of recreating history. This time, with all due respect, I have used some of its code as a reference.
Hooray for open source, and generous licences, and the sharing of knowledge by experts.
#### Version history
- 0.22.13:
- Improved:
- Now using HTTP for the remote database URI warns of an error (on mobile) or notice (on desktop).
- Refactored:
- Dependencies have been polished.
- 0.22.12:
- Changed:
- The default settings has been changed.
- Improved:
- Default and preferred settings are applied on completion of the wizard.
- 0.23.2
- Sorry for all the fixes to experimental features. (These things were also critical for dogfooding). The next release would be the main fixes! Thank you for your patience and understanding!
- Fixed:
- Now Initialisation `Fetch` will be performed smoothly and there will be fewer conflicts.
- No longer stuck while Handling transferred or initialised documents.
- 0.22.11:
- Journal Sync will not hang up during big replication, especially the initial one.
- All changes which have been replicated while rebuilding will not be postponed (Previous behaviour).
- Improved:
- Now Journal Sync works efficiently in download and parse, or pack and upload.
- Less server storage and faster packing/unpacking usage by the new chunk format.
- 0.23.1
- Fixed:
- `Verify and repair all files` is no longer broken.
- Now journal synchronisation considers untransferred each from sent and received.
- Journal sync now handles retrying.
- Journal synchronisation no longer considers the synchronisation of chunks as revision updates (Simply ignored).
- Journal sync now splits the journal pack to prevent mobile device rebooting.
- Maintenance menus which had been on the command palette are now back in the maintain pane on the setting dialogue.
- Improved:
- Now all changes which have been replicated while rebuilding will be postponed.
- 0.23.0
- New feature:
- Now `Verify and repair all files` is able to...
- Restore if the file only in the local database.
- Show the history.
- Improved:
- Performance improved.
- 0.22.10
- Fixed:
- No longer unchanged hidden files and customisations are saved and transferred now.
- File integrity of vault history indicates the integrity correctly.
- Improved:
- In the report, the schema of the remote database URI is now printed.
- 0.22.9
- Fixed:
- Fixed a bug on `fetch chunks on demand` that could not fetch the chunks on demand.
- Improved:
- `fetch chunks on demand` works more smoothly.
- Initialisation `Fetch` is now more efficient.
- Tidied:
- Removed some meaningless codes.
- 0.22.8
- Fixed:
- Now fetch and unlock the locked remote database works well again.
- No longer crash on symbolic links inside hidden folders.
- Improved:
- Chunks are now created more efficiently.
- Splitting old notes into a larger chunk.
- Better performance in saving notes.
- Network activities are indicated as an icon.
- Less memory used for binary processing.
- Tidied:
- Cleaned unused functions up.
- Sorting out the codes that have become nonsense.
- Changed:
- Now no longer `fetch chunks on demand` needs `Pacing replication`
- The setting `Do not pace synchronization` has been deleted.
... To continue on to `updates_old.md`.
- Now we can use Object Storage.

View File

@@ -10,6 +10,114 @@ Note: we got a very performance improvement.
Note at 0.22.2: **Now, to rescue mobile devices, Maximum file size is set to 50 by default**. Please configure the limit as you need. If you do not want to limit the sizes, set zero manually, please.
#### Version history
- 0.22.19
- Fixed:
- No longer data corrupting due to false BASE64 detections.
- Improved:
- A bit more efficient in Automatic data compression.
- 0.22.18
- New feature (Very Experimental):
- Now we can use `Automatic data compression` to reduce amount of traffic and the usage of remote database.
- Please make sure all devices are updated to v0.22.18 before trying this feature.
- If you are using some other utilities which connected to your vault, please make sure that they have compatibilities.
- Note: Setting `File Compression` on the remote database works for shrink the size of remote database. Please refer the [Doc](https://docs.couchdb.org/en/stable/config/couchdb.html#couchdb/file_compression).
- 0.22.17:
- Fixed:
- Error handling on booting now works fine.
- Replication is now started automatically in LiveSync mode.
- Batch database update is now disabled in LiveSync mode.
- No longer automatically reconnection while off-focused.
- Status saves are thinned out.
- Now Self-hosted LiveSync waits for all files between the local database and storage to be surely checked.
- Improved:
- The job scheduler is now more robust and stable.
- The status indicator no longer flickers and keeps zero for a while.
- No longer meaningless frequent updates of status indicators.
- Now we can configure regular expression filters in handy UI. Thank you so much, @eth-p!
- `Fetch` or `Rebuild everything` is now more safely performed.
- Minor things
- Some utility function has been added.
- Customisation sync now less wrong messages.
- Digging the weeds for eradication of type errors.
- 0.22.16:
- Fixed:
- Fixed the issue that binary files were sometimes corrupted.
- Fixed customisation sync data could be corrupted.
- Improved:
- Now the remote database costs lower memory.
- This release requires a brief wait on the first synchronisation, to track the latest changeset again.
- Description added for the `Device name`.
- Refactored:
- Many type-errors have been resolved.
- Obsolete file has been deleted.
- 0.22.15:
- Improved:
- Faster start-up by removing too many logs which indicates normality
- By streamlined scanning of customised synchronisation extra phases have been deleted.
... To continue on to `updates_old.md`.
- 0.22.14:
- New feature:
- We can disable the status bar in the setting dialogue.
- Improved:
- Now some files are handled as correct data type.
- Customisation sync now uses the digest of each file for better performance.
- The status in the Editor now works performant.
- Refactored:
- Common functions have been ready and the codebase has been organised.
- Stricter type checking following TypeScript updates.
- Remove old iOS workaround for simplicity and performance.
- 0.22.13:
- Improved:
- Now using HTTP for the remote database URI warns of an error (on mobile) or notice (on desktop).
- Refactored:
- Dependencies have been polished.
- 0.22.12:
- Changed:
- The default settings has been changed.
- Improved:
- Default and preferred settings are applied on completion of the wizard.
- Fixed:
- Now Initialisation `Fetch` will be performed smoothly and there will be fewer conflicts.
- No longer stuck while Handling transferred or initialised documents.
- 0.22.11:
- Fixed:
- `Verify and repair all files` is no longer broken.
- New feature:
- Now `Verify and repair all files` is able to...
- Restore if the file only in the local database.
- Show the history.
- Improved:
- Performance improved.
- 0.22.10
- Fixed:
- No longer unchanged hidden files and customisations are saved and transferred now.
- File integrity of vault history indicates the integrity correctly.
- Improved:
- In the report, the schema of the remote database URI is now printed.
- 0.22.9
- Fixed:
- Fixed a bug on `fetch chunks on demand` that could not fetch the chunks on demand.
- Improved:
- `fetch chunks on demand` works more smoothly.
- Initialisation `Fetch` is now more efficient.
- Tidied:
- Removed some meaningless codes.
- 0.22.8
- Fixed:
- Now fetch and unlock the locked remote database works well again.
- No longer crash on symbolic links inside hidden folders.
- Improved:
- Chunks are now created more efficiently.
- Splitting old notes into a larger chunk.
- Better performance in saving notes.
- Network activities are indicated as an icon.
- Less memory used for binary processing.
- Tidied:
- Cleaned unused functions up.
- Sorting out the codes that have become nonsense.
- Changed:
- Now no longer `fetch chunks on demand` needs `Pacing replication`
- The setting `Do not pace synchronization` has been deleted.
- 0.22.7
- Fixed:
- No longer deleted hidden files were ignored.