Compare commits

...

10 Commits

Author SHA1 Message Date
vorotamoroz
d3dc1e7328 Minor fix and refine the readme 2024-01-12 10:29:18 +00:00
vorotamoroz
45304af369 bump 2024-01-12 09:38:57 +00:00
vorotamoroz
7f422d58f2 - Refined:
- Task scheduling logics has been rewritten.
  - Possibly many bugs and fragile behaviour has been fixed
- Fixed:
  - Remote-chunk-fetching now works with keeping request intervals
- New feature:
  - We can show only the icons in the editor.
2024-01-12 09:36:49 +00:00
vorotamoroz
c2491fdfad bump 2023-12-11 12:55:01 +09:00
vorotamoroz
06a6e391e8 Fixed for change detection bug. 2023-12-11 12:53:50 +09:00
vorotamoroz
f99475f6b7 bump 2023-12-11 12:46:23 +09:00
vorotamoroz
109fc00b9d Fixed
- Now ID of the documents is shown in the log with the first 8 letters.
2023-12-11 12:45:40 +09:00
vorotamoroz
c071d822e1 - Improved:
- Now all revisions will be shown only its first a few letters.
- Fixed:
  - Check before modifying files has been implemented.
  - Content change detection has been improved.
2023-12-11 12:22:17 +09:00
vorotamoroz
d2de5b4710 bump 2023-12-04 19:39:47 +09:00
vorotamoroz
cf5ecd8922 Implemented:
- Now we can use SHA1 for hash function as fallback.
2023-12-04 19:39:04 +09:00
20 changed files with 642 additions and 706 deletions

View File

@@ -59,14 +59,23 @@ Synchronization status is shown in statusbar.
- Status
- ⏹️ Stopped
- 💤 LiveSync enabled. Waiting for changes.
- ⚡️ Synchronization in progress.
- ⚠ An error occurred.
- ↑ Uploaded chunks and metadata
- ↓ Downloaded chunks and metadata
- ⏳ Number of pending processes
- 🧩 Number of files waiting for their chunks.
If you have deleted or renamed files, please wait until ⏳ icon disappears.
- 💤 LiveSync enabled. Waiting for changes
- ⚡️ Synchronization in progress
- ⚠ An error occurred
- Statistical indicator
- ↑ Uploaded chunks and metadata
- ↓ Downloaded chunks and metadata
- Progress indicator
- 📥 Unprocessed transferred items
- 📄 Working database operation
- 💾 Working write storage processes
- ⏳ Working read storage processes
- 🛫 Pending read storage processes
- ⚙️ Working or pending storage processes of hidden files
- 🧩 Waiting chunks
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
To prevent file and database corruption, please wait until all progress indicators have disappeared. Especially in case of if you have deleted or renamed files.
## Hints

View File

@@ -1,7 +1,7 @@
{
"id": "obsidian-livesync",
"name": "Self-hosted LiveSync",
"version": "0.21.2",
"version": "0.22.0",
"minAppVersion": "0.9.12",
"description": "Community implementation of self-hosted livesync. Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"author": "vorotamoroz",

4
package-lock.json generated
View File

@@ -1,12 +1,12 @@
{
"name": "obsidian-livesync",
"version": "0.21.2",
"version": "0.22.0",
"lockfileVersion": 2,
"requires": true,
"packages": {
"": {
"name": "obsidian-livesync",
"version": "0.21.2",
"version": "0.22.0",
"license": "MIT",
"dependencies": {
"diff-match-patch": "^1.0.5",

View File

@@ -1,6 +1,6 @@
{
"name": "obsidian-livesync",
"version": "0.21.2",
"version": "0.22.0",
"description": "Reflect your vault changes to some other devices immediately. Please make sure to disable other synchronize solutions to avoid content corruption or duplication.",
"main": "main.js",
"type": "module",

View File

@@ -7,14 +7,16 @@ import { ICXHeader, PERIODIC_PLUGIN_SWEEP, } from "./types";
import { createTextBlob, delay, getDocData } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { WrappedNotice } from "./lib/src/wrapper";
import { readString, crc32CKHash, decodeBinary, arrayBufferToBase64 } from "./lib/src/strbin";
import { readString, decodeBinary, arrayBufferToBase64, sha1 } from "./lib/src/strbin";
import { serialized } from "./lib/src/lock";
import { LiveSyncCommands } from "./LiveSyncCommands";
import { stripAllPrefixes } from "./lib/src/path";
import { PeriodicProcessor, askYesNo, disposeMemoObject, memoIfNotExist, memoObject, retrieveMemoObject, scheduleTask } from "./utils";
import { PluginDialogModal } from "./dialogs";
import { JsonResolveModal } from "./JsonResolveModal";
import { pipeGeneratorToGenerator, processAllGeneratorTasksWithConcurrencyLimit } from './lib/src/task';
import { QueueProcessor } from './lib/src/processor';
import { pluginScanningCount } from './lib/src/stores';
import type ObsidianLiveSyncPlugin from './main';
const d = "\u200b";
const d2 = "\n";
@@ -162,6 +164,16 @@ export type PluginDataEx = {
mtime: number,
};
export class ConfigSync extends LiveSyncCommands {
constructor(plugin: ObsidianLiveSyncPlugin) {
super(plugin);
pluginScanningCount.onChanged((e) => {
const total = e.value;
pluginIsEnumerating.set(total != 0);
if (total == 0) {
Logger(`Processing configurations done`, LOG_LEVEL_INFO, "get-plugins");
}
})
}
confirmPopup: WrappedNotice = null;
get kvDB() {
return this.plugin.kvDB;
@@ -270,7 +282,7 @@ export class ConfigSync extends LiveSyncCommands {
for (const file of data.files) {
const work = { ...file };
const tempStr = getDocData(work.data);
work.data = [crc32CKHash(tempStr)];
work.data = [await sha1(tempStr)];
xFiles.push(work);
}
return ({
@@ -302,65 +314,65 @@ export class ConfigSync extends LiveSyncCommands {
this.plugin.saveSettingData();
}
}
pluginScanProcessor = new QueueProcessor(async (v: AnyEntry[]) => {
const plugin = v[0];
const path = plugin.path || this.getPath(plugin);
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
if (oldEntry && oldEntry.mtime == plugin.mtime) return;
try {
const pluginData = await this.loadPluginData(path);
if (pluginData) {
return [pluginData];
}
// Failed to load
return;
} catch (ex) {
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
Logger(ex, LOG_LEVEL_VERBOSE);
}
return;
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 300, yieldThreshold: 10 }).pipeTo(
new QueueProcessor(
(pluginDataList) => {
let newList = [...this.pluginList];
for (const item of pluginDataList) {
newList = newList.filter(x => x.documentPath != item.documentPath);
newList.push(item)
}
this.pluginList = newList;
pluginList.set(newList);
return;
}
, { suspended: true, batchSize: 1000, concurrentLimit: 10, delay: 200, yieldThreshold: 25, totalRemainingReactiveSource: pluginScanningCount })).startPipeline().root.onIdle(() => {
Logger(`All files enumerated`, LOG_LEVEL_INFO, "get-plugins");
this.createMissingConfigurationEntry();
});
async updatePluginList(showMessage: boolean, updatedDocumentPath?: FilePathWithPrefix): Promise<void> {
const logLevel = showMessage ? LOG_LEVEL_NOTICE : LOG_LEVEL_INFO;
// pluginList.set([]);
if (!this.settings.usePluginSync) {
this.pluginScanProcessor.clearQueue();
this.pluginList = [];
pluginList.set(this.pluginList)
return;
}
await Promise.resolve(); // Just to prevent warning.
scheduleTask("update-plugin-list-task", 200, async () => {
await serialized("update-plugin-list", async () => {
try {
const updatedDocumentId = updatedDocumentPath ? await this.path2id(updatedDocumentPath) : "";
const plugins = updatedDocumentPath ?
this.localDatabase.findEntries(updatedDocumentId, updatedDocumentId + "\u{10ffff}", { include_docs: true, key: updatedDocumentId, limit: 1 }) :
this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
let count = 0;
pluginIsEnumerating.set(true);
for await (const v of processAllGeneratorTasksWithConcurrencyLimit(20, pipeGeneratorToGenerator(plugins, async plugin => {
const path = plugin.path || this.getPath(plugin);
if (updatedDocumentPath && updatedDocumentPath != path) {
return false;
}
const oldEntry = (this.pluginList.find(e => e.documentPath == path));
if (oldEntry && oldEntry.mtime == plugin.mtime) return false;
try {
count++;
if (count % 10 == 0) Logger(`Enumerating files... ${count}`, logLevel, "get-plugins");
Logger(`plugin-${path}`, LOG_LEVEL_VERBOSE);
return this.loadPluginData(path);
// return entries;
} catch (ex) {
//TODO
Logger(`Something happened at enumerating customization :${path}`, LOG_LEVEL_NOTICE);
console.warn(ex);
}
return false;
}))) {
if ("ok" in v) {
if (v.ok !== false) {
let newList = [...this.pluginList];
const item = v.ok;
newList = newList.filter(x => x.documentPath != item.documentPath);
newList.push(item)
if (updatedDocumentPath != "") newList = newList.filter(e => e.documentPath != updatedDocumentPath);
this.pluginList = newList;
pluginList.set(newList);
}
}
}
Logger(`All files enumerated`, logLevel, "get-plugins");
pluginIsEnumerating.set(false);
this.createMissingConfigurationEntry();
} finally {
pluginIsEnumerating.set(false);
}
});
try {
const updatedDocumentId = updatedDocumentPath ? await this.path2id(updatedDocumentPath) : "";
const plugins = updatedDocumentPath ?
this.localDatabase.findEntries(updatedDocumentId, updatedDocumentId + "\u{10ffff}", { include_docs: true, key: updatedDocumentId, limit: 1 }) :
this.localDatabase.findEntries(ICXHeader + "", `${ICXHeader}\u{10ffff}`, { include_docs: true });
for await (const v of plugins) {
const path = v.path || this.getPath(v);
if (updatedDocumentPath && updatedDocumentPath != path) continue;
this.pluginScanProcessor.enqueue(v);
}
} finally {
pluginIsEnumerating.set(false);
});
}
pluginIsEnumerating.set(false);
// return entries;
}
async compareUsingDisplayData(dataA: PluginDataExDisplay, dataB: PluginDataExDisplay) {

View File

@@ -1,16 +1,18 @@
import { normalizePath, type PluginManifest } from "./deps";
import { type EntryDoc, type LoadedEntry, type InternalFileEntry, type FilePathWithPrefix, type FilePath, LOG_LEVEL_INFO, LOG_LEVEL_NOTICE, LOG_LEVEL_VERBOSE, MODE_SELECTIVE, MODE_PAUSED, type SavingEntry } from "./lib/src/types";
import { type InternalFileInfo, ICHeader, ICHeaderEnd } from "./types";
import { Parallels, createBinaryBlob, delay, isDocContentSame } from "./lib/src/utils";
import { createBinaryBlob, delay, isDocContentSame } from "./lib/src/utils";
import { Logger } from "./lib/src/logger";
import { PouchDB } from "./lib/src/pouchdb-browser.js";
import { scheduleTask, isInternalMetadata, PeriodicProcessor } from "./utils";
import { isInternalMetadata, PeriodicProcessor } from "./utils";
import { WrappedNotice } from "./lib/src/wrapper";
import { decodeBinary, encodeBinary } from "./lib/src/strbin";
import { serialized } from "./lib/src/lock";
import { JsonResolveModal } from "./JsonResolveModal";
import { LiveSyncCommands } from "./LiveSyncCommands";
import { addPrefix, stripAllPrefixes } from "./lib/src/path";
import { KeyedQueueProcessor, QueueProcessor } from "./lib/src/processor";
import { hiddenFilesEventCount, hiddenFilesProcessingCount } from "./lib/src/stores";
export class HiddenFileSync extends LiveSyncCommands {
periodicInternalFileScanProcessor: PeriodicProcessor = new PeriodicProcessor(this.plugin, async () => this.settings.syncInternalFiles && this.localDatabase.isReady && await this.syncInternalFilesAndDatabase("push", false));
@@ -75,22 +77,17 @@ export class HiddenFileSync extends LiveSyncCommands {
return;
}
procInternalFiles: string[] = [];
async execInternalFile() {
await serialized("execInternal", async () => {
const w = [...this.procInternalFiles];
this.procInternalFiles = [];
Logger(`Applying hidden ${w.length} files change...`);
await this.syncInternalFilesAndDatabase("pull", false, false, w);
Logger(`Applying hidden ${w.length} files changed`);
});
}
procInternalFile(filename: string) {
this.procInternalFiles.push(filename);
scheduleTask("procInternal", 500, async () => {
await this.execInternalFile();
});
this.internalFileProcessor.enqueueWithKey(filename, filename);
}
internalFileProcessor = new KeyedQueueProcessor<string, any>(
async (filenames) => {
Logger(`START :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
await this.syncInternalFilesAndDatabase("pull", false, false, filenames);
Logger(`DONE :Applying hidden ${filenames.length} files change`, LOG_LEVEL_VERBOSE);
return;
}, { batchSize: 100, concurrentLimit: 1, delay: 100, yieldThreshold: 10, suspended: false, totalRemainingReactiveSource: hiddenFilesEventCount }
);
recentProcessedInternalFiles = [] as string[];
async watchVaultRawEventsAsync(path: FilePath) {
@@ -278,28 +275,38 @@ export class HiddenFileSync extends LiveSyncCommands {
acc[stripAllPrefixes(this.getPath(cur))] = cur;
return acc;
}, {} as { [key: string]: InternalFileEntry; });
const para = Parallels();
for (const filename of allFileNames) {
await new QueueProcessor(async (filenames: FilePath[]) => {
const filename = filenames[0];
processed++;
if (processed % 100 == 0) {
Logger(`Hidden file: ${processed}/${fileCount}`, logLevel, "sync_internal");
}
if (!filename) continue;
if (!filename) return;
if (ignorePatterns.some(e => filename.match(e)))
continue;
return;
if (await this.plugin.isIgnoredByIgnoreFiles(filename)) {
continue;
return;
}
const fileOnStorage = filename in filesMap ? filesMap[filename] : undefined;
const fileOnDatabase = filename in filesOnDBMap ? filesOnDBMap[filename] : undefined;
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
await para.wait(5);
const proc = (async (xFileOnStorage: InternalFileInfo, xFileOnDatabase: InternalFileEntry) => {
return [{
filename,
fileOnStorage,
fileOnDatabase,
}]
}, { suspended: true, batchSize: 1, concurrentLimit: 10, delay: 0, totalRemainingReactiveSource: hiddenFilesProcessingCount })
.pipeTo(new QueueProcessor(async (params) => {
const
{
filename,
fileOnStorage: xFileOnStorage,
fileOnDatabase: xFileOnDatabase
} = params[0];
if (xFileOnStorage && xFileOnDatabase) {
const cache = filename in caches ? caches[filename] : { storageMtime: 0, docMtime: 0 };
// Both => Synchronize
if ((direction != "pullForce" && direction != "pushForce") && xFileOnDatabase.mtime == cache.docMtime && xFileOnStorage.mtime == cache.storageMtime) {
return;
@@ -340,11 +347,12 @@ export class HiddenFileSync extends LiveSyncCommands {
throw new Error("Invalid state on hidden file sync");
// Something corrupted?
}
return;
}, { suspended: true, batchSize: 1, concurrentLimit: 5, delay: 0 }))
.root
.enqueueAll(allFileNames)
.startPipeline().waitForPipeline();
});
para.add(proc(fileOnStorage, fileOnDatabase))
}
await para.all();
await this.kvDB.set("diff-caches-internal", caches);
// When files has been retrieved from the database. they must be reloaded.
@@ -436,7 +444,7 @@ export class HiddenFileSync extends LiveSyncCommands {
type: "newnote",
};
} else {
if (isDocContentSame(old.data, content) && !forceWrite) {
if (await isDocContentSame(old.data, content) && !forceWrite) {
// Logger(`STORAGE --> DB:${file.path}: (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return;
}
@@ -560,7 +568,7 @@ export class HiddenFileSync extends LiveSyncCommands {
} else {
const contentBin = await this.plugin.vaultAccess.adapterReadBinary(filename);
const content = await encodeBinary(contentBin);
if (isDocContentSame(content, fileOnDB.data) && !force) {
if (await isDocContentSame(content, fileOnDB.data) && !force) {
// Logger(`STORAGE <-- DB:${filename}: skipped (hidden) Not changed`, LOG_LEVEL_VERBOSE);
return true;
}

View File

@@ -228,7 +228,7 @@ export class PluginAndTheirSettings extends LiveSyncCommands {
if (old !== false) {
const oldData = { data: old.data, deleted: old._deleted };
const newData = { data: d.data, deleted: d._deleted };
if (isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
if (await isDocContentSame(oldData.data, newData.data) && oldData.deleted == newData.deleted) {
Logger(`Nothing changed:${m.name}`);
return;
}

View File

@@ -321,7 +321,6 @@ Of course, we are able to disable these features.`
this.plugin.settings.suspendFileWatching = false;
await this.plugin.syncAllFiles(true);
await this.plugin.loadQueuedFiles();
this.plugin.procQueuedFiles();
await this.plugin.saveSettings();
}

View File

@@ -66,10 +66,7 @@
for (const revInfo of reversedRevs) {
if (revInfo.status == "available") {
const doc =
(!isPlain && showDiffInfo) || (checkStorageDiff && revInfo.rev == docA._rev)
? await db.getDBEntry(path, { rev: revInfo.rev }, false, false, true)
: await db.getDBEntryMeta(path, { rev: revInfo.rev }, true);
const doc = (!isPlain && showDiffInfo) || (checkStorageDiff && revInfo.rev == docA._rev) ? await db.getDBEntry(path, { rev: revInfo.rev }, false, false, true) : await db.getDBEntryMeta(path, { rev: revInfo.rev }, true);
if (doc === false) continue;
const rev = revInfo.rev;
@@ -112,11 +109,11 @@
let result = false;
if (isPlainText(docA.path)) {
const data = await plugin.vaultAccess.adapterRead(abs);
result = isDocContentSame(data, doc.data);
result = await isDocContentSame(data, doc.data);
} else {
const data = await plugin.vaultAccess.adapterReadBinary(abs);
const dataEEncoded = createBinaryBlob(data);
result = isDocContentSame(dataEEncoded, doc.data);
result = await isDocContentSame(dataEEncoded, doc.data);
}
if (result) {
diffDetail += " ⚖️";

View File

@@ -1,5 +1,6 @@
import { App, Modal } from "./deps";
import { logMessageStore } from "./lib/src/stores";
import type { ReactiveInstance, } from "./lib/src/reactive";
import { logMessages } from "./lib/src/stores";
import { escapeStringToHTML } from "./lib/src/strbin";
import ObsidianLiveSyncPlugin from "./main";
@@ -21,14 +22,16 @@ export class LogDisplayModal extends Modal {
div.addClass("op-scrollable");
div.addClass("op-pre");
this.logEl = div;
this.unsubscribe = logMessageStore.observe((e) => {
function updateLog(logs: ReactiveInstance<string[]>) {
const e = logs.value;
let msg = "";
for (const v of e) {
msg += escapeStringToHTML(v) + "<br>";
}
this.logEl.innerHTML = msg;
})
logMessageStore.invalidate();
}
logMessages.onChanged(updateLog);
this.unsubscribe = () => logMessages.offChanged(updateLog);
}
onClose() {
const { contentEl } = this;

View File

@@ -1,26 +1,27 @@
<script lang="ts">
import { onDestroy, onMount } from "svelte";
import { logMessageStore } from "./lib/src/stores";
import { logMessages } from "./lib/src/stores";
import type { ReactiveInstance } from "./lib/src/reactive";
import { Logger } from "./lib/src/logger";
let unsubscribe: () => void;
let messages = [] as string[];
let wrapRight = false;
let autoScroll = true;
let suspended = false;
function updateLog(logs: ReactiveInstance<string[]>) {
const e = logs.value;
if (!suspended) {
messages = [...e];
setTimeout(() => {
if (scroll) scroll.scrollTop = scroll.scrollHeight;
}, 10);
}
}
onMount(async () => {
unsubscribe = logMessageStore.observe((e) => {
if (!suspended) {
messages = [...e];
if (autoScroll) {
if (scroll) scroll.scrollTop = scroll.scrollHeight;
}
}
});
logMessageStore.invalidate();
setTimeout(() => {
if (scroll) scroll.scrollTop = scroll.scrollHeight;
}, 100);
logMessages.onChanged(updateLog);
Logger("Log window opened");
unsubscribe = () => logMessages.offChanged(updateLog);
});
onDestroy(() => {
if (unsubscribe) unsubscribe();

View File

@@ -4,7 +4,7 @@ import { delay } from "./lib/src/utils";
import { Semaphore } from "./lib/src/semaphore";
import { versionNumberString2Number } from "./lib/src/strbin";
import { Logger } from "./lib/src/logger";
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb.js";
import { checkSyncInfo, isCloudantURI } from "./lib/src/utils_couchdb";
import { testCrypt } from "./lib/src/e2ee_v2";
import ObsidianLiveSyncPlugin from "./main";
import { askYesNo, performRebuildDB, requestToCouchDB, scheduleTask } from "./utils";
@@ -746,8 +746,20 @@ export class ObsidianLiveSyncSettingTab extends PluginSettingTab {
toggle.setValue(this.plugin.settings.showStatusOnEditor).onChange(async (value) => {
this.plugin.settings.showStatusOnEditor = value;
await this.plugin.saveSettings();
this.display();
})
);
if (this.plugin.settings.showStatusOnEditor) {
new Setting(containerGeneralSettingsEl)
.setName("Show status as icons only")
.setDesc("")
.addToggle((toggle) =>
toggle.setValue(this.plugin.settings.showOnlyIconsOnEditor).onChange(async (value) => {
this.plugin.settings.showOnlyIconsOnEditor = value;
await this.plugin.saveSettings();
})
);
}
containerGeneralSettingsEl.createEl("h4", { text: "Logging" });
new Setting(containerGeneralSettingsEl)
@@ -1813,7 +1825,7 @@ ${stringifyYaml(pluginConfig)}`;
.setClass("wizardHidden")
.addDropdown((dropdown) =>
dropdown
.addOptions({ "": "Old Algorithm", "xxhash32": "xxhash32 (Fast)", "xxhash64": "xxhash64 (Fastest)" } as Record<HashAlgorithm, string>)
.addOptions({ "": "Old Algorithm", "xxhash32": "xxhash32 (Fast)", "xxhash64": "xxhash64 (Fastest)", "sha1": "Fallback (Without WebAssembly)" } as Record<HashAlgorithm, string>)
.setValue(this.plugin.settings.hashAlg)
.onChange(async (value) => {
this.plugin.settings.hashAlg = value as HashAlgorithm;

View File

@@ -1,6 +1,7 @@
import { type App, TFile, type DataWriteOptions, TFolder, TAbstractFile } from "./deps";
import { serialized } from "./lib/src/lock";
import type { FilePath } from "./lib/src/types";
import { createBinaryBlob, isDocContentSame } from "./lib/src/utils";
function getFileLockKey(file: TFile | TFolder | string) {
return `fl:${typeof (file) == "string" ? file : file.path}`;
}
@@ -65,9 +66,22 @@ export class SerializedFileAccess {
async vaultModify(file: TFile, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions) {
if (typeof (data) === "string") {
return await serialized(getFileLockKey(file), () => this.app.vault.modify(file, data, options));
return await serialized(getFileLockKey(file), async () => {
const oldData = await this.app.vault.read(file);
if (data === oldData) return false
await this.app.vault.modify(file, data, options)
return true;
}
);
} else {
return await serialized(getFileLockKey(file), () => this.app.vault.modifyBinary(file, toArrayBuffer(data), options));
return await serialized(getFileLockKey(file), async () => {
const oldData = await this.app.vault.readBinary(file);
if (isDocContentSame(createBinaryBlob(oldData), createBinaryBlob(data))) {
return false;
}
await this.app.vault.modifyBinary(file, toArrayBuffer(data), options)
return true;
});
}
}
async vaultCreate(path: string, data: string | ArrayBuffer | Uint8Array, options?: DataWriteOptions): Promise<TFile> {

View File

@@ -1,15 +1,13 @@
import type { SerializedFileAccess } from "./SerializedFileAccess";
import { Plugin, TAbstractFile, TFile, TFolder } from "./deps";
import { isPlainText, shouldBeIgnored } from "./lib/src/path";
import { getGlobalStore } from "./lib/src/store";
import type { KeyedQueueProcessor } from "./lib/src/processor";
import { type FilePath, type ObsidianLiveSyncSettings } from "./lib/src/types";
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo, type queueItem } from "./types";
import { type FileEventItem, type FileEventType, type FileInfo, type InternalFileInfo } from "./types";
export abstract class StorageEventManager {
abstract fetchEvent(): FileEventItem | false;
abstract cancelRelativeEvent(item: FileEventItem): void;
abstract getQueueLength(): number;
abstract beginWatch(): void;
}
type LiveSyncForStorageEventManager = Plugin &
@@ -19,19 +17,18 @@ type LiveSyncForStorageEventManager = Plugin &
vaultAccess: SerializedFileAccess
} & {
isTargetFile: (file: string | TAbstractFile) => Promise<boolean>,
procFileEvent: (applyBatch?: boolean) => Promise<any>,
fileEventQueue: KeyedQueueProcessor<FileEventItem, any>
};
export class StorageEventManagerObsidian extends StorageEventManager {
plugin: LiveSyncForStorageEventManager;
queuedFilesStore = getGlobalStore("queuedFiles", { queuedItems: [] as queueItem[], fileEventItems: [] as FileEventItem[] });
watchedFileEventQueue = [] as FileEventItem[];
constructor(plugin: LiveSyncForStorageEventManager) {
super();
this.plugin = plugin;
}
beginWatch() {
const plugin = this.plugin;
this.watchVaultChange = this.watchVaultChange.bind(this);
this.watchVaultCreate = this.watchVaultCreate.bind(this);
this.watchVaultDelete = this.watchVaultDelete.bind(this);
@@ -43,6 +40,7 @@ export class StorageEventManagerObsidian extends StorageEventManager {
plugin.registerEvent(plugin.app.vault.on("create", this.watchVaultCreate));
//@ts-ignore : Internal API
plugin.registerEvent(plugin.app.vault.on("raw", this.watchVaultRawEvents));
plugin.fileEventQueue.startPipeline();
}
watchVaultCreate(file: TAbstractFile, ctx?: any) {
@@ -90,7 +88,6 @@ export class StorageEventManagerObsidian extends StorageEventManager {
}
// Cache file and waiting to can be proceed.
async appendWatchEvent(params: { type: FileEventType, file: TAbstractFile | InternalFileInfo, oldPath?: string }[], ctx?: any) {
let forcePerform = false;
for (const param of params) {
if (shouldBeIgnored(param.file.path)) {
continue;
@@ -116,33 +113,6 @@ export class StorageEventManagerObsidian extends StorageEventManager {
if (!cache) cache = await this.plugin.vaultAccess.vaultRead(file);
}
}
if (type == "DELETE" || type == "RENAME") {
forcePerform = true;
}
if (this.plugin.settings.batchSave && !this.plugin.settings.liveSync) {
// if the latest event is the same type, omit that
// a.md MODIFY <- this should be cancelled when a.md MODIFIED
// b.md MODIFY <- this should be cancelled when b.md MODIFIED
// a.md MODIFY
// a.md CREATE
// :
let i = this.watchedFileEventQueue.length;
L1:
while (i >= 0) {
i--;
if (i < 0) break L1;
if (this.watchedFileEventQueue[i].args.file.path != file.path) {
continue L1;
}
if (this.watchedFileEventQueue[i].type != type) break L1;
this.watchedFileEventQueue.remove(this.watchedFileEventQueue[i]);
//this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
}
}
const fileInfo = file instanceof TFile ? {
ctime: file.stat.ctime,
mtime: file.stat.mtime,
@@ -150,7 +120,8 @@ export class StorageEventManagerObsidian extends StorageEventManager {
path: file.path,
size: file.stat.size
} as FileInfo : file as InternalFileInfo;
this.watchedFileEventQueue.push({
this.plugin.fileEventQueue.enqueueWithKey(`file-${fileInfo.path}`, {
type,
args: {
file: fileInfo,
@@ -161,21 +132,5 @@ export class StorageEventManagerObsidian extends StorageEventManager {
key: atomicKey
})
}
// this.queuedFilesStore.set({ queuedItems: this.queuedFiles, fileEventItems: this.watchedFileEventQueue });
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
this.plugin.procFileEvent(forcePerform);
}
fetchEvent(): FileEventItem | false {
if (this.watchedFileEventQueue.length == 0) return false;
const item = this.watchedFileEventQueue.shift();
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
return item;
}
cancelRelativeEvent(item: FileEventItem) {
this.watchedFileEventQueue = [...this.watchedFileEventQueue].filter(e => e.key != item.key);
this.queuedFilesStore.apply((value) => ({ ...value, fileEventItems: this.watchedFileEventQueue }));
}
getQueueLength() {
return this.watchedFileEventQueue.length;
}
}

Submodule src/lib updated: 7e79c27035...33f7e69433

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,4 @@
import { normalizePath, TFile, Platform, TAbstractFile, App, Plugin, type RequestUrlParam, requestUrl } from "./deps";
import { normalizePath, Platform, TAbstractFile, App, Plugin, type RequestUrlParam, requestUrl } from "./deps";
import { path2id_base, id2path_base, isValidFilenameInLinux, isValidFilenameInDarwin, isValidFilenameInWidows, isValidFilenameInAndroid, stripAllPrefixes } from "./lib/src/path";
import { Logger } from "./lib/src/logger";
@@ -8,6 +8,8 @@ import { InputStringDialog, PopoverSelectString } from "./dialogs";
import ObsidianLiveSyncPlugin from "./main";
import { writeString } from "./lib/src/strbin";
export { scheduleTask, setPeriodicTask, cancelTask, cancelAllTasks, cancelPeriodicTask, cancelAllPeriodicTask, } from "./lib/src/task";
// For backward compatibility, using the path for determining id.
// Only CouchDB unacceptable ID (that starts with an underscore) has been prefixed with "/".
// The first slash will be deleted when the path is normalized.
@@ -43,49 +45,6 @@ export function getPathFromTFile(file: TAbstractFile) {
return file.path as FilePath;
}
const tasks: { [key: string]: ReturnType<typeof setTimeout> } = {};
export function scheduleTask(key: string, timeout: number, proc: (() => Promise<any> | void), skipIfTaskExist?: boolean) {
if (skipIfTaskExist && key in tasks) {
return;
}
cancelTask(key);
tasks[key] = setTimeout(async () => {
delete tasks[key];
await proc();
}, timeout);
}
export function cancelTask(key: string) {
if (key in tasks) {
clearTimeout(tasks[key]);
delete tasks[key];
}
}
export function cancelAllTasks() {
for (const v in tasks) {
clearTimeout(tasks[v]);
delete tasks[v];
}
}
const intervals: { [key: string]: ReturnType<typeof setInterval> } = {};
export function setPeriodicTask(key: string, timeout: number, proc: (() => Promise<any> | void)) {
cancelPeriodicTask(key);
intervals[key] = setInterval(async () => {
delete intervals[key];
await proc();
}, timeout);
}
export function cancelPeriodicTask(key: string) {
if (key in intervals) {
clearInterval(intervals[key]);
delete intervals[key];
}
}
export function cancelAllPeriodicTask() {
for (const v in intervals) {
clearInterval(intervals[v]);
delete intervals[v];
}
}
const memos: { [key: string]: any } = {};
export function memoObject<T>(key: string, obj: T): T {

View File

@@ -16,6 +16,7 @@
"importHelpers": false,
"alwaysStrict": true,
"allowImportingTsExtensions": true,
"noEmit": true,
"lib": [
"es2018",
"DOM",

View File

@@ -1,31 +1,33 @@
### 0.21.0
The E2EE encryption V2 format has been reverted. That was probably the cause of the glitch.
Instead, to maintain efficiency, files are treated with Blob until just before saving. Along with this, the old-fashioned encryption format has also been discontinued.
There are both forward and backwards compatibilities, with recent versions. However, unfortunately, we lost compatibility with filesystem-livesync or some.
It will be addressed soon. Please be patient if you are using filesystem-livesync with E2EE.
### 0.22.0
A few years passed since Self-hosted LiveSync was born, and our codebase had been very complicated. This could be patient now, but it should be a tremendous hurt.
Therefore at v0.22.0, for future maintainability, I refined task scheduling logic totally.
Of course, I think this would be our suffering in some cases. However, I would love to ask you for your cooperation and contribution.
Sorry for being absent so much long. And thank you for your patience!
Note: we got a very performance improvement.
#### Version history
- 0.21.2
- IMPORTANT NOTICE: **0.21.1 CONTAINS A BUG WHILE REBUILDING THE DATABASE. IF YOU HAVE BEEN REBUILT, PLEASE MAKE SURE THAT ALL FILES ARE SANE.**
- This has been fixed in this version.
- 0.22.0
- Refined:
- Task scheduling logics has been rewritten.
- Screen updates are also now efficient.
- Possibly many bugs and fragile behaviour has been fixed.
- Status updates and logging have been thinned out to display.
- Fixed:
- No longer files are broken while rebuilding.
- Now, Large binary files can be written correctly on a mobile platform.
- Any decoding errors now make zero-byte files.
- Modified:
- All files are processed sequentially for each.
- 0.21.1
- Fixed:
- No more infinity loops on larger files.
- Show message on decode error.
- Refactored:
- Fixed to avoid obsolete global variables.
- 0.21.0
- Changes and performance improvements:
- Now the saving files are processed by Blob.
- The V2-Format has been reverted.
- New encoding format has been enabled in default.
- WARNING: Since this version, the compatibilities with older Filesystem LiveSync have been lost.
- Remote-chunk-fetching now works with keeping request intervals
- New feature:
- We can show only the icons in the editor.
- Progress indicators have been more meaningful:
- 📥 Unprocessed transferred items
- 📄 Working database operation
- 💾 Working write storage processes
- ⏳ Working read storage processes
- 🛫 Pending read storage processes
- ⚙️ Working or pending storage processes of hidden files
- 🧩 Waiting chunks
- 🔌 Working Customisation items (Configuration, snippets and plug-ins)
... To continue on to `updates_old.md`.

View File

@@ -1,3 +1,43 @@
### 0.21.0
The E2EE encryption V2 format has been reverted. That was probably the cause of the glitch.
Instead, to maintain efficiency, files are treated with Blob until just before saving. Along with this, the old-fashioned encryption format has also been discontinued.
There are both forward and backwards compatibilities, with recent versions. However, unfortunately, we lost compatibility with filesystem-livesync or some.
It will be addressed soon. Please be patient if you are using filesystem-livesync with E2EE.
- 0.21.5
- Improved:
- Now all revisions will be shown only its first a few letters.
- Now ID of the documents is shown in the log with the first 8 letters.
- Fixed:
- Check before modifying files has been implemented.
- Content change detection has been improved.
- 0.21.4
- This release had been skipped.
- 0.21.3
- Implemented:
- Now we can use SHA1 for hash function as fallback.
- 0.21.2
- IMPORTANT NOTICE: **0.21.1 CONTAINS A BUG WHILE REBUILDING THE DATABASE. IF YOU HAVE BEEN REBUILT, PLEASE MAKE SURE THAT ALL FILES ARE SANE.**
- This has been fixed in this version.
- Fixed:
- No longer files are broken while rebuilding.
- Now, Large binary files can be written correctly on a mobile platform.
- Any decoding errors now make zero-byte files.
- Modified:
- All files are processed sequentially for each.
- 0.21.1
- Fixed:
- No more infinity loops on larger files.
- Show message on decode error.
- Refactored:
- Fixed to avoid obsolete global variables.
- 0.21.0
- Changes and performance improvements:
- Now the saving files are processed by Blob.
- The V2-Format has been reverted.
- New encoding format has been enabled in default.
- WARNING: Since this version, the compatibilities with older Filesystem LiveSync have been lost.
## 0.20.0
At 0.20.0, Self-hosted LiveSync has changed the binary file format and encrypting format, for efficient synchronisation.
The dialogue will be shown and asks us to decide whether to keep v1 or use v2. Once we have enabled v2, all subsequent edits will be saved in v2. Therefore, devices running 0.19 or below cannot understand this and they might say that decryption error. Please update all devices.