diff --git a/TECHNICAL.md b/TECHNICAL.md index 3f4d3ec..8d1cdf2 100644 --- a/TECHNICAL.md +++ b/TECHNICAL.md @@ -1088,7 +1088,7 @@ The `sortText` prefix ensures pages always appear before aliases in the list. Bo **Front matter suppression:** `isLineInsideFrontMatter()` checks whether the cursor is between the first two `---` lines. If so, no completions are returned — front matter aliases are plain strings, not wikilinks. -All four pure functions — `findInnermostOpenBracket()`, `findMatchingCloseBracket()`, `isLineInsideFrontMatter()`, and `isPositionInsideCode()` — live in `CompletionUtils.ts` with no VS Code dependency, and are fully unit-tested. +All five pure functions — `findInnermostOpenBracket()`, `findMatchingCloseBracket()`, `isLineInsideFrontMatter()`, `isPositionInsideCode()`, and `hasNewCompleteWikilink()` — live in `CompletionUtils.ts` with no VS Code dependency, and are fully unit-tested. **Code block detection:** `isPositionInsideCode(lines, lineIndex, charIndex)` scans lines 0 through `lineIndex` tracking fenced code block open/close state (supports both `` ` `` and `~` fences, respects fence length — a closing fence must use the same character and at least as many markers as the opener). If a fence is still open at `lineIndex`, the position is inside a code block. Separately checks inline code spans (`` ` ``) on the target line. @@ -1097,7 +1097,7 @@ All four pure functions — `findInnermostOpenBracket()`, `findMatchingCloseBrac The provider maintains a cached array of `CompletionItem[]` objects. The cache is rebuilt **eagerly** whenever `refresh()` is called — there is no dirty flag or lazy rebuild. - **Eager rebuild:** `refresh()` calls `rebuildCache()` immediately, running three SQLite queries (`getAllPages`, `getAllAliases`, `getForwardReferencedPages`) and building lightweight plain data objects (not `CompletionItem` instances). This happens at index-update time (file save, create, delete, rename, periodic scan, rebuild), not at `[[` keystroke time. -- **Not called on text-change debounce:** The 500 ms `onDidChangeTextDocument` debounce handler does **not** call `completionProvider.refresh()`. This eliminates three SQLite queries on every typing pause. Forward references appear in autocomplete after the next file save. +- **Selective text-change refresh:** The 500 ms `onDidChangeTextDocument` debounce handler re-indexes the live buffer and calls `completionProvider.refresh()` **only when** `hasNewCompleteWikilink()` detects that the current document now contains at least one newly added complete wikilink compared to the page's last indexed links. This makes new forward references available without save while avoiding the refresh cost on ordinary typing edits. - **Warm on activation:** `enterFullMode()` calls `completionProvider.refresh()` immediately after construction (post stale-scan), so the cache is warm before the user types the first `[[`. - **Not called on editor switch:** The `onDidChangeActiveTextEditor` handler does **not** call `completionProvider.refresh()`. Completion data is global (all pages, aliases, forward refs) and is unaffected by which tab is focused. This avoids 3 SQLite queries on every tab switch. - **Hot path:** `provideCompletionItems()` never touches the database. It builds `CompletionItem` instances from lightweight cached data objects with the per-call replacement range and returns them inside a `CompletionList`. @@ -1128,11 +1128,11 @@ The listener only fires on deletions (where `rangeLength > 0` and `text` is empt When the user types inside a wikilink, two things happen on every keystroke: 1. `WikilinkRenameTracker.onDocumentChanged` — records `pendingEdit` (the outermost wikilink position the cursor is inside). This edit state is cleared and rename detection runs when the cursor exits the wikilink. -2. The 500 ms `onDidChangeTextDocument` debounce in `extension.ts` — re-indexes the live buffer into the in-memory DB so that newly typed forward references appear in autocomplete immediately. +2. The 500 ms `onDidChangeTextDocument` debounce in `extension.ts` — re-indexes the live buffer into the in-memory DB and selectively refreshes autocomplete when a newly added complete wikilink is detected. These two behaviours interact at the index: rename detection (`checkForRenames`) works by comparing the **current DB state** (the last-indexed link positions and names) against the live document. If the debounce fires and re-indexes the document before the cursor exits the wikilink, the DB is updated to reflect the edited name. When `checkForRenames` later runs after cursor exit, it compares the edited name in the DB against the same name in the live document — no difference is detected, and the rename dialog is never shown. -**Guard:** The debounce callback in `extension.ts` checks `renameTracker.hasPendingEdit(doc.uri.toString())` before calling `indexFileContent`. If a pending edit is active for that document, the re-index is skipped for that tick. `WikilinkRenameTracker` exposes: +**Guard:** The debounce callback in `extension.ts` checks `renameTracker.hasPendingEdit(doc.uri.toString())` before calling `indexFileContent` or refreshing completion. If a pending edit is active for that document, the re-index is skipped for that tick. `WikilinkRenameTracker` exposes: ```typescript hasPendingEdit(docKey: string): boolean @@ -1310,7 +1310,15 @@ Diagnostic logging is provided by `LogService` (`src/LogService.ts`), a pure Nod **Log format:** `[ISO timestamp] [LEVEL] tag: message` -**Usage in services:** `LogService` is injected as an optional constructor parameter (defaulting to `NO_OP_LOGGER`) into `IndexService`, `IndexScanner`, `WikilinkDecorationManager`, and `WikilinkCompletionProvider`. The `extension.ts` module also uses the instance directly for lifecycle logging. A `NO_OP_LOGGER` singleton is used when logging is disabled, avoiding null checks throughout the codebase. +**Usage in services:** `LogService` is injected as an optional constructor parameter (defaulting to `NO_OP_LOGGER`) into `IndexService`, `IndexScanner`, `WikilinkDecorationManager`, `WikilinkCompletionProvider`, and other logger-aware services. The `extension.ts` module also uses the instance directly for lifecycle logging. A `NO_OP_LOGGER` singleton is used when logging is disabled, avoiding null checks throughout the codebase. + +For modules that are not constructed with an injected logger (for example some inline-editor / Mermaid helpers), `LogService.ts` also exposes a shared active logger: + +- `setActiveLogger(logger)` — called by `extension.ts` when logging is configured +- `getActiveLogger()` — returns the current active logger, or `NO_OP_LOGGER` when logging is off +- `formatLogError(error)` — normalises unknown exceptions into a string for logger output + +This allows the extension to avoid naked `console.log/warn/error` calls while still keeping all diagnostic output behind the same logging gate. **Timer API:** `logService.time(tag, label)` returns a closure that, when called, logs the elapsed milliseconds at INFO level — used for performance instrumentation throughout the decoration and completion pipelines. @@ -1576,9 +1584,42 @@ A single dialog lists all affected renames. For each rename: - If the old target file exists → shows as a file rename (e.g. `"Pagey.md" → "Page.md"`) - If the old file doesn't exist → shows as a link-only change +**Merge detection:** + +When the old target file exists, rename execution distinguishes three cases: + +1. **Direct file merge target exists** — the new page name resolves to an existing file by **direct filename match** anywhere in the notes tree. In this case the dialog switches to merge language and the source page is merged into the existing target page. +2. **Alias-only target resolution** — if the new page name only resolves via alias, no file merge is attempted. The operation falls back to a normal file rename path instead of merging into the alias's canonical page. +3. **No existing direct target** — the source file is renamed in place (same directory as the source file), preserving the original rename behaviour. + +Direct-merge resolution is intentionally global while plain rename destination selection remains local. This preserves the original "rename beside the source file" behaviour when there is no real merge target, while still allowing merges into an existing page in another folder. + +**Alias self-name guard:** + +If `resolveAlias(oldPageName)` returns the same page whose filename is `oldPageName.md`, the rename tracker does **not** treat this as a true alias rename. This avoids a front-matter alias like `aliases: [Pothos]` on `Pothos.md` blocking merge detection for `[[Pothos]] → [[Monstera]]`. + **Workspace-wide link update:** -`updateLinksInWorkspace()` finds all `.md` and `.markdown` files, parses each for wikilinks, and creates a `WorkspaceEdit` that replaces every `[[oldPageName]]` with `[[newPageName]]`. After applying the edit, it saves modified files. +`updateLinksInWorkspace()` finds all `.md` and `.markdown` files, parses each for wikilinks, and now uses a split write strategy: + +1. **Already-open documents** are updated through a `WorkspaceEdit` so their live editor buffers stay authoritative. +2. **Closed documents** are rewritten directly on disk using `workspace.fs.writeFile()` after a raw `fs.readFile()` pass. + +This avoids VS Code opening newly-dirty tabs for files that were closed before the rename, which in turn avoids working-copy save conflicts on those reference files. + +**Index-driven candidate narrowing:** + +Rename refactors no longer have to discover rewrite candidates by scanning the entire workspace up front. `IndexService.findPagesLinkingToPageNames(...)` queries the `links` table for distinct source pages whose indexed `page_name` matches one of the renamed targets. `updateLinksInWorkspace()` now accepts those candidate URIs and only opens that bounded set of files when a candidate set is provided. The actual rewrite still parses the real file text before editing, so the index narrows the search space but does not become the edit source of truth. + +**Notification progress:** + +Accepted in-editor rename operations are wrapped in `withWikilinkRenameProgress()` (`WikilinkRenameProgressService.ts`), which shows a non-cancellable VS Code notification while the slow path runs. The tracker reports three coarse phases: + +1. `Preparing rename operations` +2. `Updating links across workspace` +3. `Refreshing index` + +The progress notification is only shown after the user confirms the rename or merge. Declined or dismissed prompts remain a no-op aside from the existing decline re-index path. ### Post-rename index refresh @@ -1591,6 +1632,44 @@ After a rename operation completes, `refreshIndexAfterRename()` ensures the inde This explicit refresh prevents a stale-index window where the next edit event could compare against outdated links. The extension.ts save/rename handlers may also re-index some of these files (via event triggers), but the operations are idempotent — double-indexing is harmless and keeps the code robust. +### Explorer rename merge handling + +Explorer-driven file renames are handled separately in `extension.ts` via `onDidRenameFiles`. + +After the renamed file is indexed at its new path, AS Notes checks for filename collisions using `IndexService.findPagesByFilename(newFilename)`. Merge handling is intentionally conservative: + +1. Compute the notes-root-relative path of the just-renamed file. +2. Filter that path out of the duplicate list. +3. Only proceed with a merge when **exactly one** pre-existing target remains. +4. If multiple pre-existing targets remain, show a warning and skip the merge rather than picking an arbitrary file. + +This selection logic is isolated in `WikilinkExplorerMergeService.ts` so the ambiguity rules are unit-tested independently of the large `extension.ts` event handler. + +The user-confirmed refactor work that follows explorer renames is now extracted into `WikilinkExplorerRenameRefactorService.ts`. That helper applies the same notification UX as in-editor renames: + +1. Accepted merge operations show `AS Notes: Applying rename updates` while the merge, delete, and target re-index complete. +2. Accepted workspace-wide reference updates show `AS Notes: Updating wikilink references` while index-driven candidate rewrite and targeted file re-indexing complete. +3. Declined explorer prompts do not show progress notifications. + +For explorer renames, the old broad `staleScan()` follow-up has been replaced with targeted re-indexing of the files actually edited by the refactor. That removes a second whole-tree pass from the common rename path. + +`updateLinksInWorkspace()` no longer auto-saves affected open editors after applying workspace edits. Instead, both the in-editor and explorer rename flows now use `reindexWorkspaceUri(...)`: if an affected file is currently open, it is re-indexed from the live editor buffer via `indexFileContent(...)`; otherwise it falls back to `indexScanner.indexFile(...)`. `WikilinkRenameTracker` still re-indexes the initiating document from `document.getText()` when its URI is stable, and remaps any old source candidate URI to the post-rename or post-merge target before follow-up indexing. Combined with the direct-to-disk rewrite path for closed files, this avoids save conflicts on both open reference files and files that were previously closed, as well as attempts to reopen a source file that has just been renamed or deleted. + +### Filename-level wikilink refactors + +Rename propagation now also updates filenames that contain the renamed wikilink text itself, not just file contents. The shared planner in `WikilinkFilenameRefactorService.ts` scans indexed pages for filenames containing `[[oldPageName]]` and computes additional file operations such as: + +1. `Topic [[Plant]].md` -> `Topic [[Tree]].md` +2. merge into an existing `Topic [[Tree]].md` when that target already exists + +The planner is reused by both `WikilinkRenameTracker.ts` and `WikilinkExplorerRenameRefactorService.ts`, so in-editor renames and explorer renames follow the same rules. It provides three guarantees: + +1. **Consistent rename/merge classification** - filename collisions become merges instead of failing midway. +2. **Safe ordering** - chained filename renames are topologically ordered so a rename that frees a target path can run before a dependent rename that needs that path. +3. **URI remapping for follow-up work** - candidate files selected for link rewrites or re-indexing are remapped through the filename operations so later content edits and index refreshes use the final file locations. + +This keeps filename refactors, content refactors, and index updates in sync even when nested wikilinks appear inside page filenames. + ### Re-entrancy guard The `isProcessing` flag prevents document-change events fired by the rename operation itself (file renames, workspace edits) from being treated as new user edits. It is set to `true` before any rename work begins and cleared in the `finally` block. diff --git a/common/src/FrontMatterService.ts b/common/src/FrontMatterService.ts index 51d3644..120af31 100644 --- a/common/src/FrontMatterService.ts +++ b/common/src/FrontMatterService.ts @@ -370,4 +370,157 @@ export class FrontMatterService { return null; // alias not found } + + /** + * Merge two markdown documents, combining front matter and appending bodies. + * + * - Front matter is merged at the raw YAML line level so unknown/custom + * properties are preserved. + * - Target properties take priority when both documents define the same key. + * - Aliases are merged and deduplicated (case-sensitive). + * - The source body is appended after the target body, separated by a blank line. + * + * @param targetContent - The authoritative document content (priority for overlapping keys). + * @param sourceContent - The document being merged in. + * @returns The merged document content as a single string. + */ + mergeDocuments(targetContent: string, sourceContent: string): string { + const targetFm = this.extractFrontMatter(targetContent); + const sourceFm = this.extractFrontMatter(sourceContent); + const targetBody = this.stripFrontMatter(targetContent); + const sourceBody = this.stripFrontMatter(sourceContent); + + let mergedFrontMatter: string | null = null; + + if (targetFm !== null && sourceFm !== null) { + mergedFrontMatter = this.mergeFrontMatterBlocks(targetFm, sourceFm); + } else if (targetFm !== null) { + mergedFrontMatter = targetFm; + } else if (sourceFm !== null) { + mergedFrontMatter = sourceFm; + } + + // Build merged body: target body + blank line + source body + const trimmedTargetBody = targetBody.replace(/^\n+/, '').replace(/\n+$/, ''); + const trimmedSourceBody = sourceBody.replace(/^\n+/, '').replace(/\n+$/, ''); + let mergedBody: string; + if (trimmedTargetBody.length > 0 && trimmedSourceBody.length > 0) { + mergedBody = trimmedTargetBody + '\n\n' + trimmedSourceBody; + } else if (trimmedTargetBody.length > 0) { + mergedBody = trimmedTargetBody; + } else { + mergedBody = trimmedSourceBody; + } + + if (mergedFrontMatter !== null) { + return '---\n' + mergedFrontMatter + '\n---\n\n' + mergedBody; + } + return mergedBody; + } + + /** + * Merge two raw front matter blocks (without --- fences). + * Target keys take priority. Source-only keys are appended. + * Aliases are merged and deduplicated. + */ + private mergeFrontMatterBlocks(targetFm: string, sourceFm: string): string { + const targetEntries = this.parseFrontMatterEntries(targetFm); + const sourceEntries = this.parseFrontMatterEntries(sourceFm); + + // Determine which source keys are missing from target + const targetKeys = new Set(targetEntries.map(e => e.key)); + const sourceOnlyEntries = sourceEntries.filter(e => e.key !== 'aliases' && !targetKeys.has(e.key)); + + // Handle aliases specially: merge and dedupe + const targetAliases = this.parseAliasesFromFrontMatter(targetFm); + const sourceAliases = this.parseAliasesFromFrontMatter(sourceFm); + const hasTargetAliases = targetKeys.has('aliases'); + const hasSourceAliases = sourceEntries.some(e => e.key === 'aliases'); + + // Start with the target front matter lines + const resultLines = targetFm.split(/\r?\n/); + + // Append source-only entries + for (const entry of sourceOnlyEntries) { + resultLines.push(...entry.lines); + } + + // Merge aliases if source has any that target doesn't + if (hasSourceAliases && sourceAliases.length > 0) { + const mergedAliases = [...targetAliases]; + for (const alias of sourceAliases) { + if (!mergedAliases.includes(alias)) { + mergedAliases.push(alias); + } + } + + if (hasTargetAliases) { + // Replace existing alias block in result + this.replaceAliasBlock(resultLines, mergedAliases); + } else { + // Append alias block + resultLines.push('aliases:'); + for (const alias of mergedAliases) { + resultLines.push(` - ${alias}`); + } + } + } + + return resultLines.join('\n'); + } + + /** + * Parse front matter into entries: each entry has a key and its raw lines + * (including any indented continuation lines like list items). + */ + private parseFrontMatterEntries(frontMatter: string): { key: string; lines: string[] }[] { + const lines = frontMatter.split(/\r?\n/); + const entries: { key: string; lines: string[] }[] = []; + + for (let i = 0; i < lines.length; i++) { + const kvMatch = lines[i].match(/^(\w[\w-]*)\s*:/); + if (kvMatch) { + const key = kvMatch[1].toLowerCase(); + const entryLines = [lines[i]]; + // Collect indented continuation lines (list items, etc.) + while (i + 1 < lines.length && /^\s+/.test(lines[i + 1]) && !lines[i + 1].match(/^(\w[\w-]*)\s*:/)) { + i++; + entryLines.push(lines[i]); + } + entries.push({ key, lines: entryLines }); + } + } + + return entries; + } + + /** + * Replace the alias block in a set of front matter lines with merged aliases. + * Outputs list-style format. + */ + private replaceAliasBlock(lines: string[], mergedAliases: string[]): void { + // Find alias key line + let aliasStart = -1; + for (let i = 0; i < lines.length; i++) { + if (lines[i].match(/^aliases\s*:/)) { + aliasStart = i; + break; + } + } + if (aliasStart === -1) { return; } + + // Find end of alias block (next top-level key or end) + let aliasEnd = aliasStart + 1; + while (aliasEnd < lines.length && /^\s+/.test(lines[aliasEnd])) { + aliasEnd++; + } + + // Build replacement lines + const replacementLines = ['aliases:']; + for (const alias of mergedAliases) { + replacementLines.push(` - ${alias}`); + } + + lines.splice(aliasStart, aliasEnd - aliasStart, ...replacementLines); + } } diff --git a/common/test/FrontMatterService.test.ts b/common/test/FrontMatterService.test.ts index fa014c0..d8bcc31 100644 --- a/common/test/FrontMatterService.test.ts +++ b/common/test/FrontMatterService.test.ts @@ -303,3 +303,136 @@ describe('FrontMatterService — parseFrontMatterFields', () => { expect(fields.assets).toBe(false); }); }); + +describe('FrontMatterService - mergeDocuments', () => { + const service = new FrontMatterService(); + + it('should append source body after target body with blank line separator', () => { + const target = '# Target\n\nTarget content.'; + const source = '# Source\n\nSource content.'; + const result = service.mergeDocuments(target, source); + expect(result).toBe('# Target\n\nTarget content.\n\n# Source\n\nSource content.'); + }); + + it('should keep target front matter when source has none', () => { + const target = '---\ntitle: Target Title\n---\n\n# Target body'; + const source = '# Source body'; + const result = service.mergeDocuments(target, source); + expect(result).toBe('---\ntitle: Target Title\n---\n\n# Target body\n\n# Source body'); + }); + + it('should adopt source front matter when target has none', () => { + const target = '# Target body'; + const source = '---\ntitle: Source Title\n---\n\n# Source body'; + const result = service.mergeDocuments(target, source); + expect(result).toBe('---\ntitle: Source Title\n---\n\n# Target body\n\n# Source body'); + }); + + it('should give target priority for overlapping properties', () => { + const target = '---\ntitle: Target Title\npublic: true\n---\n\nTarget body'; + const source = '---\ntitle: Source Title\npublic: false\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + expect(result).toContain('title: Target Title'); + expect(result).toContain('public: true'); + expect(result).not.toContain('Source Title'); + expect(result).not.toContain('public: false'); + }); + + it('should copy source-only properties to target', () => { + const target = '---\ntitle: Target Title\n---\n\nTarget body'; + const source = '---\ndraft: true\norder: 5\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + expect(result).toContain('title: Target Title'); + expect(result).toContain('draft: true'); + expect(result).toContain('order: 5'); + }); + + it('should merge and dedupe aliases from both documents', () => { + const target = '---\naliases:\n - Alpha\n - Beta\n---\n\nTarget body'; + const source = '---\naliases:\n - Beta\n - Gamma\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + // Should contain all three, Beta only once + const merged = service.parseAliases(result); + expect(merged).toEqual(['Alpha', 'Beta', 'Gamma']); + }); + + it('should add source aliases when target has none', () => { + const target = '---\ntitle: Target\n---\n\nTarget body'; + const source = '---\naliases:\n - Alias A\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + const merged = service.parseAliases(result); + expect(merged).toEqual(['Alias A']); + }); + + it('should keep target aliases when source has none', () => { + const target = '---\naliases:\n - Alias A\n---\n\nTarget body'; + const source = '---\ntitle: Source\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + const merged = service.parseAliases(result); + expect(merged).toEqual(['Alias A']); + }); + + it('should preserve unknown/custom properties from source', () => { + const target = '---\ntitle: Target\n---\n\nTarget body'; + const source = '---\ncustom_field: hello\nanother: world\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + expect(result).toContain('custom_field: hello'); + expect(result).toContain('another: world'); + }); + + it('should preserve unknown/custom properties from target', () => { + const target = '---\ncustom_field: original\ntitle: Target\n---\n\nTarget body'; + const source = '---\ncustom_field: replaced\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + expect(result).toContain('custom_field: original'); + expect(result).not.toContain('custom_field: replaced'); + }); + + it('should handle both documents with no front matter', () => { + const target = '# Target'; + const source = '# Source'; + const result = service.mergeDocuments(target, source); + expect(result).toBe('# Target\n\n# Source'); + }); + + it('should handle empty source body', () => { + const target = '---\ntitle: Target\n---\n\n# Target body'; + const source = '---\ndraft: true\n---\n'; + const result = service.mergeDocuments(target, source); + expect(result).toContain('title: Target'); + expect(result).toContain('draft: true'); + }); + + it('should handle empty target body', () => { + const target = '---\ntitle: Target\n---\n'; + const source = '# Source body\n\nSome content.'; + const result = service.mergeDocuments(target, source); + expect(result).toContain('title: Target'); + expect(result).toContain('# Source body'); + }); + + it('should merge inline array aliases with list aliases', () => { + const target = '---\naliases: [Alpha, Beta]\n---\n\nTarget body'; + const source = '---\naliases:\n - Gamma\n - Alpha\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + const merged = service.parseAliases(result); + expect(merged).toEqual(['Alpha', 'Beta', 'Gamma']); + }); + + it('should preserve source multi-line custom properties', () => { + const target = '---\ntitle: Target\n---\n\nTarget body'; + const source = '---\ntags:\n - tag1\n - tag2\n---\n\nSource body'; + const result = service.mergeDocuments(target, source); + expect(result).toContain('tags:'); + expect(result).toContain(' - tag1'); + expect(result).toContain(' - tag2'); + }); + + it('should not duplicate the blank line separator when target body ends with newline', () => { + const target = '---\ntitle: Target\n---\n\nTarget body\n'; + const source = 'Source body'; + const result = service.mergeDocuments(target, source); + // Should not have triple newlines + expect(result).not.toMatch(/\n\n\n/); + }); +}); diff --git a/docs-src/docs/Backlinks.md b/docs-src/docs/Backlinks.md index 21a3ef9..706ca00 100644 --- a/docs-src/docs/Backlinks.md +++ b/docs-src/docs/Backlinks.md @@ -4,7 +4,7 @@ order: 3 # Backlinks -The Backlinks panel shows every note in your workspace that links to a given page. It is one of the most powerful navigation tools in AS Notes — use it to understand how ideas connect across your knowledge base. +The Backlinks panel shows every note in your workspace that links to a given page. Use it to understand how ideas connect across your knowledge base. ## Opening the Panel @@ -24,7 +24,7 @@ For example, if `Project.md` contains: - [[NGINX]] ``` -…then the backlink chain for `NGINX` from `Project.md` would be: +… then the backlink chain for `NGINX` from `Project.md` would be: ``` Project → Tasks → NGINX diff --git a/docs-src/docs/Getting Started.md b/docs-src/docs/Getting Started.md index f6aa1ea..c62544e 100644 --- a/docs-src/docs/Getting Started.md +++ b/docs-src/docs/Getting Started.md @@ -8,7 +8,7 @@ This page gets you from zero to a working AS Notes workspace in a few minutes. ## 1. Install the Extension -Install **AS Notes** from the [VS Code Marketplace](https://marketplace.visualstudio.com/items?itemName=appsoftwareltd.as-notes). +Install **AS Notes** from [Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=appsoftwareltd.as-notes) / [Open VSX](https://open-vsx.org/extension/appsoftwareltd/as-notes) Open VS Code, go to the Extensions view (`Ctrl+Shift+X`), search for **AS Notes**, and click **Install**. @@ -33,9 +33,9 @@ Look for the AS Notes sidebar icon to view Search, Calendar, Kanban Boards and T ## 4. Write Your First Note -Create a new `.md` file and start writing. Type `[[` anywhere to trigger [[Wikilinks]] autocomplete - a list of all your pages appears immediately. +Create a new `.md` file and start writing. Type `[[` anywhere to trigger [[Wikilinks]] autocomplete - a list of all your pages appears immediately (unless this is your first page and wikilink, in which case it will be added too the index ready for referencing). -Try adding a task `- [ ] task text` and try toggling the task state from the task management side panel. +Next, try adding a task `- [ ] task text` (`Ctrl+Shift+Enter` / `Cmd+Shift+Enter`) and try toggling the task state from the task management side panel or by clicking the check box or by cycling the keyboard shortcut. ## Excluding Files from the Index @@ -62,8 +62,8 @@ If the index ever becomes stale or corrupted, run **AS Notes: Rebuild Index** fr ## Cleaning the Workspace -If the extension is in a bad state (e.g. persistent errors after a crash), run **AS Notes: Clean Workspace** to remove the `.asnotes/` directory and reset all in-memory state. Your `.asnotesignore` file is preserved. Run **AS Notes: Initialise Workspace** afterwards to start fresh. +Run **AS Notes: Clean Workspace** to remove the `.asnotes/` directory and reset all in-memory state. Your `.asnotesignore` file is preserved. Run **AS Notes: Initialise Workspace** afterwards to start fresh. ## Compatibility With Other Tools -AS Notes workspaces are plain markdown files in plain folders - they are compatible with Obsidian and Logseq due to similar file structures. Be aware there are format and behavioural differences, but you can use the same notes folder with multiple tools. +AS Notes workspaces are plain markdown files in plain folders - they are largely compatible with Obsidian and Logseq due to similar file structures. Be aware there are format and behavioural differences, but you can use the same notes folder with multiple tools. diff --git a/docs-src/docs/Wikilinks.md b/docs-src/docs/Wikilinks.md index 90e656c..44862c8 100644 --- a/docs-src/docs/Wikilinks.md +++ b/docs-src/docs/Wikilinks.md @@ -59,14 +59,36 @@ Type `[[` in any markdown file to trigger the autocomplete list: ## Rename Synchronisation +AS Notes keeps files and links consistent when you rename either a wikilink or its backing file. + +### In-Editor Rename + When you edit a wikilink's text and move the cursor away (or switch files), AS Notes detects the change and offers to: 1. **Rename the corresponding `.md` file** (if it exists) 2. **Update every matching wikilink** across all markdown files in the workspace -A single confirmation dialog covers all affected pages. You can decline — the link text change is kept but files and other links are left untouched. +A single confirmation dialog covers all affected renames. You can decline -- the link text change is kept but files and other links are left untouched. + +### Merge on Rename to Existing Page + +If you rename a wikilink to match a page that already exists, AS Notes offers to **merge** the two files instead of renaming: + +- The dialog uses "Merge" language so the operation is clear +- **Target page content is preserved** -- the source body is appended below the target body, separated by a blank line +- **Front matter is merged** with target priority -- the target's values win for any shared properties (e.g. `title`, `description`), while properties that only exist in the source are added +- **Aliases are merged and deduplicated** -- both pages' alias lists are combined, with duplicates removed +- The source file is **deleted** after the merge + +Declining a merge is a full no-op -- no files change, no index updates, no other links are touched. + +### Explorer Sidebar Rename + +Renaming a `.md` file in the VS Code explorer sidebar triggers the same link update: all wikilinks that referenced the old filename are updated to match the new name. + +### Alias-Aware Rename -Rename tracking is alias-aware: editing an alias wikilink offers to update the alias in the front matter and all matching references. +Rename tracking is alias-aware. Editing an alias wikilink offers to update the alias value in the canonical page's front matter and all matching references across the workspace. ## Page Aliases diff --git a/docs-src/docs/index.md b/docs-src/docs/index.md index 0f22c2d..c5498bc 100644 --- a/docs-src/docs/index.md +++ b/docs-src/docs/index.md @@ -8,11 +8,10 @@ AS Notes brings markdown and [[wikilink]] editing for notes, documentation, blog **Capture ideas, link concepts, write, and stay focused - without ever leaving your editor.** -> **This documentation was written and generated using AS Notes. See [[Publishing a Static Site]] for how you can use AS Notes for your docs, including deploying to GitHub Pages**. +> **This documentation was written and generated using AS Notes. See [[Publishing a Static Site]] for how you can use AS Notes for your docs, including deploying to GitHub Pages, Cloudflare and More**. -> **Install:** [https://marketplace.visualstudio.com/items?itemName=appsoftwareltd.as-notes](https://marketplace.visualstudio.com/items?itemName=appsoftwareltd.as-notes) - -> **GitHub:** [https://github.com/appsoftwareltd/as-notes](https://github.com/appsoftwareltd/as-notes) +> **Install:** [Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=appsoftwareltd.as-notes) / [Open VSX](https://open-vsx.org/extension/appsoftwareltd/as-notes) +> **GitHub:** [github.com/appsoftwareltd/as-notes](https://github.com/appsoftwareltd/as-notes) ![AS Notes editor screenshot](../assets/images/as-notes-editor-screenshot.png) @@ -22,13 +21,12 @@ If you've already installed AS Notes and want to get started, see [[Getting Star ## Why VS Code? -Using VS Code as your notes app gives you a huge amount for free before you even start using AS Notes features: +Using VS Code as your notes app gives you a huge amount for free in addition the features that AS Notes provides: -- Cross-platform and web-based (via VS Code Workspaces) +- Cross-platform compatibility and web access (via VS Code Workspaces) - Tabs, file explorer, themes, keyboard shortcuts -- A vast extension library — Mermaid diagrams, Vim mode, and more, all usable alongside AS Notes +- A vast extension library - AI chat (GitHub Copilot, Claude, etc.) to query and work with your notes -- Outliner-style indentation via `Ctrl+[` / `Ctrl+]` - Syntax highlighting for code embedded in your notes ## Features at a Glance @@ -65,4 +63,4 @@ AS Notes is privacy-first. It never connects to external servers. All indexing, ## Licence -See [[Licence Rationale]] for an explanation of the source-available licence model. +See [[Licence]] diff --git a/vs-code-extension/package.json b/vs-code-extension/package.json index e77d8d7..7b4fc63 100644 --- a/vs-code-extension/package.json +++ b/vs-code-extension/package.json @@ -275,33 +275,48 @@ "scope": "resource", "description": "Subdirectory for AS Notes within the workspace (e.g. 'docs' or 'notes'). When empty, the workspace root is used. All AS Notes data (.asnotes, journals, templates, etc.) lives within this directory. Configure this per workspace, not globally. Warning: if you change this after initialisation, you must manually move your notes directory to the new location." }, - "as-notes.periodicScanInterval": { - "type": "number", - "default": 300, - "minimum": 30, - "description": "Interval in seconds between automatic background scans for file changes. Set to 0 to disable periodic scanning." - }, "as-notes.journalFolder": { "type": "string", "default": "journals", "description": "Folder for daily journal files, relative to the AS Notes root directory." }, - "as-notes.licenceKey": { + "as-notes.templateFolder": { "type": "string", - "default": "", - "scope": "machine", - "description": "AS Notes Pro licence key (format: ASNO-XXXX-XXXX-XXXX-XXXX). Enter your licence key to activate Pro features. Purchase at https://www.asnotes.io/pricing" + "default": "templates", + "description": "Folder for note templates, relative to the AS Notes root directory. Templates are markdown files that can be inserted via the /Template slash command." + }, + "as-notes.notesFolder": { + "type": "string", + "default": "notes", + "description": "Folder for new notes, relative to the AS Notes root directory. Used when creating pages via wikilink navigation and the Create Note / Create Encrypted Note commands." }, "as-notes.assetPath": { "type": "string", "default": "assets/images", "description": "Folder where dropped or pasted files are saved by VS Code's built-in markdown editor, relative to the AS Notes root directory. AS Notes configures the built-in markdown.copyFiles.destination setting to use this path." }, + "as-notes.createNotesInCurrentDirectory": { + "type": "boolean", + "default": false, + "description": "When enabled, new notes created via wikilink navigation are placed in the current editing file's directory instead of the notes folder. Ignored when the source file is in the journal folder." + }, + "as-notes.licenceKey": { + "type": "string", + "default": "", + "scope": "machine", + "description": "AS Notes Pro licence key (format: ASNO-XXXX-XXXX-XXXX-XXXX). Enter your licence key to activate Pro features. Purchase at https://www.asnotes.io/pricing" + }, "as-notes.enableLogging": { "type": "boolean", "default": false, "description": "Enable diagnostic logging to .asnotes/logs/. Rolling 10 MB files, max 5. Requires reload after changing." }, + "as-notes.periodicScanInterval": { + "type": "number", + "default": 300, + "minimum": 30, + "description": "Interval in seconds between automatic background scans for file changes. Set to 0 to disable periodic scanning." + }, "as-notes.wikilinkColour": { "type": "string", "default": "", @@ -317,32 +332,17 @@ "default": false, "description": "Default context display in the backlinks panel. When false, context is compact (single line, truncated). When true, context wraps to show full text." }, - "as-notes.outlinerMode": { - "type": "boolean", - "default": false, - "scope": "window", - "description": "Enable Outliner Mode for markdown files. In outliner mode, Enter on a bullet line inserts a new bullet at the same indentation, Tab indents the bullet line, and Shift+Tab outdents it. Only applies to lines beginning with `- `." - }, "as-notes.kanbanAssetSizeWarningMB": { "type": "number", "default": 10, "minimum": 0, "description": "Warn when attaching files larger than this size (in MB) to a Kanban card. Set to 0 to disable the warning." }, - "as-notes.templateFolder": { - "type": "string", - "default": "templates", - "description": "Folder for note templates, relative to the AS Notes root directory. Templates are markdown files that can be inserted via the /Template slash command." - }, - "as-notes.notesFolder": { - "type": "string", - "default": "notes", - "description": "Folder for new notes, relative to the AS Notes root directory. Used when creating pages via wikilink navigation and the Create Note / Create Encrypted Note commands." - }, - "as-notes.createNotesInCurrentDirectory": { + "as-notes.outlinerMode": { "type": "boolean", "default": false, - "description": "When enabled, new notes created via wikilink navigation are placed in the current editing file's directory instead of the notes folder. Ignored when the source file is in the journal folder." + "scope": "window", + "description": "Enable Outliner Mode for markdown files. In outliner mode, Enter on a bullet line inserts a new bullet at the same indentation, Tab indents the bullet line, and Shift+Tab outdents it. Only applies to lines beginning with `- `." }, "as-notes.inlineEditor.enabled": { "type": "boolean", diff --git a/vs-code-extension/src/BacklinkPanelProvider.ts b/vs-code-extension/src/BacklinkPanelProvider.ts index cce2ff9..a576f8c 100644 --- a/vs-code-extension/src/BacklinkPanelProvider.ts +++ b/vs-code-extension/src/BacklinkPanelProvider.ts @@ -1,6 +1,6 @@ import * as vscode from 'vscode'; import { IndexService, type BacklinkChainGroup, type BacklinkChainInstance, type PageRow } from './IndexService.js'; -import type { LogService } from './LogService.js'; +import { getActiveLogger, formatLogError, type LogService } from './LogService.js'; import * as path from 'path'; import { toNotesRelativePath } from './NotesRootService.js'; @@ -463,7 +463,7 @@ export class BacklinkPanelProvider implements vscode.Disposable { vscode.TextEditorRevealType.InCenter, ); } catch (err) { - console.warn('as-notes: failed to navigate to backlink:', err); + (this.logger ?? getActiveLogger()).warn('BacklinkPanel', `failed to navigate to backlink: ${formatLogError(err)}`); } } diff --git a/vs-code-extension/src/CompletionUtils.ts b/vs-code-extension/src/CompletionUtils.ts index bc7751f..94dbb05 100644 --- a/vs-code-extension/src/CompletionUtils.ts +++ b/vs-code-extension/src/CompletionUtils.ts @@ -1,8 +1,14 @@ +import type { IWikilinkService } from 'as-notes-common'; + /** * Pure utility functions for wikilink completion logic. * No VS Code dependencies — safe for unit testing. */ +interface IndexedLinkLike { + page_name: string; +} + /** * Find the column of the innermost unclosed `[[` in the text up to the cursor. * Returns -1 if no valid `[[` trigger is found. @@ -147,3 +153,38 @@ export function isPositionInsideCode(lines: string[], lineIndex: number, charInd return false; } + +/** + * Determine whether the current document contains at least one newly added + * complete wikilink compared to the page's last indexed links. + * + * This uses page-name occurrence counts rather than positions so that plain + * text edits that shift existing links do not count as new links, while added + * duplicate links and nested links are still detected. + */ +export function hasNewCompleteWikilink( + lines: string[], + indexedLinks: IndexedLinkLike[], + wikilinkService: IWikilinkService, +): boolean { + const indexedCounts = new Map(); + for (const link of indexedLinks) { + indexedCounts.set(link.page_name, (indexedCounts.get(link.page_name) ?? 0) + 1); + } + + const currentCounts = new Map(); + for (const line of lines) { + const wikilinks = wikilinkService.extractWikilinks(line, false, false); + for (const wikilink of wikilinks) { + currentCounts.set(wikilink.pageName, (currentCounts.get(wikilink.pageName) ?? 0) + 1); + } + } + + for (const [pageName, count] of currentCounts) { + if (count > (indexedCounts.get(pageName) ?? 0)) { + return true; + } + } + + return false; +} diff --git a/vs-code-extension/src/IgnoreService.ts b/vs-code-extension/src/IgnoreService.ts index 371aa1d..de02e7d 100644 --- a/vs-code-extension/src/IgnoreService.ts +++ b/vs-code-extension/src/IgnoreService.ts @@ -1,5 +1,6 @@ import * as fs from 'fs'; import ignore, { type Ignore } from 'ignore'; +import { formatLogError, getActiveLogger } from './LogService.js'; /** * Reads and parses an `.asnotesignore` file (`.gitignore` syntax) and exposes @@ -48,7 +49,7 @@ export class IgnoreService { const content = fs.readFileSync(this.ignoreFilePath, 'utf-8'); instance.add(content); } catch (err) { - console.warn(`as-notes: could not read ${this.ignoreFilePath}:`, err); + getActiveLogger().warn('IgnoreService', `could not read ${this.ignoreFilePath}: ${formatLogError(err)}`); } return instance; } diff --git a/vs-code-extension/src/IndexScanner.ts b/vs-code-extension/src/IndexScanner.ts index dd5ca71..6008b2b 100644 --- a/vs-code-extension/src/IndexScanner.ts +++ b/vs-code-extension/src/IndexScanner.ts @@ -2,7 +2,7 @@ import * as vscode from 'vscode'; import * as path from 'path'; import { IndexService, type ScanSummary } from './IndexService.js'; import { IgnoreService } from './IgnoreService.js'; -import { LogService, NO_OP_LOGGER } from './LogService.js'; +import { LogService, NO_OP_LOGGER, formatLogError } from './LogService.js'; import { toNotesRelativePath } from './NotesRootService.js'; /** @@ -108,7 +108,7 @@ export class IndexScanner { this.logger.info('IndexScanner', `fullScan: progress ${filesIndexed}/${total} files`); } } catch (err) { - console.warn(`as-notes: failed to index ${relativePath}:`, err); + this.logger.warn('IndexScanner', `failed to index ${relativePath}: ${formatLogError(err)}`); } } @@ -186,7 +186,7 @@ export class IndexScanner { summary.unchanged++; } } catch (err) { - console.warn(`as-notes: failed to scan ${relativePath}:`, err); + this.logger.warn('IndexScanner', `failed to scan ${relativePath}: ${formatLogError(err)}`); } processed++; diff --git a/vs-code-extension/src/IndexService.ts b/vs-code-extension/src/IndexService.ts index f0f6803..04c0dbb 100644 --- a/vs-code-extension/src/IndexService.ts +++ b/vs-code-extension/src/IndexService.ts @@ -570,6 +570,34 @@ export class IndexService { return this.mapLinkRows(result[0].values); } + /** + * Find distinct source pages containing links to any of the supplied page names. + * Used to narrow rename refactors to only files that could contain matches. + */ + findPagesLinkingToPageNames(pageNames: string[]): PageRow[] { + this.ensureOpen(); + if (pageNames.length === 0) { return []; } + + const placeholders = pageNames.map(() => '?').join(', '); + const result = this.db!.exec( + `SELECT DISTINCT p.id, p.path, p.filename, p.title, p.mtime, p.indexed_at + FROM pages p + JOIN links l ON l.source_page_id = p.id + WHERE l.page_name IN (${placeholders}) + ORDER BY p.path COLLATE NOCASE`, + pageNames, + ); + if (result.length === 0) { return []; } + return result[0].values.map(row => ({ + id: row[0] as number, + path: row[1] as string, + filename: row[2] as string, + title: row[3] as string, + mtime: row[4] as number, + indexed_at: row[5] as number, + })); + } + /** * Get the total number of links across all pages. * Uses a single `COUNT(*)` query — O(1) regardless of table size. diff --git a/vs-code-extension/src/LogService.ts b/vs-code-extension/src/LogService.ts index d53349f..318aa74 100644 --- a/vs-code-extension/src/LogService.ts +++ b/vs-code-extension/src/LogService.ts @@ -154,3 +154,17 @@ export class LogService { * Avoids null checks throughout the codebase — callers always have a LogService. */ export const NO_OP_LOGGER = new LogService('', { enabled: false }); + +let activeLogger: LogService = NO_OP_LOGGER; + +export function setActiveLogger(logger: LogService): void { + activeLogger = logger; +} + +export function getActiveLogger(): LogService { + return activeLogger; +} + +export function formatLogError(error: unknown): string { + return error instanceof Error ? error.message : String(error); +} diff --git a/vs-code-extension/src/TaskPanelProvider.ts b/vs-code-extension/src/TaskPanelProvider.ts index 9c109ce..017db65 100644 --- a/vs-code-extension/src/TaskPanelProvider.ts +++ b/vs-code-extension/src/TaskPanelProvider.ts @@ -1,5 +1,6 @@ import * as vscode from 'vscode'; import { IndexService, type TaskViewItem } from './IndexService.js'; +import { formatLogError, getActiveLogger } from './LogService.js'; import { toggleTodoLine } from './TodoToggleService.js'; // ── Provider ─────────────────────────────────────────────────────────────── @@ -88,7 +89,7 @@ export class TaskPanelProvider implements vscode.WebviewViewProvider { edit.replace(doc.uri, doc.lineAt(line).range, toggled); return vscode.workspace.applyEdit(edit).then(() => doc.save()); }).then(undefined, err => { - console.warn('as-notes: failed to toggle task from panel:', err); + getActiveLogger().warn('TaskPanel', `failed to toggle task from panel: ${formatLogError(err)}`); }); break; } diff --git a/vs-code-extension/src/WikilinkExplorerMergeService.ts b/vs-code-extension/src/WikilinkExplorerMergeService.ts new file mode 100644 index 0000000..639108c --- /dev/null +++ b/vs-code-extension/src/WikilinkExplorerMergeService.ts @@ -0,0 +1,21 @@ +import type { PageRow } from './IndexService.js'; + +function normalisePath(path: string): string { + return path.replace(/\\/g, '/').toLowerCase(); +} + +export function getExistingExplorerMergeTargets( + pages: PageRow[], + renamedPath: string, +): PageRow[] { + const renamed = normalisePath(renamedPath); + return pages.filter((page) => normalisePath(page.path) !== renamed); +} + +export function pickUniqueExplorerMergeTarget( + pages: PageRow[], + renamedPath: string, +): PageRow | undefined { + const candidates = getExistingExplorerMergeTargets(pages, renamedPath); + return candidates.length === 1 ? candidates[0] : undefined; +} \ No newline at end of file diff --git a/vs-code-extension/src/WikilinkExplorerRenameRefactorService.ts b/vs-code-extension/src/WikilinkExplorerRenameRefactorService.ts new file mode 100644 index 0000000..d7b28da --- /dev/null +++ b/vs-code-extension/src/WikilinkExplorerRenameRefactorService.ts @@ -0,0 +1,239 @@ +import * as path from 'path'; +import * as vscode from 'vscode'; +import { FrontMatterService, WikilinkService } from 'as-notes-common'; +import type { IndexScanner } from './IndexScanner.js'; +import type { IndexService } from './IndexService.js'; +import { + getExistingExplorerMergeTargets, + pickUniqueExplorerMergeTarget, +} from './WikilinkExplorerMergeService.js'; +import { toNotesRelativePath } from './NotesRootService.js'; +import { reindexWorkspaceUri, updateLinksInWorkspace } from './WikilinkRefactorService.js'; +import { withWikilinkRenameProgress } from './WikilinkRenameProgressService.js'; +import { + collectFilenameRefactorOperations, + remapUrisForFileOperations, +} from './WikilinkFilenameRefactorService.js'; + +interface ExplorerRenameFile { + oldUri: vscode.Uri; + newUri: vscode.Uri; +} + +interface ExplorerRenameRefactorDeps { + files: ExplorerRenameFile[]; + renameTrackerIsRenaming: boolean; + wikilinkService: WikilinkService; + indexService: Pick; + indexScanner: Pick; + notesRootPath?: string; + safeSaveToFile: () => boolean; + refreshProviders: () => void; +} + +function isMarkdownUri(uri: vscode.Uri): boolean { + const ext = path.extname(uri.fsPath).toLowerCase(); + return ext === '.md' || ext === '.markdown'; +} + +export async function handleExplorerRenameRefactors({ + files, + renameTrackerIsRenaming, + wikilinkService, + indexService, + indexScanner, + notesRootPath, + safeSaveToFile, + refreshProviders, +}: ExplorerRenameRefactorDeps): Promise { + if (renameTrackerIsRenaming) { return; } + + const linkRenames: { oldPageName: string; newPageName: string }[] = []; + + for (const { oldUri, newUri } of files) { + if (!isMarkdownUri(newUri)) { continue; } + const newFilename = path.basename(newUri.fsPath); + const pages = indexService.findPagesByFilename(newFilename); + if (pages.length < 2) { continue; } + + const newPath = notesRootPath + ? toNotesRelativePath(notesRootPath, newUri.fsPath) + : vscode.workspace.asRelativePath(newUri, false); + const existingTargets = getExistingExplorerMergeTargets(pages, newPath); + if (existingTargets.length === 0) { continue; } + if (existingTargets.length > 1) { + vscode.window.showWarningMessage( + `Merge skipped for "${newFilename}": multiple existing targets match this filename.`, + ); + continue; + } + + const existingPage = pickUniqueExplorerMergeTarget(pages, newPath); + if (!existingPage) { continue; } + + const rootUri = notesRootPath + ? vscode.Uri.file(notesRootPath) + : vscode.workspace.workspaceFolders?.[0]?.uri; + if (!rootUri) { continue; } + + const mergeChoice = await vscode.window.showInformationMessage( + `Merge "${newFilename}" into existing "${existingPage.path}"?`, + 'Yes', 'No', + ); + if (mergeChoice === 'Yes') { + await withWikilinkRenameProgress('AS Notes: Applying rename updates', async (progress) => { + progress.report('Merging renamed page'); + const targetUri = vscode.Uri.joinPath(rootUri, existingPage.path); + const sourceDoc = await vscode.workspace.openTextDocument(newUri); + const targetDoc = await vscode.workspace.openTextDocument(targetUri); + + const mergedContent = new FrontMatterService().mergeDocuments( + targetDoc.getText(), + sourceDoc.getText(), + ); + + const edit = new vscode.WorkspaceEdit(); + const fullRange = new vscode.Range( + targetDoc.lineAt(0).range.start, + targetDoc.lineAt(targetDoc.lineCount - 1).range.end, + ); + edit.replace(targetUri, fullRange, mergedContent); + await vscode.workspace.applyEdit(edit); + + progress.report('Refreshing index'); + await vscode.workspace.fs.delete(newUri); + indexService.removePage(newPath); + try { + await reindexWorkspaceUri(targetUri, { indexService, indexScanner, notesRootPath }); + } catch { /* best effort */ } + safeSaveToFile(); + }); + } + } + + for (const { oldUri, newUri } of files) { + if (isMarkdownUri(oldUri) && isMarkdownUri(newUri)) { + const oldExt = path.extname(oldUri.fsPath); + const newExt = path.extname(newUri.fsPath); + const oldPageName = path.basename(oldUri.fsPath, oldExt); + const newPageName = path.basename(newUri.fsPath, newExt); + if (oldPageName !== newPageName) { + linkRenames.push({ oldPageName, newPageName }); + } + } + } + + if (linkRenames.length === 0) { return; } + + const summary = linkRenames + .map(r => `[[${r.oldPageName}]] → [[${r.newPageName}]]`) + .join(', '); + const rootUri = notesRootPath + ? vscode.Uri.file(notesRootPath) + : vscode.workspace.workspaceFolders?.[0]?.uri; + const filenameRefactorPlan = rootUri && indexService.getAllPages + ? collectFilenameRefactorOperations( + linkRenames, + indexService.getAllPages(), + rootUri, + { + excludePaths: files.map(file => { + const uri = file.newUri; + return notesRootPath + ? toNotesRelativePath(notesRootPath, uri.fsPath) + : vscode.workspace.asRelativePath(uri, false); + }), + }, + ) + : { fileRenames: [], fileMerges: [] }; + const extraFilenameChanges = filenameRefactorPlan.fileRenames.length + filenameRefactorPlan.fileMerges.length; + const msg = linkRenames.length === 1 + ? `Update all ${summary} references?${extraFilenameChanges > 0 ? ` Also update ${extraFilenameChanges} filename reference(s).` : ''}` + : `Update references for ${linkRenames.length} renamed files? ${summary}${extraFilenameChanges > 0 ? ` Also update ${extraFilenameChanges} filename reference(s).` : ''}`; + const choice = await vscode.window.showInformationMessage(msg, 'Yes', 'No'); + if (choice !== 'Yes') { return; } + + await withWikilinkRenameProgress('AS Notes: Updating wikilink references', async (progress) => { + const candidateUris = rootUri + ? indexService.findPagesLinkingToPageNames(linkRenames.map(rename => rename.oldPageName)) + .map(page => vscode.Uri.joinPath(rootUri, page.path)) + : []; + + progress.report('Preparing rename operations'); + for (const merge of filenameRefactorPlan.fileMerges) { + const sourceDoc = await vscode.workspace.openTextDocument(merge.oldUri); + const targetDoc = await vscode.workspace.openTextDocument(merge.newUri); + + const mergedContent = new FrontMatterService().mergeDocuments( + targetDoc.getText(), + sourceDoc.getText(), + ); + + const edit = new vscode.WorkspaceEdit(); + const fullRange = new vscode.Range( + targetDoc.lineAt(0).range.start, + targetDoc.lineAt(targetDoc.lineCount - 1).range.end, + ); + edit.replace(merge.newUri, fullRange, mergedContent); + await vscode.workspace.applyEdit(edit); + await vscode.workspace.fs.delete(merge.oldUri); + } + + for (const rename of filenameRefactorPlan.fileRenames) { + await vscode.workspace.fs.rename(rename.oldUri, rename.newUri, { overwrite: false }); + } + + const rewriteCandidateUris = remapUrisForFileOperations( + candidateUris, + filenameRefactorPlan.fileRenames, + filenameRefactorPlan.fileMerges, + ); + + progress.report('Updating links across workspace'); + const affectedUris = await updateLinksInWorkspace( + wikilinkService, + linkRenames, + rewriteCandidateUris.length > 0 ? { candidateUris: rewriteCandidateUris } : undefined, + ); + + progress.report('Refreshing index'); + for (const uri of affectedUris) { + try { + await reindexWorkspaceUri(uri, { + indexService, + indexScanner, + notesRootPath, + }); + } catch { /* best effort */ } + } + for (const merge of filenameRefactorPlan.fileMerges) { + const oldPath = notesRootPath + ? toNotesRelativePath(notesRootPath, merge.oldUri.fsPath) + : vscode.workspace.asRelativePath(merge.oldUri, false); + indexService.removePage(oldPath); + try { + await reindexWorkspaceUri(merge.newUri, { + indexService, + indexScanner, + notesRootPath, + }); + } catch { /* best effort */ } + } + for (const rename of filenameRefactorPlan.fileRenames) { + const oldPath = notesRootPath + ? toNotesRelativePath(notesRootPath, rename.oldUri.fsPath) + : vscode.workspace.asRelativePath(rename.oldUri, false); + indexService.removePage(oldPath); + try { + await reindexWorkspaceUri(rename.newUri, { + indexService, + indexScanner, + notesRootPath, + }); + } catch { /* best effort */ } + } + if (safeSaveToFile()) { + refreshProviders(); + } + }); +} \ No newline at end of file diff --git a/vs-code-extension/src/WikilinkFilenameRefactorService.ts b/vs-code-extension/src/WikilinkFilenameRefactorService.ts new file mode 100644 index 0000000..765170a --- /dev/null +++ b/vs-code-extension/src/WikilinkFilenameRefactorService.ts @@ -0,0 +1,177 @@ +import * as path from 'path'; +import * as vscode from 'vscode'; + +export interface FilenameRefactorRename { + oldPageName: string; + newPageName: string; +} + +export interface FilenameRefactorPage { + path: string; + filename: string; +} + +export interface FilenameRefactorOperation { + oldUri: vscode.Uri; + newUri: vscode.Uri; + label: string; +} + +export interface FilenameRefactorPlan { + fileRenames: FilenameRefactorOperation[]; + fileMerges: FilenameRefactorOperation[]; +} + +export function collectFilenameRefactorOperations( + renames: FilenameRefactorRename[], + pages: FilenameRefactorPage[], + rootUri: vscode.Uri, + options?: { excludePaths?: string[] }, +): FilenameRefactorPlan { + const excludePaths = new Set((options?.excludePaths ?? []).map(normaliseRelativePathLower)); + const orderedRenames = [...renames].sort( + (a, b) => wikilinkToken(b.oldPageName).length - wikilinkToken(a.oldPageName).length, + ); + + const desiredPathByCurrentPath = new Map(); + const pagesByPath = new Map(); + + for (const page of pages) { + const currentPath = normaliseRelativePath(page.path); + pagesByPath.set(normaliseRelativePathLower(currentPath), page); + + if (excludePaths.has(normaliseRelativePathLower(currentPath))) { + continue; + } + + const nextFilename = rewriteFilename(page.filename, orderedRenames); + if (nextFilename === page.filename) { + continue; + } + + const currentDir = path.posix.dirname(currentPath); + const nextPath = currentDir === '.' ? nextFilename : `${currentDir}/${nextFilename}`; + if (nextPath !== currentPath) { + desiredPathByCurrentPath.set(currentPath, nextPath); + } + } + + const fileRenames: FilenameRefactorOperation[] = []; + const fileMerges: FilenameRefactorOperation[] = []; + + for (const [currentPath, desiredPath] of desiredPathByCurrentPath) { + const targetPage = pagesByPath.get(normaliseRelativePathLower(desiredPath)); + + if (targetPage && normaliseRelativePathLower(targetPage.path) !== normaliseRelativePathLower(currentPath)) { + const targetCurrentPath = normaliseRelativePath(targetPage.path); + const targetDesiredPath = desiredPathByCurrentPath.get(targetCurrentPath) ?? targetCurrentPath; + if (normaliseRelativePathLower(targetDesiredPath) === normaliseRelativePathLower(desiredPath)) { + fileMerges.push(makeOperation(rootUri, currentPath, targetCurrentPath)); + continue; + } + } + + fileRenames.push(makeOperation(rootUri, currentPath, desiredPath)); + } + + return { + fileRenames: orderFileRenameOperations(fileRenames), + fileMerges, + }; +} + +export function orderFileRenameOperations( + fileRenames: FilenameRefactorOperation[], +): FilenameRefactorOperation[] { + const remaining = [...fileRenames]; + const ordered: FilenameRefactorOperation[] = []; + + while (remaining.length > 0) { + let progress = false; + + for (let index = 0; index < remaining.length; index++) { + const candidate = remaining[index]; + const dependsOnRemaining = remaining.some((other, otherIndex) => + otherIndex !== index && sameUri(other.oldUri, candidate.newUri), + ); + + if (!dependsOnRemaining) { + ordered.push(candidate); + remaining.splice(index, 1); + progress = true; + break; + } + } + + if (!progress) { + ordered.push(...remaining); + break; + } + } + + return ordered; +} + +export function remapUrisForFileOperations( + candidateUris: vscode.Uri[], + fileRenames: FilenameRefactorOperation[], + fileMerges: FilenameRefactorOperation[], +): vscode.Uri[] { + const replacements = new Map(); + for (const rename of fileRenames) { + replacements.set(rename.oldUri.toString().toLowerCase(), rename.newUri); + } + for (const merge of fileMerges) { + replacements.set(merge.oldUri.toString().toLowerCase(), merge.newUri); + } + + const unique = new Map(); + for (const uri of candidateUris) { + const replacement = replacements.get(uri.toString().toLowerCase()) ?? uri; + unique.set(replacement.toString().toLowerCase(), replacement); + } + return [...unique.values()]; +} + +function rewriteFilename( + filename: string, + renames: FilenameRefactorRename[], +): string { + const extension = path.extname(filename); + const basename = extension ? filename.slice(0, -extension.length) : filename; + + let updated = basename; + for (const rename of renames) { + updated = updated.split(wikilinkToken(rename.oldPageName)).join(wikilinkToken(rename.newPageName)); + } + + return `${updated}${extension}`; +} + +function wikilinkToken(pageName: string): string { + return `[[${pageName}]]`; +} + +function makeOperation( + rootUri: vscode.Uri, + oldPath: string, + newPath: string, +): FilenameRefactorOperation { + return { + oldUri: vscode.Uri.joinPath(rootUri, oldPath), + newUri: vscode.Uri.joinPath(rootUri, newPath), + label: `${path.posix.basename(oldPath)} → ${path.posix.basename(newPath)}`, + }; +} + +function normaliseRelativePath(value: string): string { + return value.replace(/\\/g, '/'); +} + +function normaliseRelativePathLower(value: string): string { + return normaliseRelativePath(value).toLowerCase(); +} + +function sameUri(left: vscode.Uri, right: vscode.Uri): boolean { + return left.toString().toLowerCase() === right.toString().toLowerCase(); +} \ No newline at end of file diff --git a/vs-code-extension/src/WikilinkRefactorService.ts b/vs-code-extension/src/WikilinkRefactorService.ts new file mode 100644 index 0000000..3a05f83 --- /dev/null +++ b/vs-code-extension/src/WikilinkRefactorService.ts @@ -0,0 +1,169 @@ +import * as vscode from 'vscode'; +import { WikilinkService } from 'as-notes-common'; +import type { IndexScanner } from './IndexScanner.js'; +import type { IndexService } from './IndexService.js'; +import { toNotesRelativePath } from './NotesRootService.js'; + +export interface UpdateLinksInWorkspaceOptions { + candidateUris?: vscode.Uri[]; +} + +export interface ReindexWorkspaceUriDeps { + indexService: Pick; + indexScanner: Pick; + notesRootPath?: string; +} + +/** + * Replace every matching `[[oldPageName]]` wikilink with `[[newPageName]]` + * across all markdown files in the workspace. + * + * Accepts multiple rename pairs and applies them all in a single + * `WorkspaceEdit`. + */ +export async function updateLinksInWorkspace( + wikilinkService: WikilinkService, + renames: { oldPageName: string; newPageName: string }[], + options?: UpdateLinksInWorkspaceOptions, +): Promise { + if (renames.length === 0) { return []; } + + const mdFiles = options?.candidateUris && options.candidateUris.length > 0 + ? dedupeUris(options.candidateUris) + : await vscode.workspace.findFiles('**/*.{md,markdown}'); + const workspaceEdit = new vscode.WorkspaceEdit(); + let hasOpenDocumentEdits = false; + const affectedUris = new Map(); + const renameMap = new Map(renames.map(rename => [rename.oldPageName, rename.newPageName])); + + // Build a set of old page names for a cheap pre-scan check. + const oldPageNames = new Set(renames.map(r => r.oldPageName)); + + for (const fileUri of mdFiles) { + // Prefer already-open document buffers; for closed files, read raw + // bytes from disk so we don't open a document model unnecessarily. + const openDoc = vscode.workspace.textDocuments.find( + d => d.uri.toString() === fileUri.toString(), + ); + + let fullText: string; + let lines: string[]; + if (openDoc) { + lines = []; + for (let i = 0; i < openDoc.lineCount; i++) { + lines.push(openDoc.lineAt(i).text); + } + fullText = lines.join('\n'); + } else { + const raw = await vscode.workspace.fs.readFile(fileUri); + fullText = Buffer.from(raw).toString('utf-8'); + lines = fullText.split(/\r?\n/); + } + + // Quick pre-scan: skip files whose content doesn't mention any old + // page name (avoids the more expensive per-line wikilink extraction). + if (!Array.from(oldPageNames).some(name => fullText.includes(name))) { + continue; + } + + if (openDoc) { + for (let line = 0; line < lines.length; line++) { + const text = lines[line]; + const wikilinks = wikilinkService.extractWikilinks(text); + + for (const wl of wikilinks) { + const newPageName = renameMap.get(wl.pageName); + if (newPageName) { + const range = new vscode.Range( + line, wl.startPositionInText, + line, wl.endPositionInText + 1, + ); + workspaceEdit.replace(fileUri, range, `[[${newPageName}]]`); + hasOpenDocumentEdits = true; + affectedUris.set(fileUri.toString(), fileUri); + } + } + } + continue; + } + + const updatedText = rewriteClosedDocument(lines, fullText, wikilinkService, renameMap); + if (updatedText !== fullText) { + await vscode.workspace.fs.writeFile(fileUri, Buffer.from(updatedText, 'utf-8')); + affectedUris.set(fileUri.toString(), fileUri); + } + } + + if (hasOpenDocumentEdits) { + await vscode.workspace.applyEdit(workspaceEdit); + } + + return [...affectedUris.values()]; +} + +export async function reindexWorkspaceUri( + uri: vscode.Uri, + { indexService, indexScanner, notesRootPath }: ReindexWorkspaceUriDeps, +): Promise<'buffer' | 'disk'> { + const openDocument = vscode.workspace.textDocuments.find( + doc => doc.uri.toString() === uri.toString(), + ); + + if (openDocument) { + const relativePath = notesRootPath + ? toNotesRelativePath(notesRootPath, openDocument.uri.fsPath) + : vscode.workspace.asRelativePath(openDocument.uri, false); + const filename = openDocument.uri.fsPath.split(/[/\\]/).pop() ?? ''; + indexService.indexFileContent(relativePath, filename, openDocument.getText(), Date.now()); + return 'buffer'; + } + + await indexScanner.indexFile(uri); + return 'disk'; +} + +function dedupeUris(uris: vscode.Uri[]): vscode.Uri[] { + const unique = new Map(); + for (const uri of uris) { + unique.set(uri.toString(), uri); + } + return [...unique.values()]; +} + +function rewriteClosedDocument( + lines: string[], + originalText: string, + wikilinkService: WikilinkService, + renameMap: Map, +): string { + const newline = originalText.includes('\r\n') ? '\r\n' : '\n'; + const updatedLines = lines.map((line) => replaceLinksInLine(line, wikilinkService, renameMap)); + return updatedLines.join(newline); +} + +function replaceLinksInLine( + text: string, + wikilinkService: WikilinkService, + renameMap: Map, +): string { + const wikilinks = wikilinkService.extractWikilinks(text); + if (wikilinks.length === 0) { + return text; + } + + let updated = text; + for (let i = wikilinks.length - 1; i >= 0; i--) { + const wikilink = wikilinks[i]; + const newPageName = renameMap.get(wikilink.pageName); + if (!newPageName) { + continue; + } + + updated = + updated.slice(0, wikilink.startPositionInText) + + `[[${newPageName}]]` + + updated.slice(wikilink.endPositionInText + 1); + } + + return updated; +} diff --git a/vs-code-extension/src/WikilinkRenameProgressService.ts b/vs-code-extension/src/WikilinkRenameProgressService.ts new file mode 100644 index 0000000..6dba287 --- /dev/null +++ b/vs-code-extension/src/WikilinkRenameProgressService.ts @@ -0,0 +1,23 @@ +import * as vscode from 'vscode'; + +export interface RenameProgressReporter { + report(message: string): void; +} + +export async function withWikilinkRenameProgress( + title: string, + task: (progress: RenameProgressReporter) => Promise, +): Promise { + return vscode.window.withProgress( + { + location: vscode.ProgressLocation.Notification, + title, + cancellable: false, + }, + async (progress) => task({ + report(message: string) { + progress.report({ message }); + }, + }), + ); +} \ No newline at end of file diff --git a/vs-code-extension/src/WikilinkRenameTracker.ts b/vs-code-extension/src/WikilinkRenameTracker.ts index c0b6587..ffe00d2 100644 --- a/vs-code-extension/src/WikilinkRenameTracker.ts +++ b/vs-code-extension/src/WikilinkRenameTracker.ts @@ -6,6 +6,14 @@ import type { IndexScanner } from './IndexScanner.js'; import { sanitiseFileName } from './PathUtils.js'; import { FrontMatterService } from 'as-notes-common'; import { toNotesRelativePath } from './NotesRootService.js'; +import { reindexWorkspaceUri, updateLinksInWorkspace } from './WikilinkRefactorService.js'; +import { withWikilinkRenameProgress } from './WikilinkRenameProgressService.js'; +import { type LogService, NO_OP_LOGGER, formatLogError } from './LogService.js'; +import { + collectFilenameRefactorOperations, + orderFileRenameOperations, + remapUrisForFileOperations, +} from './WikilinkFilenameRefactorService.js'; /** * Detected rename: a wikilink at the same position now has a different pageName. @@ -56,8 +64,13 @@ export class WikilinkRenameTracker implements vscode.Disposable { private readonly indexService: IndexService; private readonly indexScanner: IndexScanner; private readonly notesRootUri: vscode.Uri | undefined; + private readonly log: Pick; private readonly disposables: vscode.Disposable[] = []; + private readonly _onDidDeclineRename = new vscode.EventEmitter(); + /** Fires after a rename is declined and the document has been re-indexed. */ + readonly onDidDeclineRename = this._onDidDeclineRename.event; + /** Tracks the wikilink the cursor was inside during the most recent edit. */ private pendingEdit: PendingEditInfo | undefined; @@ -70,12 +83,14 @@ export class WikilinkRenameTracker implements vscode.Disposable { indexService: IndexService, indexScanner: IndexScanner, notesRootUri?: vscode.Uri, + log?: Pick, ) { this.wikilinkService = wikilinkService; this.fileService = fileService; this.indexService = indexService; this.indexScanner = indexScanner; this.notesRootUri = notesRootUri; + this.log = log ?? NO_OP_LOGGER; this.disposables.push( vscode.workspace.onDidChangeTextDocument((e) => this.onDocumentChanged(e)), @@ -85,11 +100,17 @@ export class WikilinkRenameTracker implements vscode.Disposable { } dispose(): void { + this._onDidDeclineRename.dispose(); for (const d of this.disposables) { d.dispose(); } } + /** True while a rename operation is in progress (file rename + link updates). */ + get isRenaming(): boolean { + return this.isProcessing; + } + /** * Returns true if there is an unresolved pending edit for the given document key. * Used by the completion debounce in extension.ts to avoid overwriting the @@ -99,6 +120,37 @@ export class WikilinkRenameTracker implements vscode.Disposable { return this.pendingEdit?.docKey === docKey; } + /** + * Returns true when the pageName change is a nesting or un-nesting + * operation rather than a genuine rename. + * + * Covers two cases: + * 1. Full nesting/un-nesting: one pageName contains the other as `[[...]]` + * e.g. `[[A]]` wrapped to `[[[[A]] B]]`, or the reverse. + * 2. Partial bracket manipulation: the user is mid-edit adding/removing + * brackets. After stripping leading `[` and trailing `]` from both + * names, the core page name is identical. + * e.g. pageName "Demo" vs "[Demo" from intermediate `[[[Demo]]`. + */ + static isNestingChange(oldPageName: string, newPageName: string): boolean { + // Full nesting/un-nesting + if (newPageName.includes(`[[${oldPageName}]]`) || + oldPageName.includes(`[[${newPageName}]]`)) { + return true; + } + + // Partial bracket manipulation: same core name after stripping + // leading [ and trailing ] characters + const strip = (s: string) => s.replace(/^\[+/, '').replace(/\]+$/, ''); + const strippedOld = strip(oldPageName); + const strippedNew = strip(newPageName); + if (strippedOld.length > 0 && strippedOld === strippedNew) { + return true; + } + + return false; + } + // ── Change detection ─────────────────────────────────────────────── private onDocumentChanged(event: vscode.TextDocumentChangeEvent): void { @@ -283,6 +335,11 @@ export class WikilinkRenameTracker implements vscode.Disposable { const oldLink = oldMap.get(key); if (oldLink && oldLink.page_name !== curr.pageName) { + // Skip nesting/un-nesting: wrapping or unwrapping a wikilink + // inside another is not a rename. + if (WikilinkRenameTracker.isNestingChange(oldLink.page_name, curr.pageName)) { + continue; + } renames.push({ oldPageName: oldLink.page_name, newPageName: curr.pageName, @@ -297,6 +354,8 @@ export class WikilinkRenameTracker implements vscode.Disposable { return; } + this.log.info('rename', `detected ${renames.length} rename(s): ${renames.map(r => `[[${r.oldPageName}]] → [[${r.newPageName}]]`).join(', ')} in ${relativePath}`); + // Sort outermost first (largest range) so workspace replacements // for outer links happen before inner ones. This is correct because // replacing `[[Outer [[Inner]] text]]` in other files also replaces @@ -306,7 +365,7 @@ export class WikilinkRenameTracker implements vscode.Disposable { (b.endPosition - b.startPosition) - (a.endPosition - a.startPosition), ); - await this.promptAndPerformRenames(document, renames); + await this.promptAndPerformRenames(document, renames, relativePath); } // ── Rename execution ─────────────────────────────────────────────── @@ -322,6 +381,7 @@ export class WikilinkRenameTracker implements vscode.Disposable { private async promptAndPerformRenames( document: vscode.TextDocument, renames: DetectedRename[], + relativePath: string, ): Promise { // Classify each rename as alias or direct interface ClassifiedRename extends DetectedRename { @@ -333,6 +393,7 @@ export class WikilinkRenameTracker implements vscode.Disposable { const classifiedRenames: ClassifiedRename[] = []; const renameDescriptions: string[] = []; const fileRenames: { oldUri: vscode.Uri; newUri: vscode.Uri; label: string }[] = []; + const fileMerges: { oldUri: vscode.Uri; newUri: vscode.Uri; label: string }[] = []; for (const r of renames) { const oldFileName = sanitiseFileName(r.oldPageName); @@ -342,7 +403,14 @@ export class WikilinkRenameTracker implements vscode.Disposable { ? this.indexService.resolveAlias(oldFileName) : undefined; - if (aliasResolution) { + // If the alias resolves to a page whose own filename matches the + // old link name, it's not a true alias -- it's the page's own name + // that happens to be stored as an alias. Fall through to the + // direct rename path so merge detection can apply. + const isTrueAlias = aliasResolution + && aliasResolution.filename.toLowerCase() !== `${oldFileName}.md`.toLowerCase(); + + if (isTrueAlias && aliasResolution) { // Alias rename — no file rename, just update front matter + references classifiedRenames.push({ ...r, @@ -366,60 +434,136 @@ export class WikilinkRenameTracker implements vscode.Disposable { const oldFileExists = await this.fileService.fileExists(oldUri); if (oldFileExists) { - const newUri = this.fileService.resolveTargetUri(document.uri, newFileName); - fileRenames.push({ oldUri, newUri, label: `${oldFileName}.md → ${newFileName}.md` }); - renameDescriptions.push(`"${oldFileName}.md" → "${newFileName}.md"`); + // Detect merge targets using global direct-filename resolution, + // but do not merge when the new name only resolves via alias. + const newResolution = await this.fileService.resolveTargetUriCaseInsensitive( + document.uri, + newFileName, + ); + const resolvedTargetExists = await this.fileService.fileExists(newResolution.uri); + const isDirectMergeTarget = resolvedTargetExists + && !newResolution.viaAlias + && newResolution.uri.toString() !== oldUri.toString(); + + if (isDirectMergeTarget) { + fileMerges.push({ oldUri, newUri: newResolution.uri, label: `${oldFileName}.md → ${newFileName}.md` }); + renameDescriptions.push(`Merge "${oldFileName}.md" into "${newFileName}.md"`); + } else { + const newUri = this.fileService.resolveTargetUri(oldUri, newFileName); + fileRenames.push({ oldUri, newUri, label: `${oldFileName}.md → ${newFileName}.md` }); + renameDescriptions.push(`"${oldFileName}.md" → "${newFileName}.md"`); + } } else { renameDescriptions.push(`[[${r.oldPageName}]] → [[${r.newPageName}]]`); } } } + const filenameRefactorPlan = this.notesRootUri && this.indexService.getAllPages + ? collectFilenameRefactorOperations( + renames, + this.indexService.getAllPages(), + this.notesRootUri, + { + excludePaths: [ + ...fileRenames.map(operation => this.toRelativePath(operation.oldUri)), + ...fileMerges.map(operation => this.toRelativePath(operation.oldUri)), + ], + }, + ) + : { fileRenames: [], fileMerges: [] }; + + fileRenames.push(...filenameRefactorPlan.fileRenames); + fileMerges.push(...filenameRefactorPlan.fileMerges); + renameDescriptions.push( + ...filenameRefactorPlan.fileRenames.map(operation => `Filename: ${operation.label}`), + ...filenameRefactorPlan.fileMerges.map(operation => `Merge filename ${operation.label}`), + ); + + const orderedFileRenames = orderFileRenameOperations(fileRenames); + + const hasMerges = fileMerges.length > 0; const message = renames.length === 1 - ? `Rename ${renameDescriptions[0]}? This will update all matching links.` - : `Rename ${renames.length} links?\n${renameDescriptions.join('\n')}\nThis will update all matching links.`; + ? `${hasMerges ? 'Merge' : 'Rename'} ${renameDescriptions[0]}? This will update all matching links.` + : `${hasMerges ? 'Merge/Rename' : 'Rename'} ${renames.length} links?\n${renameDescriptions.join('\n')}\nThis will update all matching links.`; const choice = await vscode.window.showInformationMessage(message, 'Yes', 'No'); if (choice !== 'Yes') { + this.log.info('rename', `declined: ${renameDescriptions.join(', ')}`); + if (!hasMerges) { + // Re-index the document so any new/changed wikilinks are captured + const filename = relativePath.split('/').pop() ?? ''; + this.indexService.indexFileContent(relativePath, filename, document.getText(), Date.now()); + this._onDidDeclineRename.fire(); + } return; } + this.log.info('rename', `accepted: ${renameDescriptions.join(', ')}`); this.isProcessing = true; + this.log.info('rename', 'start (isProcessing=true)'); try { - // Process direct file renames (outermost first — matches rename order) - for (const fr of fileRenames) { - const newFileAlreadyExists = await this.fileService.fileExists(fr.newUri); - if (newFileAlreadyExists) { - vscode.window.showWarningMessage( - `Cannot rename: "${fr.label}" — target already exists.`, - ); - } else { + const rewriteCandidateUris = remapUrisForFileOperations( + this.getRefactorCandidateUris( + renames.map(r => r.oldPageName), + document.uri, + ), + orderedFileRenames, + fileMerges, + ); + + await withWikilinkRenameProgress('AS Notes: Applying rename updates', async (progress) => { + progress.report('Preparing rename operations'); + + // Process file merges (target already exists — merge content) + for (const fm of fileMerges) { + this.log.info('rename', `mergeFiles: ${vscode.workspace.asRelativePath(fm.oldUri)} → ${vscode.workspace.asRelativePath(fm.newUri)}`); + await this.mergeFiles(fm.oldUri, fm.newUri); + this.log.info('rename', 'mergeFiles: done'); + } + + // Process direct file renames + for (const fr of orderedFileRenames) { + this.log.info('rename', `fs.rename: ${vscode.workspace.asRelativePath(fr.oldUri)} → ${vscode.workspace.asRelativePath(fr.newUri)}`); await vscode.workspace.fs.rename(fr.oldUri, fr.newUri, { overwrite: false }); + this.log.info('rename', 'fs.rename: done'); } - } - // Process alias renames — update front matter on canonical pages - for (const r of classifiedRenames) { - if (r.isAlias && r.canonicalPagePath) { - await this.updateAliasFrontMatter( - r.canonicalPagePath, - r.oldPageName, - r.newPageName, - ); + // Process alias renames — update front matter on canonical pages + for (const r of classifiedRenames) { + if (r.isAlias && r.canonicalPagePath) { + this.log.info('rename', `updateAlias: ${r.canonicalPagePath} [[${r.oldPageName}]] → [[${r.newPageName}]]`); + await this.updateAliasFrontMatter( + r.canonicalPagePath, + r.oldPageName, + r.newPageName, + ); + this.log.info('rename', 'updateAlias: done'); + } } - } - // Update links across the workspace (outermost first) - for (const r of renames) { - await this.updateLinksInWorkspace(r.oldPageName, r.newPageName); - } + progress.report('Updating links across workspace'); + this.log.info('rename', `updateLinksInWorkspace: ${rewriteCandidateUris.length} candidate(s)`); + const affectedReferenceUris = await updateLinksInWorkspace( + this.wikilinkService, + renames.map(r => ({ oldPageName: r.oldPageName, newPageName: r.newPageName })), + { + candidateUris: rewriteCandidateUris.length > 0 ? rewriteCandidateUris : undefined, + }, + ); + this.log.info('rename', `updateLinksInWorkspace: ${affectedReferenceUris.length} file(s) affected${affectedReferenceUris.length > 0 ? ': ' + affectedReferenceUris.map(u => vscode.workspace.asRelativePath(u)).join(', ') : ''}`); - // Refresh the index - await this.refreshIndexAfterRename(document, classifiedRenames, fileRenames); + progress.report('Refreshing index'); + this.log.info('rename', 'refreshIndex: start'); + await this.refreshIndexAfterRename(document, classifiedRenames, orderedFileRenames, fileMerges, affectedReferenceUris); + this.log.info('rename', 'refreshIndex: done'); + }); } catch (err) { const detail = err instanceof Error ? err.message : String(err); + this.log.info('rename', `error: ${detail}`); vscode.window.showErrorMessage(`Rename failed: ${detail}`); } finally { + this.log.info('rename', 'end (isProcessing=false)'); this.isProcessing = false; } } @@ -452,13 +596,38 @@ export class WikilinkRenameTracker implements vscode.Disposable { ); edit.replace(canonicalUri, fullRange, updatedContent); await vscode.workspace.applyEdit(edit); - await doc.save(); } } catch (err) { - console.warn(`as-notes: failed to update alias front matter for ${canonicalPagePath}:`, err); + this.log.warn('rename', `failed to update alias front matter for ${canonicalPagePath}: ${formatLogError(err)}`); } } + /** + * Merge source file content into target file, then delete the source. + * Front matter is merged (target priority), source body is appended. + */ + private async mergeFiles(sourceUri: vscode.Uri, targetUri: vscode.Uri): Promise { + const sourceDoc = await vscode.workspace.openTextDocument(sourceUri); + const targetDoc = await vscode.workspace.openTextDocument(targetUri); + + const frontMatterService = new FrontMatterService(); + const mergedContent = frontMatterService.mergeDocuments( + targetDoc.getText(), + sourceDoc.getText(), + ); + + const edit = new vscode.WorkspaceEdit(); + const fullRange = new vscode.Range( + targetDoc.lineAt(0).range.start, + targetDoc.lineAt(targetDoc.lineCount - 1).range.end, + ); + edit.replace(targetUri, fullRange, mergedContent); + await vscode.workspace.applyEdit(edit); + + // Delete the source file + await vscode.workspace.fs.delete(sourceUri); + } + /** * After a rename operation, ensure the index is consistent by * re-indexing all files that were touched (the source document, @@ -472,17 +641,64 @@ export class WikilinkRenameTracker implements vscode.Disposable { sourceDocument: vscode.TextDocument, renames: (DetectedRename & { isAlias: boolean; canonicalPageId?: number })[], fileRenames: { oldUri: vscode.Uri; newUri: vscode.Uri; label: string }[], + fileMerges: { oldUri: vscode.Uri; newUri: vscode.Uri; label: string }[], + affectedReferenceUris: vscode.Uri[], ): Promise { if (!this.indexService.isOpen) { return; } const indexedUris = new Set(); - // Re-index the source document - try { - await this.indexScanner.indexFile(sourceDocument.uri); - indexedUris.add(sourceDocument.uri.toString()); - } catch { - // File may have been deleted + const replacedSourceUris = new Map(); + for (const fr of fileRenames) { + replacedSourceUris.set(fr.oldUri.toString(), fr.newUri); + } + for (const fm of fileMerges) { + replacedSourceUris.set(fm.oldUri.toString(), fm.newUri); + } + + // Re-index the initiating document from its buffer unless its URI has + // been replaced by a rename/merge during the operation. + if (!replacedSourceUris.has(sourceDocument.uri.toString())) { + try { + const relativePath = this.notesRootUri + ? toNotesRelativePath(this.notesRootUri.fsPath, sourceDocument.uri.fsPath) + : vscode.workspace.asRelativePath(sourceDocument.uri, false); + const filename = sourceDocument.uri.fsPath.split(/[/\\]/).pop() ?? ''; + this.indexService.indexFileContent(relativePath, filename, sourceDocument.getText(), Date.now()); + indexedUris.add(sourceDocument.uri.toString()); + } catch { + // Document may no longer be indexable + } + } + + for (const uri of affectedReferenceUris) { + if (!indexedUris.has(uri.toString())) { + try { + await reindexWorkspaceUri(uri, { + indexService: this.indexService, + indexScanner: this.indexScanner, + notesRootPath: this.notesRootUri?.fsPath, + }); + indexedUris.add(uri.toString()); + } catch { + // File may have been deleted or moved + } + } + } + + for (const fm of fileMerges) { + if (!indexedUris.has(fm.newUri.toString())) { + try { + await reindexWorkspaceUri(fm.newUri, { + indexService: this.indexService, + indexScanner: this.indexScanner, + notesRootPath: this.notesRootUri?.fsPath, + }); + indexedUris.add(fm.newUri.toString()); + } catch { + // Target may not exist if merge failed + } + } } // Re-index renamed files (at their new locations) @@ -495,7 +711,11 @@ export class WikilinkRenameTracker implements vscode.Disposable { : vscode.workspace.asRelativePath(fr.oldUri, false); this.indexService.removePage(oldPath); // Index at the new path - await this.indexScanner.indexFile(fr.newUri); + await reindexWorkspaceUri(fr.newUri, { + indexService: this.indexService, + indexScanner: this.indexScanner, + notesRootPath: this.notesRootUri?.fsPath, + }); indexedUris.add(fr.newUri.toString()); } catch { // Target may not exist if rename failed @@ -546,50 +766,22 @@ export class WikilinkRenameTracker implements vscode.Disposable { this.indexService.saveToFile(); } - /** - * Replace every `[[oldPageName]]` wikilink with `[[newPageName]]` - * across all markdown files in the workspace. - */ - private async updateLinksInWorkspace( - oldPageName: string, - newPageName: string, - ): Promise { - const mdFiles = await vscode.workspace.findFiles('**/*.{md,markdown}'); - const workspaceEdit = new vscode.WorkspaceEdit(); - const affectedUris = new Set(); - - for (const fileUri of mdFiles) { - const doc = await vscode.workspace.openTextDocument(fileUri); - - for (let line = 0; line < doc.lineCount; line++) { - const text = doc.lineAt(line).text; - const wikilinks = this.wikilinkService.extractWikilinks(text); - - for (const wl of wikilinks) { - if (wl.pageName === oldPageName) { - const range = new vscode.Range( - line, wl.startPositionInText, - line, wl.endPositionInText + 1, - ); - workspaceEdit.replace(fileUri, range, `[[${newPageName}]]`); - affectedUris.add(fileUri.toString()); - } - } - } - } - - if (affectedUris.size > 0) { - await vscode.workspace.applyEdit(workspaceEdit); + private getRefactorCandidateUris(pageNames: string[], currentDocumentUri?: vscode.Uri): vscode.Uri[] { + const rootUri = this.notesRootUri ?? vscode.workspace.workspaceFolders?.[0]?.uri; + if (!rootUri) { return currentDocumentUri ? [currentDocumentUri] : []; } - // Save affected files so the workspace is in a clean state - for (const uriStr of affectedUris) { - const doc = vscode.workspace.textDocuments.find( - (d) => d.uri.toString() === uriStr, - ); - if (doc?.isDirty) { - await doc.save(); - } - } + const pages = this.indexService.findPagesLinkingToPageNames(pageNames); + const uris = pages.map(page => vscode.Uri.joinPath(rootUri, page.path)); + if (currentDocumentUri) { + uris.push(currentDocumentUri); } + return uris; + } + + private toRelativePath(uri: vscode.Uri): string { + return this.notesRootUri + ? toNotesRelativePath(this.notesRootUri.fsPath, uri.fsPath) + : vscode.workspace.asRelativePath(uri, false); } + } diff --git a/vs-code-extension/src/extension.ts b/vs-code-extension/src/extension.ts index b3da7c7..88e0e31 100644 --- a/vs-code-extension/src/extension.ts +++ b/vs-code-extension/src/extension.ts @@ -4,6 +4,7 @@ import * as fs from 'fs'; import { WikilinkService, wikilinkPlugin, type WikilinkResolverFn } from 'as-notes-common'; import { WikilinkFileService } from './WikilinkFileService.js'; import { WikilinkDecorationManager } from './WikilinkDecorationManager.js'; +import { handleExplorerRenameRefactors } from './WikilinkExplorerRenameRefactorService.js'; import { WikilinkDocumentLinkProvider } from './WikilinkDocumentLinkProvider.js'; import { WikilinkHoverProvider } from './WikilinkHoverProvider.js'; import { WikilinkRenameTracker } from './WikilinkRenameTracker.js'; @@ -37,8 +38,8 @@ import { activateLicenceKey, checkServerForRevocation, verifyLicenceFromSettings import * as EncryptionService from './EncryptionService.js'; import { ensurePreCommitHook } from './GitHookService.js'; import { applyAssetPathSettings } from './ImageDropProvider.js'; -import { LogService, NO_OP_LOGGER } from './LogService.js'; -import { findInnermostOpenBracket } from './CompletionUtils.js'; +import { LogService, NO_OP_LOGGER, formatLogError, setActiveLogger } from './LogService.js'; +import { findInnermostOpenBracket, hasNewCompleteWikilink } from './CompletionUtils.js'; import { IgnoreService } from './IgnoreService.js'; import { SlashCommandProvider } from './SlashCommandProvider.js'; import { openDatePicker } from './DatePickerService.js'; @@ -535,7 +536,7 @@ export async function activate(context: vscode.ExtensionContext): Promise<{ exte inlineEditorManager?.refreshLicenceGate(); }).catch((err) => { clearTimeout(progressTimer); - console.warn('as-notes: licence validation failed:', err); + logService.warn('extension', `licence validation failed: ${formatLogError(err)}`); }); } @@ -562,7 +563,7 @@ export async function activate(context: vscode.ExtensionContext): Promise<{ exte // Clean up legacy SecretStorage keys from the JWT-based system. migrateOldSecrets(context.secrets).catch((err) => { - console.warn('as-notes: failed to migrate old secrets:', err); + logService.warn('extension', `failed to migrate old secrets: ${formatLogError(err)}`); }); // Verify licence from settings on startup (instant, offline Ed25519 check). @@ -573,7 +574,7 @@ export async function activate(context: vscode.ExtensionContext): Promise<{ exte updateFullModeStatusBar(); inlineEditorManager?.refreshLicenceGate(); }).catch((err) => { - console.warn('as-notes: failed to verify licence from settings:', err); + logService.warn('extension', `failed to verify licence from settings: ${formatLogError(err)}`); }); // Periodic background check for revocation (every 7 days). @@ -587,7 +588,7 @@ export async function activate(context: vscode.ExtensionContext): Promise<{ exte updateFullModeStatusBar(); inlineEditorManager?.refreshLicenceGate(); }).catch((err) => { - console.warn('as-notes: periodic licence check failed:', err); + logService.warn('extension', `periodic licence check failed: ${formatLogError(err)}`); }); }, VALIDATION_INTERVAL_MS); @@ -647,7 +648,7 @@ export async function activate(context: vscode.ExtensionContext): Promise<{ exte // extendMarkdownIt — if we block here on DB init + stale scan, // the preview hangs waiting and never renders wikilinks. enterFullMode(context, workspaceRoot).catch(err => { - console.error('as-notes: failed to enter full mode, falling back to passive', err); + logService.error('extension', `failed to enter full mode, falling back to passive: ${formatLogError(err)}`); setPassiveMode('Index initialisation failed'); }); } else { @@ -738,6 +739,7 @@ async function enterFullMode( const loggingEnabled = config.get('enableLogging', false) || process.env.AS_NOTES_DEBUG === '1'; logService = new LogService(nrp.logDir, { enabled: loggingEnabled }); + setActiveLogger(logService); if (logService.isEnabled) { logService.info('extension', 'Logging activated'); } @@ -1056,8 +1058,9 @@ async function enterFullMode( const summary = await indexScanner.staleScan(); if (summary.newFiles > 0 || summary.staleFiles > 0 || summary.deletedFiles > 0) { indexService.saveToFile(); - console.log( - `as-notes: stale scan — ${summary.newFiles} new, ${summary.staleFiles} stale, ${summary.deletedFiles} deleted, ${summary.unchanged} unchanged`, + logService.info( + 'extension', + `stale scan — ${summary.newFiles} new, ${summary.staleFiles} stale, ${summary.deletedFiles} deleted, ${summary.unchanged} unchanged`, ); } } @@ -1087,9 +1090,17 @@ async function enterFullMode( // Rename tracker — backed by index for pre-edit state comparison const renameTracker = new WikilinkRenameTracker( - wikilinkService, fileService, indexService, indexScanner, nrUri, + wikilinkService, fileService, indexService, indexScanner, nrUri, logService, ); fullModeDisposables.push(renameTracker); + fullModeDisposables.push( + renameTracker.onDidDeclineRename(() => { + completionProvider?.refresh(); + taskPanelProvider?.refresh(); + searchPanelProvider?.refresh(); + backlinkPanelProvider?.refresh(); + }), + ); // Completion provider — wikilink autocomplete triggered by [[ completionProvider = new WikilinkCompletionProvider(indexService, logService); @@ -1467,7 +1478,7 @@ async function enterFullMode( // Configure the built-in markdown copy-files destination to use our asset path applyAssetPathSettings().catch(err => - console.warn('as-notes: failed to apply asset path settings:', err), + logService.warn('extension', `failed to apply asset path settings: ${formatLogError(err)}`), ); // Todo toggle — requires full mode (index needed for task panel sync) @@ -1660,7 +1671,7 @@ async function enterFullMode( } encrypted++; } catch (err) { - console.warn(`as-notes: failed to encrypt ${fileUri.fsPath}:`, err); + logService.warn('extension', `failed to encrypt ${fileUri.fsPath}: ${formatLogError(err)}`); errors++; } } @@ -1718,7 +1729,7 @@ async function enterFullMode( } decrypted++; } catch (err) { - console.warn(`as-notes: failed to decrypt ${fileUri.fsPath}:`, err); + logService.warn('extension', `failed to decrypt ${fileUri.fsPath}: ${formatLogError(err)}`); errors++; } } @@ -1949,18 +1960,18 @@ async function enterFullMode( searchPanelProvider?.refresh(); backlinkPanelProvider?.refresh(); } catch (err) { - console.warn('as-notes: failed to index on save:', err); + logService.warn('extension', `failed to index on save: ${formatLogError(err)}`); } }), ); // On text change: debounced re-index of the live buffer so that newly - // typed wikilinks (forward references) appear in autocomplete immediately - // without requiring a save or editor switch. + // typed wikilinks (forward references) can appear in autocomplete without + // requiring a save or editor switch. // - // Note: completionProvider.refresh() is NOT called here — the 3 SQLite - // queries it runs are expensive and this fires on every typing pause. - // Forward references appear in autocomplete after the next save. + // To keep this cheap, the completion cache is only refreshed when the + // current document contains a newly added complete wikilink compared to + // the page's last indexed link set. fullModeDisposables.push( vscode.workspace.onDidChangeTextDocument((e) => { if (!isMarkdown(e.document)) { return; } @@ -1971,20 +1982,35 @@ async function enterFullMode( completionDebounceHandle = setTimeout(() => { completionDebounceHandle = undefined; const doc = e.document; + // Skip re-indexing while a rename operation is in progress. + // The rename flow applies workspace edits to multiple files; + // re-indexing them mid-operation is wasteful and can interfere + // with the rename flow. refreshIndexAfterRename handles all + // re-indexing once the rename completes. + if (renameTracker.isRenaming) { return; } // Skip re-indexing while a rename check is pending for this document. // The rename tracker needs the stale index state to detect the change; // refreshIndexAfterRename will re-index the file once the rename completes. if (renameTracker.hasPendingEdit(doc.uri.toString())) { return; } - const end = logService.time('debounce', 'indexFileContent + refresh'); const relativePath = notesRootPaths ? toNotesRelativePath(notesRootPaths.root, doc.uri.fsPath) : vscode.workspace.asRelativePath(doc.uri, false); const filename = path.basename(doc.uri.fsPath); + const page = indexService!.getPageByPath(relativePath); + const indexedLinks = page ? indexService!.getLinksForPage(page.id) : []; + const lines: string[] = []; + for (let i = 0; i < doc.lineCount; i++) { + lines.push(doc.lineAt(i).text); + } + const refreshCompletion = hasNewCompleteWikilink(lines, indexedLinks, wikilinkService); + indexService!.indexFileContent(relativePath, filename, doc.getText(), Date.now()); + if (refreshCompletion) { + completionProvider?.refresh(); + } taskPanelProvider?.refresh(); searchPanelProvider?.refresh(); backlinkPanelProvider?.refresh(); - end(); }, 500); }), ); @@ -2029,7 +2055,7 @@ async function enterFullMode( try { await indexScanner!.indexFile(fileUri); } catch (err) { - console.warn('as-notes: failed to index created file:', err); + logService.warn('extension', `failed to index created file: ${formatLogError(err)}`); } } } @@ -2063,7 +2089,7 @@ async function enterFullMode( try { await indexScanner!.staleScan(); } catch (err) { - console.warn('as-notes: stale scan after folder delete failed:', err); + logService.warn('extension', `stale scan after folder delete failed: ${formatLogError(err)}`); } } if (!safeSaveToFile()) { return; } @@ -2093,7 +2119,7 @@ async function enterFullMode( try { await indexScanner!.indexFile(newUri); } catch (err) { - console.warn('as-notes: failed to index renamed file:', err); + logService.warn('extension', `failed to index renamed file: ${formatLogError(err)}`); } } } @@ -2103,7 +2129,7 @@ async function enterFullMode( try { await indexScanner!.staleScan(); } catch (err) { - console.warn('as-notes: stale scan after folder rename failed:', err); + logService.warn('extension', `stale scan after folder rename failed: ${formatLogError(err)}`); } } if (!safeSaveToFile()) { return; } @@ -2112,6 +2138,22 @@ async function enterFullMode( searchPanelProvider?.refresh(); backlinkPanelProvider?.refresh(); calendarPanelProvider?.refresh(); + + await handleExplorerRenameRefactors({ + files: e.files, + renameTrackerIsRenaming: renameTracker.isRenaming, + wikilinkService, + indexService: indexService!, + indexScanner: indexScanner!, + notesRootPath: notesRootPaths?.root, + safeSaveToFile, + refreshProviders: () => { + completionProvider?.refresh(); + taskPanelProvider?.refresh(); + searchPanelProvider?.refresh(); + backlinkPanelProvider?.refresh(); + }, + }); }), ); @@ -2184,7 +2226,7 @@ async function enterFullMode( calendarPanelProvider?.refresh(); updateFullModeStatusBar(); } - }).catch(err => console.warn('as-notes: stale scan after .asnotesignore change failed:', err)); + }).catch(err => logService.warn('extension', `stale scan after .asnotesignore change failed: ${formatLogError(err)}`)); }; ignoreFileWatcher.onDidChange(onIgnoreFileChange); ignoreFileWatcher.onDidCreate(onIgnoreFileChange); @@ -2200,7 +2242,7 @@ async function enterFullMode( } if (e.affectsConfiguration('as-notes.assetPath')) { applyAssetPathSettings().catch(err => - console.warn('as-notes: failed to apply asset path settings on config change:', err), + logService.warn('extension', `failed to apply asset path settings on config change: ${formatLogError(err)}`), ); } if (e.affectsConfiguration('as-notes.rootDirectory')) { @@ -2289,10 +2331,11 @@ function exitFullMode(): void { ignoreService = undefined; completionProvider = undefined; logService.info('extension', 'exitFullMode: complete'); + logService.info('extension', '.asnotes/ directory removed — switched to passive mode'); + setActiveLogger(NO_OP_LOGGER); logService = NO_OP_LOGGER; disposeFullMode(); setPassiveMode(); - console.log('as-notes: .asnotes/ directory removed — switched to passive mode'); } /** @@ -2417,6 +2460,7 @@ async function initWorkspace(context: vscode.ExtensionContext): Promise { const loggingEnabled = config.get('enableLogging', false) || process.env.AS_NOTES_DEBUG === '1'; logService = new LogService(nrp.logDir, { enabled: loggingEnabled }); + setActiveLogger(logService); if (logService.isEnabled) { logService.info('extension', 'initWorkspace: logging activated'); } @@ -2466,7 +2510,7 @@ async function rebuildIndex(): Promise { // Re-apply asset path settings in case they were modified externally applyAssetPathSettings().catch(err => - console.warn('as-notes: failed to re-apply asset path settings on rebuild:', err), + logService.warn('extension', `failed to re-apply asset path settings on rebuild: ${formatLogError(err)}`), ); await vscode.window.withProgress( @@ -2509,8 +2553,7 @@ async function rebuildIndex(): Promise { ); logService.info('extension', `rebuildIndex: complete — ${result.filesIndexed} files, ${result.linksFound} links`); } catch (err: unknown) { - const msg = err instanceof Error ? err.message : String(err); - console.error('as-notes: rebuildIndex failed:', err); + const msg = formatLogError(err); logService.error('extension', `rebuildIndex: failed — ${msg}`); vscode.window.showErrorMessage(`AS Notes: Rebuild failed — ${msg}`); } @@ -2555,7 +2598,7 @@ async function cleanWorkspace(): Promise { try { fs.rmSync(asnotesDir, { recursive: true, force: true }); } catch (err) { - const msg = err instanceof Error ? err.message : String(err); + const msg = formatLogError(err); vscode.window.showErrorMessage(`AS Notes: Failed to remove .asnotes/ — ${msg}`); return; } @@ -2757,7 +2800,7 @@ async function openDailyJournal(notesRoot: vscode.Uri, date?: Date): Promise { await vscode.workspace.fs.rename(oldUri, newUri, { overwrite: false }); renamed++; } catch (err) { - console.warn(`as-notes: failed to rename ${oldName} to ${newName}:`, err); + logService.warn('extension', `failed to rename ${oldName} to ${newName}: ${formatLogError(err)}`); errors++; } } @@ -2873,7 +2916,7 @@ function startPeriodicScan(): void { logService.info('extension', 'periodicScan: no changes'); } } catch (err) { - console.warn('as-notes: periodic scan failed:', err); + logService.warn('extension', `periodic scan failed: ${formatLogError(err)}`); } }, intervalMs); } @@ -2930,7 +2973,7 @@ function isEncryptedFileUri(uri: vscode.Uri): boolean { */ function createExtendMarkdownIt(): (md: any) => any { const wikilinkService = new WikilinkService(); - console.log('as-notes: createExtendMarkdownIt() called'); + logService.info('extension', 'createExtendMarkdownIt() called'); const resolver: WikilinkResolverFn = (pageFileName, env) => { const sourcePath = getSourcePathFromEnv(env); @@ -2968,7 +3011,7 @@ function createExtendMarkdownIt(): (md: any) => any { }; return (md: any) => { - console.log('as-notes: extendMarkdownIt() invoked by VS Code markdown preview'); + logService.info('extension', 'extendMarkdownIt() invoked by VS Code markdown preview'); wikilinkPlugin(md, { wikilinkService, resolver }); return md; }; diff --git a/vs-code-extension/src/inline-editor/code-block-hover-provider.ts b/vs-code-extension/src/inline-editor/code-block-hover-provider.ts index 2839db2..09026c7 100644 --- a/vs-code-extension/src/inline-editor/code-block-hover-provider.ts +++ b/vs-code-extension/src/inline-editor/code-block-hover-provider.ts @@ -5,6 +5,7 @@ import { MarkdownParseCache } from './markdown-parse-cache'; import { renderMermaidSvgNatural, createErrorSvg, svgToDataUri } from './mermaid/mermaid-renderer'; import { svgToDataUriBase64 } from './mermaid/svg-processor'; import * as cheerio from 'cheerio'; +import { formatLogError, getActiveLogger } from '../LogService.js'; /** * Configuration for code block hover previews @@ -60,26 +61,26 @@ type CodeBlockHoverHandler = ( function ensureSvgDimensions(svgString: string, width: number, height: number): string { const $ = cheerio.load(svgString, { xmlMode: true }); const svgNode = $('svg').first(); - + if (svgNode.length === 0) { return svgString; } - + // Set explicit pixel-based dimensions svgNode.attr('width', `${width}px`); svgNode.attr('height', `${height}px`); - + // Ensure viewBox exists if it doesn't, using the calculated dimensions if (!svgNode.attr('viewBox')) { svgNode.attr('viewBox', `0 0 ${width} ${height}`); } - + // Remove any percentage-based width that might interfere const currentWidth = svgNode.attr('width'); if (currentWidth && currentWidth.includes('%')) { svgNode.attr('width', `${width}px`); } - + return svgNode.toString(); } @@ -125,7 +126,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { : 'default'; const fontFamily = vscode.workspace.getConfiguration('editor').get('fontFamily'); - + // Render at natural size first to get actual diagram dimensions // Pass cancellation token for shorter timeout (5 seconds) to match VS Code hover timeout const naturalSvg = await renderMermaidSvgNatural(source, { @@ -136,7 +137,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { // Parse natural SVG to get actual dimensions const $ = cheerio.load(naturalSvg, { xmlMode: true }); const svgNode = $('svg').first(); - + if (svgNode.length === 0) { return undefined; } @@ -145,11 +146,11 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { // Try to get width/height from attributes, or viewBox const widthAttr = svgNode.attr('width') || '0'; const heightAttr = svgNode.attr('height') || '0'; - + // Handle percentage-based dimensions (Mermaid often uses width="100%") let svgWidth = widthAttr === '100%' ? 0 : parseFloat(widthAttr) || 0; let svgHeight = parseFloat(heightAttr) || 0; - + // If no explicit width/height, try viewBox (most reliable source) if ((svgWidth === 0 || svgHeight === 0) && svgNode.attr('viewBox')) { const viewBox = svgNode.attr('viewBox')!.split(/\s+/); @@ -227,7 +228,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { // Further reduced from 1200px to 1000px for better size reduction const maxWidth = Math.min(config.maxWidth, 1000); // Cap at 1000px (reduced from 1200px) const maxHeight = Math.min(config.maxHeight, 750); // Cap at 750px (reduced from 900px) - + if (hoverWidth > maxWidth) { const aspectRatio = hoverHeight / hoverWidth; hoverWidth = maxWidth; @@ -242,14 +243,14 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { // Ensure SVG has explicit dimensions for proper display in hover // Some diagrams (like state diagrams) may have percentage-based or missing dimensions const processedSvg = ensureSvgDimensions(naturalSvg, hoverWidth, hoverHeight); - + return { html: processedSvg, width: Math.round(hoverWidth), height: Math.round(hoverHeight), }; } catch (error) { - console.warn('[Code Block Hover] Mermaid render failed:', error instanceof Error ? error.message : error); + getActiveLogger().warn('CodeBlockHover', `Mermaid render failed: ${formatLogError(error)}`); // Create error SVG to display in hover instead of returning undefined // Extract meaningful error message let errorMessage = 'Rendering failed'; @@ -361,7 +362,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { // First, find all code blocks with language identifiers const languageDecs = decorations.filter(d => d.type === 'codeBlockLanguage'); const codeBlockDecs = decorations.filter(d => d.type === 'codeBlock'); - + // Check each code block to see if hover is within it for (const codeBlockDec of codeBlockDecs) { if (token.isCancellationRequested) { @@ -374,7 +375,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { // Check if hover is within this code block if (hoverOffset >= codeStart && hoverOffset < codeEnd) { // Find the language identifier for this code block - const langDec = languageDecs.find(l => + const langDec = languageDecs.find(l => l.startPos >= codeBlockDec.startPos && l.endPos <= codeBlockDec.endPos ); @@ -382,7 +383,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { if (langDec) { // Extract language from text const language = text.substring(langDec.startPos, langDec.endPos).trim().toLowerCase(); - + // Check if we have a handler for this language if (this.handlers.has(language)) { // Extract source code from the code block @@ -396,7 +397,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { if (lines.length >= 3) { // Remove first line (opening fence with language) and last line (closing fence) const source = lines.slice(1, -1).join('\n'); - + return this.createHoverForCodeBlock( source, language, @@ -441,19 +442,19 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { const markdown = new vscode.MarkdownString(); markdown.supportHtml = true; markdown.isTrusted = true; // SVGs are safe - + if (result.html) { // Try multiple encoding strategies to avoid VS Code hover size limits // Strategy: URL encoding (smaller) -> Base64 (different handling) -> fallback message const svgSize = result.html.length; const MAX_DATA_URI_SIZE = 80 * 1024; // 80KB threshold - + let imageSrc: string | undefined; - + // First try URL-encoded data URI (typically smaller for SVG) const urlEncodedUri = svgToDataUri(result.html); const urlEncodedSize = urlEncodedUri.length; - + if (urlEncodedSize <= MAX_DATA_URI_SIZE) { // URL encoding fits - use it imageSrc = urlEncodedUri; @@ -461,7 +462,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { // URL encoding too large - try Base64 (might be handled differently by VS Code) const base64Uri = svgToDataUriBase64(result.html); const base64Size = base64Uri.length; - + if (base64Size <= MAX_DATA_URI_SIZE) { // Base64 fits - use it imageSrc = base64Uri; @@ -469,7 +470,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { // Both encodings too large - show helpful, informative message const sizeKB = Math.round(svgSize / 1024); const dataUriKB = Math.round(Math.max(urlEncodedSize, base64Size) / 1024); - + markdown.appendMarkdown( `## 📊 Mermaid Diagram Preview\n\n` + `**Size:** ${sizeKB}KB (data URI: ${dataUriKB}KB) \n` + @@ -484,7 +485,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { imageSrc = undefined; } } - + if (imageSrc) { const escapedUri = escapeHtmlAttribute(imageSrc); markdown.appendMarkdown( @@ -514,7 +515,7 @@ export class CodeBlockHoverProvider implements vscode.HoverProvider { return new vscode.Hover(markdown, hoverRange); } catch (error) { - console.warn(`[Code Block Hover] Failed to create hover for ${language}:`, error); + getActiveLogger().warn('CodeBlockHover', `Failed to create hover for ${language}: ${formatLogError(error)}`); return undefined; } } diff --git a/vs-code-extension/src/inline-editor/decorator.ts b/vs-code-extension/src/inline-editor/decorator.ts index f60283e..7e1f1ab 100644 --- a/vs-code-extension/src/inline-editor/decorator.ts +++ b/vs-code-extension/src/inline-editor/decorator.ts @@ -12,6 +12,7 @@ import { MermaidDiagramDecorations } from './decorator/mermaid-diagram-decoratio import { MathDecorations } from './math/math-decorations'; import { renderMermaidSvg, svgToDataUri, createErrorSvg } from './mermaid/mermaid-renderer'; import { MermaidHoverIndicatorDecorationType } from './decorations'; +import { formatLogError, getActiveLogger } from '../LogService.js'; /** * Performance and caching constants. @@ -590,7 +591,7 @@ export class Decorator { const svg = await renderMermaidSvg(block.source, { theme, fontFamily, numLines: block.numLines }); return svgToDataUri(svg); } catch (error) { - console.warn('Mermaid render failed:', error instanceof Error ? error.message : error); + getActiveLogger().warn('InlineDecorator', `Mermaid render failed: ${formatLogError(error)}`); // Create error SVG to display instead of silently failing. let errorMessage = 'Rendering failed'; if (error instanceof Error) { diff --git a/vs-code-extension/src/inline-editor/link-click-handler.ts b/vs-code-extension/src/inline-editor/link-click-handler.ts index 356da1e..ab6cb45 100644 --- a/vs-code-extension/src/inline-editor/link-click-handler.ts +++ b/vs-code-extension/src/inline-editor/link-click-handler.ts @@ -8,6 +8,7 @@ import { } from "./link-targets"; import { MarkdownParseCache } from "./markdown-parse-cache"; import { getForgeContext } from "./forge-context"; +import { formatLogError, getActiveLogger } from "../LogService.js"; /** * Handles single-click navigation for markdown links and images. @@ -20,7 +21,7 @@ export class LinkClickHandler { private disposables: vscode.Disposable[] = []; private isEnabled: boolean = false; - constructor(private parseCache: MarkdownParseCache) {} + constructor(private parseCache: MarkdownParseCache) { } /** * Enables or disables single-click link navigation. @@ -175,7 +176,7 @@ export class LinkClickHandler { await vscode.commands.executeCommand("vscode.open", target.uri); } catch (error) { // File might not exist, silently fail - console.warn("Failed to open link:", error); + getActiveLogger().warn('LinkClickHandler', `Failed to open link: ${formatLogError(error)}`); } } diff --git a/vs-code-extension/src/inline-editor/mermaid/mermaid-renderer.ts b/vs-code-extension/src/inline-editor/mermaid/mermaid-renderer.ts index 50cad3f..eaca66d 100644 --- a/vs-code-extension/src/inline-editor/mermaid/mermaid-renderer.ts +++ b/vs-code-extension/src/inline-editor/mermaid/mermaid-renderer.ts @@ -5,6 +5,7 @@ import { processSvg } from './svg-processor'; import { createErrorSvg, extractErrorMessage } from './error-handler'; import { MERMAID_CONSTANTS } from './constants'; import type { MermaidRenderOptions } from './types'; +import { getActiveLogger } from '../../LogService.js'; // Singleton webview manager instance let webviewManager: MermaidWebviewManager | undefined; @@ -19,7 +20,7 @@ let hasLoggedWaitingForWebview = false; async function waitForWebviewOnceLogged(manager: MermaidWebviewManager): Promise { if (!hasLoggedWaitingForWebview) { hasLoggedWaitingForWebview = true; - console.warn('Mermaid: waiting for webview'); + getActiveLogger().warn('MermaidRenderer', 'waiting for webview'); } await manager.waitForWebview(); } @@ -32,7 +33,7 @@ export function initMermaidRenderer(context: vscode.ExtensionContext): void { // Already initialized return; } - + webviewManager = new MermaidWebviewManager(); webviewManager.initialize(context); } @@ -56,28 +57,28 @@ export async function renderMermaidSvgNatural( await waitForWebviewOnceLogged(webviewManager); const darkMode = options.theme === 'dark'; - + // Check cancellation before starting expensive operation if (cancellationToken?.isCancellationRequested) { throw new vscode.CancellationError(); } - + // Use shorter timeout for hover requests (5 seconds) to match VS Code's hover timeout // Regular requests use the default 30-second timeout const timeoutMs = cancellationToken ? MERMAID_CONSTANTS.HOVER_REQUEST_TIMEOUT_MS : MERMAID_CONSTANTS.REQUEST_TIMEOUT_MS; - + // Request SVG without processing (get natural dimensions) const svgString = await webviewManager.requestSvg( { source, darkMode, fontFamily: options.fontFamily }, timeoutMs, cancellationToken ); - + // Check cancellation again after await (in case it was cancelled during the request) if (cancellationToken?.isCancellationRequested) { throw new vscode.CancellationError(); } - + // Return raw SVG without height processing return svgString; } @@ -123,7 +124,7 @@ const getMermaidDecoration = memoizeMermaidDecoration(async ( { source, darkMode, fontFamily }, MERMAID_CONSTANTS.REQUEST_TIMEOUT_MS ); - + // Check if this is an error SVG (contains "Mermaid Rendering Error") if (svgString.includes('Mermaid Rendering Error')) { // Recreate error SVG with proper dimensions @@ -135,9 +136,9 @@ const getMermaidDecoration = memoizeMermaidDecoration(async ( ); return errorSvg; } - + const processedSvg = processSvg(svgString, height); - + return processedSvg; }); @@ -161,7 +162,7 @@ export async function renderMermaidSvg( // Default to 200px if numLines not provided const editorConfig = vscode.workspace.getConfiguration('editor'); let lineHeight = editorConfig.get('lineHeight', 0); - + // If lineHeight is 0 or invalid, calculate from fontSize (like Markless does) if (lineHeight === 0 || lineHeight < 8) { const fontSize = editorConfig.get('fontSize', 14); @@ -172,7 +173,7 @@ export async function renderMermaidSvg( lineHeight = 8; // Minimum line height } } - + const numLines = options.numLines || 5; const height = options.height || ((numLines + 2) * lineHeight); @@ -192,7 +193,7 @@ export function disposeMermaidRenderer(): void { webviewManager.dispose(); webviewManager = undefined; } - + // Clear decoration cache decorationCache.clear(); } diff --git a/vs-code-extension/src/inline-editor/mermaid/webview-manager.ts b/vs-code-extension/src/inline-editor/mermaid/webview-manager.ts index a266e41..c67ce89 100644 --- a/vs-code-extension/src/inline-editor/mermaid/webview-manager.ts +++ b/vs-code-extension/src/inline-editor/mermaid/webview-manager.ts @@ -3,6 +3,7 @@ import { ColorThemeKind } from 'vscode'; import type { PendingRender, RenderResponse } from './types'; import { MERMAID_CONSTANTS } from './constants'; import { createErrorSvg } from './error-handler'; +import { formatLogError, getActiveLogger } from '../../LogService.js'; /** * Manages the Mermaid webview lifecycle and communication @@ -77,7 +78,7 @@ export class MermaidWebviewManager { }) .catch((err: unknown) => { if (err instanceof Error && err.message === 'timeout') { - console.warn('Mermaid: Webview not ready after opening view'); + getActiveLogger().warn('MermaidWebview', 'Webview not ready after opening view'); } this.initTimeoutId = setTimeout(() => { vscode.commands.executeCommand('workbench.view.explorer'); @@ -87,7 +88,7 @@ export class MermaidWebviewManager { }, (err: unknown) => { if (err !== undefined) { - console.warn('Mermaid: Failed to focus view', err); + getActiveLogger().warn('MermaidWebview', `Failed to focus view: ${formatLogError(err)}`); } } ); diff --git a/vs-code-extension/src/inline-editor/parser.ts b/vs-code-extension/src/inline-editor/parser.ts index 917f67d..2e02515 100644 --- a/vs-code-extension/src/inline-editor/parser.ts +++ b/vs-code-extension/src/inline-editor/parser.ts @@ -18,6 +18,7 @@ import type { import type { Node } from "unist"; import { getRemarkProcessorSync, getRemarkProcessor } from "./parser-remark"; import { getEmojiMap } from "./emoji-map-loader"; +import { formatLogError, getActiveLogger } from '../LogService.js'; import { scanMathRegions } from "./math/math-scanner"; import { config } from "./config"; @@ -259,7 +260,7 @@ export class MarkdownParser { decorations.sort((a, b) => a.startPos - b.startPos); } catch (error) { // Gracefully handle parse errors - console.error("Error parsing markdown:", error); + getActiveLogger().error('InlineParser', `Error parsing markdown: ${formatLogError(error)}`); } return { @@ -448,7 +449,7 @@ export class MarkdownParser { } catch (error) { // Gracefully handle invalid positions or processing errors // Individual methods still validate, so this catches unexpected issues - console.warn("Error processing AST node:", node.type, error); + getActiveLogger().warn('InlineParser', `Error processing AST node ${node.type}: ${formatLogError(error)}`); } }, ); diff --git a/vs-code-extension/src/test/IndexService.test.ts b/vs-code-extension/src/test/IndexService.test.ts index ffae70a..807db40 100644 --- a/vs-code-extension/src/test/IndexService.test.ts +++ b/vs-code-extension/src/test/IndexService.test.ts @@ -788,6 +788,24 @@ describe('IndexService — aliases', () => { expect(links[0].page_filename).toBe('New Name.md'); }); + it('finds distinct pages linking to any of the supplied page names', () => { + const sourcePageId = service.upsertPage('notes/Source.md', 'Source.md', 'Source', 1000); + const otherPageId = service.upsertPage('notes/Other.md', 'Other.md', 'Other', 1000); + + service.setLinksForPage(sourcePageId, [ + { page_name: 'OldName', page_filename: 'OldName.md', line: 0, start_col: 0, end_col: 12, context: '[[OldName]]', parent_link_id: null, depth: 0 }, + { page_name: 'AnotherName', page_filename: 'AnotherName.md', line: 1, start_col: 0, end_col: 16, context: '[[AnotherName]]', parent_link_id: null, depth: 0 }, + ]); + + service.setLinksForPage(otherPageId, [ + { page_name: 'OldName', page_filename: 'OldName.md', line: 0, start_col: 0, end_col: 12, context: '[[OldName]]', parent_link_id: null, depth: 0 }, + ]); + + const pages = service.findPagesLinkingToPageNames(['OldName', 'AnotherName']); + + expect(pages.map(p => p.path).sort()).toEqual(['notes/Other.md', 'notes/Source.md']); + }); + it('should find pages by filename for subfolder resolution', () => { service.indexFileContent('notes/Page.md', 'Page.md', '# Page in notes', 1000); service.indexFileContent('archive/Page.md', 'Page.md', '# Page in archive', 1000); diff --git a/vs-code-extension/src/test/WikilinkCompletionProvider.test.ts b/vs-code-extension/src/test/WikilinkCompletionProvider.test.ts index 856237a..9cca30d 100644 --- a/vs-code-extension/src/test/WikilinkCompletionProvider.test.ts +++ b/vs-code-extension/src/test/WikilinkCompletionProvider.test.ts @@ -1,6 +1,7 @@ import { describe, it, expect, beforeEach, afterEach } from 'vitest'; import { IndexService } from '../IndexService.js'; -import { findInnermostOpenBracket, findMatchingCloseBracket, isLineInsideFrontMatter, isPositionInsideCode } from '../CompletionUtils.js'; +import { findInnermostOpenBracket, findMatchingCloseBracket, hasNewCompleteWikilink, isLineInsideFrontMatter, isPositionInsideCode } from '../CompletionUtils.js'; +import { WikilinkService } from 'as-notes-common'; // ── Bracket detection ────────────────────────────────────────────────────── @@ -283,6 +284,47 @@ describe('WikilinkCompletionProvider — isPositionInsideCode', () => { }); }); +// ── New wikilink detection ────────────────────────────────────────────────── + +describe('WikilinkCompletionProvider — hasNewCompleteWikilink', () => { + const wikilinkService = new WikilinkService(); + + it('returns true when a new simple wikilink is added', () => { + const lines = ['Before [[New Link]] after']; + const indexedLinks: { page_name: string }[] = []; + + expect(hasNewCompleteWikilink(lines, indexedLinks, wikilinkService)).toBe(true); + }); + + it('returns true when the same wikilink is added a second time', () => { + const lines = ['[[New Link]] and again [[New Link]]']; + const indexedLinks = [{ page_name: 'New Link' }]; + + expect(hasNewCompleteWikilink(lines, indexedLinks, wikilinkService)).toBe(true); + }); + + it('returns true when a nested wikilink is added', () => { + const lines = ['[[Garden [[Topic]] Notes]]']; + const indexedLinks: { page_name: string }[] = []; + + expect(hasNewCompleteWikilink(lines, indexedLinks, wikilinkService)).toBe(true); + }); + + it('returns false when text changes but no new wikilink is added', () => { + const lines = ['prefix [[Existing]] suffix']; + const indexedLinks = [{ page_name: 'Existing' }]; + + expect(hasNewCompleteWikilink(lines, indexedLinks, wikilinkService)).toBe(false); + }); + + it('returns false when the only new target is incomplete', () => { + const lines = ['typing [[Incomplete']; + const indexedLinks: { page_name: string }[] = []; + + expect(hasNewCompleteWikilink(lines, indexedLinks, wikilinkService)).toBe(false); + }); +}); + // ── Completion item building (integration with IndexService) ─────────────── describe('WikilinkCompletionProvider — completion item cache', () => { diff --git a/vs-code-extension/src/test/WikilinkExplorerMergeService.test.ts b/vs-code-extension/src/test/WikilinkExplorerMergeService.test.ts new file mode 100644 index 0000000..5dbe12e --- /dev/null +++ b/vs-code-extension/src/test/WikilinkExplorerMergeService.test.ts @@ -0,0 +1,55 @@ +import { describe, expect, it } from 'vitest'; +import type { PageRow } from '../IndexService.js'; +import { + getExistingExplorerMergeTargets, + pickUniqueExplorerMergeTarget, +} from '../WikilinkExplorerMergeService.js'; + +function page(path: string): PageRow { + const filename = path.split('/').pop() ?? 'Page.md'; + return { + id: 1, + path, + filename, + title: filename.replace(/\.md$/i, ''), + mtime: 0, + indexed_at: 0, + }; +} + +describe('WikilinkExplorerMergeService', () => { + it('returns the single pre-existing merge target when exactly one exists', () => { + const pages = [ + page('source/NewName.md'), + page('target/NewName.md'), + ]; + + expect(pickUniqueExplorerMergeTarget(pages, 'source/NewName.md')?.path).toBe('target/NewName.md'); + }); + + it('returns no merge target when there is no pre-existing duplicate', () => { + const pages = [page('source/NewName.md')]; + + expect(pickUniqueExplorerMergeTarget(pages, 'source/NewName.md')).toBeUndefined(); + }); + + it('returns no merge target when multiple pre-existing duplicates exist', () => { + const pages = [ + page('source/NewName.md'), + page('target-a/NewName.md'), + page('target-b/NewName.md'), + ]; + + expect(getExistingExplorerMergeTargets(pages, 'source/NewName.md')).toHaveLength(2); + expect(pickUniqueExplorerMergeTarget(pages, 'source/NewName.md')).toBeUndefined(); + }); + + it('normalises path casing and slashes before comparing renamed path', () => { + const pages = [ + page('Source/NewName.md'), + page('target/NewName.md'), + ]; + + expect(pickUniqueExplorerMergeTarget(pages, 'source\\newname.md')?.path).toBe('target/NewName.md'); + }); +}); \ No newline at end of file diff --git a/vs-code-extension/src/test/WikilinkExplorerRenameRefactorService.test.ts b/vs-code-extension/src/test/WikilinkExplorerRenameRefactorService.test.ts new file mode 100644 index 0000000..1759eb2 --- /dev/null +++ b/vs-code-extension/src/test/WikilinkExplorerRenameRefactorService.test.ts @@ -0,0 +1,293 @@ +import { beforeEach, describe, expect, it, vi } from 'vitest'; + +vi.mock('vscode', () => { + const disposable = { dispose: vi.fn() }; + class WorkspaceEdit { + replace = vi.fn(); + } + return { + ProgressLocation: { + Notification: 15, + }, + Uri: { + file: vi.fn((fsPath: string) => ({ fsPath, toString: () => `file://${fsPath}` })), + joinPath: vi.fn((base: { fsPath: string }, child: string) => ({ + fsPath: `${base.fsPath}/${child}`.replace(/\\/g, '/'), + toString: () => `file://${base.fsPath}/${child}`, + })), + }, + Range: class { + constructor(public startLine: number, public startChar: number, public endLine: number, public endChar: number) { } + }, + WorkspaceEdit, + workspace: { + asRelativePath: vi.fn((uri: { fsPath: string }) => uri.fsPath), + findFiles: vi.fn().mockResolvedValue([{ fsPath: '/notes/Ref.md', toString: () => 'file:///notes/Ref.md' }]), + openTextDocument: vi.fn(), + applyEdit: vi.fn().mockResolvedValue(true), + textDocuments: [], + fs: { + delete: vi.fn().mockResolvedValue(undefined), + rename: vi.fn().mockResolvedValue(undefined), + readFile: vi.fn().mockResolvedValue(new Uint8Array()), + writeFile: vi.fn().mockResolvedValue(undefined), + }, + workspaceFolders: [{ uri: { fsPath: '/notes', toString: () => 'file:///notes' } }], + onDidRenameFiles: vi.fn(() => disposable), + }, + window: { + showInformationMessage: vi.fn(), + showWarningMessage: vi.fn(), + withProgress: vi.fn(async (_options: unknown, task: Function) => task({ report: vi.fn() }, {})), + }, + }; +}); + +import * as vscode from 'vscode'; +import { WikilinkService } from 'as-notes-common'; +import { handleExplorerRenameRefactors } from '../WikilinkExplorerRenameRefactorService.js'; + +describe('WikilinkExplorerRenameRefactorService', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + it('shows notification progress when the user accepts explorer reference updates', async () => { + const staleScan = vi.fn().mockResolvedValue(undefined); + const indexFile = vi.fn().mockResolvedValue(undefined); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + // updateLinksInWorkspace now reads via fs.readFile for non-open files + vi.mocked(vscode.workspace.fs.readFile).mockResolvedValue( + new TextEncoder().encode('[[OldName]]'), + ); + + await handleExplorerRenameRefactors({ + files: [{ + oldUri: { fsPath: '/notes/OldName.md', toString: () => 'file:///notes/OldName.md' }, + newUri: { fsPath: '/notes/NewName.md', toString: () => 'file:///notes/NewName.md' }, + }], + renameTrackerIsRenaming: false, + wikilinkService: new WikilinkService(), + indexService: { + findPagesByFilename: vi.fn().mockReturnValue([{ id: 1, path: 'NewName.md', filename: 'NewName.md', title: 'NewName', mtime: 0, indexed_at: 0 }]), + findPagesLinkingToPageNames: vi.fn().mockReturnValue([{ id: 3, path: 'Ref.md', filename: 'Ref.md', title: 'Ref', mtime: 0, indexed_at: 0 }]), + removePage: vi.fn(), + } as never, + indexScanner: { + staleScan, + indexFile, + } as never, + notesRootPath: '/notes', + safeSaveToFile: vi.fn().mockReturnValue(true), + refreshProviders: vi.fn(), + }); + + expect(vscode.window.withProgress).toHaveBeenCalledTimes(1); + expect(staleScan).not.toHaveBeenCalled(); + expect(indexFile).toHaveBeenCalled(); + }); + + it('renames filenames that contain the renamed wikilink text during explorer refactors', async () => { + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + vi.mocked(vscode.workspace.fs.readFile).mockResolvedValue( + new TextEncoder().encode('[[OldName]]'), + ); + + const indexFile = vi.fn().mockResolvedValue(undefined); + + await handleExplorerRenameRefactors({ + files: [{ + oldUri: { fsPath: '/notes/OldName.md', toString: () => 'file:///notes/OldName.md' }, + newUri: { fsPath: '/notes/NewName.md', toString: () => 'file:///notes/NewName.md' }, + }], + renameTrackerIsRenaming: false, + wikilinkService: new WikilinkService(), + indexService: { + findPagesByFilename: vi.fn().mockReturnValue([{ id: 1, path: 'NewName.md', filename: 'NewName.md', title: 'NewName', mtime: 0, indexed_at: 0 }]), + findPagesLinkingToPageNames: vi.fn().mockReturnValue([{ id: 3, path: 'Ref.md', filename: 'Ref.md', title: 'Ref', mtime: 0, indexed_at: 0 }]), + getAllPages: vi.fn().mockReturnValue([ + { id: 4, path: 'Topic [[OldName]].md', filename: 'Topic [[OldName]].md', title: 'Topic [[OldName]]', mtime: 0, indexed_at: 0 }, + ]), + removePage: vi.fn(), + } as never, + indexScanner: { + staleScan: vi.fn().mockResolvedValue(undefined), + indexFile, + } as never, + notesRootPath: '/notes', + safeSaveToFile: vi.fn().mockReturnValue(true), + refreshProviders: vi.fn(), + }); + + expect(vscode.workspace.fs.rename).toHaveBeenCalledWith( + expect.objectContaining({ fsPath: '/notes/Topic [[OldName]].md' }), + expect.objectContaining({ fsPath: '/notes/Topic [[NewName]].md' }), + { overwrite: false }, + ); + }); + + it('shows notification progress when the user accepts an explorer merge', async () => { + vi.mocked(vscode.window.showInformationMessage) + .mockResolvedValueOnce('Yes' as never) + .mockResolvedValueOnce('No' as never); + vi.mocked(vscode.workspace.openTextDocument) + .mockResolvedValueOnce({ + getText: () => '# source', + lineCount: 1, + lineAt: () => ({ range: { start: 0, end: 7 } }), + save: vi.fn().mockResolvedValue(true), + } as never) + .mockResolvedValueOnce({ + getText: () => '# target', + lineCount: 1, + lineAt: () => ({ range: { start: 0, end: 7 } }), + save: vi.fn().mockResolvedValue(true), + } as never); + + await handleExplorerRenameRefactors({ + files: [{ + oldUri: { fsPath: '/notes/OldName.md', toString: () => 'file:///notes/OldName.md' }, + newUri: { fsPath: '/notes/NewName.md', toString: () => 'file:///notes/NewName.md' }, + }], + renameTrackerIsRenaming: false, + wikilinkService: new WikilinkService(), + indexService: { + findPagesByFilename: vi.fn().mockReturnValue([ + { id: 1, path: 'NewName.md', filename: 'NewName.md', title: 'NewName', mtime: 0, indexed_at: 0 }, + { id: 2, path: 'folder/NewName.md', filename: 'NewName.md', title: 'NewName', mtime: 0, indexed_at: 0 }, + ]), + findPagesLinkingToPageNames: vi.fn().mockReturnValue([]), + removePage: vi.fn(), + } as never, + indexScanner: { + staleScan: vi.fn().mockResolvedValue(undefined), + indexFile: vi.fn().mockResolvedValue(undefined), + } as never, + notesRootPath: '/notes', + safeSaveToFile: vi.fn().mockReturnValue(true), + refreshProviders: vi.fn(), + }); + + expect(vscode.window.withProgress).toHaveBeenCalledTimes(1); + }); + + it('does not show notification progress when the user declines explorer reference updates', async () => { + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('No' as never); + + await handleExplorerRenameRefactors({ + files: [{ + oldUri: { fsPath: '/notes/OldName.md', toString: () => 'file:///notes/OldName.md' }, + newUri: { fsPath: '/notes/NewName.md', toString: () => 'file:///notes/NewName.md' }, + }], + renameTrackerIsRenaming: false, + wikilinkService: new WikilinkService(), + indexService: { + findPagesByFilename: vi.fn().mockReturnValue([{ id: 1, path: 'NewName.md', filename: 'NewName.md', title: 'NewName', mtime: 0, indexed_at: 0 }]), + findPagesLinkingToPageNames: vi.fn().mockReturnValue([]), + removePage: vi.fn(), + } as never, + indexScanner: { + staleScan: vi.fn().mockResolvedValue(undefined), + indexFile: vi.fn().mockResolvedValue(undefined), + } as never, + notesRootPath: '/notes', + safeSaveToFile: vi.fn().mockReturnValue(true), + refreshProviders: vi.fn(), + }); + + expect(vscode.window.withProgress).not.toHaveBeenCalled(); + }); + + it('re-indexes open affected reference documents from their live buffers instead of disk', async () => { + const candidateUri = { fsPath: '/notes/Ref.md', toString: () => 'file:///notes/Ref.md' }; + const openDoc = { + uri: candidateUri, + getText: () => '[[OldName]]', + lineCount: 1, + lineAt: () => ({ text: '[[OldName]]', range: { start: 0, end: 11 } }), + isDirty: true, + save: vi.fn().mockResolvedValue(true), + }; + + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + vi.mocked(vscode.workspace.textDocuments).splice(0, vi.mocked(vscode.workspace.textDocuments).length, openDoc as never); + + const indexFile = vi.fn().mockResolvedValue(undefined); + const indexFileContent = vi.fn(); + + await handleExplorerRenameRefactors({ + files: [{ + oldUri: { fsPath: '/notes/OldName.md', toString: () => 'file:///notes/OldName.md' }, + newUri: { fsPath: '/notes/NewName.md', toString: () => 'file:///notes/NewName.md' }, + }], + renameTrackerIsRenaming: false, + wikilinkService: new WikilinkService(), + indexService: { + findPagesByFilename: vi.fn().mockReturnValue([{ id: 1, path: 'NewName.md', filename: 'NewName.md', title: 'NewName', mtime: 0, indexed_at: 0 }]), + findPagesLinkingToPageNames: vi.fn().mockReturnValue([{ id: 3, path: 'Ref.md', filename: 'Ref.md', title: 'Ref', mtime: 0, indexed_at: 0 }]), + removePage: vi.fn(), + indexFileContent, + } as never, + indexScanner: { + staleScan: vi.fn().mockResolvedValue(undefined), + indexFile, + } as never, + notesRootPath: '/notes', + safeSaveToFile: vi.fn().mockReturnValue(true), + refreshProviders: vi.fn(), + }); + + expect(indexFileContent).toHaveBeenCalledWith('Ref.md', 'Ref.md', '[[OldName]]', expect.any(Number)); + expect(indexFile).not.toHaveBeenCalledWith(candidateUri); + }); + + it('does not save the target document after an explorer merge', async () => { + const targetSave = vi.fn().mockResolvedValue(true); + const sourceSave = vi.fn().mockResolvedValue(true); + + vi.mocked(vscode.window.showInformationMessage) + .mockResolvedValueOnce('Yes' as never) // merge dialog + .mockResolvedValueOnce('No' as never); // reference update dialog + vi.mocked(vscode.workspace.openTextDocument) + .mockResolvedValueOnce({ + getText: () => '# source', + lineCount: 1, + lineAt: () => ({ range: { start: 0, end: 7 } }), + save: sourceSave, + } as never) + .mockResolvedValueOnce({ + getText: () => '# target', + lineCount: 1, + lineAt: () => ({ range: { start: 0, end: 7 } }), + save: targetSave, + } as never); + + await handleExplorerRenameRefactors({ + files: [{ + oldUri: { fsPath: '/notes/OldName.md', toString: () => 'file:///notes/OldName.md' }, + newUri: { fsPath: '/notes/NewName.md', toString: () => 'file:///notes/NewName.md' }, + }], + renameTrackerIsRenaming: false, + wikilinkService: new WikilinkService(), + indexService: { + findPagesByFilename: vi.fn().mockReturnValue([ + { id: 1, path: 'NewName.md', filename: 'NewName.md', title: 'NewName', mtime: 0, indexed_at: 0 }, + { id: 2, path: 'folder/NewName.md', filename: 'NewName.md', title: 'NewName', mtime: 0, indexed_at: 0 }, + ]), + findPagesLinkingToPageNames: vi.fn().mockReturnValue([]), + removePage: vi.fn(), + indexFileContent: vi.fn(), + } as never, + indexScanner: { + staleScan: vi.fn().mockResolvedValue(undefined), + indexFile: vi.fn().mockResolvedValue(undefined), + } as never, + notesRootPath: '/notes', + safeSaveToFile: vi.fn().mockReturnValue(true), + refreshProviders: vi.fn(), + }); + + expect(targetSave).not.toHaveBeenCalled(); + expect(sourceSave).not.toHaveBeenCalled(); + }); +}); \ No newline at end of file diff --git a/vs-code-extension/src/test/WikilinkFilenameRefactorService.test.ts b/vs-code-extension/src/test/WikilinkFilenameRefactorService.test.ts new file mode 100644 index 0000000..943b670 --- /dev/null +++ b/vs-code-extension/src/test/WikilinkFilenameRefactorService.test.ts @@ -0,0 +1,66 @@ +import { describe, expect, it, vi } from 'vitest'; + +vi.mock('vscode', () => ({ + Uri: { + file: vi.fn((fsPath: string) => ({ fsPath, toString: () => `file://${fsPath}` })), + joinPath: vi.fn((base: { fsPath: string }, child: string) => ({ + fsPath: `${base.fsPath}/${child}`.replace(/\\/g, '/'), + toString: () => `file://${base.fsPath}/${child}`, + })), + }, +})); + +import * as vscode from 'vscode'; +import { + collectFilenameRefactorOperations, + remapUrisForFileOperations, +} from '../WikilinkFilenameRefactorService.js'; + +describe('WikilinkFilenameRefactorService', () => { + const rootUri = vscode.Uri.file('/notes'); + + it('plans a filename rename when a page filename contains the renamed wikilink', () => { + const result = collectFilenameRefactorOperations( + [{ oldPageName: 'Plant', newPageName: 'Tree' }], + [ + { path: 'Topic [[Plant]].md', filename: 'Topic [[Plant]].md' }, + ], + rootUri, + ); + + expect(result.fileRenames).toHaveLength(1); + expect(result.fileRenames[0].oldUri.fsPath).toContain('Topic [[Plant]].md'); + expect(result.fileRenames[0].newUri.fsPath).toContain('Topic [[Tree]].md'); + expect(result.fileMerges).toHaveLength(0); + }); + + it('plans a filename merge when the renamed filename target already exists', () => { + const result = collectFilenameRefactorOperations( + [{ oldPageName: 'Plant', newPageName: 'Tree' }], + [ + { path: 'Topic [[Plant]].md', filename: 'Topic [[Plant]].md' }, + { path: 'Topic [[Tree]].md', filename: 'Topic [[Tree]].md' }, + ], + rootUri, + ); + + expect(result.fileRenames).toHaveLength(0); + expect(result.fileMerges).toHaveLength(1); + expect(result.fileMerges[0].oldUri.fsPath).toContain('Topic [[Plant]].md'); + expect(result.fileMerges[0].newUri.fsPath).toContain('Topic [[Tree]].md'); + }); + + it('remaps candidate URIs through filename rename operations', () => { + const oldUri = vscode.Uri.file('/notes/Topic [[Plant]].md'); + const newUri = vscode.Uri.file('/notes/Topic [[Tree]].md'); + + const remapped = remapUrisForFileOperations( + [oldUri], + [{ oldUri, newUri, label: 'Topic [[Plant]].md → Topic [[Tree]].md' }], + [], + ); + + expect(remapped).toHaveLength(1); + expect(remapped[0].fsPath).toBe('/notes/Topic [[Tree]].md'); + }); +}); \ No newline at end of file diff --git a/vs-code-extension/src/test/WikilinkRefactorService.test.ts b/vs-code-extension/src/test/WikilinkRefactorService.test.ts new file mode 100644 index 0000000..2ccbee6 --- /dev/null +++ b/vs-code-extension/src/test/WikilinkRefactorService.test.ts @@ -0,0 +1,163 @@ +import { beforeEach, describe, expect, it, vi } from 'vitest'; + +vi.mock('vscode', () => ({ + Uri: { + file: vi.fn((fsPath: string) => ({ fsPath, toString: () => `file://${fsPath}` })), + }, + Range: class { + constructor(public startLine: number, public startChar: number, public endLine: number, public endChar: number) { } + }, + WorkspaceEdit: class { + replace = vi.fn(); + }, + workspace: { + findFiles: vi.fn().mockResolvedValue([]), + openTextDocument: vi.fn(), + applyEdit: vi.fn().mockResolvedValue(true), + textDocuments: [], + fs: { + readFile: vi.fn(), + writeFile: vi.fn().mockResolvedValue(undefined), + }, + }, +})); + +import * as vscode from 'vscode'; +import { WikilinkService } from 'as-notes-common'; +import { updateLinksInWorkspace } from '../WikilinkRefactorService.js'; + +describe('WikilinkRefactorService', () => { + beforeEach(() => { + vi.clearAllMocks(); + vi.mocked(vscode.workspace.textDocuments).splice(0, vi.mocked(vscode.workspace.textDocuments).length); + }); + + it('uses provided candidate URIs instead of scanning the whole workspace', async () => { + const candidateUri = { fsPath: '/notes/Ref.md', toString: () => 'file:///notes/Ref.md' }; + vi.mocked(vscode.workspace.fs.readFile).mockResolvedValue( + new TextEncoder().encode('[[OldName]]'), + ); + + await updateLinksInWorkspace( + new WikilinkService(), + [{ oldPageName: 'OldName', newPageName: 'NewName' }], + { candidateUris: [candidateUri as never] }, + ); + + expect(vscode.workspace.findFiles).not.toHaveBeenCalled(); + expect(vscode.workspace.fs.readFile).toHaveBeenCalledWith(candidateUri); + }); + + it('does not auto-save dirty affected documents after applying rename edits', async () => { + const candidateUri = { fsPath: '/notes/Ref.md', toString: () => 'file:///notes/Ref.md' }; + const save = vi.fn().mockResolvedValue(true); + const dirtyDoc = { + uri: candidateUri, + getText: () => '[[OldName]]', + lineCount: 1, + lineAt: (i: number) => ({ text: '[[OldName]]' }), + isDirty: true, + save, + }; + + vi.mocked(vscode.workspace.textDocuments).splice(0, 0, dirtyDoc as never); + + await updateLinksInWorkspace( + new WikilinkService(), + [{ oldPageName: 'OldName', newPageName: 'NewName' }], + { candidateUris: [candidateUri as never] }, + ); + + expect(save).not.toHaveBeenCalled(); + }); + + it('reads from open document buffer instead of disk when file is already open', async () => { + const candidateUri = { fsPath: '/notes/Ref.md', toString: () => 'file:///notes/Ref.md' }; + const openDoc = { + uri: candidateUri, + lineCount: 1, + lineAt: () => ({ text: '[[OldName]]' }), + }; + + vi.mocked(vscode.workspace.textDocuments).splice(0, 0, openDoc as never); + + await updateLinksInWorkspace( + new WikilinkService(), + [{ oldPageName: 'OldName', newPageName: 'NewName' }], + { candidateUris: [candidateUri as never] }, + ); + + // Should use the open document, not fs.readFile + expect(vscode.workspace.fs.readFile).not.toHaveBeenCalled(); + expect(vscode.workspace.applyEdit).toHaveBeenCalled(); + }); + + it('reads from disk via fs.readFile for files not already open', async () => { + const candidateUri = { fsPath: '/notes/Ref.md', toString: () => 'file:///notes/Ref.md' }; + vi.mocked(vscode.workspace.fs.readFile).mockResolvedValue( + new TextEncoder().encode('[[OldName]]'), + ); + + await updateLinksInWorkspace( + new WikilinkService(), + [{ oldPageName: 'OldName', newPageName: 'NewName' }], + { candidateUris: [candidateUri as never] }, + ); + + // Should NOT open a document model — reads raw bytes instead + expect(vscode.workspace.openTextDocument).not.toHaveBeenCalled(); + expect(vscode.workspace.fs.readFile).toHaveBeenCalledWith(candidateUri); + expect(vscode.workspace.applyEdit).not.toHaveBeenCalled(); + expect(vscode.workspace.fs.writeFile).toHaveBeenCalledWith( + candidateUri, + Buffer.from('[[NewName]]', 'utf-8'), + ); + }); + + it('uses workspace edits only for files that are already open', async () => { + const openUri = { fsPath: '/notes/Open.md', toString: () => 'file:///notes/Open.md' }; + const closedUri = { fsPath: '/notes/Closed.md', toString: () => 'file:///notes/Closed.md' }; + const openDoc = { + uri: openUri, + lineCount: 1, + lineAt: () => ({ text: '[[OldName]]' }), + }; + + vi.mocked(vscode.workspace.textDocuments).splice(0, 0, openDoc as never); + vi.mocked(vscode.workspace.fs.readFile).mockImplementation(async (uri) => { + if ((uri as { toString(): string }).toString() === closedUri.toString()) { + return new TextEncoder().encode('[[OldName]]'); + } + throw new Error('unexpected uri'); + }); + + await updateLinksInWorkspace( + new WikilinkService(), + [{ oldPageName: 'OldName', newPageName: 'NewName' }], + { candidateUris: [openUri as never, closedUri as never] }, + ); + + expect(vscode.workspace.applyEdit).toHaveBeenCalledTimes(1); + expect(vscode.workspace.fs.writeFile).toHaveBeenCalledTimes(1); + expect(vscode.workspace.fs.writeFile).toHaveBeenCalledWith( + closedUri, + Buffer.from('[[NewName]]', 'utf-8'), + ); + }); + + it('skips files that do not contain any old page name', async () => { + const candidateUri = { fsPath: '/notes/Ref.md', toString: () => 'file:///notes/Ref.md' }; + vi.mocked(vscode.workspace.fs.readFile).mockResolvedValue( + new TextEncoder().encode('No wikilinks here'), + ); + + const result = await updateLinksInWorkspace( + new WikilinkService(), + [{ oldPageName: 'OldName', newPageName: 'NewName' }], + { candidateUris: [candidateUri as never] }, + ); + + expect(result).toHaveLength(0); + expect(vscode.workspace.applyEdit).not.toHaveBeenCalled(); + }); +}); \ No newline at end of file diff --git a/vs-code-extension/src/test/WikilinkRenameTracker.test.ts b/vs-code-extension/src/test/WikilinkRenameTracker.test.ts index 53230db..3512d8c 100644 --- a/vs-code-extension/src/test/WikilinkRenameTracker.test.ts +++ b/vs-code-extension/src/test/WikilinkRenameTracker.test.ts @@ -12,23 +12,56 @@ import { describe, it, expect, vi, beforeEach } from 'vitest'; vi.mock('vscode', () => { const disposable = { dispose: vi.fn() }; + // Minimal EventEmitter stub matching the VS Code API surface + class EventEmitter { + private listeners: ((e: T) => void)[] = []; + event = (listener: (e: T) => void) => { + this.listeners.push(listener); + return { dispose: () => { this.listeners = this.listeners.filter(l => l !== listener); } }; + }; + fire(data: T) { for (const l of this.listeners) { l(data); } } + dispose() { this.listeners = []; } + } return { + EventEmitter, + ProgressLocation: { + Notification: 15, + }, + Uri: { + joinPath: vi.fn((...args: unknown[]) => args), + }, + Range: class { constructor(public sl: number, public sc: number, public el: number, public ec: number) { } }, + WorkspaceEdit: class { replace() { } }, workspace: { onDidChangeTextDocument: vi.fn(() => disposable), asRelativePath: vi.fn((uri: { fsPath?: string; toString(): string }) => typeof uri === 'string' ? uri : uri.fsPath ?? uri.toString(), ), + findFiles: vi.fn().mockResolvedValue([]), + fs: { + rename: vi.fn().mockResolvedValue(undefined), + delete: vi.fn().mockResolvedValue(undefined), + readFile: vi.fn().mockResolvedValue(new Uint8Array()), + }, + applyEdit: vi.fn().mockResolvedValue(true), + openTextDocument: vi.fn().mockResolvedValue({ getText: () => '', lineCount: 0, lineAt: () => ({ text: '' }) }), + textDocuments: [], }, window: { onDidChangeActiveTextEditor: vi.fn(() => disposable), onDidChangeTextEditorSelection: vi.fn(() => disposable), activeTextEditor: undefined, + showInformationMessage: vi.fn().mockResolvedValue('No'), + showWarningMessage: vi.fn(), + showErrorMessage: vi.fn(), + withProgress: vi.fn(async (_options: unknown, task: Function) => task({ report: vi.fn() }, {})), }, }; }); import { WikilinkRenameTracker } from '../WikilinkRenameTracker.js'; import { WikilinkService } from 'as-notes-common'; +import * as vscode from 'vscode'; // ── Minimal dependency stubs ────────────────────────────────────────────────── @@ -115,3 +148,1121 @@ describe('WikilinkRenameTracker — debounce guard integration', () => { expect(tracker.hasPendingEdit(docKey)).toBe(false); }); }); + +// ── isNestingChange ─────────────────────────────────────────────────────────── + +describe('WikilinkRenameTracker.isNestingChange', () => { + const { isNestingChange } = WikilinkRenameTracker; + + // ── Should detect nesting (skip rename) ─────────────────────────── + + it('detects nesting: [[A]] wrapped to [[[[A]] B]]', () => { + expect(isNestingChange('A', '[[A]] B')).toBe(true); + }); + + it('detects wrapping: [[A]] to [[[[A]]]]', () => { + expect(isNestingChange('A', '[[A]]')).toBe(true); + }); + + it('detects un-nesting: [[[[A]] B]] to [[A]]', () => { + expect(isNestingChange('[[A]] B', 'A')).toBe(true); + }); + + it('detects nesting with longer page names', () => { + expect(isNestingChange('Project Name', '[[Project Name]] Test Evidences')).toBe(true); + }); + + it('detects un-nesting with longer page names', () => { + expect(isNestingChange('[[Project Name]] Test Evidences', 'Project Name')).toBe(true); + }); + + // ── Partial bracket manipulation (mid-edit) ─────────────────────── + + it('detects partial nesting: pageName "Demo" vs "[Demo" (from [[[Demo]])', () => { + expect(isNestingChange('Demo', '[Demo')).toBe(true); + }); + + it('detects partial nesting: pageName "Demo" vs "[Demo]" (from [[[Demo]]])', () => { + expect(isNestingChange('Demo', '[Demo]')).toBe(true); + }); + + it('detects partial un-nesting: pageName "[Demo" vs "Demo"', () => { + expect(isNestingChange('[Demo', 'Demo')).toBe(true); + }); + + it('detects partial nesting with trailing brackets: "Demo]" vs "Demo"', () => { + expect(isNestingChange('Demo]', 'Demo')).toBe(true); + }); + + // ── Should not detect nesting (allow rename) ────────────────────── + + it('allows simple rename: [[A]] to [[B]]', () => { + expect(isNestingChange('A', 'B')).toBe(false); + }); + + it('allows inner rename: [[X [[A]] Y]] to [[X [[B]] Y]]', () => { + expect(isNestingChange('X [[A]] Y', 'X [[B]] Y')).toBe(false); + }); + + it('allows outer rename: [[X [[A]] Y]] to [[Z [[A]] Y]]', () => { + expect(isNestingChange('X [[A]] Y', 'Z [[A]] Y')).toBe(false); + }); + + it('allows rename when page names are substrings but not bracketed', () => { + // "AB" contains "A" but not "[[A]]" + expect(isNestingChange('A', 'AB')).toBe(false); + }); + + it('allows rename of identical-length names', () => { + expect(isNestingChange('Foo', 'Bar')).toBe(false); + }); +}); + +// ── Parser output for intermediate nesting states ───────────────────────────── + +describe('extractWikilinks — nesting intermediate states', () => { + const ws = new WikilinkService(); + + it('[[[Demo]] produces wikilink at pos 0 with pageName "[Demo"', () => { + const wls = ws.extractWikilinks('[[[Demo]]'); + const atPos0 = wls.find(w => w.startPositionInText === 0); + expect(atPos0).toBeDefined(); + expect(atPos0!.pageName).toBe('[Demo'); + }); + + it('[[[[Demo]]]] produces inner Demo at pos 2 and outer [[Demo]] at pos 0', () => { + const wls = ws.extractWikilinks('[[[[Demo]]]]'); + const inner = wls.find(w => w.startPositionInText === 2); + const outer = wls.find(w => w.startPositionInText === 0); + expect(inner).toBeDefined(); + expect(inner!.pageName).toBe('Demo'); + expect(outer).toBeDefined(); + expect(outer!.pageName).toBe('[[Demo]]'); + }); + + it('[[[[Demo]] Test]] produces inner Demo at pos 2 and outer at pos 0', () => { + const wls = ws.extractWikilinks('[[[[Demo]] Test]]'); + const inner = wls.find(w => w.startPositionInText === 2); + const outer = wls.find(w => w.startPositionInText === 0); + expect(inner).toBeDefined(); + expect(inner!.pageName).toBe('Demo'); + expect(outer).toBeDefined(); + expect(outer!.pageName).toBe('[[Demo]] Test'); + }); +}); + +// ── promptAndPerformRenames — rename directory & decline re-index ────────────── + +describe('WikilinkRenameTracker — promptAndPerformRenames', () => { + function makeMocks() { + const oldUri = { fsPath: '/notes/sub/OldName.md', toString: () => 'file:///notes/sub/OldName.md' }; + const newUri = { fsPath: '/notes/sub/NewName.md', toString: () => 'file:///notes/sub/NewName.md' }; + const documentUri = { fsPath: '/notes/referencing.md', toString: () => 'file:///notes/referencing.md' }; + const document = { + uri: documentUri, + getText: vi.fn(() => '[[NewName]]'), + lineCount: 1, + lineAt: vi.fn(() => ({ text: '[[NewName]]' })), + }; + + const resolveTargetUri = vi.fn().mockReturnValue(newUri); + const resolveTargetUriCaseInsensitive = vi.fn().mockImplementation(async (_uri: unknown, pageFileName: string) => { + if (pageFileName === 'OldName') { + return { uri: oldUri, viaAlias: false }; + } + return { uri: newUri, viaAlias: false }; + }); + // Old file exists, new file does not (standard rename, not merge) + const fileExists = vi.fn().mockImplementation(async (uri: unknown) => { + const uriStr = (uri as { toString(): string }).toString(); + return uriStr.includes('OldName'); + }); + + const fileService = { + resolveTargetUri, + resolveTargetUriCaseInsensitive, + fileExists, + resolveNewFileTargetUri: vi.fn(), + }; + + const indexFileContent = vi.fn(); + const indexService = { + isOpen: true, + getPageByPath: vi.fn().mockReturnValue({ id: 1 }), + getLinksForPage: vi.fn().mockReturnValue([]), + findPagesLinkingToPageNames: vi.fn().mockReturnValue([]), + resolveAlias: vi.fn().mockReturnValue(undefined), + indexFileContent, + updateRename: vi.fn(), + saveToFile: vi.fn(), + removePage: vi.fn(), + getPageById: vi.fn().mockReturnValue(undefined), + updateAliasRename: vi.fn(), + }; + + const indexScanner = { + indexFile: vi.fn().mockResolvedValue(undefined), + }; + + const wikilinkService = new WikilinkService(); + const tracker = new WikilinkRenameTracker( + wikilinkService, + fileService as never, + indexService as never, + indexScanner as never, + { fsPath: '/notes', toString: () => 'file:///notes' } as never, + ); + + return { tracker, document, fileService, indexService, indexScanner, oldUri, newUri, documentUri }; + } + + it('resolves newUri from the old file location, not the referencing document', async () => { + const { tracker, document, fileService, oldUri } = makeMocks(); + + // User clicks Yes to rename + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + const renames = [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }]; + + // Call the private method directly + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + // resolveTargetUri should have been called with the OLD file's URI, + // not the referencing document's URI + expect(fileService.resolveTargetUri).toHaveBeenCalledWith(oldUri, 'NewName'); + }); + + it('re-indexes the document when the user declines the rename', async () => { + const { tracker, document, indexService } = makeMocks(); + + // User clicks No + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('No' as never); + + const renames = [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }]; + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'sub/referencing.md'); + + // On decline, indexFileContent should be called with buffer text + expect(indexService.indexFileContent).toHaveBeenCalledWith( + 'sub/referencing.md', + 'referencing.md', + '[[NewName]]', + expect.any(Number), + ); + }); + + it('re-indexes the document when the user dismisses the dialog', async () => { + const { tracker, document, indexService } = makeMocks(); + + // User dismisses dialog (returns undefined) + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue(undefined as never); + + const renames = [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }]; + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'notes/page.md'); + + // Dismissing is equivalent to declining — should still re-index + expect(indexService.indexFileContent).toHaveBeenCalledWith( + 'notes/page.md', + 'page.md', + '[[NewName]]', + expect.any(Number), + ); + }); + + it('re-indexes the initiating document buffer when the user accepts the rename', async () => { + const { tracker, document, indexService } = makeMocks(); + + // User clicks Yes + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + const renames = [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }]; + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + expect(indexService.indexFileContent).toHaveBeenCalledWith( + 'referencing.md', + 'referencing.md', + '[[NewName]]', + expect.any(Number), + ); + }); + + it('does not try to re-open the old source URI after the source page itself is renamed', async () => { + const oldUri = { fsPath: '/notes/sub/OldName.md', toString: () => 'file:///notes/sub/OldName.md' }; + const newUri = { fsPath: '/notes/sub/NewName.md', toString: () => 'file:///notes/sub/NewName.md' }; + vi.mocked(vscode.Uri.joinPath).mockImplementation((base: { fsPath: string }, child: string) => ({ + fsPath: `${base.fsPath}/${child}`, + toString: () => `file://${base.fsPath}/${child}`, + }) as never); + + const document = { + uri: oldUri, + getText: vi.fn(() => '[[NewName]]'), + lineCount: 1, + lineAt: vi.fn(() => ({ text: '[[NewName]]' })), + }; + + const fileService = { + resolveTargetUri: vi.fn().mockReturnValue(newUri), + resolveTargetUriCaseInsensitive: vi.fn().mockImplementation(async (_uri: unknown, pageFileName: string) => { + if (pageFileName === 'OldName') { + return { uri: oldUri, viaAlias: false }; + } + return { uri: newUri, viaAlias: false }; + }), + fileExists: vi.fn().mockImplementation(async (uri: unknown) => { + const uriStr = (uri as { toString(): string }).toString(); + return uriStr.includes('OldName'); + }), + resolveNewFileTargetUri: vi.fn(), + }; + + const indexService = { + isOpen: true, + getPageByPath: vi.fn().mockReturnValue({ id: 1 }), + getLinksForPage: vi.fn().mockReturnValue([]), + findPagesLinkingToPageNames: vi.fn().mockReturnValue([{ id: 1, path: 'sub/OldName.md', filename: 'OldName.md', title: 'OldName', mtime: 0, indexed_at: 0 }]), + resolveAlias: vi.fn().mockReturnValue(undefined), + indexFileContent: vi.fn(), + updateRename: vi.fn(), + saveToFile: vi.fn(), + removePage: vi.fn(), + getPageById: vi.fn().mockReturnValue(undefined), + updateAliasRename: vi.fn(), + }; + + const indexScanner = { + indexFile: vi.fn().mockResolvedValue(undefined), + }; + + const tracker = new WikilinkRenameTracker( + new WikilinkService(), + fileService as never, + indexService as never, + indexScanner as never, + { fsPath: '/notes', toString: () => 'file:///notes' } as never, + ); + + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }], 'sub/OldName.md'); + + expect(indexScanner.indexFile).not.toHaveBeenCalledWith(oldUri); + expect(indexScanner.indexFile).toHaveBeenCalledWith(newUri); + }); + + it('renames filenames that contain the renamed wikilink text', async () => { + const { tracker, document, indexService } = makeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + indexService.getAllPages = vi.fn().mockReturnValue([ + { id: 2, path: 'Topic [[OldName]].md', filename: 'Topic [[OldName]].md', title: 'Topic [[OldName]]', mtime: 0, indexed_at: 0 }, + ]); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }], 'referencing.md'); + + expect(vscode.workspace.fs.rename).toHaveBeenCalledWith( + expect.objectContaining({ fsPath: '/notes/Topic [[OldName]].md' }), + expect.objectContaining({ fsPath: '/notes/Topic [[NewName]].md' }), + { overwrite: false }, + ); + }); + + it('fires onDidDeclineRename when user declines', async () => { + const { tracker, document } = makeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('No' as never); + + const listener = vi.fn(); + tracker.onDidDeclineRename(listener); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, [{ + oldPageName: 'OldName', newPageName: 'NewName', + line: 0, startPosition: 0, endPosition: 11, + }], 'referencing.md'); + + expect(listener).toHaveBeenCalledOnce(); + }); + + it('fires onDidDeclineRename when user dismisses the dialog', async () => { + const { tracker, document } = makeMocks(); + // Dismissing returns undefined + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue(undefined as never); + + const listener = vi.fn(); + tracker.onDidDeclineRename(listener); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, [{ + oldPageName: 'OldName', newPageName: 'NewName', + line: 0, startPosition: 0, endPosition: 11, + }], 'referencing.md'); + + expect(listener).toHaveBeenCalledOnce(); + }); + + it('does not fire onDidDeclineRename when user accepts', async () => { + const { tracker, document } = makeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + const listener = vi.fn(); + tracker.onDidDeclineRename(listener); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, [{ + oldPageName: 'OldName', newPageName: 'NewName', + line: 0, startPosition: 0, endPosition: 11, + }], 'referencing.md'); + + expect(listener).not.toHaveBeenCalled(); + }); +}); + +// ── isRenaming getter ───────────────────────────────────────────────────────── + +describe('WikilinkRenameTracker — isRenaming', () => { + it('returns false when no rename is in progress', () => { + const wikilinkService = new WikilinkService(); + const tracker = new WikilinkRenameTracker( + wikilinkService, + { resolveTargetUri: vi.fn(), resolveTargetUriCaseInsensitive: vi.fn(), fileExists: vi.fn(), resolveNewFileTargetUri: vi.fn() } as never, + { isOpen: true, getPageByPath: vi.fn(), getLinksForPage: vi.fn(), resolveAlias: vi.fn(), indexFileContent: vi.fn(), updateRename: vi.fn(), saveToFile: vi.fn(), removePage: vi.fn(), getPageById: vi.fn(), updateAliasRename: vi.fn() } as never, + { indexFile: vi.fn() } as never, + ); + expect(tracker.isRenaming).toBe(false); + }); +}); + +// ── Merge on rename to existing page ────────────────────────────────────────── + +describe('WikilinkRenameTracker — merge on rename to existing', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + function makeMergeMocks() { + const oldUri = { fsPath: '/notes/sub/OldName.md', toString: () => 'file:///notes/sub/OldName.md' }; + const newUri = { fsPath: '/notes/sub/NewName.md', toString: () => 'file:///notes/sub/NewName.md' }; + const documentUri = { fsPath: '/notes/referencing.md', toString: () => 'file:///notes/referencing.md' }; + const document = { + uri: documentUri, + getText: vi.fn(() => '[[NewName]]'), + lineCount: 1, + lineAt: vi.fn(() => ({ text: '[[NewName]]' })), + }; + + const resolveTargetUri = vi.fn().mockReturnValue(newUri); + const resolveTargetUriCaseInsensitive = vi.fn().mockImplementation(async (_uri: unknown, pageFileName: string) => { + if (pageFileName === 'OldName') { + return { uri: oldUri, viaAlias: false }; + } + return { uri: newUri, viaAlias: false }; + }); + // Both old and new files exist — triggers merge flow + const fileExists = vi.fn().mockResolvedValue(true); + + const fileService = { + resolveTargetUri, + resolveTargetUriCaseInsensitive, + fileExists, + resolveNewFileTargetUri: vi.fn(), + }; + + const indexFileContent = vi.fn(); + const indexService = { + isOpen: true, + getPageByPath: vi.fn().mockReturnValue({ id: 1 }), + getLinksForPage: vi.fn().mockReturnValue([]), + resolveAlias: vi.fn().mockReturnValue(undefined), + indexFileContent, + updateRename: vi.fn(), + saveToFile: vi.fn(), + removePage: vi.fn(), + getPageById: vi.fn().mockReturnValue(undefined), + updateAliasRename: vi.fn(), + }; + + const indexScanner = { + indexFile: vi.fn().mockResolvedValue(undefined), + }; + + // Mock openTextDocument to return different content for old vs new files + const oldFileContent = '---\ntitle: Old Page\n---\n\n# Old content'; + const newFileContent = '---\ntitle: New Page\n---\n\n# New content'; + const makeDoc = (content: string, uri: unknown) => ({ + getText: () => content, + lineCount: content.split('\n').length, + lineAt: (line: number) => ({ + text: content.split('\n')[line] ?? '', + range: { start: { line, character: 0 }, end: { line, character: (content.split('\n')[line] ?? '').length } }, + }), + uri, + save: vi.fn().mockResolvedValue(true), + }); + vi.mocked(vscode.workspace.openTextDocument).mockImplementation(async (uri: unknown) => { + const uriStr = typeof uri === 'string' ? uri : (uri as { toString(): string }).toString(); + if (uriStr.includes('OldName')) { + return makeDoc(oldFileContent, oldUri) as never; + } + return makeDoc(newFileContent, newUri) as never; + }); + + const wikilinkService = new WikilinkService(); + const tracker = new WikilinkRenameTracker( + wikilinkService, + fileService as never, + indexService as never, + indexScanner as never, + ); + + const renames = [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }]; + + return { tracker, document, fileService, indexService, indexScanner, oldUri, newUri, renames }; + } + + it('includes merge language in dialog when target file exists', async () => { + const { tracker, document, renames } = makeMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('No' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + const messageArg = vi.mocked(vscode.window.showInformationMessage).mock.calls[0][0] as string; + expect(messageArg.toLowerCase()).toContain('merge'); + }); + + it('does not rename the file when target exists — merges instead', async () => { + const { tracker, document, renames } = makeMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + // fs.rename should NOT have been called (merge, not rename) + expect(vscode.workspace.fs.rename).not.toHaveBeenCalled(); + }); + + it('writes merged content to target file on accept', async () => { + const { tracker, document, renames } = makeMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + // applyEdit should have been called to write merged content + expect(vscode.workspace.applyEdit).toHaveBeenCalled(); + }); + + it('deletes source file after merge on accept', async () => { + const { tracker, document, renames } = makeMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + // The old file should be deleted after merge + expect(vscode.workspace.fs.delete).toHaveBeenCalled(); + }); + + it('performs full no-op when user declines a merge rename', async () => { + const { tracker, document, indexService, renames } = makeMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('No' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + // No file operations + expect(vscode.workspace.fs.rename).not.toHaveBeenCalled(); + // No index update (full no-op for merge decline) + expect(indexService.indexFileContent).not.toHaveBeenCalled(); + // No link updates (applyEdit not called for merge workspace edits) + // Note: applyEdit for workspace link replacement should not be called + }); + + it('does not show the old warning message when target exists', async () => { + const { tracker, document, renames } = makeMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + expect(vscode.window.showWarningMessage).not.toHaveBeenCalled(); + }); +}); + +// ── Alias-is-page-own-name should not block merge detection ─────────────────── + +describe('WikilinkRenameTracker — alias matching page own name', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + /** + * Mocks the scenario from the bug report: + * - Pothos.md has aliases: [Pothos] (its own name as an alias) + * - Monstera.md exists (target file) + * - User renames [[Pothos]] → [[Monstera]] in editor + * - resolveAlias("Pothos") returns Pothos.md + * - Expected: merge (not alias rename) + */ + function makeAliasOwnNameMergeMocks() { + const oldUri = { fsPath: '/notes/Pothos.md', toString: () => 'file:///notes/Pothos.md' }; + const newUri = { fsPath: '/notes/Monstera.md', toString: () => 'file:///notes/Monstera.md' }; + const documentUri = { fsPath: '/notes/referencing.md', toString: () => 'file:///notes/referencing.md' }; + const document = { + uri: documentUri, + getText: vi.fn(() => '[[Monstera]]'), + lineCount: 1, + lineAt: vi.fn(() => ({ text: '[[Monstera]]' })), + }; + + const resolveTargetUri = vi.fn().mockReturnValue(newUri); + const resolveTargetUriCaseInsensitive = vi.fn().mockImplementation(async (_uri: unknown, pageFileName: string) => { + if (pageFileName === 'Pothos') { + return { uri: oldUri, viaAlias: false }; + } + return { uri: newUri, viaAlias: false }; + }); + // Both old and new files exist + const fileExists = vi.fn().mockResolvedValue(true); + + const fileService = { + resolveTargetUri, + resolveTargetUriCaseInsensitive, + fileExists, + resolveNewFileTargetUri: vi.fn(), + }; + + const indexService = { + isOpen: true, + getPageByPath: vi.fn().mockReturnValue({ id: 1 }), + getLinksForPage: vi.fn().mockReturnValue([]), + // This is the key: resolveAlias matches because "Pothos" is an alias on Pothos.md + resolveAlias: vi.fn().mockReturnValue({ + id: 1, + path: 'notes/Pothos.md', + filename: 'Pothos.md', + title: 'Pothos', + mtime: 0, + indexed_at: 0, + }), + indexFileContent: vi.fn(), + updateRename: vi.fn(), + saveToFile: vi.fn(), + removePage: vi.fn(), + getPageById: vi.fn().mockReturnValue(undefined), + updateAliasRename: vi.fn(), + }; + + const indexScanner = { + indexFile: vi.fn().mockResolvedValue(undefined), + }; + + // Mock openTextDocument for merge content + const oldContent = '---\ntitle: Pothos\naliases:\n - Pothos\n---\n\nPothos content'; + const newContent = '---\ntitle: Monstera\n---\n\nMonstera content'; + const makeDoc = (content: string, uri: unknown) => ({ + getText: () => content, + lineCount: content.split('\n').length, + lineAt: (line: number) => ({ + text: content.split('\n')[line] ?? '', + range: { start: { line, character: 0 }, end: { line, character: (content.split('\n')[line] ?? '').length } }, + }), + uri, + save: vi.fn().mockResolvedValue(true), + }); + vi.mocked(vscode.workspace.openTextDocument).mockImplementation(async (uri: unknown) => { + const uriStr = typeof uri === 'string' ? uri : (uri as { toString(): string }).toString(); + if (uriStr.includes('Pothos')) { + return makeDoc(oldContent, oldUri) as never; + } + return makeDoc(newContent, newUri) as never; + }); + + const wikilinkService = new WikilinkService(); + const tracker = new WikilinkRenameTracker( + wikilinkService, + fileService as never, + indexService as never, + indexScanner as never, + ); + + const renames = [{ + oldPageName: 'Pothos', + newPageName: 'Monstera', + line: 0, + startPosition: 0, + endPosition: 12, + }]; + + return { tracker, document, fileService, indexService, indexScanner, oldUri, newUri, renames }; + } + + it('shows merge dialog (not alias rename) when alias matches page own name and target exists', async () => { + const { tracker, document, renames } = makeAliasOwnNameMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('No' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + const messageArg = vi.mocked(vscode.window.showInformationMessage).mock.calls[0][0] as string; + expect(messageArg.toLowerCase()).toContain('merge'); + expect(messageArg.toLowerCase()).not.toContain('alias'); + }); + + it('merges files when alias matches page own name and user accepts', async () => { + const { tracker, document, renames } = makeAliasOwnNameMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + // Should merge (applyEdit + delete), not just update alias + expect(vscode.workspace.applyEdit).toHaveBeenCalled(); + expect(vscode.workspace.fs.delete).toHaveBeenCalled(); + expect(vscode.workspace.fs.rename).not.toHaveBeenCalled(); + }); + + it('does not call updateAliasRename when alias matches page own name', async () => { + const { tracker, document, indexService, renames } = makeAliasOwnNameMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + // Should NOT have taken the alias path + expect(indexService.updateAliasRename).not.toHaveBeenCalled(); + }); +}); + +// ── Cross-directory merge detection ────────────────────────────────────────── + +describe('WikilinkRenameTracker — cross-directory merge detection', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + function makeCrossDirectoryMergeMocks(oldPageName = 'OldName', newPageName = 'NewName') { + const oldUri = { fsPath: `/notes/source/${oldPageName}.md`, toString: () => `file:///notes/source/${oldPageName}.md` }; + const sameDirNewUri = { fsPath: `/notes/source/${newPageName}.md`, toString: () => `file:///notes/source/${newPageName}.md` }; + const targetUri = { fsPath: `/notes/target/${newPageName}.md`, toString: () => `file:///notes/target/${newPageName}.md` }; + const documentUri = { fsPath: '/notes/referencing.md', toString: () => 'file:///notes/referencing.md' }; + const document = { + uri: documentUri, + getText: vi.fn(() => `[[${newPageName}]]`), + lineCount: 1, + lineAt: vi.fn(() => ({ text: `[[${newPageName}]]` })), + }; + + const resolveTargetUri = vi.fn().mockReturnValue(sameDirNewUri); + const resolveTargetUriCaseInsensitive = vi.fn().mockImplementation(async (_uri: unknown, pageFileName: string) => { + if (pageFileName === oldPageName) { + return { uri: oldUri, viaAlias: false }; + } + if (pageFileName === newPageName) { + return { uri: targetUri, viaAlias: false }; + } + return { uri: sameDirNewUri, viaAlias: false }; + }); + const fileExists = vi.fn().mockImplementation(async (uri: { toString(): string }) => { + const uriStr = uri.toString(); + return uriStr === oldUri.toString() || uriStr === targetUri.toString(); + }); + + const fileService = { + resolveTargetUri, + resolveTargetUriCaseInsensitive, + fileExists, + resolveNewFileTargetUri: vi.fn(), + }; + + const indexService = { + isOpen: true, + getPageByPath: vi.fn().mockReturnValue({ id: 1 }), + getLinksForPage: vi.fn().mockReturnValue([]), + resolveAlias: vi.fn().mockReturnValue(undefined), + findPagesByFilename: vi.fn().mockReturnValue([{ path: `target/${newPageName}.md`, filename: `${newPageName}.md` }]), + indexFileContent: vi.fn(), + updateRename: vi.fn(), + saveToFile: vi.fn(), + removePage: vi.fn(), + getPageById: vi.fn().mockReturnValue(undefined), + updateAliasRename: vi.fn(), + }; + + const indexScanner = { + indexFile: vi.fn().mockResolvedValue(undefined), + }; + + const oldFileContent = `---\ntitle: ${oldPageName}\n---\n\nOld content`; + const targetFileContent = `---\ntitle: ${newPageName}\n---\n\nTarget content`; + const makeDoc = (content: string, uri: unknown) => ({ + getText: () => content, + lineCount: content.split('\n').length, + lineAt: (line: number) => ({ + text: content.split('\n')[line] ?? '', + range: { start: { line, character: 0 }, end: { line, character: (content.split('\n')[line] ?? '').length } }, + }), + uri, + save: vi.fn().mockResolvedValue(true), + }); + vi.mocked(vscode.workspace.openTextDocument).mockImplementation(async (uri: unknown) => { + const uriStr = typeof uri === 'string' ? uri : (uri as { toString(): string }).toString(); + if (uriStr === oldUri.toString()) { + return makeDoc(oldFileContent, oldUri) as never; + } + return makeDoc(targetFileContent, targetUri) as never; + }); + + const wikilinkService = new WikilinkService(); + const tracker = new WikilinkRenameTracker( + wikilinkService, + fileService as never, + indexService as never, + indexScanner as never, + ); + + const renames = [{ + oldPageName, + newPageName, + line: 0, + startPosition: 0, + endPosition: newPageName.length + 3, + }]; + + return { tracker, document, fileService, indexService, oldUri, sameDirNewUri, targetUri, renames }; + } + + it('merges into an existing page in another directory instead of renaming locally', async () => { + const { tracker, document, renames, oldUri } = makeCrossDirectoryMergeMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + expect(vscode.workspace.fs.rename).not.toHaveBeenCalled(); + expect(vscode.workspace.fs.delete).toHaveBeenCalledWith(oldUri); + expect(vscode.workspace.applyEdit).toHaveBeenCalled(); + }); + + it('does not treat alias-only target resolution as a file merge', async () => { + const { tracker, document, renames, fileService } = makeCrossDirectoryMergeMocks('OldName', 'AliasTarget'); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + vi.mocked(fileService.resolveTargetUriCaseInsensitive).mockImplementation(async (_uri: unknown, pageFileName: string) => { + if (pageFileName === 'OldName') { + return { uri: { fsPath: '/notes/source/OldName.md', toString: () => 'file:///notes/source/OldName.md' }, viaAlias: false }; + } + return { uri: { fsPath: '/notes/target/Canonical.md', toString: () => 'file:///notes/target/Canonical.md' }, viaAlias: true }; + }); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + expect(vscode.workspace.fs.rename).toHaveBeenCalled(); + expect(vscode.workspace.fs.delete).not.toHaveBeenCalled(); + }); + + it('detects merge targets across directories for nested page names', async () => { + const { tracker, document, renames } = makeCrossDirectoryMergeMocks('[[Topic]] Notes', '[[Topic]] Garden'); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('No' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + const messageArg = vi.mocked(vscode.window.showInformationMessage).mock.calls[0][0] as string; + expect(messageArg.toLowerCase()).toContain('merge'); + expect(messageArg).toContain('[[Topic]] Garden.md'); + }); +}); + +describe('WikilinkRenameTracker — progress notifications', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + function makeProgressMocks() { + const oldUri = { fsPath: '/notes/sub/OldName.md', toString: () => 'file:///notes/sub/OldName.md' }; + const newUri = { fsPath: '/notes/sub/NewName.md', toString: () => 'file:///notes/sub/NewName.md' }; + const documentUri = { fsPath: '/notes/referencing.md', toString: () => 'file:///notes/referencing.md' }; + const document = { + uri: documentUri, + getText: vi.fn(() => '[[NewName]]'), + lineCount: 1, + lineAt: vi.fn(() => ({ text: '[[NewName]]', range: { start: 0, end: 10 } })), + }; + + const fileService = { + resolveTargetUri: vi.fn().mockReturnValue(newUri), + resolveTargetUriCaseInsensitive: vi.fn().mockImplementation(async (_uri: unknown, pageFileName: string) => { + if (pageFileName === 'OldName') { + return { uri: oldUri, viaAlias: false }; + } + return { uri: newUri, viaAlias: false }; + }), + fileExists: vi.fn().mockImplementation(async (uri: unknown) => { + const uriStr = (uri as { toString(): string }).toString(); + return uriStr.includes('OldName'); + }), + resolveNewFileTargetUri: vi.fn(), + }; + + const indexService = { + isOpen: true, + getPageByPath: vi.fn().mockReturnValue({ id: 1 }), + getLinksForPage: vi.fn().mockReturnValue([]), + resolveAlias: vi.fn().mockReturnValue(undefined), + indexFileContent: vi.fn(), + updateRename: vi.fn(), + saveToFile: vi.fn(), + removePage: vi.fn(), + getPageById: vi.fn().mockReturnValue(undefined), + updateAliasRename: vi.fn(), + }; + + const indexScanner = { + indexFile: vi.fn().mockResolvedValue(undefined), + }; + + const tracker = new WikilinkRenameTracker( + new WikilinkService(), + fileService as never, + indexService as never, + indexScanner as never, + ); + + const renames = [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }]; + + return { tracker, document, renames }; + } + + it('shows notification progress when the user accepts an in-editor rename', async () => { + const { tracker, document, renames } = makeProgressMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + expect(vscode.window.withProgress).toHaveBeenCalledTimes(1); + }); + + it('does not show notification progress when the user declines an in-editor rename', async () => { + const { tracker, document, renames } = makeProgressMocks(); + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('No' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, renames, 'referencing.md'); + + expect(vscode.window.withProgress).not.toHaveBeenCalled(); + }); +}); + +// ── No-save behaviour during merge and alias flows ──────────────────────────── + +describe('WikilinkRenameTracker — no save() during merge flow', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + it('does not save the target document after a merge', async () => { + const oldUri = { fsPath: '/notes/sub/OldName.md', toString: () => 'file:///notes/sub/OldName.md' }; + const newUri = { fsPath: '/notes/sub/NewName.md', toString: () => 'file:///notes/sub/NewName.md' }; + const documentUri = { fsPath: '/notes/referencing.md', toString: () => 'file:///notes/referencing.md' }; + const document = { + uri: documentUri, + getText: vi.fn(() => '[[NewName]]'), + lineCount: 1, + lineAt: vi.fn(() => ({ text: '[[NewName]]' })), + }; + + const targetSave = vi.fn().mockResolvedValue(true); + const sourceSave = vi.fn().mockResolvedValue(true); + + const makeDoc = (content: string, uri: unknown, save: ReturnType) => ({ + getText: () => content, + lineCount: content.split('\n').length, + lineAt: (line: number) => ({ + text: content.split('\n')[line] ?? '', + range: { start: { line, character: 0 }, end: { line, character: (content.split('\n')[line] ?? '').length } }, + }), + uri, + save, + }); + + vi.mocked(vscode.workspace.openTextDocument).mockImplementation(async (uri: unknown) => { + const uriStr = typeof uri === 'string' ? uri : (uri as { toString(): string }).toString(); + if (uriStr.includes('OldName')) { + return makeDoc('---\ntitle: Old\n---\n\n# Old', oldUri, sourceSave) as never; + } + return makeDoc('---\ntitle: New\n---\n\n# New', newUri, targetSave) as never; + }); + + const tracker = new WikilinkRenameTracker( + new WikilinkService(), + { + resolveTargetUri: vi.fn().mockReturnValue(newUri), + resolveTargetUriCaseInsensitive: vi.fn().mockImplementation(async (_uri: unknown, pageFileName: string) => { + if (pageFileName === 'OldName') return { uri: oldUri, viaAlias: false }; + return { uri: newUri, viaAlias: false }; + }), + fileExists: vi.fn().mockResolvedValue(true), + resolveNewFileTargetUri: vi.fn(), + } as never, + { + isOpen: true, + getPageByPath: vi.fn().mockReturnValue({ id: 1 }), + getLinksForPage: vi.fn().mockReturnValue([]), + resolveAlias: vi.fn().mockReturnValue(undefined), + indexFileContent: vi.fn(), + updateRename: vi.fn(), + saveToFile: vi.fn(), + removePage: vi.fn(), + getPageById: vi.fn().mockReturnValue(undefined), + updateAliasRename: vi.fn(), + } as never, + { indexFile: vi.fn().mockResolvedValue(undefined) } as never, + ); + + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, [{ + oldPageName: 'OldName', + newPageName: 'NewName', + line: 0, + startPosition: 0, + endPosition: 11, + }], 'referencing.md'); + + expect(targetSave).not.toHaveBeenCalled(); + expect(sourceSave).not.toHaveBeenCalled(); + }); +}); + +describe('WikilinkRenameTracker — no save() during alias rename flow', () => { + beforeEach(() => { + vi.clearAllMocks(); + }); + + it('does not save the canonical document after an alias front matter update', async () => { + const documentUri = { fsPath: '/notes/referencing.md', toString: () => 'file:///notes/referencing.md' }; + const canonicalUri = { fsPath: '/notes/Canonical.md', toString: () => 'file:///notes/Canonical.md' }; + const document = { + uri: documentUri, + getText: vi.fn(() => '[[NewAlias]]'), + lineCount: 1, + lineAt: vi.fn(() => ({ text: '[[NewAlias]]' })), + }; + + const canonicalSave = vi.fn().mockResolvedValue(true); + const canonicalContent = '---\ntitle: Canonical\naliases:\n - OldAlias\n---\n\n# Body'; + vi.mocked(vscode.workspace.openTextDocument).mockResolvedValue({ + getText: () => canonicalContent, + lineCount: canonicalContent.split('\n').length, + lineAt: (line: number) => ({ + text: canonicalContent.split('\n')[line] ?? '', + range: { + start: { line, character: 0 }, + end: { line, character: (canonicalContent.split('\n')[line] ?? '').length }, + }, + }), + uri: canonicalUri, + save: canonicalSave, + } as never); + + const tracker = new WikilinkRenameTracker( + new WikilinkService(), + { + resolveTargetUri: vi.fn(), + resolveTargetUriCaseInsensitive: vi.fn(), + fileExists: vi.fn().mockResolvedValue(false), + resolveNewFileTargetUri: vi.fn(), + } as never, + { + isOpen: true, + getPageByPath: vi.fn().mockReturnValue({ id: 1 }), + getLinksForPage: vi.fn().mockReturnValue([]), + resolveAlias: vi.fn().mockReturnValue({ + id: 10, + path: 'Canonical.md', + filename: 'Canonical.md', + }), + indexFileContent: vi.fn(), + updateRename: vi.fn(), + saveToFile: vi.fn(), + removePage: vi.fn(), + getPageById: vi.fn().mockReturnValue(undefined), + updateAliasRename: vi.fn(), + } as never, + { indexFile: vi.fn().mockResolvedValue(undefined) } as never, + ); + + vi.mocked(vscode.window.showInformationMessage).mockResolvedValue('Yes' as never); + + await (tracker as unknown as { promptAndPerformRenames: Function }) + .promptAndPerformRenames(document, [{ + oldPageName: 'OldAlias', + newPageName: 'NewAlias', + line: 0, + startPosition: 0, + endPosition: 13, + }], 'referencing.md'); + + expect(canonicalSave).not.toHaveBeenCalled(); + }); +});