Add spec, plan, and tasks for 030-bulk-import-sources feature
Defines the "Bulk Import All Sources" feature for the on-demand bestiary system: one-click loading of all ~104 bestiary sources with concurrent fetching, progress feedback, toast notifications, and completion reporting. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
70
specs/030-bulk-import-sources/research.md
Normal file
70
specs/030-bulk-import-sources/research.md
Normal file
@@ -0,0 +1,70 @@
|
||||
# Research: Bulk Import All Sources
|
||||
|
||||
## R1: Source Code List Availability
|
||||
|
||||
**Decision**: Use the existing bestiary index's `sources` object keys to enumerate all source codes for bulk fetching.
|
||||
|
||||
**Rationale**: The `loadBestiaryIndex()` function already returns a `BestiaryIndex` with a `sources: Record<string, string>` mapping all ~104 source codes to display names. This is the single source of truth.
|
||||
|
||||
**Alternatives considered**:
|
||||
- Hardcoded source list: Rejected — would drift from index and require manual maintenance.
|
||||
- Fetching a remote manifest: Rejected — adds complexity and an extra network call.
|
||||
|
||||
## R2: URL Construction Pattern
|
||||
|
||||
**Decision**: Construct fetch URLs by appending `bestiary-{sourceCode.toLowerCase()}.json` to a user-provided base URL, matching the existing `getDefaultFetchUrl()` pattern.
|
||||
|
||||
**Rationale**: The `getDefaultFetchUrl` helper in `bestiary-index-adapter.ts` already implements this pattern. The bulk import reuses it with a configurable base URL prefix.
|
||||
|
||||
**Alternatives considered**:
|
||||
- Per-source URL customization: Rejected — too complex for bulk operation; single base URL is sufficient.
|
||||
|
||||
## R3: Concurrent Fetch Strategy
|
||||
|
||||
**Decision**: Fire all fetch requests via `Promise.allSettled()` and let the browser handle HTTP/2 connection multiplexing and connection pooling (typically 6 concurrent connections per origin for HTTP/1.1).
|
||||
|
||||
**Rationale**: `Promise.allSettled()` (not `Promise.all()`) ensures that individual failures don't abort the entire operation. The browser naturally throttles concurrent connections, so no manual batching is needed.
|
||||
|
||||
**Alternatives considered**:
|
||||
- Manual batching (e.g., 10 at a time): Rejected — adds complexity; browser pooling handles this naturally.
|
||||
- Sequential fetching: Rejected — too slow for 104 sources.
|
||||
|
||||
## R4: Progress State Management
|
||||
|
||||
**Decision**: Create a dedicated `useBulkImport` hook that manages import state (total, completed, failed, status) and exposes it to both the side panel component and the toast component.
|
||||
|
||||
**Rationale**: The import state needs to survive the side panel closing (toast takes over). Lifting state to a hook that lives in App.tsx ensures both UI targets can consume the same progress data.
|
||||
|
||||
**Alternatives considered**:
|
||||
- Context provider: Rejected — overkill for a single piece of state consumed by 2 components.
|
||||
- Global state (zustand/jotai): Rejected — project doesn't use external state management; unnecessary dependency.
|
||||
|
||||
## R5: Toast Implementation
|
||||
|
||||
**Decision**: Build a minimal custom toast component using a React portal rendered at `document.body` level, positioned at bottom-center via fixed positioning.
|
||||
|
||||
**Rationale**: The spec requires no third-party toast library. A portal ensures the toast renders above all other content. The component needs only: text, progress bar, optional dismiss button, and auto-dismiss timer.
|
||||
|
||||
**Alternatives considered**:
|
||||
- Third-party library (react-hot-toast, sonner): Rejected — spec explicitly requires custom component.
|
||||
- Non-portal approach: Rejected — would require careful z-index management and DOM nesting.
|
||||
|
||||
## R6: Skip-Already-Cached Strategy
|
||||
|
||||
**Decision**: Before firing fetches, check each source against `isSourceCached()` and build a filtered list of uncached sources. Update the total count to reflect only uncached sources.
|
||||
|
||||
**Rationale**: This avoids unnecessary network requests and gives accurate progress counts. The existing `isSourceCached()` function supports this directly.
|
||||
|
||||
**Alternatives considered**:
|
||||
- Fetch all and overwrite: Rejected — wastes bandwidth and time.
|
||||
- Check during fetch (lazy): Rejected — harder to show accurate total count upfront.
|
||||
|
||||
## R7: Integration with Existing Bestiary Hook
|
||||
|
||||
**Decision**: The `useBulkImport` hook calls the existing `fetchAndCacheSource` from `useBestiary` for each source. After all sources complete, a single `refreshCache()` call reloads the creature map.
|
||||
|
||||
**Rationale**: Reuses the existing normalization + caching pipeline. Calling `refreshCache()` once at the end (instead of after each source) avoids O(N) full map rebuilds.
|
||||
|
||||
**Alternatives considered**:
|
||||
- Inline the fetch/normalize/cache logic in the bulk import hook: Rejected — duplicates code.
|
||||
- Call refreshCache after each source: Rejected — expensive O(N) rebuild on each call.
|
||||
Reference in New Issue
Block a user