Three issues in categoryAliasList/categoryMatcher:
1. categoryAliasList appended raw synonyms without deduplication—the
addAlias helper already handles lowering and dedup, so route synonyms
through it instead of direct append.
2. categoryMatcher.matches had a fast-path that returned false when the
input contained no separators (-_ space), skipping the normalization
step entirely. This caused legitimate matches like "frozen foods" vs
"frozen" to fail when the input was a simple word that needed plural
stripping to match.
3. normalizeCategory unconditionally replaced underscores/hyphens and
re-joined fields even for inputs without separators. Gate the
separator logic behind a ContainsAny check, and use direct slice
indexing instead of TrimSuffix for the plural stripping.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- expand feature list with sort support, store comparison, and interactive TUI browsing
- add command reference sections for compare and tui with concrete examples
- document new --sort flag in the filter flag table
- clarify behavior notes so category filtering is described as case-insensitive with practical synonym support
- add pubcli compare command to rank nearby stores by filtered deal coverage, bogo count, aggregate score, and distance tie-breaks
- support --count (1-10) for comparison breadth and emit structured JSON/text output with ranked entries
- add robust distance token parsing to tolerate upstream distance string formatting differences
- add pubcli tui interactive terminal browser with paging, deal detail drill-in, and explicit TTY validation for stdin/stdout
- share deal-filter flag registration across root/tui/compare and add --sort support in root execution path
- validate sort mode early and allow canonical aliases (end, expiry, expiration) while preserving explicit invalid-arg guidance
- expand tolerant CLI normalization for new commands/flags and aliases (orderby, sortby, count, bare-flag rewrite for compare/tui)
- update quick-start flag list and integration tests to cover compare help and normalization behavior
- extend filter.Options with sort mode support and keep Apply as a single-pass pipeline with limit behavior preserved for unsorted flows
- add sort normalization and two ordering strategies:
* savings: rank by computed DealScore with deterministic title tie-break
* ending: rank by earliest parsed end date, then DealScore fallback
- introduce DealScore heuristics that combine BOGO weighting, dollar-off extraction, and percentage extraction from savings/deal-info text
- add category synonym matcher that supports:
* direct case-insensitive matches
* canonical group synonym expansion (e.g. veggies -> produce)
* normalized fallback for hyphen/underscore/plural variants without breaking exact unknown-category matching
- include explicit tests for synonym matching, hyphenated category handling, unknown plural exact matching, and sort ordering behavior
- keep allocation-sensitive behavior intact while adding matcher precomputation and fast-path checks
New internal/perf package with BenchmarkZipPipeline_1kDeals that
exercises the full hot path: API fetch (stores + savings against an
httptest server with pre-marshaled payloads), filter.Apply with all
predicates active and a 50-item limit, and display.PrintDealsJSON
to io.Discard.
This provides a single top-level benchmark to catch regressions
across package boundaries — e.g. if a filter optimization shifts
allocation pressure into the display layer, this benchmark surfaces
it where per-package benchmarks would not.
Synthetic dataset: 1000 deals with deterministic category/department
distribution to ensure the filter pipeline has meaningful work to do
(~1/3 BOGO, mixed departments, keyword matches in titles).
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace the simple nil-title → "Unknown" logic in printDeal with a
fallbackDealTitle() function that tries multiple fields in priority
order before giving up:
1. Title (cleaned) — the happy path, same as before
2. Brand + Department — e.g. "Publix deal (Meat)"
3. Brand alone — e.g. "Publix deal"
4. Department alone — e.g. "Meat deal"
5. Description truncated to 48 chars — last-resort meaningful text
6. Item ID — e.g. "Deal 12345"
7. "Untitled deal" — only when every field is empty
This makes the output more useful for the ~5-10% of weekly ad items
that ship with a nil Title from the Publix API, which previously all
showed as "Unknown" and were indistinguishable from each other.
Tests:
- TestPrintDeals_FallbackTitleFromBrandAndDepartment
- TestPrintDeals_FallbackTitleFromID
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace the multi-pass where() chain in Apply() with a single loop that
evaluates all filter predicates per item and skips immediately on first
mismatch. This eliminates N intermediate slice allocations (one per
active filter) and avoids re-scanning the full dataset for each filter
dimension.
Key changes in filter.go:
- Single loop with continue-on-mismatch for BOGO, category, department,
and query filters — combined categories check scans item.Categories
once for both BOGO and category instead of twice
- Pre-allocate result slice capped at min(len(items), opts.Limit) to
avoid grow-and-copy churn
- Fast-path bypass when no filters are active (just apply limit)
- Break early once limit is reached instead of filtering everything
and truncating after
- Remove the now-unused where() helper function
- Add early-return fast paths to CleanText() for the common case where
input contains no HTML entities or newlines, avoiding unnecessary
html.UnescapeString and ReplaceAll calls
Test coverage:
- filter_equivalence_test.go (new): Reference implementation of the
original multi-pass algorithm with 500 randomized test cases verifying
behavioral equivalence. Includes allocation budget guardrail (<=80
allocs/op for 1k items) to catch accidental regression to multi-pass.
Benchmarks for new vs legacy reference on identical workload.
- filter_test.go: Benchmark comparisons for CleanText on plain text
(fast path) vs escaped HTML (full path), new vs legacy.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Refactor the internal HTTP helper from get() returning raw bytes to
getAndDecode() that streams directly into the target struct via
json.NewDecoder. This eliminates the intermediate []byte allocation
from io.ReadAll on every API response.
The new decoder also validates that responses contain exactly one JSON
value by attempting a second Decode after the primary one — any content
beyond the first value (e.g., concatenated objects from a misbehaving
proxy) returns an error instead of silently discarding it.
Changes:
- api/client.go: Replace get() with getAndDecode(), update FetchStores
and FetchSavings callers to use the new signature
- api/client_test.go: Add TestFetchSavings_TrailingJSONIsRejected and
TestFetchStores_MalformedJSONReturnsDecodeError covering the new
decoder error paths
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Agent-friendly argument normalization that auto-corrects common
CLI syntax mistakes before cobra parses them:
- Single-dash long flags: -zip -> --zip
- Bare key=value: zip=33101 -> --zip=33101
- Typos via Levenshtein distance (max 2): --ziip -> --zip
- Command typos: categoriess -> categories
- Flag aliases: --zipcode, --dept, --search -> canonical names
Corrections emit a "note:" line to stderr showing what was rewritten.
Positional arguments for completion/help subcommands are preserved
(e.g., "completion zsh" is not rewritten). Integration tests verify
end-to-end behavior including tolerance notes, double-dash boundaries,
and help output for rewritten args.
Three cobra commands forming the CLI surface:
- root: fetch and filter weekly deals (--store/--zip with BOGO,
category, department, query, and limit filters)
- stores: list nearby Publix locations by ZIP code
- categories: show available deal categories with counts
Structured error system with typed error codes (INVALID_ARGS,
NOT_FOUND, UPSTREAM_ERROR, INTERNAL_ERROR) and semantic exit codes
(0-4). Errors render as human-readable text or JSON depending on
output mode. Robot-mode features: auto-JSON when stdout is not a TTY,
compact quick-start help when invoked with no args, and JSON error
payloads for programmatic consumers.
Rendering layer for deals, stores, and categories with two output
modes: styled terminal text using lipgloss (color-coded BOGO tags,
price highlights, dim metadata, word-wrapped descriptions) and
compact JSON for programmatic consumption.
JSON output types (DealJSON, StoreJSON) normalize raw API fields —
cleaning HTML entities, dereferencing nullable pointers, and
computing derived fields like isBogo. Terminal output includes
contextual headers with item counts and date ranges. Tests verify
both rendering modes including HTML entity handling and nil safety.
Composable filter pipeline that processes SavingItem slices through
chained predicates: BOGO detection (category match), exact category
match, substring department match, and keyword search across title
and description fields. All text matching is case-insensitive.
Includes utility functions for HTML entity unescaping (CleanText),
nil-safe string pointer dereferencing (Deref), and case-insensitive
slice membership (ContainsIgnoreCase). An optional limit truncates
results after all filters are applied. Tests cover each filter in
isolation, combined filters, nil field safety, and the Categories
aggregation helper.
HTTP client that wraps the Publix services API with two endpoints:
- /api/v4/savings — fetches weekly ad deals for a given store number
- /api/v1/storelocation — finds nearby stores by ZIP code
Includes request types (SavingsResponse, SavingItem, StoreResponse,
Store) mapping directly to the Publix JSON schema. The client sends
a PublixStore header for store-scoped requests and uses a 15-second
timeout. Tests use httptest servers to verify header propagation,
JSON decoding, and error handling for non-200 responses.
Initialize the publix-deals Go module (go 1.24.4) with core
dependencies: cobra for CLI structure, lipgloss for styled terminal
output, testify for assertions, and x/term for TTY detection.
The main entrypoint at cmd/pubcli/main.go delegates to cmd.Execute().
The .gitignore covers Go build artifacts, editor files, coverage
output, and jj VCS state.