Privacy-first analytics
Payback
A privacy-first consumer intelligence app that extracts signals on-device, stores them in an encrypted vault, and uses a guarded AI backend only for derived analysis.
Payback
On-device extraction, encrypted storage, and a tightly scoped AI proxy.
What shipped
- Kept raw Google Takeout and Meta export data on-device using AES-256-GCM encrypted SQLite.
- Normalized behavior into 135 categories across 10 pillars for consistent downstream analysis.
- Built resumable background processing plus a hardened AI proxy with OAuth verification, rate limits, timeout control, and API-key failover.
Core stack
Most consumer-intelligence products centralize raw personal data on a server and call that insight. Payback needed to produce useful analysis without reproducing the same surveillance pattern it was supposed to critique.
I moved extraction, normalization, and checkpointed long-running work onto the device with ZIP parsing, media agents, resumable background jobs, and encrypted SQLite storage. The backend is intentionally narrow: it verifies Google OAuth tokens, rate-limits aggressively, supports dual-key Gemini failover, and only handles derived category sets instead of raw exports.
Payback turns Google and Meta exports into 135 behavioral categories across 10 pillars while keeping raw source data local. The interface, documentation, and backend all reinforce the same product promise: insight without surrendering the underlying archive.
Need this level of product depth for your own build?
The same mix of product direction, interface work, and backend systems can be applied to your next app.