Skip to main content
image.png

From Data to Insight

The platform treats the market as a stream of events. Transactions, pool updates, price ticks, bridge moves, and social posts arrive in real time from major EVM networks, Solana, and other high‑demand chains; reputable market aggregators and exchange websockets; and public social APIs. Each record is validated (duplicate removal, late/out‑of‑order quarantine), stamped with source and time, and written to two paths: hot caches (Redis) for sub‑second reads and a time‑series store for historical comparisons, with versioned parquet in an object lake for audit and backfill. Lightweight stream jobs enrich events with labels (token metadata, contract type, exchange addresses) and compute rolling aggregates such as 1‑hour whale netflow, pool depth at a defined slippage, and 24‑hour sentiment deltas. These derived metrics, “features,” carry freshness targets, sanity rules, and lineage so the system knows when to recompute and how to explain results. When you open a view or ask a question, Zonein assembles the smallest relevant slice of data by detecting your context (token page, contract, pool, governance post, or portfolio/watchlist). A reasoning layer then ranks what matters, compares metrics to recent baselines, and produces a concise answer with optional evidence (numbers, charts, links). Impermanent loss estimates, for example, are produced with a constant‑product approximation adjusted by realized volatility; pool depth considers fee tiers and concentrated liquidity; whale netflow relies on a curated smart‑money set maintained by realized PnL and hit‑rate. Answers include method notes and confidence when appropriate. Latency targets are practical: short Q&A typically under one second p50 and under two seconds p95, while larger research streams start with a TL;DR and fill in details progressively. Guardrails handle chain reorganizations, oracle outliers, and social spam spikes so corrupted data does not reach your screen.

What You Use

The web app is a real‑time dashboard built from moveable widgets. Tiles for portfolio, smart‑money flows, yields, sentiment, events, and a 3D network graph stream updates over a single WebSocket. Portfolio views support tags, notes, and optional cost basis, and research views provide filters by timeframe, token, and role. Alert rules let you specify conditions (metric, comparator, window, cool‑down), such as “whale_netflow_1h < −$500k for TOKEN” with a 15‑minute cool‑down. The Chrome extension brings the same intelligence to any crypto website. It infers context from the page (token symbol, contract address, pool) without harvesting full browsing history; you highlight a line, right‑click to ask, and receive an overlay answer tied to the current page. Session memory keeps follow‑ups focused without re‑entering addresses. The Telegram bot delivers the alerts you subscribe to, answers quick checks (e.g., “whale flows 1h TOKEN”), and links back to the relevant app view. Across surfaces, the system aims to show less, not more: a small, verifiable answer first; deeper sources and historical context one click away. Within these surfaces, several capabilities are worth special note. The sentiment view groups tokens by recent positive or negative momentum and exposes the underlying sources with credibility scores so you can distinguish organic interest from coordinated pushes. Yield opportunities are personalized to your holdings, with APY, TVL, depth, and volatility notes so high numbers are explained rather than hyped. The events calendar focuses on items that can move positions, burns, migrations, listings, vesting, governance deadlines, and adds expected impact where history allows. Smart‑money analytics profile wallets by behavior (profitability, hold time, venues), show cluster rotations, and connect to the 3D graph so you can follow funds across entities and time. The 3D network graph renders clusters and flows with GPU instancing and level‑of‑detail culling for large views; your wallet is highlighted, exchanges and contracts are visually distinct, and filters let you isolate tokens, time windows, and roles. AI-Powered Dashboard User Flow.png

Trust, Performance & Controls

Deployment Architecture.png Transport is encrypted end to end with HSTS and strict content security policies. Credentials and API keys stay in secure vaults with rotation and egress allow‑lists. Internal services operate with least privilege and rate limits. The extension reads only minimal context to perform analysis and does not collect full browsing history. Private keys are never requested; Zonein performs view‑only analysis of public data. Logs are scrubbed of personal identifiers and kept only as long as required for reliability and support, with retention windows documented and visible on request. Reliability measures include horizontal scaling, multi‑region availability for read paths, and circuit breakers on upstream providers. C aching strategies combine Redis for hot reads, edge delivery for static assets, and explicit invalidation on chain events. Chain reorg handling (confirmation depths), oracle sanity checks, and social outlier filters protect result quality. Accessibility covers keyboard navigation for key workflows, icon‑backed color cues, and alt text on critical visuals; times display in your local zone with UTC on hover. Data governance guarantees lineage from sources → features → models → output; user exports for portfolio, alerts, and notes (CSV/JSON); memory controls to edit or clear personalization; and deletion on request, including associated cached artifacts. Known limits are communicated in‑product: coverage depth varies by chain; some social noise persists despite credibility scoring; and speed is balanced with certainty (often a TL;DR first, then deeper analysis).

ZoneIn’s Endgame

ZoneIn focuses on doing more with less input from you: crypto‑specific language models with stronger numerical accuracy; optional on‑chain actions initiated from insights with simulation and explicit confirmations; privacy‑preserving learning (federated/edge) to improve personalization without moving raw data; multi‑agent research that divides tokenomics, security, and social analysis and reconciles outputs; and a curated model/strategy marketplace with sandboxing and clear risk labels. Typical scenarios illustrate how this works in practice: checking unlock risk from a token page via the extension (dates, size%, float, confidence, and sources); evaluating a pool in the dashboard (APY, TVL, depth, fee tier, IL range, and alerts for thresholds); or tracking smart‑money rotation (top net buyers/sellers, common destinations, cluster drill‑downs, and saved watchlists for alerts). Short answers to common questions: updates for active entities arrive in seconds, and aggregated metrics refresh in short intervals; anonymous use is supported, though connecting a wallet improves personalization; the extension supports major crypto sites and reads only minimal page context; conflicting sources are cross‑checked and, when unresolved, labeled as uncertain or temporarily withheld.