First Steps
Ludopoly Analytics is designed so that a dApp project can move from registration to live analytics in minutes rather than days. The onboarding flow follows three stages: register your project, configure the chains and contract addresses you want to monitor, and access your dashboard or connect through the API. Each stage is self-service and requires no manual approval for Starter and Professional tiers.
Project Registration
Every interaction with Ludopoly Analytics begins with a project. A project represents a single dApp or a logical grouping of smart contracts that you want to monitor. Registration is handled through the dashboard or via a YAML configuration file submitted through the API. The YAML-based approach is particularly suited to CI/CD workflows where project configuration lives alongside application code.
During registration you provide a project name, the blockchain networks your contracts are deployed on, and the contract addresses to track. The platform's dynamic ABI resolution system handles the rest — it will attempt to fetch your contract ABIs from its internal registry first, then from block explorers, and finally through 4-byte function signature databases. In most cases, you will not need to supply ABI files manually.
Once registered, the platform begins indexing events from your specified contracts within the current block window. Historical backfill is available on Professional plans and above, allowing you to populate analytics dashboards with data from before your registration date.
Starter plan projects can monitor up to two blockchain networks. Professional, Business, and Enterprise tiers progressively expand chain coverage up to 200+ networks.
Dashboard and API Access
The analytics dashboard is the primary interface for most users. It provides real-time visualisation of key metrics — daily active users, transaction volumes, cohort retention curves, and risk alerts — organised into panels that can be customised per project. Every metric visible in the dashboard is also accessible through the REST and GraphQL APIs, ensuring that teams who prefer programmatic access or need to integrate analytics into their own systems are fully supported.
API authentication uses standard API key pairs with scoped permissions. Keys can be restricted to specific projects, specific modules (analytics only, compliance only, or full access), and specific IP ranges. Rate limits scale with your subscription tier, from 10,000 requests per day on Professional plans to unlimited on Enterprise agreements.
For developers who work primarily in their IDE, the Ludopoly Analytics VS Code extension surfaces key metrics, compliance alerts, and ZK-KYC integration status directly within the development environment. The extension queries the same API endpoints available to any external integration, so the data is always consistent.
Subscription Tiers
The platform offers four tiers designed to match the scale and regulatory obligations of different user segments.
The Starter tier is built for individual developers and small dApp projects exploring on-chain analytics for the first time. It covers basic dApp metrics across two chains with daily metric summaries and community support — enough to understand user behaviour without committing to a production-grade subscription.
The Professional tier expands to ten chains, introduces cohort analysis and RFM segmentation, and opens API access with generous rate limits. It is the natural choice for growing projects that need to understand their user base at a strategic level.
The Business tier activates the AML/CFT monitoring module and the LLM risk analysis engine, making it suitable for crypto exchanges, payment providers, and any entity with regulatory reporting obligations. Priority support and SDK access are included.
The Enterprise tier unlocks every module, supports over two hundred chains, and includes dedicated infrastructure deployment, SLA guarantees, and around-the-clock support. It is designed for large exchanges, fintech institutions, and regulatory bodies that require both depth and operational assurance.
All tiers include access to the same underlying data pipeline. Higher tiers unlock additional modules and increase throughput limits rather than gating access to a fundamentally different data set.