How to Build a Solana Token Analytics Dashboard (Price, Holders, Volume, Liquidity)

Tools Mentioned in This Article
Compare features, read reviews, and check live health scores on MadeOnSol.

Compare features, read reviews, and check live health scores on MadeOnSol.

Prediction markets meta-aggregator for best prices across platforms on Solana

World's largest API marketplace — find, test, and connect to thousands of APIs

Solana validator with 0% fee, MEV, and Jito integration

Anti-snipe human-only token launchpad on Solana with Raydium integration

Solana validators prune old data aggressively, making historical analysis difficult. Learn how to access Solana historical data using BigQuery, Flipside, Dune, Helius DAS API, and archival services — with practical query examples.

Learn how to set up Solana webhooks for real-time blockchain alerts. Compare Helius, QuickNode, Shyft, and Triton webhook providers, with TypeScript code examples for wallet monitoring, NFT sales alerts, and payment confirmation.

Learn how to track Solana program activity by monitoring program IDs, decoding instructions with Anchor IDLs, parsing custom events, and following CPI chains. Covers real use cases like DEX monitoring and pool creation detection.
Access Solana DeFi data via API
KOL trading intelligence, DEX trade streaming, and deployer analytics. Free tier: 200 requests/day.
from madeonsol_x402 import MadeOnSolClient
client = MadeOnSolClient(api_key="msk_your_key")
# KOL convergence signals — what are whales accumulating?
signals = client.kol_coordination(period="24h", min_kols=3)
# KOL PnL leaderboard
leaders = client.kol_leaderboard(period="7d")Get weekly Solana ecosystem insights delivered to your inbox.
Building a token analytics dashboard is one of the most practical projects for anyone working in the Solana ecosystem. Traders want a unified view of price action, holder distribution, volume trends, and liquidity depth before making decisions. Developers building trading bots or portfolio trackers need the same data in a structured format.
This tutorial walks through building a Solana token analytics dashboard from scratch. You will pull data from multiple sources, normalize it, and render it in a single dashboard view. By the end, you will have a working TypeScript application that fetches and displays comprehensive token metrics.
Platforms like Birdeye and DexScreener provide excellent analytics UIs, but they serve a general audience. A custom dashboard lets you:
If you are building any kind of Solana trading tool, a token analytics layer is foundational. For a broader look at available data providers, see our guide on the best Solana API providers.
The dashboard pulls four categories of data:
| Data type | Source | Update frequency |
|---|---|---|
| Price feed | MadeOnSol API / on-chain | Real-time or polling |
| Holder data | Helius DAS API | Every 5-15 minutes |
| Trading volume | MadeOnSol DEX stream | Real-time via WebSocket |
| Liquidity | DexScreener or on-chain pools | Every 1-5 minutes |
The application is a Node.js backend that aggregates these feeds and exposes a normalized JSON endpoint. You can connect any frontend framework to consume it.
Install the dependencies:
npm init -y
npm install @solana/web3.js axios ws dotenv
npm install -D typescript @types/node @types/ws tsx
Create a src/types.ts file to define the core data structures for your dashboard:
export interface TokenMetrics {
mint: string;
symbol: string;
name: string;
price: PriceData;
holders: HolderData;
volume: VolumeData;
liquidity: LiquidityData;
lastUpdated: number;
}
export interface PriceData {
current: number;
change24h: number;
high24h: number;
low24h: number;
}
export interface HolderData {
total: number;
top10Percentage: number;
top10Holders: { address: string; balance: number; percentage: number }[];
}
export interface VolumeData {
volume24h: number;
buyVolume24h: number;
sellVolume24h: number;
tradeCount24h: number;
}
export interface LiquidityData {
totalUsd: number;
pools: { dex: string; pairAddress: string; liquidityUsd: number }[];
}
The MadeOnSol API provides token price data with a single call. This avoids the complexity of parsing on-chain pool state yourself.
import axios from "axios";
import type { PriceData } from "./types";
const MADEONSOL_API = "https://api.madeonsol.com/v1";
const API_KEY = process.env.MADEONSOL_API_KEY!;
export async function fetchPrice(mint: string): Promise<PriceData> {
const { data } = await axios.get(`${MADEONSOL_API}/tokens/${mint}/price`, {
headers: { "x-api-key": API_KEY },
});
return {
current: data.priceUsd,
change24h: data.priceChange24h,
high24h: data.high24h,
low24h: data.low24h,
};
}
For real-time price updates without polling, you can connect to the MadeOnSol WebSocket DEX stream and calculate price from incoming trades. See How to Stream Solana DEX Trades with WebSockets for the full setup.
Holder data tells you how concentrated ownership is. A token where the top 10 wallets hold 90% of supply carries different risk than one with broad distribution. Helius provides a token holders endpoint through their DAS API:
import type { HolderData } from "./types";
const HELIUS_API = "https://mainnet.helius-rpc.com";
const HELIUS_KEY = process.env.HELIUS_API_KEY!;
export async function fetchHolders(mint: string): Promise<HolderData> {
const { data } = await axios.post(`${HELIUS_API}/?api-key=${HELIUS_KEY}`, {
jsonrpc: "2.0",
id: 1,
method: "getTokenAccounts",
params: { mint, limit: 1000 },
});
const accounts = data.result.token_accounts;
const totalSupply = accounts.reduce(
(sum: number, a: { amount: number }) => sum + a.amount,
0
);
const sorted = accounts.sort(
(a: { amount: number }, b: { amount: number }) => b.amount - a.amount
);
const top10 = sorted.slice(0, 10);
const top10Total = top10.reduce(
(sum: number, a: { amount: number }) => sum + a.amount,
0
);
return {
total: accounts.length,
top10Percentage: (top10Total / totalSupply) * 100,
top10Holders: top10.map((a: { address: string; amount: number }) => ({
address: a.address,
balance: a.amount,
percentage: (a.amount / totalSupply) * 100,
})),
};
}
For a deeper dive into data indexing options beyond Helius, check out the best Solana data indexers.
Volume is best tracked in real-time. The MadeOnSol DEX trade stream provides parsed trades across all major Solana DEXs. Here is how to accumulate volume over a rolling 24-hour window:
import WebSocket from "ws";
import type { VolumeData } from "./types";
interface Trade {
timestamp: number;
volumeUsd: number;
side: "buy" | "sell";
}
const tradeHistory: Map<string, Trade[]> = new Map();
export function startVolumeTracker(mint: string): void {
const ws = new WebSocket(
`wss://stream.madeonsol.com/v1/dex-trades?apiKey=${API_KEY}`
);
ws.on("open", () => {
ws.send(JSON.stringify({
action: "subscribe",
filters: { token: mint },
}));
});
ws.on("message", (raw: Buffer) => {
const trade = JSON.parse(raw.toString());
const trades = tradeHistory.get(mint) || [];
trades.push({
timestamp: Date.now(),
volumeUsd: trade.volume_usd,
side: trade.side,
});
tradeHistory.set(mint, trades);
});
ws.on("close", () => {
setTimeout(() => startVolumeTracker(mint), 3000);
});
}
export function getVolume(mint: string): VolumeData {
const cutoff = Date.now() - 24 * 60 * 60 * 1000;
const trades = (tradeHistory.get(mint) || []).filter(
(t) => t.timestamp > cutoff
);
return {
volume24h: trades.reduce((sum, t) => sum + t.volumeUsd, 0),
buyVolume24h: trades
.filter((t) => t.side === "buy")
.reduce((sum, t) => sum + t.volumeUsd, 0),
sellVolume24h: trades
.filter((t) => t.side === "sell")
.reduce((sum, t) => sum + t.volumeUsd, 0),
tradeCount24h: trades.length,
};
}
Periodically prune entries older than 24 hours to keep memory usage stable.
Liquidity depth determines how much slippage a trade will experience. DexScreener provides a convenient API for pool-level liquidity data:
import type { LiquidityData } from "./types";
export async function fetchLiquidity(mint: string): Promise<LiquidityData> {
const { data } = await axios.get(
`https://api.dexscreener.com/latest/dex/tokens/${mint}`
);
const pools = (data.pairs || []).map(
(pair: { dexId: string; pairAddress: string; liquidity: { usd: number } }) => ({
dex: pair.dexId,
pairAddress: pair.pairAddress,
liquidityUsd: pair.liquidity?.usd || 0,
})
);
return {
totalUsd: pools.reduce(
(sum: number, p: { liquidityUsd: number }) => sum + p.liquidityUsd,
0
),
pools: pools.slice(0, 10),
};
}
Now tie everything together in a single aggregator that returns the full TokenMetrics object:
import type { TokenMetrics } from "./types";
import { fetchPrice } from "./price";
import { fetchHolders } from "./holders";
import { getVolume } from "./volume";
import { fetchLiquidity } from "./liquidity";
export async function getTokenMetrics(
mint: string,
symbol: string,
name: string
): Promise<TokenMetrics> {
const [price, holders, liquidity] = await Promise.all([
fetchPrice(mint),
fetchHolders(mint),
fetchLiquidity(mint),
]);
const volume = getVolume(mint);
return {
mint,
symbol,
name,
price,
holders,
volume,
liquidity,
lastUpdated: Date.now(),
};
}
You can serve this from an Express or Fastify endpoint, or call it from a Next.js API route. The Promise.all call ensures price, holder, and liquidity data fetch in parallel, keeping latency low.
Fetching all four data sources on every request is wasteful and will hit rate limits quickly. Add a simple in-memory cache with per-source TTLs:
const cache = new Map<string, { data: unknown; expiry: number }>();
function getCached<T>(key: string): T | null {
const entry = cache.get(key);
if (!entry || Date.now() > entry.expiry) return null;
return entry.data as T;
}
function setCache(key: string, data: unknown, ttlMs: number): void {
cache.set(key, { data, expiry: Date.now() + ttlMs });
}
Recommended TTLs: price data at 10 seconds, holder data at 5 minutes, liquidity at 60 seconds. Volume is already tracked in memory via the WebSocket stream, so it needs no cache.
Production dashboards need to handle failures gracefully. Each data source can fail independently, so never let one source block the entire response:
async function safeFetch<T>(
fn: () => Promise<T>,
fallback: T,
label: string
): Promise<T> {
try {
return await fn();
} catch (err) {
console.error(`[${label}] fetch failed:`, err);
return fallback;
}
}
Wrap each data fetcher in safeFetch so the dashboard returns partial data rather than a 500 error when one provider is down.
For the WebSocket volume tracker, the reconnection logic in Step 4 handles disconnections automatically. In production, add exponential backoff and a maximum retry count.
Once the core metrics are in place, consider these additions:
It depends on which metrics matter most. The MadeOnSol API covers price feeds and real-time DEX trade streaming across all major Solana programs. Helius is the strongest option for holder data and token account queries via the DAS API. DexScreener provides reliable liquidity and pair-level data. For most dashboards, combining two or three of these sources gives you complete coverage without redundancy.
The most efficient approach is subscribing to a DEX trade WebSocket stream and computing price from incoming trades. MadeOnSol provides a filtered WebSocket feed that covers Raydium, Jupiter, Orca, Pump.fun, and other Solana DEXs. Each trade event includes the USD volume and token amounts, so you can derive the latest price from the most recent swap. This eliminates polling entirely and gives you sub-second price updates.
Holder data changes much slower than price or volume, so refreshing every 5 to 15 minutes is sufficient for most use cases. Fetching holder lists is also one of the more expensive API calls in terms of rate limits, especially for tokens with thousands of holders. If you need near-real-time holder tracking, consider subscribing to token account changes via Solana gRPC streams rather than polling the full holder list repeatedly.
You can get started on free tiers. Helius offers a free plan with enough requests for holder queries at moderate refresh rates. DexScreener's token endpoint is publicly accessible. The MadeOnSol API includes a free tier for basic price lookups. The main limitation on free plans is rate limits, which matter once you are tracking more than a handful of tokens or need sub-minute refresh intervals. For production use with real-time streaming, a paid plan on at least one provider is recommended.