Skip to content

Instantly share code, notes, and snippets.

@nicobailon
Created February 7, 2026 20:22
Show Gist options
  • Select an option

  • Save nicobailon/5a5090fd4927e27e471879e8d925bf0f to your computer and use it in GitHub Desktop.

Select an option

Save nicobailon/5a5090fd4927e27e471879e8d925bf0f to your computer and use it in GitHub Desktop.
Packages page ranking overhaul - replace npm downloads with GitHub stars

Packages Page Ranking Overhaul

Problem

npm download counts on the packages page are being gamed. An attacker published 22 versions of pi-screenshots-picker in 7 days and used a download inflation tool that targets a fixed ~100-120 downloads per version. Since npm counts every tarball HTTP 200 as a "download," the tool pushed the package to 2,581/mo (edging out pi-interactive-shell at 2,262/mo, a package that actually went viral on Twitter). The same pattern exists on pi-super-curl, the attacker's other package.

The current page defaults to "Most downloads" sort, giving the attacker the #1 slot. The root issue: npm downloads are a fundamentally unreliable signal. Any client-side slicing of npm download data (latest-version-only, anomaly detection, etc.) can be adapted around once the attacker reads the defense.

Solution

Replace npm downloads with GitHub stars as the primary ranking signal. Stars are far harder to fake at scale (GitHub actively polices fake starring) and better reflect genuine community interest.

The challenge: the GitHub API rate-limits unauthenticated requests to 60/hr. With 128+ packages each needing a repo lookup, the client can't fetch stars directly. The fix is a pre-computed stars.json served as a static file, refreshed automatically.

Architecture

GitHub Action (cron, every 6h)
  1. npm search API → all pi-package names + repo URLs
  2. GitHub API (with GITHUB_TOKEN, 5000 req/hr) → star counts
  3. Write stars.json → commit to repo
                              ↓
Client (packages.html)
  1. Fetch stars.json from raw.githubusercontent.com (or bundled in build)
  2. Merge star counts into package data
  3. Sort by stars instead of downloads

Star data that's up to 6 hours stale is perfectly fine. Stars don't change by the minute.

Implementation

1. GitHub Action: .github/workflows/stars.yml

Runs on a cron schedule. Fetches all pi-packages from npm, extracts their GitHub repo URLs, queries the GitHub API for star counts, and commits src/stars.json to the repo.

name: Update package stars
on:
  schedule:
    - cron: '0 */6 * * *'  # every 6 hours
  workflow_dispatch:        # manual trigger

permissions:
  contents: write

jobs:
  update-stars:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Fetch stars
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
        run: |
          node scripts/fetch-stars.js

      - name: Commit if changed
        run: |
          git config user.name "github-actions[bot]"
          git config user.email "github-actions[bot]@users.noreply.github.com"
          git add src/stars.json
          git diff --cached --quiet || git commit -m "chore: update package stars"
          git push

The built-in GITHUB_TOKEN provides 5,000 req/hr which covers 128 packages trivially.

2. Stars fetcher script: scripts/fetch-stars.js

Vanilla Node.js, no dependencies. Queries npm search, extracts GitHub repo URLs from each package's links.repository, fetches star counts, writes src/stars.json.

const fs = require('fs');

const SEARCH_API = 'https://registry.npmjs.org/-/v1/search';
const GITHUB_API = 'https://api.github.com/repos';
const TOKEN = process.env.GITHUB_TOKEN;

async function fetchAllPackages() {
    let all = [];
    let from = 0;
    while (true) {
        const url = `${SEARCH_API}?text=keywords:pi-package&size=250&from=${from}`;
        const res = await fetch(url);
        const data = await res.json();
        all = all.concat(data.objects || []);
        if (all.length >= data.total || !data.objects?.length) break;
        from = all.length;
    }
    return all;
}

function extractGitHubRepo(links) {
    const url = links?.repository || links?.homepage || '';
    const match = url.match(/github\.com\/([^\/]+)\/([^\/\.\#]+)/);
    return match ? { owner: match[1], repo: match[2] } : null;
}

async function fetchStars(owner, repo) {
    const headers = { 'User-Agent': 'pi-stars-bot' };
    if (TOKEN) headers['Authorization'] = `Bearer ${TOKEN}`;
    try {
        const res = await fetch(`${GITHUB_API}/${owner}/${repo}`, { headers });
        if (!res.ok) return null;
        const data = await res.json();
        return data.stargazers_count ?? null;
    } catch {
        return null;
    }
}

async function main() {
    const packages = await fetchAllPackages();
    const stars = {};
    for (const obj of packages) {
        const pkg = obj.package;
        const gh = extractGitHubRepo(pkg.links);
        if (!gh) continue;
        const count = await fetchStars(gh.owner, gh.repo);
        if (count !== null) stars[pkg.name] = count;
    }
    stars._updated = new Date().toISOString();
    fs.writeFileSync('src/stars.json', JSON.stringify(stars, null, 2) + '\n');
    console.log(`Updated stars for ${Object.keys(stars).length - 1} packages`);
}

main().catch(e => { console.error(e); process.exit(1); });

Output (src/stars.json):

{
  "pi-interactive-shell": 45,
  "pi-screenshots-picker": 3,
  "pi-mcp-adapter": 28,
  "_updated": "2026-02-07T18:00:00.000Z"
}

3. Client-side changes to packages.html

a. Add stars.json URL constant and sort option

var STARS_URL = 'https://raw.githubusercontent.com/badlogic/shittycodingagent.ai/main/src/stars.json';

Update the sort dropdown:

<select class="pkg-sort">
    <option value="stars">Most stars</option>
    <option value="downloads">Most downloads</option>
    <option value="recent">Recently published</option>
    <option value="name">A-Z</option>
</select>

Default changes from downloads to stars.

b. Fetch stars data in init

Fetch stars.json alongside the existing search and flags calls in init():

function fetchStarsData() {
    return fetch(STARS_URL)
        .then(function (r) { return r.ok ? r.json() : {}; })
        .catch(function () { return {}; });
}

Add to the Promise.all in init():

Promise.all([fetchSearch(), fetchFlags(), fetchStarsData()]).then(function (results) {
    packages = processSearchResults(results[0]);
    flaggedPkgs = results[1];
    var starsData = results[2];
    // merge stars into packages
    packages.forEach(function (pkg) {
        pkg.stars = starsData[pkg.name] || 0;
    });
    // ... rest of existing init
});

c. Update applyFilters sort logic

Add the stars sort case:

if (sortVal === 'stars') {
    sorted.sort(function (a, b) { return (b.stars || 0) - (a.stars || 0); });
} else if (sortVal === 'downloads') {
    // existing downloads sort
}

d. Update card display

In createCard, show stars alongside or instead of downloads:

var dl = document.createElement('span');
if (pkg.stars > 0) {
    dl.textContent = pkg.stars.toLocaleString() + ' ★';
} else {
    dl.textContent = (pkg.downloads || 0).toLocaleString() + '/mo';
}
meta.appendChild(dl);

When both stars and downloads are available, could show both: 12 ★ 2,262/mo. But for simplicity, showing stars when available and falling back to downloads keeps the UI clean.

e. Cache stars data

Add stars to the existing localStorage cache alongside search results and manifests:

function saveCache(manifests) {
    try {
        localStorage.setItem(CACHE_KEY, JSON.stringify({
            timestamp: Date.now(),
            search: packages,  // already includes .stars from merge
            manifests: manifests
        }));
    } catch (e) {}
}

Stars are already merged into the packages array, so the existing cache mechanism handles them automatically.

4. Build integration

Two options for getting stars.json into the deployed site:

Option A: raw.githubusercontent.com (recommended)

The client fetches directly from GitHub's raw content CDN. No build changes needed. The Action commits to the repo, and the URL always serves the latest committed version. Cache is ~5 minutes on raw.githubusercontent.com, which is fine.

Pro: zero deploy coupling. Stars update independently of site deploys. Con: extra network request to a different origin (but it's tiny JSON, <5KB).

Option B: bundled in build

If blargh copies src/stars.json to html/stars.json, the client fetches it from the same origin. Requires a redeploy after each Action run to pick up new data.

Pro: same-origin, one fewer external dependency. Con: stars only update when someone runs publish.sh. If deploys are infrequent, data goes stale.

Recommend Option A. The deploy is manual (publish.sh via rsync), and coupling stars freshness to deploy frequency defeats the purpose of the cron Action.

Additional Hardening

Change default sort to "Most stars" immediately. Even before the Action is set up, flipping the default removes the incentive to game downloads. The "Most downloads" option can remain for anyone who wants it.

Flag the gamed packages. Use the existing flag system (package-flag label on badlogic/pi-mono issues) to flag pi-screenshots-picker and pi-super-curl with a warning badge. The flag infrastructure already exists in the client code.

Consider removing download counts from display entirely. Once stars are the ranking signal, showing /mo download numbers only serves to mislead. Could replace with star count everywhere, or show nothing for packages with zero stars.

Rollout

  1. Add scripts/fetch-stars.js and .github/workflows/stars.yml to the repo
  2. Run the Action manually once (workflow_dispatch) to generate initial stars.json
  3. Update src/packages.html: add stars fetch, merge, sort option, default to stars
  4. Test locally with blargh dev server
  5. Deploy via publish.sh
  6. Flag the gamed packages via GitHub issue

Steps 1-5 can ship in a single commit. The Action auto-runs every 6 hours after that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment