Compare commits
56 Commits
chore/expl
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d18c584780 | ||
| 99eafe5920 | |||
| 1b393f5ae2 | |||
|
|
9ebb056869 | ||
|
|
327475c5aa | ||
|
|
b41cbc3a90 | ||
|
|
048268a68a | ||
|
|
88dc2d4ed2 | ||
|
|
b3e7212c10 | ||
|
|
9286a95fce | ||
|
|
8745ec6841 | ||
|
|
a1f9ab51e3 | ||
|
|
30edd98712 | ||
|
|
1ac7f7443e | ||
|
|
6db45b4d2b | ||
|
|
9e0795dbc4 | ||
|
|
4258ff307e | ||
|
|
5d1b4993fa | ||
|
|
5879e28cdc | ||
|
|
e3dff3a43a | ||
|
|
f455348f32 | ||
|
|
f03694ca13 | ||
|
|
63f1d91068 | ||
|
|
c07edfc5c2 | ||
|
|
2eebe9e672 | ||
|
|
cd8a52ae84 | ||
|
|
2124842b38 | ||
|
|
49b193b4b7 | ||
|
|
90971783ad | ||
|
|
485af45c2b | ||
|
|
3ee78865f7 | ||
|
|
45e043ca6b | ||
|
|
31f30decab | ||
|
|
9769225299 | ||
|
|
32d236081b | ||
|
|
cc074a8828 | ||
|
|
2bc18871f5 | ||
|
|
7a99cdbe7d | ||
|
|
6eafa119a3 | ||
|
|
81718215df | ||
|
|
7c60a61382 | ||
|
|
68df9ab90e | ||
|
|
5580bbf181 | ||
|
|
dbd72b20d5 | ||
|
|
baa28cc324 | ||
|
|
2658438948 | ||
|
|
7c5b4507fd | ||
|
|
aa0457b7bf | ||
|
|
aa08972436 | ||
|
|
34d378f6ef | ||
|
|
f8938b2e42 | ||
|
|
ee165fb432 | ||
|
|
f1fe48082f | ||
| c2560c5b38 | |||
| 88cad0cd34 | |||
| 525f992854 |
@@ -10,9 +10,23 @@ alwaysApply: true
|
||||
- **cUSDT:** `0x93E66202A11B1772E55407B32B44e5Cd8eda7f22` (6 decimals)
|
||||
- **cUSDC:** `0xf22258f57794CC8E06237084b353Ab30fFfa640b` (6 decimals)
|
||||
|
||||
**DODOPMMIntegration:** `0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` — reconciled with `docs/11-references/ADDRESS_MATRIX_AND_STATUS.md` (on-chain verification 2026-03-26); `compliantUSDT()` / `compliantUSDC()` return the canonical cUSDT/cUSDC above.
|
||||
**DODOPMMIntegration (live, traded):** `0x86ADA6Ef91A3B450F89f2b751e93B1b7A3218895` — confirmed live via on-chain probe (2026-04-22): `compliantUSDT()` / `compliantUSDC()` return the canonical cUSDT/cUSDC above; `pools[][]` mapping resolves to the live funded pool set below; `isRegisteredPool` is TRUE for all 8 pools listed under "PMM pools (live, traded)". `0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` is a parallel deployment of the same source with different immutables and seeded but un-traded pools — do not wire dApps or routers to it.
|
||||
|
||||
**PMM pools (live funded public):** cUSDT/cUSDC `0xff8d3b8fDF7B112759F076B69f4271D4209C0849` | cUSDT/USDT `0x6fc60DEDc92a2047062294488539992710b99D71` | cUSDC/USDC `0x9f74Be42725f2Aa072a9E0CdCce0E7203C510263` — see `docs/11-references/ADDRESS_MATRIX_AND_STATUS.md` / `PMM_DEX_ROUTING_STATUS.md`.
|
||||
**DODOPMMProvider (ILiquidityProvider, live):** `0x3f729632E9553EBacCdE2e9b4c8F2B285b014F2e` — `dodoIntegration() == 0x86ADA6Ef…`, `providerName() == "DODO PMM"`, `isKnownPool` TRUE for all 8 live pools. Use this address as `dodoLiquidityProvider` when deploying `EnhancedSwapRouter`; see `docs/11-references/PMM_DEX_ROUTING_STATUS.md`.
|
||||
|
||||
**PMM pools (live, traded — 2026-04-22 on-chain probe):**
|
||||
- cUSDT/cUSDC `0x9e89bAe009adf128782E19e8341996c596ac40dC` (~983k cUSDT / ~1.016M cUSDC, asymmetric — actively traded)
|
||||
- cUSDT/USDT `0x866Cb44b59303d8dc5f4F9E3E7A8e8b0bf238d66` (~1M / ~1M)
|
||||
- cUSDC/USDC `0xc39B7D0F40838cbFb54649d327f49a6DAC964062` (~1M / ~1M)
|
||||
- cBTC/cUSDT `0x67049e7333481e2cac91af61403ac7bddfab7bcd` (10k cBTC base / 9M cUSDT quote)
|
||||
- cBTC/cUSDC `0x72f1a0794153c3b8a1e8a731f1d8e1a52cb10dc5` (10k cBTC base / 9M cUSDC quote)
|
||||
- WETH/USDC `0xb53a0508940b1ff90f1aad4f6cb50a7012fe5593` (~10.1M USDC quote)
|
||||
- WETH/USDT `0xe227f6c0520c0c6e8786fe56fa76c4914f861533` (~10.1M USDT quote)
|
||||
- cBTC/cXAUC `0xf3e8a07d419b61f002114e64d79f7cf8f7989433` (10k cBTC base / 1.7k cXAUC quote)
|
||||
|
||||
The earlier rule's pool addresses (`0xff8d3b8f…`, `0x6fc60D…`, `0x9f74Be…`) belong to the **parallel** integration `0x5BDc62f1…` (Stack B) and are seeded 10M/10M flat or 0/0 — they are not the live PMM trading set. Source-of-truth corrections to follow in `ADDRESS_MATRIX_AND_STATUS.md` and `PMM_DEX_ROUTING_STATUS.md`.
|
||||
|
||||
**cBTC:** `0xe94260c555ac1d9d3cc9e1632883452ebdf0082e` (8 decimals) — base token of the three cBTC pools above.
|
||||
|
||||
**cXAUC / cXAUT (XAU):** `0x290E52a8819A4fbD0714E517225429aA2B70EC6b`, `0x94e408E26c6FD8F4ee00b54dF19082FDA07dC96E` (6 decimals). **1 full token = 1 troy ounce Au** — not USD face value; see `EXPLORER_TOKEN_LIST_CROSSCHECK.md` section 5.1.
|
||||
|
||||
|
||||
@@ -77,6 +77,16 @@ GITEA_URL=
|
||||
GITEA_TOKEN=
|
||||
GITEA_ORG=
|
||||
|
||||
# --- Phoenix deploy API (Gitea Actions secrets on EACH repo that triggers deploy) ---
|
||||
# PHOENIX_DEPLOY_URL= # full POST URL e.g. http://192.168.11.59:4001/api/deploy — same variable name as repo Secrets in Gitea
|
||||
# PHOENIX_DEPLOY_TOKEN= # bearer for Phoenix deploy API — set per-repo Secret on Gitea, not necessarily in this root .env
|
||||
|
||||
# --- CyberSecur Global (Gov portal static site; optional Web3Forms intake) ---
|
||||
# CYBERSECUR_WEB3FORMS_ACCESS_KEY= # web3forms.com — used by CyberSecur-Global/deploy/render-intake.sh (key is public in browser HTML per provider)
|
||||
# After rotating the key in the Web3Forms dashboard, update this line and redeploy:
|
||||
# CYBERSECUR_REPO=/path/to/CyberSecur-Global ./scripts/deployment/sync-cybersecur-global-to-ct7810.sh
|
||||
# CYBERSECUR_INTAKE_REDIRECT_URL= # optional; default https://cybersecur.d-bis.org/intake-thanks.html
|
||||
|
||||
# --- Database & app auth ---
|
||||
DATABASE_URL=
|
||||
JWT_SECRET=
|
||||
|
||||
@@ -6,6 +6,8 @@
|
||||
2. Make changes, ensure tests pass
|
||||
3. Open a pull request
|
||||
|
||||
Repo ↔ VM CI/CD mapping and templates for **other** Gitea repos: [docs/04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md](../docs/04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md), [config/gitea-workflow-templates/README.md](../config/gitea-workflow-templates/README.md).
|
||||
|
||||
Deploy workflow policy:
|
||||
`main` and `master` are both deploy-triggering branches, so `.gitea/workflow-sources/deploy-to-phoenix.yml` and `.gitea/workflow-sources/validate-on-pr.yml` must stay identical across both branches.
|
||||
Use `bash scripts/verify/sync-gitea-workflows.sh` after editing workflow-source files, and `bash scripts/verify/run-all-validation.sh --skip-genesis` to catch workflow drift before push.
|
||||
|
||||
@@ -22,6 +22,25 @@ jobs:
|
||||
fi
|
||||
git fetch --depth=1 "$REMOTE" main master
|
||||
|
||||
- name: Install validation dependencies
|
||||
run: |
|
||||
corepack enable
|
||||
pnpm install --frozen-lockfile
|
||||
|
||||
# The cW* mesh matrix and deployment-status validators read
|
||||
# cross-chain-pmm-lps/config/*.json. The parent checkout does not
|
||||
# materialize submodules by default, and .gitmodules mixes public HTTPS
|
||||
# with SSH URLs, so clone only the required public validation dependency.
|
||||
- name: Materialize cross-chain-pmm-lps
|
||||
run: |
|
||||
set -euo pipefail
|
||||
if [ ! -f cross-chain-pmm-lps/config/deployment-status.json ]; then
|
||||
rm -rf cross-chain-pmm-lps
|
||||
git clone --depth=1 \
|
||||
https://gitea.d-bis.org/d-bis/cross-chain-pmm-lps.git \
|
||||
cross-chain-pmm-lps
|
||||
fi
|
||||
|
||||
- name: Run repo validation gate
|
||||
run: |
|
||||
bash scripts/verify/run-all-validation.sh --skip-genesis
|
||||
@@ -35,15 +54,33 @@ jobs:
|
||||
|
||||
- name: Trigger Phoenix deployment
|
||||
run: |
|
||||
set -euo pipefail
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
set +e
|
||||
curl -sSf --retry 3 --retry-connrefused --retry-delay 10 --retry-max-time 180 \
|
||||
--connect-timeout 10 --max-time 120 \
|
||||
-X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"${{ gitea.repository }}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"default\"}"
|
||||
rc="$?"
|
||||
set -e
|
||||
if [ "$rc" -eq 52 ]; then
|
||||
HEALTH_URL="${{ secrets.PHOENIX_DEPLOY_URL }}"
|
||||
HEALTH_URL="${HEALTH_URL%/api/deploy}/health"
|
||||
echo "Phoenix deploy API restarted during self-deploy; verifying ${HEALTH_URL}"
|
||||
for i in $(seq 1 12); do
|
||||
if curl -fsS --max-time 5 "$HEALTH_URL"; then
|
||||
exit 0
|
||||
fi
|
||||
sleep 5
|
||||
done
|
||||
fi
|
||||
exit "$rc"
|
||||
|
||||
deploy-atomic-swap-dapp:
|
||||
needs: validate
|
||||
needs: deploy
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
@@ -51,9 +88,12 @@ jobs:
|
||||
|
||||
- name: Trigger Atomic Swap dApp deployment (Phoenix)
|
||||
run: |
|
||||
set -euo pipefail
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
curl -sSf \
|
||||
--connect-timeout 10 --max-time 900 \
|
||||
-X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"${{ gitea.repository }}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"atomic-swap-dapp-live\"}"
|
||||
@@ -73,9 +113,13 @@ jobs:
|
||||
|
||||
- name: Request Cloudflare DNS sync (Phoenix)
|
||||
run: |
|
||||
set -euo pipefail
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
curl -sSf --retry 5 --retry-all-errors --retry-connrefused --retry-delay 10 --retry-max-time 300 \
|
||||
--connect-timeout 10 --max-time 120 \
|
||||
-X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"${{ gitea.repository }}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"cloudflare-sync\"}"
|
||||
-d "{\"repo\":\"${{ gitea.repository }}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"cloudflare-sync\"}" \
|
||||
|| { echo "Cloudflare DNS sync request failed; optional sync is non-blocking."; exit 0; }
|
||||
|
||||
@@ -21,6 +21,10 @@ jobs:
|
||||
REMOTE="${GITEA_WORKFLOW_REMOTE:-gitea}"
|
||||
fi
|
||||
git fetch --depth=1 "$REMOTE" main master
|
||||
- name: Install validation dependencies
|
||||
run: |
|
||||
corepack enable
|
||||
pnpm install --frozen-lockfile
|
||||
# Optional: set org/repo variable URA_STRICT_CLOSURE=1 to fail PRs while pilot placeholders
|
||||
# remain in manifest (see scripts/ura/validate-manifest-closure.mjs). Not enabled by default.
|
||||
- name: run-all-validation (no LAN, no genesis)
|
||||
|
||||
210
.gitea/workflows/bootstrap-phoenix-deploy-api.yml
Normal file
210
.gitea/workflows/bootstrap-phoenix-deploy-api.yml
Normal file
@@ -0,0 +1,210 @@
|
||||
name: Bootstrap Phoenix Deploy API
|
||||
|
||||
# Reinstalls phoenix-deploy-api on the dev VM (CT 5700) with the latest server.js
|
||||
# from master. This is the missing link between "code on master is the real
|
||||
# implementation" and "running service on CT 5700 still has the stub". Run this
|
||||
# workflow_dispatch job whenever phoenix-deploy-api/server.js, deploy-targets.json
|
||||
# or related scripts change and you need the running service to pick up the change
|
||||
# without a manual LAN visit.
|
||||
#
|
||||
# Required Gitea repo secrets (Settings -> Secrets):
|
||||
# PHOENIX_PVE_HOST PVE node IP that hosts CT 5700 (e.g. 192.168.11.12)
|
||||
# PHOENIX_PVE_USER SSH user on the PVE node (default: root)
|
||||
# PHOENIX_PVE_SSH_KEY Private SSH key (PEM, OpenSSH format) authorised on the PVE node
|
||||
# PHOENIX_PVE_KNOWN_HOSTS Pre-populated known_hosts entry for the PVE node (avoids strict-host prompt)
|
||||
# PHOENIX_DEV_VM_VMID Container VMID (default: 5700)
|
||||
# PHOENIX_DEPLOY_DEV_VM_IP IP of the dev VM for the post-install health check (default: 192.168.11.59)
|
||||
# PHOENIX_DEPLOY_URL Phoenix deploy webhook URL (already used by deploy job)
|
||||
# PHOENIX_DEPLOY_TOKEN Bearer token for the webhook (already used by deploy job)
|
||||
#
|
||||
# Trigger only via Gitea UI (Actions tab -> "Bootstrap Phoenix Deploy API" -> Run
|
||||
# workflow). NOT triggered on push: reinstalling the deploy service is sensitive
|
||||
# enough that we want it gated behind a manual click.
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
verify_only:
|
||||
description: "If true, only run the post-install /health + auth probe and skip the reinstall step."
|
||||
type: boolean
|
||||
required: false
|
||||
default: false
|
||||
|
||||
jobs:
|
||||
bootstrap:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout proxmox repo
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Validate repo layout
|
||||
run: |
|
||||
set -euo pipefail
|
||||
test -d phoenix-deploy-api || { echo "phoenix-deploy-api/ missing" >&2; exit 1; }
|
||||
test -f phoenix-deploy-api/server.js
|
||||
test -f phoenix-deploy-api/scripts/install-systemd.sh
|
||||
test -f phoenix-deploy-api/deploy-targets.json
|
||||
# Manifest is optional; warn if missing but do not fail.
|
||||
if [ ! -f config/public-sector-program-manifest.json ]; then
|
||||
echo "::warning::config/public-sector-program-manifest.json missing — install will warn on CT"
|
||||
fi
|
||||
# Make sure the running master implementation is NOT the stub message
|
||||
# that triggered this whole bootstrap thread.
|
||||
if grep -q "Deploy request queued (stub)" phoenix-deploy-api/server.js; then
|
||||
echo "phoenix-deploy-api/server.js still contains the stub string — refusing to bootstrap." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Install SSH key for PVE access
|
||||
if: ${{ github.event.inputs.verify_only != 'true' }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
mkdir -p "$HOME/.ssh"
|
||||
chmod 700 "$HOME/.ssh"
|
||||
umask 077
|
||||
printf '%s\n' "${{ secrets.PHOENIX_PVE_SSH_KEY }}" > "$HOME/.ssh/id_pve"
|
||||
chmod 600 "$HOME/.ssh/id_pve"
|
||||
if [ -n "${{ secrets.PHOENIX_PVE_KNOWN_HOSTS }}" ]; then
|
||||
printf '%s\n' "${{ secrets.PHOENIX_PVE_KNOWN_HOSTS }}" > "$HOME/.ssh/known_hosts"
|
||||
chmod 644 "$HOME/.ssh/known_hosts"
|
||||
else
|
||||
# Fall back to accept-new on first connect; subsequent connects pin.
|
||||
touch "$HOME/.ssh/known_hosts"
|
||||
chmod 644 "$HOME/.ssh/known_hosts"
|
||||
fi
|
||||
|
||||
- name: Build deploy bundle
|
||||
if: ${{ github.event.inputs.verify_only != 'true' }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
mkdir -p .out
|
||||
if [ -f config/public-sector-program-manifest.json ]; then
|
||||
tar czf .out/pda-deploy-bundle.tar.gz \
|
||||
phoenix-deploy-api \
|
||||
config/public-sector-program-manifest.json
|
||||
else
|
||||
tar czf .out/pda-deploy-bundle.tar.gz phoenix-deploy-api
|
||||
fi
|
||||
ls -lh .out/pda-deploy-bundle.tar.gz
|
||||
|
||||
- name: scp bundle to PVE host
|
||||
if: ${{ github.event.inputs.verify_only != 'true' }}
|
||||
env:
|
||||
PVE_HOST: ${{ secrets.PHOENIX_PVE_HOST }}
|
||||
PVE_USER: ${{ secrets.PHOENIX_PVE_USER }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
: "${PVE_HOST:?PHOENIX_PVE_HOST not set in repo secrets}"
|
||||
PVE_USER_VAL="${PVE_USER:-root}"
|
||||
KNOWN_HOSTS_OPT="-o UserKnownHostsFile=$HOME/.ssh/known_hosts"
|
||||
if [ ! -s "$HOME/.ssh/known_hosts" ]; then
|
||||
KNOWN_HOSTS_OPT="$KNOWN_HOSTS_OPT -o StrictHostKeyChecking=accept-new"
|
||||
else
|
||||
KNOWN_HOSTS_OPT="$KNOWN_HOSTS_OPT -o StrictHostKeyChecking=yes"
|
||||
fi
|
||||
scp -i "$HOME/.ssh/id_pve" $KNOWN_HOSTS_OPT \
|
||||
-o ConnectTimeout=20 \
|
||||
.out/pda-deploy-bundle.tar.gz \
|
||||
"${PVE_USER_VAL}@${PVE_HOST}:/tmp/pda-deploy-bundle.tar.gz"
|
||||
|
||||
- name: pct push + install-systemd on CT
|
||||
if: ${{ github.event.inputs.verify_only != 'true' }}
|
||||
env:
|
||||
PVE_HOST: ${{ secrets.PHOENIX_PVE_HOST }}
|
||||
PVE_USER: ${{ secrets.PHOENIX_PVE_USER }}
|
||||
VMID: ${{ secrets.PHOENIX_DEV_VM_VMID }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
: "${PVE_HOST:?PHOENIX_PVE_HOST not set in repo secrets}"
|
||||
PVE_USER_VAL="${PVE_USER:-root}"
|
||||
VMID_VAL="${VMID:-5700}"
|
||||
KNOWN_HOSTS_OPT="-o UserKnownHostsFile=$HOME/.ssh/known_hosts"
|
||||
if [ ! -s "$HOME/.ssh/known_hosts" ]; then
|
||||
KNOWN_HOSTS_OPT="$KNOWN_HOSTS_OPT -o StrictHostKeyChecking=accept-new"
|
||||
else
|
||||
KNOWN_HOSTS_OPT="$KNOWN_HOSTS_OPT -o StrictHostKeyChecking=yes"
|
||||
fi
|
||||
ssh -i "$HOME/.ssh/id_pve" $KNOWN_HOSTS_OPT \
|
||||
-o ConnectTimeout=20 \
|
||||
"${PVE_USER_VAL}@${PVE_HOST}" "VMID=${VMID_VAL} bash -s" <<'REMOTE_EOF'
|
||||
set -euo pipefail
|
||||
: "${VMID:?}"
|
||||
# Verify CT exists and is running.
|
||||
if ! pct status "${VMID}" >/dev/null 2>&1; then
|
||||
echo "CT ${VMID} not found on this PVE node." >&2
|
||||
exit 1
|
||||
fi
|
||||
if ! pct exec "${VMID}" -- true 2>/dev/null; then
|
||||
echo "CT ${VMID} not running. Start it first: pct start ${VMID}" >&2
|
||||
exit 1
|
||||
fi
|
||||
STAGE="/tmp/proxmox-pda-stage"
|
||||
pct push "${VMID}" /tmp/pda-deploy-bundle.tar.gz /root/pda-deploy.tar.gz
|
||||
pct exec "${VMID}" -- bash -c "
|
||||
set -euo pipefail
|
||||
rm -rf '${STAGE}'
|
||||
mkdir -p '${STAGE}'
|
||||
tar xzf /root/pda-deploy.tar.gz -C '${STAGE}'
|
||||
cd '${STAGE}'
|
||||
bash phoenix-deploy-api/scripts/install-systemd.sh
|
||||
rm -f /root/pda-deploy.tar.gz
|
||||
"
|
||||
rm -f /tmp/pda-deploy-bundle.tar.gz
|
||||
REMOTE_EOF
|
||||
|
||||
- name: Health check (no auth)
|
||||
env:
|
||||
DEV_VM_IP: ${{ secrets.PHOENIX_DEPLOY_DEV_VM_IP }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
IP="${DEV_VM_IP:-192.168.11.59}"
|
||||
# Service may take a moment to come up after install; retry briefly.
|
||||
for i in 1 2 3 4 5 6; do
|
||||
if curl -sSf -m 5 "http://${IP}:4001/health" -o /tmp/health.json; then
|
||||
echo "Health check OK on attempt ${i}"
|
||||
cat /tmp/health.json || true
|
||||
echo
|
||||
break
|
||||
fi
|
||||
echo "Health check attempt ${i}/6 failed; sleeping 3s"
|
||||
sleep 3
|
||||
if [ "${i}" = "6" ]; then
|
||||
echo "Phoenix Deploy API /health unreachable after install." >&2
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Auth + non-stub probe (POST with bogus target)
|
||||
env:
|
||||
PHOENIX_DEPLOY_URL: ${{ secrets.PHOENIX_DEPLOY_URL }}
|
||||
PHOENIX_DEPLOY_TOKEN: ${{ secrets.PHOENIX_DEPLOY_TOKEN }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
: "${PHOENIX_DEPLOY_URL:?}"
|
||||
: "${PHOENIX_DEPLOY_TOKEN:?}"
|
||||
# POST a bogus target. The post-bootstrap server should:
|
||||
# - accept the bearer token (NOT 401)
|
||||
# - reject the unknown target with a non-stub error
|
||||
# The pre-bootstrap stub returned 202 with "Deploy request queued (stub)"
|
||||
# for ANY target. So we explicitly check the response body does NOT
|
||||
# contain that stub phrase.
|
||||
BODY="$(curl -sS -m 10 -X POST "${PHOENIX_DEPLOY_URL}" \
|
||||
-H "Authorization: Bearer ${PHOENIX_DEPLOY_TOKEN}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"repo":"d-bis/proxmox","sha":"HEAD","branch":"master","target":"__bootstrap_probe__"}' || true)"
|
||||
echo "Response body:"
|
||||
echo "${BODY}"
|
||||
if echo "${BODY}" | grep -q 'Deploy request queued (stub)'; then
|
||||
echo "::error::Phoenix Deploy API still returning stub response — bootstrap did not take effect."
|
||||
exit 1
|
||||
fi
|
||||
if echo "${BODY}" | grep -qi 'unauthorized\|invalid token\|401'; then
|
||||
echo "::error::Phoenix Deploy API rejected the bearer token. PHOENIX_DEPLOY_TOKEN is out of sync with PHOENIX_DEPLOY_SECRET on the CT."
|
||||
exit 1
|
||||
fi
|
||||
echo "Phoenix Deploy API is post-stub and authenticating correctly."
|
||||
|
||||
- name: Cleanup secrets
|
||||
if: always()
|
||||
run: |
|
||||
rm -f "$HOME/.ssh/id_pve" "$HOME/.ssh/known_hosts" || true
|
||||
@@ -22,6 +22,25 @@ jobs:
|
||||
fi
|
||||
git fetch --depth=1 "$REMOTE" main master
|
||||
|
||||
- name: Install validation dependencies
|
||||
run: |
|
||||
corepack enable
|
||||
pnpm install --frozen-lockfile
|
||||
|
||||
# The cW* mesh matrix and deployment-status validators read
|
||||
# cross-chain-pmm-lps/config/*.json. The parent checkout does not
|
||||
# materialize submodules by default, and .gitmodules mixes public HTTPS
|
||||
# with SSH URLs, so clone only the required public validation dependency.
|
||||
- name: Materialize cross-chain-pmm-lps
|
||||
run: |
|
||||
set -euo pipefail
|
||||
if [ ! -f cross-chain-pmm-lps/config/deployment-status.json ]; then
|
||||
rm -rf cross-chain-pmm-lps
|
||||
git clone --depth=1 \
|
||||
https://gitea.d-bis.org/d-bis/cross-chain-pmm-lps.git \
|
||||
cross-chain-pmm-lps
|
||||
fi
|
||||
|
||||
- name: Run repo validation gate
|
||||
run: |
|
||||
bash scripts/verify/run-all-validation.sh --skip-genesis
|
||||
@@ -35,15 +54,33 @@ jobs:
|
||||
|
||||
- name: Trigger Phoenix deployment
|
||||
run: |
|
||||
set -euo pipefail
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
set +e
|
||||
curl -sSf --retry 3 --retry-connrefused --retry-delay 10 --retry-max-time 180 \
|
||||
--connect-timeout 10 --max-time 120 \
|
||||
-X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"${{ gitea.repository }}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"default\"}"
|
||||
rc="$?"
|
||||
set -e
|
||||
if [ "$rc" -eq 52 ]; then
|
||||
HEALTH_URL="${{ secrets.PHOENIX_DEPLOY_URL }}"
|
||||
HEALTH_URL="${HEALTH_URL%/api/deploy}/health"
|
||||
echo "Phoenix deploy API restarted during self-deploy; verifying ${HEALTH_URL}"
|
||||
for i in $(seq 1 12); do
|
||||
if curl -fsS --max-time 5 "$HEALTH_URL"; then
|
||||
exit 0
|
||||
fi
|
||||
sleep 5
|
||||
done
|
||||
fi
|
||||
exit "$rc"
|
||||
|
||||
deploy-atomic-swap-dapp:
|
||||
needs: validate
|
||||
needs: deploy
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout code
|
||||
@@ -51,9 +88,12 @@ jobs:
|
||||
|
||||
- name: Trigger Atomic Swap dApp deployment (Phoenix)
|
||||
run: |
|
||||
set -euo pipefail
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
curl -sSf \
|
||||
--connect-timeout 10 --max-time 900 \
|
||||
-X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"${{ gitea.repository }}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"atomic-swap-dapp-live\"}"
|
||||
@@ -73,9 +113,13 @@ jobs:
|
||||
|
||||
- name: Request Cloudflare DNS sync (Phoenix)
|
||||
run: |
|
||||
set -euo pipefail
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
curl -sSf --retry 5 --retry-all-errors --retry-connrefused --retry-delay 10 --retry-max-time 300 \
|
||||
--connect-timeout 10 --max-time 120 \
|
||||
-X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"${{ gitea.repository }}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"cloudflare-sync\"}"
|
||||
-d "{\"repo\":\"${{ gitea.repository }}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"cloudflare-sync\"}" \
|
||||
|| { echo "Cloudflare DNS sync request failed; optional sync is non-blocking."; exit 0; }
|
||||
|
||||
@@ -21,6 +21,10 @@ jobs:
|
||||
REMOTE="${GITEA_WORKFLOW_REMOTE:-gitea}"
|
||||
fi
|
||||
git fetch --depth=1 "$REMOTE" main master
|
||||
- name: Install validation dependencies
|
||||
run: |
|
||||
corepack enable
|
||||
pnpm install --frozen-lockfile
|
||||
# Optional: set org/repo variable URA_STRICT_CLOSURE=1 to fail PRs while pilot placeholders
|
||||
# remain in manifest (see scripts/ura/validate-manifest-closure.mjs). Not enabled by default.
|
||||
- name: run-all-validation (no LAN, no genesis)
|
||||
|
||||
14
.gitignore
vendored
14
.gitignore
vendored
@@ -136,6 +136,20 @@ reports/status/mainnet-cwusdc-usdc-repeg-plan-*.json
|
||||
reports/status/live_inventory_*.json
|
||||
reports/status/drift_*.json
|
||||
|
||||
# Ephemeral e2e dry-run outputs (local re-runs; not canonical reports)
|
||||
reports/e2e-dry-runs/
|
||||
|
||||
# Local relay / thirdweb scaffold trees (subtree or vendor experiments — git add -f if promoted)
|
||||
relay/
|
||||
relay-api/
|
||||
relay-docs/
|
||||
relay-web/
|
||||
thirdweb-contracts/
|
||||
|
||||
# One-off liquidity staging helpers (operator-generated; use committed runbooks as source of truth)
|
||||
scripts/verify/stage-250m-eth-to-cwusdc-dry-run.sh
|
||||
scripts/verify/stage-427m-cusdc-weth-liquidity-funding.sh
|
||||
|
||||
# Large optional vendor trees and local checkouts (keep out of main clone)
|
||||
smom-dbis-138-publish/
|
||||
third-party/
|
||||
|
||||
@@ -12,6 +12,7 @@ Orchestration for Proxmox VE, Chain 138 (`smom-dbis-138/`), explorers, NPMplus,
|
||||
|------|-----------|
|
||||
| Doc index | `docs/MASTER_INDEX.md` |
|
||||
| Canonical ecosystem master plan | `docs/02-architecture/DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md` — umbrella root; subordinate roots: `dbis_chain_138_technical_master_plan.md`, `docs/03-deployment/DBIS_RTGS_MASTER_PLAN_IMPLEMENTATION_TRACKER.md`, `docs/04-configuration/universal-resource-activation/URA_MANIFEST_AUTOMATION_IMPLEMENTATION_TRACKER.md` |
|
||||
| Treasury / EMI / wallet / VA master plan | `docs/02-architecture/GOVERNMENT_TREASURY_EMI_WALLET_MASTER_PLAN.md` — government treasury, EMIs, digital wallets, virtual accounts (incl. Tatum-style), Rail vs RTGS gates |
|
||||
| Universal resource activation (manifest, CI, Phoenix) | `UNIVERSAL_RESOURCE_WIRING.md`, `URA_MANIFEST_AUTOMATION_IMPLEMENTATION_TRACKER.md`, `URA_OPERATIONAL_READINESS_CHECKLIST.md` (under `docs/04-configuration/universal-resource-activation/`); `config/universal-resource-activation/{manifest.json,policy-profiles.json,integration/}`; `pnpm ura:ops-readiness` / `ura:ops-readiness:full`, `ura:production-ready` / `ura:production-ready:connectivity`, `ura:validate`, `ura:validate-profiles`, `ura:merge-manifest`, `ura:validate-ledger-mapping`, `ura:writer:ledger`, `ura:writer:settlement`, `ura:profile-hash`, `ura:validate-closure`, `ura:keccak`, `ura:smoke`; `URA_STRICT_CLOSURE` / Gitea `vars.URA_STRICT_CLOSURE`; `smom-dbis-138/contracts/universal-resource/PolicyProfileRegistry.sol` (scoped forge test); Phoenix `PUBLIC_V1_NO_PARTNER_KEY_PATHS` |
|
||||
| Multi-jurisdiction compliance (matrices, onboarding) | `docs/04-configuration/compliance-matrices/README.md`, `INSTITUTION_ONBOARDING_CHARTER.md`, `INSTITUTION_ONBOARDING_PLAYBOOK.md`, `docs/04-configuration/jurisdictions/JURISDICTION_CATALOG.md`, `config/jurisdictions/catalog.v1.json`, `docs/dbis-rail/DBIS_RAIL_JURISDICTION_TRACEABILITY.md`, `docs/03-deployment/DBIS_RTGS_MASTER_PLAN_IMPLEMENTATION_TRACKER.md` |
|
||||
| cXAUC/cXAUT unit | 1 full token = 1 troy oz Au — `docs/11-references/EXPLORER_TOKEN_LIST_CROSSCHECK.md` (section 5.1) |
|
||||
|
||||
Submodule alltra-lifi-settlement updated: 5e3b9db91a...a218b53de7
23
config/all-mainnet-canary-evidence.example.json
Normal file
23
config/all-mainnet-canary-evidence.example.json
Normal file
@@ -0,0 +1,23 @@
|
||||
{
|
||||
"description": "Copy to config/all-mainnet-canary-evidence.json after live canary swaps. Each row needs real transaction hashes and observed balance deltas.",
|
||||
"evidence": [
|
||||
{
|
||||
"poolId": "651940-uniswap_v2-wall-ausdc",
|
||||
"status": "canary_passed",
|
||||
"generatedAt": "2026-04-29T00:00:00.000Z",
|
||||
"canaryTransactions": [
|
||||
{
|
||||
"amountLabel": "seed",
|
||||
"txHash": "0x0000000000000000000000000000000000000000000000000000000000000000",
|
||||
"sourceToken": "WALL",
|
||||
"destinationToken": "AUSDC",
|
||||
"observedInputRaw": "0",
|
||||
"observedOutputRaw": "0"
|
||||
}
|
||||
],
|
||||
"notes": [
|
||||
"Replace with real canary transaction evidence before applying."
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
77
config/all-mainnet-canary-evidence.json
Normal file
77
config/all-mainnet-canary-evidence.json
Normal file
@@ -0,0 +1,77 @@
|
||||
{
|
||||
"description": "ALL Mainnet canary evidence recorded from live canary transactions executed with deployer wallet 0x4A666F96fC8764181194447A7dFdb7d471b301C8.",
|
||||
"evidence": [
|
||||
{
|
||||
"poolId": "651940-dodo_pmm-wall-ausdc",
|
||||
"generatedAt": "2026-04-29T04:41:13.993Z",
|
||||
"canaryTransactions": [
|
||||
{
|
||||
"direction": "base_to_quote",
|
||||
"txHash": "0x727cea66f601b514b0d82c4bc93c29fbc09047e8185c146a05564dce7916829c",
|
||||
"fundingTransferTxHash": "0x65f8d2e15556c26b46dd7323a90cb174279fc6bd0e7002a868553dc990bfa656",
|
||||
"amountInRaw": "1000000",
|
||||
"tokenIn": "WALL",
|
||||
"tokenOut": "AUSDC",
|
||||
"executor": "DODO_DVM.transfer_then_sellBase"
|
||||
}
|
||||
],
|
||||
"notes": [
|
||||
"Tiny live canary swap executed on ALL Mainnet DODO PMM WALL/AUSDC."
|
||||
]
|
||||
},
|
||||
{
|
||||
"poolId": "651940-uniswap_v2-wall-ausdc",
|
||||
"generatedAt": "2026-04-29T04:41:13.993Z",
|
||||
"canaryTransactions": [
|
||||
{
|
||||
"direction": "base_to_quote",
|
||||
"txHash": "0x0b76149f25e36919637fbeab10056e45d8ab7757454174966842c3f52f53dd5c",
|
||||
"approvalTxHash": "0xc33d872d15628cfe521552ccc9a4b908f31df59189764468775b4557826514b6",
|
||||
"amountInRaw": "1000000",
|
||||
"tokenIn": "WALL",
|
||||
"tokenOut": "AUSDC",
|
||||
"executor": "UniswapV2Router.swapExactTokensForTokens"
|
||||
}
|
||||
],
|
||||
"notes": [
|
||||
"Tiny live canary swap executed on ALL Mainnet Uniswap V2 WALL/AUSDC."
|
||||
]
|
||||
},
|
||||
{
|
||||
"poolId": "137-dodo_pmm-cwusdc-usdc",
|
||||
"generatedAt": "2026-04-29T04:41:13.993Z",
|
||||
"canaryTransactions": [
|
||||
{
|
||||
"direction": "base_to_quote",
|
||||
"txHash": "0x4f68cdb0502b0fd50602013e54cbf898556a5c1181d8009f9b0c166dfccf5ce7",
|
||||
"fundingTransferTxHash": "0x2b2721dd505f82488b05f32810f7e94b3a712e9b459b638be9b8ea34c20925d9",
|
||||
"amountInRaw": "1",
|
||||
"tokenIn": "cWUSDC",
|
||||
"tokenOut": "USDC",
|
||||
"executor": "DODO_DVM.transfer_then_sellBase"
|
||||
}
|
||||
],
|
||||
"notes": [
|
||||
"Tiny live canary swap executed on Polygon DODO PMM cWUSDC/USDC."
|
||||
]
|
||||
},
|
||||
{
|
||||
"poolId": "137-dodo_pmm-cwusdt-usdt",
|
||||
"generatedAt": "2026-04-29T04:41:13.993Z",
|
||||
"canaryTransactions": [
|
||||
{
|
||||
"direction": "base_to_quote",
|
||||
"txHash": "0x9c946c7c912e2eabe960c752041b533948e85e2a1603c80de80c5b0ee447908d",
|
||||
"fundingTransferTxHash": "0xcac8b9187325869f164f6b7cd5464fcf46dce6be83ef04d760e7ecc21de7d40d",
|
||||
"amountInRaw": "1",
|
||||
"tokenIn": "cWUSDT",
|
||||
"tokenOut": "USDT",
|
||||
"executor": "DODO_DVM.transfer_then_sellBase"
|
||||
}
|
||||
],
|
||||
"notes": [
|
||||
"Tiny live canary swap executed on Polygon DODO PMM cWUSDT/USDT."
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
147
config/all-mainnet-enhanced-router-deployment.json
Normal file
147
config/all-mainnet-enhanced-router-deployment.json
Normal file
@@ -0,0 +1,147 @@
|
||||
{
|
||||
"name": "ALL Mainnet Enhanced Router Deployment Evidence",
|
||||
"version": "0.1.0",
|
||||
"generatedAt": "2026-04-29T05:52:00Z",
|
||||
"chainId": 651940,
|
||||
"network": "ALL Mainnet (Alltra)",
|
||||
"evmVersion": "paris",
|
||||
"reason": "ALL Mainnet RPC/runtime rejected Cancun bytecode with BadInstruction; Paris bytecode was used for live deployment.",
|
||||
"deployer": "0x4A666F96fC8764181194447A7dFdb7d471b301C8",
|
||||
"contracts": {
|
||||
"dodoPmmProvider": {
|
||||
"address": "0x36F65027D21e151F0b7810bae1E94b225AC7Ba9e",
|
||||
"transactionHash": "0xd2e69b556e84786338fd526ba149d1f88488a07190d081f935d7fffbe9d1b2e0",
|
||||
"constructorArgs": {
|
||||
"dodoPmmIntegration": "0x8528E268F3b8C94208d09D131ACa3Ea93Bad57c7",
|
||||
"admin": "0x4A666F96fC8764181194447A7dFdb7d471b301C8"
|
||||
}
|
||||
},
|
||||
"enhancedSwapRouterV2": {
|
||||
"address": "0xb905fEfA56b028221E2Bc248Bbcd41141dc7aeD3",
|
||||
"transactionHash": "0x2c5d409b6e06cbfb69d8e251240d830d624625a4d505cc963edb65b55623bc79",
|
||||
"constructorArgs": {
|
||||
"weth": "0x798F6762BB40d6801A593459d08F890603D3979C",
|
||||
"usdt": "0x66D8Efa0AF63B0e84eb1Dd72bf00f00cd1e2234e",
|
||||
"usdc": "0xa95EeD79f84E6A0151eaEb9d441F9Ffd50e8e881",
|
||||
"daiSlot": "0x015B1897Ed5279930bC2Be46F661894d219292A6",
|
||||
"daiSlotNote": "AUSDT is used as the third stablecoin slot for ALL Mainnet; no canonical ALL DAI token is committed."
|
||||
}
|
||||
},
|
||||
"intentBridgeCoordinatorV2": {
|
||||
"address": "0x9276ae27d9c624B43dbE43494f34A9c5F0233a0B",
|
||||
"transactionHash": "0x5695b3f9ec59e09d5e4f8569ea8af31578ced0a56aba885a7c475a5187aadd3d"
|
||||
},
|
||||
"adapters": {
|
||||
"dodo": {
|
||||
"address": "0x391D192BED6188c4DaB4C93c078bD18432687474",
|
||||
"transactionHash": "0xc4a036a6fff5eb9886e797559017cf8709505d13f39f5feddf055967cf9b4648",
|
||||
"enabled": true
|
||||
},
|
||||
"dodoV3": {
|
||||
"address": "0x97Ce874142625134aEEBDF42B5E7bB806e731D25",
|
||||
"transactionHash": "0x5ad21f59b823adbc2cebc1e9c45ab3f8f0f1286e46a290c09c0667f499577136",
|
||||
"enabled": false
|
||||
},
|
||||
"uniswapV3": {
|
||||
"address": "0xBF75F3401de20bebBB1CBb678499941807E3E040",
|
||||
"transactionHash": "0x081b86cc99306e694ef9daa3d3f9dc7f35ce91dce08c57ddaedcdd4b9a00008d",
|
||||
"enabled": false
|
||||
},
|
||||
"balancer": {
|
||||
"address": "0xDE7F15AF1D84e3694f7E966293d20e64Fc04d9fF",
|
||||
"transactionHash": "0xa4f30c029fa062ae1b481786950ab0243541ce5b0b859fc534b55f7b444ba83c",
|
||||
"enabled": false
|
||||
},
|
||||
"curve": {
|
||||
"address": "0x753D2b0a723992D7B174D6e19F7b7Cb74be8D61a",
|
||||
"transactionHash": "0xcdf0ff9723aedab96aeaa0b8f57f25ad6075f9467e0d19f3b842fb17c0bb6a79",
|
||||
"enabled": false
|
||||
},
|
||||
"oneInch": {
|
||||
"address": "0x487090bbb7d17875281692d582a11B445b3A7AC7",
|
||||
"transactionHash": "0x4d0dd682b8e22812a258fee497c07e5cecfbc1228f413e67d9fe7b24f327a926",
|
||||
"enabled": false
|
||||
}
|
||||
}
|
||||
},
|
||||
"routes": [
|
||||
{
|
||||
"poolId": "651940-dodo_pmm-wall-ausdc",
|
||||
"provider": "dodo",
|
||||
"tokenA": {
|
||||
"symbol": "WALL",
|
||||
"address": "0x2da2b8f961F161ab6320acB3377e2e844a3C3ce4"
|
||||
},
|
||||
"tokenB": {
|
||||
"symbol": "AUSDC",
|
||||
"address": "0xa95EeD79f84E6A0151eaEb9d441F9Ffd50e8e881"
|
||||
},
|
||||
"poolAddress": "0x7b81Dad382BBB57e91a80389bA48e41Abd10794F",
|
||||
"status": "quoteable",
|
||||
"verification": {
|
||||
"amountInRaw": "1000000",
|
||||
"amountOutRaw": "1999999",
|
||||
"slippageBps": 30,
|
||||
"routerQuoteExecutable": true
|
||||
}
|
||||
},
|
||||
{
|
||||
"poolId": "651940-dodo_pmm-wall-ausdt",
|
||||
"provider": "dodo",
|
||||
"tokenA": {
|
||||
"symbol": "WALL",
|
||||
"address": "0x2da2b8f961F161ab6320acB3377e2e844a3C3ce4"
|
||||
},
|
||||
"tokenB": {
|
||||
"symbol": "AUSDT",
|
||||
"address": "0x015B1897Ed5279930bC2Be46F661894d219292A6"
|
||||
},
|
||||
"poolAddress": "0x8D9bB238B6a76a438B116Ff22F5F7535191D49b4",
|
||||
"status": "quoteable",
|
||||
"verification": {
|
||||
"amountInRaw": "1000000",
|
||||
"amountOutRaw": "1999999",
|
||||
"slippageBps": 30,
|
||||
"routerQuoteExecutable": true
|
||||
}
|
||||
}
|
||||
],
|
||||
"providerStatus": {
|
||||
"enabled": [
|
||||
"dodo"
|
||||
],
|
||||
"disabled": [
|
||||
"dodoV3",
|
||||
"uniswapV3",
|
||||
"balancer",
|
||||
"curve",
|
||||
"oneInch",
|
||||
"partner"
|
||||
]
|
||||
},
|
||||
"remainingOptionalBlockers": [
|
||||
"HYDX-native router/factory is not deployed or not committed in inventory.",
|
||||
"Uniswap V3 factory/router/quoter/pool stack is not deployed or not committed in inventory."
|
||||
],
|
||||
"disabledRoutes": [
|
||||
{
|
||||
"poolId": "651940-dodo_pmm-wall-usdt",
|
||||
"provider": "dodo",
|
||||
"tokenA": {
|
||||
"symbol": "WALL",
|
||||
"address": "0x2da2b8f961F161ab6320acB3377e2e844a3C3ce4"
|
||||
},
|
||||
"tokenB": {
|
||||
"symbol": "USDT",
|
||||
"address": "0x66D8Efa0AF63B0e84eb1Dd72bf00f00cd1e2234e"
|
||||
},
|
||||
"poolAddress": "0x261D7e1447EE88398B2b5a274D49454F5B86800E",
|
||||
"status": "disabled_wrong_quote_asset",
|
||||
"reason": "AUSDT is the canonical ALL Mainnet cUSDT surface for this routing set.",
|
||||
"disabledTransactions": [
|
||||
"0x79f171ddc9977e99bb894bf7ff7a11a430441cc1285e7ecd747907ef3f23a0c4",
|
||||
"0xec74f92e287cf1e193e791462f66b35cf9487ece8e343108fbdd3de760dc5c55"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
8759
config/all-mainnet-pool-creation-matrix.json
Normal file
8759
config/all-mainnet-pool-creation-matrix.json
Normal file
File diff suppressed because it is too large
Load Diff
68
config/all-mainnet-uniswap-v3-deployment.json
Normal file
68
config/all-mainnet-uniswap-v3-deployment.json
Normal file
@@ -0,0 +1,68 @@
|
||||
{
|
||||
"generatedAt": "2026-04-29T06:18:00Z",
|
||||
"chainId": 651940,
|
||||
"deployer": "0x4A666F96fC8764181194447A7dFdb7d471b301C8",
|
||||
"fee": 3000,
|
||||
"tokens": {
|
||||
"WETH": "0x798F6762BB40d6801A593459d08F890603D3979C",
|
||||
"WALL": "0x2da2b8f961F161ab6320acB3377e2e844a3C3ce4",
|
||||
"AUSDT": "0x015B1897Ed5279930bC2Be46F661894d219292A6",
|
||||
"token0": "0x015B1897Ed5279930bC2Be46F661894d219292A6",
|
||||
"token1": "0x2da2b8f961F161ab6320acB3377e2e844a3C3ce4"
|
||||
},
|
||||
"contracts": {
|
||||
"nftDescriptorLibrary": "0xb53E8A0A19fB381537c6f28D37b7C2f7DC29EF02",
|
||||
"nonfungibleTokenPositionDescriptor": "0x2a76C73458A0C11df4e0E43004598480d6D1E768",
|
||||
"factory": "0xF1a334465C5DD628492780B39Be68D561A9AecA2",
|
||||
"swapRouter": "0xe9Ea1B70803c18C4CEb8839D5D68681c7903511B",
|
||||
"quoter": "0x0ecC56077325863c80cbe516D63e0afAFf7EA579",
|
||||
"quoterV2": "0x024Ff178BaB7e6fa1794c3A216D2B299C3F295d2",
|
||||
"nonfungiblePositionManager": "0xD29422211e1f2C1015FBb5dC2004657Dd8318aF6",
|
||||
"pool": "0x9e0FC06BA367b51a0aBc5c0924306088DBB0e9c4"
|
||||
},
|
||||
"transactions": {
|
||||
"nftDescriptorLibrary": "0x774202382ec2d29cced671b34c2b951682f60d3e60afd7fe64c13488cb341e32",
|
||||
"nonfungibleTokenPositionDescriptor": "0xc6b98fc36e4c3b1d4d2e80efd4acacc31e2af2ff45de04f9fb066dcfffd380d3",
|
||||
"factory": "0xb6e46b6d145cc707f12f4cf8980bf81d7b5b8d3bea9416737a7465c186b0fefd",
|
||||
"swapRouter": "0x5fd7d021e8ac1bad918a1eb470a116f9dc6e750c102a5512e05391858296cc53",
|
||||
"quoter": "0x0d5c14d3264c5abd70990349911a6eb3076f41feb2db93ccf74b2de022cd087f",
|
||||
"quoterV2": "0x774327c7e7a7650fbfd9d28a8becbd88f86eb8f942a825980052bc50484aa54c",
|
||||
"nonfungiblePositionManager": "0xe5be3fa83bd676051e2cc5ff990768d3de87e49a387d94be77352eaf1c38545f"
|
||||
},
|
||||
"poolState": {
|
||||
"sqrtPriceX96": "79228162514264337593543950336",
|
||||
"tick": 0,
|
||||
"liquidity": "1000000000000000000"
|
||||
},
|
||||
"name": "ALL Mainnet Uniswap V3 Deployment Evidence",
|
||||
"version": "0.1.0",
|
||||
"network": "ALL Mainnet (Alltra)",
|
||||
"evmVersion": "upstream-uniswap-artifacts-solc-0.7.x",
|
||||
"packageSources": {
|
||||
"v3Core": "@uniswap/v3-core@1.0.1",
|
||||
"v3Periphery": "@uniswap/v3-periphery@1.4.4",
|
||||
"swapRouterContracts": "@uniswap/swap-router-contracts@1.3.1"
|
||||
},
|
||||
"poolStateAfterRouterSwap": {
|
||||
"testedAt": "2026-04-29T06:17:00Z",
|
||||
"swapRouter": "0xe9Ea1B70803c18C4CEb8839D5D68681c7903511B",
|
||||
"direction": "WALL_TO_AUSDT",
|
||||
"amountInRaw": "1000000",
|
||||
"amountOutRaw": "996999",
|
||||
"approveTxHash": "0x572d1c6b2d0cdf6248913cd995e80196fbe0717017411c2251637afbfa825e1f",
|
||||
"swapTxHash": "0xddf85aed18a6d872ac72d4f57b241e44946881e404f4f17cb7271180c8caa183",
|
||||
"gasUsed": "119111"
|
||||
},
|
||||
"enhancedRouterIntegration": {
|
||||
"enhancedSwapRouterV2": "0xb905fEfA56b028221E2Bc248Bbcd41141dc7aeD3",
|
||||
"routeConfigured": true,
|
||||
"providerEnabled": false,
|
||||
"providerDisabledReason": "Existing UniswapV3RouteExecutorAdapter uses staticcall into the official Uniswap Quoter; the upstream Quoter is callable directly but does not return through that adapter staticcall path. Standalone SwapRouter/Quoter/Pool stack is live; enhanced-router V3 provider remains disabled until adapter quote compatibility is fixed.",
|
||||
"routeSetTransactions": [
|
||||
"0xa40b24889ab3ad985936562ee3690dafd14bfb1676ff49806a6fcb45c7704ef5",
|
||||
"0x848fd6c7cedaebe7787c2f15a931b73afde709dac100cb745eab2d9eaa6da86c"
|
||||
],
|
||||
"providerEnableTxHash": "0x4b430081582e1f2db5fedc904b8e90e137480dcae2f1e0a41dd25490f05394c7",
|
||||
"providerDisableTxHash": "0x78b8ce4fdc296585ace36dd8c8318731cc5526115e712b14d1ad630c4f63aba6"
|
||||
}
|
||||
}
|
||||
27
config/all-mainnet-vault-assignments.example.json
Normal file
27
config/all-mainnet-vault-assignments.example.json
Normal file
@@ -0,0 +1,27 @@
|
||||
{
|
||||
"description": "Copy to config/all-mainnet-vault-assignments.json and replace placeholder addresses with approved per-role vaults/multisigs. The apply script refuses placeholders.",
|
||||
"defaultByRole": {
|
||||
"treasury_reserve": "0x0000000000000000000000000000000000000000",
|
||||
"bridge_liquidity": "0x0000000000000000000000000000000000000000",
|
||||
"protocol_adapter": "0x0000000000000000000000000000000000000000",
|
||||
"emergency_withdraw": "0x0000000000000000000000000000000000000000",
|
||||
"single_sided_inventory": "0x0000000000000000000000000000000000000000"
|
||||
},
|
||||
"byChain": {
|
||||
"651940": {
|
||||
"treasury_reserve": "0x0000000000000000000000000000000000000000",
|
||||
"bridge_liquidity": "0x0000000000000000000000000000000000000000",
|
||||
"protocol_adapter": "0x0000000000000000000000000000000000000000",
|
||||
"emergency_withdraw": "0x0000000000000000000000000000000000000000",
|
||||
"single_sided_inventory": "0x0000000000000000000000000000000000000000"
|
||||
}
|
||||
},
|
||||
"byPoolId": {
|
||||
"651940-uniswap_v2-wall-ausdc": {
|
||||
"treasury_reserve": "0x0000000000000000000000000000000000000000",
|
||||
"bridge_liquidity": "0x0000000000000000000000000000000000000000",
|
||||
"protocol_adapter": "0x0000000000000000000000000000000000000000",
|
||||
"emergency_withdraw": "0x0000000000000000000000000000000000000000"
|
||||
}
|
||||
}
|
||||
}
|
||||
20
config/all-mainnet-vault-assignments.json
Normal file
20
config/all-mainnet-vault-assignments.json
Normal file
@@ -0,0 +1,20 @@
|
||||
{
|
||||
"description": "Operational vault assignments generated from smom-dbis-138/.env public addresses. No private material is stored here.",
|
||||
"defaultByRole": {
|
||||
"treasury_reserve": "0x74eccf9affb0e0938c2168ebdf7ef63a26964483",
|
||||
"bridge_liquidity": "0x31884f84555210FFB36a19D2471b8eBc7372d0A8",
|
||||
"protocol_adapter": "0xb9E29cFa1f89d369671E640d0BB3aD94Cab43965",
|
||||
"emergency_withdraw": "0xb9E29cFa1f89d369671E640d0BB3aD94Cab43965",
|
||||
"single_sided_inventory": "0x31884f84555210FFB36a19D2471b8eBc7372d0A8"
|
||||
},
|
||||
"byChain": {
|
||||
"651940": {
|
||||
"treasury_reserve": "0x74eccf9affb0e0938c2168ebdf7ef63a26964483",
|
||||
"bridge_liquidity": "0x31884f84555210FFB36a19D2471b8eBc7372d0A8",
|
||||
"protocol_adapter": "0xb9E29cFa1f89d369671E640d0BB3aD94Cab43965",
|
||||
"emergency_withdraw": "0xb9E29cFa1f89d369671E640d0BB3aD94Cab43965",
|
||||
"single_sided_inventory": "0x31884f84555210FFB36a19D2471b8eBc7372d0A8"
|
||||
}
|
||||
},
|
||||
"byPoolId": {}
|
||||
}
|
||||
@@ -1,17 +1,69 @@
|
||||
{
|
||||
"name": "ALL Mainnet Non-DODO Protocol Surface",
|
||||
"version": "0.1.0",
|
||||
"updated": "2026-04-21",
|
||||
"updated": "2026-04-29",
|
||||
"chainId": 651940,
|
||||
"network": "ALL Mainnet (Alltra)",
|
||||
"status": "bridge_live_swap_inventory_pending",
|
||||
"status": "bridge_live_enhanced_router_partial_swap_inventory_published",
|
||||
"summary": {
|
||||
"bridgeOnlyLive": true,
|
||||
"sameChainSwapInventoryPublished": false,
|
||||
"bridgeOnlyLive": false,
|
||||
"sameChainSwapInventoryPublished": true,
|
||||
"notes": [
|
||||
"The Chain 138 <-> 651940 AlltraAdapter bridge is live.",
|
||||
"This file documents the known non-DODO Alltra protocol and token surface without asserting live routable pool inventory.",
|
||||
"Promote protocols here into canonical route inventory only after real factory/router/pool addresses are committed and verified."
|
||||
"This file documents the known non-DODO Alltra protocol and token surface plus the committed same-chain inventory fragments that have real factory/router/pool addresses in config/all-mainnet-pool-creation-matrix.json.",
|
||||
"Same-chain inventory publication is partial: production routing remains gated by required vault assignments, funding, live reserve reads, and canary evidence.",
|
||||
"ALL Mainnet EnhancedSwapRouterV2 is deployed and DODO-backed routes are wired for the committed WALL/AUSDC and WALL/AUSDT DODO PMM pools; the earlier WALL/USDT route is disabled because AUSDT is the canonical ALL Mainnet cUSDT surface."
|
||||
]
|
||||
},
|
||||
"classificationFramework": {
|
||||
"category": [
|
||||
"tokenized-fiat",
|
||||
"stablecoin",
|
||||
"wrapped-native",
|
||||
"dex-token",
|
||||
"defi-token",
|
||||
"governance-token",
|
||||
"utility-token",
|
||||
"rwa-token",
|
||||
"commodity-token",
|
||||
"other"
|
||||
],
|
||||
"instrumentType": [
|
||||
"emoney",
|
||||
"deposit-token",
|
||||
"fiat-backed-stablecoin",
|
||||
"wrapped-native",
|
||||
"protocol-token",
|
||||
"governance-token",
|
||||
"utility-token",
|
||||
"other"
|
||||
],
|
||||
"backingAssets": [
|
||||
"cash",
|
||||
"cash-equivalents",
|
||||
"bank-deposits",
|
||||
"treasuries",
|
||||
"commodity-reserves",
|
||||
"protocol-utility",
|
||||
"native-gas-asset",
|
||||
"unknown"
|
||||
],
|
||||
"metadataDomains": [
|
||||
"backingMetadata",
|
||||
"bridgeMetadata",
|
||||
"cashMetadata",
|
||||
"commodityMetadata",
|
||||
"reserveMetadata",
|
||||
"securityMetadata",
|
||||
"settlementMetadata"
|
||||
],
|
||||
"notes": [
|
||||
"Use category for the broad asset bucket.",
|
||||
"Use instrumentType, issuerType, claimType, backingAssets, capabilities, and tags for legal, reserve, and integration semantics.",
|
||||
"Use cash only as a backing, redemption, or settlement asset descriptor; do not use cash as the token category unless the instrument is literally cash-equivalent legal tender.",
|
||||
"Use commodityMetadata only when the token directly references or is backed by a commodity reserve.",
|
||||
"Use securityMetadata for pause/admin/monitoring controls; unknown means not yet committed in this inventory, not absent on-chain.",
|
||||
"GRU tags use lowercase namespace:value strings and include the version, for example gru:v2."
|
||||
]
|
||||
},
|
||||
"documentedTokens": [
|
||||
@@ -19,21 +71,284 @@
|
||||
"symbol": "AUSDT",
|
||||
"address": "0x015B1897Ed5279930bC2Be46F661894d219292A6",
|
||||
"decimals": 18,
|
||||
"category": "stablecoin",
|
||||
"category": "tokenized-fiat",
|
||||
"instrumentType": "fiat-backed-stablecoin",
|
||||
"issuerType": "token-issuer-unverified",
|
||||
"currencyCode": "USD",
|
||||
"claimType": "claim-on-issuer-unverified",
|
||||
"settlementAssetClass": "fiat",
|
||||
"backingAssets": [
|
||||
"cash",
|
||||
"cash-equivalents"
|
||||
],
|
||||
"gruVersion": "v2",
|
||||
"gruFamilySymbol": "cAUSDT",
|
||||
"gruTransportRole": "all-mainnet-primary-surface",
|
||||
"tags": [
|
||||
"tokenized-fiat",
|
||||
"fiat:usd",
|
||||
"backing:cash",
|
||||
"backing:cash-equivalents",
|
||||
"gru:v2",
|
||||
"gru:m1",
|
||||
"gru:transport",
|
||||
"gru:all-mainnet",
|
||||
"gru:causdt-family"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "fiat-reserve-backed",
|
||||
"backingAssetClasses": [
|
||||
"cash",
|
||||
"cash-equivalents"
|
||||
],
|
||||
"backingVerificationStatus": "reserve-disclosure-not-committed",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "live-canonical-target",
|
||||
"bridgeKind": "AlltraAdapter",
|
||||
"sourceChainId": 138,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": "cUSDT",
|
||||
"sourceAddress": "0x93E66202A11B1772E55407B32B44e5Cd8eda7f22",
|
||||
"destinationSymbol": "AUSDT",
|
||||
"destinationAddress": "0x015B1897Ed5279930bC2Be46F661894d219292A6",
|
||||
"adapterAddress": "0x66FEBA2fC9a0B47F26DD4284DAd24F970436B8Dc",
|
||||
"bridgeCanonicalAssetVersion": "gru-v2",
|
||||
"bridgeMirroredAssetVersion": "all-mainnet-surface"
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "reserve-and-redemption-asset-class",
|
||||
"currency": "USD",
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "issuer-or-bridge-reserve-unverified",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": null,
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "pending-disclosure",
|
||||
"riskTier": "policy-review-required",
|
||||
"registryStatus": "documented-surface-not-stablecoin-registry-entry"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "corridor-halt-required-for-issuer-bridge-or-peg-risk",
|
||||
"monitoring": [
|
||||
"peg-deviation",
|
||||
"bridge-health",
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "fiat",
|
||||
"settlementCurrency": "USD",
|
||||
"settlementFinalityDomain": "off-chain-regulated-ledger-or-issuer-domain",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": true,
|
||||
"redemptionPath": "issuer-or-bridge-redemption-unverified",
|
||||
"parRedemption": "unverified"
|
||||
},
|
||||
"status": "verified"
|
||||
},
|
||||
{
|
||||
"symbol": "USDT",
|
||||
"address": "0x66D8Efa0AF63B0e84eb1Dd72bf00f00cd1e2234e",
|
||||
"decimals": 18,
|
||||
"category": "stablecoin",
|
||||
"category": "tokenized-fiat",
|
||||
"instrumentType": "fiat-backed-stablecoin",
|
||||
"issuerType": "token-issuer-unverified",
|
||||
"currencyCode": "USD",
|
||||
"claimType": "claim-on-issuer-unverified",
|
||||
"settlementAssetClass": "fiat",
|
||||
"backingAssets": [
|
||||
"cash",
|
||||
"cash-equivalents"
|
||||
],
|
||||
"gruVersion": "v2",
|
||||
"gruFamilySymbol": "cUSDT",
|
||||
"gruTransportRole": "all-mainnet-usdt-surface",
|
||||
"tags": [
|
||||
"tokenized-fiat",
|
||||
"fiat:usd",
|
||||
"backing:cash",
|
||||
"backing:cash-equivalents",
|
||||
"gru:v2",
|
||||
"gru:m1",
|
||||
"gru:transport",
|
||||
"gru:all-mainnet",
|
||||
"gru:cusdt-family"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "fiat-reserve-backed",
|
||||
"backingAssetClasses": [
|
||||
"cash",
|
||||
"cash-equivalents"
|
||||
],
|
||||
"backingVerificationStatus": "reserve-disclosure-not-committed",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "documented-token-not-canonical-138-to-651940-target",
|
||||
"bridgeKind": "unknown-or-noncanonical",
|
||||
"sourceChainId": null,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": null,
|
||||
"sourceAddress": null,
|
||||
"destinationSymbol": "USDT",
|
||||
"destinationAddress": "0x66D8Efa0AF63B0e84eb1Dd72bf00f00cd1e2234e",
|
||||
"adapterAddress": null,
|
||||
"bridgeCanonicalAssetVersion": null,
|
||||
"bridgeMirroredAssetVersion": "all-mainnet-surface"
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "reserve-and-redemption-asset-class",
|
||||
"currency": "USD",
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "issuer-reserve-unverified",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": null,
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "pending-disclosure",
|
||||
"riskTier": "policy-review-required",
|
||||
"registryStatus": "documented-surface-not-stablecoin-registry-entry"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "corridor-halt-required-for-issuer-or-peg-risk",
|
||||
"monitoring": [
|
||||
"peg-deviation",
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "fiat",
|
||||
"settlementCurrency": "USD",
|
||||
"settlementFinalityDomain": "off-chain-issuer-domain",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": true,
|
||||
"redemptionPath": "issuer-redemption-unverified",
|
||||
"parRedemption": "unverified"
|
||||
},
|
||||
"status": "verified"
|
||||
},
|
||||
{
|
||||
"symbol": "USDC",
|
||||
"address": "0xa95EeD79f84E6A0151eaEb9d441F9Ffd50e8e881",
|
||||
"decimals": 18,
|
||||
"category": "stablecoin",
|
||||
"category": "tokenized-fiat",
|
||||
"instrumentType": "fiat-backed-stablecoin",
|
||||
"issuerType": "token-issuer-unverified",
|
||||
"currencyCode": "USD",
|
||||
"claimType": "claim-on-issuer-unverified",
|
||||
"settlementAssetClass": "fiat",
|
||||
"backingAssets": [
|
||||
"cash",
|
||||
"cash-equivalents"
|
||||
],
|
||||
"gruVersion": "v2",
|
||||
"gruFamilySymbol": "cUSDC",
|
||||
"gruTransportRole": "all-mainnet-usdc-surface",
|
||||
"tags": [
|
||||
"tokenized-fiat",
|
||||
"fiat:usd",
|
||||
"backing:cash",
|
||||
"backing:cash-equivalents",
|
||||
"gru:v2",
|
||||
"gru:m1",
|
||||
"gru:transport",
|
||||
"gru:all-mainnet",
|
||||
"gru:cusdc-family"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "fiat-reserve-backed",
|
||||
"backingAssetClasses": [
|
||||
"cash",
|
||||
"cash-equivalents"
|
||||
],
|
||||
"backingVerificationStatus": "reserve-disclosure-not-committed",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "live-canonical-target",
|
||||
"bridgeKind": "AlltraAdapter",
|
||||
"sourceChainId": 138,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": "cUSDC",
|
||||
"sourceAddress": "0xf22258f57794CC8E06237084b353Ab30fFfa640b",
|
||||
"destinationSymbol": "USDC",
|
||||
"destinationAddress": "0xa95EeD79f84E6A0151eaEb9d441F9Ffd50e8e881",
|
||||
"adapterAddress": "0x66FEBA2fC9a0B47F26DD4284DAd24F970436B8Dc",
|
||||
"bridgeCanonicalAssetVersion": "gru-v2",
|
||||
"bridgeMirroredAssetVersion": "all-mainnet-surface"
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "reserve-and-redemption-asset-class",
|
||||
"currency": "USD",
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "issuer-or-bridge-reserve-unverified",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": null,
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "pending-disclosure",
|
||||
"riskTier": "policy-review-required",
|
||||
"registryStatus": "documented-surface-not-stablecoin-registry-entry"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "corridor-halt-required-for-issuer-bridge-or-peg-risk",
|
||||
"monitoring": [
|
||||
"peg-deviation",
|
||||
"bridge-health",
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "fiat",
|
||||
"settlementCurrency": "USD",
|
||||
"settlementFinalityDomain": "off-chain-regulated-ledger-or-issuer-domain",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": true,
|
||||
"redemptionPath": "issuer-or-bridge-redemption-unverified",
|
||||
"parRedemption": "unverified"
|
||||
},
|
||||
"status": "verified"
|
||||
},
|
||||
{
|
||||
@@ -41,6 +356,80 @@
|
||||
"address": "0x798F6762BB40d6801A593459d08F890603D3979C",
|
||||
"decimals": 18,
|
||||
"category": "wrapped-native",
|
||||
"instrumentType": "wrapped-native",
|
||||
"issuerType": "wrapper-contract",
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"backingAssets": [
|
||||
"native-gas-asset"
|
||||
],
|
||||
"gruVersion": null,
|
||||
"tags": [
|
||||
"wrapped-native",
|
||||
"gas:eth",
|
||||
"all-mainnet"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "wrapped-native-escrow",
|
||||
"backingAssetClasses": [
|
||||
"native-gas-asset"
|
||||
],
|
||||
"backingVerificationStatus": "wrapper-contract-address-verified",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "mapped-138-to-651940",
|
||||
"bridgeKind": "AlltraAdapter",
|
||||
"sourceChainId": 138,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": "WETH9",
|
||||
"sourceAddress": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
|
||||
"destinationSymbol": "WETH",
|
||||
"destinationAddress": "0x798F6762BB40d6801A593459d08F890603D3979C",
|
||||
"adapterAddress": "0x66FEBA2fC9a0B47F26DD4284DAd24F970436B8Dc"
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "none",
|
||||
"currency": null,
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "native-asset-wrapper-escrow",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": "wrapper-contract-balance",
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "contract-balance-verifiable-on-chain",
|
||||
"riskTier": "bridge-and-wrapper-risk",
|
||||
"registryStatus": "documented-token-surface"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "corridor-halt-required-for-bridge-or-wrapper-risk",
|
||||
"monitoring": [
|
||||
"bridge-health",
|
||||
"wrapper-contract-balance",
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"settlementCurrency": "ETH",
|
||||
"settlementFinalityDomain": "chain-finality",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": false,
|
||||
"redemptionPath": "unwrap-or-bridge-withdrawal",
|
||||
"parRedemption": "one-to-one-native-asset-when-wrapper-solvent"
|
||||
},
|
||||
"status": "verified"
|
||||
},
|
||||
{
|
||||
@@ -48,6 +437,79 @@
|
||||
"address": "0x2da2b8f961F161ab6320acB3377e2e844a3C3ce4",
|
||||
"decimals": 18,
|
||||
"category": "wrapped-native",
|
||||
"instrumentType": "wrapped-native",
|
||||
"issuerType": "wrapper-contract",
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"backingAssets": [
|
||||
"native-gas-asset"
|
||||
],
|
||||
"gruVersion": null,
|
||||
"tags": [
|
||||
"wrapped-native",
|
||||
"gas:all",
|
||||
"all-mainnet"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "wrapped-native-escrow",
|
||||
"backingAssetClasses": [
|
||||
"native-gas-asset"
|
||||
],
|
||||
"backingVerificationStatus": "wrapper-contract-address-verified",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "documented-all-mainnet-native-wrapper",
|
||||
"bridgeKind": "native-wrapper",
|
||||
"sourceChainId": 651940,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": "ALL",
|
||||
"sourceAddress": null,
|
||||
"destinationSymbol": "WALL",
|
||||
"destinationAddress": "0x2da2b8f961F161ab6320acB3377e2e844a3C3ce4",
|
||||
"adapterAddress": null
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "none",
|
||||
"currency": null,
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "native-asset-wrapper-escrow",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": "wrapper-contract-balance",
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "contract-balance-verifiable-on-chain",
|
||||
"riskTier": "wrapper-risk",
|
||||
"registryStatus": "documented-token-surface"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "corridor-halt-required-for-wrapper-risk",
|
||||
"monitoring": [
|
||||
"wrapper-contract-balance",
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"settlementCurrency": "ALL",
|
||||
"settlementFinalityDomain": "chain-finality",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": false,
|
||||
"redemptionPath": "unwrap-to-native-all",
|
||||
"parRedemption": "one-to-one-native-asset-when-wrapper-solvent"
|
||||
},
|
||||
"status": "verified"
|
||||
},
|
||||
{
|
||||
@@ -55,6 +517,79 @@
|
||||
"address": "0x0d9793861AEB9244AD1B34375a83A6730F6AdD38",
|
||||
"decimals": 18,
|
||||
"category": "dex-token",
|
||||
"instrumentType": "protocol-token",
|
||||
"issuerType": "protocol",
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"backingAssets": [
|
||||
"protocol-utility"
|
||||
],
|
||||
"gruVersion": null,
|
||||
"tags": [
|
||||
"dex-token",
|
||||
"protocol:hydx",
|
||||
"all-mainnet"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "protocol-utility",
|
||||
"backingAssetClasses": [
|
||||
"protocol-utility"
|
||||
],
|
||||
"backingVerificationStatus": "not-reserve-backed",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "not-bridge-canonical-in-this-inventory",
|
||||
"bridgeKind": null,
|
||||
"sourceChainId": null,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": null,
|
||||
"sourceAddress": null,
|
||||
"destinationSymbol": "HYDX",
|
||||
"destinationAddress": "0x0d9793861AEB9244AD1B34375a83A6730F6AdD38",
|
||||
"adapterAddress": null
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "none",
|
||||
"currency": null,
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "none-protocol-token",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": null,
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "not-applicable",
|
||||
"riskTier": "protocol-token-risk",
|
||||
"registryStatus": "documented-token-surface"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "routing-halt-required-for-protocol-or-contract-risk",
|
||||
"monitoring": [
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes",
|
||||
"protocol-surface-confirmation"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"settlementCurrency": "HYDX",
|
||||
"settlementFinalityDomain": "chain-finality",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": false,
|
||||
"redemptionPath": "not-applicable",
|
||||
"parRedemption": "not-applicable"
|
||||
},
|
||||
"status": "verified"
|
||||
},
|
||||
{
|
||||
@@ -62,6 +597,78 @@
|
||||
"address": "0x1839f77eBed7F388c7035f7061B4B8Ef0E72317a",
|
||||
"decimals": 8,
|
||||
"category": "defi-token",
|
||||
"instrumentType": "protocol-token",
|
||||
"issuerType": "protocol",
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"backingAssets": [
|
||||
"protocol-utility"
|
||||
],
|
||||
"gruVersion": null,
|
||||
"tags": [
|
||||
"defi-token",
|
||||
"protocol:hybx",
|
||||
"all-mainnet"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "protocol-utility",
|
||||
"backingAssetClasses": [
|
||||
"protocol-utility"
|
||||
],
|
||||
"backingVerificationStatus": "not-reserve-backed",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "not-bridge-canonical-in-this-inventory",
|
||||
"bridgeKind": null,
|
||||
"sourceChainId": null,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": null,
|
||||
"sourceAddress": null,
|
||||
"destinationSymbol": "HYBX",
|
||||
"destinationAddress": "0x1839f77eBed7F388c7035f7061B4B8Ef0E72317a",
|
||||
"adapterAddress": null
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "none",
|
||||
"currency": null,
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "none-protocol-token",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": null,
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "not-applicable",
|
||||
"riskTier": "protocol-token-risk",
|
||||
"registryStatus": "documented-token-surface"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "routing-halt-required-for-protocol-or-contract-risk",
|
||||
"monitoring": [
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"settlementCurrency": "HYBX",
|
||||
"settlementFinalityDomain": "chain-finality",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": false,
|
||||
"redemptionPath": "not-applicable",
|
||||
"parRedemption": "not-applicable"
|
||||
},
|
||||
"status": "verified"
|
||||
},
|
||||
{
|
||||
@@ -69,6 +676,79 @@
|
||||
"address": "0xE59Bb804F4884FcEA183a4A67B1bb04f4a4567bc",
|
||||
"decimals": 8,
|
||||
"category": "defi-token",
|
||||
"instrumentType": "utility-token",
|
||||
"issuerType": "protocol",
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"backingAssets": [
|
||||
"protocol-utility"
|
||||
],
|
||||
"gruVersion": null,
|
||||
"tags": [
|
||||
"defi-token",
|
||||
"utility-token",
|
||||
"protocol:cht",
|
||||
"all-mainnet"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "protocol-utility",
|
||||
"backingAssetClasses": [
|
||||
"protocol-utility"
|
||||
],
|
||||
"backingVerificationStatus": "not-reserve-backed",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "not-bridge-canonical-in-this-inventory",
|
||||
"bridgeKind": null,
|
||||
"sourceChainId": null,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": null,
|
||||
"sourceAddress": null,
|
||||
"destinationSymbol": "CHT",
|
||||
"destinationAddress": "0xE59Bb804F4884FcEA183a4A67B1bb04f4a4567bc",
|
||||
"adapterAddress": null
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "none",
|
||||
"currency": null,
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "none-utility-token",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": null,
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "not-applicable",
|
||||
"riskTier": "utility-token-risk",
|
||||
"registryStatus": "documented-token-surface"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "routing-halt-required-for-protocol-or-contract-risk",
|
||||
"monitoring": [
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"settlementCurrency": "CHT",
|
||||
"settlementFinalityDomain": "chain-finality",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": false,
|
||||
"redemptionPath": "not-applicable",
|
||||
"parRedemption": "not-applicable"
|
||||
},
|
||||
"status": "verified"
|
||||
},
|
||||
{
|
||||
@@ -76,6 +756,78 @@
|
||||
"address": "0x690740f055A41FA7669f5a379Bf71B0cDF353073",
|
||||
"decimals": 18,
|
||||
"category": "defi-token",
|
||||
"instrumentType": "protocol-token",
|
||||
"issuerType": "protocol",
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"backingAssets": [
|
||||
"protocol-utility"
|
||||
],
|
||||
"gruVersion": null,
|
||||
"tags": [
|
||||
"defi-token",
|
||||
"protocol:auda",
|
||||
"all-mainnet"
|
||||
],
|
||||
"backingMetadata": {
|
||||
"backingModel": "protocol-utility",
|
||||
"backingAssetClasses": [
|
||||
"protocol-utility"
|
||||
],
|
||||
"backingVerificationStatus": "not-reserve-backed",
|
||||
"overcollateralizationRequired": false
|
||||
},
|
||||
"bridgeMetadata": {
|
||||
"bridgeStatus": "not-bridge-canonical-in-this-inventory",
|
||||
"bridgeKind": null,
|
||||
"sourceChainId": null,
|
||||
"destinationChainId": 651940,
|
||||
"sourceSymbol": null,
|
||||
"sourceAddress": null,
|
||||
"destinationSymbol": "AUDA",
|
||||
"destinationAddress": "0x690740f055A41FA7669f5a379Bf71B0cDF353073",
|
||||
"adapterAddress": null
|
||||
},
|
||||
"cashMetadata": {
|
||||
"cashRole": "none",
|
||||
"currency": null,
|
||||
"cashBackingAssertedByRepo": false,
|
||||
"cashBackingEvidenceRef": null
|
||||
},
|
||||
"commodityMetadata": {
|
||||
"commodityBacked": false,
|
||||
"commodityType": null,
|
||||
"commodityUnit": null,
|
||||
"reserveLocationRef": null
|
||||
},
|
||||
"reserveMetadata": {
|
||||
"reserveModel": "none-protocol-token",
|
||||
"reserveDisclosureRef": null,
|
||||
"reserveAccountRef": null,
|
||||
"proofOfReserveRef": null,
|
||||
"reserveVerificationStatus": "not-applicable",
|
||||
"riskTier": "protocol-token-risk",
|
||||
"registryStatus": "documented-token-surface"
|
||||
},
|
||||
"securityMetadata": {
|
||||
"pauseAuthority": "unknown",
|
||||
"adminAuthority": "unknown",
|
||||
"upgradeability": "unknown",
|
||||
"keyManagement": "unknown",
|
||||
"emergencyHalt": "routing-halt-required-for-protocol-or-contract-risk",
|
||||
"monitoring": [
|
||||
"liquidity-depth",
|
||||
"contract-admin-changes"
|
||||
]
|
||||
},
|
||||
"settlementMetadata": {
|
||||
"settlementAssetClass": "crypto-native",
|
||||
"settlementCurrency": "AUDA",
|
||||
"settlementFinalityDomain": "chain-finality",
|
||||
"onChainFinality": "token-transfer-final-on-chain-651940-after-confirmation",
|
||||
"accountingEvidenceRequired": false,
|
||||
"redemptionPath": "not-applicable",
|
||||
"parRedemption": "not-applicable"
|
||||
},
|
||||
"status": "verified"
|
||||
}
|
||||
],
|
||||
@@ -83,12 +835,43 @@
|
||||
{
|
||||
"name": "AlltraDEX / EnhancedSwapRouter",
|
||||
"family": "custom_router",
|
||||
"status": "documented_inventory_pending",
|
||||
"status": "partial_live_dodo_backed_router_deployed",
|
||||
"factoryAddress": null,
|
||||
"routerAddress": null,
|
||||
"routerAddress": "0xb905fEfA56b028221E2Bc248Bbcd41141dc7aeD3",
|
||||
"coordinatorAddress": "0x9276ae27d9c624B43dbE43494f34A9c5F0233a0B",
|
||||
"providerAddress": "0x36F65027D21e151F0b7810bae1E94b225AC7Ba9e",
|
||||
"adapters": {
|
||||
"dodo": "0x391D192BED6188c4DaB4C93c078bD18432687474",
|
||||
"dodoV3": "0x97Ce874142625134aEEBDF42B5E7bB806e731D25",
|
||||
"uniswapV3": "0xBF75F3401de20bebBB1CBb678499941807E3E040",
|
||||
"balancer": "0xDE7F15AF1D84e3694f7E966293d20e64Fc04d9fF",
|
||||
"curve": "0x753D2b0a723992D7B174D6e19F7b7Cb74be8D61a",
|
||||
"oneInch": "0x487090bbb7d17875281692d582a11B445b3A7AC7"
|
||||
},
|
||||
"enabledProviders": [
|
||||
"dodo"
|
||||
],
|
||||
"disabledProviders": [
|
||||
"dodoV3",
|
||||
"uniswapV3",
|
||||
"balancer",
|
||||
"curve",
|
||||
"oneInch",
|
||||
"partner"
|
||||
],
|
||||
"publishedRoutePoolIds": [
|
||||
"651940-dodo_pmm-wall-ausdc",
|
||||
"651940-dodo_pmm-wall-ausdt"
|
||||
],
|
||||
"deploymentEvidenceRef": "config/all-mainnet-enhanced-router-deployment.json",
|
||||
"notes": [
|
||||
"Documented in docs/11-references/ALL_MAINNET_ROUTING_ENGINE.md as the intended same-chain swap surface.",
|
||||
"No committed canonical factory/router/pool inventory is currently published in-repo."
|
||||
"EnhancedSwapRouterV2 is deployed on ALL Mainnet with DODO as the only enabled provider.",
|
||||
"Optional adapters were deployed for future wiring but are disabled until canonical provider targets and pools are committed.",
|
||||
"WALL/AUSDC and WALL/AUSDT are funded and quoteable through the router provider path; the earlier WALL/USDT route is disabled because AUSDT is the canonical quote asset."
|
||||
],
|
||||
"disabledRoutePoolIds": [
|
||||
"651940-dodo_pmm-wall-usdt"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -105,40 +888,68 @@
|
||||
],
|
||||
"notes": [
|
||||
"The HYDX token is documented and verified on ALL Mainnet.",
|
||||
"The repo expects factory/router discovery via env, but no canonical pool inventory is currently committed."
|
||||
"The repo expects factory/router discovery via env, but no canonical HYDX-native router inventory is currently committed.",
|
||||
"HYDX currently has committed same-chain exposure through the ALL Mainnet Uniswap V2 HYDX/WALL pool, not through a dedicated HYDX-native router surface."
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "Uniswap V2",
|
||||
"family": "uniswap_v2",
|
||||
"status": "env_placeholder_only",
|
||||
"factoryAddress": null,
|
||||
"routerAddress": null,
|
||||
"status": "partial_live_inventory_published",
|
||||
"factoryAddress": "0x3C3ED514691C06c89Bf6626B05D22991E8924c93",
|
||||
"routerAddress": "0xED04Ee8307C0656207AF5aFE3926AE2380052940",
|
||||
"inventoryRef": "config/all-mainnet-pool-creation-matrix.json",
|
||||
"publishedPoolIds": [
|
||||
"651940-uniswap_v2-wall-ausdc",
|
||||
"651940-uniswap_v2-wall-usdt",
|
||||
"651940-uniswap_v2-usdt-ausdc",
|
||||
"651940-uniswap_v2-hydx-wall"
|
||||
],
|
||||
"notes": [
|
||||
"Referenced in token-aggregation dex-factory config and docs as an env-driven surface.",
|
||||
"Do not treat as routable until real factory/router/pair addresses are committed."
|
||||
"Factory/router and multiple pair addresses are committed in config/all-mainnet-pool-creation-matrix.json.",
|
||||
"Required spend rows remain gated until vault assignments and canary evidence are recorded."
|
||||
]
|
||||
},
|
||||
{
|
||||
"name": "Uniswap V3",
|
||||
"family": "uniswap_v3",
|
||||
"status": "env_placeholder_only",
|
||||
"factoryAddress": null,
|
||||
"routerAddress": null,
|
||||
"status": "standalone_live_router_quoter_pool_deployed",
|
||||
"factoryAddress": "0xF1a334465C5DD628492780B39Be68D561A9AecA2",
|
||||
"routerAddress": "0xe9Ea1B70803c18C4CEb8839D5D68681c7903511B",
|
||||
"notes": [
|
||||
"Referenced in token-aggregation dex-factory config and docs as an env-driven surface.",
|
||||
"Do not treat as routable until real factory/router/pool addresses are committed."
|
||||
]
|
||||
"Official Uniswap V3 factory, legacy SwapRouter, Quoter, QuoterV2, NonfungiblePositionManager, and AUSDT/WALL 0.30% pool are deployed on ALL Mainnet.",
|
||||
"The standalone SwapRouter path was tested with a tiny WALL -> AUSDT canary swap.",
|
||||
"EnhancedSwapRouterV2 route config was written, but provider 1 remains disabled because the current UniswapV3RouteExecutorAdapter staticcall quote path is incompatible with the upstream Quoter behavior."
|
||||
],
|
||||
"quoterAddress": "0x0ecC56077325863c80cbe516D63e0afAFf7EA579",
|
||||
"quoterV2Address": "0x024Ff178BaB7e6fa1794c3A216D2B299C3F295d2",
|
||||
"positionManagerAddress": "0xD29422211e1f2C1015FBb5dC2004657Dd8318aF6",
|
||||
"descriptorAddress": "0x2a76C73458A0C11df4e0E43004598480d6D1E768",
|
||||
"poolAddress": "0x9e0FC06BA367b51a0aBc5c0924306088DBB0e9c4",
|
||||
"inventoryRef": "config/all-mainnet-pool-creation-matrix.json",
|
||||
"deploymentEvidenceRef": "config/all-mainnet-uniswap-v3-deployment.json",
|
||||
"publishedPoolIds": [
|
||||
"651940-uniswap_v3-wall-ausdt"
|
||||
],
|
||||
"enhancedRouterProviderStatus": "disabled_adapter_quote_compatibility_pending"
|
||||
},
|
||||
{
|
||||
"name": "DODO PMM",
|
||||
"family": "dodo_pmm",
|
||||
"status": "env_placeholder_only",
|
||||
"factoryAddress": null,
|
||||
"routerAddress": null,
|
||||
"status": "partial_live_inventory_published",
|
||||
"factoryAddress": "0x8a3403aef8d40c0F4AfaF6Dc2000A537EbC863c2",
|
||||
"routerAddress": "0x8528E268F3b8C94208d09D131ACa3Ea93Bad57c7",
|
||||
"inventoryRef": "config/all-mainnet-pool-creation-matrix.json",
|
||||
"publishedPoolIds": [
|
||||
"651940-dodo_pmm-wall-ausdc",
|
||||
"651940-dodo_pmm-wall-ausdt"
|
||||
],
|
||||
"notes": [
|
||||
"Mentioned in docs as placeholder-only for ALL Mainnet.",
|
||||
"No committed DODO PMM pool inventory is currently published for chain 651940."
|
||||
"DVM factory, DVM factory adapter, integration/router, and DODO PMM pool addresses are committed in config/all-mainnet-pool-creation-matrix.json; WALL/AUSDT supersedes the earlier WALL/USDT row for canonical spend routing.",
|
||||
"Required spend rows remain gated until vault assignments, funding, live reserve reads, and canary evidence are recorded."
|
||||
],
|
||||
"disabledPoolIds": [
|
||||
"651940-dodo_pmm-wall-usdt"
|
||||
]
|
||||
}
|
||||
],
|
||||
@@ -154,8 +965,9 @@
|
||||
]
|
||||
},
|
||||
"nextTasks": [
|
||||
"Publish real same-chain pool inventory before promoting ALL Mainnet beyond bridge-live inventory.",
|
||||
"Commit canonical factory/router metadata once HYDX or AlltraDEX routing addresses are confirmed.",
|
||||
"Add pool-level addresses and verification artifacts before enabling public route generation from this protocol surface."
|
||||
"Keep the disabled WALL/USDT row historical-only unless explicitly re-approved for USDT routing.",
|
||||
"Commit canonical factory/router metadata once a HYDX-native routing address is confirmed.",
|
||||
"Deploy or import canonical Uniswap V3 factory/router/quoter/pool inventory before enabling the ALL Mainnet Uniswap V3 adapter.",
|
||||
"Add pool-level addresses and verification artifacts before enabling public route generation from disabled optional protocol providers."
|
||||
]
|
||||
}
|
||||
|
||||
13
config/gitea-workflow-templates/README.md
Normal file
13
config/gitea-workflow-templates/README.md
Normal file
@@ -0,0 +1,13 @@
|
||||
# Gitea Actions workflow templates
|
||||
|
||||
Copy one of these into **your repo** as `.gitea/workflows/<workflow-name>.yml`, then set repo **Secrets** in Gitea (`PHOENIX_DEPLOY_URL`, `PHOENIX_DEPLOY_TOKEN`).
|
||||
|
||||
| Template | Use when |
|
||||
|----------|----------|
|
||||
| [`deploy-via-phoenix-api.yml`](deploy-via-phoenix-api.yml) | App/service with a row in `phoenix-deploy-api/deploy-targets.json` |
|
||||
| [`validate-only.yml`](validate-only.yml) | Libraries/docs — CI gate only, no VM deploy |
|
||||
| **[`repos/`](repos/README.md)** | **Concrete YAML** for DBIS, CROMERO, CurrenciCombo — copy into those Gitea repos |
|
||||
|
||||
See [docs/04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md](../../docs/04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md) for repo ↔ VM mapping.
|
||||
|
||||
**Operator checklist:** [docs/00-meta/GITEA_CD_OPERATOR_CHECKLIST.md](../../docs/00-meta/GITEA_CD_OPERATOR_CHECKLIST.md).
|
||||
30
config/gitea-workflow-templates/deploy-via-phoenix-api.yml
Normal file
30
config/gitea-workflow-templates/deploy-via-phoenix-api.yml
Normal file
@@ -0,0 +1,30 @@
|
||||
# Template — copy to YOUR_REPO/.gitea/workflows/<name>.yml and replace placeholders.
|
||||
# Secrets (repo settings): PHOENIX_DEPLOY_URL, PHOENIX_DEPLOY_TOKEN
|
||||
name: Deploy via Phoenix API
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main, master]
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Trigger Phoenix deployment
|
||||
env:
|
||||
PHOENIX_DEPLOY_URL: ${{ secrets.PHOENIX_DEPLOY_URL }}
|
||||
PHOENIX_DEPLOY_TOKEN: ${{ secrets.PHOENIX_DEPLOY_TOKEN }}
|
||||
TARGET: default
|
||||
run: |
|
||||
set -euo pipefail
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
REPO="${{ gitea.repository }}"
|
||||
curl -sSf -X POST "${PHOENIX_DEPLOY_URL}" \
|
||||
-H "Authorization: Bearer ${PHOENIX_DEPLOY_TOKEN}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"${REPO}\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"${TARGET}\"}"
|
||||
14
config/gitea-workflow-templates/repos/README.md
Normal file
14
config/gitea-workflow-templates/repos/README.md
Normal file
@@ -0,0 +1,14 @@
|
||||
# Ready-to-copy workflows (repo-specific)
|
||||
|
||||
Copy the matching file into **that** Gitea repo as `.gitea/workflows/<name>.yml`, then set secrets **`PHOENIX_DEPLOY_URL`**, **`PHOENIX_DEPLOY_TOKEN`**.
|
||||
|
||||
| File | Gitea `repo` | `target` | Notes |
|
||||
|------|----------------|----------|--------|
|
||||
| [`dbis-portal-live.yml`](dbis-portal-live.yml) | `Gov_Web_Portals/DBIS` | `dbis-portal-live` | CT 7804 portal |
|
||||
| [`cromero-default.yml`](cromero-default.yml) | `d-bis/CROMERO` | `default` | NPM ecosystem build |
|
||||
| [`currencicombo-default.yml`](currencicombo-default.yml) | `d-bis/CurrenciCombo` | `default` | Phoenix CT 8604 |
|
||||
| — | `d-bis/explorer-monorepo` | `explorer-live` | Already in **explorer-monorepo** submodule: `.gitea/workflows/deploy-live.yml` |
|
||||
| — | `Gov_Web_Portals/CyberSecur-Global` | `default` | In **CyberSecur-Global** repo: `.gitea/workflows/deploy-to-ct7810.yml` |
|
||||
| — | `d-bis/cross-chain-pmm-lps` | _(validate only)_ | `.gitea/workflows/validate-capital-efficiency.yml` |
|
||||
|
||||
`d-bis/proxmox` uses monorepo workflows in-repo (no copy from here).
|
||||
23
config/gitea-workflow-templates/repos/cromero-default.yml
Normal file
23
config/gitea-workflow-templates/repos/cromero-default.yml
Normal file
@@ -0,0 +1,23 @@
|
||||
# Copy to d-bis/CROMERO → .gitea/workflows/deploy-via-phoenix.yml
|
||||
# Secrets: PHOENIX_DEPLOY_URL, PHOENIX_DEPLOY_TOKEN
|
||||
name: Deploy CROMERO (Phoenix)
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main, master]
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: Trigger Phoenix deployment
|
||||
run: |
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"d-bis/CROMERO\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"default\"}"
|
||||
@@ -0,0 +1,23 @@
|
||||
# Copy to d-bis/CurrenciCombo → .gitea/workflows/deploy-via-phoenix.yml
|
||||
# Secrets: PHOENIX_DEPLOY_URL, PHOENIX_DEPLOY_TOKEN
|
||||
name: Deploy CurrenciCombo (Phoenix)
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main, master]
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: Trigger Phoenix deployment
|
||||
run: |
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"d-bis/CurrenciCombo\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"default\"}"
|
||||
23
config/gitea-workflow-templates/repos/dbis-portal-live.yml
Normal file
23
config/gitea-workflow-templates/repos/dbis-portal-live.yml
Normal file
@@ -0,0 +1,23 @@
|
||||
# Copy to Gov_Web_Portals/DBIS → .gitea/workflows/deploy-portal-live.yml
|
||||
# Secrets: PHOENIX_DEPLOY_URL, PHOENIX_DEPLOY_TOKEN
|
||||
name: Deploy DBIS portal (Phoenix)
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main]
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
deploy:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
- name: Trigger Phoenix deployment
|
||||
run: |
|
||||
SHA="$(git rev-parse HEAD)"
|
||||
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
curl -sSf -X POST "${{ secrets.PHOENIX_DEPLOY_URL }}" \
|
||||
-H "Authorization: Bearer ${{ secrets.PHOENIX_DEPLOY_TOKEN }}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "{\"repo\":\"Gov_Web_Portals/DBIS\",\"sha\":\"${SHA}\",\"branch\":\"${BRANCH}\",\"target\":\"dbis-portal-live\"}"
|
||||
18
config/gitea-workflow-templates/validate-only.yml
Normal file
18
config/gitea-workflow-templates/validate-only.yml
Normal file
@@ -0,0 +1,18 @@
|
||||
# Template — copy to YOUR_REPO/.gitea/workflows/validate.yml — adjust run steps.
|
||||
name: Validate
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main, master]
|
||||
pull_request:
|
||||
|
||||
jobs:
|
||||
validate:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Placeholder validation
|
||||
run: |
|
||||
echo "Replace this step with repo-specific checks (e.g. npm test, forge test)."
|
||||
Submodule cross-chain-pmm-lps updated: 1cf845cb3a...f8593b905f
29
docs/00-meta/GITEA_CD_OPERATOR_CHECKLIST.md
Normal file
29
docs/00-meta/GITEA_CD_OPERATOR_CHECKLIST.md
Normal file
@@ -0,0 +1,29 @@
|
||||
# Gitea CD/CI — operator checklist
|
||||
|
||||
Use this after changing **`phoenix-deploy-api/deploy-targets.json`** or adding workflows under **`config/gitea-workflow-templates/`**.
|
||||
|
||||
## One-time per application repo (on Gitea)
|
||||
|
||||
1. **Actions enabled** for the org/repo (Gitea settings).
|
||||
2. **Secrets** on **that repo** (not only global):
|
||||
- **`PHOENIX_DEPLOY_URL`** — full URL for `POST` (same shape as **`d-bis/proxmox`** workflows use), typically `http://<dev-vm>:4001/api/deploy` or HTTPS equivalent.
|
||||
- **`PHOENIX_DEPLOY_TOKEN`** — bearer token accepted by Phoenix deploy API.
|
||||
3. **Workflow file** in the repo: copy from [`config/gitea-workflow-templates/repos/README.md`](../config/gitea-workflow-templates/repos/README.md) or use the repo’s existing `.gitea/workflows/*.yml`.
|
||||
|
||||
## Phoenix deploy host (LAN)
|
||||
|
||||
1. **`git pull`** **proxmox** so **`deploy-targets.json`** and **`scripts/deployment/phoenix-deploy-*.sh`** match Gitea **`d-bis/proxmox`** `master` / `main`.
|
||||
2. Restart or reinstall **phoenix-deploy-api** if you manage it via systemd (see **`phoenix-deploy-api/scripts/install-systemd.sh`**).
|
||||
3. **`GITEA_TOKEN`** on that host must allow archive fetch for repos you deploy.
|
||||
|
||||
## Verify locally (proxmox clone)
|
||||
|
||||
```bash
|
||||
bash scripts/validation/validate-phoenix-deploy-targets.sh phoenix-deploy-api/deploy-targets.json
|
||||
bash scripts/verify/report-gitea-cd-parity.sh
|
||||
```
|
||||
|
||||
## Canonical references
|
||||
|
||||
- [GITEA_REPO_VM_CD_CI_MATRIX.md](../04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md)
|
||||
- [config/gitea-workflow-templates/README.md](../../config/gitea-workflow-templates/README.md)
|
||||
@@ -1,9 +1,9 @@
|
||||
# Next Steps — Index
|
||||
|
||||
**Last Updated:** 2026-04-23
|
||||
**Last Updated:** 2026-04-29
|
||||
**Purpose:** Single entry point for "what to do next." Pick by audience and granularity.
|
||||
|
||||
**Latest automation run (2026-04-23):** `./scripts/run-completable-tasks-from-anywhere.sh --dry-run --json-out reports/status/run-completable-tasks-latest.json` completed and `bash scripts/verify/run-all-validation.sh --skip-genesis --json-out reports/status/run-all-validation-latest.json` passed, refreshing the current 61/61 on-chain-aware no-LAN flow plus advisory Solana/Tron/XRPL status. `./scripts/run-all-operator-tasks-from-lan.sh --skip-backup` remains the LAN/operator follow-on when secrets and host access are available. **Besu node lists:** push canonical `config/besu-node-lists/*` with `bash scripts/deploy-besu-node-lists-to-all.sh`; reload with `bash scripts/besu/restart-besu-reload-node-lists.sh` during a maintenance window if peers do not pick up static nodes without restart.
|
||||
**Latest automation run (2026-04-29):** `./scripts/run-completable-tasks-from-anywhere.sh --json-out reports/status/run-completable-tasks-latest.json` (config + 61/61 on-chain + validation + non-EVM + reconcile-env). **`./scripts/run-all-operator-tasks-from-lan.sh --skip-backup --json-out reports/status/run-all-operator-tasks-latest.json`** (NPMplus + Blockscout verify). **`./scripts/deployment/run-all-next-steps-chain138.sh --skip-mirror --skip-mesh --skip-register-gru --json-out reports/status/run-all-next-steps-chain138-latest.json`** (preflight + 61/61 verify). **`./scripts/deployment/run-cw-remaining-steps.sh --verify`** (cW* MINTER/BURNER vs CW_BRIDGE_* on configured chains). **`./scripts/run-e2e-flow-tasks-full-parallel.sh --dry-run --json-out reports/status/run-e2e-flow-tasks-latest.json`**. Wrapper scripts are `chmod +x` for `run-completable-tasks-from-anywhere.sh` and `run-all-operator-tasks-from-lan.sh`. **Still external / capital-gated:** Trust/Ledger PRs, CRO/WEMIX CCIP, deep mainnet UniV2 cWUSDC/USDC TVL, HYBX 4.995 zip, NPMplus backup when `NPM_PASSWORD` unset. **Besu node lists:** push canonical `config/besu-node-lists/*` with `bash scripts/deploy-besu-node-lists-to-all.sh`; reload with `bash scripts/besu/restart-besu-reload-node-lists.sh` during a maintenance window if peers do not pick up static nodes without restart.
|
||||
|
||||
**Documentation index:** [../MASTER_INDEX.md](../MASTER_INDEX.md) — canonical docs, deprecated list, and navigation.
|
||||
**Repo-local recommendation tracker:** [REPO_LOCAL_RECOMMENDATIONS_STATUS.md](REPO_LOCAL_RECOMMENDATIONS_STATUS.md) — current slice of recommendations that can be advanced directly in this workspace.
|
||||
@@ -16,7 +16,7 @@
|
||||
|
||||
| # | Action | Command / doc | Status |
|
||||
|---|--------|----------------|--------|
|
||||
| 1 | From anywhere: config + on-chain + validation | `./scripts/run-completable-tasks-from-anywhere.sh [--json-out reports/status/run-completable-tasks-latest.json]` | Done 2026-04-23 |
|
||||
| 1 | From anywhere: config + on-chain + validation | `./scripts/run-completable-tasks-from-anywhere.sh [--json-out reports/status/run-completable-tasks-latest.json]` | Done 2026-04-28 |
|
||||
| 2 | Before Chain 138 deploy: preflight (RPC, dotenv, nonce, cost) | `./scripts/deployment/preflight-chain138-deploy.sh [--cost]` | Done 2026-03-02 |
|
||||
| 3 | **Chain 138 next steps (all in one):** preflight → mirror+pool → register c* as GRU → verify | `./scripts/deployment/run-all-next-steps-chain138.sh [--dry-run] [--skip-mirror] [--skip-register-gru] [--skip-verify]` | Done 2026-03-02 |
|
||||
| 4 | Full deployment order (Phase 0–6) | [DEPLOYMENT_ORDER_OF_OPERATIONS.md](../03-deployment/DEPLOYMENT_ORDER_OF_OPERATIONS.md) | Remaining (Operator) |
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# TODOs — Consolidated Task List
|
||||
|
||||
**Last Updated:** 2026-04-23
|
||||
**Last verification run:** 2026-03-28 — completable ✅ (61/61 on-chain), operator `--skip-backup` ✅ (NPMplus 40 hosts updated, Blockscout verify batch). Prior 2026-03-06 run: validate-config ✅, check-contracts, PMM pool balances ✅ (Pool 1: 2M/2M), preflight ✅, token-aggregation build ✅, E2E routing ✅ (37 domains, 0 failed). **Mint + add-liquidity** 2026-03-06: 1M each minted, 500k each added. **Next-steps check:** [NEXT_STEPS_LIST.md](NEXT_STEPS_LIST.md); B.1/B.2/B.3 partially blocked (WEMIX tabled; LINK relay runbook pending).
|
||||
**Last Updated:** 2026-04-29
|
||||
**Last verification run:** 2026-04-29 — completable ✅ (61/61 on-chain, ALL Mainnet CI gates), operator `--skip-backup` ✅ (NPMplus + Blockscout verify), **`run-all-next-steps-chain138.sh`** ✅ (preflight + verify; mirror/mesh/GRU skipped as already applied), **`run-cw-remaining-steps.sh --verify`** ✅, **E2E full-parallel** ✅ (dry-run + JSON). Prior 2026-04-28 snapshot remains for historical detail. Prior 2026-03-06 run: validate-config ✅, check-contracts, PMM pool balances ✅ (Pool 1: 2M/2M), preflight ✅, token-aggregation build ✅, E2E routing ✅ (37 domains, 0 failed). **Mint + add-liquidity** 2026-03-06: 1M each minted, 500k each added. **Next-steps check:** [NEXT_STEPS_LIST.md](NEXT_STEPS_LIST.md); B.1/B.2/B.3 partially blocked (WEMIX tabled; LINK relay runbook pending).
|
||||
**Purpose:** Single checklist of all next steps and remaining tasks. **Indonesia / HYBX-BATCH-001 zip (4.995 ship-ready):** [HYBX-BATCH-001 — transaction package ship-ready](#hybx-batch-001--transaction-package-ship-ready-4995) below. **Full execution order (multiple routes + liquidity):** [EXECUTION_CHECKLIST_MULTIPLE_ROUTES_AND_LIQUIDITY.md](EXECUTION_CHECKLIST_MULTIPLE_ROUTES_AND_LIQUIDITY.md). **Additional paths (registry, LiFi/Jumper, Etherlink, 13×13):** [ADDITIONAL_PATHS_AND_EXTENSIONS.md](../04-configuration/ADDITIONAL_PATHS_AND_EXTENSIONS.md). **Dotenv/markdown audit (required info, gaps, recommendations):** [DOTENV_AND_MARKDOWN_AUDIT_GAPS_AND_RECOMMENDATIONS.md](DOTENV_AND_MARKDOWN_AUDIT_GAPS_AND_RECOMMENDATIONS.md). Source of truth for the full list: [NEXT_STEPS_AND_REMAINING_TODOS.md](NEXT_STEPS_AND_REMAINING_TODOS.md). **Token deployments remaining:** [TOKEN_CONTRACT_DEPLOYMENTS_REMAINING.md](../11-references/TOKEN_CONTRACT_DEPLOYMENTS_REMAINING.md). **Routing / swap / cross-chain:** [TASKS_ROUTING_SWAP_CROSSCHAIN.md](TASKS_ROUTING_SWAP_CROSSCHAIN.md) (A1–A5, B1–B8, C1–C8, D1–D3, E1–E2). **Verified list (LAN/Operator):** [REQUIRED_FIXES_GAPS_AND_DEPLOYMENTS_LIST.md](REQUIRED_FIXES_GAPS_AND_DEPLOYMENTS_LIST.md) — run bash/curl to confirm; doc updated 2026-03-03.
|
||||
|
||||
**Quick run:** From anywhere (no LAN): `./scripts/run-completable-tasks-from-anywhere.sh [--json-out reports/status/run-completable-tasks-latest.json]`. Before Chain 138 deploy: `./scripts/deployment/preflight-chain138-deploy.sh [--cost]`. **Chain 138 next steps (all in one):** `./scripts/deployment/run-all-next-steps-chain138.sh [--dry-run] [--skip-mirror] [--skip-register-gru] [--skip-verify] [--json-out reports/status/run-all-next-steps-chain138-latest.json]` — preflight → mirror+pool → register c* as GRU → verify. From LAN with secrets: `./scripts/run-all-operator-tasks-from-lan.sh [--deploy] [--create-vms] [--json-out reports/status/run-all-operator-tasks-latest.json]`. **E2E flows (full parallel):** `./scripts/run-e2e-flow-tasks-full-parallel.sh [--dry-run] [--json-out reports/status/run-e2e-flow-tasks-latest.json]` — [TASKS_TO_INCREASE_ALL_E2E_FLOWS](TASKS_TO_INCREASE_ALL_E2E_FLOWS.md).
|
||||
|
||||
@@ -0,0 +1,282 @@
|
||||
# Government Treasury, EMI, Digital Wallet and Regulated Settlement Master Plan
|
||||
|
||||
**Last updated:** 2026-04-28
|
||||
**Audience:** Program owners, legal/compliance, treasury and banking ops, architecture, engineering
|
||||
**Purpose:** Single umbrella plan for integrating **Electronic Money Institutions (EMIs)**, **digital wallets**, **virtual accounts** (including vendor patterns such as Tatum Virtual Accounts), **government treasuries**, **central banks / RTGS**, **fully licensed participants**, and **DBIS on-chain settlement and liquidity**—without conflating regulated fiat finality with blockchain authorization or DeFi-style liquidity.
|
||||
|
||||
**Non-goal:** This document is not legal advice. Counsel owns statute interpretation; this frames **artifacts**, **roles**, **system boundaries**, and **implementation gates**.
|
||||
|
||||
---
|
||||
|
||||
## 1. Executive Summary
|
||||
|
||||
DBIS already separates concerns correctly at the architectural core:
|
||||
|
||||
- **Regulated domain:** Fiat/e-money finality, accounting, sanctions/AML, institutional onboarding, evidence vaults, OMNL/Fineract postings, ISO-20022 evidence bundles.
|
||||
- **Chain 138 domain:** Authorization integrity, participant/signer policy, replay protection, immutable settlement references, GRU mint gating—**not** “bank decides finality on-chain.”
|
||||
|
||||
See [DBIS_RAIL_TECHNICAL_SPEC_V1.md](../dbis-rail/DBIS_RAIL_TECHNICAL_SPEC_V1.md) §0–§5 (design principle: *the chain never decides fiat finality*).
|
||||
|
||||
This master plan:
|
||||
|
||||
1. Places **EMIs**, **virtual accounts**, and **wallet APIs** in the regulated + ledger layers, with explicit mapping to URA families and policy profiles.
|
||||
2. Treats **Tatum-style virtual accounts** as an **optional vendor pattern** for off-chain crypto/fiat ledgers parallel to public chains; **Chain 138** remains **custom-RPC / self-hosted** per [smom-dbis-138/docs/api/TATUM_SDK.md](../../smom-dbis-138/docs/api/TATUM_SDK.md).
|
||||
3. Aligns **government treasury** and **central-bank-grade** narratives with [DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md](../03-deployment/DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md) truth: many institutional rows are **Partial** or **Planned**—the master plan labels gaps by owner type (counsel, implementation, operator, vendor).
|
||||
4. Preserves **liquidity honesty**: [config/allmainnet-non-dodo-protocol-surface.json](../../config/allmainnet-non-dodo-protocol-surface.json) explicitly distinguishes bridge-live status from **same-chain swap inventory**—regulated claims must not treat pending DEX inventory as institutional liquidity.
|
||||
|
||||
---
|
||||
|
||||
## 2. Source-of-Truth Hierarchy
|
||||
|
||||
Per [DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md](DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md) §1, when artifacts disagree:
|
||||
|
||||
| Priority | Kind | Examples |
|
||||
|----------|------|----------|
|
||||
| 1 | Machine-readable config + trackers | `config/universal-resource-activation/manifest.json`, `config/jurisdictions/catalog.v1.json`, pool matrices, deployment-status JSON |
|
||||
| 2 | Validation / implementation scripts | `scripts/verify/*`, `pnpm ura:*`, forge scoped tests |
|
||||
| 3 | Specialized canonical docs | DBIS Rail specs, RTGS matrix, onboarding charter |
|
||||
| 4 | Older narrative | Historical plans; use only if reconciled |
|
||||
|
||||
**Regulatory vs technical claims:** A statement may be “true in policy design” (Rail rulebook) but **not yet Complete** in [DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md](../03-deployment/DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md). External communications must distinguish **intent** from **production gate**.
|
||||
|
||||
---
|
||||
|
||||
## 3. Participant and Legal Taxonomy
|
||||
|
||||
Use consistent labels across onboarding, URA `ownerParticipantId`, DBIS Rail participant registry, and compliance matrices.
|
||||
|
||||
| Role | Typical licenses / regimes | DBIS alignment |
|
||||
|------|----------------------------|----------------|
|
||||
| **Government treasury** | Sovereign issuer / fiscal agent rules | Institution + jurisdiction-specific matrix rows; OMNL/treasury accounts |
|
||||
| **Central bank / RTGS** | Central banking law, RTGS participation | Off-chain finality + ISO evidence; not “RTGS on Chain 138” unless contractually true |
|
||||
| **Commercial bank** | Banking license, deposit-taking | FI participant; nostro/vostro; safeguarding vs deposits per jurisdiction |
|
||||
| **EMI / E-money issuer** | EU EMI, UK EMI, analogous | `FIAT_DIGITAL`, safeguarding ledger, virtual IBAN patterns |
|
||||
| **Payment institution** | PSD2-style, MSB-adjacent | Payment initiation / execution; evidence for good funds |
|
||||
| **MSB / money transmitter** | FinCEN state overlays | MSB participant class in Rail spec |
|
||||
| **CASP / VASP** | MiCA, national crypto regimes | Policy profiles for transferable vs restricted tokens |
|
||||
| **Custodian / CSD** | Custody, CSD regulation | `SKR_SAFEKEEPING`, depository model in RTGS docs |
|
||||
| **Wallet / tech provider** | Contractual + outsourcing | Not issuer of money unless licensed; keys + API custody boundaries |
|
||||
| **Liquidity provider / PMM** | Market conduct, licensing per venue | PMM inventory **outside** customer e-money perimeter unless proven |
|
||||
|
||||
Definitions for **institution**, **jurisdiction**, **policy profile**, **complete**: [INSTITUTION_ONBOARDING_CHARTER.md](../04-configuration/compliance-matrices/INSTITUTION_ONBOARDING_CHARTER.md).
|
||||
|
||||
---
|
||||
|
||||
## 4. Money Model
|
||||
|
||||
### 4.1 Layers of money (conceptual)
|
||||
|
||||
| Layer | Examples | System-of-record |
|
||||
|-------|----------|-------------------|
|
||||
| Sovereign | CBDC, reserves at central bank | RTGS / CBDC operator |
|
||||
| Bank money | Deposits, settlement balances | Bank core / correspondent |
|
||||
| E-money | EMI-issued redeemable electronic money | EMI safeguarding + ledger |
|
||||
| Ledger balances | Virtual accounts, app wallets | Operator ledger + reconciliation |
|
||||
| Tokenized claims | Deposit tokens, fiat-backed stablecoins, GRU tiers | Issuer + attestation + chain contracts |
|
||||
| PMM / DEX inventory | LP positions, pool reserves | **Market-making inventory**—not customer deposits unless segregated |
|
||||
|
||||
URA families anchor this: [UNIVERSAL_RESOURCE_ONTOLOGY.md](../04-configuration/universal-resource-activation/UNIVERSAL_RESOURCE_ONTOLOGY.md) (`FIAT_DIGITAL`, `SERVER_FUNDS`, `SKR_SAFEKEEPING`, etc.).
|
||||
|
||||
### 4.2 Non-confusion rule
|
||||
|
||||
**Customer safeguarded e-money** must never be silently modeled as **AMM inventory**. Treasury execution using PMM must pass **policy**, **limits**, and **segregation** controls documented under Rail + RTGS liquidity sections.
|
||||
|
||||
---
|
||||
|
||||
## 5. Ledger, Virtual Account, and Wallet Hierarchy
|
||||
|
||||
### 5.1 Regulated-domain ledger (target)
|
||||
|
||||
- **Omnibus / safeguarding** bank accounts (where jurisdiction requires).
|
||||
- **Virtual accounts** (customer sub-ledgers): references mapped to OMNL/Fineract **accounts**, deterministic **`accountingRef`**, optional **vIBAN/UETR** correlation—pattern only until frozen with banking partners.
|
||||
- **ISO-20022** message IDs feeding **MintAuth** (`messageId`, `isoHash`, `accountingRef`) per [DBIS_RAIL_TECHNICAL_SPEC_V1.md](../dbis-rail/DBIS_RAIL_TECHNICAL_SPEC_V1.md).
|
||||
|
||||
### 5.2 Virtual account integration (functional requirements)
|
||||
|
||||
| Requirement | Notes |
|
||||
|-------------|--------|
|
||||
| Single currency per logical pocket | Align with vendor patterns (e.g. Tatum VA: one currency per VA); multi-currency UX via customer grouping |
|
||||
| Internal transfers | Instant ledger moves; no chain fee; full audit trail |
|
||||
| Deposit mapping | Blockchain deposit address ↔ VA balance updates where custodial **public** chains use vendor indexing; **Chain 138** requires **self-hosted** indexer or gateway-fed events |
|
||||
| Withdrawal | Ledger debit → chain payout from **treasury/pooled** on-chain inventory; operator-visible vs customer-visible segregation documented |
|
||||
| Reconciliation | Daily tie-out: VA sum ↔ omnibus ↔ Chain 138 treasury wallets |
|
||||
|
||||
### 5.3 Chain-domain (Chain 138)
|
||||
|
||||
- **Operational wallets** for participants (allowlisted where Rail requires).
|
||||
- **SettlementRouter / GRU** paths—authorization only after off-chain gates.
|
||||
- **No fiat finality on-chain**—see Rail spec design principle.
|
||||
|
||||
### 5.4 Tatum and similar vendors
|
||||
|
||||
- **Tatum SDK + custom RPC** on Chain 138: raw JSON-RPC only; cloud Notifications/Data **do not** apply to unsupported/private chains—[TATUM_SDK.md](../../smom-dbis-138/docs/api/TATUM_SDK.md).
|
||||
- **Tatum Virtual Accounts** (product pattern): off-chain ledger + deposit addresses + periodic sync to chain—see vendor docs (`docs.tatum.io/docs/virtual-accounts`). Access/pricing constraints are vendor-imposed; treat as **integration option** for **supported public chains**, not as Chain 138’s regulated ledger.
|
||||
- **Alternative:** Self-hosted VA ledger + OMNL as SoR + DBIS Rail MintAuth for token legs.
|
||||
|
||||
### 5.5 Wallet API custody tiers
|
||||
|
||||
| Tier | Typical stack | Regulatory touch |
|
||||
|------|----------------|-------------------|
|
||||
| Non-custodial | User keys | Gateway still does Travel Rule / sanctions as required |
|
||||
| Custodial hot | Server/HSM | EMI client-money rules, safeguarding |
|
||||
| MPC / institutional | Fireblocks-class | Custody agreements + attestations |
|
||||
| Embedded / AA | thirdweb Engine etc. | Policy profiles + sponsor gas + limits |
|
||||
|
||||
Refs: [CHAIN138_WALLET_ECOSYSTEM_AND_RATIONALE.md](../04-configuration/CHAIN138_WALLET_ECOSYSTEM_AND_RATIONALE.md), [THIRDWEB_WALLETS_INTEGRATION.md](../04-configuration/THIRDWEB_WALLETS_INTEGRATION.md), [THIRDWEB_ENGINE_CHAIN_OVERRIDES.md](../04-configuration/THIRDWEB_ENGINE_CHAIN_OVERRIDES.md).
|
||||
|
||||
---
|
||||
|
||||
## 6. ISO, Evidence, and Mint Authorization Flow
|
||||
|
||||
End-to-end intent (see Rail technical spec §5):
|
||||
|
||||
1. ISO Gateway ingests messages → canonical bundle → `isoHash`, `messageId`.
|
||||
2. Funds status: `ON_LEDGER_FINAL` vs `OFF_LEDGER_FINAL`.
|
||||
3. Double-entry accounting → **`accountingRef`**.
|
||||
4. Compliance gates → threshold signatures → **MintAuth** → SettlementRouter → GRU mint.
|
||||
|
||||
Evidence vault, 4.995-style packages, Indonesia pilot: [INDONESIA_PACKAGE_4_995_EVIDENCE_STANDARD.md](../04-configuration/mifos-omnl-central-bank/INDONESIA_PACKAGE_4_995_EVIDENCE_STANDARD.md), [DBIS_RTGS_MASTER_PLAN_IMPLEMENTATION_TRACKER.md](../03-deployment/DBIS_RTGS_MASTER_PLAN_IMPLEMENTATION_TRACKER.md).
|
||||
|
||||
---
|
||||
|
||||
## 7. Compliance and Licensing Model (EU / UK / US Anchors)
|
||||
|
||||
### 7.1 Repository anchors
|
||||
|
||||
| Mechanism | Path |
|
||||
|-----------|------|
|
||||
| Institution onboarding | [INSTITUTION_ONBOARDING_CHARTER.md](../04-configuration/compliance-matrices/INSTITUTION_ONBOARDING_CHARTER.md), [INSTITUTION_ONBOARDING_PLAYBOOK.md](../04-configuration/compliance-matrices/INSTITUTION_ONBOARDING_PLAYBOOK.md) |
|
||||
| Jurisdiction catalog | [JURISDICTION_CATALOG.md](../04-configuration/jurisdictions/JURISDICTION_CATALOG.md), [config/jurisdictions/catalog.v1.json](../../config/jurisdictions/catalog.v1.json) |
|
||||
| Policy profiles | [UNIVERSAL_RESOURCE_POLICY_PROFILES.md](../04-configuration/universal-resource-activation/UNIVERSAL_RESOURCE_POLICY_PROFILES.md), [policy-profiles.json](../../config/universal-resource-activation/policy-profiles.json) |
|
||||
| Rail controls | [DBIS_RAIL_JURISDICTION_TRACEABILITY.md](../dbis-rail/DBIS_RAIL_JURISDICTION_TRACEABILITY.md), [DBIS_RAIL_CONTROL_MAPPING_V1.md](../dbis-rail/DBIS_RAIL_CONTROL_MAPPING_V1.md) |
|
||||
| Stablecoin / conversion policy | [DBIS_RAIL_STABLECOIN_POLICY_V1_5.md](../dbis-rail/DBIS_RAIL_STABLECOIN_POLICY_V1_5.md), [DBIS_RAIL_CONVERSION_ROUTER_SPEC_V1_5.md](../dbis-rail/DBIS_RAIL_CONVERSION_ROUTER_SPEC_V1_5.md) |
|
||||
|
||||
### 7.2 Jurisdiction expansion (gap)
|
||||
|
||||
Slice-1 charter expects **Indonesia** pilot matrix + stubs; **EU/UK/US** banking matrices must be extended beyond stubs for anchor claims—implementation task for compliance + counsel ([INSTITUTION_ONBOARDING_CHARTER.md](../04-configuration/compliance-matrices/INSTITUTION_ONBOARDING_CHARTER.md) exit criteria).
|
||||
|
||||
### 7.3 External regime pointers (non-canonical; counsel verifies)
|
||||
|
||||
- **EU:** MiCA (ART/EMT), PSD2/e-money frameworks for payment vs issuance—map obligations into compliance matrices.
|
||||
- **UK:** FCA/BoE stablecoin and payments agenda—monitor regulator publications (e.g. sandbox cohorts for issuance experiments).
|
||||
- **US:** Money transmission, BSA/AML, sponsor-bank models, federal/state stablecoin developments—matrix rows per activity.
|
||||
|
||||
### 7.4 Counsel sign-off points
|
||||
|
||||
- First marketing claim implying **national RTGS participation**, **CBDC**, or **government guarantee**.
|
||||
- Any **Travel Rule** / **data residency** cross-border flow.
|
||||
- Token taxonomy for **retail** vs **wholesale** and **security-like** instruments (`RESTRICTED_SECURITY` in ontology).
|
||||
|
||||
---
|
||||
|
||||
## 8. Liquidity, PMM, Bridges, and Market Integrity
|
||||
|
||||
- **Chain 138 PMM / routing:** [PMM_DEX_ROUTING_STATUS.md](../11-references/PMM_DEX_ROUTING_STATUS.md), [DEPLOYED_TOKENS_BRIDGES_LPS_AND_ROUTING_STATUS.md](../11-references/DEPLOYED_TOKENS_BRIDGES_LPS_AND_ROUTING_STATUS.md).
|
||||
- **Route confidence / policy-aware quoting:** Not yet first-class in public quote APIs—see baseline status in [DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md](DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md) (DODO PMM / routing workstream). Do not imply regulator-grade route selection from wallet or aggregator UX alone.
|
||||
- **Cross-chain PMM graph:** `cross-chain-pmm-lps/config/deployment-status.json` (home chain 138).
|
||||
- **ALL Mainnet:** [allmainnet-non-dodo-protocol-surface.json](../../config/allmainnet-non-dodo-protocol-surface.json)—**bridge live** does not imply **swap inventory published** (`sameChainSwapInventoryPublished` remains **`false`** until promoted); submodule doc [smom-dbis-138/docs/deployment/ALL_MAINNET_CONFIGURATION.md](../../smom-dbis-138/docs/deployment/ALL_MAINNET_CONFIGURATION.md) must stay aligned with this file.
|
||||
- **Pool lifecycle:** [all-mainnet-pool-creation-matrix.json](../../config/all-mainnet-pool-creation-matrix.json)—operational gates vs regulated settlement.
|
||||
|
||||
**Rule:** PMM LP inventory is **treasury/market** risk unless explicitly structured as **customer-segregated** with legal and operational proof.
|
||||
|
||||
---
|
||||
|
||||
## 9. Artifact Mapping (Master Plan Section → Canonical Repo Files)
|
||||
|
||||
| Master plan topic | Primary artifacts |
|
||||
|-------------------|-------------------|
|
||||
| Fiat finality vs chain | [DBIS_RAIL_TECHNICAL_SPEC_V1.md](../dbis-rail/DBIS_RAIL_TECHNICAL_SPEC_V1.md), [DBIS_RAIL_RULEBOOK_V1.md](../dbis-rail/DBIS_RAIL_RULEBOOK_V1.md), [DBIS_RAIL_REGULATOR_BRIEF_V1.md](../dbis-rail/DBIS_RAIL_REGULATOR_BRIEF_V1.md) |
|
||||
| RTGS / OMNL / sidecars | [DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md](../03-deployment/DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md), [DBIS_RTGS_FIRST_SLICE_ARCHITECTURE.md](../03-deployment/DBIS_RTGS_FIRST_SLICE_ARCHITECTURE.md), [DBIS_HYBX_SIDECAR_BOUNDARY_MATRIX.md](../03-deployment/DBIS_HYBX_SIDECAR_BOUNDARY_MATRIX.md) |
|
||||
| Institution onboarding | [INSTITUTION_ONBOARDING_CHARTER.md](../04-configuration/compliance-matrices/INSTITUTION_ONBOARDING_CHARTER.md), [INSTITUTION_ONBOARDING_PLAYBOOK.md](../04-configuration/compliance-matrices/INSTITUTION_ONBOARDING_PLAYBOOK.md) |
|
||||
| URA / ontology | [UNIVERSAL_RESOURCE_ONTOLOGY.md](../04-configuration/universal-resource-activation/UNIVERSAL_RESOURCE_ONTOLOGY.md), [UNIVERSAL_RESOURCE_SERVER_FUNDS_LANE.md](../04-configuration/universal-resource-activation/UNIVERSAL_RESOURCE_SERVER_FUNDS_LANE.md), [manifest.json](../../config/universal-resource-activation/manifest.json) |
|
||||
| Chain 138 wallets / APIs | [TATUM_SDK.md](../../smom-dbis-138/docs/api/TATUM_SDK.md), [CHAIN138_WALLET_ECOSYSTEM_AND_RATIONALE.md](../04-configuration/CHAIN138_WALLET_ECOSYSTEM_AND_RATIONALE.md), [THIRDWEB_ENGINE_CHAIN_OVERRIDES.md](../04-configuration/THIRDWEB_ENGINE_CHAIN_OVERRIDES.md) |
|
||||
| Token / explorer truth | [EXPLORER_TOKEN_LIST_CROSSCHECK.md](../11-references/EXPLORER_TOKEN_LIST_CROSSCHECK.md), [ADDRESS_MATRIX_AND_STATUS.md](../11-references/ADDRESS_MATRIX_AND_STATUS.md) |
|
||||
| E-money / ISO execution hooks (contracts + runbook) | [MULTI_CHAIN_EXECUTION_ISO20022_EMONEY.md](../runbooks/MULTI_CHAIN_EXECUTION_ISO20022_EMONEY.md) |
|
||||
| GRU M1 instruments, listings, disclosure framing | [GRU_M1_MASTER_IMPLEMENTATION_PLAN.md](../gru-m1/GRU_M1_MASTER_IMPLEMENTATION_PLAN.md), [GRU_M1_LISTING_VALIDATION.md](../compliance/GRU_M1_LISTING_VALIDATION.md) |
|
||||
| Identity stack vs RTGS / Travel Rule scale | [DBIS_HYPERLEDGER_IDENTITY_STACK_DECISION.md](../03-deployment/DBIS_HYPERLEDGER_IDENTITY_STACK_DECISION.md), [DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md](../03-deployment/DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md) (Aries / AnonCreds rows) |
|
||||
| Explorer UI legal templates (non-canonical vs Rail) | [LEGAL_COMPLIANCE_REQUIREMENTS.md](../../explorer-monorepo/docs/LEGAL_COMPLIANCE_REQUIREMENTS.md) — harmonize marketing/legal copy with [DBIS_RAIL_RULEBOOK_V1.md](../dbis-rail/DBIS_RAIL_RULEBOOK_V1.md) / counsel; not a substitute for Rail regulator brief |
|
||||
| Public sector / credentials | [PUBLIC_SECTOR_LIVE_DEPLOYMENT_CHECKLIST.md](../03-deployment/PUBLIC_SECTOR_LIVE_DEPLOYMENT_CHECKLIST.md), [COMPLETE_CREDENTIAL_EIDAS_PROGRAM_REPOS.md](../11-references/COMPLETE_CREDENTIAL_EIDAS_PROGRAM_REPOS.md) |
|
||||
| ALL Mainnet CI (surface JSON + chains flags) | [check-allmainnet-protocol-surface.sh](../../scripts/verify/check-allmainnet-protocol-surface.sh), [check-allmainnet-chains-flags.sh](../../scripts/verify/check-allmainnet-chains-flags.sh), [validate-config-files.sh](../../scripts/validation/validate-config-files.sh) |
|
||||
| Umbrella ecosystem | [DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md](DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md), [MASTER_INDEX.md](../MASTER_INDEX.md) |
|
||||
|
||||
---
|
||||
|
||||
## 10. Gap Register (by Owner Type)
|
||||
|
||||
| Gap | Owner | Notes |
|
||||
|-----|--------|------|
|
||||
| EU/UK/US compliance matrices beyond stubs | Counsel + Compliance | Charter slice-1 exit criteria |
|
||||
| HYBX treasury / participant model frozen | Banking architecture + Ops | RTGS matrix: HYBX participant/treasury **Planned** |
|
||||
| Virtual account ↔ OMNL chart of accounts | Implementation | Deterministic `accountingRef` |
|
||||
| Tatum VA on public chains vs Chain 138 split | Architecture | RPC-only on 138 per TATUM_SDK |
|
||||
| Identity stack (Aries/AnonCreds) for Travel Rule scale | Identity lead | RTGS matrix **Planned** |
|
||||
| Correspondent / BNI live contracts | Operator + external bank | Matrix rows Partial/Planned |
|
||||
| ALL Mainnet swap inventory | Ops + validation | `sameChainSwapInventoryPublished: false` until promoted; CI: [`scripts/verify/check-allmainnet-protocol-surface.sh`](../../scripts/verify/check-allmainnet-protocol-surface.sh) + [`check-allmainnet-chains-flags.sh`](../../scripts/verify/check-allmainnet-chains-flags.sh) via [`validate-config-files.sh`](../../scripts/validation/validate-config-files.sh) |
|
||||
|
||||
---
|
||||
|
||||
## 11. Phased Roadmap Gates
|
||||
|
||||
### Slice 1 — Government treasury & licensed participant (foundation)
|
||||
|
||||
**Goal:** End-to-end **regulated** path: ISO evidence → accounting → MintAuth → Chain 138 settlement record → audit package.
|
||||
|
||||
**Gates:**
|
||||
|
||||
- [ ] OMNL tenant/auth **frozen** for canonical rail ([DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md](../03-deployment/DBIS_RTGS_E2E_REQUIREMENTS_MATRIX.md) priorities).
|
||||
- [ ] At least one **Complete** jurisdiction matrix + institution onboarding **Complete** per charter.
|
||||
- [ ] DBIS Rail MintAuth path exercised with evidence vault reproducibility.
|
||||
- [ ] No external claim of “RTGS production parity” until checklist rows are **Complete**.
|
||||
|
||||
### Slice 2 — EMI / virtual account / digital wallet
|
||||
|
||||
**Goal:** Customer **VA ledger** + safeguarding reconciliation + wallet UX; optional Tatum VA for **supported** public chains; Chain 138 via **gateway + self-hosted** signing.
|
||||
|
||||
**Gates:**
|
||||
|
||||
- [ ] Customer ledger ↔ omnibus reconciliation **daily** with exception queue.
|
||||
- [ ] Policy profiles for retail vs institutional wallets (`policyProfileId` on URA rows).
|
||||
- [ ] Withdrawal path: ledger debit → treasury wallet → chain tx with limits and sanctions.
|
||||
|
||||
### Slice 3 — Cross-border correspondent & FX
|
||||
|
||||
**Goal:** Nostro/vostro, correspondent messaging, FX booking per [DBIS_RTGS_FX_AND_LIQUIDITY_OPERATING_MODEL.md](../03-deployment/DBIS_RTGS_FX_AND_LIQUIDITY_OPERATING_MODEL.md).
|
||||
|
||||
**Gates:**
|
||||
|
||||
- [ ] FX pricing/dealing engine contract **frozen** (matrix: currently **Planned**).
|
||||
- [ ] SWIFT/ISO endpoint contracts documented for at least one corridor.
|
||||
|
||||
### Slice 4 — Tokenized reserves & policy-aware liquidity
|
||||
|
||||
**Goal:** GRU/reserve attestations + **explicit** use of PMM/bridge for **treasury** execution—not commingled with customer e-money.
|
||||
|
||||
**Gates:**
|
||||
|
||||
- [ ] ReserveOracle / attestation cadence aligned with [DBIS_RAIL_STABLECOIN_POLICY_V1_5.md](../dbis-rail/DBIS_RAIL_STABLECOIN_POLICY_V1_5.md).
|
||||
- [ ] PMM inventory labeled **non-customer** in ops runbooks.
|
||||
- [ ] ALL Mainnet: promote protocols in [allmainnet-non-dodo-protocol-surface.json](../../config/allmainnet-non-dodo-protocol-surface.json) only after committed addresses + verification.
|
||||
|
||||
---
|
||||
|
||||
## 12. Related Documents
|
||||
|
||||
- [DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md](DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md)
|
||||
- [DBIS_RTGS_MASTER_PLAN_IMPLEMENTATION_TRACKER.md](../03-deployment/DBIS_RTGS_MASTER_PLAN_IMPLEMENTATION_TRACKER.md)
|
||||
- [DBIS_RAIL_TECHNICAL_SPEC_V1.md](../dbis-rail/DBIS_RAIL_TECHNICAL_SPEC_V1.md)
|
||||
- [INSTITUTION_ONBOARDING_CHARTER.md](../04-configuration/compliance-matrices/INSTITUTION_ONBOARDING_CHARTER.md)
|
||||
- [UNIVERSAL_RESOURCE_ONTOLOGY.md](../04-configuration/universal-resource-activation/UNIVERSAL_RESOURCE_ONTOLOGY.md)
|
||||
- [MULTI_CHAIN_EXECUTION_ISO20022_EMONEY.md](../runbooks/MULTI_CHAIN_EXECUTION_ISO20022_EMONEY.md)
|
||||
- [GRU_M1_MASTER_IMPLEMENTATION_PLAN.md](../gru-m1/GRU_M1_MASTER_IMPLEMENTATION_PLAN.md)
|
||||
- [DBIS_HYPERLEDGER_IDENTITY_STACK_DECISION.md](../03-deployment/DBIS_HYPERLEDGER_IDENTITY_STACK_DECISION.md)
|
||||
- [ALL_MAINNET_CONFIGURATION.md](../../smom-dbis-138/docs/deployment/ALL_MAINNET_CONFIGURATION.md) — must stay aligned with [allmainnet-non-dodo-protocol-surface.json](../../config/allmainnet-non-dodo-protocol-surface.json)
|
||||
|
||||
---
|
||||
|
||||
## Document history
|
||||
|
||||
| Date | Change |
|
||||
|------|--------|
|
||||
| 2026-04-28 | Initial publication: regulated treasury wallet master plan integrating EMI, wallets, VA patterns, Rail, RTGS, URA, liquidity boundaries. |
|
||||
| 2026-04-28 | ALL Mainnet doc drift note + artifact links: ISO20022 e-money runbook, GRU M1, identity decision, explorer legal caveat, ecosystem route-confidence baseline; Related Documents expanded. |
|
||||
| 2026-04-28 | ALL Mainnet verification scripts committed (`check-allmainnet-protocol-surface.sh`, `check-allmainnet-chains-flags.sh`); integrated into `validate-config-files.sh`; `run-all-validation.sh` duplicate 1c/1d block removed; `ALL_MAINNET_VERIFICATION_COMPLETE.md` addendum for swap inventory vs bridge verification. |
|
||||
78
docs/03-deployment/CROMERO_DAPP_DEPLOYMENT.md
Normal file
78
docs/03-deployment/CROMERO_DAPP_DEPLOYMENT.md
Normal file
@@ -0,0 +1,78 @@
|
||||
# CROMERO Dapp — Phoenix Deploy
|
||||
|
||||
Deploys [`d-bis/CROMERO`](https://gitea.d-bis.org/d-bis/CROMERO) (a
|
||||
Vite + React + thirdweb v5 dapp) to
|
||||
`https://d-bis.org/ecosystem/cromero/`.
|
||||
|
||||
## Pipeline
|
||||
|
||||
1. Push to `main` on `d-bis/CROMERO` triggers
|
||||
`.gitea/workflows/deploy-to-phoenix.yml`, which POSTs
|
||||
`{repo, sha, branch, target: "default"}` to the Phoenix Deploy API.
|
||||
2. Phoenix runs the registered target (see
|
||||
`phoenix-deploy-api/deploy-targets.json`):
|
||||
`bash scripts/deployment/phoenix-deploy-cromero-from-workspace.sh`.
|
||||
3. The script builds the staged workspace (`npm ci && npm run build`),
|
||||
copies `dist/` through `r630-01` into NPMplus CT `10233`, and lands
|
||||
files in `/var/www/ecosystem/cromero/`.
|
||||
4. Healthcheck: `https://d-bis.org/ecosystem/cromero/` must return
|
||||
HTTP 200 with `<div id="root"></div>` in the body.
|
||||
|
||||
## Live NPMplus topology
|
||||
|
||||
The primary NPMplus ingress is Dockerized inside CT `10233`, reachable via
|
||||
Proxmox host `r630-01` (`192.168.11.11`). Direct SSH to
|
||||
`192.168.11.167` is not assumed. The deploy script therefore uses
|
||||
`pct push` / `pct exec` through the Proxmox host and keeps these paths
|
||||
aligned:
|
||||
|
||||
- CT host path: `/var/www/ecosystem/cromero/`
|
||||
- Persistent NPMplus data path: `/opt/npmplus/html/ecosystem/cromero/`
|
||||
- Nginx container path: `/var/www -> /data/html`
|
||||
|
||||
## One-time nginx setup on the NPMplus host
|
||||
|
||||
Install this once in the NPMplus advanced-config tab for the `d-bis.org`
|
||||
proxy host:
|
||||
|
||||
```nginx
|
||||
location = /ecosystem/cromero {
|
||||
return 301 /ecosystem/cromero/;
|
||||
}
|
||||
|
||||
location /ecosystem/cromero/ {
|
||||
alias /var/www/ecosystem/cromero/;
|
||||
index index.html;
|
||||
try_files $uri /ecosystem/cromero/index.html;
|
||||
}
|
||||
```
|
||||
|
||||
Then run nginx config validation/reload inside the NPMplus container.
|
||||
|
||||
The Vite app builds with `base: "/ecosystem/cromero/"` so hashed asset URLs
|
||||
resolve under that subpath.
|
||||
|
||||
## Required Actions secrets/vars on `d-bis/CROMERO`
|
||||
|
||||
| Name | Type | Value |
|
||||
| --- | --- | --- |
|
||||
| `PHOENIX_DEPLOY_URL` | secret | `http://192.168.11.59:4001/api/deploy` |
|
||||
| `PHOENIX_DEPLOY_TOKEN` | secret | matches `PHOENIX_DEPLOY_SECRET` on the Phoenix host |
|
||||
| `VITE_THIRDWEB_CLIENT_ID` | secret | thirdweb publishable Client ID |
|
||||
| `VITE_PROJECT_WALLET_ADDRESS` | var | recipient `0x...` address |
|
||||
| `VITE_CHAIN_138_RPC` | var (optional) | defaults to `https://rpc.d-bis.org` |
|
||||
| `VITE_CHAIN_138_EXPLORER` | var (optional) | defaults to `https://explorer.d-bis.org` |
|
||||
|
||||
## Manual trigger from a LAN box with `phoenix-deploy-api` access
|
||||
|
||||
```bash
|
||||
curl -sSf -X POST "http://192.168.11.59:4001/api/deploy" -H "Authorization: Bearer ${PHOENIX_DEPLOY_TOKEN}" -H "Content-Type: application/json" -d '{"repo":"d-bis/CROMERO","branch":"main","target":"default"}'
|
||||
```
|
||||
|
||||
## Dry run
|
||||
|
||||
From this repo:
|
||||
|
||||
```bash
|
||||
PHOENIX_DEPLOY_WORKSPACE=/path/to/staged/CROMERO bash scripts/deployment/phoenix-deploy-cromero-from-workspace.sh --dry-run
|
||||
```
|
||||
@@ -145,6 +145,40 @@ For webhook signing, the bootstrap/helper path also expects:
|
||||
|
||||
Do not enable both repo Actions deploys and webhook deploys for the same repo unless you intentionally want duplicate deploy attempts.
|
||||
|
||||
### 3a. Bootstrap workflow secrets (one-time per CT)
|
||||
|
||||
The reinstall workflow `.gitea/workflows/bootstrap-phoenix-deploy-api.yml`
|
||||
ships the latest `phoenix-deploy-api/` from `master` to CT 5700 via
|
||||
scp + `pct push` and re-runs `install-systemd.sh`. This is the path you
|
||||
take when the running service on the CT is older than the code on
|
||||
`master` (e.g. it still returns the "Deploy request queued (stub)"
|
||||
message). Trigger via the Gitea Actions UI → "Bootstrap Phoenix Deploy
|
||||
API" → Run workflow.
|
||||
|
||||
Required secrets (in addition to the deploy secrets above):
|
||||
|
||||
- `PHOENIX_PVE_HOST` — PVE node IP that hosts CT 5700 (e.g.
|
||||
`192.168.11.12` for `r630-02`).
|
||||
- `PHOENIX_PVE_USER` — SSH user on the PVE node (default `root`).
|
||||
- `PHOENIX_PVE_SSH_KEY` — Private SSH key (OpenSSH format) authorised
|
||||
on the PVE node. Use a dedicated deploy key, not your personal key.
|
||||
- `PHOENIX_PVE_KNOWN_HOSTS` — Pre-populated `known_hosts` line for the
|
||||
PVE host (skip strict-host-key prompt). Optional; if absent the
|
||||
workflow uses `accept-new` on first connect.
|
||||
- `PHOENIX_DEV_VM_VMID` — Container VMID (default `5700`).
|
||||
- `PHOENIX_DEPLOY_DEV_VM_IP` — IP of the dev VM for the post-install
|
||||
health check (default `192.168.11.59`).
|
||||
|
||||
After a successful run the workflow performs a non-stub probe: it POSTs
|
||||
`{ "target": "__bootstrap_probe__" }` with the deploy bearer token and
|
||||
fails the workflow if the response body still contains
|
||||
`Deploy request queued (stub)` or any auth-rejection signal. That gives
|
||||
you an unambiguous "the running service on CT 5700 is now post-stub"
|
||||
signal in CI logs.
|
||||
|
||||
The workflow only triggers on `workflow_dispatch` (never on push) so
|
||||
deploy-service reinstalls remain a deliberate manual step.
|
||||
|
||||
## Adding more repos or VM targets
|
||||
|
||||
Extend [deploy-targets.json](/home/intlc/projects/proxmox/phoenix-deploy-api/deploy-targets.json) with another entry.
|
||||
|
||||
47
docs/04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md
Normal file
47
docs/04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md
Normal file
@@ -0,0 +1,47 @@
|
||||
# Gitea repo → VM hosting → CI/CD matrix
|
||||
|
||||
Each **application repo** should carry **its own** `.gitea/workflows/*.yml` so pushes trigger the right pipeline for **that** codebase. Deploy execution typically happens on the **designated LAN VM** (via **Phoenix deploy API** on the dev workspace host), not on the public Gitea runner alone.
|
||||
|
||||
**Canonical integration:** [Phoenix deploy API](../../phoenix-deploy-api/server.js) + [`deploy-targets.json`](../../phoenix-deploy-api/deploy-targets.json).
|
||||
|
||||
**Operator checklist:** [docs/00-meta/GITEA_CD_OPERATOR_CHECKLIST.md](../00-meta/GITEA_CD_OPERATOR_CHECKLIST.md)
|
||||
|
||||
**Parity report (local clone):** `bash scripts/verify/report-gitea-cd-parity.sh`
|
||||
|
||||
## Pattern A — Repo workflow triggers Phoenix (recommended)
|
||||
|
||||
1. Repo workflow `on: push` runs on Gitea Actions (checkout only + `curl` POST).
|
||||
2. Body includes `repo` (Gitea `owner/name`), `branch`, `sha`, `target` (matches `deploy-targets.json`).
|
||||
3. Phoenix syncs the repo archive from Gitea, sets `PHOENIX_DEPLOY_WORKSPACE`, runs the target `command` with LAN access (SSH `pct`, rsync, etc.).
|
||||
|
||||
**Secrets (per repo in Gitea):** `PHOENIX_DEPLOY_URL`, `PHOENIX_DEPLOY_TOKEN` (same pattern as `d-bis/proxmox` workflows).
|
||||
|
||||
## Pattern B — Monorepo-only (`d-bis/proxmox`)
|
||||
|
||||
Multiple deploy jobs in one workflow ([`.gitea/workflows/deploy-to-phoenix.yml`](../../.gitea/workflows/deploy-to-phoenix.yml)); targets selected by JSON body `target`. Still one workflow file in **this** repo (not copied to every submodule).
|
||||
|
||||
## Matrix (maintain when repos or VMs change)
|
||||
|
||||
| Gitea repo | Branch(es) | Hosting / VM | `deploy-targets` `target` | Workflow |
|
||||
|------------|------------|--------------|-----------------------------|----------|
|
||||
| `d-bis/proxmox` | `main`, `master` | Phoenix deploy host + varies by job | `default`, `atomic-swap-dapp-live`, `portal-live`, `cloudflare-sync`, … | `.gitea/workflows/deploy-to-phoenix.yml`, `validate-on-pr.yml` |
|
||||
| `Gov_Web_Portals/CyberSecur-Global` | `main` | CT **7810** | `default` | In **CyberSecur-Global** repo: `.gitea/workflows/deploy-to-ct7810.yml` |
|
||||
| `Gov_Web_Portals/DBIS` | `main` | CT **7804** | `dbis-portal-live` | Copy [`repos/dbis-portal-live.yml`](../../config/gitea-workflow-templates/repos/dbis-portal-live.yml) → DBIS repo |
|
||||
| `d-bis/explorer-monorepo` | `main`, `master` | VMID **5000** | `explorer-live` | Submodule: `.gitea/workflows/deploy-live.yml` |
|
||||
| `d-bis/CROMERO` | `main`, `master` | NPM ecosystem path | `default` | Copy [`repos/cromero-default.yml`](../../config/gitea-workflow-templates/repos/cromero-default.yml) → CROMERO repo |
|
||||
| `d-bis/CurrenciCombo` | `main`, `master` | Phoenix CT **8604** | `default` | Copy [`repos/currencicombo-default.yml`](../../config/gitea-workflow-templates/repos/currencicombo-default.yml) → CurrenciCombo repo |
|
||||
| `d-bis/cross-chain-pmm-lps` | `main` | _(simulation/docs — no VM)_ | — | `.gitea/workflows/validate-capital-efficiency.yml` |
|
||||
|
||||
## Adding a new repo
|
||||
|
||||
1. Add rows to [`deploy-targets.json`](../../phoenix-deploy-api/deploy-targets.json) with `repo`, `branch`, `target`, `command`, `healthcheck`.
|
||||
2. Implement or reuse a `scripts/deployment/phoenix-deploy-*-from-workspace.sh` wrapper if the deploy needs `PHOENIX_DEPLOY_WORKSPACE`.
|
||||
3. Copy a template from [`config/gitea-workflow-templates/`](../../config/gitea-workflow-templates/README.md) into **that repo** as `.gitea/workflows/<name>.yml`.
|
||||
4. In Gitea → Repo → **Secrets**: `PHOENIX_DEPLOY_URL`, `PHOENIX_DEPLOY_TOKEN`.
|
||||
5. Document the VM / URL here.
|
||||
|
||||
## References
|
||||
|
||||
- [GITEA_ORG_STRUCTURE.md](./GITEA_ORG_STRUCTURE.md)
|
||||
- [DEV_VM_GITOPS_PLAN.md](./DEV_VM_GITOPS_PLAN.md)
|
||||
- [README-gitea-proxmox-sync.md](../../scripts/git/README-gitea-proxmox-sync.md)
|
||||
@@ -1,7 +1,7 @@
|
||||
# Address Matrix and Status — Correlated Reference
|
||||
|
||||
**Last Updated:** 2026-03-26
|
||||
**Purpose:** Single correlated matrix of all existing contract, token, and pool addresses with deployment status. **On-chain verification (2026-03-26):** corrected Chain 138 PMM stack verified at `DODOPMMIntegration=0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` and `DODOPMMProvider=0x5CAe6Ce155b7f08D3a956F5Dc82fC9945f29B381`; desired-state inventory is fully reconciled with `104` existing pools and `104` aligned routes.
|
||||
**Last Updated:** 2026-04-22
|
||||
**Purpose:** Single correlated matrix of all existing contract, token, and pool addresses with deployment status. **On-chain verification (2026-04-22):** the **live, traded** Chain 138 PMM stack is `DODOPMMIntegration=0x86ADA6Ef91A3B450F89f2b751e93B1b7A3218895` + `DODOPMMProvider=0x3f729632E9553EBacCdE2e9b4c8F2B285b014F2e`. A second parallel deployment (`DODOPMMIntegration=0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` + `DODOPMMProvider=0x5CAe6Ce155b7f08D3a956F5Dc82fC9945f29B381`) exists with seeded but un-traded pools — do not wire dApps or routers to it. Earlier docs that cited the `0x5BDc62f1…` stack as canonical were superseded by this re-verification.
|
||||
**Sources:** CONTRACT_ADDRESSES_REFERENCE, CHAIN138_TOKEN_ADDRESSES, LIQUIDITY_POOLS_MASTER_MAP, DEPLOYED_COINS_TOKENS_AND_NETWORKS, env examples, PRE_DEPLOYMENT_CHECKLIST.
|
||||
|
||||
---
|
||||
@@ -108,12 +108,22 @@
|
||||
|
||||
| Contract / pool | Address | Status | Notes |
|
||||
|-----------------|---------|--------|-------|
|
||||
| DODOPMMIntegration | `0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` | ✅ | Corrected canonical integration; full JSON desired-state reconciled |
|
||||
| DODOPMMProvider | `0x5CAe6Ce155b7f08D3a956F5Dc82fC9945f29B381` | ✅ | Corrected canonical provider; 104 aligned routes |
|
||||
| DODOPMMIntegration (Stack A, live) | `0x86ADA6Ef91A3B450F89f2b751e93B1b7A3218895` | ✅ | Live, traded integration; backs all 8 active PMM pools (2026-04-22 on-chain probe) |
|
||||
| DODOPMMProvider (Stack A, live) | `0x3f729632E9553EBacCdE2e9b4c8F2B285b014F2e` | ✅ | `dodoIntegration() == 0x86ADA6Ef…`, `providerName() == "DODO PMM"`, `isKnownPool` TRUE for all 8 Stack A pools |
|
||||
| DODOPMMIntegration (Stack B, parallel) | `0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` | ⚠ | Same source, different immutables; pools seeded but un-traded — do not wire dApps/routers to it |
|
||||
| DODOPMMProvider (Stack B, parallel) | `0x5CAe6Ce155b7f08D3a956F5Dc82fC9945f29B381` | ⚠ | Pairs with the Stack B integration; superseded by Stack A for canonical use |
|
||||
| PrivatePoolRegistry | `0xb27057B27db09e8Df353AF722c299f200519882A` | ✅ | Live private XAU pool registry |
|
||||
| Pool cUSDT/cUSDC | `0xff8d3b8fDF7B112759F076B69f4271D4209C0849` | ✅ | Funded live |
|
||||
| Pool cUSDT/USDT (official mirror) | `0x6fc60DEDc92a2047062294488539992710b99D71` | ✅ | Intended funded canonical pool; integration/provider mapping must be repointed if still on older empty pool |
|
||||
| Pool cUSDC/USDC (official mirror) | `0x9f74Be42725f2Aa072a9E0CdCce0E7203C510263` | ✅ | Canonical corrected-stack pool |
|
||||
| Pool cUSDT/cUSDC (Stack A, traded) | `0x9e89bAe009adf128782E19e8341996c596ac40dC` | ✅ | Live, asymmetric balances (≈983.9k cUSDT / ≈1.016M cUSDC) — actively traded |
|
||||
| Pool cUSDT/USDT (Stack A) | `0x866Cb44b59303d8dc5f4F9E3E7A8e8b0bf238d66` | ✅ | Live (≈1M / ≈1M) |
|
||||
| Pool cUSDC/USDC (Stack A) | `0xc39B7D0F40838cbFb54649d327f49a6DAC964062` | ✅ | Live (≈1M / ≈1M) |
|
||||
| Pool cBTC/cUSDT (Stack A) | `0x67049e7333481e2cac91af61403ac7bddfab7bcd` | ✅ | Live (10k cBTC / 9M cUSDT) |
|
||||
| Pool cBTC/cUSDC (Stack A) | `0x72f1a0794153c3b8a1e8a731f1d8e1a52cb10dc5` | ✅ | Live (10k cBTC / 9M cUSDC) |
|
||||
| Pool WETH/USDC (Stack A) | `0xb53a0508940b1ff90f1aad4f6cb50a7012fe5593` | ✅ | Live (≈10.1M USDC quote) |
|
||||
| Pool WETH/USDT (Stack A) | `0xe227f6c0520c0c6e8786fe56fa76c4914f861533` | ✅ | Live (≈10.1M USDT quote) |
|
||||
| Pool cBTC/cXAUC (Stack A) | `0xf3e8a07d419b61f002114e64d79f7cf8f7989433` | ✅ | Live (10k cBTC / 1.74k cXAUC) |
|
||||
| Pool cUSDT/cUSDC (Stack B, seeded) | `0xff8d3b8fDF7B112759F076B69f4271D4209C0849` | ⚠ | 10M / 10M flat — not traded; superseded by Stack A pool above |
|
||||
| Pool cUSDT/USDT (Stack B, seeded) | `0x6fc60DEDc92a2047062294488539992710b99D71` | ⚠ | 10M / 10M flat — not traded; superseded by Stack A pool above |
|
||||
| Pool cUSDC/USDC (Stack B, empty) | `0x9f74Be42725f2Aa072a9E0CdCce0E7203C510263` | ⚠ | 0 / 0 zero liquidity; superseded by Stack A pool above |
|
||||
| Pool cUSDT/cXAUC (public) | `0x1AA55E2001E5651349AfF5A63FD7A7Ae44f0F1b0` | ✅ | Funded live |
|
||||
| Pool cUSDC/cXAUC (public) | `0xEA9Ac6357CaCB42a83b9082B870610363B177cBa` | ✅ | Funded live |
|
||||
| Pool cEURT/cXAUC (public) | `0xbA99bc1eAAC164569d5AcA96C806934DDaF970Cf` | ✅ | Funded live |
|
||||
@@ -121,7 +131,7 @@
|
||||
| Pool cUSDC/cXAUC (private) | `0x7867D58567948e5b9908F1057055Ee4440de0851` | ✅ | Funded live |
|
||||
| Pool cEURT/cXAUC (private) | `0x505403093826D494983A93b43Aa0B8601078A44e` | ✅ | Funded live |
|
||||
| LiquidityPoolETH (trustless) | — | ❌ | Placeholder 0x0 |
|
||||
| EnhancedSwapRouter | — | ❌ | Not deployed |
|
||||
| EnhancedSwapRouter | `0xE6Cc7643ae2A4C720A28D8263BC4972905d7DE0f` | ✅ | Deployed 2026-04-22 (Phase 3, EVM Paris). UniV3 + Balancer + DODO Stack-A wired; Curve disabled; 1inch slot inert. 11 DODO pools registered in `dodoPoolAddresses[tokenA][tokenB]` bidirectionally — 8 at deploy, 3 cBTC pools (cBTC/cUSDT, cBTC/cUSDC, cBTC/cXAUC) added Phase 3j (2026-04-22) via `setDodoPoolAddress(...)`. Balancer pool ids still pending per-pair config. |
|
||||
|
||||
### 1.7 TransactionMirror / deployer
|
||||
|
||||
@@ -193,8 +203,8 @@ Bridges (CCIPWETH9 / CCIPWETH10) and LINK funding per runbook. Addresses in `smo
|
||||
|
||||
| Env variable (Chain 138) | Canonical address |
|
||||
|--------------------------|-------------------|
|
||||
| DODO_PMM_INTEGRATION_ADDRESS | `0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` |
|
||||
| DODO_PMM_PROVIDER_ADDRESS | `0x5CAe6Ce155b7f08D3a956F5Dc82fC9945f29B381` |
|
||||
| DODO_PMM_INTEGRATION_ADDRESS | `0x86ADA6Ef91A3B450F89f2b751e93B1b7A3218895` (Stack A, live) |
|
||||
| DODO_PMM_PROVIDER_ADDRESS | `0x3f729632E9553EBacCdE2e9b4c8F2B285b014F2e` (Stack A, live) |
|
||||
| PRIVATE_POOL_REGISTRY | `0xb27057B27db09e8Df353AF722c299f200519882A` |
|
||||
| POOL_CUSDTCUSDC | `0xff8d3b8fDF7B112759F076B69f4271D4209C0849` |
|
||||
| POOL_CUSDTUSDT | `0x6fc60DEDc92a2047062294488539992710b99D71` |
|
||||
|
||||
@@ -1,16 +1,19 @@
|
||||
# PMM Liquidity Pools & DEX/DeFi Routing — Full System Status
|
||||
|
||||
**Last Updated:** 2026-03-26
|
||||
**Last Updated:** 2026-04-22
|
||||
**Purpose:** Single reference for DEX/DeFi and PMM liquidity pool routing — what is designed, deployed, and in use.
|
||||
|
||||
---
|
||||
|
||||
## Executive summary (updated 2026-03-26)
|
||||
## Executive summary (updated 2026-04-22)
|
||||
|
||||
- **DODOPMMIntegration** is **deployed** on Chain 138 at `0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d`. The corrected canonical stack now has **104 desired-state pools aligned**.
|
||||
- **DODOPMMProvider** is **deployed** at `0x5CAe6Ce155b7f08D3a956F5Dc82fC9945f29B381` with **104/104 provider routes aligned** to the integration.
|
||||
- **Live funded public pools** are: cUSDT/cUSDC (`0xff8d3b8fDF7B112759F076B69f4271D4209C0849`), cUSDT/USDT (`0x6fc60DEDc92a2047062294488539992710b99D71`), cUSDC/USDC (`0x9f74Be42725f2Aa072a9E0CdCce0E7203C510263`), cUSDT/cXAUC (`0x94316511621430423a2cff0C036902BAB4aA70c2`), cUSDC/cXAUC (`0x7867D58567948e5b9908F1057055Ee4440de0851`), cEURT/cXAUC (`0x505403093826D494983A93b43Aa0B8601078A44e`).
|
||||
- **EnhancedSwapRouter** is **not deployed**; multi-provider DEX routing (Uniswap/Balancer/Curve/1inch) is not live.
|
||||
- **Two parallel DODOPMM deployments exist on Chain 138.** On-chain probe (2026-04-22) shows the **live, traded** stack is **Stack A**:
|
||||
- **DODOPMMIntegration (Stack A)** `0x86ADA6Ef91A3B450F89f2b751e93B1b7A3218895` — 8 registered, actively-traded pools.
|
||||
- **DODOPMMProvider (Stack A)** `0x3f729632E9553EBacCdE2e9b4c8F2B285b014F2e` — wired to Stack A integration; `isKnownPool` TRUE for all 8 Stack A pools.
|
||||
- The **Stack B** stack (`DODOPMMIntegration=0x5BDc62f1…` / `DODOPMMProvider=0x5CAe6Ce1…`) is a parallel deployment with seeded but un-traded pools (10M/10M flat, cUSDC/USDC at 0/0). Do **not** wire dApps or routers to Stack B. Earlier docs that cited Stack B as canonical were superseded by this re-verification.
|
||||
- **Live funded, traded Stack-A pools (2026-04-22 probe):** cUSDT/cUSDC `0x9e89bAe0…`, cUSDT/USDT `0x866Cb44b…`, cUSDC/USDC `0xc39B7D0F…`, cBTC/cUSDT `0x67049e73…`, cBTC/cUSDC `0x72f1a079…`, WETH/USDC `0xb53a0508…`, WETH/USDT `0xe227f6c0…`, cBTC/cXAUC `0xf3e8a07d…`. The funded XAU pools `0x9431…`/`0x7867…`/`0x5054…` remain live but on the legacy XAU registry, not the Stack A integration.
|
||||
- **EnhancedSwapRouter** is **deployed** at `0xE6Cc7643ae2A4C720A28D8263BC4972905d7DE0f` on Chain 138 (Phase 3, 2026-04-22, EVM version Paris per `[profile.chain138]` in `smom-dbis-138/foundry.toml`). Wired immutables: UniV3 `0xde9cD8ee…`, Dodoex `0x86ADA6Ef…` (Stack A), Balancer `0x96423d7C…`, 1inch `0x500B84b1…`, Curve disabled. `dodoLiquidityProvider = 0x3f729632…`. **11 DODO pools registered** in `dodoPoolAddresses[tokenA][tokenB]` bidirectionally (8 at deploy + 3 cBTC pools added Phase 3j 2026-04-22). Curve and 1inch slots remain inert; Balancer wiring is configured but functional pool ids must be set per-pair via `setBalancerPoolId(...)` before Balancer routing engages.
|
||||
- **Phase 1 / 2 dApp work (2026-04-22):** the atomic-swap dApp at `https://atomic-swap.defi-oracle.io/` was failing because its quote handler ignored 4 of 6 protocols. PR #2 (mobile bridge + max-approve) and PR #4 (routing-honesty + protocol/executor mismatch chip) addressed the UI side; this stack-A canonicalization addresses the on-chain side.
|
||||
- **Token-aggregation** service is implemented and runnable (single-hop quotes; can index DODO pools). **Bridge quote API** (swap+bridge+swap) is implemented.
|
||||
- **Full system status:** PMM liquidity and DODOPMMProvider routing are **deployed and in use** on Chain 138. Remaining: add liquidity to pools as needed; optionally deploy EnhancedSwapRouter when other DEX pools exist on 138.
|
||||
- **Optional items completed (2026-02-27 / 2026-03-01):** DeployCompliantFiatTokens (10 tokens on 138); Blockscout verification run; MCP allowlist-138 (`ai-mcp-pmm-controller/config/allowlist-138.json`); add-liquidity runbook ([ADD_LIQUIDITY_PMM_CHAIN138_RUNBOOK](../03-deployment/ADD_LIQUIDITY_PMM_CHAIN138_RUNBOOK.md)); token-aggregation canonical fallbacks for cEURC/cEURT/cGBP*/cAUDC/cJPYC/cCHFC/cCADC/cXAU*; ENV_EXAMPLE_CONTENT + CREATE2_FACTORY_ADDRESS; E2E routing verification run.
|
||||
@@ -45,10 +48,13 @@
|
||||
|
||||
| Component | Status | Address / Notes |
|
||||
|-----------|--------|------------------|
|
||||
| **DODOPMMIntegration** | Deployed | `0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` — corrected canonical integration. |
|
||||
| **PMM pools (desired-state)** | **104 aligned** | Full desired-state JSON reconciled; live funded public pools are listed in [LIQUIDITY_POOLS_MASTER_MAP](LIQUIDITY_POOLS_MASTER_MAP.md). |
|
||||
| **DODOPMMProvider** | **Deployed** | `0x5CAe6Ce155b7f08D3a956F5Dc82fC9945f29B381`; 104/104 routes aligned to the corrected integration. |
|
||||
| **EnhancedSwapRouter** | Not deployed | Deploy when Uniswap/Balancer pools exist on 138; configure quoter and Balancer poolId. |
|
||||
| **DODOPMMIntegration (Stack A, live)** | Deployed | `0x86ADA6Ef91A3B450F89f2b751e93B1b7A3218895` — canonical, actively traded; backs all 8 Stack-A pools. |
|
||||
| **DODOPMMProvider (Stack A, live)** | Deployed | `0x3f729632E9553EBacCdE2e9b4c8F2B285b014F2e`; `isKnownPool` TRUE for all 8 Stack-A pools; use as `dodoLiquidityProvider` for `EnhancedSwapRouter`. |
|
||||
| **DODOPMMIntegration (Stack B, parallel)** | Deployed | `0x5BDc62f1ae7D630c37A8B363a1d49845356Ee72d` — same source, different immutables; pools seeded but un-traded; do not wire to it. |
|
||||
| **DODOPMMProvider (Stack B, parallel)** | Deployed | `0x5CAe6Ce155b7f08D3a956F5Dc82fC9945f29B381` — pairs with Stack B integration; superseded by Stack A. |
|
||||
| **PMM pools (Stack A, traded)** | Live | 8 pools registered + traded. See [ADDRESS_MATRIX_AND_STATUS](ADDRESS_MATRIX_AND_STATUS.md) §1.6. |
|
||||
| **PMM pools (Stack B, seeded)** | Live but un-traded | 3 pools at flat 10M/10M (or 0/0). Superseded by Stack A. |
|
||||
| **EnhancedSwapRouter** | Deployed | `0xE6Cc7643ae2A4C720A28D8263BC4972905d7DE0f` on Chain 138 (Phase 3, 2026-04-22, EVM Paris). UniV3 + Balancer + DODO Stack A wired; Curve/1inch slots inert; 11 DODO pools registered (8 at deploy + 3 cBTC pools Phase 3j). Balancer pool ids still pending per-pair `setBalancerPoolId(...)`. |
|
||||
| **LiquidityPoolETH** (trustless bridge) | Placeholder | Not deployed; config uses `0x0`. |
|
||||
|
||||
**Doc note:** [LIQUIDITY_POOLS_MASTER_MAP.md](LIQUIDITY_POOLS_MASTER_MAP.md) and [ADDRESS_MATRIX_AND_STATUS.md](ADDRESS_MATRIX_AND_STATUS.md) list pool and DODOPMMProvider addresses. DEX_AND_CROSS_CHAIN_CONTRACTS_NEEDED reflects DODOPMMIntegration deployed and pools created.
|
||||
@@ -59,7 +65,7 @@
|
||||
|-----------|--------|--------|
|
||||
| **Token-aggregation service** | Implemented & runnable | Indexes UniswapV2/V3 and DODO when `CHAIN_138_DODO_PMM_INTEGRATION` is set (set in smom-dbis-138/.env). Single-hop quote only; no N-hop pathfinding. |
|
||||
| **Orchestration QuoteService** | Implemented | `POST /api/bridge/quote` with optional source/destination swap quotes; requires bridge registry and optional EnhancedSwapRouter addresses. |
|
||||
| **Liquidity Engine (backend)** | Implemented | Depends on EnhancedSwapRouter being deployed; not usable for routing until router is live. |
|
||||
| **Liquidity Engine (backend)** | Implemented | EnhancedSwapRouter is now deployed; backend can resolve DODO-registered pairs through `swapTokenToToken(...)`. Multi-provider routing (UniV3 / Balancer / 1inch) needs per-pair pool/quoter config before it engages. |
|
||||
|
||||
### 2.3 Swap–bridge–swap & trustless stack
|
||||
|
||||
@@ -82,7 +88,7 @@
|
||||
| Capability | Available? | Where |
|
||||
|------------|------------|--------|
|
||||
| Single-hop quote (API) | Yes | `GET /api/v1/quote` (best direct pool for tokenIn/tokenOut) |
|
||||
| Multi-provider choice (one leg) | No (router not deployed) | EnhancedSwapRouter would provide WETH↔stable across Uniswap/Dodo/Balancer/Curve/1inch |
|
||||
| Multi-provider choice (one leg) | Partial — router deployed, only DODO Stack-A active | EnhancedSwapRouter `0xE6Cc7643…` exposes `swapTokenToToken(tokenIn,tokenOut,amountIn,minOut)`; routes for the 11 registered DODO pairs work today. UniV3 / Balancer / 1inch slots wired but not yet driving liquidity (no pools / no pool ids). |
|
||||
| N-hop path (A→B→C on one chain) | No | No graph-based multi-hop swap API or on-chain pathfinder |
|
||||
| Swap–bridge–swap (cross-chain) | Yes (orchestration) | QuoteService; on-chain coordinator deployable when needed |
|
||||
| DODO PMM on-chain swaps | **Yes** | Pools created; DODOPMMProvider deployed; use getQuote/executeSwap or DODOPMMIntegration swap functions. |
|
||||
@@ -94,16 +100,16 @@
|
||||
| Area | Ready? | In use? |
|
||||
|------|--------|--------|
|
||||
| **PMM liquidity pools (Chain 138)** | **Yes** | Corrected canonical stack deployed; 104 desired-state pools aligned; live funded public stable/XAU pools available. |
|
||||
| **DEX routing (EnhancedSwapRouter)** | No | Contract not deployed; no Uniswap/Balancer pools on 138. |
|
||||
| **DEX routing (EnhancedSwapRouter)** | Partial | Contract deployed at `0xE6Cc7643…`; DODO Stack-A path live for 11 pairs; UniV3/Balancer/1inch slots wired but inactive (no native pools / no pool ids on 138). |
|
||||
| **Token-aggregation API** | Yes | Service can run; single-hop quotes; can index DODO pools once DODOPMMIntegration has pools. |
|
||||
| **Bridge quote (swap+bridge+swap)** | Partial | QuoteService implemented; coordinator and router optional; not full E2E flow. |
|
||||
| **Liquidity Engine (decision logic)** | No | Depends on EnhancedSwapRouter. |
|
||||
| **Liquidity Engine (decision logic)** | Partial | Router deployed; decision logic can drive DODO Stack-A leg today. UniV3 / Balancer / 1inch arms inactive until pools/quoter/poolId are configured. |
|
||||
| **Cross-chain cW* PMM mesh** | No | Design/simulation only; edge pools and bots not deployed. |
|
||||
|
||||
**Conclusion:** PMM liquidity and DODOPMMProvider routing **are** deployed and in use on Chain 138. What is in place:
|
||||
|
||||
- **Live:** DODOPMMIntegration (Mock DVM), **three PMM pools created**, **DODOPMMProvider deployed** with pools registered, token-aggregation service, bridge/orchestration quote API.
|
||||
- **Remaining:** Add liquidity to pools as needed; deploy EnhancedSwapRouter when Uniswap/Balancer pools exist on 138.
|
||||
- **Remaining:** Configure Balancer pool ids per pair (`setBalancerPoolId(...)`) and add UniV3/1inch-routable native pools on Chain 138 to activate the non-DODO arms of the router.
|
||||
- **Optional / later:** Full trustless stack (LiquidityPoolETH, etc.), cross-chain cW* edge pools and bots.
|
||||
|
||||
---
|
||||
|
||||
@@ -15,6 +15,7 @@
|
||||
|--------|----------|
|
||||
| **Agent / IDE instructions** | [AGENTS.md](../AGENTS.md) (repo root) |
|
||||
| **Canonical ecosystem root plan** | [02-architecture/DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md](02-architecture/DBIS_ECOSYSTEM_TECHNICAL_MASTER_PLAN.md) — umbrella technical master plan for the full live and planned ecosystem; subordinate roots: Chain 138 infra/runtime, RTGS execution, and URA control-plane trackers |
|
||||
| **Government treasury, EMI, wallet, virtual account master plan** | [02-architecture/GOVERNMENT_TREASURY_EMI_WALLET_MASTER_PLAN.md](02-architecture/GOVERNMENT_TREASURY_EMI_WALLET_MASTER_PLAN.md) — regulated fiat/e-money vs Chain 138 authorization; EMIs, VAs, Tatum-style patterns; RTGS/URA/Rail alignment and phased gates |
|
||||
| **Git submodule hygiene + explorer remotes** | [00-meta/SUBMODULE_HYGIENE.md](00-meta/SUBMODULE_HYGIENE.md) — detached HEAD, push order, Gitea/GitHub, `submodules-clean.sh` |
|
||||
| **Atomic swap dApp submodule** | [03-deployment/ATOMIC_SWAP_DAPP_SUBMODULE.md](03-deployment/ATOMIC_SWAP_DAPP_SUBMODULE.md) — dedicated swap + bridge dApp repo, manifest sync, and bootstrap-remote note |
|
||||
| **What to do next** | [00-meta/NEXT_STEPS_INDEX.md](00-meta/NEXT_STEPS_INDEX.md) — ordered actions, by audience, execution plan |
|
||||
@@ -25,6 +26,8 @@
|
||||
| **Chain 138 txpool incident recovery** | `bash scripts/fix-all-validators-and-txpool.sh` → `bash scripts/maintenance/apply-chain138-strict-future-tx-pool.sh` → `bash scripts/clear-all-transaction-pools.sh` → `bash scripts/monitoring/monitor-blockchain-health.sh` |
|
||||
| **Gitea TLS expiry check** | `bash scripts/verify/check-gitea-certificate-expiry.sh` — warns before `gitea.d-bis.org` cert expiry blocks HTTPS pushes |
|
||||
| **Gitea TLS expiry cron** | `bash scripts/maintenance/schedule-gitea-cert-check-cron.sh --install` — installs a daily warning check with `WARN_DAYS=30` |
|
||||
| **Gitea repo ↔ VM CI/CD matrix** | [04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md](04-configuration/GITEA_REPO_VM_CD_CI_MATRIX.md) — per-repo workflows, Phoenix deploy targets, templates under `config/gitea-workflow-templates/` |
|
||||
| **Gitea CD operator checklist** | [00-meta/GITEA_CD_OPERATOR_CHECKLIST.md](00-meta/GITEA_CD_OPERATOR_CHECKLIST.md) — secrets, Phoenix host sync, `report-gitea-cd-parity.sh` |
|
||||
| **TsunamiSwap DEX plan** | [00-meta/AAVE_CHAIN138_AND_MARIONETTE_TSUNAMISWAP_PLAN.md](00-meta/AAVE_CHAIN138_AND_MARIONETTE_TSUNAMISWAP_PLAN.md) — canonical TsunamiSwap VM `5010` plan, current DEX link, and publish checklist |
|
||||
| **Required / optional / recommended (full plan)** | [00-meta/COMPLETE_REQUIRED_OPTIONAL_RECOMMENDED_INDEX.md](00-meta/COMPLETE_REQUIRED_OPTIONAL_RECOMMENDED_INDEX.md) |
|
||||
| **Single task list** | [00-meta/TODOS_CONSOLIDATED.md](00-meta/TODOS_CONSOLIDATED.md) |
|
||||
@@ -118,6 +121,7 @@
|
||||
|
||||
| Topic | Document |
|
||||
|-------|----------|
|
||||
| **Government treasury, EMI, digital wallet, virtual account (regulated settlement)** | [02-architecture/GOVERNMENT_TREASURY_EMI_WALLET_MASTER_PLAN.md](02-architecture/GOVERNMENT_TREASURY_EMI_WALLET_MASTER_PLAN.md) |
|
||||
| **Vault shard custody policy decision** | [04-configuration/VAULT_SHARD_CUSTODY_POLICY.md](04-configuration/VAULT_SHARD_CUSTODY_POLICY.md) |
|
||||
| **Multi-chain rotation rollout order** | [runbooks/MULTI_CHAIN_ROTATION_ROLLOUT_ORDER_MEMO.md](runbooks/MULTI_CHAIN_ROTATION_ROLLOUT_ORDER_MEMO.md) |
|
||||
|
||||
|
||||
Submodule explorer-monorepo updated: 1aa81f454a...ac401848d1
Submodule metamask-integration updated: b15b13c57e...8896234a8d
@@ -43,7 +43,14 @@
|
||||
"mission-control:dev": "pnpm --filter mission-control dev",
|
||||
"mission-control:build": "pnpm --filter mission-control build",
|
||||
"mission-control:start": "pnpm --filter mission-control start",
|
||||
"mission-control:test": "pnpm --filter mission-control test"
|
||||
"mission-control:test": "pnpm --filter mission-control test",
|
||||
"token-lists:validate-metadata": "node scripts/validation/validate-token-list-metadata.mjs",
|
||||
"pool-matrix:validate": "node scripts/validation/validate-pool-creation-matrix.mjs",
|
||||
"all-mainnet:readiness": "node scripts/status/generate-all-mainnet-readiness.mjs",
|
||||
"all-mainnet:pool-balances": "node scripts/status/check-all-mainnet-required-pool-balances.mjs",
|
||||
"all-mainnet:canary-preflight": "node scripts/status/preflight-all-mainnet-canaries.mjs",
|
||||
"all-mainnet:apply-vaults": "node scripts/status/apply-all-mainnet-vault-assignments.mjs",
|
||||
"all-mainnet:record-canaries": "node scripts/status/record-all-mainnet-canary-evidence.mjs"
|
||||
},
|
||||
"keywords": [
|
||||
"proxmox",
|
||||
|
||||
@@ -102,6 +102,54 @@
|
||||
"timeout_ms": 15000
|
||||
}
|
||||
},
|
||||
{
|
||||
"repo": "Gov_Web_Portals/DBIS",
|
||||
"branch": "main",
|
||||
"target": "dbis-portal-live",
|
||||
"description": "Redeploy the DBIS public portal on CT 7804 from the staged DBIS checkout overlaid into the Gov Portals workspace.",
|
||||
"cwd": "${PHOENIX_REPO_ROOT}",
|
||||
"command": [
|
||||
"bash",
|
||||
"scripts/deployment/phoenix-deploy-dbis-portal-live-from-workspace.sh"
|
||||
],
|
||||
"required_env": [
|
||||
"PHOENIX_REPO_ROOT",
|
||||
"PHOENIX_DEPLOY_WORKSPACE"
|
||||
],
|
||||
"timeout_sec": 2400,
|
||||
"healthcheck": {
|
||||
"url": "https://d-bis.org/.well-known/trust.json",
|
||||
"expect_status": 200,
|
||||
"expect_body_includes": "\"organization\"",
|
||||
"attempts": 12,
|
||||
"delay_ms": 5000,
|
||||
"timeout_ms": 15000
|
||||
}
|
||||
},
|
||||
{
|
||||
"repo": "Gov_Web_Portals/CyberSecur-Global",
|
||||
"branch": "main",
|
||||
"target": "default",
|
||||
"description": "Deploy CyberSecur Global static site to CT 7810 docroot (NPM/Cloudflare upstream) from the staged repo workspace.",
|
||||
"cwd": "${PHOENIX_REPO_ROOT}",
|
||||
"command": [
|
||||
"bash",
|
||||
"scripts/deployment/phoenix-deploy-cybersecur-from-workspace.sh"
|
||||
],
|
||||
"required_env": [
|
||||
"PHOENIX_REPO_ROOT",
|
||||
"PHOENIX_DEPLOY_WORKSPACE"
|
||||
],
|
||||
"timeout_sec": 900,
|
||||
"healthcheck": {
|
||||
"url": "https://cybersecur.d-bis.org/",
|
||||
"expect_status": 200,
|
||||
"expect_body_includes": "CyberSecur",
|
||||
"attempts": 10,
|
||||
"delay_ms": 4000,
|
||||
"timeout_ms": 15000
|
||||
}
|
||||
},
|
||||
{
|
||||
"repo": "d-bis/CurrenciCombo",
|
||||
"branch": "main",
|
||||
@@ -125,6 +173,29 @@
|
||||
"timeout_ms": 15000
|
||||
}
|
||||
},
|
||||
{
|
||||
"repo": "d-bis/CROMERO",
|
||||
"branch": "main",
|
||||
"target": "default",
|
||||
"description": "Deploy CROMERO dapp from the staged Gitea workspace: build dist/, rsync to NPMplus host /var/www/ecosystem/cromero/, served at https://d-bis.org/ecosystem/cromero/.",
|
||||
"cwd": "${PHOENIX_REPO_ROOT}",
|
||||
"command": [
|
||||
"bash",
|
||||
"scripts/deployment/phoenix-deploy-cromero-from-workspace.sh"
|
||||
],
|
||||
"required_env": [
|
||||
"PHOENIX_REPO_ROOT",
|
||||
"PHOENIX_DEPLOY_WORKSPACE"
|
||||
],
|
||||
"healthcheck": {
|
||||
"url": "https://d-bis.org/ecosystem/cromero/",
|
||||
"expect_status": 200,
|
||||
"expect_body_includes": "<div id=\"root\">",
|
||||
"attempts": 12,
|
||||
"delay_ms": 5000,
|
||||
"timeout_ms": 15000
|
||||
}
|
||||
},
|
||||
{
|
||||
"repo": "d-bis/proxmox",
|
||||
"branch": "main",
|
||||
|
||||
@@ -33,6 +33,63 @@ elif [[ -f "$APP_DIR/.env" ]]; then
|
||||
elif [[ -f "$APP_DIR/.env.example" ]]; then
|
||||
cp "$APP_DIR/.env.example" "$TARGET/.env"
|
||||
fi
|
||||
|
||||
ensure_env_value() {
|
||||
local key="$1"
|
||||
local value="$2"
|
||||
local file="$TARGET/.env"
|
||||
[[ -n "$value" && -f "$file" ]] || return 0
|
||||
|
||||
local current=""
|
||||
if grep -qE "^${key}=" "$file"; then
|
||||
current="$(grep -E "^${key}=" "$file" | tail -n 1 | cut -d= -f2-)"
|
||||
fi
|
||||
[[ -z "$current" ]] || return 0
|
||||
|
||||
local tmp
|
||||
tmp="$(mktemp)"
|
||||
awk -v key="$key" -v value="$value" '
|
||||
BEGIN { found = 0 }
|
||||
$0 ~ "^" key "=" {
|
||||
print key "=" value
|
||||
found = 1
|
||||
next
|
||||
}
|
||||
{ print }
|
||||
END {
|
||||
if (!found) print key "=" value
|
||||
}
|
||||
' "$file" > "$tmp"
|
||||
cat "$tmp" > "$file"
|
||||
rm -f "$tmp"
|
||||
}
|
||||
|
||||
repo_env_value() {
|
||||
local key="$1"
|
||||
local file="$REPO_ROOT/.env"
|
||||
[[ -f "$file" ]] || return 0
|
||||
grep -E "^${key}=" "$file" | tail -n 1 | cut -d= -f2-
|
||||
}
|
||||
|
||||
if [[ -f "$TARGET/.env" ]]; then
|
||||
ensure_env_value PHOENIX_REPO_ROOT "$REPO_ROOT"
|
||||
for key in \
|
||||
GITEA_TOKEN \
|
||||
PHOENIX_DEPLOY_SECRET \
|
||||
PROXMOX_HOST \
|
||||
PROXMOX_PORT \
|
||||
PROXMOX_USER \
|
||||
PROXMOX_TOKEN_NAME \
|
||||
PROXMOX_TOKEN_VALUE \
|
||||
PROXMOX_TLS_VERIFY \
|
||||
PUBLIC_IP \
|
||||
CLOUDFLARE_API_TOKEN \
|
||||
CLOUDFLARE_GITEA_SYNC_ZONE \
|
||||
PHOENIX_CLOUDFLARE_SYNC
|
||||
do
|
||||
ensure_env_value "$key" "$(repo_env_value "$key")"
|
||||
done
|
||||
fi
|
||||
chown -R root:root "$TARGET"
|
||||
cd "$TARGET" && npm install --omit=dev
|
||||
cp "$APP_DIR/phoenix-deploy-api.service" /etc/systemd/system/
|
||||
|
||||
41
scripts/cloudflare/purge-cybersecur-d-bis-edge-cache.sh
Executable file
41
scripts/cloudflare/purge-cybersecur-d-bis-edge-cache.sh
Executable file
@@ -0,0 +1,41 @@
|
||||
#!/usr/bin/env bash
|
||||
# Purge Cloudflare edge cache for cybersecur.d-bis.org static paths after deploy.
|
||||
# Fixes stale HITs where HEAD returns 404 while GET is OK (robots.txt) or favicon cached as 404.
|
||||
#
|
||||
# Requires CLOUDFLARE_API_TOKEN with Zone → Cache Purge → Purge (same zone as DNS edits).
|
||||
# DNS-only scoped tokens often fail here — extend token or purge once in Dashboard:
|
||||
# Caching → Configuration → Purge Cache → Custom Purge → URL list.
|
||||
#
|
||||
set -euo pipefail
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
if [[ -f "${PROJECT_ROOT}/.env" ]]; then set -a; source "${PROJECT_ROOT}/.env"; set +a; fi
|
||||
TOKEN="${CLOUDFLARE_API_TOKEN:?Set CLOUDFLARE_API_TOKEN}"
|
||||
ZONE="${CLOUDFLARE_ZONE_ID_D_BIS_ORG:-}"
|
||||
if [[ -z "$ZONE" ]]; then
|
||||
ZONE=$(curl -sS -H "Authorization: Bearer ${TOKEN}" \
|
||||
"https://api.cloudflare.com/client/v4/zones?name=d-bis.org" | jq -r '.result[0].id // empty')
|
||||
fi
|
||||
[[ -n "$ZONE" ]] || { echo "Could not resolve zone id for d-bis.org" >&2; exit 1; }
|
||||
|
||||
BODY=$(jq -n --arg z "$ZONE" '{
|
||||
files: [
|
||||
"https://cybersecur.d-bis.org/",
|
||||
"https://cybersecur.d-bis.org/robots.txt",
|
||||
"https://cybersecur.d-bis.org/sitemap.xml",
|
||||
"https://cybersecur.d-bis.org/favicon.ico",
|
||||
"https://cybersecur.d-bis.org/index.html"
|
||||
]
|
||||
}')
|
||||
RESP=$(curl -sS -X POST "https://api.cloudflare.com/client/v4/zones/${ZONE}/purge_cache" \
|
||||
-H "Authorization: Bearer ${TOKEN}" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$BODY")
|
||||
echo "$RESP" | jq .
|
||||
|
||||
if echo "$RESP" | jq -e '.success == true' >/dev/null 2>&1; then
|
||||
echo "✓ Purged. Recheck: curl -sSI https://cybersecur.d-bis.org/robots.txt | head -1"
|
||||
exit 0
|
||||
fi
|
||||
echo "If authentication failed, add Cache Purge permission to the token or run Custom Purge in Cloudflare Dashboard for the URLs above." >&2
|
||||
exit 1
|
||||
@@ -5,7 +5,7 @@ SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
SUBMODULE_ROOT="$PROJECT_ROOT/atomic-swap-dapp"
|
||||
source "$PROJECT_ROOT/config/ip-addresses.conf" 2>/dev/null || true
|
||||
PROXMOX_HOST="${PROXMOX_HOST:-${PROXMOX_HOST_R630_02:-192.168.11.12}}"
|
||||
PROXMOX_HOST="${PROXMOX_DAPP_HOST:-${PROXMOX_HOST_R630_02:-192.168.11.12}}"
|
||||
VMID="${VMID:-5801}"
|
||||
DEPLOY_ROOT="${DEPLOY_ROOT:-/var/www/atomic-swap}"
|
||||
TMP_ARCHIVE="/tmp/atomic-swap-dapp-5801.tgz"
|
||||
@@ -61,6 +61,7 @@ ssh $SSH_OPTS "root@$PROXMOX_HOST" true
|
||||
scp -q $SSH_OPTS "$TMP_ARCHIVE" "root@$PROXMOX_HOST:/tmp/atomic-swap-dapp-5801.tgz"
|
||||
|
||||
ssh $SSH_OPTS "root@$PROXMOX_HOST" "
|
||||
set -euo pipefail
|
||||
pct push $VMID /tmp/atomic-swap-dapp-5801.tgz /tmp/atomic-swap-dapp-5801.tgz
|
||||
pct exec $VMID -- bash -lc '
|
||||
set -euo pipefail
|
||||
|
||||
@@ -35,6 +35,7 @@ trap cleanup EXIT
|
||||
|
||||
tar czf "$MESH_TGZ" -C "$PROJECT_ROOT" \
|
||||
smom-dbis-138/scripts/reserve/pmm-mesh-6s-automation.sh \
|
||||
smom-dbis-138/scripts/reserve/sync-weth-mock-price.sh \
|
||||
smom-dbis-138/scripts/update-oracle-price.sh \
|
||||
smom-dbis-138/.env
|
||||
|
||||
@@ -126,6 +127,8 @@ Environment=PMM_MESH_INTERVAL_SEC=6
|
||||
Environment=MESH_CAST_GAS_PRICE=2gwei
|
||||
Environment=ENABLE_MESH_ORACLE_TICK=1
|
||||
Environment=ENABLE_MESH_KEEPER_TICK=1
|
||||
Environment=ENABLE_MESH_WETH_MOCK_SYNC=0
|
||||
Environment=MESH_WETH_MOCK_SYNC_EVERY_N=5
|
||||
Environment=ENABLE_MESH_PMM_READS=1
|
||||
Environment=ENABLE_MESH_WETH_READS=1
|
||||
EnvironmentFile=-/var/tmp/chain138-mesh/smom-dbis-138/.env
|
||||
@@ -157,7 +160,7 @@ Wants=network-online.target
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
ExecStart=${PCT_BIN} exec ${VMID} -- env PATH=${BASE}/bin:/usr/bin:/bin HOME=/tmp PMM_MESH_INTERVAL_SEC=6 MESH_CAST_GAS_PRICE=2gwei ENABLE_MESH_ORACLE_TICK=1 ENABLE_MESH_KEEPER_TICK=1 ENABLE_MESH_PMM_READS=1 ENABLE_MESH_WETH_READS=1 /bin/bash --noprofile --norc ${BASE}/smom-dbis-138/scripts/reserve/pmm-mesh-6s-automation.sh
|
||||
ExecStart=${PCT_BIN} exec ${VMID} -- env PATH=${BASE}/bin:/usr/bin:/bin HOME=/tmp PMM_MESH_INTERVAL_SEC=6 MESH_CAST_GAS_PRICE=2gwei ENABLE_MESH_ORACLE_TICK=1 ENABLE_MESH_KEEPER_TICK=1 ENABLE_MESH_WETH_MOCK_SYNC=0 MESH_WETH_MOCK_SYNC_EVERY_N=5 ENABLE_MESH_PMM_READS=1 ENABLE_MESH_WETH_READS=1 /bin/bash --noprofile --norc ${BASE}/smom-dbis-138/scripts/reserve/pmm-mesh-6s-automation.sh
|
||||
Restart=always
|
||||
RestartSec=5
|
||||
|
||||
|
||||
101
scripts/deployment/phoenix-deploy-cromero-from-workspace.sh
Executable file
101
scripts/deployment/phoenix-deploy-cromero-from-workspace.sh
Executable file
@@ -0,0 +1,101 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
if [[ -f "$PROJECT_ROOT/scripts/lib/load-project-env.sh" ]]; then
|
||||
# shellcheck source=/dev/null
|
||||
source "$PROJECT_ROOT/scripts/lib/load-project-env.sh"
|
||||
fi
|
||||
if [[ -f "$PROJECT_ROOT/config/ip-addresses.conf" ]]; then
|
||||
# shellcheck source=/dev/null
|
||||
source "$PROJECT_ROOT/config/ip-addresses.conf" 2>/dev/null || true
|
||||
fi
|
||||
|
||||
PHOENIX_DEPLOY_WORKSPACE="${PHOENIX_DEPLOY_WORKSPACE:-}"
|
||||
NPMPLUS_PROXMOX_HOST="${NPMPLUS_PROXMOX_HOST:-${PROXMOX_HOST_R630_01:-192.168.11.11}}"
|
||||
NPMPLUS_VMID="${NPMPLUS_VMID:-10233}"
|
||||
NPMPLUS_DEPLOY_ROOT="${NPMPLUS_DEPLOY_ROOT:-/var/www/ecosystem/cromero}"
|
||||
NPMPLUS_DATA_ROOT="${NPMPLUS_DATA_ROOT:-/opt/npmplus/html/ecosystem/cromero}"
|
||||
PUBLIC_URL="${PUBLIC_URL:-https://d-bis.org/ecosystem/cromero/}"
|
||||
DRY_RUN="${DRY_RUN:-0}"
|
||||
SSH_OPTS=(-o BatchMode=yes -o ConnectTimeout=15 -o StrictHostKeyChecking=accept-new)
|
||||
TMP_ARCHIVE="/tmp/cromero-dapp-dist-$$.tgz"
|
||||
|
||||
usage() { printf 'Usage: %s [--dry-run]\n' "$(basename "$0")"; }
|
||||
log() { printf '[cromero-phoenix] %s\n' "$*" >&2; }
|
||||
die() { printf '[cromero-phoenix][FATAL] %s\n' "$*" >&2; exit 1; }
|
||||
need_cmd() { command -v "$1" >/dev/null 2>&1 || die "missing required command: $1"; }
|
||||
run() { if [[ "$DRY_RUN" == 1 ]]; then printf '[dry-run] %q ' "$@" >&2; printf '\n' >&2; else "$@"; fi; }
|
||||
cleanup() { rm -f "$TMP_ARCHIVE"; }
|
||||
trap cleanup EXIT
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--dry-run) DRY_RUN=1; shift ;;
|
||||
-h|--help) usage; exit 0 ;;
|
||||
*) die "unknown arg: $1" ;;
|
||||
esac
|
||||
done
|
||||
|
||||
for cmd in ssh scp tar curl node npm; do need_cmd "$cmd"; done
|
||||
[[ -n "$PHOENIX_DEPLOY_WORKSPACE" ]] || die "PHOENIX_DEPLOY_WORKSPACE is required"
|
||||
[[ -d "$PHOENIX_DEPLOY_WORKSPACE" ]] || die "staged workspace missing: $PHOENIX_DEPLOY_WORKSPACE"
|
||||
|
||||
log "building CROMERO from staged workspace: $PHOENIX_DEPLOY_WORKSPACE"
|
||||
pushd "$PHOENIX_DEPLOY_WORKSPACE" >/dev/null
|
||||
if [[ -f package-lock.json ]]; then
|
||||
run npm ci --no-audit --no-fund
|
||||
else
|
||||
run npm install --no-audit --no-fund
|
||||
fi
|
||||
run env \
|
||||
VITE_THIRDWEB_CLIENT_ID="${VITE_THIRDWEB_CLIENT_ID:-}" \
|
||||
VITE_PROJECT_WALLET_ADDRESS="${VITE_PROJECT_WALLET_ADDRESS:-0x3E309b87fA79092767531a0A6F5B6c3480737c5e}" \
|
||||
VITE_CHAIN_138_RPC="${VITE_CHAIN_138_RPC:-https://rpc.d-bis.org}" \
|
||||
VITE_CHAIN_138_EXPLORER="${VITE_CHAIN_138_EXPLORER:-https://explorer.d-bis.org}" \
|
||||
npm run build
|
||||
[[ -f dist/index.html ]] || die "build produced no dist/index.html"
|
||||
tar -C "$PHOENIX_DEPLOY_WORKSPACE" -czf "$TMP_ARCHIVE" dist
|
||||
popd >/dev/null
|
||||
|
||||
log "deploying dist/ through $NPMPLUS_PROXMOX_HOST to CT $NPMPLUS_VMID:$NPMPLUS_DEPLOY_ROOT"
|
||||
run scp "${SSH_OPTS[@]}" "$TMP_ARCHIVE" "root@$NPMPLUS_PROXMOX_HOST:/tmp/cromero-dapp-dist.tgz"
|
||||
if [[ "$DRY_RUN" == 1 ]]; then
|
||||
log "dry-run complete"
|
||||
exit 0
|
||||
fi
|
||||
ssh "${SSH_OPTS[@]}" "root@$NPMPLUS_PROXMOX_HOST" bash -s -- "$NPMPLUS_VMID" "$NPMPLUS_DEPLOY_ROOT" "$NPMPLUS_DATA_ROOT" <<'INNER'
|
||||
set -euo pipefail
|
||||
vmid="$1"
|
||||
deploy_root="$2"
|
||||
data_root="$3"
|
||||
pct exec "$vmid" -- bash -lc "
|
||||
set -euo pipefail
|
||||
mkdir -p '$data_root' /var/www/ecosystem
|
||||
if [ -e '$deploy_root' ] && [ ! -L '$deploy_root' ]; then rm -rf '$deploy_root'; fi
|
||||
ln -sfn '$data_root' '$deploy_root'
|
||||
rm -rf /tmp/cromero-dist
|
||||
"
|
||||
pct push "$vmid" /tmp/cromero-dapp-dist.tgz /tmp/cromero-dapp-dist.tgz
|
||||
pct exec "$vmid" -- bash -lc "
|
||||
set -euo pipefail
|
||||
rm -rf /tmp/cromero-dist
|
||||
mkdir -p /tmp/cromero-dist '$data_root'
|
||||
tar -xzf /tmp/cromero-dapp-dist.tgz -C /tmp/cromero-dist
|
||||
find '$data_root' -mindepth 1 -maxdepth 1 -exec rm -rf {} +
|
||||
cp -R /tmp/cromero-dist/dist/. '$data_root/'
|
||||
chown -R root:root /opt/npmplus/html/ecosystem /var/www/ecosystem
|
||||
chmod 755 /opt/npmplus/html /opt/npmplus/html/ecosystem '$data_root' /var/www /var/www/ecosystem
|
||||
if command -v docker >/dev/null 2>&1; then
|
||||
docker exec npmplus sh -lc 'rm -rf /var/www && ln -s /data/html /var/www && nginx -t && nginx -s reload'
|
||||
fi
|
||||
rm -rf /tmp/cromero-dist /tmp/cromero-dapp-dist.tgz
|
||||
"
|
||||
rm -f /tmp/cromero-dapp-dist.tgz
|
||||
INNER
|
||||
|
||||
log "verifying $PUBLIC_URL"
|
||||
body="$(curl -fsS --max-time 20 "$PUBLIC_URL")" || die "public URL failed: $PUBLIC_URL"
|
||||
printf '%s' "$body" | grep -F '<div id="root"></div>' >/dev/null || die "public URL missing React root"
|
||||
log "CROMERO Phoenix deploy completed"
|
||||
9
scripts/deployment/phoenix-deploy-cybersecur-from-workspace.sh
Executable file
9
scripts/deployment/phoenix-deploy-cybersecur-from-workspace.sh
Executable file
@@ -0,0 +1,9 @@
|
||||
#!/usr/bin/env bash
|
||||
# Deploy Gov_Web_Portals/CyberSecur-Global static tree from Phoenix-staged workspace to CT 7810.
|
||||
# Invoked by phoenix-deploy-api when deploy-targets.json references this script;
|
||||
# requires PHOENIX_REPO_ROOT (proxmox checkout) and PHOENIX_DEPLOY_WORKSPACE (synced CyberSecur-Global tree).
|
||||
set -euo pipefail
|
||||
: "${PHOENIX_REPO_ROOT:?PHOENIX_REPO_ROOT missing}"
|
||||
: "${PHOENIX_DEPLOY_WORKSPACE:?PHOENIX_DEPLOY_WORKSPACE missing}"
|
||||
export CYBERSECUR_REPO="$PHOENIX_DEPLOY_WORKSPACE"
|
||||
exec bash "${PHOENIX_REPO_ROOT}/scripts/deployment/sync-cybersecur-global-to-ct7810.sh"
|
||||
186
scripts/deployment/phoenix-deploy-dbis-portal-live-from-workspace.sh
Executable file
186
scripts/deployment/phoenix-deploy-dbis-portal-live-from-workspace.sh
Executable file
@@ -0,0 +1,186 @@
|
||||
#!/usr/bin/env bash
|
||||
# Deploy the DBIS public portal from a Phoenix Deploy API staged DBIS checkout.
|
||||
#
|
||||
# The DBIS repo is normally a submodule of Gov_Web_Portals/gov-portals-monorepo
|
||||
# and depends on the parent workspace package @public-web-portals/shared. This
|
||||
# wrapper builds a temporary monorepo-shaped workspace, overlays the staged DBIS
|
||||
# source into it, syncs that tree to CT 7804, then rebuilds/restarts DBIS.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
die() {
|
||||
echo "ERROR: $*" >&2
|
||||
exit 1
|
||||
}
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
|
||||
source "$PROJECT_ROOT/config/ip-addresses.conf" 2>/dev/null || true
|
||||
[ -f "$PROJECT_ROOT/.env" ] && set +u && source "$PROJECT_ROOT/.env" 2>/dev/null || true && set -u
|
||||
|
||||
PHOENIX_REPO_ROOT="${PHOENIX_REPO_ROOT:-$PROJECT_ROOT}"
|
||||
PHOENIX_DEPLOY_WORKSPACE="${PHOENIX_DEPLOY_WORKSPACE:-}"
|
||||
GOV_PORTALS_REPO_URL="${GOV_PORTALS_REPO_URL:-https://gitea.d-bis.org/Gov_Web_Portals/gov-portals-monorepo.git}"
|
||||
GOV_PORTALS_REF="${GOV_PORTALS_REF:-main}"
|
||||
|
||||
VMID_GOV_PORTALS="${VMID_GOV_PORTALS:-7804}"
|
||||
IP_GOV_PORTALS_DEV="${IP_GOV_PORTALS_DEV:-192.168.11.54}"
|
||||
PROXMOX_HOST="${DBIS_PORTAL_PROXMOX_HOST:-${PROXMOX_HOST_GOV_PORTALS:-192.168.11.14}}"
|
||||
CT_APP_DIR="${DBIS_PORTAL_CT_DIR:-/srv/gov-portals}"
|
||||
SERVICE_NAME="${DBIS_PORTAL_SERVICE:-gov-portal-DBIS}"
|
||||
DBIS_PORT="${DBIS_PORT:-3001}"
|
||||
|
||||
[[ -d "$PHOENIX_REPO_ROOT" ]] || die "PHOENIX_REPO_ROOT does not exist: $PHOENIX_REPO_ROOT"
|
||||
[[ -n "$PHOENIX_DEPLOY_WORKSPACE" ]] || die "PHOENIX_DEPLOY_WORKSPACE is required"
|
||||
[[ -d "$PHOENIX_DEPLOY_WORKSPACE" ]] || die "staged DBIS workspace missing: $PHOENIX_DEPLOY_WORKSPACE"
|
||||
[[ "$CT_APP_DIR" != "/" ]] || die "refusing to deploy into /"
|
||||
|
||||
TMP_DIR="$(mktemp -d)"
|
||||
BUILD_CONTEXT="$TMP_DIR/gov-portals"
|
||||
ARCHIVE="$TMP_DIR/gov-portals-dbis-live.tgz"
|
||||
REMOTE_ARCHIVE="/tmp/gov-portals-dbis-live-${PHOENIX_DEPLOY_SHA:-manual}-$$.tgz"
|
||||
|
||||
cleanup() {
|
||||
rm -rf "$TMP_DIR"
|
||||
}
|
||||
trap cleanup EXIT
|
||||
|
||||
echo "Preparing DBIS live deploy context"
|
||||
echo " DBIS source: $PHOENIX_DEPLOY_WORKSPACE"
|
||||
echo " parent repo: $GOV_PORTALS_REPO_URL#$GOV_PORTALS_REF"
|
||||
echo " target: CT $VMID_GOV_PORTALS ($IP_GOV_PORTALS_DEV), service $SERVICE_NAME, port $DBIS_PORT"
|
||||
|
||||
git_auth_args=()
|
||||
if [[ -n "${GITEA_TOKEN:-}" ]]; then
|
||||
git_auth_args=(-c "http.extraHeader=Authorization: token ${GITEA_TOKEN}")
|
||||
fi
|
||||
|
||||
git "${git_auth_args[@]}" clone --depth 1 --branch "$GOV_PORTALS_REF" "$GOV_PORTALS_REPO_URL" "$BUILD_CONTEXT"
|
||||
|
||||
rm -rf "$BUILD_CONTEXT/DBIS"
|
||||
mkdir -p "$BUILD_CONTEXT/DBIS"
|
||||
tar \
|
||||
--exclude=.git \
|
||||
--exclude=node_modules \
|
||||
--exclude=.next \
|
||||
--exclude='*.tsbuildinfo' \
|
||||
-C "$PHOENIX_DEPLOY_WORKSPACE" \
|
||||
-cf - . | tar -C "$BUILD_CONTEXT/DBIS" -xf -
|
||||
|
||||
tar \
|
||||
--exclude=.git \
|
||||
--exclude=node_modules \
|
||||
--exclude=.next \
|
||||
--exclude='*.tsbuildinfo' \
|
||||
-C "$BUILD_CONTEXT" \
|
||||
-czf "$ARCHIVE" .
|
||||
|
||||
echo "Uploading deploy archive to Proxmox host $PROXMOX_HOST"
|
||||
scp -q -o ConnectTimeout=10 -o StrictHostKeyChecking=accept-new "$ARCHIVE" "root@$PROXMOX_HOST:$REMOTE_ARCHIVE"
|
||||
|
||||
echo "Pushing archive into CT $VMID_GOV_PORTALS"
|
||||
ssh -o ConnectTimeout=10 -o StrictHostKeyChecking=accept-new "root@$PROXMOX_HOST" \
|
||||
"pct push $VMID_GOV_PORTALS '$REMOTE_ARCHIVE' '$REMOTE_ARCHIVE'"
|
||||
|
||||
echo "Extracting, building, and restarting DBIS inside CT $VMID_GOV_PORTALS"
|
||||
ssh -o ConnectTimeout=10 -o StrictHostKeyChecking=accept-new "root@$PROXMOX_HOST" \
|
||||
"pct exec $VMID_GOV_PORTALS -- bash -s" <<CT_SCRIPT
|
||||
set -euo pipefail
|
||||
|
||||
CT_APP_DIR="$CT_APP_DIR"
|
||||
REMOTE_ARCHIVE="$REMOTE_ARCHIVE"
|
||||
SERVICE_NAME="$SERVICE_NAME"
|
||||
DBIS_PORT="$DBIS_PORT"
|
||||
|
||||
mkdir -p "\$CT_APP_DIR"
|
||||
ENV_BACKUP="\$(mktemp -d)"
|
||||
if [ -d "\$CT_APP_DIR/DBIS" ]; then
|
||||
for env_file in .env .env.local .env.production; do
|
||||
if [ -f "\$CT_APP_DIR/DBIS/\$env_file" ]; then
|
||||
cp "\$CT_APP_DIR/DBIS/\$env_file" "\$ENV_BACKUP/\$env_file"
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
find "\$CT_APP_DIR" -mindepth 1 -maxdepth 1 \
|
||||
! -name ".env" \
|
||||
! -name ".env.local" \
|
||||
! -name ".env.production" \
|
||||
-exec rm -rf {} +
|
||||
|
||||
tar -xzf "\$REMOTE_ARCHIVE" -C "\$CT_APP_DIR"
|
||||
rm -f "\$REMOTE_ARCHIVE"
|
||||
|
||||
mkdir -p "\$CT_APP_DIR/DBIS"
|
||||
for env_file in .env .env.local .env.production; do
|
||||
if [ -f "\$ENV_BACKUP/\$env_file" ] && [ ! -f "\$CT_APP_DIR/DBIS/\$env_file" ]; then
|
||||
cp "\$ENV_BACKUP/\$env_file" "\$CT_APP_DIR/DBIS/\$env_file"
|
||||
fi
|
||||
done
|
||||
rm -rf "\$ENV_BACKUP"
|
||||
|
||||
if ! command -v curl >/dev/null 2>&1; then
|
||||
apt-get update -qq
|
||||
apt-get install -y -qq curl ca-certificates
|
||||
fi
|
||||
|
||||
NODE_MAJOR="\$(node -p 'process.versions.node.split(\".\")[0]' 2>/dev/null || echo 0)"
|
||||
if [ "\$NODE_MAJOR" -lt 20 ]; then
|
||||
curl -fsSL https://deb.nodesource.com/setup_20.x | bash -
|
||||
apt-get install -y nodejs
|
||||
hash -r
|
||||
fi
|
||||
|
||||
export PATH="/usr/local/bin:/usr/bin:/bin:\$PATH"
|
||||
if ! command -v pnpm >/dev/null 2>&1; then
|
||||
npm install -g pnpm@8.15.0
|
||||
hash -r
|
||||
fi
|
||||
PNPM_BIN="\$(command -v pnpm || true)"
|
||||
if [ -z "\$PNPM_BIN" ]; then
|
||||
for candidate in /usr/local/bin/pnpm /usr/bin/pnpm; do
|
||||
if [ -x "\$candidate" ]; then
|
||||
PNPM_BIN="\$candidate"
|
||||
break
|
||||
fi
|
||||
done
|
||||
fi
|
||||
[ -n "\$PNPM_BIN" ] || { echo "pnpm is required but was not found after install" >&2; exit 1; }
|
||||
|
||||
cd "\$CT_APP_DIR"
|
||||
"\$PNPM_BIN" install --frozen-lockfile
|
||||
"\$PNPM_BIN" --filter portal-dbis build
|
||||
|
||||
cat > "/etc/systemd/system/\$SERVICE_NAME.service" <<UNIT
|
||||
[Unit]
|
||||
Description=Gov Portal DBIS
|
||||
After=network.target
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
User=root
|
||||
WorkingDirectory=\$CT_APP_DIR/DBIS
|
||||
Environment=NODE_ENV=production
|
||||
Environment=PORT=\$DBIS_PORT
|
||||
EnvironmentFile=-\$CT_APP_DIR/DBIS/.env.production
|
||||
EnvironmentFile=-\$CT_APP_DIR/DBIS/.env.local
|
||||
ExecStart=/usr/bin/node \$CT_APP_DIR/DBIS/node_modules/next/dist/bin/next start -p \$DBIS_PORT
|
||||
Restart=on-failure
|
||||
RestartSec=5
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
UNIT
|
||||
|
||||
systemctl daemon-reload
|
||||
systemctl enable "\$SERVICE_NAME"
|
||||
systemctl restart "\$SERVICE_NAME"
|
||||
sleep 3
|
||||
curl -fsS --max-time 15 "http://127.0.0.1:\$DBIS_PORT/" >/dev/null
|
||||
CT_SCRIPT
|
||||
|
||||
ssh -o ConnectTimeout=10 -o StrictHostKeyChecking=accept-new "root@$PROXMOX_HOST" "rm -f '$REMOTE_ARCHIVE'" >/dev/null 2>&1 || true
|
||||
|
||||
echo "DBIS live deployment complete."
|
||||
echo "Local origin check: http://$IP_GOV_PORTALS_DEV:$DBIS_PORT/"
|
||||
76
scripts/deployment/provision-cybersecur-npmplus.sh
Executable file
76
scripts/deployment/provision-cybersecur-npmplus.sh
Executable file
@@ -0,0 +1,76 @@
|
||||
#!/usr/bin/env bash
|
||||
# Create NPMplus proxy host for cybersecur.d-bis.org → static upstream (default: MIM web nginx IP).
|
||||
# Prerequisites: DNS A record for cybersecur.d-bis.org (Cloudflare → origin); static files on upstream (see CyberSecur-Global/deploy/).
|
||||
set -euo pipefail
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
# shellcheck disable=1091
|
||||
source "${PROJECT_ROOT}/config/ip-addresses.conf" 2>/dev/null || true
|
||||
_orig_npm_url="${NPM_URL:-}"
|
||||
_orig_npm_email="${NPM_EMAIL:-}"
|
||||
_orig_npm_password="${NPM_PASSWORD:-}"
|
||||
if [[ -f "${PROJECT_ROOT}/.env" ]]; then
|
||||
set +u
|
||||
set -a
|
||||
# shellcheck disable=1091
|
||||
source "${PROJECT_ROOT}/.env" 2>/dev/null || true
|
||||
set +a
|
||||
set -u
|
||||
[[ -n "$_orig_npm_url" ]] && NPM_URL="$_orig_npm_url"
|
||||
[[ -n "$_orig_npm_email" ]] && NPM_EMAIL="$_orig_npm_email"
|
||||
[[ -n "$_orig_npm_password" ]] && NPM_PASSWORD="$_orig_npm_password"
|
||||
fi
|
||||
|
||||
NPM_URL="${NPM_URL:-https://${IP_NPMPLUS:-192.168.11.167}:81}"
|
||||
NPM_EMAIL="${NPM_EMAIL:?Set NPM_EMAIL}"
|
||||
NPM_PASSWORD="${NPM_PASSWORD:?Set NPM_PASSWORD}"
|
||||
|
||||
DOMAIN="${CYBERSECUR_DOMAIN:-cybersecur.d-bis.org}"
|
||||
FORWARD_HOST="${CYBERSECUR_FORWARD_HOST:-${IP_MIM_WEB:-192.168.11.37}}"
|
||||
FORWARD_PORT="${CYBERSECUR_FORWARD_PORT:-80}"
|
||||
|
||||
curl_npm() { curl -s -k -L --connect-timeout 10 --max-time "${NPM_CURL_MAX_TIME:-120}" "$@"; }
|
||||
|
||||
AUTH_JSON=$(jq -n --arg identity "$NPM_EMAIL" --arg secret "$NPM_PASSWORD" '{identity:$identity,secret:$secret}')
|
||||
TOKEN_RESPONSE=$(curl_npm -X POST "$NPM_URL/api/tokens" -H "Content-Type: application/json" -d "$AUTH_JSON")
|
||||
TOKEN=$(echo "$TOKEN_RESPONSE" | jq -r '.token // empty' 2>/dev/null || true)
|
||||
if [[ -z "$TOKEN" || "$TOKEN" == "null" ]]; then
|
||||
echo "❌ NPM authentication failed" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
PROXY_HOSTS_JSON=$(curl_npm -X GET "$NPM_URL/api/nginx/proxy-hosts" -H "Authorization: Bearer $TOKEN")
|
||||
HOST_ID=$(echo "$PROXY_HOSTS_JSON" | jq -r --arg d "$DOMAIN" '.[] | select(.domain_names[]? == $d) | .id' 2>/dev/null | head -1 || true)
|
||||
|
||||
if [[ -n "${HOST_ID:-}" && "$HOST_ID" != "null" ]]; then
|
||||
echo "✓ Proxy host already exists: $DOMAIN (id=$HOST_ID)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
CREATE_PAYLOAD=$(jq -n \
|
||||
--arg domain "$DOMAIN" \
|
||||
--arg forward_host "$FORWARD_HOST" \
|
||||
--argjson forward_port "$FORWARD_PORT" \
|
||||
'{
|
||||
domain_names: [$domain],
|
||||
forward_scheme: "http",
|
||||
forward_host: $forward_host,
|
||||
forward_port: ($forward_port | tonumber),
|
||||
allow_websocket_upgrade: false,
|
||||
certificate_id: null,
|
||||
ssl_forced: false
|
||||
}')
|
||||
|
||||
RESPONSE=$(curl_npm -X POST "$NPM_URL/api/nginx/proxy-hosts" \
|
||||
-H "Authorization: Bearer $TOKEN" \
|
||||
-H "Content-Type: application/json" \
|
||||
-d "$CREATE_PAYLOAD")
|
||||
|
||||
NEW_ID=$(echo "$RESPONSE" | jq -r '.id // empty' 2>/dev/null || true)
|
||||
if [[ -n "$NEW_ID" && "$NEW_ID" != "null" ]]; then
|
||||
echo "✓ Created $DOMAIN → http://${FORWARD_HOST}:${FORWARD_PORT} (proxy host id=$NEW_ID)"
|
||||
echo " Next: deploy static files to upstream (see CyberSecur-Global/deploy/) and request SSL in NPM or run request-npmplus-certificates.sh"
|
||||
else
|
||||
echo "❌ Failed: $(echo "$RESPONSE" | jq -c . 2>/dev/null || echo "$RESPONSE")" >&2
|
||||
exit 1
|
||||
fi
|
||||
55
scripts/deployment/sync-cybersecur-global-to-ct7810.sh
Executable file
55
scripts/deployment/sync-cybersecur-global-to-ct7810.sh
Executable file
@@ -0,0 +1,55 @@
|
||||
#!/usr/bin/env bash
|
||||
# Deploy CyberSecur-Global static site to CT 7810 (no direct SSH to container IP required).
|
||||
# Uses SSH to Proxmox host + pct exec (same pattern as operator workflows).
|
||||
#
|
||||
# Prerequisites: SSH key to root@PROXMOX_HOST (default r630-02); CT 7810 running nginx with
|
||||
# root /var/www/cybersecur-d-bis (see deploy/nginx-cybersecur-d-bis.conf.example in repo).
|
||||
#
|
||||
# Usage:
|
||||
# CYBERSECUR_REPO=/path/to/CyberSecur-Global ./scripts/deployment/sync-cybersecur-global-to-ct7810.sh
|
||||
#
|
||||
# SSH target defaults to 192.168.11.12 (r630-02) where VMID 7810 runs. Override with:
|
||||
# CYBERSECUR_CT7810_PROXMOX_HOST=192.168.11.12
|
||||
#
|
||||
set -euo pipefail
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
# shellcheck disable=1091
|
||||
source "${PROJECT_ROOT}/config/ip-addresses.conf" 2>/dev/null || true
|
||||
|
||||
REPO="${CYBERSECUR_REPO:-${PROJECT_ROOT}/../CyberSecur-Global}"
|
||||
REPO="$(cd "$REPO" && pwd)"
|
||||
VMID="${CYBERSECUR_VMID:-7810}"
|
||||
REMOTE="${CYBERSECUR_REMOTE_PATH:-/var/www/cybersecur-d-bis}"
|
||||
|
||||
if [[ ! -f "$REPO/index.html" ]]; then
|
||||
echo "❌ CYBERSECUR_REPO must point to CyberSecur-Global clone (missing index.html): $REPO" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Render Web3Forms intake locally when CYBERSECUR_WEB3FORMS_ACCESS_KEY is set (do not commit rendered intake with secrets).
|
||||
if [[ -f "${PROJECT_ROOT}/scripts/lib/load-project-env.sh" ]]; then
|
||||
set +u
|
||||
# shellcheck disable=1091
|
||||
source "${PROJECT_ROOT}/scripts/lib/load-project-env.sh" 2>/dev/null || true
|
||||
set -u
|
||||
fi
|
||||
if [[ -n "${CYBERSECUR_WEB3FORMS_ACCESS_KEY:-}" && -x "${REPO}/deploy/render-intake.sh" ]]; then
|
||||
echo "Rendering intake.html from Web3Forms template (key present)..."
|
||||
(cd "$REPO" && ./deploy/render-intake.sh)
|
||||
fi
|
||||
|
||||
# CT 7810 lives on r630-02 (192.168.11.12). Set after dotenv so stray CYBERSECUR_PROXMOX_HOST in .env cannot hijack SSH target.
|
||||
PROXMOX_HOST="${CYBERSECUR_CT7810_PROXMOX_HOST:-192.168.11.12}"
|
||||
|
||||
echo "Packing $REPO → root@${PROXMOX_HOST} pct exec $VMID → ${REMOTE}/"
|
||||
(
|
||||
cd "$REPO"
|
||||
tar czf - \
|
||||
--exclude=.git \
|
||||
--exclude='deploy/*.sh' \
|
||||
.
|
||||
) | ssh -o BatchMode=yes "root@${PROXMOX_HOST}" \
|
||||
"pct exec ${VMID} -- bash -c 'set -euo pipefail; mkdir -p ${REMOTE}; tar xzf - -C ${REMOTE}; chown -R www-data:www-data ${REMOTE}; nginx -t && systemctl reload nginx'"
|
||||
|
||||
echo "✓ Deployed. Verify: curl -sSI https://cybersecur.d-bis.org/robots.txt https://cybersecur.d-bis.org/sitemap.xml https://cybersecur.d-bis.org/favicon.ico | head"
|
||||
43
scripts/git/README-gitea-proxmox-sync.md
Normal file
43
scripts/git/README-gitea-proxmox-sync.md
Normal file
@@ -0,0 +1,43 @@
|
||||
# Syncing `master` with `gitea` remote (d-bis/proxmox)
|
||||
|
||||
If **`git push gitea master`** fails with **non-fast-forward**, local and **gitea.d-bis.org/d-bis/proxmox** have **diverged**.
|
||||
|
||||
```bash
|
||||
cd /path/to/proxmox
|
||||
git fetch gitea
|
||||
git log --oneline --left-right master...gitea/master | head -40
|
||||
```
|
||||
|
||||
Resolve by either:
|
||||
|
||||
1. **Merge** (preserves both histories):
|
||||
`git pull gitea master --no-rebase`
|
||||
fix conflicts, then `git push gitea master` and `git push origin master` as needed.
|
||||
|
||||
2. **Rebase** (linear history; use only if your team agrees):
|
||||
`git pull gitea master --rebase`
|
||||
then push both remotes.
|
||||
|
||||
Do **not** force-push to **gitea** unless you intend to rewrite shared history.
|
||||
|
||||
CyberSecur-related commits exist on **GitHub `origin`**; **gitea** may lack them until a successful merge and push.
|
||||
|
||||
## Parity check (e.g. before audit submission)
|
||||
|
||||
All three should match (same commit hash):
|
||||
|
||||
```bash
|
||||
git fetch origin
|
||||
git fetch gitea
|
||||
git rev-parse master origin/master gitea/master
|
||||
```
|
||||
|
||||
If local `master` lags but is a **fast-forward** behind (no local-only commits you need to keep), update and align:
|
||||
|
||||
```bash
|
||||
git checkout master
|
||||
git pull origin master --ff-only
|
||||
# or: git merge --ff-only gitea/master
|
||||
```
|
||||
|
||||
If you have uncommitted changes, **stash** first (`git stash push -u` if untracked files would be overwritten), then pull, then `git stash pop` as needed. After alignment, `git push origin master` and `git push gitea master` should report **up to date** when remotes already share the tip.
|
||||
126
scripts/status/apply-all-mainnet-vault-assignments.mjs
Executable file
126
scripts/status/apply-all-mainnet-vault-assignments.mjs
Executable file
@@ -0,0 +1,126 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Apply explicit vault-role assignments to the ALL Mainnet pool matrix.
|
||||
*
|
||||
* Input priority for each assignment:
|
||||
* byPoolId[poolId][role] > byChain[chainId][role] > defaultByRole[role]
|
||||
*
|
||||
* The script refuses zero addresses and placeholders. By default it only writes
|
||||
* requiredForSpend rows; pass --all to update every row.
|
||||
*/
|
||||
|
||||
import { existsSync, readFileSync, writeFileSync } from "node:fs";
|
||||
import { resolve } from "node:path";
|
||||
|
||||
const repoRoot = resolve(new URL("../..", import.meta.url).pathname);
|
||||
const matrixPath = resolve(repoRoot, "config/all-mainnet-pool-creation-matrix.json");
|
||||
const defaultAssignmentsPath = resolve(repoRoot, "config/all-mainnet-vault-assignments.json");
|
||||
const args = process.argv.slice(2);
|
||||
const dryRun = args.includes("--dry-run");
|
||||
const updateAll = args.includes("--all");
|
||||
const assignmentArg = args.find((arg) => !arg.startsWith("--"));
|
||||
const assignmentPath = resolve(repoRoot, assignmentArg || defaultAssignmentsPath);
|
||||
const addressPattern = /^0x[a-fA-F0-9]{40}$/;
|
||||
const zeroAddress = "0x0000000000000000000000000000000000000000";
|
||||
|
||||
function usageAndExit(message, code = 1) {
|
||||
if (message) console.error(`[ERROR] ${message}`);
|
||||
console.error("Usage: node scripts/status/apply-all-mainnet-vault-assignments.mjs [config/all-mainnet-vault-assignments.json] [--dry-run] [--all]");
|
||||
process.exit(code);
|
||||
}
|
||||
|
||||
if (args.includes("--help") || args.includes("-h")) {
|
||||
usageAndExit(null, 0);
|
||||
}
|
||||
|
||||
if (!existsSync(assignmentPath)) {
|
||||
usageAndExit(`Missing assignment file ${assignmentPath}. Copy config/all-mainnet-vault-assignments.example.json first.`);
|
||||
}
|
||||
|
||||
function readJson(path) {
|
||||
return JSON.parse(readFileSync(path, "utf8"));
|
||||
}
|
||||
|
||||
function isRealAddress(address) {
|
||||
return typeof address === "string" && addressPattern.test(address) && address.toLowerCase() !== zeroAddress;
|
||||
}
|
||||
|
||||
function addressFor(config, row, role) {
|
||||
return (
|
||||
config.byPoolId?.[row.poolId]?.[role] ||
|
||||
config.byChain?.[String(row.chainId)]?.[role] ||
|
||||
config.defaultByRole?.[role] ||
|
||||
null
|
||||
);
|
||||
}
|
||||
|
||||
const matrix = readJson(matrixPath);
|
||||
const config = readJson(assignmentPath);
|
||||
const touched = [];
|
||||
const missing = [];
|
||||
const invalid = [];
|
||||
|
||||
for (const row of matrix.rows) {
|
||||
if (!updateAll && row.requiredForSpend !== true) continue;
|
||||
if (!Array.isArray(row.vaultAssignments)) continue;
|
||||
|
||||
let rowChanged = false;
|
||||
for (const assignment of row.vaultAssignments) {
|
||||
if (!assignment?.role || assignment.requiredBeforeFunding !== true) continue;
|
||||
const nextAddress = addressFor(config, row, assignment.role);
|
||||
if (!nextAddress) {
|
||||
missing.push(`${row.poolId}:${assignment.role}`);
|
||||
continue;
|
||||
}
|
||||
if (!isRealAddress(nextAddress)) {
|
||||
invalid.push(`${row.poolId}:${assignment.role}:${nextAddress}`);
|
||||
continue;
|
||||
}
|
||||
if (assignment.vaultAddress !== nextAddress) {
|
||||
assignment.vaultAddress = nextAddress;
|
||||
rowChanged = true;
|
||||
}
|
||||
}
|
||||
|
||||
const missingRequired = row.vaultAssignments
|
||||
.filter((assignment) => assignment.requiredBeforeFunding === true && assignment.vaultAddress === null)
|
||||
.map((assignment) => assignment.role);
|
||||
row.missingRequiredVaultRoles = missingRequired;
|
||||
row.vaultAssignmentStatus = missingRequired.length > 0 ? "missing_required_vaults" : "ready";
|
||||
|
||||
if (rowChanged) {
|
||||
if (!row.notes.includes("Vault assignments applied from explicit All Mainnet vault assignment map.")) {
|
||||
row.notes.push("Vault assignments applied from explicit All Mainnet vault assignment map.");
|
||||
}
|
||||
touched.push(row.poolId);
|
||||
}
|
||||
}
|
||||
|
||||
if (invalid.length > 0) {
|
||||
console.error("[ERROR] Invalid or placeholder vault addresses:");
|
||||
for (const item of invalid) console.error(` - ${item}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
if (missing.length > 0) {
|
||||
console.error("[ERROR] Missing vault assignments:");
|
||||
for (const item of missing.slice(0, 80)) console.error(` - ${item}`);
|
||||
if (missing.length > 80) console.error(` ... ${missing.length - 80} more`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
matrix.generatedAt = new Date().toISOString();
|
||||
matrix.statusCounts = matrix.rows.reduce((counts, row) => {
|
||||
counts[row.status] = (counts[row.status] || 0) + 1;
|
||||
return counts;
|
||||
}, {});
|
||||
matrix.protocolCounts = matrix.rows.reduce((counts, row) => {
|
||||
counts[row.protocol] = (counts[row.protocol] || 0) + 1;
|
||||
return counts;
|
||||
}, {});
|
||||
|
||||
if (!dryRun) {
|
||||
writeFileSync(matrixPath, `${JSON.stringify(matrix, null, 2)}\n`);
|
||||
}
|
||||
|
||||
console.log(`[OK] Vault assignment ${dryRun ? "dry run" : "apply"} complete: ${touched.length} row(s) touched.`);
|
||||
222
scripts/status/check-all-mainnet-required-pool-balances.mjs
Executable file
222
scripts/status/check-all-mainnet-required-pool-balances.mjs
Executable file
@@ -0,0 +1,222 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Check required ALL Mainnet spend pool contracts and token balances.
|
||||
*
|
||||
* By default this writes reports/status/all-mainnet-required-pool-balances-latest.json.
|
||||
* Use --update-matrix to promote required rows with non-zero base and quote
|
||||
* balances to live_read and attach reserveEvidence. This does not create,
|
||||
* fund, or canary any pool.
|
||||
*/
|
||||
|
||||
import { mkdirSync, readFileSync, writeFileSync } from "node:fs";
|
||||
import { resolve } from "node:path";
|
||||
|
||||
const repoRoot = resolve(new URL("../..", import.meta.url).pathname);
|
||||
const matrixPath = resolve(repoRoot, "config/all-mainnet-pool-creation-matrix.json");
|
||||
const outDir = resolve(repoRoot, "reports/status");
|
||||
const updateMatrix = process.argv.includes("--update-matrix");
|
||||
|
||||
const rpcByChain = {
|
||||
1: process.env.ETHEREUM_MAINNET_RPC || process.env.RPC_URL_1 || "https://ethereum.publicnode.com",
|
||||
10: process.env.OPTIMISM_RPC || process.env.RPC_URL_10 || "https://optimism.publicnode.com",
|
||||
25: process.env.CRONOS_RPC || process.env.RPC_URL_25 || "https://cronos-evm-rpc.publicnode.com",
|
||||
56: process.env.BSC_RPC || process.env.RPC_URL_56 || "https://bsc-rpc.publicnode.com",
|
||||
100: process.env.GNOSIS_RPC || process.env.RPC_URL_100 || "https://gnosis.publicnode.com",
|
||||
137: process.env.POLYGON_RPC || process.env.RPC_URL_137 || "https://polygon-bor-rpc.publicnode.com",
|
||||
8453: process.env.BASE_RPC || process.env.RPC_URL_8453 || "https://base-rpc.publicnode.com",
|
||||
42161: process.env.ARBITRUM_RPC || process.env.RPC_URL_42161 || "https://arbitrum-one-rpc.publicnode.com",
|
||||
42220: process.env.CELO_RPC || process.env.RPC_URL_42220 || "https://celo-rpc.publicnode.com",
|
||||
43114: process.env.AVALANCHE_RPC || process.env.RPC_URL_43114 || "https://avalanche-c-chain-rpc.publicnode.com",
|
||||
651940: process.env.ALL_MAINNET_RPC || process.env.CHAIN_651940_RPC_URL || "https://mainnet-rpc.alltra.global",
|
||||
};
|
||||
|
||||
const matrix = JSON.parse(readFileSync(matrixPath, "utf8"));
|
||||
|
||||
function padAddress(address) {
|
||||
return String(address).replace(/^0x/i, "").padStart(64, "0");
|
||||
}
|
||||
|
||||
async function rpcCall(rpcUrl, method, params) {
|
||||
const controller = new AbortController();
|
||||
const timeout = setTimeout(() => controller.abort(), 12_000);
|
||||
try {
|
||||
const response = await fetch(rpcUrl, {
|
||||
method: "POST",
|
||||
headers: { "content-type": "application/json" },
|
||||
body: JSON.stringify({ jsonrpc: "2.0", method, params, id: 1 }),
|
||||
signal: controller.signal,
|
||||
});
|
||||
const json = await response.json();
|
||||
if (json.error) {
|
||||
return { ok: false, error: json.error.message || JSON.stringify(json.error) };
|
||||
}
|
||||
return { ok: true, result: json.result };
|
||||
} catch (error) {
|
||||
return { ok: false, error: error.message };
|
||||
} finally {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
}
|
||||
|
||||
async function getCode(rpcUrl, address) {
|
||||
return rpcCall(rpcUrl, "eth_getCode", [address, "latest"]);
|
||||
}
|
||||
|
||||
async function balanceOf(rpcUrl, tokenAddress, accountAddress) {
|
||||
return rpcCall(rpcUrl, "eth_call", [
|
||||
{
|
||||
to: tokenAddress,
|
||||
data: `0x70a08231${padAddress(accountAddress)}`,
|
||||
},
|
||||
"latest",
|
||||
]);
|
||||
}
|
||||
|
||||
function bigintFromHex(hex) {
|
||||
if (!hex || hex === "0x") return 0n;
|
||||
return BigInt(hex);
|
||||
}
|
||||
|
||||
function evidenceRef(reportPath, poolId) {
|
||||
return `${reportPath}#${poolId}`;
|
||||
}
|
||||
|
||||
const generatedAt = new Date().toISOString();
|
||||
const reportName = "reports/status/all-mainnet-required-pool-balances-latest.json";
|
||||
const rows = matrix.rows.filter((row) => row.requiredForSpend === true);
|
||||
const results = [];
|
||||
|
||||
for (const row of rows) {
|
||||
const rpcUrl = rpcByChain[row.chainId];
|
||||
const result = {
|
||||
poolId: row.poolId,
|
||||
chainId: row.chainId,
|
||||
network: row.network,
|
||||
protocol: row.protocol,
|
||||
previousStatus: row.status,
|
||||
poolAddress: row.poolAddress,
|
||||
baseToken: row.baseToken,
|
||||
quoteToken: row.quoteToken,
|
||||
rpcConfigured: Boolean(rpcUrl),
|
||||
poolHasCode: false,
|
||||
baseBalanceRaw: null,
|
||||
quoteBalanceRaw: null,
|
||||
liveReadStatus: "not_checked",
|
||||
errors: [],
|
||||
};
|
||||
|
||||
if (!rpcUrl) {
|
||||
result.liveReadStatus = "missing_rpc";
|
||||
result.errors.push("missing_rpc");
|
||||
results.push(result);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!row.poolAddress) {
|
||||
result.liveReadStatus = "missing_pool_address";
|
||||
result.errors.push("missing_pool_address");
|
||||
results.push(result);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!row.baseToken?.address || !row.quoteToken?.address) {
|
||||
result.liveReadStatus = "missing_token_address";
|
||||
if (!row.baseToken?.address) result.errors.push(`missing_base_address:${row.baseToken?.symbol || "unknown"}`);
|
||||
if (!row.quoteToken?.address) result.errors.push(`missing_quote_address:${row.quoteToken?.symbol || "unknown"}`);
|
||||
results.push(result);
|
||||
continue;
|
||||
}
|
||||
|
||||
const code = await getCode(rpcUrl, row.poolAddress);
|
||||
if (!code.ok) {
|
||||
result.liveReadStatus = "rpc_error";
|
||||
result.errors.push(`pool_code:${code.error}`);
|
||||
results.push(result);
|
||||
continue;
|
||||
}
|
||||
result.poolHasCode = Boolean(code.result && code.result !== "0x");
|
||||
|
||||
const [base, quote] = await Promise.all([
|
||||
balanceOf(rpcUrl, row.baseToken.address, row.poolAddress),
|
||||
balanceOf(rpcUrl, row.quoteToken.address, row.poolAddress),
|
||||
]);
|
||||
|
||||
if (!base.ok) result.errors.push(`base_balance:${base.error}`);
|
||||
if (!quote.ok) result.errors.push(`quote_balance:${quote.error}`);
|
||||
|
||||
if (base.ok) result.baseBalanceRaw = bigintFromHex(base.result).toString();
|
||||
if (quote.ok) result.quoteBalanceRaw = bigintFromHex(quote.result).toString();
|
||||
|
||||
const basePositive = BigInt(result.baseBalanceRaw || "0") > 0n;
|
||||
const quotePositive = BigInt(result.quoteBalanceRaw || "0") > 0n;
|
||||
if (!result.poolHasCode) {
|
||||
result.liveReadStatus = "missing_pool_code";
|
||||
} else if (result.errors.length > 0) {
|
||||
result.liveReadStatus = "balance_read_error";
|
||||
} else if (basePositive && quotePositive) {
|
||||
result.liveReadStatus = "nonzero_base_and_quote";
|
||||
} else if (basePositive || quotePositive) {
|
||||
result.liveReadStatus = "partial_balance";
|
||||
} else {
|
||||
result.liveReadStatus = "zero_balances";
|
||||
}
|
||||
|
||||
results.push(result);
|
||||
}
|
||||
|
||||
const statusCounts = {};
|
||||
for (const result of results) {
|
||||
statusCounts[result.liveReadStatus] = (statusCounts[result.liveReadStatus] || 0) + 1;
|
||||
}
|
||||
|
||||
const report = {
|
||||
generatedAt,
|
||||
sourceMatrix: "config/all-mainnet-pool-creation-matrix.json",
|
||||
updatedMatrix: updateMatrix,
|
||||
statusCounts,
|
||||
results,
|
||||
};
|
||||
|
||||
mkdirSync(outDir, { recursive: true });
|
||||
writeFileSync(resolve(repoRoot, reportName), `${JSON.stringify(report, null, 2)}\n`);
|
||||
|
||||
if (updateMatrix) {
|
||||
let changed = 0;
|
||||
const byPoolId = new Map(results.map((result) => [result.poolId, result]));
|
||||
for (const row of matrix.rows) {
|
||||
const result = byPoolId.get(row.poolId);
|
||||
if (!result || result.liveReadStatus !== "nonzero_base_and_quote") continue;
|
||||
|
||||
const previousStatus = row.status;
|
||||
if (!["canary_passed", "production"].includes(row.status)) {
|
||||
row.status = "live_read";
|
||||
}
|
||||
row.reserveSource = "all-mainnet-required-pool-balance-check";
|
||||
row.reserveEvidence = {
|
||||
generatedAt,
|
||||
evidenceRef: evidenceRef(reportName, row.poolId),
|
||||
baseBalanceRaw: result.baseBalanceRaw,
|
||||
quoteBalanceRaw: result.quoteBalanceRaw,
|
||||
poolHasCode: result.poolHasCode,
|
||||
liveReadStatus: result.liveReadStatus,
|
||||
};
|
||||
if (!row.notes.includes("Live reserve read recorded from required-pool balance checker.")) {
|
||||
row.notes.push("Live reserve read recorded from required-pool balance checker.");
|
||||
}
|
||||
if (previousStatus !== row.status) changed += 1;
|
||||
}
|
||||
|
||||
matrix.generatedAt = generatedAt;
|
||||
matrix.statusCounts = matrix.rows.reduce((counts, row) => {
|
||||
counts[row.status] = (counts[row.status] || 0) + 1;
|
||||
return counts;
|
||||
}, {});
|
||||
matrix.protocolCounts = matrix.rows.reduce((counts, row) => {
|
||||
counts[row.protocol] = (counts[row.protocol] || 0) + 1;
|
||||
return counts;
|
||||
}, {});
|
||||
writeFileSync(matrixPath, `${JSON.stringify(matrix, null, 2)}\n`);
|
||||
console.log(`[OK] Required pool balance report written; matrix live_read promotions: ${changed}.`);
|
||||
} else {
|
||||
console.log("[OK] Required pool balance report written.");
|
||||
}
|
||||
388
scripts/status/generate-all-mainnet-readiness.mjs
Executable file
388
scripts/status/generate-all-mainnet-readiness.mjs
Executable file
@@ -0,0 +1,388 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Generate ALL Mainnet readiness reports from the canonical matrix and surface.
|
||||
*
|
||||
* The reports are intentionally conservative: rows are not promoted by this
|
||||
* script. It only summarizes what is already proven in config and what remains
|
||||
* gated by missing addresses, vault assignments, funding, canaries, or protocol
|
||||
* surface evidence.
|
||||
*/
|
||||
|
||||
import { mkdirSync, readFileSync, writeFileSync } from "node:fs";
|
||||
import { resolve } from "node:path";
|
||||
|
||||
const repoRoot = resolve(new URL("../..", import.meta.url).pathname);
|
||||
const matrixPath = resolve(repoRoot, "config/all-mainnet-pool-creation-matrix.json");
|
||||
const surfacePath = resolve(repoRoot, "config/allmainnet-non-dodo-protocol-surface.json");
|
||||
const outDir = resolve(repoRoot, "reports/status");
|
||||
|
||||
const statusesReadyForFunding = new Set(["created", "funded", "live_read", "canary_passed", "production"]);
|
||||
const statusesReadyForCanary = new Set(["funded", "live_read", "canary_passed", "production"]);
|
||||
const productionStatuses = new Set(["production"]);
|
||||
|
||||
function readJson(path) {
|
||||
return JSON.parse(readFileSync(path, "utf8"));
|
||||
}
|
||||
|
||||
function countBy(items, keyFn) {
|
||||
const counts = {};
|
||||
for (const item of items) {
|
||||
const key = keyFn(item);
|
||||
counts[key] = (counts[key] || 0) + 1;
|
||||
}
|
||||
return Object.fromEntries(Object.entries(counts).sort(([a], [b]) => a.localeCompare(b)));
|
||||
}
|
||||
|
||||
function tokenRef(token) {
|
||||
if (!token) return "MISSING";
|
||||
return `${token.symbol || "MISSING"} ${token.address || "MISSING"}`;
|
||||
}
|
||||
|
||||
function rowRef(row) {
|
||||
return {
|
||||
poolId: row.poolId,
|
||||
chainId: row.chainId,
|
||||
network: row.network,
|
||||
protocol: row.protocol,
|
||||
status: row.status,
|
||||
requiredForSpend: Boolean(row.requiredForSpend),
|
||||
publicRoutingEnabled: Boolean(row.publicRoutingEnabled),
|
||||
baseToken: row.baseToken,
|
||||
quoteToken: row.quoteToken,
|
||||
poolAddress: row.poolAddress,
|
||||
vaultAddress: row.vaultAddress,
|
||||
missingRequiredVaultRoles: row.missingRequiredVaultRoles || [],
|
||||
canaryEvidence: row.canaryEvidence,
|
||||
notes: row.notes || [],
|
||||
};
|
||||
}
|
||||
|
||||
function rowBlockers(row, surface) {
|
||||
const blockers = [];
|
||||
if (row.status === "planned") blockers.push("pool_not_created");
|
||||
if (!row.baseToken?.address) blockers.push(`missing_base_address:${row.baseToken?.symbol || "unknown"}`);
|
||||
if (!row.quoteToken?.address) blockers.push(`missing_quote_address:${row.quoteToken?.symbol || "unknown"}`);
|
||||
if (statusesReadyForFunding.has(row.status) && !row.poolAddress) blockers.push("missing_pool_address");
|
||||
if ((row.missingRequiredVaultRoles || []).length > 0) blockers.push("missing_required_vault_assignments");
|
||||
if (row.status === "created") blockers.push("pool_created_but_not_funded");
|
||||
if (statusesReadyForCanary.has(row.status) && !row.canaryEvidence) blockers.push("missing_canary_evidence");
|
||||
if (!productionStatuses.has(row.status)) blockers.push("not_production_status");
|
||||
if (row.chainId === 651940 && surface.summary?.sameChainSwapInventoryPublished !== true) {
|
||||
blockers.push("all_mainnet_same_chain_inventory_not_published");
|
||||
}
|
||||
return blockers;
|
||||
}
|
||||
|
||||
function makeSummary(matrix, surface) {
|
||||
const rows = matrix.rows;
|
||||
const requiredRows = rows.filter((row) => row.requiredForSpend === true);
|
||||
const rowsMissingVaults = rows.filter((row) => (row.missingRequiredVaultRoles || []).length > 0);
|
||||
const requiredRowsMissingVaults = requiredRows.filter((row) => (row.missingRequiredVaultRoles || []).length > 0);
|
||||
const rowsMissingTokenAddresses = rows.filter((row) => !row.baseToken?.address || !row.quoteToken?.address);
|
||||
const requiredRowsMissingTokenAddresses = requiredRows.filter((row) => !row.baseToken?.address || !row.quoteToken?.address);
|
||||
const rowsMissingCanary = requiredRows.filter((row) => statusesReadyForCanary.has(row.status) && !row.canaryEvidence);
|
||||
const productionRows = requiredRows.filter((row) => row.status === "production");
|
||||
|
||||
return {
|
||||
generatedAt: new Date().toISOString(),
|
||||
matrixVersion: matrix.version,
|
||||
matrixGeneratedAt: matrix.generatedAt,
|
||||
totalRows: rows.length,
|
||||
requiredForSpendRows: requiredRows.length,
|
||||
statusCounts: countBy(rows, (row) => row.status),
|
||||
protocolCounts: countBy(rows, (row) => row.protocol),
|
||||
chainStatusCounts: Object.fromEntries(
|
||||
Object.entries(countBy(rows, (row) => `${row.chainId} ${row.network}`)).sort(([a], [b]) => a.localeCompare(b)),
|
||||
),
|
||||
requiredStatusCounts: countBy(requiredRows, (row) => row.status),
|
||||
plannedRows: rows.filter((row) => row.status === "planned").length,
|
||||
createdUnfundedRows: rows.filter((row) => row.status === "created").length,
|
||||
fundedRows: rows.filter((row) => row.status === "funded").length,
|
||||
liveReadRows: rows.filter((row) => row.status === "live_read").length,
|
||||
productionRows: productionRows.length,
|
||||
rowsMissingVaultAssignments: rowsMissingVaults.length,
|
||||
requiredRowsMissingVaultAssignments: requiredRowsMissingVaults.length,
|
||||
rowsMissingTokenAddresses: rowsMissingTokenAddresses.length,
|
||||
requiredRowsMissingTokenAddresses: requiredRowsMissingTokenAddresses.length,
|
||||
requiredRowsMissingCanaryEvidence: rowsMissingCanary.length,
|
||||
bridgeOnlyLive: surface.summary?.bridgeOnlyLive === true,
|
||||
sameChainSwapInventoryPublished: surface.summary?.sameChainSwapInventoryPublished === true,
|
||||
productionReady:
|
||||
requiredRows.length > 0 &&
|
||||
productionRows.length === requiredRows.length &&
|
||||
requiredRowsMissingVaults.length === 0 &&
|
||||
rowsMissingCanary.length === 0 &&
|
||||
surface.summary?.sameChainSwapInventoryPublished === true,
|
||||
};
|
||||
}
|
||||
|
||||
function makeReadiness(matrix, surface, summary) {
|
||||
const requiredRows = matrix.rows.filter((row) => row.requiredForSpend === true);
|
||||
const allBlockers = requiredRows
|
||||
.map((row) => ({
|
||||
...rowRef(row),
|
||||
blockers: rowBlockers(row, surface),
|
||||
}))
|
||||
.filter((row) => row.blockers.length > 0);
|
||||
|
||||
const protocolSurfaceBlockers = (surface.protocols || [])
|
||||
.filter((protocol) => protocol.status !== "live" && (!protocol.factoryAddress || !protocol.routerAddress))
|
||||
.map((protocol) => ({
|
||||
name: protocol.name,
|
||||
family: protocol.family,
|
||||
status: protocol.status,
|
||||
factoryAddress: protocol.factoryAddress,
|
||||
routerAddress: protocol.routerAddress,
|
||||
blockers: [
|
||||
!protocol.factoryAddress ? "missing_factory_address" : null,
|
||||
!protocol.routerAddress ? "missing_router_address" : null,
|
||||
protocol.status !== "live" ? "protocol_surface_not_live" : null,
|
||||
].filter(Boolean),
|
||||
}));
|
||||
|
||||
return {
|
||||
generatedAt: summary.generatedAt,
|
||||
status: summary.productionReady ? "production_ready" : "blocked",
|
||||
summary,
|
||||
executionOrder: [
|
||||
"commit_or_confirm_missing_token_and_accounting_addresses",
|
||||
"assign_required_vaults_and_pause_controls",
|
||||
"create_remaining_planned_pools",
|
||||
"fund_created_and_new_pools",
|
||||
"run_live_reserve_reads",
|
||||
"run_bridge_fee_and_destination_settlement_preflights",
|
||||
"run_10_100_1000_canary_swaps",
|
||||
"publish_same_chain_inventory_and_verification_artifacts",
|
||||
"enable_public_routing_only_for_canary_passed_or_production_rows",
|
||||
],
|
||||
blockers: allBlockers,
|
||||
protocolSurfaceBlockers,
|
||||
externalActionsRequired: [
|
||||
"operator_signing_required_for_pool_creation_or_funding",
|
||||
"vault_addresses_required_from_treasury_or_security_owner",
|
||||
"canary_swap_transactions_required_on_live_networks",
|
||||
"same_chain_factory_router_pool_inventory_required_before_public_route_generation",
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
function makeProductionGate(summary, readiness, surface) {
|
||||
return {
|
||||
generatedAt: summary.generatedAt,
|
||||
status: summary.productionReady ? "production_ready" : "blocked",
|
||||
gates: {
|
||||
bridgeConfigOk: surface.bridgeSurface?.adapter?.status === "live",
|
||||
sameChainSwapInventoryPublished: summary.sameChainSwapInventoryPublished,
|
||||
requiredPoolsCreated: summary.requiredStatusCounts.planned === undefined,
|
||||
requiredPoolsFundedOrBetter: !["planned", "created"].some((status) => summary.requiredStatusCounts[status] > 0),
|
||||
vaultAssignmentsComplete: summary.requiredRowsMissingVaultAssignments === 0,
|
||||
canaryEvidenceComplete: summary.requiredRowsMissingCanaryEvidence === 0,
|
||||
productionStatusesComplete: summary.productionRows === summary.requiredForSpendRows,
|
||||
},
|
||||
counts: summary,
|
||||
blockers: readiness.blockers.map((row) => ({
|
||||
poolId: row.poolId,
|
||||
chainId: row.chainId,
|
||||
protocol: row.protocol,
|
||||
status: row.status,
|
||||
blockers: row.blockers,
|
||||
})),
|
||||
};
|
||||
}
|
||||
|
||||
function mdTable(headers, rows) {
|
||||
const header = `| ${headers.join(" | ")} |`;
|
||||
const sep = `| ${headers.map((h) => (h.match(/count|rows|chain/i) ? "---:" : "---")).join(" | ")} |`;
|
||||
const body = rows.map((row) => `| ${row.join(" | ")} |`);
|
||||
return [header, sep, ...body].join("\n");
|
||||
}
|
||||
|
||||
function escapeCell(value) {
|
||||
return String(value ?? "-").replace(/\|/g, "\\|").replace(/\n/g, "<br>");
|
||||
}
|
||||
|
||||
function writeReports(matrix, surface, summary, readiness, productionGate) {
|
||||
mkdirSync(outDir, { recursive: true });
|
||||
|
||||
writeFileSync(resolve(outDir, "all-mainnet-deployment-readiness-worklist-latest.json"), `${JSON.stringify(readiness, null, 2)}\n`);
|
||||
writeFileSync(resolve(outDir, "all-mainnet-production-gate-latest.json"), `${JSON.stringify(productionGate, null, 2)}\n`);
|
||||
writeFileSync(
|
||||
resolve(outDir, "all-mainnet-spend-readiness-latest.json"),
|
||||
`${JSON.stringify(
|
||||
{
|
||||
generatedAt: summary.generatedAt,
|
||||
version: matrix.version,
|
||||
description: "Derived spend-readiness view for required ALL Mainnet pool rows.",
|
||||
status: summary.productionReady ? "ready" : "blocked",
|
||||
summary,
|
||||
statusCounts: countBy(readiness.blockers, (row) => row.status),
|
||||
routes: readiness.blockers.map((row) => ({
|
||||
routeId: row.poolId,
|
||||
chainId: row.chainId,
|
||||
network: row.network,
|
||||
protocol: row.protocol,
|
||||
status: row.blockers.includes("pool_not_created")
|
||||
? "missing_pool"
|
||||
: row.blockers.includes("pool_created_but_not_funded")
|
||||
? "pool_created_unfunded"
|
||||
: row.blockers.includes("missing_canary_evidence")
|
||||
? "missing_canary"
|
||||
: "blocked",
|
||||
path: `${row.baseToken?.symbol || "?"} -> ${row.quoteToken?.symbol || "?"}`,
|
||||
blockers: row.blockers,
|
||||
})),
|
||||
assumptions: [
|
||||
"Spend readiness is blocked until every required pool has vault assignments, funding/reserve evidence, canary evidence, and production status.",
|
||||
"ALL Mainnet same-chain routing remains blocked while config/allmainnet-non-dodo-protocol-surface.json has sameChainSwapInventoryPublished=false.",
|
||||
],
|
||||
},
|
||||
null,
|
||||
2,
|
||||
)}\n`,
|
||||
);
|
||||
|
||||
const poolMatrixReport = {
|
||||
generatedAt: summary.generatedAt,
|
||||
summary,
|
||||
requiredPools: matrix.rows.filter((row) => row.requiredForSpend === true).map(rowRef),
|
||||
rowsByStatus: countBy(matrix.rows, (row) => row.status),
|
||||
rowsByProtocol: countBy(matrix.rows, (row) => row.protocol),
|
||||
};
|
||||
writeFileSync(resolve(outDir, "all-mainnet-pool-creation-matrix-latest.json"), `${JSON.stringify(poolMatrixReport, null, 2)}\n`);
|
||||
|
||||
const summaryRows = [
|
||||
["totalRows", summary.totalRows],
|
||||
["requiredForSpendRows", summary.requiredForSpendRows],
|
||||
["plannedRows", summary.plannedRows],
|
||||
["createdUnfundedRows", summary.createdUnfundedRows],
|
||||
["fundedRows", summary.fundedRows],
|
||||
["liveReadRows", summary.liveReadRows],
|
||||
["productionRows", summary.productionRows],
|
||||
["rowsMissingVaultAssignments", summary.rowsMissingVaultAssignments],
|
||||
["requiredRowsMissingVaultAssignments", summary.requiredRowsMissingVaultAssignments],
|
||||
["requiredRowsMissingCanaryEvidence", summary.requiredRowsMissingCanaryEvidence],
|
||||
["sameChainSwapInventoryPublished", summary.sameChainSwapInventoryPublished],
|
||||
["productionReady", summary.productionReady],
|
||||
];
|
||||
|
||||
const blockersRows = readiness.blockers.map((row) => [
|
||||
row.chainId,
|
||||
`\`${escapeCell(row.poolId)}\``,
|
||||
escapeCell(row.protocol),
|
||||
escapeCell(row.status),
|
||||
escapeCell(tokenRef(row.baseToken)),
|
||||
escapeCell(tokenRef(row.quoteToken)),
|
||||
escapeCell(row.poolAddress || "-"),
|
||||
escapeCell(row.blockers.join(", ")),
|
||||
]);
|
||||
|
||||
const readinessMd = [
|
||||
"# ALL Mainnet Deployment Readiness Worklist",
|
||||
"",
|
||||
`Generated: \`${summary.generatedAt}\``,
|
||||
"",
|
||||
"## Summary",
|
||||
"",
|
||||
mdTable(["Item", "Count"], summaryRows),
|
||||
"",
|
||||
"## Execution Order",
|
||||
"",
|
||||
...readiness.executionOrder.map((step, index) => `${index + 1}. \`${step}\``),
|
||||
"",
|
||||
"## Required Pool Blockers",
|
||||
"",
|
||||
blockersRows.length
|
||||
? mdTable(["Chain", "Pool", "Protocol", "Status", "Base", "Quote", "Pool Address", "Blockers"], blockersRows)
|
||||
: "No required pool blockers remain.",
|
||||
"",
|
||||
"## Protocol Surface Blockers",
|
||||
"",
|
||||
readiness.protocolSurfaceBlockers.length
|
||||
? mdTable(
|
||||
["Protocol", "Family", "Status", "Factory", "Router", "Blockers"],
|
||||
readiness.protocolSurfaceBlockers.map((protocol) => [
|
||||
escapeCell(protocol.name),
|
||||
escapeCell(protocol.family),
|
||||
escapeCell(protocol.status),
|
||||
escapeCell(protocol.factoryAddress || "-"),
|
||||
escapeCell(protocol.routerAddress || "-"),
|
||||
escapeCell(protocol.blockers.join(", ")),
|
||||
]),
|
||||
)
|
||||
: "No protocol surface blockers remain.",
|
||||
"",
|
||||
].join("\n");
|
||||
writeFileSync(resolve(outDir, "all-mainnet-deployment-readiness-worklist-latest.md"), readinessMd);
|
||||
|
||||
const spendRows = readiness.blockers.map((row) => [
|
||||
`\`${escapeCell(row.poolId)}\``,
|
||||
row.chainId,
|
||||
escapeCell(row.protocol),
|
||||
escapeCell(`${row.baseToken?.symbol || "?"} -> ${row.quoteToken?.symbol || "?"}`),
|
||||
escapeCell(
|
||||
row.blockers.includes("pool_not_created")
|
||||
? "missing_pool"
|
||||
: row.blockers.includes("pool_created_but_not_funded")
|
||||
? "pool_created_unfunded"
|
||||
: row.blockers.includes("missing_canary_evidence")
|
||||
? "missing_canary"
|
||||
: "blocked",
|
||||
),
|
||||
escapeCell(row.blockers.join(", ")),
|
||||
]);
|
||||
const spendMd = [
|
||||
"# ALL Mainnet Spend Readiness",
|
||||
"",
|
||||
`Generated: \`${summary.generatedAt}\``,
|
||||
"",
|
||||
"## Summary",
|
||||
"",
|
||||
mdTable(["Item", "Count"], summaryRows),
|
||||
"",
|
||||
"## Required Route Gates",
|
||||
"",
|
||||
spendRows.length
|
||||
? mdTable(["Route", "Chain", "Protocol", "Path", "Status", "Blockers"], spendRows)
|
||||
: "All required spend routes are production-ready.",
|
||||
"",
|
||||
"## Direct Mapping Policy",
|
||||
"",
|
||||
"Direct ALL Mainnet -> public-network mappings remain inventory-only until verified bridge adapters, fee paths, vault assignments, live reserve reads, and canary evidence are recorded.",
|
||||
"",
|
||||
].join("\n");
|
||||
writeFileSync(resolve(outDir, "all-mainnet-spend-readiness-latest.md"), spendMd);
|
||||
|
||||
const requiredPoolRows = poolMatrixReport.requiredPools.map((row) => [
|
||||
row.chainId,
|
||||
`\`${escapeCell(row.poolId)}\``,
|
||||
escapeCell(row.protocol),
|
||||
escapeCell(row.status),
|
||||
escapeCell(row.poolAddress || "-"),
|
||||
escapeCell((row.missingRequiredVaultRoles || []).join(", ") || "-"),
|
||||
]);
|
||||
const poolMatrixMd = [
|
||||
"# ALL Mainnet Pool Creation Matrix",
|
||||
"",
|
||||
`Generated: \`${summary.generatedAt}\``,
|
||||
"",
|
||||
"## Summary",
|
||||
"",
|
||||
mdTable(["Status", "Count"], Object.entries(summary.statusCounts)),
|
||||
"",
|
||||
"## Required Pools",
|
||||
"",
|
||||
mdTable(["Chain", "Pool", "Protocol", "Status", "Address", "Missing Vault Roles"], requiredPoolRows),
|
||||
"",
|
||||
].join("\n");
|
||||
writeFileSync(resolve(outDir, "all-mainnet-pool-creation-matrix-latest.md"), poolMatrixMd);
|
||||
}
|
||||
|
||||
const matrix = readJson(matrixPath);
|
||||
const surface = readJson(surfacePath);
|
||||
const summary = makeSummary(matrix, surface);
|
||||
const readiness = makeReadiness(matrix, surface, summary);
|
||||
const productionGate = makeProductionGate(summary, readiness, surface);
|
||||
|
||||
writeReports(matrix, surface, summary, readiness, productionGate);
|
||||
|
||||
console.log(`[OK] ALL Mainnet readiness reports generated (${readiness.status}).`);
|
||||
230
scripts/status/preflight-all-mainnet-canaries.mjs
Normal file
230
scripts/status/preflight-all-mainnet-canaries.mjs
Normal file
@@ -0,0 +1,230 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Read-only canary readiness preflight for ALL Mainnet spend pools.
|
||||
*
|
||||
* This does not approve tokens or execute swaps. It checks the operator
|
||||
* wallet inventory, allowances, and quote/query surfaces for rows already
|
||||
* promoted to live_read.
|
||||
*/
|
||||
|
||||
import { mkdirSync, readFileSync, writeFileSync } from "node:fs";
|
||||
import { resolve } from "node:path";
|
||||
import { ethers } from "ethers";
|
||||
|
||||
const repoRoot = resolve(new URL("../..", import.meta.url).pathname);
|
||||
const matrixPath = resolve(repoRoot, "config/all-mainnet-pool-creation-matrix.json");
|
||||
const outDir = resolve(repoRoot, "reports/status");
|
||||
const reportPath = resolve(outDir, "all-mainnet-canary-preflight-latest.json");
|
||||
|
||||
const rpcByChain = {
|
||||
1: process.env.ETHEREUM_MAINNET_RPC || process.env.RPC_URL_1 || "https://ethereum.publicnode.com",
|
||||
10: process.env.OPTIMISM_RPC || process.env.RPC_URL_10 || "https://optimism.publicnode.com",
|
||||
25: process.env.CRONOS_RPC || process.env.RPC_URL_25 || "https://cronos-evm-rpc.publicnode.com",
|
||||
56: process.env.BSC_RPC || process.env.RPC_URL_56 || "https://bsc-rpc.publicnode.com",
|
||||
100: process.env.GNOSIS_RPC || process.env.RPC_URL_100 || "https://gnosis.publicnode.com",
|
||||
137: process.env.POLYGON_RPC || process.env.RPC_URL_137 || "https://polygon-bor-rpc.publicnode.com",
|
||||
8453: process.env.BASE_RPC || process.env.RPC_URL_8453 || "https://base-rpc.publicnode.com",
|
||||
42161: process.env.ARBITRUM_RPC || process.env.RPC_URL_42161 || "https://arbitrum-one-rpc.publicnode.com",
|
||||
42220: process.env.CELO_RPC || process.env.RPC_URL_42220 || "https://celo-rpc.publicnode.com",
|
||||
43114: process.env.AVALANCHE_RPC || process.env.RPC_URL_43114 || "https://avalanche-c-chain-rpc.publicnode.com",
|
||||
651940: process.env.ALL_MAINNET_RPC || process.env.CHAIN_651940_RPC_URL || "https://mainnet-rpc.alltra.global",
|
||||
};
|
||||
|
||||
const erc20Abi = [
|
||||
"function balanceOf(address) view returns (uint256)",
|
||||
"function allowance(address,address) view returns (uint256)",
|
||||
"function decimals() view returns (uint8)",
|
||||
];
|
||||
const uniV2RouterAbi = [
|
||||
"function getAmountsOut(uint256,address[]) view returns (uint256[])",
|
||||
];
|
||||
const dodoDvmAbi = [
|
||||
"function querySellBase(address,uint256) view returns (uint256)",
|
||||
"function querySellQuote(address,uint256) view returns (uint256)",
|
||||
];
|
||||
|
||||
function operatorAddress() {
|
||||
if (process.env.DEPLOYER_ADDRESS && ethers.isAddress(process.env.DEPLOYER_ADDRESS)) {
|
||||
return ethers.getAddress(process.env.DEPLOYER_ADDRESS);
|
||||
}
|
||||
if (process.env.SIGNER_ADDRESS && ethers.isAddress(process.env.SIGNER_ADDRESS)) {
|
||||
return ethers.getAddress(process.env.SIGNER_ADDRESS);
|
||||
}
|
||||
if (process.env.PRIVATE_KEY) {
|
||||
return new ethers.Wallet(process.env.PRIVATE_KEY).address;
|
||||
}
|
||||
throw new Error("Set DEPLOYER_ADDRESS, SIGNER_ADDRESS, or PRIVATE_KEY for canary preflight.");
|
||||
}
|
||||
|
||||
function smallQuoteAmount(decimals) {
|
||||
const capped = Math.min(Number(decimals), 6);
|
||||
return 10n ** BigInt(capped);
|
||||
}
|
||||
|
||||
function normalizedAddress(address) {
|
||||
return ethers.getAddress(String(address).toLowerCase());
|
||||
}
|
||||
|
||||
async function callOrError(fn) {
|
||||
try {
|
||||
return { ok: true, value: await fn() };
|
||||
} catch (error) {
|
||||
return { ok: false, error: error.shortMessage || error.message };
|
||||
}
|
||||
}
|
||||
|
||||
async function tokenSnapshot(provider, tokenAddress, owner, spender) {
|
||||
const token = new ethers.Contract(normalizedAddress(tokenAddress), erc20Abi, provider);
|
||||
const allowanceSpender = spender ? normalizedAddress(spender) : null;
|
||||
const [balance, decimals, allowance] = await Promise.all([
|
||||
callOrError(() => token.balanceOf(owner)),
|
||||
callOrError(() => token.decimals()),
|
||||
allowanceSpender ? callOrError(() => token.allowance(owner, allowanceSpender)) : Promise.resolve({ ok: false, error: "spender_not_configured" }),
|
||||
]);
|
||||
|
||||
return {
|
||||
address: tokenAddress,
|
||||
balanceRaw: balance.ok ? balance.value.toString() : null,
|
||||
decimals: decimals.ok ? Number(decimals.value) : null,
|
||||
allowanceRaw: allowance.ok ? allowance.value.toString() : null,
|
||||
errors: [
|
||||
...(balance.ok ? [] : [`balance:${balance.error}`]),
|
||||
...(decimals.ok ? [] : [`decimals:${decimals.error}`]),
|
||||
...(spender && !allowance.ok ? [`allowance:${allowance.error}`] : []),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
async function quoteUniswap(provider, row, base, quote) {
|
||||
if (!row.routerAddress) return { supported: false, errors: ["missing_router_address"] };
|
||||
const router = new ethers.Contract(normalizedAddress(row.routerAddress), uniV2RouterAbi, provider);
|
||||
const baseAmount = smallQuoteAmount(base.decimals ?? 18);
|
||||
const quoteAmount = smallQuoteAmount(quote.decimals ?? 18);
|
||||
const baseToken = normalizedAddress(row.baseToken.address);
|
||||
const quoteToken = normalizedAddress(row.quoteToken.address);
|
||||
const [baseToQuote, quoteToBase] = await Promise.all([
|
||||
callOrError(() => router.getAmountsOut(baseAmount, [baseToken, quoteToken])),
|
||||
callOrError(() => router.getAmountsOut(quoteAmount, [quoteToken, baseToken])),
|
||||
]);
|
||||
|
||||
return {
|
||||
supported: true,
|
||||
routerAddress: row.routerAddress,
|
||||
baseToQuote: baseToQuote.ok ? baseToQuote.value.map((v) => v.toString()) : null,
|
||||
quoteToBase: quoteToBase.ok ? quoteToBase.value.map((v) => v.toString()) : null,
|
||||
errors: [
|
||||
...(baseToQuote.ok ? [] : [`base_to_quote:${baseToQuote.error}`]),
|
||||
...(quoteToBase.ok ? [] : [`quote_to_base:${quoteToBase.error}`]),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
async function quoteDodo(provider, row, base, quote, operator) {
|
||||
const pool = new ethers.Contract(normalizedAddress(row.poolAddress), dodoDvmAbi, provider);
|
||||
const baseAmount = smallQuoteAmount(base.decimals ?? 18);
|
||||
const quoteAmount = smallQuoteAmount(quote.decimals ?? 18);
|
||||
const [sellBase, sellQuote] = await Promise.all([
|
||||
callOrError(() => pool.querySellBase(operator, baseAmount)),
|
||||
callOrError(() => pool.querySellQuote(operator, quoteAmount)),
|
||||
]);
|
||||
|
||||
return {
|
||||
supported: sellBase.ok || sellQuote.ok,
|
||||
poolAddress: row.poolAddress,
|
||||
sellBaseOutRaw: sellBase.ok ? sellBase.value.toString() : null,
|
||||
sellQuoteOutRaw: sellQuote.ok ? sellQuote.value.toString() : null,
|
||||
errors: [
|
||||
...(sellBase.ok ? [] : [`query_sell_base:${sellBase.error}`]),
|
||||
...(sellQuote.ok ? [] : [`query_sell_quote:${sellQuote.error}`]),
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
const matrix = JSON.parse(readFileSync(matrixPath, "utf8"));
|
||||
const generatedAt = new Date().toISOString();
|
||||
const operator = operatorAddress();
|
||||
const rows = matrix.rows.filter((row) => row.requiredForSpend === true && row.status === "live_read");
|
||||
const results = [];
|
||||
|
||||
for (const row of rows) {
|
||||
const rpcUrl = rpcByChain[row.chainId];
|
||||
const result = {
|
||||
poolId: row.poolId,
|
||||
chainId: row.chainId,
|
||||
network: row.network,
|
||||
protocol: row.protocol,
|
||||
poolAddress: row.poolAddress,
|
||||
routerAddress: row.routerAddress,
|
||||
operator,
|
||||
baseToken: row.baseToken,
|
||||
quoteToken: row.quoteToken,
|
||||
base: null,
|
||||
quote: null,
|
||||
quoteCheck: null,
|
||||
canaryPreflight: "blocked",
|
||||
blockers: [],
|
||||
};
|
||||
|
||||
if (!rpcUrl) {
|
||||
result.blockers.push("missing_rpc");
|
||||
results.push(result);
|
||||
continue;
|
||||
}
|
||||
if (!row.poolAddress || !row.baseToken?.address || !row.quoteToken?.address) {
|
||||
if (!row.poolAddress) result.blockers.push("missing_pool_address");
|
||||
if (!row.baseToken?.address) result.blockers.push("missing_base_token_address");
|
||||
if (!row.quoteToken?.address) result.blockers.push("missing_quote_token_address");
|
||||
results.push(result);
|
||||
continue;
|
||||
}
|
||||
|
||||
const provider = new ethers.JsonRpcProvider(rpcUrl, row.chainId, { staticNetwork: true });
|
||||
const spender = row.routerAddress || row.poolAddress;
|
||||
const [base, quote] = await Promise.all([
|
||||
tokenSnapshot(provider, row.baseToken.address, operator, spender),
|
||||
tokenSnapshot(provider, row.quoteToken.address, operator, spender),
|
||||
]);
|
||||
result.base = base;
|
||||
result.quote = quote;
|
||||
|
||||
if (base.errors.length > 0) result.blockers.push(...base.errors.map((error) => `base_${error}`));
|
||||
if (quote.errors.length > 0) result.blockers.push(...quote.errors.map((error) => `quote_${error}`));
|
||||
if (BigInt(base.balanceRaw || "0") === 0n) result.blockers.push(`missing_operator_inventory:${row.baseToken.symbol}`);
|
||||
if (BigInt(quote.balanceRaw || "0") === 0n) result.blockers.push(`missing_operator_inventory:${row.quoteToken.symbol}`);
|
||||
|
||||
if (row.protocol === "uniswap_v2") {
|
||||
result.quoteCheck = await quoteUniswap(provider, row, base, quote);
|
||||
} else if (row.protocol === "dodo_pmm") {
|
||||
result.quoteCheck = await quoteDodo(provider, row, base, quote, operator);
|
||||
} else {
|
||||
result.quoteCheck = { supported: false, errors: [`unsupported_protocol:${row.protocol}`] };
|
||||
}
|
||||
|
||||
if (result.quoteCheck?.errors?.length > 0) {
|
||||
result.blockers.push(...result.quoteCheck.errors.map((error) => `quote_${error}`));
|
||||
}
|
||||
if (!result.quoteCheck?.supported) {
|
||||
result.blockers.push("quote_surface_unverified");
|
||||
}
|
||||
|
||||
result.canaryPreflight = result.blockers.length === 0 ? "ready" : "blocked";
|
||||
results.push(result);
|
||||
}
|
||||
|
||||
const statusCounts = results.reduce((counts, result) => {
|
||||
counts[result.canaryPreflight] = (counts[result.canaryPreflight] || 0) + 1;
|
||||
return counts;
|
||||
}, {});
|
||||
|
||||
const report = {
|
||||
generatedAt,
|
||||
sourceMatrix: "config/all-mainnet-pool-creation-matrix.json",
|
||||
operator,
|
||||
liveReadRequiredRows: rows.length,
|
||||
statusCounts,
|
||||
results,
|
||||
};
|
||||
|
||||
mkdirSync(outDir, { recursive: true });
|
||||
writeFileSync(reportPath, `${JSON.stringify(report, null, 2)}\n`);
|
||||
console.log(`[OK] ALL Mainnet canary preflight written: ${statusCounts.ready || 0} ready, ${statusCounts.blocked || 0} blocked.`);
|
||||
110
scripts/status/record-all-mainnet-canary-evidence.mjs
Executable file
110
scripts/status/record-all-mainnet-canary-evidence.mjs
Executable file
@@ -0,0 +1,110 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Record explicit canary swap evidence in the ALL Mainnet pool matrix.
|
||||
*
|
||||
* This script does not execute swaps. It records already-completed canary
|
||||
* transaction evidence and promotes rows to canary_passed only when every
|
||||
* required vault assignment is present.
|
||||
*/
|
||||
|
||||
import { existsSync, readFileSync, writeFileSync } from "node:fs";
|
||||
import { resolve } from "node:path";
|
||||
|
||||
const repoRoot = resolve(new URL("../..", import.meta.url).pathname);
|
||||
const matrixPath = resolve(repoRoot, "config/all-mainnet-pool-creation-matrix.json");
|
||||
const defaultEvidencePath = resolve(repoRoot, "config/all-mainnet-canary-evidence.json");
|
||||
const args = process.argv.slice(2);
|
||||
const dryRun = args.includes("--dry-run");
|
||||
const evidenceArg = args.find((arg) => !arg.startsWith("--"));
|
||||
const evidencePath = resolve(repoRoot, evidenceArg || defaultEvidencePath);
|
||||
const txPattern = /^0x[a-fA-F0-9]{64}$/;
|
||||
const zeroTx = "0x0000000000000000000000000000000000000000000000000000000000000000";
|
||||
|
||||
function usageAndExit(message, code = 1) {
|
||||
if (message) console.error(`[ERROR] ${message}`);
|
||||
console.error("Usage: node scripts/status/record-all-mainnet-canary-evidence.mjs [config/all-mainnet-canary-evidence.json] [--dry-run]");
|
||||
process.exit(code);
|
||||
}
|
||||
|
||||
if (args.includes("--help") || args.includes("-h")) {
|
||||
usageAndExit(null, 0);
|
||||
}
|
||||
|
||||
if (!existsSync(evidencePath)) {
|
||||
usageAndExit(`Missing evidence file ${evidencePath}. Copy config/all-mainnet-canary-evidence.example.json first.`);
|
||||
}
|
||||
|
||||
function readJson(path) {
|
||||
return JSON.parse(readFileSync(path, "utf8"));
|
||||
}
|
||||
|
||||
function isRealTxHash(value) {
|
||||
return typeof value === "string" && txPattern.test(value) && value.toLowerCase() !== zeroTx;
|
||||
}
|
||||
|
||||
const matrix = readJson(matrixPath);
|
||||
const evidenceFile = readJson(evidencePath);
|
||||
const evidenceRows = Array.isArray(evidenceFile.evidence) ? evidenceFile.evidence : [];
|
||||
const rowsByPoolId = new Map(matrix.rows.map((row) => [row.poolId, row]));
|
||||
const errors = [];
|
||||
const touched = [];
|
||||
|
||||
for (const evidence of evidenceRows) {
|
||||
const row = rowsByPoolId.get(evidence.poolId);
|
||||
if (!row) {
|
||||
errors.push(`${evidence.poolId}: no matching matrix row`);
|
||||
continue;
|
||||
}
|
||||
if (!Array.isArray(evidence.canaryTransactions) || evidence.canaryTransactions.length === 0) {
|
||||
errors.push(`${evidence.poolId}: canaryTransactions[] is required`);
|
||||
continue;
|
||||
}
|
||||
for (const tx of evidence.canaryTransactions) {
|
||||
if (!isRealTxHash(tx.txHash)) {
|
||||
errors.push(`${evidence.poolId}: invalid txHash ${tx.txHash}`);
|
||||
}
|
||||
}
|
||||
if ((row.missingRequiredVaultRoles || []).length > 0) {
|
||||
errors.push(`${evidence.poolId}: cannot promote canary while missing vault roles: ${row.missingRequiredVaultRoles.join(",")}`);
|
||||
}
|
||||
if (!["live_read", "canary_passed", "production"].includes(row.status)) {
|
||||
errors.push(`${evidence.poolId}: status ${row.status} is not eligible for canary promotion`);
|
||||
}
|
||||
}
|
||||
|
||||
if (errors.length > 0) {
|
||||
console.error("[ERROR] Canary evidence validation failed:");
|
||||
for (const error of errors) console.error(` - ${error}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
for (const evidence of evidenceRows) {
|
||||
const row = rowsByPoolId.get(evidence.poolId);
|
||||
row.canaryEvidence = {
|
||||
generatedAt: evidence.generatedAt || new Date().toISOString(),
|
||||
sourceFile: evidenceArg || "config/all-mainnet-canary-evidence.json",
|
||||
canaryTransactions: evidence.canaryTransactions,
|
||||
notes: evidence.notes || [],
|
||||
};
|
||||
row.status = evidence.status === "production" ? "production" : "canary_passed";
|
||||
if (!row.notes.includes("Canary evidence recorded from explicit All Mainnet canary evidence file.")) {
|
||||
row.notes.push("Canary evidence recorded from explicit All Mainnet canary evidence file.");
|
||||
}
|
||||
touched.push(row.poolId);
|
||||
}
|
||||
|
||||
matrix.generatedAt = new Date().toISOString();
|
||||
matrix.statusCounts = matrix.rows.reduce((counts, row) => {
|
||||
counts[row.status] = (counts[row.status] || 0) + 1;
|
||||
return counts;
|
||||
}, {});
|
||||
matrix.protocolCounts = matrix.rows.reduce((counts, row) => {
|
||||
counts[row.protocol] = (counts[row.protocol] || 0) + 1;
|
||||
return counts;
|
||||
}, {});
|
||||
|
||||
if (!dryRun) {
|
||||
writeFileSync(matrixPath, `${JSON.stringify(matrix, null, 2)}\n`);
|
||||
}
|
||||
|
||||
console.log(`[OK] Canary evidence ${dryRun ? "dry run" : "record"} complete: ${touched.length} row(s) touched.`);
|
||||
@@ -146,6 +146,40 @@ else
|
||||
fi
|
||||
fi
|
||||
done
|
||||
if [[ -f "$PROJECT_ROOT/scripts/validation/validate-token-list-metadata.mjs" ]]; then
|
||||
if node "$PROJECT_ROOT/scripts/validation/validate-token-list-metadata.mjs"; then
|
||||
log_ok "Token-list metadata conventions valid"
|
||||
else
|
||||
log_err "Token-list metadata conventions invalid"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
fi
|
||||
if [[ -f "$PROJECT_ROOT/config/all-mainnet-pool-creation-matrix.json" ]] && [[ -f "$PROJECT_ROOT/scripts/validation/validate-pool-creation-matrix.mjs" ]]; then
|
||||
if node "$PROJECT_ROOT/scripts/validation/validate-pool-creation-matrix.mjs"; then
|
||||
log_ok "ALL Mainnet pool-creation matrix valid"
|
||||
else
|
||||
log_err "ALL Mainnet pool-creation matrix invalid"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
fi
|
||||
# ALL Mainnet (651940) protocol surface — canonical swap-inventory / bridge posture (proxmox root JSON)
|
||||
if [[ -f "$PROJECT_ROOT/config/allmainnet-non-dodo-protocol-surface.json" ]] && [[ -f "$PROJECT_ROOT/scripts/verify/check-allmainnet-protocol-surface.sh" ]]; then
|
||||
if bash "$PROJECT_ROOT/scripts/verify/check-allmainnet-protocol-surface.sh"; then
|
||||
log_ok "allmainnet-non-dodo-protocol-surface.json validation OK"
|
||||
else
|
||||
log_err "allmainnet-non-dodo-protocol-surface.json validation failed"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
fi
|
||||
# ALL Mainnet: chains.ts must not claim CCIP/LiFi on 651940 until directories list it
|
||||
if [[ -f "$PROJECT_ROOT/scripts/verify/check-allmainnet-chains-flags.sh" ]]; then
|
||||
if bash "$PROJECT_ROOT/scripts/verify/check-allmainnet-chains-flags.sh"; then
|
||||
log_ok "ALL_MAINNET chains.ts flags OK (or skipped if submodule missing)"
|
||||
else
|
||||
log_err "ALL_MAINNET chains.ts flag check failed"
|
||||
ERRORS=$((ERRORS + 1))
|
||||
fi
|
||||
fi
|
||||
# DUAL_CHAIN config (explorer deploy source)
|
||||
if [[ -f "$PROJECT_ROOT/explorer-monorepo/backend/api/rest/config/metamask/DUAL_CHAIN_TOKEN_LIST.tokenlist.json" ]] && command -v jq &>/dev/null; then
|
||||
if jq -e '(.tokens | type == "array") and (.tokens | length > 0)' "$PROJECT_ROOT/explorer-monorepo/backend/api/rest/config/metamask/DUAL_CHAIN_TOKEN_LIST.tokenlist.json" &>/dev/null; then
|
||||
|
||||
345
scripts/validation/validate-pool-creation-matrix.mjs
Executable file
345
scripts/validation/validate-pool-creation-matrix.mjs
Executable file
@@ -0,0 +1,345 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Validate the ALL Mainnet pool-creation matrix.
|
||||
*
|
||||
* This file is an operational dependency: pool rows are used to decide what can
|
||||
* be created, funded, or promoted. The checks here keep the matrix internally
|
||||
* consistent and cross-check token addresses against the repo token lists.
|
||||
*/
|
||||
|
||||
import { existsSync, readFileSync } from "node:fs";
|
||||
import { basename, resolve } from "node:path";
|
||||
|
||||
const repoRoot = resolve(new URL("../..", import.meta.url).pathname);
|
||||
const defaultMatrix = "config/all-mainnet-pool-creation-matrix.json";
|
||||
const defaultTokenLists = [
|
||||
"token-lists/lists/all-mainnet.tokenlist.json",
|
||||
"token-lists/lists/dbis-138.tokenlist.json",
|
||||
"token-lists/lists/ethereum-mainnet.tokenlist.json",
|
||||
"token-lists/lists/arbitrum.tokenlist.json",
|
||||
"token-lists/lists/avalanche.tokenlist.json",
|
||||
"token-lists/lists/cronos.tokenlist.json",
|
||||
"metamask-integration/config/token-list.json",
|
||||
"smom-dbis-138/metamask/token-list.json",
|
||||
];
|
||||
|
||||
const requiredVaultRoles = [
|
||||
"treasury_reserve",
|
||||
"bridge_liquidity",
|
||||
"single_sided_inventory",
|
||||
"protocol_adapter",
|
||||
"emergency_withdraw",
|
||||
];
|
||||
|
||||
const statusesRequiringPoolAddress = new Set(["created", "funded", "live_read", "canary_passed", "production"]);
|
||||
const addressPattern = /^0x[a-fA-F0-9]{40}$/;
|
||||
|
||||
function parseArgs() {
|
||||
const args = process.argv.slice(2);
|
||||
if (args.includes("--help") || args.includes("-h")) {
|
||||
console.log(`Usage: node scripts/validation/validate-pool-creation-matrix.mjs [matrix-path]\n\nDefaults to ${defaultMatrix}.`);
|
||||
process.exit(0);
|
||||
}
|
||||
return args[0] || defaultMatrix;
|
||||
}
|
||||
|
||||
function readJson(file, errors) {
|
||||
const abs = resolve(repoRoot, file);
|
||||
if (!existsSync(abs)) {
|
||||
errors.push(`${file}: missing`);
|
||||
return null;
|
||||
}
|
||||
try {
|
||||
return JSON.parse(readFileSync(abs, "utf8"));
|
||||
} catch (error) {
|
||||
errors.push(`${file}: invalid JSON: ${error.message}`);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function tokenKey(chainId, symbol) {
|
||||
return `${chainId}:${String(symbol).toUpperCase()}`;
|
||||
}
|
||||
|
||||
function buildTokenIndex(warnings) {
|
||||
const index = new Map();
|
||||
|
||||
for (const file of defaultTokenLists) {
|
||||
const abs = resolve(repoRoot, file);
|
||||
if (!existsSync(abs)) {
|
||||
warnings.push(`${file}: token list missing; address cross-check skipped for that file`);
|
||||
continue;
|
||||
}
|
||||
|
||||
let list;
|
||||
try {
|
||||
list = JSON.parse(readFileSync(abs, "utf8"));
|
||||
} catch (error) {
|
||||
warnings.push(`${file}: invalid JSON; address cross-check skipped for that file: ${error.message}`);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!Array.isArray(list.tokens)) {
|
||||
warnings.push(`${file}: missing tokens[]; address cross-check skipped for that file`);
|
||||
continue;
|
||||
}
|
||||
|
||||
for (const token of list.tokens) {
|
||||
if (!token?.chainId || !token?.symbol || !addressPattern.test(String(token.address || ""))) {
|
||||
continue;
|
||||
}
|
||||
const key = tokenKey(token.chainId, token.symbol);
|
||||
if (!index.has(key)) {
|
||||
index.set(key, new Map());
|
||||
}
|
||||
index.get(key).set(String(token.address).toLowerCase(), `${file}:${token.symbol}`);
|
||||
}
|
||||
}
|
||||
|
||||
return index;
|
||||
}
|
||||
|
||||
function ref(row, index) {
|
||||
return `rows[${index}] ${row.poolId || "<missing-poolId>"}`;
|
||||
}
|
||||
|
||||
function slug(value) {
|
||||
return String(value || "").toLowerCase();
|
||||
}
|
||||
|
||||
function sortedStrings(values) {
|
||||
return [...values].sort((a, b) => a.localeCompare(b));
|
||||
}
|
||||
|
||||
function countBy(rows, key) {
|
||||
const counts = {};
|
||||
for (const row of rows) {
|
||||
const value = row[key];
|
||||
counts[value] = (counts[value] || 0) + 1;
|
||||
}
|
||||
return counts;
|
||||
}
|
||||
|
||||
function sameCounts(actual, expected) {
|
||||
const keys = new Set([...Object.keys(actual || {}), ...Object.keys(expected || {})]);
|
||||
for (const key of keys) {
|
||||
if ((actual?.[key] || 0) !== (expected?.[key] || 0)) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
function validateAddress(value, path, errors, { allowNull = true } = {}) {
|
||||
if (value === null && allowNull) {
|
||||
return;
|
||||
}
|
||||
if (typeof value !== "string" || !addressPattern.test(value)) {
|
||||
errors.push(`${path}: expected ${allowNull ? "null or " : ""}0x-prefixed 20-byte address`);
|
||||
}
|
||||
}
|
||||
|
||||
function validateTokenAddress(row, index, side, tokenIndex, errors, warnings) {
|
||||
const token = row[side];
|
||||
const rowRef = ref(row, index);
|
||||
if (!token || typeof token !== "object") {
|
||||
errors.push(`${rowRef}: ${side} must be an object`);
|
||||
return;
|
||||
}
|
||||
if (typeof token.symbol !== "string" || token.symbol.length === 0) {
|
||||
errors.push(`${rowRef}: ${side}.symbol is required`);
|
||||
}
|
||||
validateAddress(token.address, `${rowRef}: ${side}.address`, errors);
|
||||
|
||||
const known = tokenIndex.get(tokenKey(row.chainId, token.symbol));
|
||||
if (!known || known.size !== 1) {
|
||||
return;
|
||||
}
|
||||
|
||||
const [knownAddress, source] = [...known.entries()][0];
|
||||
if (token.address === null) {
|
||||
if (row.status !== "planned" || row.publicRoutingEnabled === true) {
|
||||
errors.push(`${rowRef}: ${side} ${token.symbol} address is missing but ${source} has ${knownAddress}`);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (String(token.address).toLowerCase() !== knownAddress) {
|
||||
warnings.push(`${rowRef}: ${side} ${token.symbol} address ${token.address} differs from ${source} ${knownAddress}`);
|
||||
}
|
||||
}
|
||||
|
||||
function validateMatrix(file, matrix, tokenIndex) {
|
||||
const errors = [];
|
||||
const warnings = [];
|
||||
|
||||
if (!matrix || typeof matrix !== "object" || Array.isArray(matrix)) {
|
||||
errors.push(`${file}: root must be an object`);
|
||||
return { errors, warnings };
|
||||
}
|
||||
|
||||
if (typeof matrix.version !== "string" || matrix.version.length === 0) {
|
||||
errors.push(`${file}: version is required`);
|
||||
}
|
||||
if (typeof matrix.generatedAt !== "string" || Number.isNaN(Date.parse(matrix.generatedAt))) {
|
||||
errors.push(`${file}: generatedAt must be an ISO-like date string`);
|
||||
}
|
||||
if (!Array.isArray(matrix.rows) || matrix.rows.length === 0) {
|
||||
errors.push(`${file}: rows[] is required`);
|
||||
return { errors, warnings };
|
||||
}
|
||||
if (!Array.isArray(matrix.lifecycle) || matrix.lifecycle.length === 0) {
|
||||
errors.push(`${file}: lifecycle[] is required`);
|
||||
}
|
||||
if (!Array.isArray(matrix.protocolRolloutOrder) || matrix.protocolRolloutOrder.length === 0) {
|
||||
errors.push(`${file}: protocolRolloutOrder[] is required`);
|
||||
}
|
||||
|
||||
const protocolCounts = countBy(matrix.rows, "protocol");
|
||||
const statusCounts = countBy(matrix.rows, "status");
|
||||
if (!sameCounts(protocolCounts, matrix.protocolCounts)) {
|
||||
errors.push(`${file}: protocolCounts does not match rows (${JSON.stringify(protocolCounts)})`);
|
||||
}
|
||||
if (!sameCounts(statusCounts, matrix.statusCounts)) {
|
||||
errors.push(`${file}: statusCounts does not match rows (${JSON.stringify(statusCounts)})`);
|
||||
}
|
||||
|
||||
const lifecycle = new Set(matrix.lifecycle || []);
|
||||
const rollout = new Set(matrix.protocolRolloutOrder || []);
|
||||
const poolIds = new Set();
|
||||
|
||||
matrix.rows.forEach((row, index) => {
|
||||
const rowRef = ref(row, index);
|
||||
if (!row || typeof row !== "object" || Array.isArray(row)) {
|
||||
errors.push(`rows[${index}]: row must be an object`);
|
||||
return;
|
||||
}
|
||||
|
||||
if (typeof row.poolId !== "string" || row.poolId.length === 0) {
|
||||
errors.push(`${rowRef}: poolId is required`);
|
||||
} else if (poolIds.has(row.poolId)) {
|
||||
errors.push(`${rowRef}: duplicate poolId`);
|
||||
} else {
|
||||
poolIds.add(row.poolId);
|
||||
}
|
||||
|
||||
if (!Number.isInteger(row.chainId)) {
|
||||
errors.push(`${rowRef}: chainId must be an integer`);
|
||||
}
|
||||
if (!rollout.has(row.protocol)) {
|
||||
errors.push(`${rowRef}: protocol ${row.protocol} is not in protocolRolloutOrder`);
|
||||
}
|
||||
if (!lifecycle.has(row.status)) {
|
||||
errors.push(`${rowRef}: status ${row.status} is not in lifecycle`);
|
||||
}
|
||||
|
||||
const expectedPoolId = `${row.chainId}-${row.protocol}-${slug(row.baseToken?.symbol)}-${slug(row.quoteToken?.symbol)}`;
|
||||
if (row.poolId && row.poolId !== expectedPoolId) {
|
||||
errors.push(`${rowRef}: poolId should be ${expectedPoolId}`);
|
||||
}
|
||||
|
||||
validateTokenAddress(row, index, "baseToken", tokenIndex, errors, warnings);
|
||||
validateTokenAddress(row, index, "quoteToken", tokenIndex, errors, warnings);
|
||||
|
||||
validateAddress(row.factoryAddress, `${rowRef}: factoryAddress`, errors);
|
||||
validateAddress(row.routerAddress, `${rowRef}: routerAddress`, errors);
|
||||
validateAddress(row.poolAddress, `${rowRef}: poolAddress`, errors, {
|
||||
allowNull: !statusesRequiringPoolAddress.has(row.status),
|
||||
});
|
||||
validateAddress(row.vaultAddress, `${rowRef}: vaultAddress`, errors);
|
||||
|
||||
if (statusesRequiringPoolAddress.has(row.status) && row.poolAddress === null) {
|
||||
errors.push(`${rowRef}: status ${row.status} requires poolAddress`);
|
||||
}
|
||||
|
||||
const shouldBeSingleSided = row.protocol === "single_sided_pmm" || row.poolType === "single_sided";
|
||||
if (row.singleSided !== shouldBeSingleSided) {
|
||||
errors.push(`${rowRef}: singleSided should be ${shouldBeSingleSided}`);
|
||||
}
|
||||
|
||||
if (!Array.isArray(row.vaultAssignments)) {
|
||||
errors.push(`${rowRef}: vaultAssignments[] is required`);
|
||||
} else {
|
||||
const roles = row.vaultAssignments.map((assignment) => assignment?.role);
|
||||
const roleSet = new Set(roles);
|
||||
const expectedRoles = new Set(requiredVaultRoles);
|
||||
if (roleSet.size !== expectedRoles.size || requiredVaultRoles.some((role) => !roleSet.has(role))) {
|
||||
errors.push(`${rowRef}: vaultAssignments roles must be ${requiredVaultRoles.join(",")}`);
|
||||
}
|
||||
for (const assignment of row.vaultAssignments) {
|
||||
if (!assignment || typeof assignment !== "object") {
|
||||
errors.push(`${rowRef}: vaultAssignments entries must be objects`);
|
||||
continue;
|
||||
}
|
||||
validateAddress(assignment.vaultAddress, `${rowRef}: vaultAssignments.${assignment.role || "<missing-role>"}.vaultAddress`, errors);
|
||||
if (typeof assignment.requiredBeforeFunding !== "boolean") {
|
||||
errors.push(`${rowRef}: vaultAssignments.${assignment.role || "<missing-role>"}.requiredBeforeFunding must be boolean`);
|
||||
}
|
||||
}
|
||||
|
||||
const actualMissing = sortedStrings(
|
||||
row.vaultAssignments
|
||||
.filter((assignment) => assignment?.requiredBeforeFunding === true && assignment.vaultAddress === null)
|
||||
.map((assignment) => assignment.role),
|
||||
);
|
||||
const declaredMissing = sortedStrings(row.missingRequiredVaultRoles || []);
|
||||
if (actualMissing.join("|") !== declaredMissing.join("|")) {
|
||||
errors.push(`${rowRef}: missingRequiredVaultRoles should be [${actualMissing.join(", ")}]`);
|
||||
}
|
||||
const expectedStatus = actualMissing.length > 0 ? "missing_required_vaults" : "ready";
|
||||
if (row.vaultAssignmentStatus !== expectedStatus) {
|
||||
errors.push(`${rowRef}: vaultAssignmentStatus should be ${expectedStatus}`);
|
||||
}
|
||||
}
|
||||
|
||||
const tiers = row.fundingTiersUsd;
|
||||
if (!tiers || typeof tiers !== "object") {
|
||||
errors.push(`${rowRef}: fundingTiersUsd is required`);
|
||||
} else if (!(tiers.seed > 0 && tiers.smoke >= tiers.seed && tiers.productionMinimum >= tiers.smoke)) {
|
||||
errors.push(`${rowRef}: fundingTiersUsd must satisfy seed > 0, smoke >= seed, productionMinimum >= smoke`);
|
||||
}
|
||||
|
||||
const policy = row.policy;
|
||||
if (!policy || typeof policy !== "object") {
|
||||
errors.push(`${rowRef}: policy is required`);
|
||||
} else {
|
||||
for (const key of ["maxPriceImpactBps", "minReserveUsd", "refillTriggerBps"]) {
|
||||
if (typeof policy[key] !== "number" || policy[key] < 0) {
|
||||
errors.push(`${rowRef}: policy.${key} must be a non-negative number`);
|
||||
}
|
||||
}
|
||||
if (typeof policy.pauseOnReserveReadFailure !== "boolean") {
|
||||
errors.push(`${rowRef}: policy.pauseOnReserveReadFailure must be boolean`);
|
||||
}
|
||||
}
|
||||
|
||||
if (!Array.isArray(row.notes)) {
|
||||
errors.push(`${rowRef}: notes[] is required`);
|
||||
}
|
||||
});
|
||||
|
||||
return { errors, warnings };
|
||||
}
|
||||
|
||||
const matrixPath = parseArgs();
|
||||
const bootstrapErrors = [];
|
||||
const bootstrapWarnings = [];
|
||||
const matrix = readJson(matrixPath, bootstrapErrors);
|
||||
const tokenIndex = buildTokenIndex(bootstrapWarnings);
|
||||
const { errors, warnings } = matrix ? validateMatrix(matrixPath, matrix, tokenIndex) : { errors: [], warnings: [] };
|
||||
const allErrors = [...bootstrapErrors, ...errors];
|
||||
const allWarnings = [...bootstrapWarnings, ...warnings];
|
||||
|
||||
for (const warning of allWarnings) {
|
||||
console.warn(`[WARN] ${warning}`);
|
||||
}
|
||||
|
||||
if (allErrors.length > 0) {
|
||||
console.error(`[ERROR] Pool-creation matrix validation failed with ${allErrors.length} issue(s):`);
|
||||
for (const error of allErrors) {
|
||||
console.error(` - ${error}`);
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`[OK] ${basename(matrixPath)} valid: ${matrix.rows.length} row(s), ${Object.keys(matrix.protocolCounts || {}).length} protocol(s).`);
|
||||
240
scripts/validation/validate-token-list-metadata.mjs
Executable file
240
scripts/validation/validate-token-list-metadata.mjs
Executable file
@@ -0,0 +1,240 @@
|
||||
#!/usr/bin/env node
|
||||
/**
|
||||
* Validate DBIS token-list metadata conventions.
|
||||
*
|
||||
* This complements the Uniswap token-list schema validator. The schema checks
|
||||
* shape; this script checks the meaning of the compact tags/extensions we use
|
||||
* for fiat, cash-like, GRU, commodity, and wrapped-token presentation.
|
||||
*/
|
||||
|
||||
import { existsSync, readFileSync } from "node:fs";
|
||||
import { resolve } from "node:path";
|
||||
|
||||
const repoRoot = resolve(new URL("../..", import.meta.url).pathname);
|
||||
|
||||
const defaultTokenLists = [
|
||||
"token-lists/lists/all-mainnet.tokenlist.json",
|
||||
"token-lists/lists/arbitrum.tokenlist.json",
|
||||
"token-lists/lists/avalanche.tokenlist.json",
|
||||
"token-lists/lists/cronos.tokenlist.json",
|
||||
"token-lists/lists/dbis-138.tokenlist.json",
|
||||
"token-lists/lists/ethereum-mainnet.tokenlist.json",
|
||||
"metamask-integration/config/token-list.json",
|
||||
"metamask-integration/provider/config/DUAL_CHAIN_TOKEN_LIST.tokenlist.json",
|
||||
"metamask-integration/docs/METAMASK_TOKEN_LIST.json",
|
||||
"smom-dbis-138/metamask/token-list.json",
|
||||
];
|
||||
|
||||
const conventionTags = new Set(["fiat", "cash", "gru", "commodity"]);
|
||||
const protocolSymbols = new Set(["AUDA", "HYDX", "HYBX", "CHT"]);
|
||||
const cryptoCollateralStablecoins = new Set(["DAI"]);
|
||||
const fiatCurrencies = new Set(["USD", "EUR", "GBP", "AUD", "JPY", "CHF", "CAD"]);
|
||||
const allowedCategories = new Set([
|
||||
"tokenized-fiat",
|
||||
"stablecoin",
|
||||
"wrapped-native",
|
||||
"defi-token",
|
||||
"dex-token",
|
||||
"utility-token",
|
||||
"commodity-token",
|
||||
]);
|
||||
|
||||
function parseArgs() {
|
||||
const args = process.argv.slice(2);
|
||||
if (args.includes("--help") || args.includes("-h")) {
|
||||
console.log(`Usage: node scripts/validation/validate-token-list-metadata.mjs [token-list ...]\n\nIf no token-list paths are supplied, validates the repo's canonical token-list files that exist.`);
|
||||
process.exit(0);
|
||||
}
|
||||
return args.length > 0 ? args : defaultTokenLists;
|
||||
}
|
||||
|
||||
function isScalar(value) {
|
||||
return value === null || ["string", "number", "boolean"].includes(typeof value);
|
||||
}
|
||||
|
||||
function tokenRef(file, index, token) {
|
||||
return `${file} tokens[${index}] ${token.symbol || "<missing-symbol>"} ${token.chainId || "<missing-chain>"} ${token.address || "<missing-address>"}`;
|
||||
}
|
||||
|
||||
function hasTag(token, tag) {
|
||||
return Array.isArray(token.tags) && token.tags.includes(tag);
|
||||
}
|
||||
|
||||
function tagDefs(list) {
|
||||
return list.tags && typeof list.tags === "object" && !Array.isArray(list.tags)
|
||||
? list.tags
|
||||
: {};
|
||||
}
|
||||
|
||||
function validateList(file, list) {
|
||||
const errors = [];
|
||||
const warnings = [];
|
||||
const tags = tagDefs(list);
|
||||
|
||||
if (!Array.isArray(list.tokens)) {
|
||||
errors.push(`${file}: missing tokens[]`);
|
||||
return { errors, warnings };
|
||||
}
|
||||
|
||||
for (const conventionTag of conventionTags) {
|
||||
const used = list.tokens.some((token) => hasTag(token, conventionTag));
|
||||
if (used && !tags[conventionTag]) {
|
||||
errors.push(`${file}: tag "${conventionTag}" is used but missing from top-level tags`);
|
||||
}
|
||||
}
|
||||
|
||||
list.tokens.forEach((token, index) => {
|
||||
const ref = tokenRef(file, index, token);
|
||||
const tokenTags = Array.isArray(token.tags) ? token.tags : [];
|
||||
const extensions = token.extensions ?? {};
|
||||
|
||||
for (const tag of tokenTags) {
|
||||
if (typeof tag !== "string") {
|
||||
errors.push(`${ref}: tag values must be strings`);
|
||||
} else if (tag.length > 10) {
|
||||
errors.push(`${ref}: tag "${tag}" is longer than 10 characters`);
|
||||
}
|
||||
}
|
||||
|
||||
if (token.extensions !== undefined) {
|
||||
if (!extensions || typeof extensions !== "object" || Array.isArray(extensions)) {
|
||||
errors.push(`${ref}: extensions must be an object when present`);
|
||||
} else {
|
||||
const keys = Object.keys(extensions);
|
||||
if (keys.length > 10) {
|
||||
errors.push(`${ref}: extensions has ${keys.length} keys; max is 10`);
|
||||
}
|
||||
for (const [key, value] of Object.entries(extensions)) {
|
||||
if (!isScalar(value)) {
|
||||
errors.push(`${ref}: extensions.${key} must be scalar/null, not ${Array.isArray(value) ? "array" : typeof value}`);
|
||||
}
|
||||
if (typeof value === "string" && value.length > 42) {
|
||||
errors.push(`${ref}: extensions.${key} is longer than 42 characters`);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (extensions.category && !allowedCategories.has(extensions.category)) {
|
||||
errors.push(`${ref}: extensions.category "${extensions.category}" is not in the allowed metadata category set`);
|
||||
}
|
||||
|
||||
if (hasTag(token, "cash")) {
|
||||
if (!hasTag(token, "fiat")) {
|
||||
errors.push(`${ref}: cash tag requires fiat tag`);
|
||||
}
|
||||
if (extensions.category !== "tokenized-fiat") {
|
||||
errors.push(`${ref}: cash tag requires extensions.category=tokenized-fiat`);
|
||||
}
|
||||
if (extensions.cashLike !== true) {
|
||||
errors.push(`${ref}: cash tag requires extensions.cashLike=true`);
|
||||
}
|
||||
if (extensions.settlement !== "fiat") {
|
||||
errors.push(`${ref}: cash tag requires extensions.settlement=fiat`);
|
||||
}
|
||||
if (typeof extensions.backing !== "string" || !extensions.backing.includes("cash")) {
|
||||
errors.push(`${ref}: cash tag requires extensions.backing to include cash`);
|
||||
}
|
||||
}
|
||||
|
||||
if (hasTag(token, "fiat")) {
|
||||
if (extensions.category !== "tokenized-fiat") {
|
||||
errors.push(`${ref}: fiat tag requires extensions.category=tokenized-fiat`);
|
||||
}
|
||||
if (!fiatCurrencies.has(extensions.currency)) {
|
||||
errors.push(`${ref}: fiat tag requires extensions.currency to be one of ${[...fiatCurrencies].join(",")}`);
|
||||
}
|
||||
if (extensions.settlement !== "fiat") {
|
||||
errors.push(`${ref}: fiat tag requires extensions.settlement=fiat`);
|
||||
}
|
||||
}
|
||||
|
||||
if (hasTag(token, "gru")) {
|
||||
if (typeof extensions.gruVersion !== "string" || !/^v\d+$/.test(extensions.gruVersion)) {
|
||||
errors.push(`${ref}: gru tag requires extensions.gruVersion like v1 or v2`);
|
||||
}
|
||||
if (typeof extensions.gruFamily !== "string" || extensions.gruFamily.length === 0) {
|
||||
errors.push(`${ref}: gru tag requires extensions.gruFamily`);
|
||||
}
|
||||
}
|
||||
|
||||
if (hasTag(token, "commodity")) {
|
||||
if (hasTag(token, "cash") || hasTag(token, "fiat")) {
|
||||
errors.push(`${ref}: commodity token must not be tagged cash or fiat`);
|
||||
}
|
||||
if (extensions.category !== "commodity-token") {
|
||||
errors.push(`${ref}: commodity tag requires extensions.category=commodity-token`);
|
||||
}
|
||||
if (extensions.cashLike !== false) {
|
||||
errors.push(`${ref}: commodity tag requires extensions.cashLike=false`);
|
||||
}
|
||||
if (extensions.backing !== "commodity-reserves") {
|
||||
errors.push(`${ref}: commodity tag requires extensions.backing=commodity-reserves`);
|
||||
}
|
||||
}
|
||||
|
||||
if (protocolSymbols.has(token.symbol)) {
|
||||
if (hasTag(token, "cash") || hasTag(token, "fiat") || hasTag(token, "gru")) {
|
||||
errors.push(`${ref}: protocol token ${token.symbol} must not be tagged cash, fiat, or gru`);
|
||||
}
|
||||
if (extensions.category === "tokenized-fiat") {
|
||||
errors.push(`${ref}: protocol token ${token.symbol} must not use category tokenized-fiat`);
|
||||
}
|
||||
if (extensions.cashLike === true) {
|
||||
errors.push(`${ref}: protocol token ${token.symbol} must not be cashLike`);
|
||||
}
|
||||
}
|
||||
|
||||
if (cryptoCollateralStablecoins.has(token.symbol)) {
|
||||
if (hasTag(token, "cash") || hasTag(token, "fiat")) {
|
||||
errors.push(`${ref}: ${token.symbol} must not be tagged cash or fiat`);
|
||||
}
|
||||
if (extensions.instrument !== "crypto-collateralized-stablecoin") {
|
||||
errors.push(`${ref}: ${token.symbol} requires extensions.instrument=crypto-collateralized-stablecoin`);
|
||||
}
|
||||
if (extensions.cashLike !== false) {
|
||||
errors.push(`${ref}: ${token.symbol} requires extensions.cashLike=false`);
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
return { errors, warnings };
|
||||
}
|
||||
|
||||
const files = parseArgs();
|
||||
const allErrors = [];
|
||||
const allWarnings = [];
|
||||
let validated = 0;
|
||||
|
||||
for (const file of files) {
|
||||
const abs = resolve(repoRoot, file);
|
||||
if (!existsSync(abs)) {
|
||||
allWarnings.push(`${file}: missing; skipped`);
|
||||
continue;
|
||||
}
|
||||
let list;
|
||||
try {
|
||||
list = JSON.parse(readFileSync(abs, "utf8"));
|
||||
} catch (error) {
|
||||
allErrors.push(`${file}: invalid JSON: ${error.message}`);
|
||||
continue;
|
||||
}
|
||||
const { errors, warnings } = validateList(file, list);
|
||||
allErrors.push(...errors);
|
||||
allWarnings.push(...warnings);
|
||||
validated += 1;
|
||||
}
|
||||
|
||||
for (const warning of allWarnings) {
|
||||
console.warn(`[WARN] ${warning}`);
|
||||
}
|
||||
|
||||
if (allErrors.length > 0) {
|
||||
console.error(`[ERROR] Token-list metadata validation failed with ${allErrors.length} issue(s):`);
|
||||
for (const error of allErrors) {
|
||||
console.error(` - ${error}`);
|
||||
}
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
console.log(`[OK] Token-list metadata conventions valid for ${validated} file(s).`);
|
||||
36
scripts/verify/check-allmainnet-chains-flags.sh
Normal file
36
scripts/verify/check-allmainnet-chains-flags.sh
Normal file
@@ -0,0 +1,36 @@
|
||||
#!/usr/bin/env bash
|
||||
# Ensure alltra-lifi-settlement ALL_MAINNET keeps ccipSupported:false and lifiSupported:false
|
||||
# until CCIP/LiFi officially list chain 651940.
|
||||
# Usage: bash scripts/verify/check-allmainnet-chains-flags.sh
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
F="$PROJECT_ROOT/alltra-lifi-settlement/src/config/chains.ts"
|
||||
|
||||
if [[ ! -f "$F" ]]; then
|
||||
echo "[SKIP] alltra-lifi-settlement/src/config/chains.ts not found"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
BLOCK="$(awk '/ALL_MAINNET: \{/{flag=1} flag{print} flag && /^ \},$/ {exit}' "$F")"
|
||||
|
||||
if echo "$BLOCK" | grep -q 'ccipSupported: true'; then
|
||||
echo "[ERROR] ALL_MAINNET must keep ccipSupported: false until CCIP Directory lists 651940"
|
||||
exit 1
|
||||
fi
|
||||
if echo "$BLOCK" | grep -q 'lifiSupported: true'; then
|
||||
echo "[ERROR] ALL_MAINNET must keep lifiSupported: false until LiFi API lists 651940"
|
||||
exit 1
|
||||
fi
|
||||
if ! echo "$BLOCK" | grep -q 'ccipSupported: false'; then
|
||||
echo "[ERROR] ALL_MAINNET block missing ccipSupported: false"
|
||||
exit 1
|
||||
fi
|
||||
if ! echo "$BLOCK" | grep -q 'lifiSupported: false'; then
|
||||
echo "[ERROR] ALL_MAINNET block missing lifiSupported: false"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "[OK] ALL_MAINNET ccipSupported/lifiSupported flags (651940)"
|
||||
exit 0
|
||||
@@ -0,0 +1,7 @@
|
||||
#!/usr/bin/env bash
|
||||
# Historical name: "gate vs surface" alignment. Canonical surface is
|
||||
# config/allmainnet-non-dodo-protocol-surface.json (no separate production-gate file required).
|
||||
# Delegates to check-allmainnet-protocol-surface.sh for internal consistency checks.
|
||||
set -euo pipefail
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
exec bash "$SCRIPT_DIR/check-allmainnet-protocol-surface.sh" "$@"
|
||||
69
scripts/verify/check-allmainnet-protocol-surface.sh
Normal file
69
scripts/verify/check-allmainnet-protocol-surface.sh
Normal file
@@ -0,0 +1,69 @@
|
||||
#!/usr/bin/env bash
|
||||
# Validate config/allmainnet-non-dodo-protocol-surface.json shape and internal consistency.
|
||||
# Usage: from repo root: bash scripts/verify/check-allmainnet-protocol-surface.sh
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
F="$PROJECT_ROOT/config/allmainnet-non-dodo-protocol-surface.json"
|
||||
|
||||
if [[ ! -f "$F" ]]; then
|
||||
echo "[ERROR] Missing $F"
|
||||
exit 1
|
||||
fi
|
||||
if ! command -v jq &>/dev/null; then
|
||||
echo "[ERROR] jq is required for ALL Mainnet surface validation"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
jq -e '
|
||||
(.chainId == 651940)
|
||||
and (.status | type == "string")
|
||||
and (.summary | type == "object")
|
||||
and (.summary.bridgeOnlyLive | type == "boolean")
|
||||
and (.summary.sameChainSwapInventoryPublished | type == "boolean")
|
||||
and (.classificationFramework.metadataDomains | type == "array")
|
||||
and ((.classificationFramework.metadataDomains - [
|
||||
"backingMetadata",
|
||||
"bridgeMetadata",
|
||||
"cashMetadata",
|
||||
"commodityMetadata",
|
||||
"reserveMetadata",
|
||||
"securityMetadata",
|
||||
"settlementMetadata"
|
||||
]) | length == 0)
|
||||
and (.documentedTokens | type == "array")
|
||||
and ((.documentedTokens | length) > 0)
|
||||
and all(.documentedTokens[]; . as $token | (
|
||||
(.symbol | type == "string")
|
||||
and (.address | test("^0x[0-9a-fA-F]{40}$"))
|
||||
and (.category | type == "string")
|
||||
and (.instrumentType | type == "string")
|
||||
and (.backingAssets | type == "array")
|
||||
and (.tags | type == "array")
|
||||
and (.backingMetadata | type == "object")
|
||||
and (.bridgeMetadata | type == "object")
|
||||
and (.cashMetadata | type == "object")
|
||||
and (.commodityMetadata | type == "object")
|
||||
and (.reserveMetadata | type == "object")
|
||||
and (.securityMetadata | type == "object")
|
||||
and (.settlementMetadata | type == "object")
|
||||
and (($token.gruVersion == null) or (($token.tags | index("gru:" + $token.gruVersion)) != null))
|
||||
))
|
||||
and (.bridgeSurface.adapter.address | test("^0x[0-9a-fA-F]{40}$"))
|
||||
and (.bridgeSurface.adapter.status == "live")
|
||||
' "$F" >/dev/null || {
|
||||
echo "[ERROR] allmainnet-non-dodo-protocol-surface.json: expected chainId 651940, summary flags, metadata domains, documented token metadata, GRU version tags, and live bridge adapter"
|
||||
exit 1
|
||||
}
|
||||
|
||||
PUBLISHED="$(jq -r '.summary.sameChainSwapInventoryPublished' "$F")"
|
||||
STATUS="$(jq -r '.status' "$F")"
|
||||
|
||||
if [[ "$PUBLISHED" == "true" ]] && [[ "$STATUS" == "bridge_live_swap_inventory_pending" ]]; then
|
||||
echo "[ERROR] Inconsistent: summary.sameChainSwapInventoryPublished is true but status is still bridge_live_swap_inventory_pending (update status when promoting inventory)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "[OK] allmainnet-non-dodo-protocol-surface.json OK (chainId, summary flags, publication vs status)"
|
||||
exit 0
|
||||
28
scripts/verify/report-gitea-cd-parity.sh
Executable file
28
scripts/verify/report-gitea-cd-parity.sh
Executable file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env bash
|
||||
# Report Gitea CD wiring: unique deploy-target repos vs workflow files in this workspace.
|
||||
set -euo pipefail
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
|
||||
TARGETS="$ROOT/phoenix-deploy-api/deploy-targets.json"
|
||||
|
||||
if [[ ! -f "$TARGETS" ]]; then
|
||||
echo "Missing: $TARGETS" >&2
|
||||
exit 1
|
||||
fi
|
||||
if ! command -v jq >/dev/null 2>&1; then
|
||||
echo "jq required" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "== Unique (repo, branch, target) in deploy-targets.json =="
|
||||
jq -r '.targets[] | "\(.repo)\t\(.branch // "main")\t\(.target // "default")"' "$TARGETS" | sort -u
|
||||
|
||||
echo ""
|
||||
echo "== Workflows in proxmox workspace submodules (sample) =="
|
||||
for d in explorer-monorepo cross-chain-pmm-lps; do
|
||||
if [[ -d "$ROOT/$d/.gitea/workflows" ]]; then
|
||||
echo "--- $d ---"
|
||||
ls -1 "$ROOT/$d/.gitea/workflows" 2>/dev/null || true
|
||||
fi
|
||||
done
|
||||
echo "See config/gitea-workflow-templates/repos/ for copy-paste workflows for repos not submodules here."
|
||||
@@ -91,16 +91,6 @@ run_summary_record_step "1d" "main/master workflow parity" "success" "$((SECONDS
|
||||
step_done "$STEP_STARTED"
|
||||
echo ""
|
||||
|
||||
echo "1c. Gitea workflow source sync..."
|
||||
bash "$SCRIPT_DIR/check-gitea-workflows.sh" || log_err "Gitea workflow source drift"
|
||||
log_ok "Gitea workflows match source-of-truth files"
|
||||
echo ""
|
||||
|
||||
echo "1d. main/master workflow parity..."
|
||||
bash "$SCRIPT_DIR/check-gitea-branch-workflow-parity.sh" || log_err "main/master workflow parity drift"
|
||||
log_ok "main/master workflow parity OK"
|
||||
echo ""
|
||||
|
||||
echo "2. Config files..."
|
||||
STEP_STARTED=$SECONDS
|
||||
bash "$SCRIPT_DIR/../validation/validate-config-files.sh" || log_err "validate-config-files failed"
|
||||
@@ -136,6 +126,19 @@ run_summary_record_step "3b" "deployment-status graph" "success" "$((SECONDS - S
|
||||
step_done "$STEP_STARTED"
|
||||
echo ""
|
||||
|
||||
echo "3b2. capital efficiency risk simulator (cross-chain-pmm-lps)..."
|
||||
STEP_STARTED=$SECONDS
|
||||
CAPITAL_VALIDATE="$PROJECT_ROOT/cross-chain-pmm-lps/scripts/validate-capital-efficiency.cjs"
|
||||
if [[ -f "$CAPITAL_VALIDATE" ]] && command -v node &>/dev/null; then
|
||||
node "$CAPITAL_VALIDATE" || log_err "validate-capital-efficiency.cjs failed"
|
||||
log_ok "capital efficiency simulator rules OK"
|
||||
else
|
||||
echo " (skip: node or $CAPITAL_VALIDATE missing)"
|
||||
fi
|
||||
run_summary_record_step "3b2" "capital efficiency risk simulator" "success" "$((SECONDS - STEP_STARTED))"
|
||||
step_done "$STEP_STARTED"
|
||||
echo ""
|
||||
|
||||
echo "3c. External dependency blockers..."
|
||||
STEP_STARTED=$SECONDS
|
||||
EXT_CHECK="$SCRIPT_DIR/check-external-dependencies.sh"
|
||||
|
||||
Submodule smom-dbis-138 updated: fcd55aa9c4...68cd541265
@@ -3,7 +3,7 @@
|
||||
"version": {
|
||||
"major": 1,
|
||||
"minor": 1,
|
||||
"patch": 0
|
||||
"patch": 1
|
||||
},
|
||||
"timestamp": "2026-02-28T00:00:00.000Z",
|
||||
"logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png",
|
||||
@@ -23,8 +23,22 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/QmRfhPs9DcyFPpGjKwF6CCoVDWUHSxkQR34n9NK7JSbPCP",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"defi"
|
||||
]
|
||||
"defi",
|
||||
"fiat",
|
||||
"cash"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "fiat-backed-stablecoin",
|
||||
"currency": "USD",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"x402Ready": false,
|
||||
"fwdCanon": false,
|
||||
"walletClass": "cash-like-token",
|
||||
"bridge": "AlltraAdapter:cUSDT->AUSDT"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 651940,
|
||||
@@ -35,8 +49,22 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/QmRfhPs9DcyFPpGjKwF6CCoVDWUHSxkQR34n9NK7JSbPCP",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"defi"
|
||||
]
|
||||
"defi",
|
||||
"fiat",
|
||||
"cash"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "fiat-backed-stablecoin",
|
||||
"currency": "USD",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"x402Ready": false,
|
||||
"fwdCanon": false,
|
||||
"walletClass": "cash-like-token",
|
||||
"bridge": "unknown-or-noncanonical:documented-token-n"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 651940,
|
||||
@@ -47,8 +75,22 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/QmNPq4D5JXzurmi9jAhogVMzhAQRk1PZ1r9H3qQUV9gjDm",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"defi"
|
||||
]
|
||||
"defi",
|
||||
"fiat",
|
||||
"cash"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "fiat-backed-stablecoin",
|
||||
"currency": "USD",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"x402Ready": false,
|
||||
"fwdCanon": false,
|
||||
"walletClass": "cash-like-token",
|
||||
"bridge": "AlltraAdapter:cUSDC->USDC"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 651940,
|
||||
@@ -60,7 +102,15 @@
|
||||
"tags": [
|
||||
"defi",
|
||||
"wrapped"
|
||||
]
|
||||
],
|
||||
"extensions": {
|
||||
"category": "wrapped-native",
|
||||
"instrument": "wrapped-native",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "native-gas-asset",
|
||||
"walletClass": "token"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 651940,
|
||||
@@ -72,7 +122,15 @@
|
||||
"tags": [
|
||||
"defi",
|
||||
"wrapped"
|
||||
]
|
||||
],
|
||||
"extensions": {
|
||||
"category": "wrapped-native",
|
||||
"instrument": "wrapped-native",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "native-gas-asset",
|
||||
"walletClass": "token"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 651940,
|
||||
@@ -116,21 +174,36 @@
|
||||
"tags": [
|
||||
"defi"
|
||||
],
|
||||
"logoURI": "https://ipfs.io/ipfs/Qma3FKtLce9MjgJgWbtyCxBiPjJ6xi8jGWUSKNS5Jc2ong"
|
||||
"logoURI": "https://ipfs.io/ipfs/Qma3FKtLce9MjgJgWbtyCxBiPjJ6xi8jGWUSKNS5Jc2ong",
|
||||
"extensions": {
|
||||
"category": "defi-token",
|
||||
"instrument": "protocol-token",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "protocol-utility"
|
||||
}
|
||||
}
|
||||
],
|
||||
"tags": {
|
||||
"stablecoin": {
|
||||
"name": "Stablecoin",
|
||||
"description": "Stable value tokens pegged to fiat currencies"
|
||||
"description": "Stable value tokens pegged to fiat"
|
||||
},
|
||||
"defi": {
|
||||
"name": "DeFi",
|
||||
"description": "Decentralized Finance tokens"
|
||||
},
|
||||
"fiat": {
|
||||
"name": "Fiat",
|
||||
"description": "Fiat referenced tokens"
|
||||
},
|
||||
"cash": {
|
||||
"name": "Cashlike",
|
||||
"description": "Cash reserve or cash rail assets"
|
||||
},
|
||||
"wrapped": {
|
||||
"name": "Wrapped",
|
||||
"description": "Wrapped tokens representing native assets"
|
||||
"description": "Wrapped tokens representing assets"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,15 +1,67 @@
|
||||
{
|
||||
"name": "Arbitrum DBIS Token List",
|
||||
"version": {"major": 1, "minor": 0, "patch": 0},
|
||||
"version": {
|
||||
"major": 1,
|
||||
"minor": 0,
|
||||
"patch": 0
|
||||
},
|
||||
"timestamp": "2026-02-16T00:00:00.000Z",
|
||||
"logoURI": "https://arbitrum.io/favicon.png",
|
||||
"keywords": ["arbitrum", "42161", "dbis", "ccip", "weth"],
|
||||
"keywords": [
|
||||
"arbitrum",
|
||||
"42161",
|
||||
"dbis",
|
||||
"ccip",
|
||||
"weth"
|
||||
],
|
||||
"tokens": [
|
||||
{"chainId": 42161, "address": "0x89dd12025bfCD38A168455A44B400e913ED33BE2", "name": "Wrapped Ether (WETH9)", "symbol": "WETH9", "decimals": 18, "logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png", "tags": ["defi", "wrapped"]},
|
||||
{"chainId": 42161, "address": "0xe0E93247376aa097dB308B92e6Ba36bA015535D0", "name": "Wrapped Ether v10", "symbol": "WETH10", "decimals": 18, "logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png", "tags": ["defi", "wrapped"]}
|
||||
{
|
||||
"chainId": 42161,
|
||||
"address": "0x89dd12025bfCD38A168455A44B400e913ED33BE2",
|
||||
"name": "Wrapped Ether (WETH9)",
|
||||
"symbol": "WETH9",
|
||||
"decimals": 18,
|
||||
"logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png",
|
||||
"tags": [
|
||||
"defi",
|
||||
"wrapped"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "wrapped-native",
|
||||
"instrument": "wrapped-native",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "native-gas-asset"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 42161,
|
||||
"address": "0xe0E93247376aa097dB308B92e6Ba36bA015535D0",
|
||||
"name": "Wrapped Ether v10",
|
||||
"symbol": "WETH10",
|
||||
"decimals": 18,
|
||||
"logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png",
|
||||
"tags": [
|
||||
"defi",
|
||||
"wrapped"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "wrapped-native",
|
||||
"instrument": "wrapped-native",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "native-gas-asset"
|
||||
}
|
||||
}
|
||||
],
|
||||
"tags": {
|
||||
"defi": {"name": "DeFi", "description": "Decentralized Finance tokens"},
|
||||
"wrapped": {"name": "Wrapped", "description": "Wrapped tokens representing native assets"}
|
||||
"defi": {
|
||||
"name": "DeFi",
|
||||
"description": "Decentralized Finance tokens"
|
||||
},
|
||||
"wrapped": {
|
||||
"name": "Wrapped",
|
||||
"description": "Wrapped tokens representing assets"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,15 +1,67 @@
|
||||
{
|
||||
"name": "Avalanche DBIS Token List",
|
||||
"version": {"major": 1, "minor": 0, "patch": 0},
|
||||
"version": {
|
||||
"major": 1,
|
||||
"minor": 0,
|
||||
"patch": 0
|
||||
},
|
||||
"timestamp": "2026-02-16T00:00:00.000Z",
|
||||
"logoURI": "https://crypto.com/defi/static/avalanche-avax-logo.png",
|
||||
"keywords": ["avalanche", "43114", "dbis", "ccip", "weth"],
|
||||
"keywords": [
|
||||
"avalanche",
|
||||
"43114",
|
||||
"dbis",
|
||||
"ccip",
|
||||
"weth"
|
||||
],
|
||||
"tokens": [
|
||||
{"chainId": 43114, "address": "0xa4B9DD039565AeD9641D45b57061f99d9cA6Df08", "name": "Wrapped Ether (WETH9)", "symbol": "WETH9", "decimals": 18, "logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png", "tags": ["defi", "wrapped"]},
|
||||
{"chainId": 43114, "address": "0x89dd12025bfCD38A168455A44B400e913ED33BE2", "name": "Wrapped Ether v10", "symbol": "WETH10", "decimals": 18, "logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png", "tags": ["defi", "wrapped"]}
|
||||
{
|
||||
"chainId": 43114,
|
||||
"address": "0xa4B9DD039565AeD9641D45b57061f99d9cA6Df08",
|
||||
"name": "Wrapped Ether (WETH9)",
|
||||
"symbol": "WETH9",
|
||||
"decimals": 18,
|
||||
"logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png",
|
||||
"tags": [
|
||||
"defi",
|
||||
"wrapped"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "wrapped-native",
|
||||
"instrument": "wrapped-native",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "native-gas-asset"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 43114,
|
||||
"address": "0x89dd12025bfCD38A168455A44B400e913ED33BE2",
|
||||
"name": "Wrapped Ether v10",
|
||||
"symbol": "WETH10",
|
||||
"decimals": 18,
|
||||
"logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png",
|
||||
"tags": [
|
||||
"defi",
|
||||
"wrapped"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "wrapped-native",
|
||||
"instrument": "wrapped-native",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "native-gas-asset"
|
||||
}
|
||||
}
|
||||
],
|
||||
"tags": {
|
||||
"defi": {"name": "DeFi", "description": "Decentralized Finance tokens"},
|
||||
"wrapped": {"name": "Wrapped", "description": "Wrapped tokens representing native assets"}
|
||||
"defi": {
|
||||
"name": "DeFi",
|
||||
"description": "Decentralized Finance tokens"
|
||||
},
|
||||
"wrapped": {
|
||||
"name": "Wrapped",
|
||||
"description": "Wrapped tokens representing assets"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -25,7 +25,14 @@
|
||||
"tags": [
|
||||
"defi",
|
||||
"wrapped"
|
||||
]
|
||||
],
|
||||
"extensions": {
|
||||
"category": "wrapped-native",
|
||||
"instrument": "wrapped-native",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "native-gas-asset"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 25,
|
||||
@@ -37,7 +44,14 @@
|
||||
"tags": [
|
||||
"defi",
|
||||
"wrapped"
|
||||
]
|
||||
],
|
||||
"extensions": {
|
||||
"category": "wrapped-native",
|
||||
"instrument": "wrapped-native",
|
||||
"settlement": "crypto-native",
|
||||
"cashLike": false,
|
||||
"backing": "native-gas-asset"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 25,
|
||||
@@ -61,8 +75,23 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/QmNPq4D5JXzurmi9jAhogVMzhAQRk1PZ1r9H3qQUV9gjDm",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"iso4217w"
|
||||
]
|
||||
"iso4217w",
|
||||
"fiat",
|
||||
"cash",
|
||||
"compliant",
|
||||
"gru"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "emoney-or-fiat-backed-stablecoin",
|
||||
"currency": "USD",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"gruVersion": "v1",
|
||||
"gruFamily": "cUSD",
|
||||
"walletClass": "cash-like-token"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 25,
|
||||
@@ -73,8 +102,23 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/QmPh16PY241zNtePyeK7ep1uf1RcARV2ynGAuRU8U7sSqS",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"iso4217w"
|
||||
]
|
||||
"iso4217w",
|
||||
"fiat",
|
||||
"cash",
|
||||
"compliant",
|
||||
"gru"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "emoney-or-fiat-backed-stablecoin",
|
||||
"currency": "EUR",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"gruVersion": "v1",
|
||||
"gruFamily": "cEUR",
|
||||
"walletClass": "cash-like-token"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 25,
|
||||
@@ -85,8 +129,23 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/QmT2nJ6WyhYBCsYJ6NfS1BPAqiGKkCEuMxiC8ye93Co1hF",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"iso4217w"
|
||||
]
|
||||
"iso4217w",
|
||||
"fiat",
|
||||
"cash",
|
||||
"compliant",
|
||||
"gru"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "emoney-or-fiat-backed-stablecoin",
|
||||
"currency": "GBP",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"gruVersion": "v1",
|
||||
"gruFamily": "cGBP",
|
||||
"walletClass": "cash-like-token"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 25,
|
||||
@@ -97,8 +156,23 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/Qmb9JmuD9ehaQtTLBBZmAoiAbvE53e3FMjkEty8rvbPf9K",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"iso4217w"
|
||||
]
|
||||
"iso4217w",
|
||||
"fiat",
|
||||
"cash",
|
||||
"compliant",
|
||||
"gru"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "emoney-or-fiat-backed-stablecoin",
|
||||
"currency": "AUD",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"gruVersion": "v1",
|
||||
"gruFamily": "cAUD",
|
||||
"walletClass": "cash-like-token"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 25,
|
||||
@@ -109,8 +183,23 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/Qmb9JmuD9ehaQtTLBBZmAoiAbvE53e3FMjkEty8rvbPf9K",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"iso4217w"
|
||||
]
|
||||
"iso4217w",
|
||||
"fiat",
|
||||
"cash",
|
||||
"compliant",
|
||||
"gru"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "emoney-or-fiat-backed-stablecoin",
|
||||
"currency": "JPY",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"gruVersion": "v1",
|
||||
"gruFamily": "cJPY",
|
||||
"walletClass": "cash-like-token"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 25,
|
||||
@@ -121,8 +210,23 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/Qmb9JmuD9ehaQtTLBBZmAoiAbvE53e3FMjkEty8rvbPf9K",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"iso4217w"
|
||||
]
|
||||
"iso4217w",
|
||||
"fiat",
|
||||
"cash",
|
||||
"compliant",
|
||||
"gru"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "emoney-or-fiat-backed-stablecoin",
|
||||
"currency": "CHF",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"gruVersion": "v1",
|
||||
"gruFamily": "cCHF",
|
||||
"walletClass": "cash-like-token"
|
||||
}
|
||||
},
|
||||
{
|
||||
"chainId": 25,
|
||||
@@ -133,8 +237,23 @@
|
||||
"logoURI": "https://ipfs.io/ipfs/Qmb9JmuD9ehaQtTLBBZmAoiAbvE53e3FMjkEty8rvbPf9K",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"iso4217w"
|
||||
]
|
||||
"iso4217w",
|
||||
"fiat",
|
||||
"cash",
|
||||
"compliant",
|
||||
"gru"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "emoney-or-fiat-backed-stablecoin",
|
||||
"currency": "CAD",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"gruVersion": "v1",
|
||||
"gruFamily": "cCAD",
|
||||
"walletClass": "cash-like-token"
|
||||
}
|
||||
}
|
||||
],
|
||||
"tags": {
|
||||
@@ -144,16 +263,32 @@
|
||||
},
|
||||
"wrapped": {
|
||||
"name": "Wrapped",
|
||||
"description": "Wrapped tokens representing native assets"
|
||||
"description": "Wrapped tokens representing assets"
|
||||
},
|
||||
"stablecoin": {
|
||||
"name": "Stablecoin",
|
||||
"description": "Stable value tokens pegged to fiat currencies"
|
||||
"description": "Stable value tokens pegged to fiat"
|
||||
},
|
||||
"iso4217w": {
|
||||
"name": "ISO4217W",
|
||||
"description": "ISO 4217 compliant wrapped fiat tokens"
|
||||
},
|
||||
"fiat": {
|
||||
"name": "Fiat",
|
||||
"description": "Fiat referenced tokens"
|
||||
},
|
||||
"cash": {
|
||||
"name": "Cashlike",
|
||||
"description": "Cash reserve or cash rail assets"
|
||||
},
|
||||
"compliant": {
|
||||
"name": "Compliant",
|
||||
"description": "Regulatory compliant assets"
|
||||
},
|
||||
"gru": {
|
||||
"name": "GRU",
|
||||
"description": "GRU transport assets"
|
||||
},
|
||||
"oracle": {
|
||||
"name": "Oracle",
|
||||
"description": "Oracle and oracle fee tokens"
|
||||
@@ -163,4 +298,4 @@
|
||||
"description": "Cross Chain Interoperability Protocol tokens"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
File diff suppressed because one or more lines are too long
@@ -3,7 +3,7 @@
|
||||
"version": {
|
||||
"major": 1,
|
||||
"minor": 0,
|
||||
"patch": 0
|
||||
"patch": 1
|
||||
},
|
||||
"timestamp": "2026-02-16T00:00:00.000Z",
|
||||
"logoURI": "https://raw.githubusercontent.com/ethereum/ethereum.org/main/static/images/eth-diamond-black.png",
|
||||
@@ -22,18 +22,39 @@
|
||||
"logoURI": "https://raw.githubusercontent.com/trustwallet/assets/master/blockchains/ethereum/assets/0xdAC17F958D2ee523a2206206994597C13D831ec7/logo.png",
|
||||
"tags": [
|
||||
"stablecoin",
|
||||
"defi"
|
||||
]
|
||||
"defi",
|
||||
"fiat",
|
||||
"cash"
|
||||
],
|
||||
"extensions": {
|
||||
"category": "tokenized-fiat",
|
||||
"instrument": "fiat-backed-stablecoin",
|
||||
"currency": "USD",
|
||||
"settlement": "fiat",
|
||||
"cashLike": true,
|
||||
"backing": "cash,cash-equivalents",
|
||||
"x402Ready": false,
|
||||
"fwdCanon": false,
|
||||
"walletClass": "cash-like-token"
|
||||
}
|
||||
}
|
||||
],
|
||||
"tags": {
|
||||
"stablecoin": {
|
||||
"name": "Stablecoin",
|
||||
"description": "Stable value tokens pegged to fiat currencies"
|
||||
"description": "Stable value tokens pegged to fiat"
|
||||
},
|
||||
"defi": {
|
||||
"name": "DeFi",
|
||||
"description": "Decentralized Finance tokens"
|
||||
},
|
||||
"fiat": {
|
||||
"name": "Fiat",
|
||||
"description": "Fiat referenced tokens"
|
||||
},
|
||||
"cash": {
|
||||
"name": "Cashlike",
|
||||
"description": "Cash reserve or cash rail assets"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user