-
Notifications
You must be signed in to change notification settings - Fork 11
Release 03-11-2025 #227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Release 03-11-2025 #227
Conversation
feat: Replace ethers with viem
Mtp bug fixes v07
WalkthroughMajor backend migration from ethers.js to viem across paymaster, routes, server, and utils; switch build/runtime to Bun; remove ERC20 TokenPaymaster module and tests; add viem helpers and a DB seed migration; add test helpers/config and a small editor setting. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant Route as Route Handler
participant Paymaster as Paymaster
participant PublicClient as Viem PublicClient
participant Wallet as PrivateKeyAccount
participant Chain
rect rgb(250,235,235)
note left of Route: Old (ethers-based)
Client->>Route: POST /paymaster (userOp)
Route->>Paymaster: signV07(..., Wallet)
Paymaster->>Chain: provider.eth_call / estimateUserOperationGas
Chain-->>Paymaster: rpc responses
Paymaster->>Wallet: signer.signMessage(payload)
Wallet-->>Paymaster: signature
Paymaster-->>Route: paymasterAndData
Route-->>Client: 200 OK
end
rect rgb(235,250,235)
note left of Route: New (viem-based)
Client->>Route: POST /paymaster (userOp)
Route->>Paymaster: signV07(..., PrivateKeyAccount)
Paymaster->>PublicClient: getChainId() / readContract() / request(eth_estimateUserOperationGas)
PublicClient->>Chain: RPC calls
Chain-->>PublicClient: responses
Paymaster->>Wallet: PrivateKeyAccount.signMessage(hexPayload)
Wallet-->>Paymaster: signature
Paymaster-->>Route: paymasterAndData (Hex)
Route-->>Client: 200 OK
end
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Areas needing extra attention:
Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (1 passed)
✨ Finishing touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 12
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
backend/src/paymaster/index.ts (1)
1396-1443: Return cached Coingecko data when the API call failsIf the Coingecko request throws, this function currently falls through without returning anything even when cached data exists. Callers then receive
undefined, leading to downstream crashes. Ensure the cached branch returns the reconstructed{ ethPrice, …tokenData }.Apply this diff:
} catch (err) { log?.error(err + ' Reverting to previously cached price' + tokenAddress, 'CoingeckoError') const tokenData = this.coingeckoPrice.get(cacheKey)?.data; if (!tokenData) { log?.error('Price fetch error on tokenAddress: ' + tokenAddress, 'CoingeckoError') throw new Error(`${tokenAddress} ${ErrorMessage.COINGECKO_PRICE_NOT_FETCHED}`); } + const fallbackEthPrice = parseUnits( + (Number(nativePrice) / Number(tokenData.price)).toFixed(tokenData.decimals), + tokenData.decimals + ); + return { + ethPrice: fallbackEthPrice, + ...tokenData + }; } }
♻️ Duplicate comments (1)
backend/src/utils/crypto.ts (1)
51-54: Same unnecessary type conversion as in thedecodefunction.This has the same unnecessary
Uint8Array.from()wrapping as lines 36-39. The same performance consideration applies here.Apply the same fix as suggested for the
decodefunction:const digestsEqual = crypto.timingSafeEqual( - Uint8Array.from(Buffer.from(sourceDigest, 'base64')), - Uint8Array.from(checkDigest) + Buffer.from(sourceDigest, 'base64'), + checkDigest );
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
⛔ Files ignored due to path filters (1)
backend/bun.lockis excluded by!**/*.lock
📒 Files selected for processing (27)
.vscode/settings.json(1 hunks)backend/CHANGELOG.md(1 hunks)backend/Dockerfile(1 hunks)backend/indexer/ponder.config.ts(1 hunks)backend/package.json(3 hunks)backend/src/abi/VerifyingPaymasterAbiV3.ts(1 hunks)backend/src/constants/MultitokenPaymaster.ts(1 hunks)backend/src/constants/Token.ts(0 hunks)backend/src/paymaster/index.test.ts(6 hunks)backend/src/paymaster/index.ts(48 hunks)backend/src/paymaster/token.test.ts(0 hunks)backend/src/paymaster/token.ts(0 hunks)backend/src/repository/sponsorship-policy-repository.ts(3 hunks)backend/src/routes/admin-routes.ts(5 hunks)backend/src/routes/deposit-route.ts(1 hunks)backend/src/routes/metadata-routes.ts(7 hunks)backend/src/routes/paymaster-routes.ts(14 hunks)backend/src/routes/token-routes.ts(2 hunks)backend/src/routes/whitelist-routes.ts(18 hunks)backend/src/server.ts(16 hunks)backend/src/utils/common.ts(3 hunks)backend/src/utils/crypto.ts(2 hunks)backend/src/utils/interface.ts(1 hunks)backend/src/utils/monitorTokenPaymaster.ts(1 hunks)backend/test/helper.ts(1 hunks)backend/test/routes.test.ts(1 hunks)backend/test/tsconfig.test.json(1 hunks)
💤 Files with no reviewable changes (3)
- backend/src/constants/Token.ts
- backend/src/paymaster/token.test.ts
- backend/src/paymaster/token.ts
🧰 Additional context used
🧬 Code graph analysis (8)
backend/test/routes.test.ts (1)
backend/test/helper.ts (1)
build(28-28)
backend/src/utils/monitorTokenPaymaster.ts (1)
frontend/src/components/Dashboard.jsx (1)
chainId(92-92)
backend/src/routes/paymaster-routes.ts (4)
backend/src/constants/ChainlinkOracles.ts (1)
NativeOracles(2-34)backend/src/models/sponsorship-policy.ts (1)
SponsorshipPolicy(4-67)backend/src/types/sponsorship-policy-dto.ts (1)
getEPVersion(44-55)backend/src/constants/ErrorMessage.ts (1)
generateErrorMessage(63-73)
backend/src/paymaster/index.test.ts (1)
backend/src/paymaster/index.ts (1)
Paymaster(89-1452)
backend/src/routes/metadata-routes.ts (1)
backend/src/utils/common.ts (1)
getViemChainDef(14-34)
backend/src/paymaster/index.ts (2)
backend/src/constants/MultitokenPaymaster.ts (2)
TokenDecimalsAndSymbol(10-207)OracleDecimals(209-363)backend/src/utils/common.ts (1)
getViemChainDef(14-34)
backend/src/server.ts (3)
backend/src/utils/common.ts (1)
getViemChainDef(14-34)backend/src/utils/monitorTokenPaymaster.ts (1)
checkDeposit(6-22)backend/src/constants/ChainlinkOracles.ts (1)
NativeOracles(2-34)
backend/src/routes/admin-routes.ts (1)
backend/src/utils/common.ts (1)
getViemChainDef(14-34)
🪛 Checkov (3.2.334)
backend/Dockerfile
[low] 21-21: Ensure the last USER is not root
(CKV_DOCKER_8)
[low] 1-23: Ensure that HEALTHCHECK instructions have been added to container images
(CKV_DOCKER_2)
🪛 Hadolint (2.14.0)
backend/Dockerfile
[warning] 2-2: Pin versions in apk add. Instead of apk add <package> use apk add <package>=<version>
(DL3018)
[warning] 21-21: Last USER should not be root
(DL3002)
🪛 markdownlint-cli2 (0.18.1)
backend/CHANGELOG.md
2-2: Headings should be surrounded by blank lines
Expected: 1; Actual: 0; Above
(MD022, blanks-around-headings)
2-2: Headings should be surrounded by blank lines
Expected: 1; Actual: 0; Below
(MD022, blanks-around-headings)
🔇 Additional comments (9)
.vscode/settings.json (1)
8-8: Clarify the purpose of this IDE configuration setting.The addition of
"claudeCodeChat.permissions.yoloMode": falseappears orthogonal to the PR's stated objectives (ethers → viem migration, MTP support for epv8, database migrations). This looks like a personal IDE or tooling preference.Consider whether this should be included in the release commit or if it was accidentally staged with unrelated changes. If it is intentional, please clarify its purpose in the PR description.
backend/src/repository/sponsorship-policy-repository.ts (3)
4-4: LGTM! Clean migration to viem.The import change from ethers to viem's
isAddressis correct and aligns with the broader migration strategy described in the PR objectives.
658-670: Address validation correctly migrated to viem.The validation logic for
addressAllowListproperly uses viem'sisAddressfunction. The condition!address || !isAddress(address)correctly handles both null/empty values and invalid address formats.
672-684: Consistent validation pattern maintained.The
addressBlockListvalidation correctly uses the same pattern asaddressAllowList, ensuring consistency across the codebase. The migration to viem'sisAddressis properly implemented.backend/test/routes.test.ts (2)
3-4: Test setup looks correct, but depends on helper.ts fixes.The pattern of calling
build()and relying on its internalbeforeAllhooks is appropriate for test frameworks like Jest/Vitest. However, ensure the issues identified inbackend/test/helper.tsare addressed for proper async plugin registration.
6-12: LGTM!The test correctly validates the error response when no payload is provided. The assertion properly checks for the expected error message.
backend/src/utils/monitorTokenPaymaster.ts (2)
2-2: LGTM: Viem imports are correct.The migration from ethers to viem imports is appropriate and includes all necessary functions for this utility.
9-9: LGTM: Contract initialization is correct.The
getContractcall properly type-casts the address to viem'sAddresstype and follows viem patterns.backend/src/utils/interface.ts (1)
17-19: No JSON serialization issues found – bigint values are used internally only.The
getEtherscanFeeResponseinterface correctly usesbigintfor the viem migration. However, verification shows that whilegetGasFee()returns objects withbigintvalues, these are used internally within paymaster methods for viem transaction construction (e.g.,walletClient.sendTransaction()parameters). The paymaster methods return only strings and hashes to HTTP clients, so no JSON serialization ofbigintoccurs in the current request/response flow.The bigint usage is correct and poses no immediate risk. The type change aligns with the viem migration strategy.
| FROM oven/bun:1-alpine AS runner | ||
| WORKDIR /usr/app | ||
| ARG APP_ENV | ||
| COPY --from=builder /app/build ./build | ||
| COPY ./migrations ./build/migrations | ||
| COPY package.json ./ | ||
| COPY --from=builder /app/config.json.default /usr/app/config.json | ||
| RUN npm install | ||
| USER root | ||
| ENV NODE_ENV="production" | ||
| CMD ["npm", "start"] | ||
| CMD ["bun", "start"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure runtime stage carries Bun dependencies
bun start in the runner stage will fail because we never copy the installed dependencies (node_modules and Bun’s cache in /root/.bun) out of the build stages. The generated build artifacts still import runtime packages, so the container will hit “module not found” errors as soon as it starts. Please copy the installed deps (or run a production bun install) into the runner stage.
FROM oven/bun:1-alpine AS runner
WORKDIR /usr/app
ARG APP_ENV
COPY --from=builder /app/build ./build
COPY ./migrations ./build/migrations
-COPY package.json ./
-COPY --from=builder /app/config.json.default /usr/app/config.json
+COPY package.json ./
+COPY bun.lock* ./
+COPY --from=deps /app/node_modules ./node_modules
+COPY --from=deps /root/.bun /root/.bun
+COPY --from=builder /app/config.json.default /usr/app/config.json
USER root
ENV NODE_ENV="production"
CMD ["bun", "start"]Committable suggestion skipped: line range outside the PR's diff.
🧰 Tools
🪛 Checkov (3.2.334)
[low] 21-21: Ensure the last USER is not root
(CKV_DOCKER_8)
[low] 1-23: Ensure that HEALTHCHECK instructions have been added to container images
(CKV_DOCKER_2)
🪛 Hadolint (2.14.0)
[warning] 21-21: Last USER should not be root
(DL3002)
🤖 Prompt for AI Agents
In backend/Dockerfile around lines 14 to 23, the runner stage currently only
copies build artifacts and config but not Bun/node dependencies, so `bun start`
will fail with module-not-found errors; fix it by either copying the installed
dependencies and Bun cache from the builder stage into the runner (e.g., COPY
--from=builder /app/node_modules ./node_modules and COPY --from=builder
/root/.bun /root/.bun) or by running a production install in the runner (run bun
install --production or bun install --no-dev) before CMD, and ensure file
ownership/permissions match the USER in the runner.
| "test": "bun test", | ||
| "postinstall": "node -e \"try { require('esbuild/install') } catch (e) { console.log('Using JS fallback for esbuild') }\"" | ||
| }, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Postinstall script breaks Bun-only installs
The new Bun-based toolchain (and Bun Docker image) no longer guarantees a node binary. This postinstall will fail outright in that environment, blocking installs. Please drop the node -e … hook or replace it with a Bun-compatible implementation.
Apply this diff:
- "test": "bun test",
- "postinstall": "node -e \"try { require('esbuild/install') } catch (e) { console.log('Using JS fallback for esbuild') }\""
+ "test": "bun test"📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "test": "bun test", | |
| "postinstall": "node -e \"try { require('esbuild/install') } catch (e) { console.log('Using JS fallback for esbuild') }\"" | |
| }, | |
| "test": "bun test" | |
| }, |
🤖 Prompt for AI Agents
In backend/package.json around lines 19-21, the postinstall script calls node
directly which breaks Bun-only environments because a node binary may not exist;
replace or remove this hook — either drop the postinstall entirely, or change it
to a Bun-compatible check that first detects if node is available (e.g. via a
shell check like command -v node) and only runs the require('esbuild/install')
step when node exists, otherwise skip or use a Bun-friendly fallback so installs
do not fail in Bun-only environments.
| try { | ||
| expect(signResponse).toHaveProperty("paymasterAndData"); | ||
| expect(depositResponse).toHaveProperty("receipt"); | ||
| } catch (e) { | ||
| throw new Error( | ||
| "The paymasterAndData details is not displayed in the signV06 response" | ||
| ); | ||
| } | ||
| } catch (e) { | ||
| throw new Error("An error is displayed while performing signV06 action."); | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the signV06 function with invalid bundler", async () => { | ||
| await delay(3000); | ||
| try { | ||
| await paymaster.signV06( | ||
| userOpV06, | ||
| Mock_Valid_Until, | ||
| Mock_Valid_After, | ||
| entryPointAddressV06, | ||
| paymasterAddressV06, | ||
| bundlerUrlV07, | ||
| signerV06, | ||
| true | ||
| ); | ||
| } catch (e: any) { | ||
| const actualMessage = | ||
| "Failed to process request to bundler. Please contact support team RawErrorMsg"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log("The bundler is invalid while using the signV06 function."); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for invalid bundler while using the signV06 function." | ||
| "The receipt details is not displayed in the deposit response" | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the signV07 function with estimation is false", async () => { | ||
| await delay(3000); | ||
| try { | ||
| const signResponse = await paymaster.signV07( | ||
| userOpV07, | ||
| Mock_Valid_Until, | ||
| Mock_Valid_After, | ||
| entryPointAddressV07, | ||
| paymasterAddressV07, | ||
| bundlerUrlV07, | ||
| signerV07, | ||
| false | ||
| ); | ||
|
|
||
| try { | ||
| expect(signResponse).toHaveProperty("paymaster"); | ||
| expect(depositResponse).toHaveProperty("userOpResponse"); | ||
| } catch (e) { | ||
| throw new Error( | ||
| "The paymaster details is not displayed in the signV07 response" | ||
| ); | ||
| } | ||
|
|
||
| try { | ||
| expect(signResponse).toHaveProperty("paymasterData"); | ||
| } catch (e) { | ||
| throw new Error( | ||
| "The paymasterData details is not displayed in the signV07 response" | ||
| "The userOpResponse details is not displayed in the deposit response" | ||
| ); | ||
| } | ||
| } catch (e) { | ||
| throw new Error("An error is displayed while performing signV07 action."); | ||
| throw new Error("An error is displayed while performing deposit action."); | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the signV07 function with invalid bundler", async () => { | ||
| test("SMOKE: validate the deposit function with epv07 details", async () => { | ||
| await delay(3000); | ||
| try { | ||
| await paymaster.signV07( | ||
| userOpV07, | ||
| Mock_Valid_Until, | ||
| Mock_Valid_After, | ||
| entryPointAddressV07, | ||
| const depositResponse = await paymaster.deposit( | ||
| "100000000000000000", | ||
| paymasterAddressV07, | ||
| bundlerUrlV06, | ||
| signerV07, | ||
| true | ||
| ); | ||
| } catch (e: any) { | ||
| const actualMessage = | ||
| "Failed to process request to bundler. Please contact support team RawErrorMsg"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log("The bundler is invalid while using the signv07 function."); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for invalid bundler while using the signv07 function." | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test.skip("REGRESSION: validate the pimlico function with invalid custom paymaster address", async () => { | ||
| await delay(3000); | ||
| const gasToken = "USDC"; | ||
| const address = ethers.Wallet.createRandom(); // random address | ||
| try { | ||
| await paymaster.pimlico( | ||
| userOpV06, | ||
| bundlerUrlV06, | ||
| entryPointAddressV06, | ||
| address.toString() | ||
| ); // invalid custom paymaster address | ||
| throw new Error( | ||
| "The pimlico function is worked, however the customPaymasterAddress is invalid." | ||
| ); | ||
| } catch (e: any) { | ||
| const actualMessage = | ||
| "Please contact support team RawErrorMsg: network does not support ENS"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log( | ||
| "The customPaymasterAddress is invalid while using the pimlico function." | ||
| ); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for invalid customPaymasterAddress while using the pimlico function." | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test.skip("REGRESSION: validate the pimlico function with invalid sender address", async () => { | ||
| await delay(3000); | ||
| const gasToken = "USDC"; | ||
| const userOp = { | ||
| sender: "0x1434E767F0D878de8C053896ec6F7e5d1951eE0", // invalid sender address | ||
| nonce: "0x0", | ||
| initCode: | ||
| "0x7f6d8f107fe8551160bd5351d5f1514a6ad5d40e5fbfb9cf0000000000000000000000009ae4935fae629dd042c18df520322e0e9efad73d0000000000000000000000000000000000000000000000000000000000000000", | ||
| callData: | ||
| "0x47e1da2a000000000000000000000000000000000000000000000000000000000000006000000000000000000000000000000000000000000000000000000000000000a000000000000000000000000000000000000000000000000000000000000000e0000000000000000000000000000000000000000000000000000000000000000100000000000000000000000080a1874e1046b1cc5defdf4d3153838b72ff94ac0000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000000009184e72a000000000000000000000000000000000000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000200000000000000000000000000000000000000000000000000000000000000000", | ||
| callGasLimit: "0xc7ed", | ||
| verificationGasLimit: "0x4d62f", | ||
| maxFeePerGas: "0x59682f1e", | ||
| maxPriorityFeePerGas: "0x59682f00", | ||
| paymasterAndData: "0x", | ||
| preVerificationGas: "0xb1f4", | ||
| signature: | ||
| "0xfffffffffffffffffffffffffffffff0000000000000000000000000000000007aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa1c", | ||
| }; | ||
|
|
||
| try { | ||
| await paymaster.pimlico( | ||
| userOp, | ||
| bundlerUrlV06, | ||
| entryPointAddressV06, | ||
| PAYMASTER_ADDRESS[chainId][gasToken] | ||
| ); | ||
| throw new Error( | ||
| "The pimlico function is worked, however the sender address is invalid." | ||
| ); | ||
| } catch (e: any) { | ||
| const actualMessage = | ||
| " Please contact support team RawErrorMsg: invalid address"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log( | ||
| "The sender address is invalid while using the pimlico function." | ||
| ); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for invalid sender address while using the pimlico function." | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the whitelistAddresses function with not whitelisted address", async () => { | ||
| await delay(3000); | ||
| const wallet = ethers.Wallet.createRandom(); | ||
| const address = [wallet.address]; // not whitelisted address | ||
| try { | ||
| const whitelistAddresses = await paymaster.whitelistAddresses( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
|
|
||
| if ( | ||
| whitelistAddresses.message.includes( | ||
| "Successfully whitelisted with transaction Hash" | ||
| ) | ||
| ) { | ||
| console.log("The address is whitelisted successfully."); | ||
| } else { | ||
| throw new Error( | ||
| "The expected success message is not displayed while performing the whitelistAddress action." | ||
| ); | ||
| } | ||
| } catch (e: any) { | ||
| throw new Error( | ||
| "An error is displayed while performing the whitelistAddress action." | ||
| ); | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the whitelistAddresses function with already whitelisted address", async () => { | ||
| await delay(3000); | ||
| const address = ["0x7b3078b9A28DF76453CDfD2bA5E75f32f0676321"]; // already whitelisted address | ||
| try { | ||
| await paymaster.whitelistAddresses( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
| throw new Error( | ||
| "Address is whitelisted, However it was already whitelisted." | ||
| ); | ||
| } catch (e: any) { | ||
| const actualMessage = "already whitelisted"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log("The address is already whitelisted."); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for already whitelisted address. " | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the whitelistAddresses function with invalid relayerKey", async () => { | ||
| await delay(3000); | ||
| const address = ["0x7b3078b9A28DF76453CDfD2bA5E75f32f0676321"]; // already whitelisted address | ||
| const relayerKey = | ||
| "0xdd45837c9d94e7cc3ed3b24be7c1951eff6ed3c6fd0baf68fc1ba8c0e51debb"; // invalid relayerKey | ||
| try { | ||
| await paymaster.whitelistAddresses( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
| throw new Error( | ||
| "Address is whitelisted, however the relayerKey is invalid." | ||
| ); | ||
| } catch (e: any) { | ||
| const actualMessage = | ||
| "Please try again later or contact support team RawErrorMsg: hex data is odd-length"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log( | ||
| "The relayerKey is invalid while whitelisting the address." | ||
| ); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for invalid relayerKey while address whitelisting." | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the checkWhitelistAddress function with whitelisted address", async () => { | ||
| await delay(3000); | ||
| const address = "0x7b3078b9A28DF76453CDfD2bA5E75f32f0676321"; // whitelisted address | ||
| try { | ||
| const checkWhitelistAddressResponse = | ||
| await paymaster.checkWhitelistAddress( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey | ||
| ); | ||
| if (checkWhitelistAddressResponse === true) { | ||
| console.log("The address is whitelisted."); | ||
| } else { | ||
| throw new Error( | ||
| "The address is displayed not whitelisted, however it is already whitelisted." | ||
| ); | ||
| } | ||
| } catch (e: any) { | ||
| throw new Error( | ||
| "An error is displayed while checking the address for whitelisting." | ||
| ); | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the checkWhitelistAddress function with non whitelisted address", async () => { | ||
| await delay(3000); | ||
| const address = "0x8350355c08aDAC387b443782124A30A8942BeC2e"; // non whitelisted address | ||
| try { | ||
| const checkWhitelistAddressResponse = | ||
| await paymaster.checkWhitelistAddress( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey | ||
| ); | ||
| if (checkWhitelistAddressResponse === false) { | ||
| console.log("The address is not whitelisted as expected."); | ||
| } else { | ||
| throw new Error( | ||
| "The address is displayed whitelisted, however it is not whitelisted." | ||
| ); | ||
| } | ||
| } catch (e: any) { | ||
| throw new Error( | ||
| "An error is displayed while checking the address for whitelisting." | ||
| ); | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the checkWhitelistAddress function with invalid relayerKey", async () => { | ||
| await delay(3000); | ||
| const address = "0x7b3078b9A28DF76453CDfD2bA5E75f32f0676321"; | ||
| const relayerKey = | ||
| "0xdd45837c9d94e7cc3ed3b24be7c1951eff6ed3c6fd0baf68fc1ba8c0e51debb"; // invalid relayerKey | ||
| try { | ||
| await paymaster.checkWhitelistAddress( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey | ||
| ); | ||
| throw new Error( | ||
| "The whitelist address checking is performed, however the relayerKey is invalid." | ||
| ); | ||
| } catch (e: any) { | ||
| const actualMessage = "rpcError while checking whitelist"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log( | ||
| "The relayerKey is invalid while checking the whitelist address." | ||
| ); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for invalid relayerKey while checking the whitelist address." | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the removeWhitelistAddress function with not whitelisted address", async () => { | ||
| await delay(3000); | ||
| let wallet = ethers.Wallet.createRandom(); | ||
| let address = [wallet.address]; // not whitelisted address | ||
|
|
||
| // make the address whitelisted | ||
| try { | ||
| const whitelistAddresses = await paymaster.whitelistAddresses( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
|
|
||
| if ( | ||
| whitelistAddresses.message.includes( | ||
| "Successfully whitelisted with transaction Hash" | ||
| ) | ||
| ) { | ||
| console.log("The address is whitelisted successfully."); | ||
| } else { | ||
| throw new Error( | ||
| "The expected success message is not displayed while performing the whitelistAddress action." | ||
| ); | ||
| } | ||
| } catch (e: any) { | ||
| throw new Error( | ||
| "An error is displayed while performing the whitelistAddress action." | ||
| ); | ||
| } | ||
|
|
||
| // wait for 10 second | ||
| await delay(10000); | ||
|
|
||
| // remove the address from the whitelisting | ||
| try { | ||
| const removeWhitelistAddress = await paymaster.removeWhitelistAddress( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
|
|
||
| if ( | ||
| removeWhitelistAddress.message.includes( | ||
| "Successfully removed whitelisted addresses" | ||
| ) | ||
| ) { | ||
| console.log( | ||
| "The address is removed from the whitelisting successfully." | ||
| ); | ||
| } else { | ||
| throw new Error( | ||
| "The expected success message is not displayed while performing the removeWhitelistAddress action." | ||
| ); | ||
| } | ||
| } catch (e: any) { | ||
| throw new Error( | ||
| "An error is displayed while performing the whitelistAddress action." | ||
| ); | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the removeWhitelistAddress function with already removed from whitelisted address", async () => { | ||
| await delay(3000); | ||
| let wallet = ethers.Wallet.createRandom(); | ||
| let address = [wallet.address]; // not whitelisted address | ||
|
|
||
| // make the address whitelisted | ||
| try { | ||
| await paymaster.whitelistAddresses( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
| } catch (e: any) { | ||
| console.log("error 1:::::::::", e); | ||
| throw new Error( | ||
| "An error is displayed while performing the whitelistAddress action." | ||
| ); | ||
| } | ||
|
|
||
| // wait for 10 second | ||
| await delay(10000); | ||
|
|
||
| // remove the address from the whitelisting | ||
| try { | ||
| await paymaster.removeWhitelistAddress( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
| } catch (e: any) { | ||
| console.log("error 2:::::::::", e); | ||
| throw new Error( | ||
| "An error is displayed while performing the removeWhitelistAddress action." | ||
| ); | ||
| } | ||
|
|
||
| // wait for 5 second | ||
| await delay(5000); | ||
|
|
||
| // again trying for removing address from whitelisting | ||
| try { | ||
| await paymaster.removeWhitelistAddress( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
| throw new Error( | ||
| "Address is removed from the whitelisting, However it was already removed from the whitelisting." | ||
| ); | ||
| } catch (e: any) { | ||
| console.log("error 3:::::::::", e); | ||
| const actualMessage = "is not whitelisted"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log("The address is already removed from the whitelisting."); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for already removed the address from whitelisting. " | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the removeWhitelistAddress function with invalid relayerKey", async () => { | ||
| await delay(3000); | ||
| let wallet = ethers.Wallet.createRandom(); | ||
| let address = [wallet.address]; // not whitelisted address | ||
| const relayerKey = | ||
| "0xdd45837c9d94e7cc3ed3b24be7c1951eff6ed3c6fd0baf68fc1ba8c0e51debb2"; // valid relayerKey | ||
| const relayerKey_invalid = | ||
| "0xdd45837c9d94e7cc3ed3b24be7c1951eff6ed3c6fd0baf68fc1ba8c0e51debb"; // invalid relayerKey | ||
|
|
||
| // make the address whitelisted | ||
| try { | ||
| await paymaster.whitelistAddresses( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId | ||
| ); | ||
| } catch (e: any) { | ||
| console.log("error 1:::::::::", e); | ||
| throw new Error( | ||
| "An error is displayed while performing the whitelistAddress action." | ||
| ); | ||
| } | ||
|
|
||
| // wait for 10 second | ||
| await delay(10000); | ||
|
|
||
| // remove the address from the whitelisting | ||
| try { | ||
| await paymaster.removeWhitelistAddress( | ||
| address, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey_invalid, | ||
| chainId | ||
| ); | ||
| throw new Error( | ||
| "The removeWhitelistAddress action is performed with invalid relayerKey." | ||
| ); | ||
| } catch (e: any) { | ||
| console.log("error 2:::::::::", e); | ||
| const actualMessage = | ||
| "The wallet does not have enough funds or the gas price is too high at the moment."; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log( | ||
| "The relayerKey is invalid while perform the removing the address from the whitelisting." | ||
| ); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for invalid relayerKey while perform the removing the address from the whitelisting." | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the deposit function with exceeded amount", async () => { | ||
| await delay(3000); | ||
| const amount = "10000000"; // exceeded amount | ||
| try { | ||
| await paymaster.deposit( | ||
| amount, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| relayerKey, | ||
| chainId, | ||
| true | ||
| ); | ||
| throw new Error("The deposite action is performed with exceeded amount."); | ||
| } catch (e: any) { | ||
| const actualMessage = "Balance is less than the amount to be deposited"; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log("The amount is exceeded while performing the deposit."); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for exceeded amount while deposit." | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the deposit function with invalid amount", async () => { | ||
| await delay(3000); | ||
| const amount = "abc"; // invalid amount | ||
| try { | ||
| await paymaster.deposit( | ||
| amount, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| bundlerUrlV07, | ||
| relayerKey, | ||
| chainId, | ||
| true | ||
| ); | ||
| throw new Error("The deposite action is performed with invalid amount."); | ||
| } catch (e: any) { | ||
| console.log("error 1:::::::::", e); | ||
| const actualMessage = | ||
| "The wallet does not have enough funds or the gas price is too high at the moment."; | ||
| const expectedMessage = e.message; | ||
| if (expectedMessage.includes(actualMessage)) { | ||
| console.log("The amount is invalid while performing the deposit."); | ||
| } else { | ||
| throw new Error( | ||
| "The respective validation is not displayed for invalid amount while deposit." | ||
| ); | ||
| } | ||
| } | ||
| }); | ||
|
|
||
| test("REGRESSION: validate the signV06 function with multiTokenMarkUp is NaN", async () => { | ||
| await delay(3000); | ||
| const paymaster1 = new Paymaster("0", "NaN"); | ||
| try { | ||
| const signResponse = await paymaster1.signV06( | ||
| userOpV06, | ||
| Mock_Valid_Until, | ||
| Mock_Valid_After, | ||
| entryPointAddressV06, | ||
| paymasterAddressV06, | ||
| bundlerUrlV06, | ||
| signerV06, | ||
| true | ||
| false | ||
| ); | ||
|
|
||
| try { | ||
| expect(signResponse).toHaveProperty("paymasterAndData"); | ||
| } catch (e) { | ||
| throw new Error( | ||
| "The paymasterAndData details is not displayed in the signV06 response" | ||
| ); | ||
| } | ||
|
|
||
| try { | ||
| expect(signResponse).toHaveProperty("verificationGasLimit"); | ||
| expect(depositResponse).toHaveProperty("receipt"); | ||
| } catch (e) { | ||
| throw new Error( | ||
| "The verificationGasLimit details is not displayed in the signV06 response" | ||
| "The receipt details is not displayed in the deposit response" | ||
| ); | ||
| } | ||
|
|
||
| try { | ||
| expect(signResponse).toHaveProperty("preVerificationGas"); | ||
| expect(depositResponse).toHaveProperty("userOpResponse"); | ||
| } catch (e) { | ||
| throw new Error( | ||
| "The preVerificationGas details is not displayed in the signV06 response" | ||
| ); | ||
| } | ||
|
|
||
| try { | ||
| expect(signResponse).toHaveProperty("callGasLimit"); | ||
| } catch (e) { | ||
| throw new Error( | ||
| "The callGasLimit details is not displayed in the signV06 response" | ||
| "The userOpResponse details is not displayed in the deposit response" | ||
| ); | ||
| } | ||
| } catch (e) { | ||
| throw new Error("An error is displayed while performing signV06 action."); | ||
| throw new Error("An error is displayed while performing deposit action."); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Align deposit test expectations with new return shape
Paymaster.deposit now returns { message: … }, so these assertions for receipt and userOpResponse will always throw. Update the tests to validate the message instead.
Apply this diff:
- try {
- expect(depositResponse).toHaveProperty("receipt");
- } catch (e) {
- throw new Error(
- "The receipt details is not displayed in the deposit response"
- );
- }
-
- try {
- expect(depositResponse).toHaveProperty("userOpResponse");
- } catch (e) {
- throw new Error(
- "The userOpResponse details is not displayed in the deposit response"
- );
- }
+ try {
+ expect(depositResponse).toHaveProperty("message");
+ } catch (e) {
+ throw new Error(
+ "The deposit response is missing the success message"
+ );
+ }Repeat the same change in the EPV07 deposit test below.
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In backend/src/paymaster/index.test.ts around lines 253 to 302, the tests still
assert depositResponse has properties "receipt" and "userOpResponse" but
Paymaster.deposit now returns an object with a "message" field; update the
assertions to check for the new shape instead. Replace the two
expect(...toHaveProperty("receipt")) / ("userOpResponse") checks with an
assertion that depositResponse has a "message" property (e.g.,
expect(depositResponse).toHaveProperty("message")) and validate the message
contents (exact text or pattern expected) or at minimum assert it's a non-empty
string. Make the identical change in the EPV07 deposit test further down so both
tests assert the "message" field and its expected value rather than the old
receipt/userOpResponse properties.
| const digestsEqual = crypto.timingSafeEqual( | ||
| Buffer.from(sourceDigest, 'base64'), | ||
| checkDigest | ||
| Uint8Array.from(Buffer.from(sourceDigest, 'base64')), | ||
| Uint8Array.from(checkDigest) | ||
| ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick | 🔵 Trivial
Unnecessary type conversion creates performance overhead.
In Node.js, Buffer already extends Uint8Array, so both Buffer.from(sourceDigest, 'base64') and checkDigest are valid Uint8Array instances that can be passed directly to crypto.timingSafeEqual. Wrapping them with Uint8Array.from() creates new array instances and copies the byte values, adding unnecessary allocation overhead without functional benefit.
If this change is part of a type standardization effort for the viem migration, please clarify the rationale. Otherwise, consider reverting to avoid the extra allocations:
const digestsEqual = crypto.timingSafeEqual(
- Uint8Array.from(Buffer.from(sourceDigest, 'base64')),
- Uint8Array.from(checkDigest)
+ Buffer.from(sourceDigest, 'base64'),
+ checkDigest
);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const digestsEqual = crypto.timingSafeEqual( | |
| Buffer.from(sourceDigest, 'base64'), | |
| checkDigest | |
| Uint8Array.from(Buffer.from(sourceDigest, 'base64')), | |
| Uint8Array.from(checkDigest) | |
| ); | |
| const digestsEqual = crypto.timingSafeEqual( | |
| Buffer.from(sourceDigest, 'base64'), | |
| checkDigest | |
| ); |
🤖 Prompt for AI Agents
In backend/src/utils/crypto.ts around lines 36 to 39, the code is unnecessarily
wrapping Buffer and Uint8Array values with Uint8Array.from before passing them
to crypto.timingSafeEqual which causes extra allocations; remove the
Uint8Array.from(...) wrappers and pass Buffer.from(sourceDigest, 'base64') and
checkDigest directly (ensuring checkDigest is already a Buffer or Uint8Array),
and add/retain a length equality check before calling timingSafeEqual to avoid
exceptions if lengths differ.
| const currentDeposit = await contract.getDeposit(); | ||
| if (ethers.utils.parseEther(thresholdValue).gte(currentDeposit)) { | ||
| const body = { message: `Balance below threshold. Please deposit on tokenPaymasterAddress: ${paymasterAddress} chainId: ${chainId}`, currentDeposit: ethers.utils.formatEther(currentDeposit) } | ||
| const publicClient = createPublicClient({ transport: http(bundlerUrl) }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Missing chain parameter in publicClient configuration.
The createPublicClient call omits the chain parameter, which is important in viem for proper chain-specific operations like address formatting, ENS resolution, and transaction formatting. The function receives chainId but doesn't use it when creating the client.
Consider adding the chain configuration. You'll need to map the chainId to a viem chain definition:
+import { getViemChainDef } from './common.js'; // or appropriate utility
- const publicClient = createPublicClient({ transport: http(bundlerUrl) });
+ const chain = getViemChainDef(chainId);
+ const publicClient = createPublicClient({ chain, transport: http(bundlerUrl) });Run the following script to verify if a utility function for chain mapping exists:
🏁 Script executed:
#!/bin/bash
# Search for existing chain mapping utilities
rg -n "getViemChain|viemChain|chainId.*viem" --type ts -C 3Length of output: 13288
🏁 Script executed:
#!/bin/bash
# View the monitorTokenPaymaster.ts file to verify the issue
cat backend/src/utils/monitorTokenPaymaster.tsLength of output: 1170
Missing chain parameter in publicClient configuration.
The createPublicClient call at line 8 omits the chain parameter despite chainId being passed to the function. This is inconsistent with the rest of the codebase and prevents viem from applying chain-specific logic properly. The getViemChainDef utility function already exists and should be used.
Add the chain configuration:
+import { getViemChainDef } from './common.js';
import { createPublicClient, http, getContract, parseEther, formatEther } from "viem";
export async function checkDeposit(paymasterAddress: string, bundlerUrl: string, webhookUrl: string, thresholdValue: string, chainId: number, log: FastifyBaseLogger) {
try {
- const publicClient = createPublicClient({ transport: http(bundlerUrl) });
+ const viemChain = getViemChainDef(chainId);
+ const publicClient = createPublicClient({ chain: viemChain, transport: http(bundlerUrl) });Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In backend/src/utils/monitorTokenPaymaster.ts around line 8, the
createPublicClient call omits the required chain parameter; update the client
creation to include chain: getViemChainDef(chainId) (using the existing
getViemChainDef utility and the function's chainId argument) so viem can apply
chain-specific logic when creating the publicClient with transport:
http(bundlerUrl).
| function build () { | ||
| const app = fastify(); | ||
|
|
||
| // fastify-plugin ensures that all decorators | ||
| // are exposed for testing purposes, this is | ||
| // different from the production setup | ||
|
|
||
|
|
||
| beforeAll(async () => { | ||
| void app.register(fp(configPlugin)); | ||
| app.register(routes) | ||
| await app.ready(); | ||
| }); | ||
|
|
||
| afterAll(() => app.close()) | ||
|
|
||
| return app | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick | 🔵 Trivial
Consider adding error handling for plugin registration failures.
If plugin registration or app.ready() fails in the beforeAll hook, tests may hang or produce cryptic errors. While test frameworks typically handle uncaught errors, explicit error handling or logging would improve debuggability.
Example:
beforeAll(async () => {
+ try {
await app.register(fp(configPlugin));
await app.register(routes);
await app.ready();
+ } catch (error) {
+ console.error('Failed to initialize test app:', error);
+ throw error;
+ }
});📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| function build () { | |
| const app = fastify(); | |
| // fastify-plugin ensures that all decorators | |
| // are exposed for testing purposes, this is | |
| // different from the production setup | |
| beforeAll(async () => { | |
| void app.register(fp(configPlugin)); | |
| app.register(routes) | |
| await app.ready(); | |
| }); | |
| afterAll(() => app.close()) | |
| return app | |
| } | |
| function build () { | |
| const app = fastify(); | |
| // fastify-plugin ensures that all decorators | |
| // are exposed for testing purposes, this is | |
| // different from the production setup | |
| beforeAll(async () => { | |
| try { | |
| await app.register(fp(configPlugin)); | |
| await app.register(routes); | |
| await app.ready(); | |
| } catch (error) { | |
| console.error('Failed to initialize test app:', error); | |
| throw error; | |
| } | |
| }); | |
| afterAll(() => app.close()) | |
| return app | |
| } |
🤖 Prompt for AI Agents
In backend/test/helper.ts around lines 8 to 25, the beforeAll hook registers
plugins and calls app.ready() without error handling which can cause tests to
hang or produce unclear failures; wrap the registration and app.ready() calls in
a try/catch (or try/catch/finally) to catch any exceptions, log the error
(console.error or a test logger) and rethrow or fail the setup so the test
runner exits clearly, and ensure app.close() is called in a finally block or in
afterAll to guarantee cleanup even when registration fails.
| beforeAll(async () => { | ||
| void app.register(fp(configPlugin)); | ||
| app.register(routes) | ||
| await app.ready(); | ||
| }); | ||
|
|
||
| afterAll(() => app.close()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Refactor plugin registration and lifecycle hooks.
Several issues with the current approach:
- Inconsistent promise handling: Line 17 uses
voidto discard the promise, but line 18 doesn't. - Race condition risk: Plugin registration happens in
beforeAll, but the app is returned synchronously. Tests may execute before plugins are fully registered. - Missing await in cleanup:
afterAllshould be async and awaitapp.close()for proper cleanup.
Consider this refactor:
- beforeAll(async () => {
- void app.register(fp(configPlugin));
- app.register(routes)
- await app.ready();
- });
+ beforeAll(async () => {
+ await app.register(fp(configPlugin));
+ await app.register(routes);
+ await app.ready();
+ });
- afterAll(() => app.close())
+ afterAll(async () => {
+ await app.close();
+ });📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| beforeAll(async () => { | |
| void app.register(fp(configPlugin)); | |
| app.register(routes) | |
| await app.ready(); | |
| }); | |
| afterAll(() => app.close()) | |
| beforeAll(async () => { | |
| await app.register(fp(configPlugin)); | |
| await app.register(routes); | |
| await app.ready(); | |
| }); | |
| afterAll(async () => { | |
| await app.close(); | |
| }); |
🤖 Prompt for AI Agents
In backend/test/helper.ts around lines 16 to 22, the test setup uses
inconsistent/unsafe plugin registration: it discards the promise from
app.register(fp(configPlugin)) with void, registers routes without awaiting, and
calls app.close() synchronously; change beforeAll to fully await plugin and
route registration and app.ready() (remove void and await both
app.register(fp(configPlugin)) and await app.register(routes), then await
app.ready()), and change afterAll to async and await app.close() to ensure
proper teardown.
| test('default etherspot paymaster with params', async () => { | ||
| const userOp = { | ||
| sender: "0x603Ef162f05dDa6e3B4717f4A951d6eF614a897f", | ||
| nonce: "0x1d", | ||
| initCode: "0x", | ||
| callData: "0x47e1da2a000000000000000000000000000000000000000000000000000000000000006000000000000000000000000000000000000000000000000000000000000000a000000000000000000000000000000000000000000000000000000000000000e0000000000000000000000000000000000000000000000000000000000000000100000000000000000000000080a1874e1046b1cc5defdf4d3153838b72ff94ac0000000000000000000000000000000000000000000000000000000000000001000000000000000000000000000000000000000000000000002386f26fc10000000000000000000000000000000000000000000000000000000000000000000100000000000000000000000000000000000000000000000000000000000000200000000000000000000000000000000000000000000000000000000000000000", | ||
| callGasLimit: "0x8342", | ||
| verificationGasLimit: "0x16070", | ||
| maxFeePerGas: "0x7d912eba", | ||
| maxPriorityFeePerGas: "0x7d912eaa", | ||
| paymasterAndData: "0x", | ||
| preVerificationGas: "0xa96c", | ||
| signature: "0x" | ||
| }; | ||
| const res = await app.inject({ | ||
| url: '/', | ||
| method: "POST", | ||
| payload: {params: [userOp, "0x603Ef162f05dDa6e3B4717f4A951d6eF614a897f"]}, | ||
| }); | ||
| expect(res.statusCode).toEqual(200); | ||
| expect(JSON.parse(res.payload)).toHaveProperty('paymasterAndData'); | ||
| }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick | 🔵 Trivial
Consider validating the paymasterAndData value and extracting test constants.
The test correctly validates the happy path, but could be more robust:
- Weak assertion: The test only checks that
paymasterAndDataexists, not that it contains a valid value. It could be an empty string or "0x". - Hardcoded values: The userOp object and addresses are hardcoded, making the test brittle and harder to maintain.
Apply these improvements:
+const TEST_SENDER_ADDRESS = "0x603Ef162f05dDa6e3B4717f4A951d6eF614a897f";
+const TEST_NONCE = "0x1d";
+// ... extract other constants
+
test('default etherspot paymaster with params', async () => {
const userOp = {
- sender: "0x603Ef162f05dDa6e3B4717f4A951d6eF614a897f",
- nonce: "0x1d",
+ sender: TEST_SENDER_ADDRESS,
+ nonce: TEST_NONCE,
// ... rest of fields
};
// ...
expect(res.statusCode).toEqual(200);
- expect(JSON.parse(res.payload)).toHaveProperty('paymasterAndData');
+ const response = JSON.parse(res.payload);
+ expect(response).toHaveProperty('paymasterAndData');
+ expect(response.paymasterAndData).not.toBe('0x');
+ expect(response.paymasterAndData.length).toBeGreaterThan(2);
})Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In backend/test/routes.test.ts around lines 14-35, strengthen the test by
validating that paymasterAndData is a non-empty hex string (e.g., exists,
startsWith '0x', length > 2 and matches a hex regexp) instead of only checking
property presence, and refactor the hardcoded userOp and address into top-level
test constants or a small fixture object so values are reused and easier to
maintain; update assertions to explicitly check the extracted paymasterAndData
against the hex validation and preserve the existing statusCode check.
| "compilerOptions": { | ||
| "baseUrl": "./", | ||
| "resolvePackageJsonImports": true, | ||
| "rootDir": "./", | ||
| "module": "esnext", | ||
| "target": "es2017", | ||
| "moduleResolution": "Node", | ||
| "lib": ["esnext", "ES2022", "ES2017"], | ||
| "esModuleInterop": true, | ||
| "allowSyntheticDefaultImports": true, | ||
| "sourceMap": true, | ||
| "skipLibCheck": true, | ||
| "strict": false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Disable strict mode for tests is anti-pattern, especially with viem migration.
Setting "strict": false undermines type safety in tests. Given the PR migrates to viem (a strongly-typed library replacing ethers.js), tests should enforce strict type checking to catch regressions and validate correct usage of viem's type system.
Apply this diff to enable strict mode:
- "strict": false
+ "strict": trueIf specific strict sub-flags need to be relaxed, disable only those (e.g., "noImplicitAny": false) rather than disabling strict wholesale.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "compilerOptions": { | |
| "baseUrl": "./", | |
| "resolvePackageJsonImports": true, | |
| "rootDir": "./", | |
| "module": "esnext", | |
| "target": "es2017", | |
| "moduleResolution": "Node", | |
| "lib": ["esnext", "ES2022", "ES2017"], | |
| "esModuleInterop": true, | |
| "allowSyntheticDefaultImports": true, | |
| "sourceMap": true, | |
| "skipLibCheck": true, | |
| "strict": false | |
| "compilerOptions": { | |
| "baseUrl": "./", | |
| "resolvePackageJsonImports": true, | |
| "rootDir": "./", | |
| "module": "esnext", | |
| "target": "es2017", | |
| "moduleResolution": "Node", | |
| "lib": ["esnext", "ES2022", "ES2017"], | |
| "esModuleInterop": true, | |
| "allowSyntheticDefaultImports": true, | |
| "sourceMap": true, | |
| "skipLibCheck": true, | |
| "strict": true |
🤖 Prompt for AI Agents
In backend/test/tsconfig.test.json around lines 2 to 14, tests currently set
"strict": false which disables all strict TypeScript checks; change this to
"strict": true so tests run with full type safety (important after the viem
migration). If any specific strict sub-flags cause nuisance errors, enable
"strict": true and selectively set only those specific flags to false (for
example "noImplicitAny": false) instead of turning off strict entirely; update
the file accordingly and run the test type checks to confirm no remaining
type-errors or to identify and relax only the necessary sub-flags.
| "baseUrl": "./", | ||
| "resolvePackageJsonImports": true, | ||
| "rootDir": "./", | ||
| "module": "esnext", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧹 Nitpick | 🔵 Trivial
Minor: redundant lib entry and trailing comma.
Line 9 includes both "esnext" and specific ES versions ("ES2022", "ES2017"). The "esnext" entry is redundant since you explicitly declare the versions you need. Additionally, line 17 has a trailing comma in the include array, which is valid JSON but may conflict with project linting rules.
Apply this diff to clean up:
- "lib": ["esnext", "ES2022", "ES2017"],
+ "lib": ["ES2022", "ES2017"],And remove the trailing comma:
"include": [
- "**/*.ts", "./*.ts",
+ "**/*.ts", "./*.ts"
]Also applies to: 9-9
🤖 Prompt for AI Agents
In backend/test/tsconfig.test.json around lines 6-17, remove the redundant
"esnext" entry from the "lib" array (keep the explicit ES versions like
ES2022/ES2017) and delete the trailing comma in the "include" array so the JSON
adheres to linting rules; update the lib array to only list the specific ES libs
and ensure the include array ends without a comma.
ch4r10t33r
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good
MTP Migrations
Deploying arka with
|
| Latest commit: |
3c6c89e
|
| Status: | ✅ Deploy successful! |
| Preview URL: | https://741a4e79.arka-3qg.pages.dev |
| Branch Preview URL: | https://develop.arka-3qg.pages.dev |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (2)
backend/migrations/2025110100001-seed-data.cjs(1 hunks)backend/src/constants/MultiTokenPaymasterCronJob.ts(0 hunks)
💤 Files with no reviewable changes (1)
- backend/src/constants/MultiTokenPaymasterCronJob.ts
🔇 Additional comments (2)
backend/migrations/2025110100001-seed-data.cjs (2)
1-2: LGTM!The imports are appropriate. Using viem for address checksumming aligns with the PR's migration from ethers to viem.
15-15: LGTM!The module export follows the standard Sequelize migration pattern.
| async function up({ context: queryInterface }) { | ||
| await queryInterface.sequelize.query(`INSERT INTO "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster ("TOKEN_ADDRESS", "PAYMASTER_ADDRESS", "ORACLE_ADDRESS", "CHAIN_ID", "DECIMALS", "CREATED_AT", "UPDATED_AT", "EP_VERSION") VALUES ('${viem.getAddress('0x0b2C639c533813f4Aa9D7837CAf62653d097Ff85')}', '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F', '0x16a9FA2FDa030272Ce99B29CF780dFA30361E0f3', 10, 6, NOW(), NOW(), 'EPV_08')`); | ||
| await queryInterface.sequelize.query(`INSERT INTO "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster ("TOKEN_ADDRESS", "PAYMASTER_ADDRESS", "ORACLE_ADDRESS", "CHAIN_ID", "DECIMALS", "CREATED_AT", "UPDATED_AT", "EP_VERSION") VALUES ('${viem.getAddress('0x94b008aA00579c1307B0EF2c499aD98a8ce58e58')}', '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F', '0xECef79E109e997bCA29c1c0897ec9d7b03647F5E', 10, 6, NOW(), NOW(), 'EPV_08')`); | ||
| await queryInterface.sequelize.query(`INSERT INTO "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster ("TOKEN_ADDRESS", "PAYMASTER_ADDRESS", "ORACLE_ADDRESS", "CHAIN_ID", "DECIMALS", "CREATED_AT", "UPDATED_AT", "EP_VERSION") VALUES ('${viem.getAddress('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48')}', '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F', '0x8fFfFfd4AfB6115b954Bd326cbe7B4BA576818f6', 1, 6, NOW(), NOW(), 'EPV_08')`); | ||
| await queryInterface.sequelize.query(`INSERT INTO "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster ("TOKEN_ADDRESS", "PAYMASTER_ADDRESS", "ORACLE_ADDRESS", "CHAIN_ID", "DECIMALS", "CREATED_AT", "UPDATED_AT", "EP_VERSION") VALUES ('${viem.getAddress('0xdAC17F958D2ee523a2206206994597C13D831ec7')}', '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F', '0x3E7d1eAB13ad0104d2750B8863b489D65364e32D', 1, 6, NOW(), NOW(), 'EPV_08')`); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SQL injection risk and consider using parameterized queries.
The schema name from process.env.DATABASE_SCHEMA_NAME is directly interpolated into the SQL query, creating a potential SQL injection vector if the environment variable is compromised. Additionally, using string interpolation for all values is less secure than parameterized queries.
Consider using Sequelize's bulkInsert method for cleaner and safer code:
-async function up({ context: queryInterface }) {
- await queryInterface.sequelize.query(`INSERT INTO "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster ("TOKEN_ADDRESS", "PAYMASTER_ADDRESS", "ORACLE_ADDRESS", "CHAIN_ID", "DECIMALS", "CREATED_AT", "UPDATED_AT", "EP_VERSION") VALUES ('${viem.getAddress('0x0b2C639c533813f4Aa9D7837CAf62653d097Ff85')}', '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F', '0x16a9FA2FDa030272Ce99B29CF780dFA30361E0f3', 10, 6, NOW(), NOW(), 'EPV_08')`);
- await queryInterface.sequelize.query(`INSERT INTO "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster ("TOKEN_ADDRESS", "PAYMASTER_ADDRESS", "ORACLE_ADDRESS", "CHAIN_ID", "DECIMALS", "CREATED_AT", "UPDATED_AT", "EP_VERSION") VALUES ('${viem.getAddress('0x94b008aA00579c1307B0EF2c499aD98a8ce58e58')}', '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F', '0xECef79E109e997bCA29c1c0897ec9d7b03647F5E', 10, 6, NOW(), NOW(), 'EPV_08')`);
- await queryInterface.sequelize.query(`INSERT INTO "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster ("TOKEN_ADDRESS", "PAYMASTER_ADDRESS", "ORACLE_ADDRESS", "CHAIN_ID", "DECIMALS", "CREATED_AT", "UPDATED_AT", "EP_VERSION") VALUES ('${viem.getAddress('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48')}', '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F', '0x8fFfFfd4AfB6115b954Bd326cbe7B4BA576818f6', 1, 6, NOW(), NOW(), 'EPV_08')`);
- await queryInterface.sequelize.query(`INSERT INTO "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster ("TOKEN_ADDRESS", "PAYMASTER_ADDRESS", "ORACLE_ADDRESS", "CHAIN_ID", "DECIMALS", "CREATED_AT", "UPDATED_AT", "EP_VERSION") VALUES ('${viem.getAddress('0xdAC17F958D2ee523a2206206994597C13D831ec7')}', '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F', '0x3E7d1eAB13ad0104d2750B8863b489D65364e32D', 1, 6, NOW(), NOW(), 'EPV_08')`);
-}
+async function up({ context: queryInterface }) {
+ const tableName = 'multi_token_paymaster';
+ const schema = process.env.DATABASE_SCHEMA_NAME;
+ const now = new Date();
+
+ await queryInterface.bulkInsert(
+ { tableName, schema },
+ [
+ {
+ TOKEN_ADDRESS: viem.getAddress('0x0b2C639c533813f4Aa9D7837CAf62653d097Ff85'),
+ PAYMASTER_ADDRESS: '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F',
+ ORACLE_ADDRESS: '0x16a9FA2FDa030272Ce99B29CF780dFA30361E0f3',
+ CHAIN_ID: 10,
+ DECIMALS: 6,
+ CREATED_AT: now,
+ UPDATED_AT: now,
+ EP_VERSION: 'EPV_08'
+ },
+ {
+ TOKEN_ADDRESS: viem.getAddress('0x94b008aA00579c1307B0EF2c499aD98a8ce58e58'),
+ PAYMASTER_ADDRESS: '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F',
+ ORACLE_ADDRESS: '0xECef79E109e997bCA29c1c0897ec9d7b03647F5E',
+ CHAIN_ID: 10,
+ DECIMALS: 6,
+ CREATED_AT: now,
+ UPDATED_AT: now,
+ EP_VERSION: 'EPV_08'
+ },
+ {
+ TOKEN_ADDRESS: viem.getAddress('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48'),
+ PAYMASTER_ADDRESS: '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F',
+ ORACLE_ADDRESS: '0x8fFfFfd4AfB6115b954Bd326cbe7B4BA576818f6',
+ CHAIN_ID: 1,
+ DECIMALS: 6,
+ CREATED_AT: now,
+ UPDATED_AT: now,
+ EP_VERSION: 'EPV_08'
+ },
+ {
+ TOKEN_ADDRESS: viem.getAddress('0xdAC17F958D2ee523a2206206994597C13D831ec7'),
+ PAYMASTER_ADDRESS: '0x6Ad5796A4B5385bB3A1573C56115BF292Fb78d2F',
+ ORACLE_ADDRESS: '0x3E7d1eAB13ad0104d2750B8863b489D65364e32D',
+ CHAIN_ID: 1,
+ DECIMALS: 6,
+ CREATED_AT: now,
+ UPDATED_AT: now,
+ EP_VERSION: 'EPV_08'
+ }
+ ]
+ );
+}Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In backend/migrations/2025110100001-seed-data.cjs around lines 4-9, the
migration builds raw SQL with process.env.DATABASE_SCHEMA_NAME and token
addresses interpolated directly, creating SQL injection risk; replace the raw
string queries with a safe approach: validate or whitelist the
DATABASE_SCHEMA_NAME (or safely quote it using Sequelize utilities), then use
queryInterface.bulkInsert with the { tableName: 'multi_token_paymaster', schema:
<validated schema> } option (or use queryInterface.sequelize.query with
parameterized binds/replacements) and pass all column values as
parameters/objects instead of inline string interpolation so values are bound
and not concatenated.
| async function down({ context: queryInterface }) { | ||
| await queryInterface.sequelize.query(`DELETE IF EXISTS FROM "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster;`); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: Invalid SQL syntax and unsafe rollback.
The down function has two critical issues:
-
SQL syntax error:
DELETE IF EXISTS FROMis invalid SQL syntax. The correct syntax isDELETE FROM. TheIF EXISTSclause is only valid withDROP TABLE, notDELETE. -
Unsafe rollback: This deletes ALL records from the table, not just the 4 records inserted by this migration. This could cause data loss if other records exist in the table.
Apply this diff to fix both issues:
async function down({ context: queryInterface }) {
- await queryInterface.sequelize.query(`DELETE IF EXISTS FROM "${process.env.DATABASE_SCHEMA_NAME}".multi_token_paymaster;`);
+ const tableName = 'multi_token_paymaster';
+ const schema = process.env.DATABASE_SCHEMA_NAME;
+
+ await queryInterface.bulkDelete(
+ { tableName, schema },
+ {
+ EP_VERSION: 'EPV_08',
+ CHAIN_ID: [1, 10],
+ TOKEN_ADDRESS: [
+ viem.getAddress('0x0b2C639c533813f4Aa9D7837CAf62653d097Ff85'),
+ viem.getAddress('0x94b008aA00579c1307B0EF2c499aD98a8ce58e58'),
+ viem.getAddress('0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48'),
+ viem.getAddress('0xdAC17F958D2ee523a2206206994597C13D831ec7')
+ ]
+ }
+ );
}Committable suggestion skipped: line range outside the PR's diff.
update the docker file to include scripts
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
♻️ Duplicate comments (2)
backend/Dockerfile (2)
14-24:⚠️ Critical: Runner stage missing Bun dependencies — will fail at startup [DUPLICATE]This is the same critical issue flagged in the previous review. The runner stage copies only build artifacts but not installed dependencies (
node_modulesand Bun's cache). Whenbun startexecutes, all imports will fail with "module not found" errors.Either copy dependencies from the deps stage or run a production install in the runner:
Option A (preferred): Copy dependencies from deps stage
FROM oven/bun:1-alpine AS runner WORKDIR /usr/app ARG APP_ENV COPY --from=builder /app/build ./build COPY ./migrations ./migrations COPY ./scripts ./scripts COPY package.json ./ +COPY bun.lock* ./ +COPY --from=deps /app/node_modules ./node_modules +COPY --from=deps /root/.bun /root/.bun COPY --from=builder /app/config.json.default /usr/app/config.json USER root ENV NODE_ENV="production" CMD ["bun", "start"]Option B (alternative): Install production dependencies in runner
FROM oven/bun:1-alpine AS runner WORKDIR /usr/app ARG APP_ENV COPY --from=builder /app/build ./build COPY ./migrations ./migrations COPY ./scripts ./scripts COPY package.json ./ +COPY bun.lock* ./ +RUN bun install --frozen-lockfile --production COPY --from=builder /app/config.json.default /usr/app/config.json USER root ENV NODE_ENV="production" CMD ["bun", "start"]
22-22: RemoveUSER rootto follow least privilege principle.Running containers as root increases risk if the container is compromised. Create or use a non-root user.
-USER root +USER bunNote: Ensure file permissions for
/usr/appand its contents allow the non-root user to read/execute.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (1)
backend/Dockerfile(1 hunks)
🧰 Additional context used
🪛 Checkov (3.2.334)
backend/Dockerfile
[low] 22-22: Ensure the last USER is not root
(CKV_DOCKER_8)
[low] 1-24: Ensure that HEALTHCHECK instructions have been added to container images
(CKV_DOCKER_2)
🪛 Hadolint (2.14.0)
backend/Dockerfile
[warning] 2-2: Pin versions in apk add. Instead of apk add <package> use apk add <package>=<version>
(DL3018)
[warning] 22-22: Last USER should not be root
(DL3002)
| # Check https://github.com/nodejs/docker-node/tree/b4117f9333da4138b03a546ec926ef50a31506c3#nodealpine to understand why libc6-compat might be needed. | ||
| RUN apk add --no-cache libc6-compat | ||
| FROM oven/bun:1-alpine AS deps | ||
| RUN apk update && apk add --no-cache g++ nodejs npm musl make musl-dev python3 git py3-setuptools && rm -rf /var/cache/apk/* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pin APK package versions to improve reproducibility and security.
Floating package versions can pull unintended updates; explicitly pinning versions ensures consistent builds.
-RUN apk update && apk add --no-cache g++ nodejs npm musl make musl-dev python3 git py3-setuptools && rm -rf /var/cache/apk/*
+RUN apk update && apk add --no-cache g++=13.2.1_p20240212-r1 nodejs=20.11.1-r0 npm=10.4.0-r0 musl=1.2.4_git20240131-r1 make=4.4.1-r2 musl-dev=1.2.4_git20240131-r1 python3=3.12.1-r1 git=2.43.0-r0 py3-setuptools=69.0.2-r0 && rm -rf /var/cache/apk/*Consider also whether npm is still needed after full migration to Bun.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| RUN apk update && apk add --no-cache g++ nodejs npm musl make musl-dev python3 git py3-setuptools && rm -rf /var/cache/apk/* | |
| RUN apk update && apk add --no-cache g++=13.2.1_p20240212-r1 nodejs=20.11.1-r0 npm=10.4.0-r0 musl=1.2.4_git20240131-r1 make=4.4.1-r2 musl-dev=1.2.4_git20240131-r1 python3=3.12.1-r1 git=2.43.0-r0 py3-setuptools=69.0.2-r0 && rm -rf /var/cache/apk/* |
🧰 Tools
🪛 Hadolint (2.14.0)
[warning] 2-2: Pin versions in apk add. Instead of apk add <package> use apk add <package>=<version>
(DL3018)
🤖 Prompt for AI Agents
In backend/Dockerfile around line 2, the RUN line installs APK packages with
floating versions; update it to pin each package to a specific version (e.g.,
package=version) to ensure reproducible builds and improved security by using
known-good package versions, obtain the exact versions from your base image's
apk index (apk info -v or from the image tag) and replace each package in the
apk add list with package=VERSION; also remove npm from the install list if the
project has fully migrated to Bun to reduce image surface area and rebuild time.
Description
Types of changes
What types of changes does your code introduce?
Further comments (optional)
Summary by CodeRabbit
Chores
Bug Fixes
Tests