Conversation
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly enhances the test coverage for the Highlights
Changelog
Activity
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This pull request introduces a comprehensive set of unit tests for the lookup command, covering CLI argument parsing, configuration binding, and the TimeoutError utility. The tests are well-structured and address various scenarios, which significantly improves the test coverage for this component. However, there are some areas for improvement regarding adherence to the project's testing style guide and test clarity.
| Deno.test("lookupCommand: binds config defaults when no CLI flags provided", async () => { | ||
| const result = await runWithConfig(lookupCommand, configContext, { | ||
| load: () => ({ | ||
| lookup: { | ||
| timeout: 12, | ||
| defaultFormat: "compact", | ||
| separator: "----SEP----", | ||
| traverse: true, | ||
| suppressErrors: true, | ||
| authorizedFetch: true, | ||
| firstKnock: "rfc9421", | ||
| }, | ||
| }), | ||
| args: ["lookup", "https://example.com/object/1"], | ||
| }); | ||
|
|
||
| assert.equal(result.command, "lookup"); | ||
| assert.deepEqual(result.urls, ["https://example.com/object/1"]); | ||
| }); | ||
|
|
||
| Deno.test("Config lookup value constraints reject invalid timeout/format inputs deterministically", async () => { | ||
| await assert.rejects( | ||
| () => | ||
| runWithConfig(lookupCommand, configContext, { | ||
| load: () => ({ | ||
| lookup: { | ||
| timeout: "not-a-number", | ||
| defaultFormat: "not-a-supported-format", | ||
| }, | ||
| }), | ||
| args: ["lookup", "https://example.com/object/1"], | ||
| }), | ||
| (err) => { | ||
| assert.ok(err instanceof Error); | ||
| assert.match( | ||
| err.message, | ||
| /(timeout|defaultFormat|lookup|schema|Invalid)/i, | ||
| ); | ||
| return true; | ||
| }, | ||
| ); | ||
| }); | ||
|
|
||
| Deno.test("lookup options parse: timeout/format/separator/traverse/suppress-errors via expected flags", () => { | ||
| const result = withActiveConfig({}, () => | ||
| parse(lookupCommand, [ | ||
| "lookup", | ||
| "--timeout", | ||
| "1.5", | ||
| "-s", | ||
| "***", | ||
| "--traverse", | ||
| "-S", | ||
| "-e", | ||
| "https://example.com/object/1", | ||
| ])); | ||
|
|
||
| assert.ok(result.success); | ||
| if (!result.success) return; | ||
|
|
||
| assert.equal(result.value.timeout, 1.5); | ||
| assert.equal(result.value.format, "expand"); | ||
| assert.equal(result.value.separator, "***"); | ||
| assert.equal(result.value.traverse, true); | ||
| assert.equal(result.value.suppressErrors, true); | ||
| assert.deepEqual(result.value.urls, ["https://example.com/object/1"]); | ||
| }); | ||
|
|
||
| Deno.test("authorized fetch dependency: --first-knock requires -a/--authorized-fetch", () => { | ||
| const result = withActiveConfig({}, () => | ||
| parse(lookupCommand, [ | ||
| "lookup", | ||
| "--first-knock", | ||
| "rfc9421", | ||
| "https://example.com/object/1", | ||
| ])); | ||
|
|
||
| assert.ok(!result.success); | ||
| }); | ||
|
|
||
| Deno.test("lookupCommand parses URL arguments and preserves order", () => { | ||
| const result = withActiveConfig({}, () => | ||
| parse(lookupCommand, [ | ||
| "lookup", | ||
| "https://example.com/objects/1", | ||
| "@alice@example.com", | ||
| "https://example.com/objects/2", | ||
| ])); | ||
|
|
||
| assert.ok(result.success); | ||
| if (!result.success) return; | ||
|
|
||
| assert.deepEqual(result.value.urls, [ | ||
| "https://example.com/objects/1", | ||
| "@alice@example.com", | ||
| "https://example.com/objects/2", | ||
| ]); | ||
| }); | ||
|
|
||
| Deno.test("CLI flags override config-bound lookup defaults", async () => { | ||
| const result = await runWithConfig(lookupCommand, configContext, { | ||
| load: () => ({ | ||
| lookup: { | ||
| timeout: 30, | ||
| defaultFormat: "raw", | ||
| separator: "CONFIG_SEP", | ||
| traverse: false, | ||
| suppressErrors: false, | ||
| }, | ||
| }), | ||
| args: [ | ||
| "lookup", | ||
| "--timeout", | ||
| "2", | ||
| "--compact", | ||
| "--separator", | ||
| "CLI_SEP", | ||
| "--traverse", | ||
| "--suppress-errors", | ||
| "https://example.com/object/1", | ||
| ], | ||
| }); | ||
|
|
||
| assert.equal(result.timeout, 2); | ||
| assert.equal(result.format, "compact"); | ||
| assert.equal(result.separator, "CLI_SEP"); | ||
| assert.equal(result.traverse, true); | ||
| assert.equal(result.suppressErrors, true); | ||
| }); | ||
|
|
||
| Deno.test("authorized fetch dependency: --tunnel-service requires -a/--authorized-fetch", () => { | ||
| const result = withActiveConfig({}, () => | ||
| parse(lookupCommand, [ | ||
| "lookup", | ||
| "--tunnel-service", | ||
| "serveo.net", | ||
| "https://example.com/object/1", | ||
| ])); | ||
|
|
||
| assert.ok(!result.success); | ||
| }); | ||
|
|
||
| Deno.test("lookupCommand requires at least one URL", () => { | ||
| const result = withActiveConfig({}, () => parse(lookupCommand, ["lookup"])); | ||
|
|
||
| assert.ok(!result.success); | ||
| if (result.success) return; | ||
|
|
||
| assert.ok(result.error); | ||
| }); | ||
|
|
||
| Deno.test("TimeoutError constructor sets message and name", () => { | ||
| const error = new TimeoutError("Test timeout message"); | ||
| assert.equal(error.message, "Test timeout message"); | ||
| assert.equal(error.name, "TimeoutError"); | ||
| }); | ||
|
|
||
| Deno.test("TimeoutError is instance of Error", () => { | ||
| const error = new TimeoutError("Test error"); | ||
| assert.ok(error instanceof Error); | ||
| assert.ok(error instanceof TimeoutError); | ||
| }); | ||
|
|
||
| Deno.test("TimeoutError preserves stack trace", () => { | ||
| const error = new TimeoutError("Test timeout"); | ||
| assert.ok(error.stack); | ||
| assert.ok(error.stack.includes("TimeoutError")); | ||
| }); | ||
|
|
||
| Deno.test("createTimeoutSignal handles zero timeout", () => { | ||
| const signal = createTimeoutSignal(0); | ||
| assert.ok(signal !== undefined); | ||
| assert.ok(signal instanceof AbortSignal); | ||
| }); | ||
|
|
||
| Deno.test("createTimeoutSignal multiple signals are independent", async () => { | ||
| const signal1 = createTimeoutSignal(0.05); | ||
| const signal2 = createTimeoutSignal(0.1); | ||
|
|
||
| assert.ok(signal1); | ||
| assert.ok(signal2); | ||
| assert.ok(!signal1.aborted); | ||
| assert.ok(!signal2.aborted); | ||
|
|
||
| await new Promise((resolve) => setTimeout(resolve, 60)); | ||
|
|
||
| assert.ok(signal1.aborted); | ||
| assert.ok(!signal2.aborted); | ||
|
|
||
| clearTimeoutSignal(signal2); | ||
| }); | ||
|
|
||
| Deno.test("clearTimeoutSignal handles undefined signal gracefully", () => { | ||
| clearTimeoutSignal(undefined); | ||
| assert.ok(true); | ||
| }); | ||
|
|
||
| Deno.test("clearTimeoutSignal handles signal without timer", () => { | ||
| const controller = new AbortController(); | ||
| const signal = controller.signal; | ||
| clearTimeoutSignal(signal); | ||
| assert.ok(true); | ||
| }); | ||
|
|
||
| Deno.test("lookupCommand parses multiple URLs and handles correctly", () => { | ||
| const result = withActiveConfig({}, () => | ||
| parse(lookupCommand, [ | ||
| "lookup", | ||
| "https://example.com/1", | ||
| "@user@domain.com", | ||
| "https://example.com/2", | ||
| "@another@server.net", | ||
| ])); | ||
|
|
||
| assert.ok(result.success); | ||
| if (!result.success) return; | ||
|
|
||
| assert.equal(result.value.urls.length, 4); | ||
| assert.equal(result.value.urls[0], "https://example.com/1"); | ||
| assert.equal(result.value.urls[1], "@user@domain.com"); | ||
| assert.equal(result.value.urls[2], "https://example.com/2"); | ||
| assert.equal(result.value.urls[3], "@another@server.net"); | ||
| }); | ||
|
|
||
| Deno.test("lookupCommand traverse and suppressErrors work together", () => { | ||
| const result = withActiveConfig({}, () => | ||
| parse(lookupCommand, [ | ||
| "lookup", | ||
| "-t", | ||
| "-S", | ||
| "https://example.com/collection/1", | ||
| ])); | ||
|
|
||
| assert.ok(result.success); | ||
| if (!result.success) return; | ||
|
|
||
| assert.equal(result.value.traverse, true); | ||
| assert.equal(result.value.suppressErrors, true); | ||
| }); | ||
|
|
||
| Deno.test("lookupCommand accepts custom separator", () => { | ||
| const result = withActiveConfig({}, () => | ||
| parse(lookupCommand, [ | ||
| "lookup", | ||
| "-s", | ||
| "===SEPARATOR===", | ||
| "https://example.com/object/1", | ||
| ])); | ||
|
|
||
| assert.ok(result.success); | ||
| if (!result.success) return; | ||
|
|
||
| assert.equal(result.value.separator, "===SEPARATOR==="); | ||
| }); | ||
|
|
||
| Deno.test("authorizedFetchOption all flags work together", () => { | ||
| const result = withActiveConfig({}, () => | ||
| parse(authorizedFetchOption, [ | ||
| "-a", | ||
| "--first-knock", | ||
| "rfc9421", | ||
| "--tunnel-service", | ||
| "serveo.net", | ||
| ])); | ||
|
|
||
| assert.ok(result.success); | ||
| if (!result.success) return; | ||
|
|
||
| assert.equal(result.value.authorizedFetch, true); | ||
| assert.equal(result.value.firstKnock, "rfc9421"); | ||
| assert.equal(result.value.tunnelService, "serveo.net"); | ||
| }); | ||
|
|
||
| Deno.test("runLookup timeout configuration creates signal correctly", () => { | ||
| const command = { | ||
| urls: ["https://example.com/notes/1"], | ||
| debug: false, | ||
| traverse: false, | ||
| suppressErrors: false, | ||
| authorizedFetch: false, | ||
| firstKnock: undefined, | ||
| tunnelService: undefined, | ||
| timeout: 5, | ||
| userAgent: undefined, | ||
| format: undefined, | ||
| separator: "----", | ||
| output: undefined, | ||
| }; | ||
|
|
||
| assert.equal(command.timeout, 5); | ||
| const signal = createTimeoutSignal(command.timeout); | ||
| assert.ok(signal); | ||
| assert.ok(!signal.aborted); | ||
| clearTimeoutSignal(signal); | ||
| }); |
There was a problem hiding this comment.
The new test cases utilize Deno.test for defining tests. According to the repository's style guide (lines 111 and 199), tests should use @fedify/fixture to ensure runtime-agnosticism across Deno, Node.js, and Bun environments. Please refactor these tests to use the test function imported from @fedify/fixture for consistency and adherence to the project's testing standards.
References
- Tests should use
@fedify/fixturefor runtime-agnostic test adapters (wraps Deno, Node.js, and Bun test APIs). (link) - Import
testfrom@fedify/fixturefor runtime-agnostic tests. (link) - For code intended to run in both Deno and Node.js environments, prefer Node.js APIs (e.g.,
node:net) over Deno-specific APIs (e.g.,Deno.connect) to ensure cross-runtime compatibility. This principle extends to testing frameworks to maintain runtime-agnosticism.
| Deno.test("createTimeoutSignal multiple signals are independent", async () => { | ||
| const signal1 = createTimeoutSignal(0.05); | ||
| const signal2 = createTimeoutSignal(0.1); | ||
|
|
||
| assert.ok(signal1); | ||
| assert.ok(signal2); | ||
| assert.ok(!signal1.aborted); | ||
| assert.ok(!signal2.aborted); | ||
|
|
||
| await new Promise((resolve) => setTimeout(resolve, 60)); | ||
|
|
||
| assert.ok(signal1.aborted); | ||
| assert.ok(!signal2.aborted); | ||
|
|
||
| clearTimeoutSignal(signal2); | ||
| }); |
There was a problem hiding this comment.
This test relies on a fixed setTimeout(resolve, 60) to assert the state of AbortSignals after a specific duration. Relying on precise timing in asynchronous tests can lead to flakiness, especially in different execution environments or under varying system loads. Consider using a more robust approach for asserting asynchronous behavior, such as Promise.race with explicit timeouts or a dedicated utility that waits for the signal to abort, to make the test more reliable.
| Deno.test("runLookup timeout configuration creates signal correctly", () => { | ||
| const command = { | ||
| urls: ["https://example.com/notes/1"], | ||
| debug: false, | ||
| traverse: false, | ||
| suppressErrors: false, | ||
| authorizedFetch: false, | ||
| firstKnock: undefined, | ||
| tunnelService: undefined, | ||
| timeout: 5, | ||
| userAgent: undefined, | ||
| format: undefined, | ||
| separator: "----", | ||
| output: undefined, | ||
| }; | ||
|
|
||
| assert.equal(command.timeout, 5); | ||
| const signal = createTimeoutSignal(command.timeout); | ||
| assert.ok(signal); | ||
| assert.ok(!signal.aborted); | ||
| clearTimeoutSignal(signal); | ||
| }); |
There was a problem hiding this comment.
The test named runLookup timeout configuration creates signal correctly does not actually invoke or test the runLookup function. It primarily tests the createTimeoutSignal function, which is already covered by other dedicated tests. This makes the test redundant and its name misleading. Please either adjust the test body to genuinely test runLookup's timeout configuration or remove this test if its functionality is sufficiently covered elsewhere.
|
Thank you for addressing the issues from PR #596 and for disclosing your use of AI. However, this PR still does not meet our AI usage policy, which requires that all AI-generated code be fully reviewed and verified by a human before submission. The following issues make it clear that the tests were not carefully validated against the actual codebase:
We understand that AI tools can be genuinely useful for writing tests, but submitting AI output without thorough review is not acceptable here. If you would like to contribute, please take the time to understand the code you are testing, run the tests locally to confirm they exercise real behavior, and carefully review every line before opening a PR. |
Summary
Add tests for fedify lookup
Related issue
#259
Changes
Added new test cases to lookup.test.ts
Benefits
Unit test for lookup.ts
Checklist
deno task test-allon your machine?Additional notes
This was done using Ogoron AI. In the past PR, there was an error with the README file. The incorrect location and watermark were my mistake, not an AI. Now, all tests target production code only. Tests have been edited and reviewed by me