Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

spike: e2e tests with openapi client #2681

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from
Draft

Conversation

hashicc
Copy link
Collaborator

@hashicc hashicc commented Feb 4, 2025

Description

Spike on using a generated openapi client

Notes

  • Based generation using package and techniques used elsewhere in hashicorp
  • Package to generate openapi spec requires java
  • Playwright "just works" with typescript so generating typescript-based open api client makes sense
  • Using an api fixture allows centralized setup for all the clients with common configuration
  • attributes in payloads are untyped and would need to rely on documentation and lookup
  • Generated client has decent error handling and the generated code while a bit verbose isn't that complicated and supports a good set of default configuration
  • Generated client uses the openapi specs and can be re-generated on demand. It currently points at HEAD of the main branch on the boundary repo.
  • ❓ Should the openapi clients be used directly from the fixture or should there be a wrapper to provide niceties like specifying a name automatically with Target-${nanoid()} or arguments for untyped attributes (see above)
    • A fixture could be created that extends and is based on the api fixture, but the api fixture could be the first step
  • ❓ Should the openapi generated client be checked into source control?
    • I think so, since the openapi schema lives outside the repo and has to be fetched
  • ❓ Is there a preference for how resources are created? cli, rest api, ui (ui clicks, Page Objects)
    • Right now we're using a combination of each and have some overlap. I think when it's the thing being tested (the "subject") it should use the UI to create the resource where possible. Otherwise, I think using a rest api first makes sense since it's less layers of abstraction and might have better error messages, although I'm not sure. Most of our UIs will end up using the rest api so it might make sense to prefer this method. That leaves the cli as a method of last resort.
  • ❓ Right now the tooling is set to pull the swagger/openapi docs from head of the main branch of the boundary repo. Is there any concern of this during active development? (eg: work against changes on a branch). The tooling could be take a commit but that would still rely on that commit being available on github.

Copy link

vercel bot commented Feb 4, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
boundary-ui ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 4, 2025 6:30pm
boundary-ui-desktop ✅ Ready (Inspect) Visit Preview 💬 Add feedback Feb 4, 2025 6:30pm

@hashicc hashicc changed the title Hashicc/api client e2e tests with openapi client Feb 4, 2025
@hashicc hashicc changed the title e2e tests with openapi client spike: e2e tests with openapi client Feb 4, 2025
};

const ApiClasses = {
target: TargetServiceApi,
Copy link
Collaborator Author

@hashicc hashicc Feb 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can add any of the generated clients here as needed

scopeId: project.id,
type: 'tcp',
port: 22,
const target = await boundaryApi.target.targetServiceCreateTarget({
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to look into the generator to see if the targetService prefix can be removed because it's redundant and noisy, it'd be nicer if this read as boundaryApi.target.createTarget

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is based on the spec "operationId": "TargetService_CreateTarget", so it might be best to keep the verbosity to have it match more clearly the open api spec

@ZedLi
Copy link
Collaborator

ZedLi commented Feb 4, 2025

❓ Should the openapi clients be used directly from the fixture or should there be a wrapper to provide niceties like specifying a name automatically with Target-${nanoid()} or arguments for untyped attributes (see above)

When you say from the fixture do you mean in the test directly or in the fixture which is already a layer abstracted from the test? I think from the POV of writing the actual tests themselves, we could totally have wrappers to at least provide some convenience since I imagine there could be a decent amount of boiler-plate otherwise, though it might reduce the convenience of having auto generated clients if we do that for everything (besides having the types).

❓ Should the openapi generated client be checked into source control?

I think so, I would imagine even if the schema was checked in here and didn't need to be fetched, I would still check in the client so that way the history is clear and to know what tests were running against.

❓ Is there a preference for how resources are created? cli, rest api, ui (ui clicks, Page Objects)

Yea, I agree with what your line of thinking as well and is similar to what I was thinking here in this PR #2594. Unless it's under test, we most likely we want to stick with using the HTTP client rather than using the CLI except in scenarios where we need it.

❓ Right now the tooling is set to pull the swagger/openapi docs from head of the main branch of the boundary repo. Is there any concern of this during active development? (eg: work against changes on a branch). The tooling could be take a commit but that would still rely on that commit being available on github.

I think this is fine as we're also checking in the client itself so we'd be aware of any changes that are being made to the clients.

@hashicc
Copy link
Collaborator Author

hashicc commented Feb 4, 2025

@ZedLi

When you say from the fixture do you mean in the test directly or in the fixture which is already a layer abstracted from the test? I think from the POV of writing the actual tests themselves, we could totally have wrappers to at least provide some convenience since I imagine there could be a decent amount of boiler-plate otherwise

That's right, an additional fixture that is based on the api fixture (using the word fixture still feels odd to me but its what playwright calls them). Thinking ahead one step it would be really nice for these domain wrapper fixtures to be class based so that they could also easily track and teardown what they create likely automatically. A target fixture based on an api fixture could look like:

class TargetSandbox {
    created = new Set();

    constructor(private api) {}
    // ... nice things like filling in `default_port`
    create(args: Args & {port?: number}) {
        const result = this.api.createTarget({
            ...args,
            attributes: {
                args.port
            }
        })

        this.created.add(result);
        return result;
    }

    delete(id) {
        const result = this.api.deleteTarget(id);
        if (this.created.has(id)) {
            this.created.remove(id);
        }

        return result;
    }

    destroy() {
        for (const item in this.created) {
            try {
                this.delete(item.id);
            } catch (e) {
                console.error(`Unable to cleanup item`)
            }
        }
    }
}

export const test = base.extend({
    targetOptions: [{ shouldDestroy: true }, { option: true }],

    target: async ({ api, targetOptions }, use) => {
        const target = new TargetSandbox(api);
        await use(target);

        if (!target.isDestroyed && targetOptions.shouldDestroy) {
            target.destroy();
        }
    }
});

Yea, I agree with what your line of thinking as well and is similar to what I was thinking here in this PR #2594. Unless it's under test, we most likely we want to stick with using the HTTP client rather than using the CLI except in scenarios where we need it.

💯 on everything you mentioned in that pull request

Copy link
Collaborator

@ZedLi ZedLi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very cool POC!! 🚀

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have two questions regarding this file:

  1. How come you had to use commonJS instead of just ESM for this file?
  2. Could we have just had this entirely in the package.json script? It doesn't look it needs to do much besides call the openapi generator cli? Or do you forsee more we'll need to add to the script?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I started off with just a yarn run and then moved it to a script. We could return it to that, I think that could make sense. I ended up copying the script from another repo to try and have a common approach as them. That's why it ended up being cjs for the spike, I didn't really want to refactor it, you're right it could totally be esm. I trimmed the script for parts we didn't need which ended up making it as simple as package.json script.

TLDR: It could totally be a package.json script, and we could return to a script later if needed


const genClient = async function (swaggerUrl) {
echoInfo(`Removing existing folder: api-clients`);
await exec(`rm -rf ${apiClientsDirectory}`);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you have just used node for this? fs.rmSync(apiClientsDirectory, { recursive: true, force: true })

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Definitely, that's a lot clearer

@@ -86,3 +91,5 @@ export const test = baseTest.extend({
vaultAddr: process.env.E2E_VAULT_ADDR,
workerTagEgress: process.env.E2E_WORKER_TAG_EGRESS,
});

export const test = mergeTests(globalBaseTest, apiTest);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We likely want to move these out of the globalSetup as these are really just shared fixtures at this point

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💯

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants