-
Notifications
You must be signed in to change notification settings - Fork 141
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add script to clean up e2e #366
base: main
Are you sure you want to change the base?
Conversation
|
WalkthroughThe changes introduce a new functionality for managing pipelines in the LlamaCloudIndex system through the creation of a Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant CLI
participant CleanService
participant ProjectsService
participant PipelinesService
User->>CLI: Execute clean command
CLI->>CleanService: initService(apiKey)
CleanService->>ProjectsService: getProjectId(projectName)
ProjectsService-->>CleanService: return projectId
CleanService->>PipelinesService: searchPipelines(projectId)
PipelinesService-->>CleanService: return pipelines
CleanService->>PipelinesService: deletePipeline(pipelineId)
PipelinesService-->>CleanService: return deletion status
CleanService->>CLI: Log deletion results
CLI->>User: Display completion message
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🧹 Outside diff range and nitpick comments (1)
e2e/clean/clean.ts (1)
14-14
: Specify the type ofrequest
parameter instead of usingany
Using
any
defeats the purpose of TypeScript's static typing. Consider specifying the appropriate type forrequest
to enhance type safety and avoid potential runtime errors.For example:
client.interceptors.request.use((request: any) => { + // If using Axios, you might import AxiosRequestConfig: + // import { AxiosRequestConfig } from 'axios'; + // and then type the request parameter accordingly: + // (request: AxiosRequestConfig) => {
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
⛔ Files ignored due to path filters (1)
e2e/clean/pnpm-lock.yaml
is excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (3)
- e2e/clean/clean.ts (1 hunks)
- e2e/clean/package.json (1 hunks)
- package.json (1 hunks)
🧰 Additional context used
🔇 Additional comments (4)
e2e/clean/package.json (4)
1-5
: LGTM: Package metadata is well-defined.The package metadata is correctly set up:
- The scoped package name is appropriate.
- Version 0.1.0 is suitable for a new package.
- The package is correctly marked as private for internal use.
- The "type" field enables ES module syntax, which is good for modern JavaScript development.
9-12
: LGTM: DevDependencies are appropriate and up-to-date.The devDependencies are well-chosen:
- @types/node provides necessary TypeScript type definitions.
- tsx enables direct execution of TypeScript files, which is used in the "clean" script.
- The versions used are recent, and the caret ranges allow for compatible updates.
13-17
: LGTM: Dependencies are appropriate. Consider checking for updates.The dependencies align with the project's purpose:
- @llamaindex/cloud and @llamaindex/core are likely used for interacting with the LlamaCloudIndex system.
- tiktoken is probably used for tokenization tasks.
The versions use caret ranges, which is good for allowing compatible updates. However, it's worth verifying if these are the most recent versions available.
Run the following script to check for the latest versions of these packages:
#!/bin/bash # Description: Check for the latest versions of the dependencies # Function to get the latest version of a package get_latest_version() { npm view $1 version } # Check each package echo "Current @llamaindex/cloud: ^0.2.14, Latest: $(get_latest_version @llamaindex/cloud)" echo "Current @llamaindex/core: ^0.2.12, Latest: $(get_latest_version @llamaindex/core)" echo "Current tiktoken: ^1.0.17, Latest: $(get_latest_version tiktoken)"
6-8
: LGTM: Script is defined correctly. Verify the existence of clean.ts.The "clean" script is correctly defined to run the clean.ts file using tsx. However, let's ensure that the clean.ts file exists in the same directory as this package.json.
Run the following script to verify the existence of clean.ts:
@@ -27,6 +27,7 @@ | |||
"e2e": "playwright test", | |||
"e2e:python": "playwright test e2e/shared e2e/python", | |||
"e2e:typescript": "playwright test e2e/shared e2e/typescript", | |||
"e2e:clean": "pnpm --filter @create-llama/e2e-clean clean", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codebase verification
Issue: @create-llama/e2e-clean
package not found
The "e2e:clean"
script references the @create-llama/e2e-clean
package, which does not exist in the repository. This will cause the script to fail when executed.
- Ensure that the
@create-llama/e2e-clean
package is added to thepackages
directory. - Verify that the
package.json
of@create-llama/e2e-clean
includes aclean
script.
🔗 Analysis chain
LGTM: New script added for e2e cleanup
The addition of the "e2e:clean"
script aligns with the PR objective of adding a script to clean up e2e. This new script utilizes pnpm to run the clean
command in the @create-llama/e2e-clean
package, which is consistent with the project's structure and tooling.
To ensure the completeness of this change, let's verify the existence of the referenced package and its clean
script:
This script will help confirm that the necessary components for the new e2e:clean
script are in place.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Description: Verify the existence of @create-llama/e2e-clean package and its clean script
# Test 1: Check if the package directory exists
if fd -t d "e2e-clean$" packages; then
echo "✅ @create-llama/e2e-clean package directory found."
else
echo "❌ @create-llama/e2e-clean package directory not found."
fi
# Test 2: Check if the package.json exists and contains a clean script
if fd -t f "package.json$" packages/e2e-clean | xargs grep -q '"clean":'; then
echo "✅ clean script found in @create-llama/e2e-clean/package.json"
else
echo "❌ clean script not found in @create-llama/e2e-clean/package.json"
fi
Length of output: 563
} catch (error) { | ||
console.error("Error during pipeline deletion process:", error); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Exit with a non-zero exit code when an error occurs
When an error occurs during the pipeline deletion process, the script logs the error but exits with a zero exit code. To indicate failure to the calling process or CI tools, consider exiting with a non-zero exit code.
Apply this diff to fix the issue:
} catch (error) {
console.error("Error during pipeline deletion process:", error);
+ process.exit(1);
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
} catch (error) { | |
console.error("Error during pipeline deletion process:", error); | |
} | |
} catch (error) { | |
console.error("Error during pipeline deletion process:", error); | |
process.exit(1); | |
} |
const token = apiKey ?? process.env.LLAMA_CLOUD_API_KEY; | ||
client.interceptors.request.use((request: any) => { | ||
request.headers.set("Authorization", `Bearer ${token}`); | ||
return request; | ||
}); | ||
if (!token) { | ||
throw new Error( | ||
"API Key is required for LlamaCloudIndex. Please set the LLAMA_CLOUD_API_KEY environment variable", | ||
); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move token validation before setting up the interceptor
Currently, the interceptor is set up before verifying that token
is defined, which could lead to setting an Authorization header with an undefined token, causing unintended behavior. To prevent this, validate that token
is defined before setting up the interceptor.
Apply this diff to fix the issue:
function initService(apiKey?: string) {
client.setConfig({
baseUrl: DEFAULT_BASE_URL,
throwOnError: true,
});
const token = apiKey ?? process.env.LLAMA_CLOUD_API_KEY;
+ if (!token) {
+ throw new Error(
+ "API Key is required for LlamaCloudIndex. Please set the LLAMA_CLOUD_API_KEY environment variable",
+ );
+ }
client.interceptors.request.use((request: any) => {
request.headers.set("Authorization", `Bearer ${token}`);
return request;
});
- if (!token) {
- throw new Error(
- "API Key is required for LlamaCloudIndex. Please set the LLAMA_CLOUD_API_KEY environment variable",
- );
- }
}
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
const token = apiKey ?? process.env.LLAMA_CLOUD_API_KEY; | |
client.interceptors.request.use((request: any) => { | |
request.headers.set("Authorization", `Bearer ${token}`); | |
return request; | |
}); | |
if (!token) { | |
throw new Error( | |
"API Key is required for LlamaCloudIndex. Please set the LLAMA_CLOUD_API_KEY environment variable", | |
); | |
} | |
} | |
const token = apiKey ?? process.env.LLAMA_CLOUD_API_KEY; | |
if (!token) { | |
throw new Error( | |
"API Key is required for LlamaCloudIndex. Please set the LLAMA_CLOUD_API_KEY environment variable", | |
); | |
} | |
client.interceptors.request.use((request: any) => { | |
request.headers.set("Authorization", `Bearer ${token}`); | |
return request; | |
}); | |
} |
|
||
if (!projectName) { | ||
console.error("Please provide a project name as an argument."); | ||
process.exit(1); | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Consider using a command-line argument parser for scalability
Directly accessing process.argv
works for simple scripts, but as the script evolves or requires more complex command-line options, using a library like yargs
or commander
can enhance scalability and maintainability.
For example, you could modify your script to use yargs
:
+import yargs from 'yargs';
+
+const argv = yargs(process.argv.slice(2))
+ .usage('Usage: $0 --project <projectName>')
+ .option('project', {
+ alias: 'p',
+ describe: 'Name of the project',
+ type: 'string',
+ demandOption: true,
+ })
+ .help()
+ .argv;
+
+const projectName = argv.project;
-
-const projectName = process.argv[2];
-
-if (!projectName) {
- console.error("Please provide a project name as an argument.");
- process.exit(1);
-}
-
deletePipelines(projectName);
Remember to install yargs
by adding it to your dependencies.
Committable suggestion was skipped due to low confidence.
Summary by CodeRabbit
New Features
Chores
@create-llama/e2e-clean
module with necessary dependencies and scripts.