-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regular prints are non distinguisable from the listing (--info) printout #668
Comments
I am aware of custom listing. However due to the nature of the being an extension, I cannot force the user to use a special listing for test discovery to work. |
Hi, Thank you for raising this, I think this is a perfectly valid requirement from external tool point of view to have a stable way of listing testcases. I think the currently implemented listers are mostly for human users. You are right that custom lister will not be generic enough as user could not be forced, but a new lister can be easily added to testplan. A new lister for externals tools to use, can be more structured by dumping a json or xml which could be used more easily than the current :: separated names, it can even deliver some more info, that might be useful for those tools. If you have the capacity to work on this kind of lister, we are open for pull requests. If not I will put this up for prioritization for our team, but please provide some feedback what would be the ideal format/information that lister need to return? |
Yes! Json format would be great! As for my point of view, this is the interface I have to comply with: Beside the mandatory
While implementing it would be fun, the rigorous DCO and the potential documentation avalanche it could cause probably won't quite fit well my active employment time :) PS: I did not hope to receive such a positive feedback :) Thanks |
Hi, We have recorded an internal ticket for adding a reasonable interface for programmatic listing - it is not prioritized at the moment, and will keep you posted. I'm also thinking if we could re-use the exporters to generate the data you need - dryrun testplan and write json/xml report skeleton. Btw, have you tried interactive mode of testplan? I'm thinking that might be another thing we could use for integration with VSCode as well. |
Thanks!
I haven't tried interactive mode, mainly because the 'Test Adapter' extension doesn't support starting-stopping test environments. |
Just a quick question: Does Testplan provide API to programmatically get the list of tests (from a separate python script, just including the testplan script file) and get some similar information like we are talking about? Quote from the vscode extension author for clarification:
|
No, we don't have that. testplan's test discovery could be more complicated than unittest/pytest - they probably can just do it by scanning the directories and importing the modules. If we do that, we probably only know what testsuites/testcases we have but we cannot construct a complete plan/multitest/testsuite/testcase structure. With that said, it might be a nice feature to discover from the file structure, and allow user to run the testsuites without having to add them into a multitest (if it doesn't require an environment). We will give this some thoughts. |
Hi!
I'm working on integrating testplan support for "Python Test Explorer for VsCode" VsCode extension.
I have encountered a scenario, where a global
print("Hello world")
in the main test_plan.py file is visible in the listing (--info) printout.This is corrupting the whole printout, making the parsing unnecessarily hard.
Example:
python test_plan.py --info pattern-full
Proposed solution
The text was updated successfully, but these errors were encountered: