42 + run = TestRunner(RUNSTEPS,default_options=DEFAULT_OPTIONS)
not PEP-8 clean
86 +artifacts.stdout_pathname is the file stored the output of the test execution. By parsing this file and append the test
87 +result to the self.results['test_result'] list. The :meth:`~lava_test.core.parsers.TestParser.analyze_test_result` help
88 +method will do the sanity check and also perform the fixup of the result.
IMHO this is not easy to understand. The "fixup" is vague, what is it about, why is it necessary? (I know, but think about the readers)
66 + class LTPParser(TestParser):
67 + def parse(self, artifacts):
Again PEP-8
72 + with open(filename, 'r') as fd:
This is a logical bug, open does not return a file descriptor. It returns a file-like object. That object happens to have (on some platforms) a fileno() method that returns a descriptor number. Call it 'stream' instead; this is just confusing the reader.
121 +To see how the json file look like, let see the examples/stream.json. This file shows how to write the stream test in
122 +json format. Same test is written in python format: lava_test/test_definitions/stream.py.
This is also confusing. We should explain what the JSON representation does. It tells lava-test how to construct a test object out of the available test classes. While learning by example is fine without explanation it _will_ lead to confusion.
63 +To show how to implement the new parser, let have a closer look at one of existing test - lava_test/test_definitions/ltp.py
None of this code makes any sense to the reader. What you should have started with is a discussion on what the test parser is for. Start by showing what the test program output looks like. Highlighting the parts that we want to capture in LAVA and then constructing simple code that does that. Only then show how to wrap that code in a Parser class.
Hi ChiThu.
42 + run = TestRunner( RUNSTEPS, default_ options= DEFAULT_ OPTIONS)
not PEP-8 clean
86 +artifacts. stdout_ pathname is the file stored the output of the test execution. By parsing this file and append the test 'test_result' ] list. The :meth:` ~lava_test. core.parsers. TestParser. analyze_ test_result` help
87 +result to the self.results[
88 +method will do the sanity check and also perform the fixup of the result.
IMHO this is not easy to understand. The "fixup" is vague, what is it about, why is it necessary? (I know, but think about the readers)
115 + $ lava-test register http:// abc.com/ my-custom- test.json
Use example. {com,org, net} per RFC 2606
66 + class LTPParser( TestParser) :
67 + def parse(self, artifacts):
Again PEP-8
72 + with open(filename, 'r') as fd:
This is a logical bug, open does not return a file descriptor. It returns a file-like object. That object happens to have (on some platforms) a fileno() method that returns a descriptor number. Call it 'stream' instead; this is just confusing the reader.
121 +To see how the json file look like, let see the examples/ stream. json. This file shows how to write the stream test in test_definition s/stream. py.
122 +json format. Same test is written in python format: lava_test/
This is also confusing. We should explain what the JSON representation does. It tells lava-test how to construct a test object out of the available test classes. While learning by example is fine without explanation it _will_ lead to confusion.
63 +To show how to implement the new parser, let have a closer look at one of existing test - lava_test/ test_definition s/ltp.py
(snip) TestParser) : stdout_ pathname case_id> \S+) (?P<subid>\d+) (?P<result>\w+) : (?P<message>.+)"
66 + class LTPParser(
67 + def parse(self, artifacts):
68 + filename = artifacts.
69 + print "filename=%s" %filename
70 + # PATTERN = "^(?P<test_
None of this code makes any sense to the reader. What you should have started with is a discussion on what the test parser is for. Start by showing what the test program output looks like. Highlighting the parts that we want to capture in LAVA and then constructing simple code that does that. Only then show how to wrap that code in a Parser class.