All of lore.kernel.org
 help / color / mirror / Atom feed
* [PATCH 1/3] meta-skeleton: add a ptest example
@ 2021-06-03 17:21 Adrian Freihofer
  2021-06-03 17:21 ` [PATCH 2/3] testimage: support additional reports for ptests Adrian Freihofer
                   ` (2 more replies)
  0 siblings, 3 replies; 9+ messages in thread
From: Adrian Freihofer @ 2021-06-03 17:21 UTC (permalink / raw)
  To: openembedded-core; +Cc: Adrian Freihofer

---
 .../ptest-example/files/run-ptest             | 26 +++++++++++++++++++
 .../ptest-example/ptest-example_0.1.bb        | 16 ++++++++++++
 2 files changed, 42 insertions(+)
 create mode 100644 meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest
 create mode 100644 meta-skeleton/recipes-skeleton/ptest-example/ptest-example_0.1.bb

diff --git a/meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest b/meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest
new file mode 100644
index 0000000000..7c80306475
--- /dev/null
+++ b/meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest
@@ -0,0 +1,26 @@
+#!/bin/sh
+
+# The result should be PASS, FAIL, or SKIP, and the testname can be any identifying string.
+# The ptest-runner does not evaluate stdout or stderr. The output format is a recommended convention.
+# A real ptest would call the unit-test executable instead of echo.
+echo "PASS:  dummy test passing always"
+
+# Optional a ptest might provide a JUnit like XML report. Reports are collected by the ptest imagetest if
+# the TESTIMAGE_PTEST_REPORT_DIR variable is configured for the tested image.
+# Example to fetch a xml report to ${TEST_LOG_DIR}/reports-xml/ptest-example.xml:
+#   TESTIMAGE_PTEST_REPORT_DIR ?= "/tmp/ptest-xml/*.xml:reports-xml"
+# The following shell heredoc is as a placeholder e.g. for something more useful such as
+# my-gtest --gtest_output="xml:/tmp/ptest-xml/"
+mkdir -p /tmp/ptest-xml
+cat << xxxEOFxxx > /tmp/ptest-xml/ptest-example.xml
+<?xml version="1.0" encoding="UTF-8"?>
+<testsuites tests="1" failures="0" disabled="0" errors="0" time="0.010" timestamp="2020-09-20T10:44:23" name="AllTests">
+  <testsuite name="ConfigurationTest" tests="1" failures="0" disabled="0" errors="0" time="0.010" timestamp="2020-09-20T10:44:23">
+    <testcase name="readConfiguration" status="run" result="completed" time="0.010" timestamp="2020-09-20T10:44:23" classname="ConfigurationTest" />
+  </testsuite>
+</testsuites>
+xxxEOFxxx
+
+# The ptest-runner evaluates the exit value of a test case: 0 means pass, 1 means fail.
+# This minimal example passes always.
+exit 0
diff --git a/meta-skeleton/recipes-skeleton/ptest-example/ptest-example_0.1.bb b/meta-skeleton/recipes-skeleton/ptest-example/ptest-example_0.1.bb
new file mode 100644
index 0000000000..02bc9bcfa2
--- /dev/null
+++ b/meta-skeleton/recipes-skeleton/ptest-example/ptest-example_0.1.bb
@@ -0,0 +1,16 @@
+SUMMARY = "A very basic ptest enabled recipe"
+DESCRIPTION = "This recipe provides a minimalistic ptest package"
+SECTION = "examples"
+LICENSE = "MIT"
+LIC_FILES_CHKSUM = "file://${COREBASE}/meta/COPYING.MIT;md5=3da9cfbcb788c80a0384361b4de20420"
+
+# A recipe is "ptest-enabled" if it inherits the ptest class
+inherit ptest
+
+# Usually a ptest contains at least two items: the actual test,
+# and a shell script (run-ptest) that starts the test.
+# For this minimized example there is just the script.
+SRC_URI = "file://run-ptest"
+
+# This minimalistic example provides nothing more than a ptest package.
+ALLOW_EMPTY_${PN} = "1"
-- 
2.31.1


^ permalink raw reply	[flat|nested] 9+ messages in thread

* [PATCH 2/3] testimage: support additional reports for ptests
  2021-06-03 17:21 [PATCH 1/3] meta-skeleton: add a ptest example Adrian Freihofer
@ 2021-06-03 17:21 ` Adrian Freihofer
  2021-06-03 17:43   ` [OE-core] " Alexander Kanavin
  2021-06-03 17:21 ` [PATCH 3/3] runtime_test.py: add new testimage ptest test case Adrian Freihofer
  2021-06-03 17:46 ` [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example Alexander Kanavin
  2 siblings, 1 reply; 9+ messages in thread
From: Adrian Freihofer @ 2021-06-03 17:21 UTC (permalink / raw)
  To: openembedded-core; +Cc: Adrian Freihofer

This adds a new optional feature to the ptest run-time test. Most unit
test frameworks such as googletest generate JUnit like XML reports which
can be processed e.g. by GitLab CI or Jenkins.

Example: A run-ptest script executes a googletest based unit test:
  /usr/bin/my-unittest --gtest_output="xml:/tmp/ptest-xml/"

The new variable TESTIMAGE_PTEST_REPORT_DIR allows to configure
bitbake -c testimage to fetch the reports from the target device and
store them into a subfolder of TEST_LOG_DIR. It's possible to fetch
report files from different locations on the target device to different
subfolders on the host.
---
 meta/classes/testimage.bbclass       |  5 +++++
 meta/lib/oeqa/runtime/cases/ptest.py | 24 ++++++++++++++++++++++++
 2 files changed, 29 insertions(+)

diff --git a/meta/classes/testimage.bbclass b/meta/classes/testimage.bbclass
index 43de9d4d76..d01892136f 100644
--- a/meta/classes/testimage.bbclass
+++ b/meta/classes/testimage.bbclass
@@ -47,6 +47,11 @@ TESTIMAGE_AUTO ??= "0"
 # TESTIMAGE_BOOT_PATTERNS[search_login_succeeded] = "webserver@[a-zA-Z0-9\-]+:~#"
 # The accepted flags are the following: search_reached_prompt, send_login_user, search_login_succeeded, search_cmd_finished.
 # They are prefixed with either search/send, to differentiate if the pattern is meant to be sent or searched to/from the target terminal
+# TESTIMAGE_PTEST_REPORT_DIR might be used to fetch additional reports (e.g. JUnit like xml files) generated by ptests from the target device.
+# A ; separate list of remote_path:host_path is expected. The host_path is optional. It defaults to "reports".
+# For example if some ptests (such as ptest-example.bb) create additional reports in /tmp/ptest-xml/ the following line in the image recipe
+# configures the ptest imagetest to fetch the xml reports into a "xml-reports" subfolder of TEST_LOG_DIR:
+# TESTIMAGE_PTEST_REPORT_DIR = "/tmp/ptest-xml/*.xml:xml-reports"
 
 TEST_LOG_DIR ?= "${WORKDIR}/testimage"
 
diff --git a/meta/lib/oeqa/runtime/cases/ptest.py b/meta/lib/oeqa/runtime/cases/ptest.py
index 0800f3c27f..7b3560a4b0 100644
--- a/meta/lib/oeqa/runtime/cases/ptest.py
+++ b/meta/lib/oeqa/runtime/cases/ptest.py
@@ -110,3 +110,27 @@ class PtestRunnerTest(OERuntimeTestCase):
         if failmsg:
             self.logger.warning("There were failing ptests.")
             self.fail(failmsg)
+
+        # Fetch log files e.g. JUnit like xml files from the target device
+        ptest_report_dir = self.td.get('TESTIMAGE_PTEST_REPORT_DIR', '')
+        if ptest_report_dir:
+            for test_log_dir_ptest in ptest_report_dir.split(';'):
+                src_tgt = test_log_dir_ptest.split(':')
+                if len(src_tgt) == 1:
+                    tgt_dir_abs = os.path.join(ptest_log_dir, "reports")
+                elif len(src_tgt) == 2:
+                    tgt_dir_abs = os.path.join(ptest_log_dir, src_tgt[1])
+                else:
+                    self.logger.error("Invalid TESTIMAGE_PTEST_REPORT_DIR setting")
+                self.copy_logs(src_tgt[0], tgt_dir_abs)
+
+    def copy_logs(self, remoteSrc, localDst):
+        self.logger.debug("Fetching from target: %s to %s" % (remoteSrc, localDst))
+        if os.path.exists(localDst):
+            from shutil import rmtree
+            rmtree(localDst)
+        os.makedirs(localDst)
+        try:
+            self.target.copyFrom(remoteSrc, localDst)
+        except AssertionError:
+            pass
-- 
2.31.1


^ permalink raw reply	[flat|nested] 9+ messages in thread

* [PATCH 3/3] runtime_test.py: add new testimage ptest test case
  2021-06-03 17:21 [PATCH 1/3] meta-skeleton: add a ptest example Adrian Freihofer
  2021-06-03 17:21 ` [PATCH 2/3] testimage: support additional reports for ptests Adrian Freihofer
@ 2021-06-03 17:21 ` Adrian Freihofer
  2021-06-03 17:46 ` [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example Alexander Kanavin
  2 siblings, 0 replies; 9+ messages in thread
From: Adrian Freihofer @ 2021-06-03 17:21 UTC (permalink / raw)
  To: openembedded-core; +Cc: Adrian Freihofer

Add a new selftest for the ptest imagetest. The new feature
TESTIMAGE_PTEST_REPORT_DIR is verified as well.
---
 meta/lib/oeqa/selftest/cases/runtime_test.py | 44 ++++++++++++++++++++
 1 file changed, 44 insertions(+)

diff --git a/meta/lib/oeqa/selftest/cases/runtime_test.py b/meta/lib/oeqa/selftest/cases/runtime_test.py
index 84c2cb77e8..2ef100f684 100644
--- a/meta/lib/oeqa/selftest/cases/runtime_test.py
+++ b/meta/lib/oeqa/selftest/cases/runtime_test.py
@@ -238,6 +238,50 @@ class TestImage(OESelftestTestCase):
         bitbake('core-image-minimal')
         bitbake('-c testimage core-image-minimal')
 
+    def test_testimage_ptest(self):
+        """
+        Summary: Verify ptest runtime test
+        Expected: 1. Verify the ptest imagetest executes the ptest-example-ptest.
+                  2. Verify the xml test report is downloaded to the ptest log files.
+        Product: oe-core
+        Author: Adrian Freihofer <adrian.freihofer@siemens.com>
+        """
+        report_subdir = ""
+
+        features = 'DISTRO_FEATURES_append = " ptest"\n'
+        features += 'IMAGE_CLASSES += "testimage"\n'
+        features += 'IMAGE_INSTALL_append = " ptest-example-ptest"\n'
+        features += 'IMAGE_FEATURES_append = " ssh-server-dropbear"\n'
+        features += 'TESTIMAGE_PTEST_REPORT_DIR = "/tmp/ptest-xml/*.xml'
+        if report_subdir:
+            features += ':' + report_subdir
+        features += '"\n'
+        features += 'TEST_SUITES = "ping ssh ptest"\n'
+        self.write_config(features)
+
+        self.append_bblayers_config('BBLAYERS_append = " ${TOPDIR}/../meta-skeleton"')
+
+        bitbake('core-image-minimal')
+        bitbake('-c testimage core-image-minimal')
+
+        vars = get_bb_vars(("TEST_LOG_DIR", "WORKDIR", "TOPDIR", "TESTIMAGE_PTEST_REPORT_DIR"), "core-image-minimal")
+
+        test_log_dir = vars["TEST_LOG_DIR"]
+        if not test_log_dir:
+            test_log_dir = os.path.join(vars["WORKDIR"], 'testimage')
+        if not os.path.isabs(test_log_dir):
+            test_log_dir = os.path.join(vars["TOPDIR"], test_log_dir)
+        ptest_log_dir_link = os.path.join(test_log_dir, 'ptest_log')
+
+        ptest_runner_log = os.path.join(ptest_log_dir_link, 'ptest-runner.log')
+        self.assertTrue(os.path.isfile(ptest_runner_log), "%s is not available." % ptest_runner_log)
+
+        if not report_subdir:
+            report_subdir = 'reports'
+        ptest_example_log = os.path.join(ptest_log_dir_link, report_subdir, 'ptest-example.xml')
+        self.assertTrue(os.path.isfile(ptest_example_log), "%s is not available." % ptest_example_log)
+
+
 class Postinst(OESelftestTestCase):
 
     def init_manager_loop(self, init_manager):
-- 
2.31.1


^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: [OE-core] [PATCH 2/3] testimage: support additional reports for ptests
  2021-06-03 17:21 ` [PATCH 2/3] testimage: support additional reports for ptests Adrian Freihofer
@ 2021-06-03 17:43   ` Alexander Kanavin
  0 siblings, 0 replies; 9+ messages in thread
From: Alexander Kanavin @ 2021-06-03 17:43 UTC (permalink / raw)
  To: Adrian Freihofer; +Cc: OE-core, Adrian Freihofer

[-- Attachment #1: Type: text/plain, Size: 3961 bytes --]

Why does this need to be specifically in ptest.py? It only copies some
files over from the target; you can write a separate runtime testcase that
does the same.

Alex

On Thu, 3 Jun 2021 at 19:22, Adrian Freihofer <adrian.freihofer@gmail.com>
wrote:

> This adds a new optional feature to the ptest run-time test. Most unit
> test frameworks such as googletest generate JUnit like XML reports which
> can be processed e.g. by GitLab CI or Jenkins.
>
> Example: A run-ptest script executes a googletest based unit test:
>   /usr/bin/my-unittest --gtest_output="xml:/tmp/ptest-xml/"
>
> The new variable TESTIMAGE_PTEST_REPORT_DIR allows to configure
> bitbake -c testimage to fetch the reports from the target device and
> store them into a subfolder of TEST_LOG_DIR. It's possible to fetch
> report files from different locations on the target device to different
> subfolders on the host.
> ---
>  meta/classes/testimage.bbclass       |  5 +++++
>  meta/lib/oeqa/runtime/cases/ptest.py | 24 ++++++++++++++++++++++++
>  2 files changed, 29 insertions(+)
>
> diff --git a/meta/classes/testimage.bbclass
> b/meta/classes/testimage.bbclass
> index 43de9d4d76..d01892136f 100644
> --- a/meta/classes/testimage.bbclass
> +++ b/meta/classes/testimage.bbclass
> @@ -47,6 +47,11 @@ TESTIMAGE_AUTO ??= "0"
>  # TESTIMAGE_BOOT_PATTERNS[search_login_succeeded] = "webserver@
> [a-zA-Z0-9\-]+:~#"
>  # The accepted flags are the following: search_reached_prompt,
> send_login_user, search_login_succeeded, search_cmd_finished.
>  # They are prefixed with either search/send, to differentiate if the
> pattern is meant to be sent or searched to/from the target terminal
> +# TESTIMAGE_PTEST_REPORT_DIR might be used to fetch additional reports
> (e.g. JUnit like xml files) generated by ptests from the target device.
> +# A ; separate list of remote_path:host_path is expected. The host_path
> is optional. It defaults to "reports".
> +# For example if some ptests (such as ptest-example.bb) create
> additional reports in /tmp/ptest-xml/ the following line in the image recipe
> +# configures the ptest imagetest to fetch the xml reports into a
> "xml-reports" subfolder of TEST_LOG_DIR:
> +# TESTIMAGE_PTEST_REPORT_DIR = "/tmp/ptest-xml/*.xml:xml-reports"
>
>  TEST_LOG_DIR ?= "${WORKDIR}/testimage"
>
> diff --git a/meta/lib/oeqa/runtime/cases/ptest.py
> b/meta/lib/oeqa/runtime/cases/ptest.py
> index 0800f3c27f..7b3560a4b0 100644
> --- a/meta/lib/oeqa/runtime/cases/ptest.py
> +++ b/meta/lib/oeqa/runtime/cases/ptest.py
> @@ -110,3 +110,27 @@ class PtestRunnerTest(OERuntimeTestCase):
>          if failmsg:
>              self.logger.warning("There were failing ptests.")
>              self.fail(failmsg)
> +
> +        # Fetch log files e.g. JUnit like xml files from the target device
> +        ptest_report_dir = self.td.get('TESTIMAGE_PTEST_REPORT_DIR', '')
> +        if ptest_report_dir:
> +            for test_log_dir_ptest in ptest_report_dir.split(';'):
> +                src_tgt = test_log_dir_ptest.split(':')
> +                if len(src_tgt) == 1:
> +                    tgt_dir_abs = os.path.join(ptest_log_dir, "reports")
> +                elif len(src_tgt) == 2:
> +                    tgt_dir_abs = os.path.join(ptest_log_dir, src_tgt[1])
> +                else:
> +                    self.logger.error("Invalid TESTIMAGE_PTEST_REPORT_DIR
> setting")
> +                self.copy_logs(src_tgt[0], tgt_dir_abs)
> +
> +    def copy_logs(self, remoteSrc, localDst):
> +        self.logger.debug("Fetching from target: %s to %s" % (remoteSrc,
> localDst))
> +        if os.path.exists(localDst):
> +            from shutil import rmtree
> +            rmtree(localDst)
> +        os.makedirs(localDst)
> +        try:
> +            self.target.copyFrom(remoteSrc, localDst)
> +        except AssertionError:
> +            pass
> --
> 2.31.1
>
>
> 
>
>

[-- Attachment #2: Type: text/html, Size: 4767 bytes --]

^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example
  2021-06-03 17:21 [PATCH 1/3] meta-skeleton: add a ptest example Adrian Freihofer
  2021-06-03 17:21 ` [PATCH 2/3] testimage: support additional reports for ptests Adrian Freihofer
  2021-06-03 17:21 ` [PATCH 3/3] runtime_test.py: add new testimage ptest test case Adrian Freihofer
@ 2021-06-03 17:46 ` Alexander Kanavin
  2021-06-03 19:53   ` Adrian Freihofer
  2 siblings, 1 reply; 9+ messages in thread
From: Alexander Kanavin @ 2021-06-03 17:46 UTC (permalink / raw)
  To: Adrian Freihofer; +Cc: OE-core, Adrian Freihofer

[-- Attachment #1: Type: text/plain, Size: 3571 bytes --]

Rather than construct an artificial example, is there an actual existing
ptest in oe-core that can be extended to do this junit xml stuff?

Alex

On Thu, 3 Jun 2021 at 19:22, Adrian Freihofer <adrian.freihofer@gmail.com>
wrote:

> ---
>  .../ptest-example/files/run-ptest             | 26 +++++++++++++++++++
>  .../ptest-example/ptest-example_0.1.bb        | 16 ++++++++++++
>  2 files changed, 42 insertions(+)
>  create mode 100644
> meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest
>  create mode 100644 meta-skeleton/recipes-skeleton/ptest-example/
> ptest-example_0.1.bb
>
> diff --git a/meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest
> b/meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest
> new file mode 100644
> index 0000000000..7c80306475
> --- /dev/null
> +++ b/meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest
> @@ -0,0 +1,26 @@
> +#!/bin/sh
> +
> +# The result should be PASS, FAIL, or SKIP, and the testname can be any
> identifying string.
> +# The ptest-runner does not evaluate stdout or stderr. The output format
> is a recommended convention.
> +# A real ptest would call the unit-test executable instead of echo.
> +echo "PASS:  dummy test passing always"
> +
> +# Optional a ptest might provide a JUnit like XML report. Reports are
> collected by the ptest imagetest if
> +# the TESTIMAGE_PTEST_REPORT_DIR variable is configured for the tested
> image.
> +# Example to fetch a xml report to
> ${TEST_LOG_DIR}/reports-xml/ptest-example.xml:
> +#   TESTIMAGE_PTEST_REPORT_DIR ?= "/tmp/ptest-xml/*.xml:reports-xml"
> +# The following shell heredoc is as a placeholder e.g. for something more
> useful such as
> +# my-gtest --gtest_output="xml:/tmp/ptest-xml/"
> +mkdir -p /tmp/ptest-xml
> +cat << xxxEOFxxx > /tmp/ptest-xml/ptest-example.xml
> +<?xml version="1.0" encoding="UTF-8"?>
> +<testsuites tests="1" failures="0" disabled="0" errors="0" time="0.010"
> timestamp="2020-09-20T10:44:23" name="AllTests">
> +  <testsuite name="ConfigurationTest" tests="1" failures="0" disabled="0"
> errors="0" time="0.010" timestamp="2020-09-20T10:44:23">
> +    <testcase name="readConfiguration" status="run" result="completed"
> time="0.010" timestamp="2020-09-20T10:44:23" classname="ConfigurationTest"
> />
> +  </testsuite>
> +</testsuites>
> +xxxEOFxxx
> +
> +# The ptest-runner evaluates the exit value of a test case: 0 means pass,
> 1 means fail.
> +# This minimal example passes always.
> +exit 0
> diff --git a/meta-skeleton/recipes-skeleton/ptest-example/
> ptest-example_0.1.bb b/meta-skeleton/recipes-skeleton/ptest-example/
> ptest-example_0.1.bb
> new file mode 100644
> index 0000000000..02bc9bcfa2
> --- /dev/null
> +++ b/meta-skeleton/recipes-skeleton/ptest-example/ptest-example_0.1.bb
> @@ -0,0 +1,16 @@
> +SUMMARY = "A very basic ptest enabled recipe"
> +DESCRIPTION = "This recipe provides a minimalistic ptest package"
> +SECTION = "examples"
> +LICENSE = "MIT"
> +LIC_FILES_CHKSUM =
> "file://${COREBASE}/meta/COPYING.MIT;md5=3da9cfbcb788c80a0384361b4de20420"
> +
> +# A recipe is "ptest-enabled" if it inherits the ptest class
> +inherit ptest
> +
> +# Usually a ptest contains at least two items: the actual test,
> +# and a shell script (run-ptest) that starts the test.
> +# For this minimized example there is just the script.
> +SRC_URI = "file://run-ptest"
> +
> +# This minimalistic example provides nothing more than a ptest package.
> +ALLOW_EMPTY_${PN} = "1"
> --
> 2.31.1
>
>
> 
>
>

[-- Attachment #2: Type: text/html, Size: 4743 bytes --]

^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example
  2021-06-03 17:46 ` [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example Alexander Kanavin
@ 2021-06-03 19:53   ` Adrian Freihofer
  2021-06-03 20:00     ` Alexander Kanavin
  0 siblings, 1 reply; 9+ messages in thread
From: Adrian Freihofer @ 2021-06-03 19:53 UTC (permalink / raw)
  To: Alexander Kanavin; +Cc: OE-core

Hi Alex

On Thu, 2021-06-03 at 19:46 +0200, Alexander Kanavin wrote:
> Rather than construct an artificial example, is there an actual
> existing ptest in oe-core that can be extended to do this junit xml
> stuff?
So far there is no test which creates xml reports and I don't know if
oe-core would have interest in something like that.

In general, a minimalist ptest example could be quite useful. I get
asked about it from time to time. However, I see your point that the
example is somewhat artificial. What do you mean if I replace the xml
with a simple "example report" for example?

Regards,
Adrian
> 
> Alex
> 
> On Thu, 3 Jun 2021 at 19:22, Adrian Freihofer <
> adrian.freihofer@gmail.com> wrote:
> > ---
> >  .../ptest-example/files/run-ptest             | 26
> > +++++++++++++++++++
> >  .../ptest-example/ptest-example_0.1.bb        | 16 ++++++++++++
> >  2 files changed, 42 insertions(+)
> >  create mode 100644 meta-skeleton/recipes-skeleton/ptest-
> > example/files/run-ptest
> >  create mode 100644 meta-skeleton/recipes-skeleton/ptest-
> > example/ptest-example_0.1.bb
> > 
> > diff --git a/meta-skeleton/recipes-skeleton/ptest-
> > example/files/run-
> > ptest b/meta-skeleton/recipes-skeleton/ptest-example/files/run-
> > ptest
> > new file mode 100644
> > index 0000000000..7c80306475
> > --- /dev/null
> > +++ b/meta-skeleton/recipes-skeleton/ptest-example/files/run-ptest
> > @@ -0,0 +1,26 @@
> > +#!/bin/sh
> > +
> > +# The result should be PASS, FAIL, or SKIP, and the testname can
> > be
> > any identifying string.
> > +# The ptest-runner does not evaluate stdout or stderr. The output
> > format is a recommended convention.
> > +# A real ptest would call the unit-test executable instead of
> > echo.
> > +echo "PASS:  dummy test passing always"
> > +
> > +# Optional a ptest might provide a JUnit like XML report. Reports
> > are collected by the ptest imagetest if
> > +# the TESTIMAGE_PTEST_REPORT_DIR variable is configured for the
> > tested image.
> > +# Example to fetch a xml report to ${TEST_LOG_DIR}/reports-
> > xml/ptest-example.xml:
> > +#   TESTIMAGE_PTEST_REPORT_DIR ?= "/tmp/ptest-xml/*.xml:reports-
> > xml"
> > +# The following shell heredoc is as a placeholder e.g. for
> > something
> > more useful such as
> > +# my-gtest --gtest_output="xml:/tmp/ptest-xml/"
> > +mkdir -p /tmp/ptest-xml
> > +cat << xxxEOFxxx > /tmp/ptest-xml/ptest-example.xml
> > +<?xml version="1.0" encoding="UTF-8"?>
> > +<testsuites tests="1" failures="0" disabled="0" errors="0"
> > time="0.010" timestamp="2020-09-20T10:44:23" name="AllTests">
> > +  <testsuite name="ConfigurationTest" tests="1" failures="0"
> > disabled="0" errors="0" time="0.010" timestamp="2020-09-
> > 20T10:44:23">
> > +    <testcase name="readConfiguration" status="run"
> > result="completed" time="0.010" timestamp="2020-09-20T10:44:23"
> > classname="ConfigurationTest" />
> > +  </testsuite>
> > +</testsuites>
> > +xxxEOFxxx
> > +
> > +# The ptest-runner evaluates the exit value of a test case: 0
> > means
> > pass, 1 means fail.
> > +# This minimal example passes always.
> > +exit 0
> > diff --git a/meta-skeleton/recipes-skeleton/ptest-
> > example/ptest-example_0.1.bb b/meta-skeleton/recipes-
> > skeleton/ptest-
> > example/ptest-example_0.1.bb
> > new file mode 100644
> > index 0000000000..02bc9bcfa2
> > --- /dev/null
> > +++ b/meta-skeleton/recipes-skeleton/ptest-
> > example/ptest-example_0.1.bb
> > @@ -0,0 +1,16 @@
> > +SUMMARY = "A very basic ptest enabled recipe"
> > +DESCRIPTION = "This recipe provides a minimalistic ptest package"
> > +SECTION = "examples"
> > +LICENSE = "MIT"
> > +LIC_FILES_CHKSUM =
> > "file://${COREBASE}/meta/COPYING.MIT;md5=3da9cfbcb788c80a0384361b4d
> > e2
> > 0420"
> > +
> > +# A recipe is "ptest-enabled" if it inherits the ptest class
> > +inherit ptest
> > +
> > +# Usually a ptest contains at least two items: the actual test,
> > +# and a shell script (run-ptest) that starts the test.
> > +# For this minimized example there is just the script.
> > +SRC_URI = "file://run-ptest"
> > +
> > +# This minimalistic example provides nothing more than a ptest
> > package.
> > +ALLOW_EMPTY_${PN} = "1"



^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example
  2021-06-03 19:53   ` Adrian Freihofer
@ 2021-06-03 20:00     ` Alexander Kanavin
  2021-06-05  8:38       ` Adrian Freihofer
  0 siblings, 1 reply; 9+ messages in thread
From: Alexander Kanavin @ 2021-06-03 20:00 UTC (permalink / raw)
  To: Adrian Freihofer; +Cc: OE-core

[-- Attachment #1: Type: text/plain, Size: 1028 bytes --]

On Thu, 3 Jun 2021 at 21:53, Adrian Freihofer <adrian.freihofer@gmail.com>
wrote:

> On Thu, 2021-06-03 at 19:46 +0200, Alexander Kanavin wrote:
> > Rather than construct an artificial example, is there an actual
> > existing ptest in oe-core that can be extended to do this junit xml
> > stuff?
> So far there is no test which creates xml reports and I don't know if
> oe-core would have interest in something like that.
>
> In general, a minimalist ptest example could be quite useful. I get
> asked about it from time to time. However, I see your point that the
> example is somewhat artificial. What do you mean if I replace the xml
> with a simple "example report" for example?
>

What I'm getting at is that I'm not sure if this should be in oe-core if
nothing in oe-core or other public layers actually benefits from this.

A much stronger case would be to enable these reports in some existing
recipe, then extend the ptest infrastructure on the yocto autobuilder to
make good use of them.

Alex

[-- Attachment #2: Type: text/html, Size: 1441 bytes --]

^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example
  2021-06-03 20:00     ` Alexander Kanavin
@ 2021-06-05  8:38       ` Adrian Freihofer
  2021-06-05 19:47         ` Alexander Kanavin
  0 siblings, 1 reply; 9+ messages in thread
From: Adrian Freihofer @ 2021-06-05  8:38 UTC (permalink / raw)
  To: Alexander Kanavin; +Cc: OE-core

On Thu, 2021-06-03 at 22:00 +0200, Alexander Kanavin wrote:
> On Thu, 3 Jun 2021 at 21:53, Adrian Freihofer <
> adrian.freihofer@gmail.com> wrote:
> > On Thu, 2021-06-03 at 19:46 +0200, Alexander Kanavin wrote:
> > > Rather than construct an artificial example, is there an actual
> > > existing ptest in oe-core that can be extended to do this junit xml
> > > stuff?
> > So far there is no test which creates xml reports and I don't know if
> > oe-core would have interest in something like that.
> > 
> > In general, a minimalist ptest example could be quite useful. I get
> > asked about it from time to time. However, I see your point that the
> > example is somewhat artificial. What do you mean if I replace the xml
> > with a simple "example report" for example?
> > 
> 
> 
> What I'm getting at is that I'm not sure if this should be in oe-core
> if nothing in oe-core or other public layers actually benefits from
> this.
> 
> A much stronger case would be to enable these reports in some existing
> recipe, then extend the ptest infrastructure on the yocto autobuilder
> to make good use of them.

Right, my goal here is primarily to support application development. 
But we should not consider application development as a "weak case". It
is probably the area where most companies using Yocto put the most
effort into embedded device development. Yocto should therefore provide
the best possible support. 
I don't know if there are packages in Yocto core that could benefit
from using JUnit or similar reports. I also don't know how this would
then be integrated with Buildbot. However, in the industry I see mostly
test frameworks like googletest and auto-builders like Jenkins and
GitLab CI that support rendering such reports.

Regards,
Adrian
> 
> Alex



^ permalink raw reply	[flat|nested] 9+ messages in thread

* Re: [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example
  2021-06-05  8:38       ` Adrian Freihofer
@ 2021-06-05 19:47         ` Alexander Kanavin
  0 siblings, 0 replies; 9+ messages in thread
From: Alexander Kanavin @ 2021-06-05 19:47 UTC (permalink / raw)
  To: Adrian Freihofer; +Cc: OE-core

[-- Attachment #1: Type: text/plain, Size: 1220 bytes --]

On Sat, 5 Jun 2021 at 10:38, Adrian Freihofer <adrian.freihofer@gmail.com>
wrote:

> > What I'm getting at is that I'm not sure if this should be in oe-core
> > if nothing in oe-core or other public layers actually benefits from
> > this.
> >
> > A much stronger case would be to enable these reports in some existing
> > recipe, then extend the ptest infrastructure on the yocto autobuilder
> > to make good use of them.
>
> Right, my goal here is primarily to support application development.
> But we should not consider application development as a "weak case". It
> is probably the area where most companies using Yocto put the most
> effort into embedded device development. Yocto should therefore provide
> the best possible support.
> I don't know if there are packages in Yocto core that could benefit
> from using JUnit or similar reports. I also don't know how this would
> then be integrated with Buildbot. However, in the industry I see mostly
> test frameworks like googletest and auto-builders like Jenkins and
> GitLab CI that support rendering such reports.
>

From that perspective, yes. I guess it should be RP's call to take it or
not, maybe we need a third opinion.

Alex

[-- Attachment #2: Type: text/html, Size: 1636 bytes --]

^ permalink raw reply	[flat|nested] 9+ messages in thread

end of thread, other threads:[~2021-06-05 19:47 UTC | newest]

Thread overview: 9+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2021-06-03 17:21 [PATCH 1/3] meta-skeleton: add a ptest example Adrian Freihofer
2021-06-03 17:21 ` [PATCH 2/3] testimage: support additional reports for ptests Adrian Freihofer
2021-06-03 17:43   ` [OE-core] " Alexander Kanavin
2021-06-03 17:21 ` [PATCH 3/3] runtime_test.py: add new testimage ptest test case Adrian Freihofer
2021-06-03 17:46 ` [OE-core] [PATCH 1/3] meta-skeleton: add a ptest example Alexander Kanavin
2021-06-03 19:53   ` Adrian Freihofer
2021-06-03 20:00     ` Alexander Kanavin
2021-06-05  8:38       ` Adrian Freihofer
2021-06-05 19:47         ` Alexander Kanavin

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.