All of lore.kernel.org
 help / color / mirror / Atom feed
* [Buildroot] [PATCH 0/4] pkg-download: check hashes before the download (branch yem/download-hash)
@ 2014-12-01 23:24 Yann E. MORIN
  2014-12-01 23:24 ` [Buildroot] [PATCH 1/4] pkg-download: check for already downloaded file in the download wrapper Yann E. MORIN
                   ` (3 more replies)
  0 siblings, 4 replies; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-01 23:24 UTC (permalink / raw)
  To: buildroot

Hello All!

This series introduces a way to check hashes prior to doing a download.

This is required for when upstream silently update their release tarballs
without renaming them, and the user is left with a stray locally cached
tarball that no longer match the hashes with have for that package.

In so doing, this series:
  - moves the check for a cached file into the wrapper;
  - moves the post-download check for hashes into the wrapper;
  - adds a pre-download check for hashes in the wrapper.

Doing the pre-download checks in the Makefile, like the post-download
checks were done, made the Makefile a bit harder to read. On the other
hand, we have a download wrapper shell script, so it is easier to do
trickey stuff in there (shell syntax) than in the Makefile (make syntax
can become unreadable pretty fast).

This has a side effect of cleaning up the pkg-download.mk Makefile, too,
but that was not the goal.

Regards,
Yann E. MORIN.


The following changes since commit 1af2db0f77cc98d57012f238afbceb203b9b739b:

  qt5base: add error handling to for loop (2014-12-01 23:33:14 +0100)

are available in the git repository at:

  git://git.busybox.net/~ymorin/git/buildroot yem/download-hash

for you to fetch changes up to e1cc728a60b6a250ea395ee429dba7f470b1f8a8:

  pkg-download: check hasahes for locally cached files (2014-12-02 00:11:05 +0100)

----------------------------------------------------------------
Yann E. MORIN (4):
      pkg-download: check for already downloaded file in the download wrapper
      pkg-download: fix arguments to hash checking script
      pkg-download: verify the hashes from the download wrapper
      pkg-download: check hasahes for locally cached files

 package/pkg-download.mk     | 36 +++++++++++-------------------------
 support/download/check-hash | 18 ++++++++++--------
 support/download/wrapper    | 22 +++++++++++++++++++++-
 3 files changed, 42 insertions(+), 34 deletions(-)

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 1/4] pkg-download: check for already downloaded file in the download wrapper
  2014-12-01 23:24 [Buildroot] [PATCH 0/4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
@ 2014-12-01 23:24 ` Yann E. MORIN
  2014-12-02  8:26   ` Thomas Petazzoni
  2014-12-01 23:24 ` [Buildroot] [PATCH 2/4] pkg-download: fix arguments to hash checking script Yann E. MORIN
                   ` (2 subsequent siblings)
  3 siblings, 1 reply; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-01 23:24 UTC (permalink / raw)
  To: buildroot

Instead of repeating the same test again and again in all our download
rules, just delegate the check for an already downloaded file to the
download wrapper.

This clears up the path for doing the hash checks on a cached file
before the download.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
---
 package/pkg-download.mk  | 8 --------
 support/download/wrapper | 5 +++++
 2 files changed, 5 insertions(+), 8 deletions(-)

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index f3409bd..f6ceadb 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -94,7 +94,6 @@ endef
 # Messages for the type of clone used are provided to ease debugging in case of
 # problems
 define DOWNLOAD_GIT
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) support/download/wrapper git \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
 		$($(PKG)_SITE) \
@@ -114,7 +113,6 @@ endef
 
 
 define DOWNLOAD_BZR
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) support/download/wrapper bzr \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
 		$($(PKG)_SITE) \
@@ -131,7 +129,6 @@ define SHOW_EXTERNAL_DEPS_BZR
 endef
 
 define DOWNLOAD_CVS
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) support/download/wrapper cvs \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
 		$(call stripurischeme,$(call qstrip,$($(PKG)_SITE))) \
@@ -150,7 +147,6 @@ define SHOW_EXTERNAL_DEPS_CVS
 endef
 
 define DOWNLOAD_SVN
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) support/download/wrapper svn \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
 		$($(PKG)_SITE) \
@@ -170,7 +166,6 @@ endef
 # Note that filepath is relative to the user's home directory, so you may want
 # to prepend the path with a slash: scp://[user@]host:/absolutepath
 define DOWNLOAD_SCP
-	test -e $(DL_DIR)/$(2) || \
 	$(EXTRA_ENV) support/download/wrapper scp \
 		$(DL_DIR)/$(2) \
 		'$(call stripurischeme,$(call qstrip,$(1)))' && \
@@ -187,7 +182,6 @@ endef
 
 
 define DOWNLOAD_HG
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) support/download/wrapper hg \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
 		$($(PKG)_SITE) \
@@ -207,7 +201,6 @@ endef
 
 
 define DOWNLOAD_WGET
-	test -e $(DL_DIR)/$(2) || \
 	$(EXTRA_ENV) support/download/wrapper wget \
 		$(DL_DIR)/$(2) \
 		'$(call qstrip,$(1))' && \
@@ -223,7 +216,6 @@ define SHOW_EXTERNAL_DEPS_WGET
 endef
 
 define DOWNLOAD_LOCALFILES
-	test -e $(DL_DIR)/$(2) || \
 	$(EXTRA_ENV) support/download/wrapper cp \
 		$(DL_DIR)/$(2) \
 		$(call stripurischeme,$(call qstrip,$(1))) && \
diff --git a/support/download/wrapper b/support/download/wrapper
index 320a37e..e3ab3a1 100755
--- a/support/download/wrapper
+++ b/support/download/wrapper
@@ -28,6 +28,11 @@ helper="${1}"
 output="${2}"
 shift 2
 
+# If the output file already exists, do not download it again
+if [ -e "${output}" ]; then
+    exit 0
+fi
+
 # tmpd is a temporary directory in which helpers may store intermediate
 # by-products of the download.
 # tmpf is the file in which the helpers should put the downloaded content.
-- 
1.9.1

^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 2/4] pkg-download: fix arguments to hash checking script
  2014-12-01 23:24 [Buildroot] [PATCH 0/4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
  2014-12-01 23:24 ` [Buildroot] [PATCH 1/4] pkg-download: check for already downloaded file in the download wrapper Yann E. MORIN
@ 2014-12-01 23:24 ` Yann E. MORIN
  2014-12-01 23:24 ` [Buildroot] [PATCH 3/4] pkg-download: verify the hashes from the download wrapper Yann E. MORIN
  2014-12-01 23:24 ` [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files Yann E. MORIN
  3 siblings, 0 replies; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-01 23:24 UTC (permalink / raw)
  To: buildroot

The argument are correctly used, but incorrectly documented.
Inverse the comments to match the actual usage.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
---
 support/download/check-hash | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/support/download/check-hash b/support/download/check-hash
index 067e7a2..13e361a 100755
--- a/support/download/check-hash
+++ b/support/download/check-hash
@@ -3,8 +3,8 @@ set -e
 
 # Helper to check a file matches its known hash
 # Call it with:
-#   $1: the full path to the file to check
-#   $2: the path of the file containing all the the expected hashes
+#   $1: the path of the file containing all the the expected hashes
+#   $2: the full path to the file to check
 
 h_file="${1}"
 file="${2}"
-- 
1.9.1

^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 3/4] pkg-download: verify the hashes from the download wrapper
  2014-12-01 23:24 [Buildroot] [PATCH 0/4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
  2014-12-01 23:24 ` [Buildroot] [PATCH 1/4] pkg-download: check for already downloaded file in the download wrapper Yann E. MORIN
  2014-12-01 23:24 ` [Buildroot] [PATCH 2/4] pkg-download: fix arguments to hash checking script Yann E. MORIN
@ 2014-12-01 23:24 ` Yann E. MORIN
  2014-12-02  8:29   ` Thomas Petazzoni
  2014-12-01 23:24 ` [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files Yann E. MORIN
  3 siblings, 1 reply; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-01 23:24 UTC (permalink / raw)
  To: buildroot

Instead of repeating the check in our download rules, delegate the check
of the hashes to the download wrapper.

This needs three different changes:

  - add a new argument to the download wrapper, that is the full path to
    the hash file; if the hash file does not exist, that does not change
    the current behaviour, as the existence of the hash file is done by
    the check-hash script;

  - add a third argument to the check-hash script, to be the basename of
    the file to check; this is required because we no longer check the
    final file with the finale filename, but anintermediate file with a
    temporary filename;

  - do the actual cal to the check-hash script form within the download
    wrapper.

This further paves the way to doing pre-download checks of the hashes
for the locally cached files.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
---
 package/pkg-download.mk     | 28 +++++++++++-----------------
 support/download/check-hash | 14 ++++++++------
 support/download/wrapper    | 11 ++++++++++-
 3 files changed, 29 insertions(+), 24 deletions(-)

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index f6ceadb..c10430f 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -58,17 +58,6 @@ domainseparator = $(if $(1),$(1),/)
 # github(user,package,version): returns site of GitHub repository
 github = https://github.com/$(1)/$(2)/archive/$(3)
 
-# Helper for checking a tarball's checksum
-# If the hash does not match, remove the incorrect file
-# $(1): the path to the file with the hashes
-# $(2): the full path to the file to check
-define VERIFY_HASH
-	if ! support/download/check-hash $(1) $(2) $(if $(QUIET),>/dev/null); then \
-		rm -f $(2); \
-		exit 1; \
-	fi
-endef
-
 ################################################################################
 # The DOWNLOAD_* helpers are in charge of getting a working copy
 # of the source repository for their corresponding SCM,
@@ -96,6 +85,7 @@ endef
 define DOWNLOAD_GIT
 	$(EXTRA_ENV) support/download/wrapper git \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
+		$(PKGDIR)/$($(PKG)_NAME).hash \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_BASE_NAME)
@@ -115,6 +105,7 @@ endef
 define DOWNLOAD_BZR
 	$(EXTRA_ENV) support/download/wrapper bzr \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
+		$(PKGDIR)/$($(PKG)_NAME).hash \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_BASE_NAME)
@@ -131,6 +122,7 @@ endef
 define DOWNLOAD_CVS
 	$(EXTRA_ENV) support/download/wrapper cvs \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
+		$(PKGDIR)/$($(PKG)_NAME).hash \
 		$(call stripurischeme,$(call qstrip,$($(PKG)_SITE))) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_RAWNAME) \
@@ -149,6 +141,7 @@ endef
 define DOWNLOAD_SVN
 	$(EXTRA_ENV) support/download/wrapper svn \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
+		$(PKGDIR)/$($(PKG)_NAME).hash \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_BASE_NAME)
@@ -168,8 +161,8 @@ endef
 define DOWNLOAD_SCP
 	$(EXTRA_ENV) support/download/wrapper scp \
 		$(DL_DIR)/$(2) \
-		'$(call stripurischeme,$(call qstrip,$(1)))' && \
-	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_NAME).hash,$(DL_DIR)/$(2))
+		$(PKGDIR)/$($(PKG)_NAME).hash \
+		'$(call stripurischeme,$(call qstrip,$(1)))'
 endef
 
 define SOURCE_CHECK_SCP
@@ -184,6 +177,7 @@ endef
 define DOWNLOAD_HG
 	$(EXTRA_ENV) support/download/wrapper hg \
 		$(DL_DIR)/$($(PKG)_SOURCE) \
+		$(PKGDIR)/$($(PKG)_NAME).hash \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_BASE_NAME)
@@ -203,8 +197,8 @@ endef
 define DOWNLOAD_WGET
 	$(EXTRA_ENV) support/download/wrapper wget \
 		$(DL_DIR)/$(2) \
-		'$(call qstrip,$(1))' && \
-	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_NAME).hash,$(DL_DIR)/$(2))
+		$(PKGDIR)/$($(PKG)_NAME).hash \
+		'$(call qstrip,$(1))'
 endef
 
 define SOURCE_CHECK_WGET
@@ -218,8 +212,8 @@ endef
 define DOWNLOAD_LOCALFILES
 	$(EXTRA_ENV) support/download/wrapper cp \
 		$(DL_DIR)/$(2) \
-		$(call stripurischeme,$(call qstrip,$(1))) && \
-	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_NAME).hash,$(DL_DIR)/$(2))
+		$(PKGDIR)/$($(PKG)_NAME).hash \
+		$(call stripurischeme,$(call qstrip,$(1)))
 endef
 
 define SOURCE_CHECK_LOCALFILES
diff --git a/support/download/check-hash b/support/download/check-hash
index 13e361a..b41a87e 100755
--- a/support/download/check-hash
+++ b/support/download/check-hash
@@ -5,9 +5,11 @@ set -e
 # Call it with:
 #   $1: the path of the file containing all the the expected hashes
 #   $2: the full path to the file to check
+#   $3: the basename of the file to check
 
 h_file="${1}"
 file="${2}"
+base="${3}"
 
 # Does the hash-file exist?
 if [ ! -f "${h_file}" ]; then
@@ -30,7 +32,7 @@ check_one_hash() {
         sha224|sha256|sha384|sha512)    ;;
         *) # Unknown hash, exit with error
             printf "ERROR: unknown hash '%s' for '%s'\n"  \
-                   "${_h}" "${_file##*/}" >&2
+                   "${_h}" "${base}" >&2
             exit 1
             ;;
     esac
@@ -38,11 +40,11 @@ check_one_hash() {
     # Do the hashes match?
     _hash=$( ${_h}sum "${_file}" |cut -d ' ' -f 1 )
     if [ "${_hash}" = "${_known}" ]; then
-        printf "%s: OK (%s: %s)\n" "${_file##*/}" "${_h}" "${_hash}"
+        printf "%s: OK (%s: %s)\n" "${base}" "${_h}" "${_hash}"
         return 0
     fi
 
-    printf "ERROR: %s has wrong %s hash:\n" "${_file##*/}" "${_h}" >&2
+    printf "ERROR: %s has wrong %s hash:\n" "${base}" "${_h}" >&2
     printf "ERROR: expected: %s\n" "${_known}" >&2
     printf "ERROR: got     : %s\n" "${_hash}" >&2
     printf "ERROR: Incomplete download, or man-in-the-middle (MITM) attack\n" >&2
@@ -59,7 +61,7 @@ while read t h f; do
             continue
             ;;
         *)
-            if [ "${f}" = "${file##*/}" ]; then
+            if [ "${f}" = "${base}" ]; then
                 check_one_hash "${t}" "${h}" "${file}"
                 : $((nb_checks++))
             fi
@@ -69,9 +71,9 @@ done <"${h_file}"
 
 if [ ${nb_checks} -eq 0 ]; then
     if [ -n "${BR2_ENFORCE_CHECK_HASH}" ]; then
-        printf "ERROR: No hash found for %s\n" "${file}" >&2
+        printf "ERROR: No hash found for %s\n" "${base}" >&2
         exit 1
     else
-        printf "WARNING: No hash found for %s\n" "${file}" >&2
+        printf "WARNING: No hash found for %s\n" "${base}" >&2
     fi
 fi
diff --git a/support/download/wrapper b/support/download/wrapper
index e3ab3a1..2f0f016 100755
--- a/support/download/wrapper
+++ b/support/download/wrapper
@@ -8,6 +8,7 @@
 # Call it with:
 #   $1: name of the helper (eg. cvs, git, cp...)
 #   $2: full path to the file in which to save the download
+#   $3: full path to the hash file
 #   $*: additional arguments to the helper in $1
 # Environment:
 #   BUILD_DIR: the path to Buildroot's build dir
@@ -26,7 +27,8 @@ set -e
 
 helper="${1}"
 output="${2}"
-shift 2
+hfile="${3}"
+shift 3
 
 # If the output file already exists, do not download it again
 if [ -e "${output}" ]; then
@@ -60,6 +62,13 @@ fi
 # cd back to free the temp-dir, so we can remove it later
 cd "${OLDPWD}"
 
+# Check if the downloaded file is sane, and matches the stored hashes
+# for that file
+if ! support/download/check-hash "${hfile}" "${tmpf}" "${output##*/}"; then
+    rm -rf "${tmpd}"
+    exit 1
+fi
+
 # tmp_output is in the same directory as the final output, so we can
 # later move it atomically.
 tmp_output="$( mktemp "${output}.XXXXXX" )"
-- 
1.9.1

^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files
  2014-12-01 23:24 [Buildroot] [PATCH 0/4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
                   ` (2 preceding siblings ...)
  2014-12-01 23:24 ` [Buildroot] [PATCH 3/4] pkg-download: verify the hashes from the download wrapper Yann E. MORIN
@ 2014-12-01 23:24 ` Yann E. MORIN
  2014-12-02  8:31   ` Thomas Petazzoni
  3 siblings, 1 reply; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-01 23:24 UTC (permalink / raw)
  To: buildroot

In some cases, upstream just update their releases in-place, without
renaming them. When that package is updated in Buildroot, a new hash to
match the new upstream release is included in the corresponding .hash
file.

As a consequence, users who previously downloaded that package's tarball
with an older version of Buildroot, will get stuck with an old archive
for that package, and after updating their Buildroot copy, will be greeted
with a failed download, due to the local file not matching the new
hashes.

So, to avoid this situation, check the hashes prior to doing the
download. If the hashes match, consider the locally cached file genuine,
and do not download it. However, if the locally cached file does not
match the known hashes we have for it, it is promptly removed, and a
download is re-attempted.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
---
 support/download/wrapper | 10 ++++++++--
 1 file changed, 8 insertions(+), 2 deletions(-)

diff --git a/support/download/wrapper b/support/download/wrapper
index 2f0f016..161df03 100755
--- a/support/download/wrapper
+++ b/support/download/wrapper
@@ -30,9 +30,15 @@ output="${2}"
 hfile="${3}"
 shift 3
 
-# If the output file already exists, do not download it again
+# If the output file already exists, and its hashes are correct, do
+# not download it again. If it exists, but it does not match the
+# known hashes for it, remove the file and do the download.
 if [ -e "${output}" ]; then
-    exit 0
+    if support/download/check-hash "${hfile}" "${output}" "${output##*/}"; then
+        exit 0
+    fi
+    rm -f "${output}"
+    printf "Re-downloading '%s'...\n" "${output##*/}"
 fi
 
 # tmpd is a temporary directory in which helpers may store intermediate
-- 
1.9.1

^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 1/4] pkg-download: check for already downloaded file in the download wrapper
  2014-12-01 23:24 ` [Buildroot] [PATCH 1/4] pkg-download: check for already downloaded file in the download wrapper Yann E. MORIN
@ 2014-12-02  8:26   ` Thomas Petazzoni
  2014-12-03 18:43     ` Yann E. MORIN
  0 siblings, 1 reply; 16+ messages in thread
From: Thomas Petazzoni @ 2014-12-02  8:26 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

On Tue,  2 Dec 2014 00:24:06 +0100, Yann E. MORIN wrote:
> Instead of repeating the same test again and again in all our download
> rules, just delegate the check for an already downloaded file to the
> download wrapper.
> 
> This clears up the path for doing the hash checks on a cached file
> before the download.
> 
> Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
> Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
> Cc: Peter Korsgaard <jacmet@uclibc.org>
> Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>

It has the consequence of doing another fork+exec of a shell script for
all files, even if they have already been downloaded. I don't know if
this is really measurable compared to the overall build time, though,
except maybe on some cases like "make source". So maybe this is OK.

Best regards,

Thomas
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 3/4] pkg-download: verify the hashes from the download wrapper
  2014-12-01 23:24 ` [Buildroot] [PATCH 3/4] pkg-download: verify the hashes from the download wrapper Yann E. MORIN
@ 2014-12-02  8:29   ` Thomas Petazzoni
  2014-12-03 18:45     ` Yann E. MORIN
  0 siblings, 1 reply; 16+ messages in thread
From: Thomas Petazzoni @ 2014-12-02  8:29 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

On Tue,  2 Dec 2014 00:24:08 +0100, Yann E. MORIN wrote:
> Instead of repeating the check in our download rules, delegate the check
> of the hashes to the download wrapper.
> 
> This needs three different changes:
> 
>   - add a new argument to the download wrapper, that is the full path to
>     the hash file; if the hash file does not exist, that does not change
>     the current behaviour, as the existence of the hash file is done by
>     the check-hash script;

It is really time to use getopt in the download wrapper scripts, as we
have discussed with Fabio recently regarding the fully silent build
thing.

>   - add a third argument to the check-hash script, to be the basename of
>     the file to check; this is required because we no longer check the
>     final file with the finale filename, but anintermediate file with a

finale -> final

anintermediate -> an intermediate

>     temporary filename;
> 
>   - do the actual cal to the check-hash script form within the download

cal?

form -> from.

Thomas
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files
  2014-12-01 23:24 ` [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files Yann E. MORIN
@ 2014-12-02  8:31   ` Thomas Petazzoni
  2014-12-02  9:27     ` Peter Korsgaard
  2014-12-03 18:51     ` Yann E. MORIN
  0 siblings, 2 replies; 16+ messages in thread
From: Thomas Petazzoni @ 2014-12-02  8:31 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

On Tue,  2 Dec 2014 00:24:09 +0100, Yann E. MORIN wrote:
> In some cases, upstream just update their releases in-place, without
> renaming them. When that package is updated in Buildroot, a new hash to
> match the new upstream release is included in the corresponding .hash
> file.
> 
> As a consequence, users who previously downloaded that package's tarball
> with an older version of Buildroot, will get stuck with an old archive
> for that package, and after updating their Buildroot copy, will be greeted
> with a failed download, due to the local file not matching the new
> hashes.
> 
> So, to avoid this situation, check the hashes prior to doing the
> download. If the hashes match, consider the locally cached file genuine,
> and do not download it. However, if the locally cached file does not
> match the known hashes we have for it, it is promptly removed, and a
> download is re-attempted.

So in essence, from now on, at each build, we are re-checking the
hashes, while previously they were checked only when the file was
downloaded. Not that great for build time, but well, again maybe the
time to check the hashes is negligible compared to the build time. And
we can assume that a big tarball, which takes a certain time to hash,
will also contain a lot of source code to build, so the time to
calculate the hash is proportional to the build time of the package. So
if you're ready to spend several minutes to build Qt, you're probably
ready to wait a few more seconds to calculate the hash of the Qt
tarball each time.

Best regards,

Thomas
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files
  2014-12-02  8:31   ` Thomas Petazzoni
@ 2014-12-02  9:27     ` Peter Korsgaard
  2014-12-02  9:30       ` Thomas Petazzoni
  2014-12-03 18:51     ` Yann E. MORIN
  1 sibling, 1 reply; 16+ messages in thread
From: Peter Korsgaard @ 2014-12-02  9:27 UTC (permalink / raw)
  To: buildroot

>>>>> "Thomas" == Thomas Petazzoni <thomas.petazzoni@free-electrons.com> writes:

Hi,

>> So, to avoid this situation, check the hashes prior to doing the
 >> download. If the hashes match, consider the locally cached file genuine,
 >> and do not download it. However, if the locally cached file does not
 >> match the known hashes we have for it, it is promptly removed, and a
 >> download is re-attempted.

 > So in essence, from now on, at each build, we are re-checking the
 > hashes, while previously they were checked only when the file was
 > downloaded. Not that great for build time, but well, again maybe the
 > time to check the hashes is negligible compared to the build time. And
 > we can assume that a big tarball, which takes a certain time to hash,
 > will also contain a lot of source code to build, so the time to
 > calculate the hash is proportional to the build time of the package. So
 > if you're ready to spend several minutes to build Qt, you're probably
 > ready to wait a few more seconds to calculate the hash of the Qt
 > tarball each time.

I doubt it adds any significant time. Probably the hash and extract
steps are largely I/O bound, so "pre-heating" the page cache by
calculating a hash on the tarball before extracting it shouldn't matter
much.

-- 
Bye, Peter Korsgaard

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files
  2014-12-02  9:27     ` Peter Korsgaard
@ 2014-12-02  9:30       ` Thomas Petazzoni
  0 siblings, 0 replies; 16+ messages in thread
From: Thomas Petazzoni @ 2014-12-02  9:30 UTC (permalink / raw)
  To: buildroot

Dear Peter Korsgaard,

On Tue, 02 Dec 2014 10:27:05 +0100, Peter Korsgaard wrote:

>  > So in essence, from now on, at each build, we are re-checking the
>  > hashes, while previously they were checked only when the file was
>  > downloaded. Not that great for build time, but well, again maybe the
>  > time to check the hashes is negligible compared to the build time. And
>  > we can assume that a big tarball, which takes a certain time to hash,
>  > will also contain a lot of source code to build, so the time to
>  > calculate the hash is proportional to the build time of the package. So
>  > if you're ready to spend several minutes to build Qt, you're probably
>  > ready to wait a few more seconds to calculate the hash of the Qt
>  > tarball each time.
> 
> I doubt it adds any significant time. Probably the hash and extract
> steps are largely I/O bound, so "pre-heating" the page cache by
> calculating a hash on the tarball before extracting it shouldn't matter
> much.

True. Anyway, my comment really wasn't rejecting the idea, just
pointing out the additional stuff we are doing.

Thomas
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 1/4] pkg-download: check for already downloaded file in the download wrapper
  2014-12-02  8:26   ` Thomas Petazzoni
@ 2014-12-03 18:43     ` Yann E. MORIN
  0 siblings, 0 replies; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-03 18:43 UTC (permalink / raw)
  To: buildroot

Thomas, All,

On 2014-12-02 09:26 +0100, Thomas Petazzoni spake thusly:
> On Tue,  2 Dec 2014 00:24:06 +0100, Yann E. MORIN wrote:
> > Instead of repeating the same test again and again in all our download
> > rules, just delegate the check for an already downloaded file to the
> > download wrapper.
> > 
> > This clears up the path for doing the hash checks on a cached file
> > before the download.
> > 
> > Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
> > Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
> > Cc: Peter Korsgaard <jacmet@uclibc.org>
> > Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
> 
> It has the consequence of doing another fork+exec of a shell script for
> all files, even if they have already been downloaded. I don't know if
> this is really measurable compared to the overall build time, though,
> except maybe on some cases like "make source". So maybe this is OK.

Well, if I remember correctly, make is smart enough to detect simple
commands and not spawn a shell for them, and just does the fork-exec
itself rather than calling system().

So, when counting the fork+exec you'd get:

  - simple command (program arg arg arg): make does a fork()+exec() to
    the script, the kernel sees the she-bang and the exec is a call to
    the shell to interpret the script

  - command lists (command1 || command2), make does a system(), which in
    turn does a fork()+exec() to the shell to interpret the command
    list.

So we're breaking even in that case. We even gain a call to system(), so
we should be a little tiny bit faster! ;-p

And anyway, a fork+exec is not that costly nowaday. Probably, on a
cache-cold system, the IOs will largely over-shadow the fork+exec pair I
am afraid... ;-)

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 3/4] pkg-download: verify the hashes from the download wrapper
  2014-12-02  8:29   ` Thomas Petazzoni
@ 2014-12-03 18:45     ` Yann E. MORIN
  0 siblings, 0 replies; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-03 18:45 UTC (permalink / raw)
  To: buildroot

Thomas, All,

On 2014-12-02 09:29 +0100, Thomas Petazzoni spake thusly:
> On Tue,  2 Dec 2014 00:24:08 +0100, Yann E. MORIN wrote:
> > Instead of repeating the check in our download rules, delegate the check
> > of the hashes to the download wrapper.
> > 
> > This needs three different changes:
> > 
> >   - add a new argument to the download wrapper, that is the full path to
> >     the hash file; if the hash file does not exist, that does not change
> >     the current behaviour, as the existence of the hash file is done by
> >     the check-hash script;
> 
> It is really time to use getopt in the download wrapper scripts, as we
> have discussed with Fabio recently regarding the fully silent build
> thing.

Yeah, I was afraid you'd say that. :-)

But I'm mostly interested on feedback about the change itself, to check
hashes on locally cached files.

If that is OK, I'll rework the series.

> >   - add a third argument to the check-hash script, to be the basename of
> >     the file to check; this is required because we no longer check the
> >     final file with the finale filename, but anintermediate file with a
> 
> finale -> final
> 
> anintermediate -> an intermediate

Yup x2.

> >     temporary filename;
> > 
> >   - do the actual cal to the check-hash script form within the download
> 
> cal?

call

> form -> from.

Yup, thanks!

Regards,
Yann E. MORIN.


-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files
  2014-12-02  8:31   ` Thomas Petazzoni
  2014-12-02  9:27     ` Peter Korsgaard
@ 2014-12-03 18:51     ` Yann E. MORIN
  2014-12-06 12:44       ` Yann E. MORIN
  1 sibling, 1 reply; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-03 18:51 UTC (permalink / raw)
  To: buildroot

Thomas, All,

On 2014-12-02 09:31 +0100, Thomas Petazzoni spake thusly:
> On Tue,  2 Dec 2014 00:24:09 +0100, Yann E. MORIN wrote:
> > In some cases, upstream just update their releases in-place, without
> > renaming them. When that package is updated in Buildroot, a new hash to
> > match the new upstream release is included in the corresponding .hash
> > file.
> > 
> > As a consequence, users who previously downloaded that package's tarball
> > with an older version of Buildroot, will get stuck with an old archive
> > for that package, and after updating their Buildroot copy, will be greeted
> > with a failed download, due to the local file not matching the new
> > hashes.
> > 
> > So, to avoid this situation, check the hashes prior to doing the
> > download. If the hashes match, consider the locally cached file genuine,
> > and do not download it. However, if the locally cached file does not
> > match the known hashes we have for it, it is promptly removed, and a
> > download is re-attempted.
> 
> So in essence, from now on, at each build, we are re-checking the
> hashes, while previously they were checked only when the file was
> downloaded. Not that great for build time, but well, again maybe the
> time to check the hashes is negligible compared to the build time. And
> we can assume that a big tarball, which takes a certain time to hash,
> will also contain a lot of source code to build, so the time to
> calculate the hash is proportional to the build time of the package. So
> if you're ready to spend several minutes to build Qt, you're probably
> ready to wait a few more seconds to calculate the hash of the Qt
> tarball each time.

I just hashed a 231MiB tarball (qt-everywhere-opensource-src-4.8.6.tar.gz)
and it takes (wall-clock time, core-i5 @2.5GHz):
  - cache-cold sha1  : 1.914s
  - cache-hot  sha1  : 0.762s
  - cache-hot  sha256: 1.270s

(cache-cold sha256 is not easily done, I'd have to either umount /home
or reboot...)

Needless to say this is negligible when compared to the build time of Qt
(I do not have the numbers available for now, but I doubt it would be
less than 30 seconds on my machine).

I will add those numbers in the commit log next time I respin.

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files
  2014-12-03 18:51     ` Yann E. MORIN
@ 2014-12-06 12:44       ` Yann E. MORIN
  2014-12-06 12:53         ` Thomas Petazzoni
  0 siblings, 1 reply; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-06 12:44 UTC (permalink / raw)
  To: buildroot

Thomas, All,

On 2014-12-03 19:51 +0100, Yann E. MORIN spake thusly:
> On 2014-12-02 09:31 +0100, Thomas Petazzoni spake thusly:
> > On Tue,  2 Dec 2014 00:24:09 +0100, Yann E. MORIN wrote:
> > > In some cases, upstream just update their releases in-place, without
> > > renaming them. When that package is updated in Buildroot, a new hash to
> > > match the new upstream release is included in the corresponding .hash
> > > file.
> > > 
> > > As a consequence, users who previously downloaded that package's tarball
> > > with an older version of Buildroot, will get stuck with an old archive
> > > for that package, and after updating their Buildroot copy, will be greeted
> > > with a failed download, due to the local file not matching the new
> > > hashes.
> > > 
> > > So, to avoid this situation, check the hashes prior to doing the
> > > download. If the hashes match, consider the locally cached file genuine,
> > > and do not download it. However, if the locally cached file does not
> > > match the known hashes we have for it, it is promptly removed, and a
> > > download is re-attempted.
> > 
> > So in essence, from now on, at each build, we are re-checking the
> > hashes, while previously they were checked only when the file was
> > downloaded. Not that great for build time, but well, again maybe the
> > time to check the hashes is negligible compared to the build time. And
> > we can assume that a big tarball, which takes a certain time to hash,
> > will also contain a lot of source code to build, so the time to
> > calculate the hash is proportional to the build time of the package. So
> > if you're ready to spend several minutes to build Qt, you're probably
> > ready to wait a few more seconds to calculate the hash of the Qt
> > tarball each time.
> 
> I just hashed a 231MiB tarball (qt-everywhere-opensource-src-4.8.6.tar.gz)
> and it takes (wall-clock time, core-i5 @2.5GHz):
>   - cache-cold sha1  : 1.914s
>   - cache-hot  sha1  : 0.762s
>   - cache-hot  sha256: 1.270s
> 
> (cache-cold sha256 is not easily done, I'd have to either umount /home
> or reboot...)
> 
> Needless to say this is negligible when compared to the build time of Qt
> (I do not have the numbers available for now, but I doubt it would be
> less than 30 seconds on my machine).
> 
> I will add those numbers in the commit log next time I respin.

Please also note that this in fact does not change the current
behaviour. Even today, we are checking the hashes for locally cached
files, so this solution adds no overhead.

To understand why, here's a summary of what happened before and after
thise series:

    before:                         after:

    check if file is cached         check if the file is cached
    if not cached,                  if cached,
        download the file               check the hashes
    check the hashes                    if match,
                                            stop
                                        rm cached file
                                    downlaod file
                                    check hashes

So, as you can see, in case the file is already cached locally, and the
hashes match, we don't do much more than today, except the checks are
not done in the same order, which allows us to attempt a re-download in
case of hash mismatch.

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files
  2014-12-06 12:44       ` Yann E. MORIN
@ 2014-12-06 12:53         ` Thomas Petazzoni
  2014-12-07 10:47           ` Yann E. MORIN
  0 siblings, 1 reply; 16+ messages in thread
From: Thomas Petazzoni @ 2014-12-06 12:53 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

On Sat, 6 Dec 2014 13:44:41 +0100, Yann E. MORIN wrote:

> Please also note that this in fact does not change the current
> behaviour. Even today, we are checking the hashes for locally cached
> files, so this solution adds no overhead.
> 
> To understand why, here's a summary of what happened before and after
> thise series:
> 
>     before:                         after:
> 
>     check if file is cached         check if the file is cached
>     if not cached,                  if cached,
>         download the file               check the hashes
>     check the hashes                    if match,
>                                             stop
>                                         rm cached file
>                                     downlaod file
>                                     check hashes
> 
> So, as you can see, in case the file is already cached locally, and the
> hashes match, we don't do much more than today, except the checks are
> not done in the same order, which allows us to attempt a re-download in
> case of hash mismatch.

define DOWNLOAD_WGET
        test -e $(DL_DIR)/$(2) || \
        $(EXTRA_ENV) support/download/wrapper wget \
                $(DL_DIR)/$(2) \
                '$(call qstrip,$(1))' && \
        $(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_NAME).hash,$(DL_DIR)/$(2))
endef


Hum, the construct 

	test -e <foo> || download && check-hash

is not very easy to understand in terms of priorities, but indeed,
regardless of whether the download is executed or not, the check-hash
part is executed.

Thanks for the explanation!

Thomas
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 16+ messages in thread

* [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files
  2014-12-06 12:53         ` Thomas Petazzoni
@ 2014-12-07 10:47           ` Yann E. MORIN
  0 siblings, 0 replies; 16+ messages in thread
From: Yann E. MORIN @ 2014-12-07 10:47 UTC (permalink / raw)
  To: buildroot

Thomas, All,

On 2014-12-06 13:53 +0100, Thomas Petazzoni spake thusly:
[--SNIP--]
> define DOWNLOAD_WGET
>         test -e $(DL_DIR)/$(2) || \
>         $(EXTRA_ENV) support/download/wrapper wget \
>                 $(DL_DIR)/$(2) \
>                 '$(call qstrip,$(1))' && \
>         $(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_NAME).hash,$(DL_DIR)/$(2))
> endef
> 
> 
> Hum, the construct 
> 
> 	test -e <foo> || download && check-hash

Well, in shell, the || and && operators have the same priority. In fact,
POSIX does not state that there should be any priority or that there
should be one. The phrasing in POSIX is very ambiguous:

    AND Lists

    The control operator "&&" denotes an AND list. The format shall be:

        command1 [ && command2] ...

    First command1 shall be executed. If its exit status is zero, command2
    shall be executed, and so on, until a command has a non-zero exit status
    or there are no more commands left to execute. The commands are expanded
    only if they are executed.

    Exit Status

    The exit status of an AND list shall be the exit status of the last
    command that is executed in the list.

    OR Lists

    The control operator "||" denotes an OR List. The format shall be:

        command1 [ || command2] ...

    First, command1 shall be executed. If its exit status is non-zero,
    command2 shall be executed, and so on, until a command has a zero exit
    status or there are no more commands left to execute.

    Exit Status

    The exit status of an OR list shall be the exit status of the last
    command that is executed in the list.

So, if one wants priority to be allied, one should use a compund
command, either {compound-list;} or (compound-list).

Yes, this sucks.

The advantage of moving that to the download wrapper makes the Makefile
code shrink considerably and makes it easier to read, since we get rid
of these AND-OR lists, we are non-trivial to read.

(Note: I too was very surprised that || and && do not have a priority,
and was bitten by it more often than not in the past; still today I do
this mistake quite often.)

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 16+ messages in thread

end of thread, other threads:[~2014-12-07 10:47 UTC | newest]

Thread overview: 16+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2014-12-01 23:24 [Buildroot] [PATCH 0/4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
2014-12-01 23:24 ` [Buildroot] [PATCH 1/4] pkg-download: check for already downloaded file in the download wrapper Yann E. MORIN
2014-12-02  8:26   ` Thomas Petazzoni
2014-12-03 18:43     ` Yann E. MORIN
2014-12-01 23:24 ` [Buildroot] [PATCH 2/4] pkg-download: fix arguments to hash checking script Yann E. MORIN
2014-12-01 23:24 ` [Buildroot] [PATCH 3/4] pkg-download: verify the hashes from the download wrapper Yann E. MORIN
2014-12-02  8:29   ` Thomas Petazzoni
2014-12-03 18:45     ` Yann E. MORIN
2014-12-01 23:24 ` [Buildroot] [PATCH 4/4] pkg-download: check hasahes for locally cached files Yann E. MORIN
2014-12-02  8:31   ` Thomas Petazzoni
2014-12-02  9:27     ` Peter Korsgaard
2014-12-02  9:30       ` Thomas Petazzoni
2014-12-03 18:51     ` Yann E. MORIN
2014-12-06 12:44       ` Yann E. MORIN
2014-12-06 12:53         ` Thomas Petazzoni
2014-12-07 10:47           ` Yann E. MORIN

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.