All of lore.kernel.org
 help / color / mirror / Atom feed
* [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash)
@ 2014-12-11 18:24 Yann E. MORIN
  2014-12-11 18:24 ` [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper Yann E. MORIN
                   ` (4 more replies)
  0 siblings, 5 replies; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 18:24 UTC (permalink / raw)
  To: buildroot

Hello All!

This series introduces a way to check hashes prior to doing a download.

This is required for when upstream silently update their release tarballs
without renaming them, and the user is left with a stray locally cached
tarball that no longer match the hashes with have for that package.

In so doing, this series:
  - moves the check for a cached file into the wrapper;
  - moves the post-download check for hashes into the wrapper;
  - adds a pre-download check for hashes in the wrapper.

Doing the pre-download checks in the Makefile, like the post-download
checks were done, made the Makefile a bit harder to read. On the other
hand, we have a download wrapper shell script, so it is easier to do
trickey stuff in there (shell syntax) than in the Makefile (make syntax
can become unreadable pretty fast).

This has a side effect of cleaning up the pkg-download.mk Makefile, too,
but that was not the goal.


Changes v3 -> v4:
  - remove patch about checking hashes for host-packages (already applied
    differently by Peter)
  - rebased on top of current master

Changes v2 -> v3:
  - fix checking hashes for host packages
  - fix left-over line continuations

Changes v1 -> v2:
  - add options parsing to the wrapper  (Thomas)
  - typoes  (Thomas)
  - rename the wrapper to dl-wrapper so it looks better in the traces,
    and it is not confused with another wrapper


Regards,
Yann E. MORIN.


The following changes since commit 5afa3b55e8f7cf27fc4335183fd477871836ca68:

  ipset: bump to version 6.24 (2014-12-11 09:59:16 +0100)

are available in the git repository at:

  git://git.busybox.net/~ymorin/git/buildroot yem/download-hash

for you to fetch changes up to 0301f2d3c762f8815c7c6dc25ed5ea8307275305:

  pkg-download: check hashes for locally cached files (2014-12-11 18:30:15 +0100)

----------------------------------------------------------------
Yann E. MORIN (4):
      suppot/download: add option parsing to the download wrapper
      pkg-download: check for already downloaded file in the download wrapper
      pkg-download: verify the hashes from the download wrapper
      pkg-download: check hashes for locally cached files

 package/pkg-download.mk     |  78 +++++++++---------
 support/download/check-hash |  14 ++--
 support/download/dl-wrapper | 191 ++++++++++++++++++++++++++++++++++++++++++++
 support/download/wrapper    |  99 -----------------------
 4 files changed, 236 insertions(+), 146 deletions(-)
 create mode 100755 support/download/dl-wrapper
 delete mode 100755 support/download/wrapper

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper
  2014-12-11 18:24 [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
@ 2014-12-11 18:24 ` Yann E. MORIN
  2014-12-11 20:37   ` Thomas Petazzoni
  2014-12-11 18:24 ` [Buildroot] [PATCH 2/4 v4] pkg-download: check for already downloaded file in " Yann E. MORIN
                   ` (3 subsequent siblings)
  4 siblings, 1 reply; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 18:24 UTC (permalink / raw)
  To: buildroot

Instead of relying on argument ordering, use actual options in the
download wrapper.

Download backends (bzr, cp, hg...) are left as-is, because it does not
make sense to complexifies them, since they are almost very trivial
shell scripts, and adding option parsing would be really overkill.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
---
 package/pkg-download.mk     |  42 ++++++-----
 support/download/dl-wrapper | 166 ++++++++++++++++++++++++++++++++++++++++++++
 support/download/wrapper    |  99 --------------------------
 3 files changed, 192 insertions(+), 115 deletions(-)
 create mode 100755 support/download/dl-wrapper
 delete mode 100755 support/download/wrapper

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index 190b5b7..8424eca 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -18,6 +18,8 @@ export SCP := $(call qstrip,$(BR2_SCP)) $(QUIET)
 SSH := $(call qstrip,$(BR2_SSH)) $(QUIET)
 export LOCALFILES := $(call qstrip,$(BR2_LOCALFILES))
 
+DL_WRAPPER = support/download/dl-wrapper
+
 # Default spider mode is 'DOWNLOAD'. Other possible values are 'SOURCE_CHECK'
 # used by the _source-check target and 'SHOW_EXTERNAL_DEPS', used by the
 # external-deps target.
@@ -95,8 +97,9 @@ endef
 # problems
 define DOWNLOAD_GIT
 	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
-	$(EXTRA_ENV) support/download/wrapper git \
-		$(DL_DIR)/$($(PKG)_SOURCE) \
+	$(EXTRA_ENV) $(DL_WRAPPER) -b git \
+		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-- \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_BASE_NAME)
@@ -115,8 +118,9 @@ endef
 
 define DOWNLOAD_BZR
 	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
-	$(EXTRA_ENV) support/download/wrapper bzr \
-		$(DL_DIR)/$($(PKG)_SOURCE) \
+	$(EXTRA_ENV) $(DL_WRAPPER) -b bzr \
+		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-- \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_BASE_NAME)
@@ -132,8 +136,9 @@ endef
 
 define DOWNLOAD_CVS
 	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
-	$(EXTRA_ENV) support/download/wrapper cvs \
-		$(DL_DIR)/$($(PKG)_SOURCE) \
+	$(EXTRA_ENV) $(DL_WRAPPER) -b cvs \
+		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-- \
 		$(call stripurischeme,$(call qstrip,$($(PKG)_SITE))) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_RAWNAME) \
@@ -151,8 +156,9 @@ endef
 
 define DOWNLOAD_SVN
 	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
-	$(EXTRA_ENV) support/download/wrapper svn \
-		$(DL_DIR)/$($(PKG)_SOURCE) \
+	$(EXTRA_ENV) $(DL_WRAPPER) -b svn \
+		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-- \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_BASE_NAME)
@@ -171,8 +177,9 @@ endef
 # to prepend the path with a slash: scp://[user@]host:/absolutepath
 define DOWNLOAD_SCP
 	test -e $(DL_DIR)/$(2) || \
-	$(EXTRA_ENV) support/download/wrapper scp \
-		$(DL_DIR)/$(2) \
+	$(EXTRA_ENV) $(DL_WRAPPER) -b scp \
+		-o $(DL_DIR)/$(2) \
+		-- \
 		'$(call stripurischeme,$(call qstrip,$(1)))' && \
 	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_RAWNAME).hash,$(DL_DIR)/$(2))
 endef
@@ -188,8 +195,9 @@ endef
 
 define DOWNLOAD_HG
 	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
-	$(EXTRA_ENV) support/download/wrapper hg \
-		$(DL_DIR)/$($(PKG)_SOURCE) \
+	$(EXTRA_ENV) $(DL_WRAPPER) -b hg \
+		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-- \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
 		$($(PKG)_BASE_NAME)
@@ -208,8 +216,9 @@ endef
 
 define DOWNLOAD_WGET
 	test -e $(DL_DIR)/$(2) || \
-	$(EXTRA_ENV) support/download/wrapper wget \
-		$(DL_DIR)/$(2) \
+	$(EXTRA_ENV) $(DL_WRAPPER) -b wget \
+		-o $(DL_DIR)/$(2) \
+		-- \
 		'$(call qstrip,$(1))' && \
 	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_RAWNAME).hash,$(DL_DIR)/$(2))
 endef
@@ -224,8 +233,9 @@ endef
 
 define DOWNLOAD_LOCALFILES
 	test -e $(DL_DIR)/$(2) || \
-	$(EXTRA_ENV) support/download/wrapper cp \
-		$(DL_DIR)/$(2) \
+	$(EXTRA_ENV) $(DL_WRAPPER) -b cp \
+		-o $(DL_DIR)/$(2) \
+		-- \
 		$(call stripurischeme,$(call qstrip,$(1))) && \
 	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_RAWNAME).hash,$(DL_DIR)/$(2))
 endef
diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
new file mode 100755
index 0000000..b1b9158
--- /dev/null
+++ b/support/download/dl-wrapper
@@ -0,0 +1,166 @@
+#!/usr/bin/env bash
+
+# This script is a wrapper to the other download backends.
+# Its role is to ensure atomicity when saving downloaded files
+# back to BR2_DL_DIR, and not clutter BR2_DL_DIR with partial,
+# failed downloads.
+#
+# Call it with:
+#   -b BACKEND  name of the backend (eg. cvs, git, cp...)
+#   -o FILE     full path to the file in which to save the download
+#   --          everything after '--' are options for the backend
+# Environment:
+#   BUILD_DIR: the path to Buildroot's build dir
+
+# To avoid cluttering BR2_DL_DIR, we download to a trashable
+# location, namely in $(BUILD_DIR).
+# Then, we move the downloaded file to a temporary file in the
+# same directory as the final output file.
+# This allows us to finally atomically rename it to its final
+# name.
+# If anything goes wrong, we just remove all the temporaries
+# created so far.
+
+# We want to catch any unexpected failure, and exit immediately.
+set -e
+
+main() {
+    local OPT OPTARG
+    local backend output
+
+    # Parse our options; anythong after '--' is for the backend
+    # Sop we only care to -b (backend) and -o (output file)
+    while getopts :hb:o: OPT; do
+        case "${OPT}" in
+        h)  help; exit 0;;
+        b)  backend="${OPTARG}";;
+        o)  output="${OPTARG}";;
+        :)  error "option '%s' expects a mandatory argument\n" "${OPTARG}";;
+        \?) error "unknown option '%s'\n" "${OPTARG}";;
+        esac
+    done
+    # Forget our options, and keep only those for the backend
+    shift $((OPTIND-1))
+
+    if [ -z "${backend}" ]; then
+        error "no backend specified, use -b\n"
+    fi
+    if [ -z "${output}" ]; then
+        error "no output specified, use -o\n"
+    fi
+
+    # tmpd is a temporary directory in which backends may store intermediate
+    # by-products of the download.
+    # tmpf is the file in which the backends should put the downloaded content.
+    # tmpd is located in $(BUILD_DIR), so as not to clutter the (precious)
+    # $(BR2_DL_DIR)
+    # We let the backends create tmpf, so they are able to set whatever
+    # permission bits they want (although we're only really interested in
+    # the executable bit.)
+    tmpd="$( mktemp -d "${BUILD_DIR}/.${output##*/}.XXXXXX" )"
+    tmpf="${tmpd}/output"
+
+    # Helpers expect to run in a directory that is *really* trashable, so
+    # they are free to create whatever files and/or sub-dirs they might need.
+    # Doing the 'cd' here rather than in all backends is easier.
+    cd "${tmpd}"
+
+    # If the backend fails, we can just remove the temporary directory to
+    # remove all the cruft it may have left behind. Then we just exit in
+    # error too.
+    if ! "${OLDPWD}/support/download/${backend}" "${tmpf}" "${@}"; then
+        rm -rf "${tmpd}"
+        exit 1
+    fi
+
+    # cd back to free the temp-dir, so we can remove it later
+    cd "${OLDPWD}"
+
+    # tmp_output is in the same directory as the final output, so we can
+    # later move it atomically.
+    tmp_output="$( mktemp "${output}.XXXXXX" )"
+
+    # 'mktemp' creates files with 'go=-rwx', so the files are not accessible
+    # to users other than the one doing the download (and root, of course).
+    # This can be problematic when a shared BR2_DL_DIR is used by different
+    # users (e.g. on a build server), where all users may write to the shared
+    # location, since other users would not be allowed to read the files
+    # another user downloaded.
+    # So, we restore the 'go' access rights to a more sensible value, while
+    # still abiding by the current user's umask. We must do that before the
+    # final 'mv', so just do it now.
+    # Some backends (cp and scp) may create executable files, so we need to
+    # carry the executable bit if needed.
+    [ -x "${tmpf}" ] && new_mode=755 || new_mode=644
+    new_mode=$( printf "%04o" $((0${new_mode} & ~0$(umask))) )
+    chmod ${new_mode} "${tmp_output}"
+
+    # We must *not* unlink tmp_output, otherwise there is a small window
+    # during which another download process may create the same tmp_output
+    # name (very, very unlikely; but not impossible.)
+    # Using 'cp' is not reliable, since 'cp' may unlink the destination file
+    # if it is unable to open it with O_WRONLY|O_TRUNC; see:
+    #   http://pubs.opengroup.org/onlinepubs/9699919799/utilities/cp.html
+    # Since the destination filesystem can be anything, it might not support
+    # O_TRUNC, so 'cp' would unlink it first.
+    # Use 'cat' and append-redirection '>>' to save to the final location,
+    # since that is the only way we can be 100% sure of the behaviour.
+    if ! cat "${tmpf}" >>"${tmp_output}"; then
+        rm -rf "${tmpd}" "${tmp_output}"
+        exit 1
+    fi
+    rm -rf "${tmpd}"
+
+    # tmp_output and output are on the same filesystem, so POSIX guarantees
+    # that 'mv' is atomic, because it then uses rename() that POSIX mandates
+    # to be atomic, see:
+    #   http://pubs.opengroup.org/onlinepubs/9699919799/functions/rename.html
+    if ! mv -f "${tmp_output}" "${output}"; then
+        rm -f "${tmp_output}"
+        exit 1
+    fi
+}
+
+help() {
+    cat <<_EOF_
+NAME
+    ${my_name} - downlaod wrapper for Buildroot
+
+SYNOPSIS
+    ${my_name} [OPTION]... -- [BACKEND OPTION]...
+
+DESCRIPTION
+    Wrapper script around different download mechanisms. Ensure that
+    concurrent downloads do not conflict, that partial downloads are
+    properly evicted without leaving temporary files, and that access
+    rights are maintained.
+
+    -h  This help text.
+
+    -b BACKEND
+        Wrap the specified BACKEND. Known backends are:
+            bzr     Bazaar
+            cp      local files via simple copy
+            cvs     Concurrent Versions System
+            git     git
+            hg      Mercurial
+            scp     remote files via Secure copy
+            svn     Subversion
+            wget    remote files via HTTP download
+
+    -o FILE
+        Store the downloaded archive in FILE.
+
+  Exit status:
+    0   if OK
+    !0  in case of error
+_EOF_
+}
+
+trace()  { local msg="${1}"; shift; printf "%s: ${msg}" "${my_name}" "${@}"; }
+warn()   { trace "${@}" >&2; }
+errorN() { local ret="${1}"; shift; warn "${@}"; exit ${ret}; }
+error()  { errorN 1 "${@}"; }
+
+my_name="${0##*/}"
+main "${@}"
diff --git a/support/download/wrapper b/support/download/wrapper
deleted file mode 100755
index 320a37e..0000000
--- a/support/download/wrapper
+++ /dev/null
@@ -1,99 +0,0 @@
-#!/usr/bin/env bash
-
-# This script is a wrapper to the other download helpers.
-# Its role is to ensure atomicity when saving downloaded files
-# back to BR2_DL_DIR, and not clutter BR2_DL_DIR with partial,
-# failed downloads.
-#
-# Call it with:
-#   $1: name of the helper (eg. cvs, git, cp...)
-#   $2: full path to the file in which to save the download
-#   $*: additional arguments to the helper in $1
-# Environment:
-#   BUILD_DIR: the path to Buildroot's build dir
-
-# To avoid cluttering BR2_DL_DIR, we download to a trashable
-# location, namely in $(BUILD_DIR).
-# Then, we move the downloaded file to a temporary file in the
-# same directory as the final output file.
-# This allows us to finally atomically rename it to its final
-# name.
-# If anything goes wrong, we just remove all the temporaries
-# created so far.
-
-# We want to catch any unexpected failure, and exit immediately.
-set -e
-
-helper="${1}"
-output="${2}"
-shift 2
-
-# tmpd is a temporary directory in which helpers may store intermediate
-# by-products of the download.
-# tmpf is the file in which the helpers should put the downloaded content.
-# tmpd is located in $(BUILD_DIR), so as not to clutter the (precious)
-# $(BR2_DL_DIR)
-# We let the helpers create tmpf, so they are able to set whatever
-# permission bits they want (although we're only really interested in
-# the executable bit.)
-tmpd="$( mktemp -d "${BUILD_DIR}/.${output##*/}.XXXXXX" )"
-tmpf="${tmpd}/output"
-
-# Helpers expect to run in a directory that is *really* trashable, so
-# they are free to create whatever files and/or sub-dirs they might need.
-# Doing the 'cd' here rather than in all helpers is easier.
-cd "${tmpd}"
-
-# If the helper fails, we can just remove the temporary directory to
-# remove all the cruft it may have left behind. Then we just exit in
-# error too.
-if ! "${OLDPWD}/support/download/${helper}" "${tmpf}" "${@}"; then
-    rm -rf "${tmpd}"
-    exit 1
-fi
-
-# cd back to free the temp-dir, so we can remove it later
-cd "${OLDPWD}"
-
-# tmp_output is in the same directory as the final output, so we can
-# later move it atomically.
-tmp_output="$( mktemp "${output}.XXXXXX" )"
-
-# 'mktemp' creates files with 'go=-rwx', so the files are not accessible
-# to users other than the one doing the download (and root, of course).
-# This can be problematic when a shared BR2_DL_DIR is used by different
-# users (e.g. on a build server), where all users may write to the shared
-# location, since other users would not be allowed to read the files
-# another user downloaded.
-# So, we restore the 'go' access rights to a more sensible value, while
-# still abiding by the current user's umask. We must do that before the
-# final 'mv', so just do it now.
-# Some helpers (cp and scp) may create executable files, so we need to
-# carry the executable bit if needed.
-[ -x "${tmpf}" ] && new_mode=755 || new_mode=644
-new_mode=$( printf "%04o" $((0${new_mode} & ~0$(umask))) )
-chmod ${new_mode} "${tmp_output}"
-
-# We must *not* unlink tmp_output, otherwise there is a small window
-# during which another download process may create the same tmp_output
-# name (very, very unlikely; but not impossible.)
-# Using 'cp' is not reliable, since 'cp' may unlink the destination file
-# if it is unable to open it with O_WRONLY|O_TRUNC; see:
-#   http://pubs.opengroup.org/onlinepubs/9699919799/utilities/cp.html
-# Since the destination filesystem can be anything, it might not support
-# O_TRUNC, so 'cp' would unlink it first.
-# Use 'cat' and append-redirection '>>' to save to the final location,
-# since that is the only way we can be 100% sure of the behaviour.
-if ! cat "${tmpf}" >>"${tmp_output}"; then
-    rm -rf "${tmpd}" "${tmp_output}"
-    exit 1
-fi
-rm -rf "${tmpd}"
-# tmp_output and output are on the same filesystem, so POSIX guarantees
-# that 'mv' is atomic, because it then uses rename() that POSIX mandates
-# to be atomic, see:
-#   http://pubs.opengroup.org/onlinepubs/9699919799/functions/rename.html
-if ! mv "${tmp_output}" "${output}"; then
-    rm -f "${tmp_output}"
-    exit 1
-fi
-- 
1.9.1

^ permalink raw reply related	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 2/4 v4] pkg-download: check for already downloaded file in the download wrapper
  2014-12-11 18:24 [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
  2014-12-11 18:24 ` [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper Yann E. MORIN
@ 2014-12-11 18:24 ` Yann E. MORIN
  2014-12-11 20:38   ` Thomas Petazzoni
  2014-12-11 18:24 ` [Buildroot] [PATCH 3/4 v4] pkg-download: verify the hashes from " Yann E. MORIN
                   ` (2 subsequent siblings)
  4 siblings, 1 reply; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 18:24 UTC (permalink / raw)
  To: buildroot

Instead of repeating the same test again and again in all our download
rules, just delegate the check for an already downloaded file to the
download wrapper.

This clears up the path for doing the hash checks on a cached file
before the download.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
---
 package/pkg-download.mk     | 8 --------
 support/download/dl-wrapper | 5 +++++
 2 files changed, 5 insertions(+), 8 deletions(-)

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index 8424eca..c021e92 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -96,7 +96,6 @@ endef
 # Messages for the type of clone used are provided to ease debugging in case of
 # problems
 define DOWNLOAD_GIT
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) $(DL_WRAPPER) -b git \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		-- \
@@ -117,7 +116,6 @@ endef
 
 
 define DOWNLOAD_BZR
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) $(DL_WRAPPER) -b bzr \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		-- \
@@ -135,7 +133,6 @@ define SHOW_EXTERNAL_DEPS_BZR
 endef
 
 define DOWNLOAD_CVS
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) $(DL_WRAPPER) -b cvs \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		-- \
@@ -155,7 +152,6 @@ define SHOW_EXTERNAL_DEPS_CVS
 endef
 
 define DOWNLOAD_SVN
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) $(DL_WRAPPER) -b svn \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		-- \
@@ -176,7 +172,6 @@ endef
 # Note that filepath is relative to the user's home directory, so you may want
 # to prepend the path with a slash: scp://[user@]host:/absolutepath
 define DOWNLOAD_SCP
-	test -e $(DL_DIR)/$(2) || \
 	$(EXTRA_ENV) $(DL_WRAPPER) -b scp \
 		-o $(DL_DIR)/$(2) \
 		-- \
@@ -194,7 +189,6 @@ endef
 
 
 define DOWNLOAD_HG
-	test -e $(DL_DIR)/$($(PKG)_SOURCE) || \
 	$(EXTRA_ENV) $(DL_WRAPPER) -b hg \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		-- \
@@ -215,7 +209,6 @@ endef
 
 
 define DOWNLOAD_WGET
-	test -e $(DL_DIR)/$(2) || \
 	$(EXTRA_ENV) $(DL_WRAPPER) -b wget \
 		-o $(DL_DIR)/$(2) \
 		-- \
@@ -232,7 +225,6 @@ define SHOW_EXTERNAL_DEPS_WGET
 endef
 
 define DOWNLOAD_LOCALFILES
-	test -e $(DL_DIR)/$(2) || \
 	$(EXTRA_ENV) $(DL_WRAPPER) -b cp \
 		-o $(DL_DIR)/$(2) \
 		-- \
diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
index b1b9158..8713424 100755
--- a/support/download/dl-wrapper
+++ b/support/download/dl-wrapper
@@ -49,6 +49,11 @@ main() {
         error "no output specified, use -o\n"
     fi
 
+    # If the output file already exists, do not download it again
+    if [ -e "${output}" ]; then
+        exit 0
+    fi
+
     # tmpd is a temporary directory in which backends may store intermediate
     # by-products of the download.
     # tmpf is the file in which the backends should put the downloaded content.
-- 
1.9.1

^ permalink raw reply related	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 3/4 v4] pkg-download: verify the hashes from the download wrapper
  2014-12-11 18:24 [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
  2014-12-11 18:24 ` [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper Yann E. MORIN
  2014-12-11 18:24 ` [Buildroot] [PATCH 2/4 v4] pkg-download: check for already downloaded file in " Yann E. MORIN
@ 2014-12-11 18:24 ` Yann E. MORIN
  2014-12-11 20:42   ` Thomas Petazzoni
  2014-12-11 18:24 ` [Buildroot] [PATCH 4/4 v4] pkg-download: check hashes for locally cached files Yann E. MORIN
  2014-12-11 20:33 ` [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Thomas Petazzoni
  4 siblings, 1 reply; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 18:24 UTC (permalink / raw)
  To: buildroot

Instead of repeating the check in our download rules, delegate the check
of the hashes to the download wrapper.

This needs three different changes:

  - add a new argument to the download wrapper, that is the full path to
    the hash file; if the hash file does not exist, that does not change
    the current behaviour, as the existence of the hash file is checked
    for in the check-hash script;

  - add a third argument to the check-hash script, to be the basename of
    the file to check; this is required because we no longer check the
    final file with the final filename, but an intermediate file with a
    temporary filename;

  - do the actual call to the check-hash script from within the download
    wrapper.

This further paves the way to doing pre-download checks of the hashes
for the locally cached files.

Note: this patch removes the check for hashes for already downloaded
files, since the wrapper script exits early. The behaviour to check
localy cached files will be restored and enhanced in the following
patch.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
---
 package/pkg-download.mk     | 28 +++++++++++-----------------
 support/download/check-hash | 14 ++++++++------
 support/download/dl-wrapper | 22 +++++++++++++++++++---
 3 files changed, 38 insertions(+), 26 deletions(-)

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index c021e92..ba72fc1 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -60,17 +60,6 @@ domainseparator = $(if $(1),$(1),/)
 # github(user,package,version): returns site of GitHub repository
 github = https://github.com/$(1)/$(2)/archive/$(3)
 
-# Helper for checking a tarball's checksum
-# If the hash does not match, remove the incorrect file
-# $(1): the path to the file with the hashes
-# $(2): the full path to the file to check
-define VERIFY_HASH
-	if ! support/download/check-hash $(1) $(2) $(if $(QUIET),>/dev/null); then \
-		rm -f $(2); \
-		exit 1; \
-	fi
-endef
-
 ################################################################################
 # The DOWNLOAD_* helpers are in charge of getting a working copy
 # of the source repository for their corresponding SCM,
@@ -98,6 +87,7 @@ endef
 define DOWNLOAD_GIT
 	$(EXTRA_ENV) $(DL_WRAPPER) -b git \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-- \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
@@ -118,6 +108,7 @@ endef
 define DOWNLOAD_BZR
 	$(EXTRA_ENV) $(DL_WRAPPER) -b bzr \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-- \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
@@ -135,6 +126,7 @@ endef
 define DOWNLOAD_CVS
 	$(EXTRA_ENV) $(DL_WRAPPER) -b cvs \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-- \
 		$(call stripurischeme,$(call qstrip,$($(PKG)_SITE))) \
 		$($(PKG)_DL_VERSION) \
@@ -154,6 +146,7 @@ endef
 define DOWNLOAD_SVN
 	$(EXTRA_ENV) $(DL_WRAPPER) -b svn \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-- \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
@@ -174,9 +167,9 @@ endef
 define DOWNLOAD_SCP
 	$(EXTRA_ENV) $(DL_WRAPPER) -b scp \
 		-o $(DL_DIR)/$(2) \
+		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-- \
-		'$(call stripurischeme,$(call qstrip,$(1)))' && \
-	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_RAWNAME).hash,$(DL_DIR)/$(2))
+		'$(call stripurischeme,$(call qstrip,$(1)))'
 endef
 
 define SOURCE_CHECK_SCP
@@ -191,6 +184,7 @@ endef
 define DOWNLOAD_HG
 	$(EXTRA_ENV) $(DL_WRAPPER) -b hg \
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
+		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-- \
 		$($(PKG)_SITE) \
 		$($(PKG)_DL_VERSION) \
@@ -211,9 +205,9 @@ endef
 define DOWNLOAD_WGET
 	$(EXTRA_ENV) $(DL_WRAPPER) -b wget \
 		-o $(DL_DIR)/$(2) \
+		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-- \
-		'$(call qstrip,$(1))' && \
-	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_RAWNAME).hash,$(DL_DIR)/$(2))
+		'$(call qstrip,$(1))'
 endef
 
 define SOURCE_CHECK_WGET
@@ -227,9 +221,9 @@ endef
 define DOWNLOAD_LOCALFILES
 	$(EXTRA_ENV) $(DL_WRAPPER) -b cp \
 		-o $(DL_DIR)/$(2) \
+		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-- \
-		$(call stripurischeme,$(call qstrip,$(1))) && \
-	$(call VERIFY_HASH,$(PKGDIR)/$($(PKG)_RAWNAME).hash,$(DL_DIR)/$(2))
+		$(call stripurischeme,$(call qstrip,$(1)))
 endef
 
 define SOURCE_CHECK_LOCALFILES
diff --git a/support/download/check-hash b/support/download/check-hash
index 13e361a..b41a87e 100755
--- a/support/download/check-hash
+++ b/support/download/check-hash
@@ -5,9 +5,11 @@ set -e
 # Call it with:
 #   $1: the path of the file containing all the the expected hashes
 #   $2: the full path to the file to check
+#   $3: the basename of the file to check
 
 h_file="${1}"
 file="${2}"
+base="${3}"
 
 # Does the hash-file exist?
 if [ ! -f "${h_file}" ]; then
@@ -30,7 +32,7 @@ check_one_hash() {
         sha224|sha256|sha384|sha512)    ;;
         *) # Unknown hash, exit with error
             printf "ERROR: unknown hash '%s' for '%s'\n"  \
-                   "${_h}" "${_file##*/}" >&2
+                   "${_h}" "${base}" >&2
             exit 1
             ;;
     esac
@@ -38,11 +40,11 @@ check_one_hash() {
     # Do the hashes match?
     _hash=$( ${_h}sum "${_file}" |cut -d ' ' -f 1 )
     if [ "${_hash}" = "${_known}" ]; then
-        printf "%s: OK (%s: %s)\n" "${_file##*/}" "${_h}" "${_hash}"
+        printf "%s: OK (%s: %s)\n" "${base}" "${_h}" "${_hash}"
         return 0
     fi
 
-    printf "ERROR: %s has wrong %s hash:\n" "${_file##*/}" "${_h}" >&2
+    printf "ERROR: %s has wrong %s hash:\n" "${base}" "${_h}" >&2
     printf "ERROR: expected: %s\n" "${_known}" >&2
     printf "ERROR: got     : %s\n" "${_hash}" >&2
     printf "ERROR: Incomplete download, or man-in-the-middle (MITM) attack\n" >&2
@@ -59,7 +61,7 @@ while read t h f; do
             continue
             ;;
         *)
-            if [ "${f}" = "${file##*/}" ]; then
+            if [ "${f}" = "${base}" ]; then
                 check_one_hash "${t}" "${h}" "${file}"
                 : $((nb_checks++))
             fi
@@ -69,9 +71,9 @@ done <"${h_file}"
 
 if [ ${nb_checks} -eq 0 ]; then
     if [ -n "${BR2_ENFORCE_CHECK_HASH}" ]; then
-        printf "ERROR: No hash found for %s\n" "${file}" >&2
+        printf "ERROR: No hash found for %s\n" "${base}" >&2
         exit 1
     else
-        printf "WARNING: No hash found for %s\n" "${file}" >&2
+        printf "WARNING: No hash found for %s\n" "${base}" >&2
     fi
 fi
diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
index 8713424..fdb49db 100755
--- a/support/download/dl-wrapper
+++ b/support/download/dl-wrapper
@@ -8,6 +8,7 @@
 # Call it with:
 #   -b BACKEND  name of the backend (eg. cvs, git, cp...)
 #   -o FILE     full path to the file in which to save the download
+#   -H FILE     full path to the hash file
 #   --          everything after '--' are options for the backend
 # Environment:
 #   BUILD_DIR: the path to Buildroot's build dir
@@ -26,15 +27,16 @@ set -e
 
 main() {
     local OPT OPTARG
-    local backend output
+    local backend output hfile
 
     # Parse our options; anythong after '--' is for the backend
     # Sop we only care to -b (backend) and -o (output file)
-    while getopts :hb:o: OPT; do
+    while getopts :hb:o:H: OPT; do
         case "${OPT}" in
         h)  help; exit 0;;
         b)  backend="${OPTARG}";;
         o)  output="${OPTARG}";;
+        H)  hfile="${OPTARG}";;
         :)  error "option '%s' expects a mandatory argument\n" "${OPTARG}";;
         \?) error "unknown option '%s'\n" "${OPTARG}";;
         esac
@@ -48,6 +50,9 @@ main() {
     if [ -z "${output}" ]; then
         error "no output specified, use -o\n"
     fi
+    if [ -z "${hfile}" ]; then
+        error "no hash-file specified, use -H\n"
+    fi
 
     # If the output file already exists, do not download it again
     if [ -e "${output}" ]; then
@@ -81,6 +86,13 @@ main() {
     # cd back to free the temp-dir, so we can remove it later
     cd "${OLDPWD}"
 
+    # Check if the downloaded file is sane, and matches the stored hashes
+    # for that file
+    if ! support/download/check-hash "${hfile}" "${tmpf}" "${output##*/}"; then
+        rm -rf "${tmpd}"
+        exit 1
+    fi
+
     # tmp_output is in the same directory as the final output, so we can
     # later move it atomically.
     tmp_output="$( mktemp "${output}.XXXXXX" )"
@@ -147,7 +159,7 @@ DESCRIPTION
             bzr     Bazaar
             cp      local files via simple copy
             cvs     Concurrent Versions System
-            git     git
+            git     Git
             hg      Mercurial
             scp     remote files via Secure copy
             svn     Subversion
@@ -156,6 +168,10 @@ DESCRIPTION
     -o FILE
         Store the downloaded archive in FILE.
 
+    -H FILE
+        Use FILE to read hashes from, and check them against the downloaded
+        archive.
+
   Exit status:
     0   if OK
     !0  in case of error
-- 
1.9.1

^ permalink raw reply related	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 4/4 v4] pkg-download: check hashes for locally cached files
  2014-12-11 18:24 [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
                   ` (2 preceding siblings ...)
  2014-12-11 18:24 ` [Buildroot] [PATCH 3/4 v4] pkg-download: verify the hashes from " Yann E. MORIN
@ 2014-12-11 18:24 ` Yann E. MORIN
  2014-12-11 20:45   ` Thomas Petazzoni
  2014-12-11 20:33 ` [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Thomas Petazzoni
  4 siblings, 1 reply; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 18:24 UTC (permalink / raw)
  To: buildroot

In some cases, upstream just update their releases in-place, without
renaming them. When that package is updated in Buildroot, a new hash to
match the new upstream release is included in the corresponding .hash
file.

As a consequence, users who previously downloaded that package's tarball
with an older version of Buildroot, will get stuck with an old archive
for that package, and after updating their Buildroot copy, will be greeted
with a failed download, due to the local file not matching the new
hashes.

So, to avoid this situation, check the hashes prior to doing the
download. If the hashes match, consider the locally cached file genuine,
and do not download it. However, if the locally cached file does not
match the known hashes we have for it, it is promptly removed, and a
download is re-attempted.

Note: this does not add any overhead compared to the previous situation,
because we were already checking hashes of localy cached files. It just
changes the order in which we do the checks. For the records, here is the
overhead of hashing a 231MiB file (qt-everywhere-opensource-src-4.8.6.tar.gz)
on a core-i5 @2.5GHz:

            cache-cold  cache-hot
    sha1      1.914s      0.762s
    sha256    2.109s      1.270s

But again, this overhead already existed before this patch.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
Cc: Peter Korsgaard <jacmet@uclibc.org>
Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>
---
 support/download/dl-wrapper | 6 +++++-
 1 file changed, 5 insertions(+), 1 deletion(-)

diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
index fdb49db..d66715f 100755
--- a/support/download/dl-wrapper
+++ b/support/download/dl-wrapper
@@ -56,7 +56,11 @@ main() {
 
     # If the output file already exists, do not download it again
     if [ -e "${output}" ]; then
-        exit 0
+        if support/download/check-hash "${hfile}" "${output}" "${output##*/}"; then
+            exit 0
+        fi
+        rm -f "${output}"
+        printf "Re-downloading '%s'...\n" "${output##*/}"
     fi
 
     # tmpd is a temporary directory in which backends may store intermediate
-- 
1.9.1

^ permalink raw reply related	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash)
  2014-12-11 18:24 [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
                   ` (3 preceding siblings ...)
  2014-12-11 18:24 ` [Buildroot] [PATCH 4/4 v4] pkg-download: check hashes for locally cached files Yann E. MORIN
@ 2014-12-11 20:33 ` Thomas Petazzoni
  2014-12-11 20:40   ` Yann E. MORIN
  4 siblings, 1 reply; 14+ messages in thread
From: Thomas Petazzoni @ 2014-12-11 20:33 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

On Thu, 11 Dec 2014 19:24:40 +0100, Yann E. MORIN wrote:

> This series introduces a way to check hashes prior to doing a download.
> 
> This is required for when upstream silently update their release tarballs
> without renaming them, and the user is left with a stray locally cached
> tarball that no longer match the hashes with have for that package.
> 
> In so doing, this series:
>   - moves the check for a cached file into the wrapper;
>   - moves the post-download check for hashes into the wrapper;
>   - adds a pre-download check for hashes in the wrapper.
> 
> Doing the pre-download checks in the Makefile, like the post-download
> checks were done, made the Makefile a bit harder to read. On the other
> hand, we have a download wrapper shell script, so it is easier to do
> trickey stuff in there (shell syntax) than in the Makefile (make syntax
> can become unreadable pretty fast).
> 
> This has a side effect of cleaning up the pkg-download.mk Makefile, too,
> but that was not the goal.

I did a quick test, and things seems to work as expected. There is
however one corner case that gives a fairly funky behavior: when the
tarball is corrupt in $(DL_DIR) *and* when the hash doesn't match the
file that is downloaded. To test this, I poisoned the busybox tarball
in my $(DL_DIR), and also modified busybox.hash to have a hash that
doesn't match (note that I changed only the SHA1 hash, not the MD5
one). And in this case, what happens is that:

 1. Aaah, the hash is not good, let's re-download.
 2. Download happens
 3. Aaah, the hash is still not good, let's re-download
 4. Download happens
 5. Aaaah, the hash is still not good. Let's give up now.

Clearly, downloading the tarball twice is not necessary here.

Here is the log of this test:

ERROR: busybox-1.22.1.tar.bz2 has wrong md5 hash:
ERROR: expected: 337d1a15ab1cb1d4ed423168b1eb7d7e
ERROR: got     : 5ee6a6f8269d5b391a990306f664dd4c
ERROR: Incomplete download, or man-in-the-middle (MITM) attack
Re-downloading 'busybox-1.22.1.tar.bz2'...
--2014-12-11 20:35:17--  http://www.busybox.net/downloads/busybox-1.22.1.tar.bz2
R?solution de www.busybox.net (www.busybox.net)? 140.211.167.224
Connexion ? www.busybox.net (www.busybox.net)|140.211.167.224|:80? connect?.
requ?te HTTP transmise, en attente de la r?ponse? 200 OK
Taille?: 2218650 (2,1M) [application/x-bzip2]
Enregistre : ?/home/thomas/projets/buildroot/output/build/.busybox-1.22.1.tar.bz2.NOqWFC/output?

100%[=======================================================================================================================================================================>] 2 218 650    475KB/s   ds 5,8s   

2014-12-11 20:35:23 (372 KB/s) - ?/home/thomas/projets/buildroot/output/build/.busybox-1.22.1.tar.bz2.NOqWFC/output? enregistr? [2218650/2218650]

busybox-1.22.1.tar.bz2: OK (md5: 337d1a15ab1cb1d4ed423168b1eb7d7e)
ERROR: busybox-1.22.1.tar.bz2 has wrong sha1 hash:
ERROR: expected: e6e96fefb6f0fb8079f27468b9bf22d8dd96108e
ERROR: got     : d6e96fefb6f0fb8079f27468b9bf22d8dd96108e
ERROR: Incomplete download, or man-in-the-middle (MITM) attack
--2014-12-11 20:35:23--  http://sources.buildroot.net/busybox-1.22.1.tar.bz2
R?solution de sources.buildroot.net (sources.buildroot.net)? 176.9.16.109
Connexion ? sources.buildroot.net (sources.buildroot.net)|176.9.16.109|:80? connect?.
requ?te HTTP transmise, en attente de la r?ponse? 200 OK
Taille?: 2218650 (2,1M) [application/x-bzip2]
Enregistre : ?/home/thomas/projets/buildroot/output/build/.busybox-1.22.1.tar.bz2.MIVtVV/output?

100%[=======================================================================================================================================================================>] 2 218 650    386KB/s   ds 4,0s   

2014-12-11 20:35:27 (543 KB/s) - ?/home/thomas/projets/buildroot/output/build/.busybox-1.22.1.tar.bz2.MIVtVV/output? enregistr? [2218650/2218650]

busybox-1.22.1.tar.bz2: OK (md5: 337d1a15ab1cb1d4ed423168b1eb7d7e)
ERROR: busybox-1.22.1.tar.bz2 has wrong sha1 hash:
ERROR: expected: e6e96fefb6f0fb8079f27468b9bf22d8dd96108e
ERROR: got     : d6e96fefb6f0fb8079f27468b9bf22d8dd96108e
ERROR: Incomplete download, or man-in-the-middle (MITM) attack
package/pkg-generic.mk:73: recipe for target '/home/thomas/projets/buildroot/output/build/busybox-1.22.1/.stamp_downloaded' failed
make: *** [/home/thomas/projets/buildroot/output/build/busybox-1.22.1/.stamp_downloaded] Error 1

Thomas
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper
  2014-12-11 18:24 ` [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper Yann E. MORIN
@ 2014-12-11 20:37   ` Thomas Petazzoni
  2014-12-11 21:03     ` Yann E. MORIN
  2014-12-11 21:26     ` Yann E. MORIN
  0 siblings, 2 replies; 14+ messages in thread
From: Thomas Petazzoni @ 2014-12-11 20:37 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

Typo in title: suppot -> support

On Thu, 11 Dec 2014 19:24:45 +0100, Yann E. MORIN wrote:
> Instead of relying on argument ordering, use actual options in the
> download wrapper.
> 
> Download backends (bzr, cp, hg...) are left as-is, because it does not
> make sense to complexifies them, since they are almost very trivial

complexifies -> complexify.

> shell scripts, and adding option parsing would be really overkill.

You could have mentioned that the commit also renames the script. I
thought it was the case in earlier versions of this patch series.


> diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
> new file mode 100755
> index 0000000..b1b9158
> --- /dev/null
> +++ b/support/download/dl-wrapper
> @@ -0,0 +1,166 @@
> +#!/usr/bin/env bash
> +
> +# This script is a wrapper to the other download backends.
> +# Its role is to ensure atomicity when saving downloaded files
> +# back to BR2_DL_DIR, and not clutter BR2_DL_DIR with partial,
> +# failed downloads.
> +#
> +# Call it with:
> +#   -b BACKEND  name of the backend (eg. cvs, git, cp...)
> +#   -o FILE     full path to the file in which to save the download
> +#   --          everything after '--' are options for the backend
> +# Environment:
> +#   BUILD_DIR: the path to Buildroot's build dir

With this in place, do we still need the -h option and the fake man
page?

> +
> +# To avoid cluttering BR2_DL_DIR, we download to a trashable
> +# location, namely in $(BUILD_DIR).
> +# Then, we move the downloaded file to a temporary file in the
> +# same directory as the final output file.
> +# This allows us to finally atomically rename it to its final
> +# name.
> +# If anything goes wrong, we just remove all the temporaries
> +# created so far.
> +
> +# We want to catch any unexpected failure, and exit immediately.
> +set -e
> +
> +main() {
> +    local OPT OPTARG
> +    local backend output
> +
> +    # Parse our options; anythong after '--' is for the backend

anything

> +    # Sop we only care to -b (backend) and -o (output file)

Sop?

Maybe:

       # We only care about -b, -h and -o

> +    while getopts :hb:o: OPT; do
> +        case "${OPT}" in
> +        h)  help; exit 0;;
> +        b)  backend="${OPTARG}";;
> +        o)  output="${OPTARG}";;
> +        :)  error "option '%s' expects a mandatory argument\n" "${OPTARG}";;
> +        \?) error "unknown option '%s'\n" "${OPTARG}";;
> +        esac
> +    done
> +    # Forget our options, and keep only those for the backend
> +    shift $((OPTIND-1))
> +
> +    if [ -z "${backend}" ]; then
> +        error "no backend specified, use -b\n"
> +    fi
> +    if [ -z "${output}" ]; then
> +        error "no output specified, use -o\n"
> +    fi
> +
> +    # tmpd is a temporary directory in which backends may store intermediate
> +    # by-products of the download.
> +    # tmpf is the file in which the backends should put the downloaded content.
> +    # tmpd is located in $(BUILD_DIR), so as not to clutter the (precious)
> +    # $(BR2_DL_DIR)
> +    # We let the backends create tmpf, so they are able to set whatever
> +    # permission bits they want (although we're only really interested in
> +    # the executable bit.)
> +    tmpd="$( mktemp -d "${BUILD_DIR}/.${output##*/}.XXXXXX" )"

No space after ( and before ).

> +    tmpf="${tmpd}/output"
> +
> +    # Helpers expect to run in a directory that is *really* trashable, so
> +    # they are free to create whatever files and/or sub-dirs they might need.
> +    # Doing the 'cd' here rather than in all backends is easier.
> +    cd "${tmpd}"
> +
> +    # If the backend fails, we can just remove the temporary directory to
> +    # remove all the cruft it may have left behind. Then we just exit in
> +    # error too.
> +    if ! "${OLDPWD}/support/download/${backend}" "${tmpf}" "${@}"; then
> +        rm -rf "${tmpd}"

How is it possible to remove ${tmpd} here, since we are cd'ed into this
directory?

> +        exit 1
> +    fi
> +
> +    # cd back to free the temp-dir, so we can remove it later
> +    cd "${OLDPWD}"
> +
> +    # tmp_output is in the same directory as the final output, so we can
> +    # later move it atomically.
> +    tmp_output="$( mktemp "${output}.XXXXXX" )"

Space issue.

> +
> +    # 'mktemp' creates files with 'go=-rwx', so the files are not accessible
> +    # to users other than the one doing the download (and root, of course).
> +    # This can be problematic when a shared BR2_DL_DIR is used by different
> +    # users (e.g. on a build server), where all users may write to the shared
> +    # location, since other users would not be allowed to read the files
> +    # another user downloaded.
> +    # So, we restore the 'go' access rights to a more sensible value, while
> +    # still abiding by the current user's umask. We must do that before the
> +    # final 'mv', so just do it now.
> +    # Some backends (cp and scp) may create executable files, so we need to
> +    # carry the executable bit if needed.
> +    [ -x "${tmpf}" ] && new_mode=755 || new_mode=644
> +    new_mode=$( printf "%04o" $((0${new_mode} & ~0$(umask))) )

Ditto.

> +    chmod ${new_mode} "${tmp_output}"
> +
> +    # We must *not* unlink tmp_output, otherwise there is a small window
> +    # during which another download process may create the same tmp_output
> +    # name (very, very unlikely; but not impossible.)
> +    # Using 'cp' is not reliable, since 'cp' may unlink the destination file
> +    # if it is unable to open it with O_WRONLY|O_TRUNC; see:
> +    #   http://pubs.opengroup.org/onlinepubs/9699919799/utilities/cp.html
> +    # Since the destination filesystem can be anything, it might not support
> +    # O_TRUNC, so 'cp' would unlink it first.
> +    # Use 'cat' and append-redirection '>>' to save to the final location,
> +    # since that is the only way we can be 100% sure of the behaviour.
> +    if ! cat "${tmpf}" >>"${tmp_output}"; then
> +        rm -rf "${tmpd}" "${tmp_output}"
> +        exit 1
> +    fi
> +    rm -rf "${tmpd}"
> +
> +    # tmp_output and output are on the same filesystem, so POSIX guarantees
> +    # that 'mv' is atomic, because it then uses rename() that POSIX mandates
> +    # to be atomic, see:
> +    #   http://pubs.opengroup.org/onlinepubs/9699919799/functions/rename.html
> +    if ! mv -f "${tmp_output}" "${output}"; then
> +        rm -f "${tmp_output}"
> +        exit 1
> +    fi
> +}
> +
> +help() {
> +    cat <<_EOF_
> +NAME
> +    ${my_name} - downlaod wrapper for Buildroot

download

> +
> +SYNOPSIS
> +    ${my_name} [OPTION]... -- [BACKEND OPTION]...
> +
> +DESCRIPTION
> +    Wrapper script around different download mechanisms. Ensure that

Ensures

> +    concurrent downloads do not conflict, that partial downloads are
> +    properly evicted without leaving temporary files, and that access
> +    rights are maintained.
> +
> +    -h  This help text.
> +
> +    -b BACKEND
> +        Wrap the specified BACKEND. Known backends are:
> +            bzr     Bazaar
> +            cp      local files via simple copy
> +            cvs     Concurrent Versions System
> +            git     git
> +            hg      Mercurial
> +            scp     remote files via Secure copy
> +            svn     Subversion
> +            wget    remote files via HTTP download
> +
> +    -o FILE
> +        Store the downloaded archive in FILE.
> +
> +  Exit status:
> +    0   if OK
> +    !0  in case of error
> +_EOF_
> +}
> +
> +trace()  { local msg="${1}"; shift; printf "%s: ${msg}" "${my_name}" "${@}"; }
> +warn()   { trace "${@}" >&2; }
> +errorN() { local ret="${1}"; shift; warn "${@}"; exit ${ret}; }
> +error()  { errorN 1 "${@}"; }

Thanks,

Thomas
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 2/4 v4] pkg-download: check for already downloaded file in the download wrapper
  2014-12-11 18:24 ` [Buildroot] [PATCH 2/4 v4] pkg-download: check for already downloaded file in " Yann E. MORIN
@ 2014-12-11 20:38   ` Thomas Petazzoni
  0 siblings, 0 replies; 14+ messages in thread
From: Thomas Petazzoni @ 2014-12-11 20:38 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

On Thu, 11 Dec 2014 19:24:46 +0100, Yann E. MORIN wrote:
> Instead of repeating the same test again and again in all our download
> rules, just delegate the check for an already downloaded file to the
> download wrapper.
> 
> This clears up the path for doing the hash checks on a cached file
> before the download.
> 
> Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
> Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
> Cc: Peter Korsgaard <jacmet@uclibc.org>
> Cc: Gustavo Zacarias <gustavo@zacarias.com.ar>

Reviewed-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash)
  2014-12-11 20:33 ` [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Thomas Petazzoni
@ 2014-12-11 20:40   ` Yann E. MORIN
  0 siblings, 0 replies; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 20:40 UTC (permalink / raw)
  To: buildroot

Thomas, All,

On 2014-12-11 21:33 +0100, Thomas Petazzoni spake thusly:
> Dear Yann E. MORIN,
> 
> On Thu, 11 Dec 2014 19:24:40 +0100, Yann E. MORIN wrote:
> 
> > This series introduces a way to check hashes prior to doing a download.
> > 
> > This is required for when upstream silently update their release tarballs
> > without renaming them, and the user is left with a stray locally cached
> > tarball that no longer match the hashes with have for that package.
> > 
> > In so doing, this series:
> >   - moves the check for a cached file into the wrapper;
> >   - moves the post-download check for hashes into the wrapper;
> >   - adds a pre-download check for hashes in the wrapper.
> > 
> > Doing the pre-download checks in the Makefile, like the post-download
> > checks were done, made the Makefile a bit harder to read. On the other
> > hand, we have a download wrapper shell script, so it is easier to do
> > trickey stuff in there (shell syntax) than in the Makefile (make syntax
> > can become unreadable pretty fast).
> > 
> > This has a side effect of cleaning up the pkg-download.mk Makefile, too,
> > but that was not the goal.
> 
> I did a quick test, and things seems to work as expected. There is
> however one corner case that gives a fairly funky behavior: when the
> tarball is corrupt in $(DL_DIR) *and* when the hash doesn't match the
> file that is downloaded. To test this, I poisoned the busybox tarball
> in my $(DL_DIR), and also modified busybox.hash to have a hash that
> doesn't match (note that I changed only the SHA1 hash, not the MD5
> one). And in this case, what happens is that:
> 
>  1. Aaah, the hash is not good, let's re-download.
>  2. Download happens
>  3. Aaah, the hash is still not good, let's re-download
>  4. Download happens
>  5. Aaaah, the hash is still not good. Let's give up now.
> 
> Clearly, downloading the tarball twice is not necessary here.

Yes, this is expected. The first download is from upstream, the second
download is from the mirror: if the download from upstream fails, we
download it from the mirror, and we consider an incorrect hash to be
a failed download.

And for the records, that's the current behaviour without this patch.
Check out master, rmove your local busybox tarball, tweak the hash, and
run make busybox-source: it should do the download twice, once from
upstream, and a second time from the mirror.

Regards,
Yann E. MORIN.

> Here is the log of this test:
> 
> ERROR: busybox-1.22.1.tar.bz2 has wrong md5 hash:
> ERROR: expected: 337d1a15ab1cb1d4ed423168b1eb7d7e
> ERROR: got     : 5ee6a6f8269d5b391a990306f664dd4c
> ERROR: Incomplete download, or man-in-the-middle (MITM) attack
> Re-downloading 'busybox-1.22.1.tar.bz2'...
> --2014-12-11 20:35:17--  http://www.busybox.net/downloads/busybox-1.22.1.tar.bz2

First attempt from upstream...

[--SNIP--]
> ERROR: Incomplete download, or man-in-the-middle (MITM) attack
> --2014-12-11 20:35:23--  http://sources.buildroot.net/busybox-1.22.1.tar.bz2

... second attempt from the mirror. ;-)

> ERROR: Incomplete download, or man-in-the-middle (MITM) attack

That one is normal, since you tweaked the hash. ;-)

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 3/4 v4] pkg-download: verify the hashes from the download wrapper
  2014-12-11 18:24 ` [Buildroot] [PATCH 3/4 v4] pkg-download: verify the hashes from " Yann E. MORIN
@ 2014-12-11 20:42   ` Thomas Petazzoni
  2014-12-11 21:12     ` Yann E. MORIN
  0 siblings, 1 reply; 14+ messages in thread
From: Thomas Petazzoni @ 2014-12-11 20:42 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

On Thu, 11 Dec 2014 19:24:47 +0100, Yann E. MORIN wrote:

> diff --git a/support/download/check-hash b/support/download/check-hash
> index 13e361a..b41a87e 100755
> --- a/support/download/check-hash
> +++ b/support/download/check-hash
> @@ -5,9 +5,11 @@ set -e
>  # Call it with:
>  #   $1: the path of the file containing all the the expected hashes
>  #   $2: the full path to the file to check
> +#   $3: the basename of the file to check
>  
>  h_file="${1}"
>  file="${2}"
> +base="${3}"

I was very confused here by $2 and $3. If you have "the full path to
the file to check", why would you need "the basename of the file to
check" as argument. I believe this should be clarified:

 $2: the full path to the temporary file that was downloaded, and that
     should be checked

 $3: the name of the real file we are downloading. Used only for
     messages, as we are in fact checking a temporary file, and waiting
     for this temporary file to be fully downloaded and checked before
     renaming it to the real file name.

Of course, feel free to reword this in a better way.

> -            if [ "${f}" = "${file##*/}" ]; then
> +            if [ "${f}" = "${base}" ]; then

Hum, so it is not only used for display. So you need to reword the
above.

>      # tmp_output is in the same directory as the final output, so we can
>      # later move it atomically.
>      tmp_output="$( mktemp "${output}.XXXXXX" )"
> @@ -147,7 +159,7 @@ DESCRIPTION
>              bzr     Bazaar
>              cp      local files via simple copy
>              cvs     Concurrent Versions System
> -            git     git
> +            git     Git

Should be part of a previous patch.

>              hg      Mercurial
>              scp     remote files via Secure copy

Maybe you want to capitalize "Local files..." and "Remote files..." as
well, for consistency.

Thanks,

Thomas
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 4/4 v4] pkg-download: check hashes for locally cached files
  2014-12-11 18:24 ` [Buildroot] [PATCH 4/4 v4] pkg-download: check hashes for locally cached files Yann E. MORIN
@ 2014-12-11 20:45   ` Thomas Petazzoni
  0 siblings, 0 replies; 14+ messages in thread
From: Thomas Petazzoni @ 2014-12-11 20:45 UTC (permalink / raw)
  To: buildroot

Dear Yann E. MORIN,

On Thu, 11 Dec 2014 19:24:48 +0100, Yann E. MORIN wrote:
> In some cases, upstream just update their releases in-place, without
> renaming them. When that package is updated in Buildroot, a new hash to
> match the new upstream release is included in the corresponding .hash
> file.

We could also mention the case of SourceForge sending HTML crap from
time to time, and things like that.

> Note: this does not add any overhead compared to the previous situation,
> because we were already checking hashes of localy cached files. It just

locally.

Other than that:

Reviewed-by: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
-- 
Thomas Petazzoni, CTO, Free Electrons
Embedded Linux, Kernel and Android engineering
http://free-electrons.com

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper
  2014-12-11 20:37   ` Thomas Petazzoni
@ 2014-12-11 21:03     ` Yann E. MORIN
  2014-12-11 21:26     ` Yann E. MORIN
  1 sibling, 0 replies; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 21:03 UTC (permalink / raw)
  To: buildroot

On 2014-12-11 21:37 +0100, Thomas Petazzoni spake thusly:
> Typo in title: suppot -> support

Suppot de Satan! :-)

> On Thu, 11 Dec 2014 19:24:45 +0100, Yann E. MORIN wrote:
> > Instead of relying on argument ordering, use actual options in the
> > download wrapper.
> > 
> > Download backends (bzr, cp, hg...) are left as-is, because it does not
> > make sense to complexifies them, since they are almost very trivial
> 
> complexifies -> complexify.

Da.

> > shell scripts, and adding option parsing would be really overkill.
> 
> You could have mentioned that the commit also renames the script. I
> thought it was the case in earlier versions of this patch series.

So did I. Will add it on the next respin...

> > diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
> > new file mode 100755
> > index 0000000..b1b9158
> > --- /dev/null
> > +++ b/support/download/dl-wrapper
> > @@ -0,0 +1,166 @@
> > +#!/usr/bin/env bash
> > +
> > +# This script is a wrapper to the other download backends.
> > +# Its role is to ensure atomicity when saving downloaded files
> > +# back to BR2_DL_DIR, and not clutter BR2_DL_DIR with partial,
> > +# failed downloads.
> > +#
> > +# Call it with:
> > +#   -b BACKEND  name of the backend (eg. cvs, git, cp...)
> > +#   -o FILE     full path to the file in which to save the download
> > +#   --          everything after '--' are options for the backend
> > +# Environment:
> > +#   BUILD_DIR: the path to Buildroot's build dir
> 
> With this in place, do we still need the -h option and the fake man
> page?

Are get rid of it and keep the help text? ;-)

[--SNIP--]
> > +main() {
> > +    local OPT OPTARG
> > +    local backend output
> > +
> > +    # Parse our options; anythong after '--' is for the backend
> 
> anything

Ack.

> > +    # Sop we only care to -b (backend) and -o (output file)
> 
> Sop?

s/p//

> Maybe:
> 
>        # We only care about -b, -h and -o

Or just remove it, it's a carry-over from the previous script...

[--SNIP--]
> > +    tmpd="$( mktemp -d "${BUILD_DIR}/.${output##*/}.XXXXXX" )"
> 
> No space after ( and before ).

Ack for it and the following ones.

[--SNIP--]
> > +    if ! "${OLDPWD}/support/download/${backend}" "${tmpf}" "${@}"; then
> > +        rm -rf "${tmpd}"
> 
> How is it possible to remove ${tmpd} here, since we are cd'ed into this
> directory?

Well, nothing prevents the removal of a directory that still has an open
fd to it, no more than anything prevents the removal of a file that
still has an open fd to it. It is marked "deleted" in the VFS so nothing
can access it anymore, nor create new entries in it...

    mkdir -p ~/foo/bar
    cd ~/foo/bar
    rm -rf ~/foo/bar
    echo buz >baz

And see what happens! ;-)

[--SNIP--]
> > +help() {
> > +    cat <<_EOF_
> > +NAME
> > +    ${my_name} - downlaod wrapper for Buildroot
> 
> download

Ack.

> > +
> > +SYNOPSIS
> > +    ${my_name} [OPTION]... -- [BACKEND OPTION]...
> > +
> > +DESCRIPTION
> > +    Wrapper script around different download mechanisms. Ensure that
> 
> Ensures

Ack.

Thanks for the review! :-)

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 3/4 v4] pkg-download: verify the hashes from the download wrapper
  2014-12-11 20:42   ` Thomas Petazzoni
@ 2014-12-11 21:12     ` Yann E. MORIN
  0 siblings, 0 replies; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 21:12 UTC (permalink / raw)
  To: buildroot

Thomas, All,

On 2014-12-11 21:42 +0100, Thomas Petazzoni spake thusly:
> Dear Yann E. MORIN,
> 
> On Thu, 11 Dec 2014 19:24:47 +0100, Yann E. MORIN wrote:
> 
> > diff --git a/support/download/check-hash b/support/download/check-hash
> > index 13e361a..b41a87e 100755
> > --- a/support/download/check-hash
> > +++ b/support/download/check-hash
> > @@ -5,9 +5,11 @@ set -e
> >  # Call it with:
> >  #   $1: the path of the file containing all the the expected hashes
> >  #   $2: the full path to the file to check
> > +#   $3: the basename of the file to check
> >  
> >  h_file="${1}"
> >  file="${2}"
> > +base="${3}"
> 
> I was very confused here by $2 and $3. If you have "the full path to
> the file to check", why would you need "the basename of the file to
> check" as argument. I believe this should be clarified:
> 
>  $2: the full path to the temporary file that was downloaded, and that
>      should be checked
> 
>  $3: the name of the real file we are downloading. Used only for
>      messages, as we are in fact checking a temporary file, and waiting
>      for this temporary file to be fully downloaded and checked before
>      renaming it to the real file name.
> 
> Of course, feel free to reword this in a better way.

Ah yes, this is slightly tricky.

The problem is that the .hash files contain the name of the local
tarballs, while the file we check is a temnporary file with an arbitrary
name.

So we have to know both the name, with full path, of the actual file we
are checking; we get in in $(2). And we need to know what hash to check
it against, so we need the final name it will be saved as; we get it via
$(3).

But your wording is pretty much OK, except the part about "just mesages",
since we really need it, as you later saw...

> > -            if [ "${f}" = "${file##*/}" ]; then
> > +            if [ "${f}" = "${base}" ]; then
> 
> Hum, so it is not only used for display. So you need to reword the
> above.

Nope. Yup. ;-)

> >      # tmp_output is in the same directory as the final output, so we can
> >      # later move it atomically.
> >      tmp_output="$( mktemp "${output}.XXXXXX" )"
> > @@ -147,7 +159,7 @@ DESCRIPTION
> >              bzr     Bazaar
> >              cp      local files via simple copy
> >              cvs     Concurrent Versions System
> > -            git     git
> > +            git     Git
> 
> Should be part of a previous patch.

Yup, my mistake, I did not ammend the correct patch.

> >              hg      Mercurial
> >              scp     remote files via Secure copy
> 
> Maybe you want to capitalize "Local files..." and "Remote files..." as
> well, for consistency.

Well, I did capitalise the names as they appear on their respective
homepages.

Git is written "Git", cvs is "COncurrent Versions System" and so on...

I should rename "remote files via Secure copy" to "Secure copyt of
remote files", and so on...

I'll rework all that.

Thanks! :-)

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 14+ messages in thread

* [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper
  2014-12-11 20:37   ` Thomas Petazzoni
  2014-12-11 21:03     ` Yann E. MORIN
@ 2014-12-11 21:26     ` Yann E. MORIN
  1 sibling, 0 replies; 14+ messages in thread
From: Yann E. MORIN @ 2014-12-11 21:26 UTC (permalink / raw)
  To: buildroot

Thomas, All,

On 2014-12-11 21:37 +0100, Thomas Petazzoni spake thusly:
> On Thu, 11 Dec 2014 19:24:45 +0100, Yann E. MORIN wrote:
[--SNIP--]
> > shell scripts, and adding option parsing would be really overkill.
> 
> You could have mentioned that the commit also renames the script. I
> thought it was the case in earlier versions of this patch series.

Well, it's in the changelog in the cover-letter...

I'll add it in the commit log of the patch too.

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 14+ messages in thread

end of thread, other threads:[~2014-12-11 21:26 UTC | newest]

Thread overview: 14+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2014-12-11 18:24 [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Yann E. MORIN
2014-12-11 18:24 ` [Buildroot] [PATCH 1/4 v4] suppot/download: add option parsing to the download wrapper Yann E. MORIN
2014-12-11 20:37   ` Thomas Petazzoni
2014-12-11 21:03     ` Yann E. MORIN
2014-12-11 21:26     ` Yann E. MORIN
2014-12-11 18:24 ` [Buildroot] [PATCH 2/4 v4] pkg-download: check for already downloaded file in " Yann E. MORIN
2014-12-11 20:38   ` Thomas Petazzoni
2014-12-11 18:24 ` [Buildroot] [PATCH 3/4 v4] pkg-download: verify the hashes from " Yann E. MORIN
2014-12-11 20:42   ` Thomas Petazzoni
2014-12-11 21:12     ` Yann E. MORIN
2014-12-11 18:24 ` [Buildroot] [PATCH 4/4 v4] pkg-download: check hashes for locally cached files Yann E. MORIN
2014-12-11 20:45   ` Thomas Petazzoni
2014-12-11 20:33 ` [Buildroot] [PATCH 0/4 v4] pkg-download: check hashes before the download (branch yem/download-hash) Thomas Petazzoni
2014-12-11 20:40   ` Yann E. MORIN

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.