All of lore.kernel.org
 help / color / mirror / Atom feed
* [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options
@ 2018-03-31 14:23 Maxime Hadjinlian
  2018-03-31 14:23 ` [Buildroot] [v3 02/13] download: put most of the infra in dl-wrapper Maxime Hadjinlian
                   ` (11 more replies)
  0 siblings, 12 replies; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:23 UTC (permalink / raw)
  To: buildroot

From: "Yann E. MORIN" <yann.morin.1998@free.fr>

Currently all download helpers accepts the local output file, the remote
locations, the changesets and so on... as positional arguments.

This was well and nice when that's was all we needed.

But then we added an option to quiesce their verbosity, and that was
shoehorned with a trivial getopts, still keeping all the existing
positional arguments as... positional arguments.

Adding yet more options while keeping positional arguments will not be
very easy, even if we do not envision any new option in the foreseeable
future (but 640K ought to be enough for everyone, remember? ;-) ).

Change all helpers to accept a set of generic options (-q for quiet and
-o for the output file) as well as helper-specific options (like -r for
the repository, -c for a changeset...).

Maxime:
Changed -R to -r for recurse (only for the git backend)
Changed -r to -u for URI (for all backend)
Change -R to -c for cset (for CVS and SVN backend)
Add the export of the BR_BACKEND_DL_GETOPTS so all the backend wrapper
can use the same option easily
Now all the backends use the same common options.

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
---
v1 -> v2:
   - Fix comment in bzr backend
   - Remove 'R' from BR_BACKEND_DL_GETOPTS
   - check-hash shouldn't have had its options changed since it's not a
   download backend (Arnout)
---
 package/pkg-download.mk     | 38 +++++++++++++++++++-------------------
 support/download/bzr        | 25 ++++++++++++++-----------
 support/download/cp         | 17 +++++++++--------
 support/download/cvs        | 34 +++++++++++++++++++---------------
 support/download/dl-wrapper |  7 ++++++-
 support/download/git        | 33 +++++++++++++++++----------------
 support/download/hg         | 25 ++++++++++++++-----------
 support/download/scp        | 19 ++++++++++---------
 support/download/svn        | 25 ++++++++++++++-----------
 support/download/wget       | 17 +++++++++--------
 10 files changed, 131 insertions(+), 109 deletions(-)

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index 3712b9ccc6..ce069b9926 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -77,9 +77,9 @@ define DOWNLOAD_GIT
 		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		$(QUIET) \
 		-- \
-		$($(PKG)_SITE) \
-		$($(PKG)_DL_VERSION) \
-		$($(PKG)_RAW_BASE_NAME) \
+		-u $($(PKG)_SITE) \
+		-c $($(PKG)_DL_VERSION) \
+		-n $($(PKG)_RAW_BASE_NAME) \
 		$($(PKG)_DL_OPTS)
 endef
 
@@ -88,9 +88,9 @@ define DOWNLOAD_BZR
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		$(QUIET) \
 		-- \
-		$($(PKG)_SITE) \
-		$($(PKG)_DL_VERSION) \
-		$($(PKG)_RAW_BASE_NAME) \
+		-u $($(PKG)_SITE) \
+		-c $($(PKG)_DL_VERSION) \
+		-n $($(PKG)_RAW_BASE_NAME) \
 		$($(PKG)_DL_OPTS)
 endef
 
@@ -99,10 +99,10 @@ define DOWNLOAD_CVS
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		$(QUIET) \
 		-- \
-		$(call stripurischeme,$(call qstrip,$($(PKG)_SITE))) \
-		$($(PKG)_DL_VERSION) \
-		$($(PKG)_RAWNAME) \
-		$($(PKG)_RAW_BASE_NAME) \
+		-u $(call stripurischeme,$(call qstrip,$($(PKG)_SITE))) \
+		-c $($(PKG)_DL_VERSION) \
+		-N $($(PKG)_RAWNAME) \
+		-n $($(PKG)_RAW_BASE_NAME) \
 		$($(PKG)_DL_OPTS)
 endef
 
@@ -111,9 +111,9 @@ define DOWNLOAD_SVN
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		$(QUIET) \
 		-- \
-		$($(PKG)_SITE) \
-		$($(PKG)_DL_VERSION) \
-		$($(PKG)_RAW_BASE_NAME) \
+		-u $($(PKG)_SITE) \
+		-c $($(PKG)_DL_VERSION) \
+		-n $($(PKG)_RAW_BASE_NAME) \
 		$($(PKG)_DL_OPTS)
 endef
 
@@ -126,7 +126,7 @@ define DOWNLOAD_SCP
 		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		$(QUIET) \
 		-- \
-		'$(call stripurischeme,$(call qstrip,$(1)))' \
+		-u '$(call stripurischeme,$(call qstrip,$(1)))' \
 		$($(PKG)_DL_OPTS)
 endef
 
@@ -135,9 +135,9 @@ define DOWNLOAD_HG
 		-o $(DL_DIR)/$($(PKG)_SOURCE) \
 		$(QUIET) \
 		-- \
-		$($(PKG)_SITE) \
-		$($(PKG)_DL_VERSION) \
-		$($(PKG)_RAW_BASE_NAME) \
+		-u $($(PKG)_SITE) \
+		-c $($(PKG)_DL_VERSION) \
+		-n $($(PKG)_RAW_BASE_NAME) \
 		$($(PKG)_DL_OPTS)
 endef
 
@@ -147,7 +147,7 @@ define DOWNLOAD_WGET
 		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		$(QUIET) \
 		-- \
-		'$(call qstrip,$(1))' \
+		-u '$(call qstrip,$(1))' \
 		$($(PKG)_DL_OPTS)
 endef
 
@@ -157,7 +157,7 @@ define DOWNLOAD_LOCALFILES
 		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		$(QUIET) \
 		-- \
-		$(call stripurischeme,$(call qstrip,$(1))) \
+		-u $(call stripurischeme,$(call qstrip,$(1))) \
 		$($(PKG)_DL_OPTS)
 endef
 
diff --git a/support/download/bzr b/support/download/bzr
index 75b7b415c1..5289a421cd 100755
--- a/support/download/bzr
+++ b/support/download/bzr
@@ -5,28 +5,31 @@ set -e
 
 # Download helper for bzr, to be called from the download wrapper script
 #
-# Call it as:
-#   .../bzr [-q] OUT_FILE REPO_URL REV BASENAME
+# Options:
+#   -q          Be quiet
+#   -o FILE     Generate archive in FILE.
+#   -u URI      Clone from repository at URI.
+#   -c CSET     Use changeset (or revision) CSET.
+#   -n NAME     Use basename NAME.
 #
 # Environment:
 #   BZR      : the bzr command to call
 
 
 verbose=
-while getopts :q OPT; do
+while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-q;;
+    o)  output="${OPTARG}";;
+    u)  uri="${OPTARG}";;
+    c)  cset="${OPTARG}";;
+    n)  basename="${OPTARG}";;
+    :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
 done
-shift $((OPTIND-1))
 
-output="${1}"
-repo="${2}"
-rev="${3}"
-basename="${4}"
-
-shift 4 # Get rid of our options
+shift $((OPTIND-1)) # Get rid of our options
 
 # Caller needs to single-quote its arguments to prevent them from
 # being expanded a second time (in case there are spaces in them)
@@ -51,5 +54,5 @@ if [ ${bzr_version} -ge ${bzr_min_version} ]; then
 fi
 
 _bzr export ${verbose} --root="'${basename}/'" --format=tgz \
-    ${timestamp_opt} - "${@}" "'${repo}'" -r "'${rev}'" \
+    ${timestamp_opt} - "${@}" "'${uri}'" -r "'${cset}'" \
     >"${output}"
diff --git a/support/download/cp b/support/download/cp
index 0ee1f3ba82..52fe2de83d 100755
--- a/support/download/cp
+++ b/support/download/cp
@@ -5,8 +5,10 @@ set -e
 
 # Download helper for cp, to be called from the download wrapper script
 #
-# Call it as:
-#   .../cp [-q] OUT_FILE SRC_FILE
+# Options:
+#   -q          Be quiet.
+#   -o FILE     Copy to file FILE.
+#   -u FILE     Copy from file FILE.
 #
 # Environment:
 #   LOCALFILES: the cp command to call
@@ -17,18 +19,17 @@ set -e
 # Make 'cp' verbose by default, so it behaves a bit like the others.
 verbose=-v
 
-while getopts :q OPT; do
+while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=;;
+    o)  output="${OPTARG}";;
+    u)  source="${OPTARG}";;
+    :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
 done
-shift $((OPTIND-1))
 
-output="${1}"
-source="${2}"
-
-shift 2 # Get rid of our options
+shift $((OPTIND-1)) # Get rid of our options
 
 # Caller needs to single-quote its arguments to prevent them from
 # being expanded a second time (in case there are spaces in them)
diff --git a/support/download/cvs b/support/download/cvs
index 50050ab1c9..69d5c71f28 100755
--- a/support/download/cvs
+++ b/support/download/cvs
@@ -5,28 +5,32 @@ set -e
 
 # Download helper for cvs, to be called from the download wrapper script
 #
-# Call it as:
-#   .../cvs [-q] OUT_FILE CVS_URL REV PKG_NAME BASENAME
+# Options:
+#   -q          Be quiet
+#   -o FILE     Generate archive in FILE.
+#   -u URI      Checkout from repository at URI.
+#   -c REV      Use revision REV.
+#   -N RAWNAME  Use rawname (aka module name) RAWNAME.
+#   -n NAME     Use basename NAME.
 #
 # Environment:
 #   CVS      : the cvs command to call
 
 verbose=
-while getopts :q OPT; do
+while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-Q;;
+    o)  output="${OPTARG}";;
+    u)  uri="${OPTARG}";;
+    c)  rev="${OPTARG}";;
+    N)  rawname="${OPTARG}";;
+    n)  basename="${OPTARG}";;
+    :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
 done
-shift $((OPTIND-1))
 
-output="${1}"
-repo="${2}"
-rev="${3}"
-rawname="${4}"
-basename="${5}"
-
-shift 5 # Get rid of our options
+shift $((OPTIND-1)) # Get rid of our options
 
 # Caller needs to single-quote its arguments to prevent them from
 # being expanded a second time (in case there are spaces in them)
@@ -42,14 +46,14 @@ else
     select="-r"
 fi
 
-# The absence of an initial : on ${repo} means access method undefined
-if [[ ! "${repo}" =~ ^: ]]; then
+# The absence of an initial : on ${uri} means access method undefined
+if [[ ! "${uri}" =~ ^: ]]; then
    # defaults to anonymous pserver
-   repo=":pserver:anonymous@${repo}"
+   uri=":pserver:anonymous@${uri}"
 fi
 
 export TZ=UTC
-_cvs ${verbose} -z3 -d"'${repo}'" \
+_cvs ${verbose} -z3 -d"'${uri}'" \
      co "${@}" -d "'${basename}'" ${select} "'${rev}'" -P "'${rawname}'"
 
 tar czf "${output}" "${basename}"
diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
index f944b71db5..510e7ef852 100755
--- a/support/download/dl-wrapper
+++ b/support/download/dl-wrapper
@@ -19,6 +19,8 @@
 # We want to catch any unexpected failure, and exit immediately.
 set -e
 
+export BR_BACKEND_DL_GETOPTS=":hc:o:n:N:H:ru:q"
+
 main() {
     local OPT OPTARG
     local backend output hfile recurse quiet
@@ -83,7 +85,10 @@ main() {
     # If the backend fails, we can just remove the temporary directory to
     # remove all the cruft it may have left behind. Then we just exit in
     # error too.
-    if ! "${OLDPWD}/support/download/${backend}" ${quiet} ${recurse} "${tmpf}" "${@}"; then
+    if ! "${OLDPWD}/support/download/${backend}" \
+            ${quiet} ${recurse} \
+            -o "${tmpf}" "${@}"
+    then
         rm -rf "${tmpd}"
         exit 1
     fi
diff --git a/support/download/git b/support/download/git
index f590ff6494..58a2c6ad9d 100755
--- a/support/download/git
+++ b/support/download/git
@@ -5,32 +5,33 @@ set -e
 
 # Download helper for git, to be called from the download wrapper script
 #
-# Call it as:
-#   .../git [-q] [-r] OUT_FILE REPO_URL CSET BASENAME
-#
-#   -q  Be quiet.
-#   -r  Clone and archive sub-modules.
+# Options:
+#   -q          Be quiet.
+#   -r          Clone and archive sub-modules.
+#   -o FILE     Generate archive in FILE.
+#   -u URI      Clone from repository at URI.
+#   -c CSET     Use changeset CSET.
+#   -n NAME     Use basename NAME.
 #
 # Environment:
 #   GIT      : the git command to call
 
 verbose=
 recurse=0
-while getopts :qr OPT; do
+while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-q; exec >/dev/null;;
     r)  recurse=1;;
+    o)  output="${OPTARG}";;
+    u)  uri="${OPTARG}";;
+    c)  cset="${OPTARG}";;
+    n)  basename="${OPTARG}";;
+    :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
 done
-shift $((OPTIND-1))
-
-output="${1}"
-repo="${2}"
-cset="${3}"
-basename="${4}"
 
-shift 4 # Get rid of our options
+shift $((OPTIND-1)) # Get rid of our options
 
 # Caller needs to single-quote its arguments to prevent them from
 # being expanded a second time (in case there are spaces in them)
@@ -46,9 +47,9 @@ _git() {
 # Messages for the type of clone used are provided to ease debugging in case of
 # problems
 git_done=0
-if [ -n "$(_git ls-remote "'${repo}'" "'${cset}'" 2>&1)" ]; then
+if [ -n "$(_git ls-remote "'${uri}'" "'${cset}'" 2>&1)" ]; then
     printf "Doing shallow clone\n"
-    if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${repo}'" "'${basename}'"; then
+    if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${uri}'" "'${basename}'"; then
         git_done=1
     else
         printf "Shallow clone failed, falling back to doing a full clone\n"
@@ -56,7 +57,7 @@ if [ -n "$(_git ls-remote "'${repo}'" "'${cset}'" 2>&1)" ]; then
 fi
 if [ ${git_done} -eq 0 ]; then
     printf "Doing full clone\n"
-    _git clone ${verbose} "${@}" "'${repo}'" "'${basename}'"
+    _git clone ${verbose} "${@}" "'${uri}'" "'${basename}'"
 fi
 
 pushd "${basename}" >/dev/null
diff --git a/support/download/hg b/support/download/hg
index 3af01690b3..efb515fca5 100755
--- a/support/download/hg
+++ b/support/download/hg
@@ -5,27 +5,30 @@ set -e
 
 # Download helper for hg, to be called from the download wrapper script
 #
-# Call it as:
-#   .../hg [-q] OUT_FILE REPO_URL CSET BASENAME
+# Options:
+#   -q          Be quiet.
+#   -o FILE     Generate archive in FILE.
+#   -u URI      Clone from repository at URI.
+#   -c CSET     Use changeset (or revision) CSET.
+#   -n NAME     Use basename NAME.
 #
 # Environment:
 #   HG       : the hg command to call
 
 verbose=
-while getopts :q OPT; do
+while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-q;;
+    o)  output="${OPTARG}";;
+    u)  uri="${OPTARG}";;
+    c)  cset="${OPTARG}";;
+    n)  basename="${OPTARG}";;
+    :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
 done
-shift $((OPTIND-1))
 
-output="${1}"
-repo="${2}"
-cset="${3}"
-basename="${4}"
-
-shift 4 # Get rid of our options
+shift $((OPTIND-1)) # Get rid of our options
 
 # Caller needs to single-quote its arguments to prevent them from
 # being expanded a second time (in case there are spaces in them)
@@ -33,7 +36,7 @@ _hg() {
     eval ${HG} "${@}"
 }
 
-_hg clone ${verbose} "${@}" --noupdate "'${repo}'" "'${basename}'"
+_hg clone ${verbose} "${@}" --noupdate "'${uri}'" "'${basename}'"
 
 _hg archive ${verbose} --repository "'${basename}'" --type tgz \
             --prefix "'${basename}'" --rev "'${cset}'" \
diff --git a/support/download/scp b/support/download/scp
index 825fd41c64..8ecf2f4b22 100755
--- a/support/download/scp
+++ b/support/download/scp
@@ -5,25 +5,26 @@ set -e
 
 # Download helper for scp, to be called from the download wrapper script
 #
-# Call it as:
-#   .../scp [-q] OUT_FILE SRC_URL
+# Options:
+#   -q          Be quiet.
+#   -o FILE     Copy to local file FILE.
+#   -u FILE     Copy from remote file FILE.
 #
 # Environment:
 #   SCP       : the scp command to call
 
 verbose=
-while getopts :q OPT; do
+while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-q;;
+    o)  output="${OPTARG}";;
+    u)  uri="${OPTARG}";;
+    :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
 done
-shift $((OPTIND-1))
 
-output="${1}"
-url="${2}"
-
-shift 2 # Get rid of our options
+shift $((OPTIND-1)) # Get rid of our options
 
 # Caller needs to single-quote its arguments to prevent them from
 # being expanded a second time (in case there are spaces in them)
@@ -31,4 +32,4 @@ _scp() {
     eval ${SCP} "${@}"
 }
 
-_scp ${verbose} "${@}" "'${url}'" "'${output}'"
+_scp ${verbose} "${@}" "'${uri}'" "'${output}'"
diff --git a/support/download/svn b/support/download/svn
index 77abf3d02d..542b25c0a2 100755
--- a/support/download/svn
+++ b/support/download/svn
@@ -5,27 +5,30 @@ set -e
 
 # Download helper for svn, to be called from the download wrapper script
 #
-# Call it as:
-#   .../svn [-q] OUT_FILE REPO_URL REV BASNAME
+# Options:
+#   -q          Be quiet.
+#   -o FILE     Generate archive in FILE.
+#   -u URI      Checkout from repository at URI.
+#   -c REV      Use revision REV.
+#   -n NAME     Use basename NAME.
 #
 # Environment:
 #   SVN      : the svn command to call
 
 verbose=
-while getopts :q OPT; do
+while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-q;;
+    o)  output="${OPTARG}";;
+    u)  uri="${OPTARG}";;
+    c)  rev="${OPTARG}";;
+    n)  basename="${OPTARG}";;
+    :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
 done
-shift $((OPTIND-1))
 
-output="${1}"
-repo="${2}"
-rev="${3}"
-basename="${4}"
-
-shift 4 # Get rid of our options
+shift $((OPTIND-1)) # Get rid of our options
 
 # Caller needs to single-quote its arguments to prevent them from
 # being expanded a second time (in case there are spaces in them)
@@ -33,6 +36,6 @@ _svn() {
     eval ${SVN} "${@}"
 }
 
-_svn export ${verbose} "${@}" "'${repo}@${rev}'" "'${basename}'"
+_svn export ${verbose} "${@}" "'${uri}@${rev}'" "'${basename}'"
 
 tar czf "${output}" "${basename}"
diff --git a/support/download/wget b/support/download/wget
index 768de904c3..fece6663ca 100755
--- a/support/download/wget
+++ b/support/download/wget
@@ -5,25 +5,26 @@ set -e
 
 # Download helper for wget, to be called from the download wrapper script
 #
-# Call it as:
-#   .../wget [-q] OUT_FILE URL
+# Options:
+#   -q          Be quiet.
+#   -o FILE     Save into file FILE.
+#   -u URL      Download file at URL.
 #
 # Environment:
 #   WGET     : the wget command to call
 
 verbose=
-while getopts :q OPT; do
+while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-q;;
+    o)  output="${OPTARG}";;
+    u)  url="${OPTARG}";;
+    :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
 done
-shift $((OPTIND-1))
 
-output="${1}"
-url="${2}"
-
-shift 2 # Get rid of our options
+shift $((OPTIND-1)) # Get rid of our options
 
 # Caller needs to single-quote its arguments to prevent them from
 # being expanded a second time (in case there are spaces in them)
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 02/13] download: put most of the infra in dl-wrapper
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
@ 2018-03-31 14:23 ` Maxime Hadjinlian
  2018-03-31 17:02   ` Maxime Hadjinlian
  2018-03-31 14:23 ` [Buildroot] [v3 03/13] packages: use new $($PKG)_DL_DIR) variable Maxime Hadjinlian
                   ` (10 subsequent siblings)
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:23 UTC (permalink / raw)
  To: buildroot

The goal here is to simplify the infrastructure by putting most of the
code in the dl-wrapper as it's easier to implement and to read.

Most of the functions were common already, this patch finalizes it by
making the pkg-download.mk pass all the parameters needed to the
dl-wrapper which in turns will pass everything to every backend.

The backend will then cherry-pick what it needs from these arguments
and act accordingly.

It eases the transition to the addition of a sub directory per package
in the DL_DIR, and later on, a git cache.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Tested-by: Luca Ceresoli <luca@lucaceresoli.net>
Reviewed-by: Luca Ceresoli <luca@lucaceresoli.net>
---
v1 -> v2:
    - Rename cp backend to file (Arnout)
    - Don't use BR_BACKEND_DL_GETOPTS for dl-wrapper (Arnout)
    - Add "urlencode" to scheme passed to the dl-wrapper to support the
    fact that we need to urlencode the filename when using PRIMARY and
    BACKUP mirror (some files are named toto.c?v=1.0) (Arnout)
    - Fix uristripscheme replaced by bash'ism (Arnout)
    - Add the check hash into the loop, exit with error only if all the
    download+check failed. (Arnout)
---
 missing-hash.py               | 145 ++++++++++++++++++++++++++++++++++++
 package/pkg-download.mk       | 166 ++++++++----------------------------------
 support/download/cvs          |   2 +-
 support/download/dl-wrapper   | 108 ++++++++++++++++++---------
 support/download/{cp => file} |   4 +-
 support/download/wget         |  10 ++-
 6 files changed, 258 insertions(+), 177 deletions(-)
 create mode 100755 missing-hash.py
 rename support/download/{cp => file} (90%)

diff --git a/missing-hash.py b/missing-hash.py
new file mode 100755
index 0000000000..5c8b3435a5
--- /dev/null
+++ b/missing-hash.py
@@ -0,0 +1,145 @@
+#!/usr/bin/env python
+# -*- coding: utf-8 -*-
+
+import fnmatch
+import distutils
+import time
+import ftplib
+import glob
+import logging
+import os
+import re
+import subprocess
+import sys
+import urllib2
+import sysconfig
+
+ERR_PROVIDER = ['exception list', 'website not reachable', 'alioth.debian.org']
+
+EXCLUDED_PKGS = [
+        "boot/common.mk",
+        "linux/linux-ext-fbtft.mk",
+        "linux/linux-ext-xenomai.mk",
+        "linux/linux-ext-rtai.mk",
+        "package/efl/efl.mk",
+        "package/freescale-imx/freescale-imx.mk",
+        "package/gcc/gcc.mk",
+        "package/gstreamer/gstreamer.mk",
+        "package/gstreamer1/gstreamer1.mk",
+        "package/gtk2-themes/gtk2-themes.mk",
+        "package/matchbox/matchbox.mk",
+        "package/opengl/opengl.mk",
+        "package/qt5/qt5.mk",
+        "package/x11r7/x11r7.mk"
+]
+
+class Package(object):
+
+    def __init__(self, package_mk_path):
+        self.mk_path = package_mk_path
+        self.name = os.path.basename(os.path.splitext(package_mk_path)[0])
+        self.mk_name = self.name.upper().replace('-', '_')
+        self.infra = 'unknown'
+        self.infra_host = False
+        self.last_version = None
+        self.hash = False
+        self.provider = None
+        self.source = None
+        self.site = None
+        self.version = None
+
+        data = sysconfig._parse_makefile(package_mk_path)
+        for k in ["SITE", "SOURCE", "VERSION", "LICENSE_FILES", "LICENSE"]:
+            k_name = "%s_%s" % (self.mk_name, k)
+            if k_name in data.keys():
+                value = None if data[k_name] == "" else data[k_name]
+                setattr(self, k.lower(), value)
+
+        if "package/qt5/" in self.mk_path:
+                data = sysconfig._parse_makefile("package/qt5/qt5.mk")
+                self.version = data["QT5_VERSION"]
+
+        if "package/efl/" in self.mk_path:
+                data = sysconfig._parse_makefile("package/efl/efl.mk")
+                self.version = data["EFL_VERSION"]
+
+        with open(package_mk_path) as f:
+            # Everything we could not obtain through the parsing of the mk
+            # files will get obtained here.
+            for line in f.readlines():
+                if "%s_VERSION" % self.mk_name in line and\
+                   self.version is None:
+                        if "$" in line:
+                                continue
+                        self.version = line[line.rindex('=')+1:].strip()
+
+                if "-package)" not in line:
+                    continue
+                self.infra = line[line.rindex('(')+1:-2]
+                if "host" in self.infra:
+                    self.infra_host = True
+                self.infra = self.infra[:self.infra.rindex('-')]
+
+        if "$" in str(self.version):
+                self.version = None
+
+        self.hash_file = "%s.hash" % os.path.splitext(package_mk_path)[0]
+        if os.path.exists(self.hash_file):
+            self.hash = True
+
+        self.provider = self.get_provider()
+
+    def get_provider(self):
+        if self.site is None:
+            return None
+
+        if "github" in self.site:
+            return "github"
+        elif "sourceforge" in self.site:
+            return "sourceforge"
+
+if __name__ == '__main__':
+    matches = []
+    for dir in ["boot", "linux", "package"]:
+        for root, _, filenames in os.walk(dir):
+            for filename in fnmatch.filter(filenames, '*.mk'):
+                path = os.path.join(root, filename)
+                if os.path.dirname(path) in dir:
+                    continue
+                matches.append(path)
+
+    print "#!/bin/sh"
+
+    matches.sort()
+    packages = []
+    count = 0
+    for mk_path in matches:
+
+        if mk_path in EXCLUDED_PKGS:
+            continue
+
+        pkg = Package(mk_path)
+
+        if pkg is None:
+            continue
+
+        if pkg.hash is False:
+            if pkg.site is not None and "github" not in pkg.site:
+                if len(str(pkg.version)) >= 40:
+                    continue
+                print "make %s-source" % pkg.name
+                print "my_file=$(find dl/ -type f)"
+                print "touch %s" % pkg.hash_file
+                print "echo '# Locally computed' >> %s" % pkg.hash_file
+                print "output=$(sha256sum \"$my_file\")"
+                print "sha256=$(echo $output | awk '{print $1}')"
+                print "filename=$(echo $output | awk '{print $2}' | cut -d'/' -f2)"
+                print "echo \"sha256 $sha256 $filename\" >> %s" % pkg.hash_file
+                print "git add %s" % pkg.hash_file
+                print "git commit -s -m \"package/%s: add hash file\"" % pkg.name
+                print "make %s-dirclean" % pkg.name
+                print "rm -Rf dl"
+                print ""
+                count += 1
+
+    print count
diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index ce069b9926..14ea4ff361 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -42,6 +42,8 @@ DL_DIR := $(shell mkdir -p $(DL_DIR) && cd $(DL_DIR) >/dev/null && pwd)
 #
 # geturischeme: http
 geturischeme = $(firstword $(subst ://, ,$(call qstrip,$(1))))
+# getschemeplusuri: git|parameter+http://example.com
+getschemeplusuri = $(call geturischeme,$(1))$(if $(2),\|$(2))+$(1)
 # stripurischeme: www.example.com/dir/file
 stripurischeme = $(lastword $(subst ://, ,$(call qstrip,$(1))))
 # domain: www.example.com
@@ -61,152 +63,42 @@ github = https://github.com/$(1)/$(2)/archive/$(3)
 export BR_NO_CHECK_HASH_FOR =
 
 ################################################################################
-# The DOWNLOAD_* helpers are in charge of getting a working copy
-# of the source repository for their corresponding SCM,
-# checking out the requested version / commit / tag, and create an
-# archive out of it. DOWNLOAD_SCP uses scp to obtain a remote file with
-# ssh authentication. DOWNLOAD_WGET is the normal wget-based download
-# mechanism.
+# DOWNLOAD -- Download helper. Will call DL_WRAPPER which will try to download
+# source from:
+# 1) BR2_PRIMARY_SITE if enabled
+# 2) Download site, unless BR2_PRIMARY_SITE_ONLY is set
+# 3) BR2_BACKUP_SITE if enabled, unless BR2_PRIMARY_SITE_ONLY is set
+#
+# Argument 1 is the source location
 #
 ################################################################################
 
-define DOWNLOAD_GIT
-	$(EXTRA_ENV) $(DL_WRAPPER) -b git \
-		-o $(DL_DIR)/$($(PKG)_SOURCE) \
-		$(if $($(PKG)_GIT_SUBMODULES),-r) \
-		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
-		$(QUIET) \
-		-- \
-		-u $($(PKG)_SITE) \
-		-c $($(PKG)_DL_VERSION) \
-		-n $($(PKG)_RAW_BASE_NAME) \
-		$($(PKG)_DL_OPTS)
-endef
-
-define DOWNLOAD_BZR
-	$(EXTRA_ENV) $(DL_WRAPPER) -b bzr \
-		-o $(DL_DIR)/$($(PKG)_SOURCE) \
-		$(QUIET) \
-		-- \
-		-u $($(PKG)_SITE) \
-		-c $($(PKG)_DL_VERSION) \
-		-n $($(PKG)_RAW_BASE_NAME) \
-		$($(PKG)_DL_OPTS)
-endef
+ifneq ($(call qstrip,$(BR2_PRIMARY_SITE)),)
+DOWNLOAD_URIS += \
+	-u $(call getschemeplusuri,$(BR2_PRIMARY_SITE),urlencode)
+endif
 
-define DOWNLOAD_CVS
-	$(EXTRA_ENV) $(DL_WRAPPER) -b cvs \
-		-o $(DL_DIR)/$($(PKG)_SOURCE) \
-		$(QUIET) \
-		-- \
-		-u $(call stripurischeme,$(call qstrip,$($(PKG)_SITE))) \
-		-c $($(PKG)_DL_VERSION) \
-		-N $($(PKG)_RAWNAME) \
-		-n $($(PKG)_RAW_BASE_NAME) \
-		$($(PKG)_DL_OPTS)
-endef
+ifeq ($(BR2_PRIMARY_SITE_ONLY),)
+DOWNLOAD_URIS += \
+	-u $($(PKG)_SITE_METHOD)+$(dir $(1))
+ifneq ($(call qstrip,$(BR2_BACKUP_SITE)),)
+DOWNLOAD_URIS += \
+	-u $(call getschemeplusuri,$(BR2_BACKUP_SITE),urlencode)
+endif
+endif
 
-define DOWNLOAD_SVN
-	$(EXTRA_ENV) $(DL_WRAPPER) -b svn \
-		-o $(DL_DIR)/$($(PKG)_SOURCE) \
-		$(QUIET) \
-		-- \
-		-u $($(PKG)_SITE) \
+define DOWNLOAD
+	$(Q)$(if $(filter bzr cvs hg svn,$($(PKG)_SITE_METHOD)),BR_NO_CHECK_HASH_FOR=$(notdir $(1));) \
+	$(EXTRA_ENV) $(DL_WRAPPER) \
 		-c $($(PKG)_DL_VERSION) \
-		-n $($(PKG)_RAW_BASE_NAME) \
-		$($(PKG)_DL_OPTS)
-endef
-
-# SCP URIs should be of the form scp://[user@]host:filepath
-# Note that filepath is relative to the user's home directory, so you may want
-# to prepend the path with a slash: scp://[user@]host:/absolutepath
-define DOWNLOAD_SCP
-	$(EXTRA_ENV) $(DL_WRAPPER) -b scp \
-		-o $(DL_DIR)/$(2) \
+		-f $(notdir $(1)) \
 		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
-		$(QUIET) \
-		-- \
-		-u '$(call stripurischeme,$(call qstrip,$(1)))' \
-		$($(PKG)_DL_OPTS)
-endef
-
-define DOWNLOAD_HG
-	$(EXTRA_ENV) $(DL_WRAPPER) -b hg \
-		-o $(DL_DIR)/$($(PKG)_SOURCE) \
-		$(QUIET) \
-		-- \
-		-u $($(PKG)_SITE) \
-		-c $($(PKG)_DL_VERSION) \
 		-n $($(PKG)_RAW_BASE_NAME) \
-		$($(PKG)_DL_OPTS)
-endef
-
-define DOWNLOAD_WGET
-	$(EXTRA_ENV) $(DL_WRAPPER) -b wget \
-		-o $(DL_DIR)/$(2) \
-		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
-		$(QUIET) \
-		-- \
-		-u '$(call qstrip,$(1))' \
-		$($(PKG)_DL_OPTS)
-endef
-
-define DOWNLOAD_LOCALFILES
-	$(EXTRA_ENV) $(DL_WRAPPER) -b cp \
-		-o $(DL_DIR)/$(2) \
-		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
+		-N $($(PKG)_RAWNAME) \
+		-o $(DL_DIR)/$(notdir $(1)) \
+		$(if $($(PKG)_GIT_SUBMODULES),-r) \
+		$(DOWNLOAD_URIS) \
 		$(QUIET) \
 		-- \
-		-u $(call stripurischeme,$(call qstrip,$(1))) \
 		$($(PKG)_DL_OPTS)
 endef
-
-################################################################################
-# DOWNLOAD -- Download helper. Will try to download source from:
-# 1) BR2_PRIMARY_SITE if enabled
-# 2) Download site, unless BR2_PRIMARY_SITE_ONLY is set
-# 3) BR2_BACKUP_SITE if enabled, unless BR2_PRIMARY_SITE_ONLY is set
-#
-# Argument 1 is the source location
-#
-# E.G. use like this:
-# $(call DOWNLOAD,$(FOO_SITE))
-#
-# For PRIMARY and BACKUP site, any ? in the URL is replaced by %3F. A ? in
-# the URL is used to separate query arguments, but the PRIMARY and BACKUP
-# sites serve just plain files.
-################################################################################
-
-define DOWNLOAD
-	$(call DOWNLOAD_INNER,$(1),$(notdir $(1)),DOWNLOAD)
-endef
-
-define DOWNLOAD_INNER
-	$(Q)$(if $(filter bzr cvs hg svn,$($(PKG)_SITE_METHOD)),export BR_NO_CHECK_HASH_FOR=$(2);) \
-	if test -n "$(call qstrip,$(BR2_PRIMARY_SITE))" ; then \
-		case "$(call geturischeme,$(BR2_PRIMARY_SITE))" in \
-			file) $(call $(3)_LOCALFILES,$(BR2_PRIMARY_SITE)/$(2),$(2)) && exit ;; \
-			scp) $(call $(3)_SCP,$(BR2_PRIMARY_SITE)/$(2),$(2)) && exit ;; \
-			*) $(call $(3)_WGET,$(BR2_PRIMARY_SITE)/$(subst ?,%3F,$(2)),$(2)) && exit ;; \
-		esac ; \
-	fi ; \
-	if test "$(BR2_PRIMARY_SITE_ONLY)" = "y" ; then \
-		exit 1 ; \
-	fi ; \
-	if test -n "$(1)" ; then \
-		case "$($(PKG)_SITE_METHOD)" in \
-			git) $($(3)_GIT) && exit ;; \
-			svn) $($(3)_SVN) && exit ;; \
-			cvs) $($(3)_CVS) && exit ;; \
-			bzr) $($(3)_BZR) && exit ;; \
-			file) $($(3)_LOCALFILES) && exit ;; \
-			scp) $($(3)_SCP) && exit ;; \
-			hg) $($(3)_HG) && exit ;; \
-			*) $(call $(3)_WGET,$(1),$(2)) && exit ;; \
-		esac ; \
-	fi ; \
-	if test -n "$(call qstrip,$(BR2_BACKUP_SITE))" ; then \
-		$(call $(3)_WGET,$(BR2_BACKUP_SITE)/$(subst ?,%3F,$(2)),$(2)) && exit ; \
-	fi ; \
-	exit 1
-endef
diff --git a/support/download/cvs b/support/download/cvs
index 69d5c71f28..3f77b849e4 100755
--- a/support/download/cvs
+++ b/support/download/cvs
@@ -21,7 +21,7 @@ while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-Q;;
     o)  output="${OPTARG}";;
-    u)  uri="${OPTARG}";;
+    u)  uri="${OPTARG#*://}";;
     c)  rev="${OPTARG}";;
     N)  rawname="${OPTARG}";;
     n)  basename="${OPTARG}";;
diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
index 510e7ef852..67e9742767 100755
--- a/support/download/dl-wrapper
+++ b/support/download/dl-wrapper
@@ -19,31 +19,34 @@
 # We want to catch any unexpected failure, and exit immediately.
 set -e
 
-export BR_BACKEND_DL_GETOPTS=":hc:o:n:N:H:ru:q"
+export BR_BACKEND_DL_GETOPTS=":hc:o:n:N:H:ru:qf:e"
 
 main() {
     local OPT OPTARG
     local backend output hfile recurse quiet
+    local -a uris
 
     # Parse our options; anything after '--' is for the backend
-    while getopts :hb:o:H:rq OPT; do
+    while getopts ":hc:o:n:N:H:rf:u:q" OPT; do
         case "${OPT}" in
         h)  help; exit 0;;
-        b)  backend="${OPTARG}";;
+        c)  cset="${OPTARG}";;
         o)  output="${OPTARG}";;
+        n)  raw_base_name="${OPTARG}";;
+        N)  base_name="${OPTARG}";;
         H)  hfile="${OPTARG}";;
         r)  recurse="-r";;
+        f)  filename="${OPTARG}";;
+        u)  uris+=( "${OPTARG}" );;
         q)  quiet="-q";;
         :)  error "option '%s' expects a mandatory argument\n" "${OPTARG}";;
         \?) error "unknown option '%s'\n" "${OPTARG}";;
         esac
     done
+
     # Forget our options, and keep only those for the backend
     shift $((OPTIND-1))
 
-    if [ -z "${backend}" ]; then
-        error "no backend specified, use -b\n"
-    fi
     if [ -z "${output}" ]; then
         error "no output specified, use -o\n"
     fi
@@ -77,28 +80,64 @@ main() {
     tmpd="$(mktemp -d "${BUILD_DIR}/.${output##*/}.XXXXXX")"
     tmpf="${tmpd}/output"
 
-    # Helpers expect to run in a directory that is *really* trashable, so
-    # they are free to create whatever files and/or sub-dirs they might need.
-    # Doing the 'cd' here rather than in all backends is easier.
-    cd "${tmpd}"
-
-    # If the backend fails, we can just remove the temporary directory to
-    # remove all the cruft it may have left behind. Then we just exit in
-    # error too.
-    if ! "${OLDPWD}/support/download/${backend}" \
-            ${quiet} ${recurse} \
-            -o "${tmpf}" "${@}"
-    then
-        rm -rf "${tmpd}"
-        exit 1
-    fi
+    # Look through all the uris that we were given to downoad the package
+    # source
+    download_and_check=0
+    for uri in "${uris[@]}"; do
+        backend=${uri%+*}
+        case "${backend}" in
+            git|svn|cvs|bzr|file|scp|hg) ;;
+            *) backend="wget" ;;
+        esac
+        uri=${uri#*+}
+
+        urlencode=${backend#*|}
+        # urlencode must be "urlencode"
+        [ "${urlencode}" != "urlencode" ] && urlencode=""
+
+        # Helpers expect to run in a directory that is *really* trashable, so
+        # they are free to create whatever files and/or sub-dirs they might need.
+        # Doing the 'cd' here rather than in all backends is easier.
+        cd "${tmpd}"
+
+        # If the backend fails, we can just remove the content of the temporary
+        # directory to remove all the cruft it may have left behind, and tries
+        # the next URI until it succeeds. Once out of URI to tries, we need to
+        # cleanup and exit.
+        if ! "${OLDPWD}/support/download/${backend}" \
+                $([ -n "${urlencode}" ] && printf %s '-e') \
+                -c "${cset}" \
+                -n "${raw_base_name}" \
+                -N "${raw_name}" \
+                -f "${filename}" \
+                -u "${uri}" \
+                -o "${tmpf}" \
+                ${quiet} ${recurse} "${@}"
+        then
+            rm -rf "${tmpd:?}/*"
+            # cd back to keep path coherence
+            cd "${OLDPWD}"
+            continue
+        fi
 
-    # cd back to free the temp-dir, so we can remove it later
-    cd "${OLDPWD}"
+        # cd back to free the temp-dir, so we can remove it later
+        cd "${OLDPWD}"
 
-    # Check if the downloaded file is sane, and matches the stored hashes
-    # for that file
-    if ! support/download/check-hash ${quiet} "${hfile}" "${tmpf}" "${output##*/}"; then
+        # Check if the downloaded file is sane, and matches the stored hashes
+        # for that file
+        if ! support/download/check-hash ${quiet} "${hfile}" "${tmpf}" "${output##*/}"; then
+            rm -rf "${tmpd:?}/*"
+            # cd back to keep path coherence
+            cd "${OLDPWD}"
+            continue
+        fi
+        download_and_check=1
+        break
+    done
+
+    # We tried every URI possible, none seems to work or to check against the
+    # available hash. *ABORT MISSION*
+    if [ "${download_and_check}" -eq 0 ]; then
         rm -rf "${tmpd}"
         exit 1
     fi
@@ -164,16 +203,13 @@ DESCRIPTION
 
     -h  This help text.
 
-    -b BACKEND
-        Wrap the specified BACKEND. Known backends are:
-            bzr     Bazaar
-            cp      Local files
-            cvs     Concurrent Versions System
-            git     Git
-            hg      Mercurial
-            scp     Secure copy
-            svn     Subversion
-            wget    HTTP download
+    -u URIs
+        The URI to get the file from, the URI must respect the format given in
+        the example.
+        You may give as many '-u URI' as you want, the script will stop at the
+        frist successful download.
+
+        Example: backend+URI; git+http://example.com or http+http://example.com
 
     -o FILE
         Store the downloaded archive in FILE.
diff --git a/support/download/cp b/support/download/file
similarity index 90%
rename from support/download/cp
rename to support/download/file
index 52fe2de83d..a3e616a181 100755
--- a/support/download/cp
+++ b/support/download/file
@@ -3,7 +3,7 @@
 # We want to catch any unexpected failure, and exit immediately
 set -e
 
-# Download helper for cp, to be called from the download wrapper script
+# Download helper for file, to be called from the download wrapper script
 #
 # Options:
 #   -q          Be quiet.
@@ -23,7 +23,7 @@ while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=;;
     o)  output="${OPTARG}";;
-    u)  source="${OPTARG}";;
+    u)  source="${OPTARG#*://}";;
     :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
diff --git a/support/download/wget b/support/download/wget
index fece6663ca..c69e6071aa 100755
--- a/support/download/wget
+++ b/support/download/wget
@@ -8,7 +8,9 @@ set -e
 # Options:
 #   -q          Be quiet.
 #   -o FILE     Save into file FILE.
+#   -f FILENAME The filename of the tarball to get at URL
 #   -u URL      Download file at URL.
+#   -e ENCODE   Tell wget to urlencode the filename passed to it
 #
 # Environment:
 #   WGET     : the wget command to call
@@ -18,7 +20,9 @@ while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
     case "${OPT}" in
     q)  verbose=-q;;
     o)  output="${OPTARG}";;
+    f)  filename="${OPTARG}";;
     u)  url="${OPTARG}";;
+    e)  encode="-e";;
     :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
     \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
     esac
@@ -32,4 +36,8 @@ _wget() {
     eval ${WGET} "${@}"
 }
 
-_wget ${verbose} "${@}" -O "'${output}'" "'${url}'"
+# Replace every '?' with '%3F' in the filename; only for the PRIMARY and BACKUP
+# mirror
+[ -n "${encode}" ] && filename=${filename//\?/%3F}
+
+_wget ${verbose} "${@}" -O "'${output}'" "'${url}/${filename}'"
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 03/13] packages: use new $($PKG)_DL_DIR) variable
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
  2018-03-31 14:23 ` [Buildroot] [v3 02/13] download: put most of the infra in dl-wrapper Maxime Hadjinlian
@ 2018-03-31 14:23 ` Maxime Hadjinlian
  2018-03-31 14:23 ` [Buildroot] [v3 04/13] arc/xtensa: store the eXtensa overlay in the per-package DL_DIR Maxime Hadjinlian
                   ` (9 subsequent siblings)
  11 siblings, 0 replies; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:23 UTC (permalink / raw)
  To: buildroot

Instead of DL_DIR, the package should now use $(PKG)_DL_DIR to ease the
transition into a new directory structure for DL_DIR.

This commit has been generated with the following scripts:

for i in $(find . -iname "*.mk"); do
	if ! grep -q "\$(DL_DIR)" ${i}; then
		continue
	fi
	pkg_name="$(basename $(dirname ${i}))"
	[ "${pkg_name}" = "package" ] && continue
	raw_pkg_name=$(echo ${pkg_name} | tr [a-z] [A-Z] | tr '-' '_')
	pkg_dl_dir="${raw_pkg_name}_DL_DIR"
	sed -i "s/\$(DL_DIR)/\$($pkg_dl_dir)/" ${i}
done

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 boot/at91bootstrap/at91bootstrap.mk                                 | 2 +-
 package/amd-catalyst/amd-catalyst.mk                                | 2 +-
 package/android-tools/android-tools.mk                              | 2 +-
 package/angularjs/angularjs.mk                                      | 2 +-
 package/bootstrap/bootstrap.mk                                      | 4 +++-
 package/cache-calibrator/cache-calibrator.mk                        | 2 +-
 package/cargo/cargo.mk                                              | 4 ++--
 package/cracklib/cracklib.mk                                        | 2 +-
 package/cryptopp/cryptopp.mk                                        | 2 +-
 package/devmem2/devmem2.mk                                          | 2 +-
 package/dhrystone/dhrystone.mk                                      | 2 +-
 package/doom-wad/doom-wad.mk                                        | 2 +-
 package/espeak/espeak.mk                                            | 2 +-
 package/fan-ctrl/fan-ctrl.mk                                        | 2 +-
 package/freescale-imx/firmware-imx/firmware-imx.mk                  | 2 +-
 package/freescale-imx/gpu-amd-bin-mx51/gpu-amd-bin-mx51.mk          | 2 +-
 package/freescale-imx/imx-codec/imx-codec.mk                        | 2 +-
 package/freescale-imx/imx-gpu-g2d/imx-gpu-g2d.mk                    | 2 +-
 package/freescale-imx/imx-gpu-viv/imx-gpu-viv.mk                    | 2 +-
 package/freescale-imx/imx-parser/imx-parser.mk                      | 2 +-
 package/freescale-imx/imx-vpu/imx-vpu.mk                            | 2 +-
 package/freescale-imx/imx-vpuwrap/imx-vpuwrap.mk                    | 2 +-
 package/freescale-imx/libz160/libz160.mk                            | 2 +-
 package/gcc/gcc.mk                                                  | 2 +-
 package/irrlicht/irrlicht.mk                                        | 2 +-
 package/jquery-mobile/jquery-mobile.mk                              | 2 +-
 package/jquery-sparkline/jquery-sparkline.mk                        | 2 +-
 package/jquery-ui-themes/jquery-ui-themes.mk                        | 2 +-
 package/jquery-ui/jquery-ui.mk                                      | 2 +-
 package/jquery-validation/jquery-validation.mk                      | 2 +-
 package/jquery/jquery.mk                                            | 2 +-
 package/kodi/kodi.mk                                                | 6 +++---
 package/libb64/libb64.mk                                            | 2 +-
 package/libfreeimage/libfreeimage.mk                                | 2 +-
 package/libjson/libjson.mk                                          | 2 +-
 package/libsoil/libsoil.mk                                          | 2 +-
 package/lsof/lsof.mk                                                | 2 +-
 package/musl-compat-headers/musl-compat-headers.mk                  | 2 +-
 package/nmon/nmon.mk                                                | 2 +-
 package/nvidia-driver/nvidia-driver.mk                              | 2 +-
 .../nvidia-tegra23/nvidia-tegra23-codecs/nvidia-tegra23-codecs.mk   | 2 +-
 package/openobex/openobex.mk                                        | 6 ++++++
 package/opentyrian-data/opentyrian-data.mk                          | 2 +-
 package/perl/perl.mk                                                | 2 +-
 package/python-keyring/python-keyring.mk                            | 1 -
 package/python-pytz/python-pytz.mk                                  | 2 +-
 package/python-simplegeneric/python-simplegeneric.mk                | 2 +-
 package/rapidxml/rapidxml.mk                                        | 2 +-
 package/rpi-wifi-firmware/rpi-wifi-firmware.mk                      | 2 +-
 package/rust-bin/rust-bin.mk                                        | 2 +-
 package/sam-ba/sam-ba.mk                                            | 2 +-
 package/spidev_test/spidev_test.mk                                  | 2 +-
 package/tar/tar.mk                                                  | 2 +-
 package/tesseract-ocr/tesseract-ocr.mk                              | 2 +-
 package/ti-cgt-pru/ti-cgt-pru.mk                                    | 4 ++--
 package/ti-gfx/ti-gfx.mk                                            | 4 ++--
 package/ts4900-fpga/ts4900-fpga.mk                                  | 2 +-
 package/unscd/unscd.mk                                              | 2 +-
 package/urg/urg.mk                                                  | 4 +++-
 package/waf/waf.mk                                                  | 2 +-
 package/whetstone/whetstone.mk                                      | 2 +-
 package/wilc1000-firmware/wilc1000-firmware.mk                      | 2 +-
 package/zynq-boot-bin/zynq-boot-bin.mk                              | 2 +-
 63 files changed, 76 insertions(+), 67 deletions(-)

diff --git a/boot/at91bootstrap/at91bootstrap.mk b/boot/at91bootstrap/at91bootstrap.mk
index f6550588d3..c68b56b6ec 100644
--- a/boot/at91bootstrap/at91bootstrap.mk
+++ b/boot/at91bootstrap/at91bootstrap.mk
@@ -17,7 +17,7 @@ AT91BOOTSTRAP_INSTALL_IMAGES = YES
 AT91BOOTSTRAP_INSTALL_TARGET = NO
 
 define AT91BOOTSTRAP_EXTRACT_CMDS
-	$(UNZIP) -d $(BUILD_DIR) $(DL_DIR)/$(AT91BOOTSTRAP_SOURCE)
+	$(UNZIP) -d $(BUILD_DIR) $(AT91BOOTSTRAP_DL_DIR)/$(AT91BOOTSTRAP_SOURCE)
 	mv $(BUILD_DIR)/Bootstrap-v$(AT91BOOTSTRAP_VERSION)/* $(@D)
 	rmdir $(BUILD_DIR)/Bootstrap-v$(AT91BOOTSTRAP_VERSION)
 endef
diff --git a/package/amd-catalyst/amd-catalyst.mk b/package/amd-catalyst/amd-catalyst.mk
index b9396e11fa..d864095c31 100644
--- a/package/amd-catalyst/amd-catalyst.mk
+++ b/package/amd-catalyst/amd-catalyst.mk
@@ -17,7 +17,7 @@ AMD_CATALYST_ARCH_DIR = $(@D)/arch/x86$(AMD_CATALYST_SUFFIX)
 AMD_CATALYST_LIB_SUFFIX = $(if $(BR2_x86_64),64)
 
 define AMD_CATALYST_EXTRACT_CMDS
-	unzip -q $(DL_DIR)/$(AMD_CATALYST_SOURCE) -d $(@D)
+	unzip -q $(AMD_CATALYST_DL_DIR)/$(AMD_CATALYST_SOURCE) -d $(@D)
 	$(SHELL) $(@D)/AMD-Catalyst-$(AMD_CATALYST_VERSION)-Linux-installer-$(AMD_CATALYST_VERBOSE_VER)-x86.x86_64.run --extract $(@D)
 endef
 
diff --git a/package/android-tools/android-tools.mk b/package/android-tools/android-tools.mk
index f6c6913a68..11e0a15a7f 100644
--- a/package/android-tools/android-tools.mk
+++ b/package/android-tools/android-tools.mk
@@ -15,7 +15,7 @@ ANDROID_TOOLS_LICENSE_FILES = debian/copyright
 # Extract the Debian tarball inside the sources
 define ANDROID_TOOLS_DEBIAN_EXTRACT
 	$(call suitable-extractor,$(notdir $(ANDROID_TOOLS_EXTRA_DOWNLOADS))) \
-		$(DL_DIR)/$(notdir $(ANDROID_TOOLS_EXTRA_DOWNLOADS)) | \
+		$(ANDROID_TOOLS_DL_DIR)/$(notdir $(ANDROID_TOOLS_EXTRA_DOWNLOADS)) | \
 		$(TAR) -C $(@D) $(TAR_OPTIONS) -
 endef
 
diff --git a/package/angularjs/angularjs.mk b/package/angularjs/angularjs.mk
index 60702a26d2..a941bc3828 100644
--- a/package/angularjs/angularjs.mk
+++ b/package/angularjs/angularjs.mk
@@ -12,7 +12,7 @@ ANGULARJS_LICENSE = MIT
 ANGULARJS_LICENSE_FILES = angular.js
 
 define ANGULARJS_EXTRACT_CMDS
-	unzip $(DL_DIR)/$(ANGULARJS_SOURCE) -d $(@D)
+	unzip $(ANGULARJS_DL_DIR)/$(ANGULARJS_SOURCE) -d $(@D)
 	mv $(@D)/angular-$(ANGULARJS_VERSION)/* $(@D)
 	rmdir $(@D)/angular-$(ANGULARJS_VERSION)
 endef
diff --git a/package/bootstrap/bootstrap.mk b/package/bootstrap/bootstrap.mk
index 61d6c826fc..50cdc322e4 100644
--- a/package/bootstrap/bootstrap.mk
+++ b/package/bootstrap/bootstrap.mk
@@ -11,7 +11,9 @@ BOOTSTRAP_LICENSE = MIT
 BOOTSTRAP_LICENSE_FILES = css/bootstrap.css
 
 define BOOTSTRAP_EXTRACT_CMDS
-	$(UNZIP) $(DL_DIR)/$(BOOTSTRAP_SOURCE) -d $(@D)
+	$(UNZIP) $(BOOTSTRAP_DL_DIR)/$(BOOTSTRAP_SOURCE) -d $(@D)
+	mv $(@D)/bootstrap-$(BOOTSTRAP_VERSION)-dist/* $(@D)/
+	rmdir $(@D)/bootstrap-$(BOOTSTRAP_VERSION)-dist
 endef
 
 define BOOTSTRAP_INSTALL_TARGET_CMDS
diff --git a/package/cache-calibrator/cache-calibrator.mk b/package/cache-calibrator/cache-calibrator.mk
index 6f12d119ef..203732e4f7 100644
--- a/package/cache-calibrator/cache-calibrator.mk
+++ b/package/cache-calibrator/cache-calibrator.mk
@@ -10,7 +10,7 @@ CACHE_CALIBRATOR_LICENSE = Cache calibrator license
 CACHE_CALIBRATOR_LICENSE_FILES = calibrator.c.license
 
 define CACHE_CALIBRATOR_EXTRACT_CMDS
-	cp $(DL_DIR)/$(CACHE_CALIBRATOR_SOURCE) $(@D)
+	cp $(CACHE_CALIBRATOR_DL_DIR)/$(CACHE_CALIBRATOR_SOURCE) $(@D)
 endef
 
 define CACHE_CALIBRATOR_EXTRACT_LICENSE
diff --git a/package/cargo/cargo.mk b/package/cargo/cargo.mk
index 3fd088d727..36b2718e90 100644
--- a/package/cargo/cargo.mk
+++ b/package/cargo/cargo.mk
@@ -37,7 +37,7 @@ HOST_CARGO_HOME = $(HOST_DIR)/share/cargo
 define HOST_CARGO_EXTRACT_DEPS
 	@mkdir -p $(@D)/vendor
 	$(call suitable-extractor,$(CARGO_DEPS_SOURCE)) \
-		$(DL_DIR)/$(CARGO_DEPS_SOURCE) | \
+		$(CARGO_DL_DIR)/$(CARGO_DEPS_SOURCE) | \
 		$(TAR) --strip-components=1 -C $(@D)/vendor $(TAR_OPTIONS) -
 endef
 
@@ -46,7 +46,7 @@ HOST_CARGO_POST_EXTRACT_HOOKS += HOST_CARGO_EXTRACT_DEPS
 define HOST_CARGO_EXTRACT_INSTALLER
 	@mkdir -p $(@D)/src/rust-installer
 	$(call suitable-extractor,$(CARGO_INSTALLER_SOURCE)) \
-		$(DL_DIR)/$(CARGO_INSTALLER_SOURCE) | \
+		$(CARGO_DL_DIR)/$(CARGO_INSTALLER_SOURCE) | \
 		$(TAR) --strip-components=1 -C $(@D)/src/rust-installer $(TAR_OPTIONS) -
 endef
 
diff --git a/package/cracklib/cracklib.mk b/package/cracklib/cracklib.mk
index 2fa3d3b4ea..aeee60d1e7 100644
--- a/package/cracklib/cracklib.mk
+++ b/package/cracklib/cracklib.mk
@@ -29,7 +29,7 @@ HOST_CRACKLIB_CONF_OPTS += --without-python --without-zlib
 
 ifeq ($(BR2_PACKAGE_CRACKLIB_FULL_DICT),y)
 CRACKLIB_EXTRA_DOWNLOADS = cracklib-words-$(CRACKLIB_VERSION).gz
-CRACKLIB_DICT_SOURCE = $(DL_DIR)/cracklib-words-$(CRACKLIB_VERSION).gz
+CRACKLIB_DICT_SOURCE = $(CRACKLIB_DL_DIR)/cracklib-words-$(CRACKLIB_VERSION).gz
 else
 CRACKLIB_DICT_SOURCE = $(@D)/dicts/cracklib-small
 endif
diff --git a/package/cryptopp/cryptopp.mk b/package/cryptopp/cryptopp.mk
index d7aabf04b8..6da39b09f2 100644
--- a/package/cryptopp/cryptopp.mk
+++ b/package/cryptopp/cryptopp.mk
@@ -12,7 +12,7 @@ CRYPTOPP_LICENSE_FILES = License.txt
 CRYPTOPP_INSTALL_STAGING = YES
 
 define HOST_CRYPTOPP_EXTRACT_CMDS
-	$(UNZIP) $(DL_DIR)/$(CRYPTOPP_SOURCE) -d $(@D)
+	$(UNZIP) $(CRYPTOPP_DL_DIR)/$(CRYPTOPP_SOURCE) -d $(@D)
 endef
 
 HOST_CRYPTOPP_MAKE_OPTS = \
diff --git a/package/devmem2/devmem2.mk b/package/devmem2/devmem2.mk
index eb92db6098..044e6fa596 100644
--- a/package/devmem2/devmem2.mk
+++ b/package/devmem2/devmem2.mk
@@ -11,7 +11,7 @@ DEVMEM2_LICENSE = GPL-2.0+
 DEVMEM2_LICENSE_FILES = devmem2.c.license
 
 define DEVMEM2_EXTRACT_CMDS
-	cp $(DL_DIR)/$($(PKG)_SOURCE) $(@D)/
+	cp $(DEVMEM2_DL_DIR)/$($(PKG)_SOURCE) $(@D)/
 endef
 
 define DEVMEM2_EXTRACT_LICENSE
diff --git a/package/dhrystone/dhrystone.mk b/package/dhrystone/dhrystone.mk
index c0bca9895d..d54c8d94e8 100644
--- a/package/dhrystone/dhrystone.mk
+++ b/package/dhrystone/dhrystone.mk
@@ -9,7 +9,7 @@ DHRYSTONE_SOURCE = dhry-c
 DHRYSTONE_SITE = http://www.netlib.org/benchmark
 
 define DHRYSTONE_EXTRACT_CMDS
-	(cd $(@D) && $(SHELL) $(DL_DIR)/$($(PKG)_SOURCE))
+	(cd $(@D) && $(SHELL) $(DHRYSTONE_DL_DIR)/$($(PKG)_SOURCE))
 	$(Q)cp package/dhrystone/Makefile $(@D)/
 endef
 
diff --git a/package/doom-wad/doom-wad.mk b/package/doom-wad/doom-wad.mk
index d3ac731bd5..f348fc2207 100644
--- a/package/doom-wad/doom-wad.mk
+++ b/package/doom-wad/doom-wad.mk
@@ -11,7 +11,7 @@ DOOM_WAD_SOURCE = doom$(subst .,,$(DOOM_WAD_VERSION))s.zip
 DOOM_WAD_SITE = http://www.jbserver.com/downloads/games/doom/misc/shareware
 
 define DOOM_WAD_EXTRACT_CMDS
-	$(UNZIP) -p $(DL_DIR)/$($(PKG)_SOURCE) 'DOOMS_19.[12]' > \
+	$(UNZIP) -p $(DOOM_WAD_DL_DIR)/$($(PKG)_SOURCE) 'DOOMS_19.[12]' > \
 		$(@D)/doom-$(DOOM_WAD_VERSION).zip
 	$(UNZIP) -d $(@D) $(@D)/doom-$(DOOM_WAD_VERSION).zip DOOM1.WAD
 endef
diff --git a/package/espeak/espeak.mk b/package/espeak/espeak.mk
index 226f17fc05..aa0b5b0716 100644
--- a/package/espeak/espeak.mk
+++ b/package/espeak/espeak.mk
@@ -21,7 +21,7 @@ ESPEAK_DEPENDENCIES = pulseaudio
 endif
 
 define ESPEAK_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(ESPEAK_SOURCE)
+	$(UNZIP) -d $(@D) $(ESPEAK_DL_DIR)/$(ESPEAK_SOURCE)
 	mv $(@D)/espeak-$(ESPEAK_VERSION)-source/* $(@D)
 	$(RM) -r $(@D)/espeak-$(ESPEAK_VERSION)-source
 endef
diff --git a/package/fan-ctrl/fan-ctrl.mk b/package/fan-ctrl/fan-ctrl.mk
index 53533a8a17..81a52f1e9f 100644
--- a/package/fan-ctrl/fan-ctrl.mk
+++ b/package/fan-ctrl/fan-ctrl.mk
@@ -12,7 +12,7 @@ FAN_CTRL_LICENSE = GPL-2.0+
 FAN_CTRL_LICENSE_FILES = fan-ctrl.c
 
 define FAN_CTRL_EXTRACT_CMDS
-	cp $(DL_DIR)/$(FAN_CTRL_SOURCE) $(@D)/fan-ctrl.c
+	cp $(FAN_CTRL_DL_DIR)/$(FAN_CTRL_SOURCE) $(@D)/fan-ctrl.c
 endef
 
 define FAN_CTRL_BUILD_CMDS
diff --git a/package/freescale-imx/firmware-imx/firmware-imx.mk b/package/freescale-imx/firmware-imx/firmware-imx.mk
index 630afcaa37..d5387b38da 100644
--- a/package/freescale-imx/firmware-imx/firmware-imx.mk
+++ b/package/freescale-imx/firmware-imx/firmware-imx.mk
@@ -15,7 +15,7 @@ FIRMWARE_IMX_REDISTRIBUTE = NO
 FIRMWARE_IMX_BLOBS = sdma vpu
 
 define FIRMWARE_IMX_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(FIRMWARE_IMX_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(FIRMWARE_IMX_DL_DIR)/$(FIRMWARE_IMX_SOURCE))
 endef
 
 define FIRMWARE_IMX_INSTALL_TARGET_CMDS
diff --git a/package/freescale-imx/gpu-amd-bin-mx51/gpu-amd-bin-mx51.mk b/package/freescale-imx/gpu-amd-bin-mx51/gpu-amd-bin-mx51.mk
index 44a6b4c7d2..5586073712 100644
--- a/package/freescale-imx/gpu-amd-bin-mx51/gpu-amd-bin-mx51.mk
+++ b/package/freescale-imx/gpu-amd-bin-mx51/gpu-amd-bin-mx51.mk
@@ -23,7 +23,7 @@ GPU_AMD_BIN_MX51_LICENSE_FILES = EULA
 GPU_AMD_BIN_MX51_REDISTRIBUTE = NO
 
 define GPU_AMD_BIN_MX51_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(GPU_AMD_BIN_MX51_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(GPU_AMD_BIN_MX51_DL_DIR)/$(GPU_AMD_BIN_MX51_SOURCE))
 endef
 
 # Upstream headers need to be compiled with -D_LINUX. It is more convenient
diff --git a/package/freescale-imx/imx-codec/imx-codec.mk b/package/freescale-imx/imx-codec/imx-codec.mk
index ea4d90110f..784c1fa4a9 100644
--- a/package/freescale-imx/imx-codec/imx-codec.mk
+++ b/package/freescale-imx/imx-codec/imx-codec.mk
@@ -26,7 +26,7 @@ IMX_CODEC_CONF_OPTS += --enable-vpu
 endif
 
 define IMX_CODEC_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(IMX_CODEC_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(IMX_CODEC_DL_DIR)/$(IMX_CODEC_SOURCE))
 endef
 
 # FIXME The Makefile installs both the arm9 and arm11 versions of the
diff --git a/package/freescale-imx/imx-gpu-g2d/imx-gpu-g2d.mk b/package/freescale-imx/imx-gpu-g2d/imx-gpu-g2d.mk
index 0c92b826c4..aafd70dc9d 100644
--- a/package/freescale-imx/imx-gpu-g2d/imx-gpu-g2d.mk
+++ b/package/freescale-imx/imx-gpu-g2d/imx-gpu-g2d.mk
@@ -15,7 +15,7 @@ IMX_GPU_G2D_LICENSE_FILES = EULA COPYING
 IMX_GPU_G2D_REDISTRIBUTE = NO
 
 define IMX_GPU_G2D_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(IMX_GPU_G2D_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(IMX_GPU_G2D_DL_DIR)/$(IMX_GPU_G2D_SOURCE))
 endef
 
 define IMX_GPU_G2D_INSTALL_STAGING_CMDS
diff --git a/package/freescale-imx/imx-gpu-viv/imx-gpu-viv.mk b/package/freescale-imx/imx-gpu-viv/imx-gpu-viv.mk
index a9f8b9d9bd..0dc2072984 100644
--- a/package/freescale-imx/imx-gpu-viv/imx-gpu-viv.mk
+++ b/package/freescale-imx/imx-gpu-viv/imx-gpu-viv.mk
@@ -26,7 +26,7 @@ IMX_GPU_VIV_DEPENDENCIES += xlib_libXdamage xlib_libXext xlib_libXfixes
 endif
 
 define IMX_GPU_VIV_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(IMX_GPU_VIV_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(IMX_GPU_VIV_DL_DIR)/$(IMX_GPU_VIV_SOURCE))
 endef
 
 # Instead of building, we fix up the inconsistencies that exist
diff --git a/package/freescale-imx/imx-parser/imx-parser.mk b/package/freescale-imx/imx-parser/imx-parser.mk
index d79a4444c8..11536e5024 100644
--- a/package/freescale-imx/imx-parser/imx-parser.mk
+++ b/package/freescale-imx/imx-parser/imx-parser.mk
@@ -24,7 +24,7 @@ IMX_PARSER_CONF_OPTS += --enable-fsw
 endif
 
 define IMX_PARSER_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(IMX_PARSER_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(IMX_PARSER_DL_DIR)/$(IMX_PARSER_SOURCE))
 endef
 
 # The Makefile installs several versions of the libraries, but we only
diff --git a/package/freescale-imx/imx-vpu/imx-vpu.mk b/package/freescale-imx/imx-vpu/imx-vpu.mk
index e3a1ee661d..2bcfe53eba 100644
--- a/package/freescale-imx/imx-vpu/imx-vpu.mk
+++ b/package/freescale-imx/imx-vpu/imx-vpu.mk
@@ -21,7 +21,7 @@ IMX_VPU_LICENSE_FILES = EULA COPYING
 IMX_VPU_REDISTRIBUTE = NO
 
 define IMX_VPU_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(IMX_VPU_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(IMX_VPU_DL_DIR)/$(IMX_VPU_SOURCE))
 endef
 
 define IMX_VPU_BUILD_CMDS
diff --git a/package/freescale-imx/imx-vpuwrap/imx-vpuwrap.mk b/package/freescale-imx/imx-vpuwrap/imx-vpuwrap.mk
index 7cbf7784c9..edba87279f 100644
--- a/package/freescale-imx/imx-vpuwrap/imx-vpuwrap.mk
+++ b/package/freescale-imx/imx-vpuwrap/imx-vpuwrap.mk
@@ -15,7 +15,7 @@ IMX_VPUWRAP_LICENSE_FILES = EULA COPYING
 IMX_VPUWRAP_REDISTRIBUTE = NO
 
 define IMX_VPUWRAP_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(IMX_VPUWRAP_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(IMX_VPUWRAP_DL_DIR)/$(IMX_VPUWRAP_SOURCE))
 endef
 
 $(eval $(autotools-package))
diff --git a/package/freescale-imx/libz160/libz160.mk b/package/freescale-imx/libz160/libz160.mk
index af43b1859d..5b1feeb441 100644
--- a/package/freescale-imx/libz160/libz160.mk
+++ b/package/freescale-imx/libz160/libz160.mk
@@ -15,7 +15,7 @@ LIBZ160_LICENSE_FILES = EULA
 LIBZ160_REDISTRIBUTE = NO
 
 define LIBZ160_EXTRACT_CMDS
-	$(call FREESCALE_IMX_EXTRACT_HELPER,$(DL_DIR)/$(LIBZ160_SOURCE))
+	$(call FREESCALE_IMX_EXTRACT_HELPER,$(LIBZ160_DL_DIR)/$(LIBZ160_SOURCE))
 endef
 
 define LIBZ160_INSTALL_STAGING_CMDS
diff --git a/package/gcc/gcc.mk b/package/gcc/gcc.mk
index 27fc1e987c..f42f36a967 100644
--- a/package/gcc/gcc.mk
+++ b/package/gcc/gcc.mk
@@ -316,7 +316,7 @@ HOST_GCC_COMMON_MAKE_OPTS = \
 	gcc_cv_libc_provides_ssp=$(if $(BR2_TOOLCHAIN_HAS_SSP),yes,no)
 
 ifeq ($(BR2_CCACHE),y)
-HOST_GCC_COMMON_CCACHE_HASH_FILES += $(DL_DIR)/$(GCC_SOURCE)
+HOST_GCC_COMMON_CCACHE_HASH_FILES += $(GCC_DL_DIR)/$(GCC_SOURCE)
 
 # Cfr. PATCH_BASE_DIRS in .stamp_patched, but we catch both versioned
 # and unversioned patches unconditionally. Moreover, to facilitate the
diff --git a/package/irrlicht/irrlicht.mk b/package/irrlicht/irrlicht.mk
index ccd1045ca0..a1e190d98d 100644
--- a/package/irrlicht/irrlicht.mk
+++ b/package/irrlicht/irrlicht.mk
@@ -25,7 +25,7 @@ IRRLICHT_SUBDIR = source/Irrlicht
 IRRLICHT_DEPENDENCIES = libgl xlib_libXxf86vm
 
 define IRRLICHT_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(IRRLICHT_SOURCE)
+	$(UNZIP) -d $(@D) $(IRRLICHT_DL_DIR)/$(IRRLICHT_SOURCE)
 	mv $(@D)/irrlicht-$(IRRLICHT_VERSION)/* $(@D)
 	$(RM) -r $(@D)/irrlicht-$(IRRLICHT_VERSION)
 endef
diff --git a/package/jquery-mobile/jquery-mobile.mk b/package/jquery-mobile/jquery-mobile.mk
index 1067952904..1171e60d0a 100644
--- a/package/jquery-mobile/jquery-mobile.mk
+++ b/package/jquery-mobile/jquery-mobile.mk
@@ -10,7 +10,7 @@ JQUERY_MOBILE_SOURCE = jquery.mobile-$(JQUERY_MOBILE_VERSION).zip
 JQUERY_MOBILE_LICENSE = MIT
 
 define JQUERY_MOBILE_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(JQUERY_MOBILE_SOURCE)
+	$(UNZIP) -d $(@D) $(JQUERY_MOBILE_DL_DIR)/$(JQUERY_MOBILE_SOURCE)
 endef
 
 JQUERY_MOBILE_INSTALLED_FILES = \
diff --git a/package/jquery-sparkline/jquery-sparkline.mk b/package/jquery-sparkline/jquery-sparkline.mk
index 165c2fa99c..683d84f4aa 100644
--- a/package/jquery-sparkline/jquery-sparkline.mk
+++ b/package/jquery-sparkline/jquery-sparkline.mk
@@ -10,7 +10,7 @@ JQUERY_SPARKLINE_SOURCE = jquery.sparkline.min.js
 JQUERY_SPARKLINE_LICENSE = BSD-3-Clause
 
 define JQUERY_SPARKLINE_EXTRACT_CMDS
-	cp $(DL_DIR)/$(JQUERY_SPARKLINE_SOURCE) $(@D)
+	cp $(JQUERY_SPARKLINE_DL_DIR)/$(JQUERY_SPARKLINE_SOURCE) $(@D)
 endef
 
 define JQUERY_SPARKLINE_INSTALL_TARGET_CMDS
diff --git a/package/jquery-ui-themes/jquery-ui-themes.mk b/package/jquery-ui-themes/jquery-ui-themes.mk
index cd3dab6d87..5108389a77 100644
--- a/package/jquery-ui-themes/jquery-ui-themes.mk
+++ b/package/jquery-ui-themes/jquery-ui-themes.mk
@@ -12,7 +12,7 @@ JQUERY_UI_THEMES_LICENSE_FILES = MIT-LICENSE.txt
 JQUERY_UI_THEMES_DEPENDENCIES = jquery-ui
 
 define JQUERY_UI_THEMES_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(JQUERY_UI_THEMES_SOURCE)
+	$(UNZIP) -d $(@D) $(JQUERY_UI_THEMES_DL_DIR)/$(JQUERY_UI_THEMES_SOURCE)
 	mv $(@D)/jquery-ui-themes-$(JQUERY_UI_THEMES_VERSION)/* $(@D)
 	$(RM) -r $(@D)/jquery-ui-themes-$(JQUERY_UI_THEMES_VERSION)
 endef
diff --git a/package/jquery-ui/jquery-ui.mk b/package/jquery-ui/jquery-ui.mk
index d829d6241b..9284e683f1 100644
--- a/package/jquery-ui/jquery-ui.mk
+++ b/package/jquery-ui/jquery-ui.mk
@@ -14,7 +14,7 @@ JQUERY_UI_LICENSE = MIT
 JQUERY_UI_LICENSE_FILES = MIT-LICENSE.txt
 
 define JQUERY_UI_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(JQUERY_UI_SOURCE)
+	$(UNZIP) -d $(@D) $(JQUERY_UI_DL_DIR)/$(JQUERY_UI_SOURCE)
 	mv $(@D)/jquery-ui-$(JQUERY_UI_VERSION)/* $(@D)
 	$(RM) -r $(@D)/jquery-ui-$(JQUERY_UI_VERSION)
 endef
diff --git a/package/jquery-validation/jquery-validation.mk b/package/jquery-validation/jquery-validation.mk
index 10d2a92afc..a8c2897983 100644
--- a/package/jquery-validation/jquery-validation.mk
+++ b/package/jquery-validation/jquery-validation.mk
@@ -11,7 +11,7 @@ JQUERY_VALIDATION_LICENSE = MIT
 JQUERY_VALIDATION_LICENSE_FILES = README.md
 
 define JQUERY_VALIDATION_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(JQUERY_VALIDATION_SOURCE)
+	$(UNZIP) -d $(@D) $(JQUERY_VALIDATION_DL_DIR)/$(JQUERY_VALIDATION_SOURCE)
 endef
 
 define JQUERY_VALIDATION_INSTALL_TARGET_CMDS
diff --git a/package/jquery/jquery.mk b/package/jquery/jquery.mk
index 363ee09633..f75eded8b3 100644
--- a/package/jquery/jquery.mk
+++ b/package/jquery/jquery.mk
@@ -10,7 +10,7 @@ JQUERY_SOURCE = jquery-$(JQUERY_VERSION).min.js
 JQUERY_LICENSE = MIT
 
 define JQUERY_EXTRACT_CMDS
-	cp $(DL_DIR)/$(JQUERY_SOURCE) $(@D)
+	cp $(JQUERY_DL_DIR)/$(JQUERY_SOURCE) $(@D)
 endef
 
 define JQUERY_INSTALL_TARGET_CMDS
diff --git a/package/kodi/kodi.mk b/package/kodi/kodi.mk
index 93f4dd4b31..739850d670 100644
--- a/package/kodi/kodi.mk
+++ b/package/kodi/kodi.mk
@@ -69,9 +69,9 @@ KODI_CONF_OPTS += \
 	-DDEPENDS_PATH=$(@D) \
 	-DWITH_FFMPEG=$(STAGING_DIR)/usr \
 	-DWITH_TEXTUREPACKER=$(HOST_DIR)/bin/TexturePacker \
-	-DLIBDVDCSS_URL=$(DL_DIR)/$(KODI_LIBDVDCSS_VERSION).tar.gz \
-	-DLIBDVDNAV_URL=$(DL_DIR)/$(KODI_LIBDVDNAV_VERSION).tar.gz \
-	-DLIBDVDREAD_URL=$(DL_DIR)/$(KODI_LIBDVDREAD_VERSION).tar.gz
+	-DLIBDVDCSS_URL=$(KODI_DL_DIR)/$(KODI_LIBDVDCSS_VERSION).tar.gz \
+	-DLIBDVDNAV_URL=$(KODI_DL_DIR)/$(KODI_LIBDVDNAV_VERSION).tar.gz \
+	-DLIBDVDREAD_URL=$(KODI_DL_DIR)/$(KODI_LIBDVDREAD_VERSION).tar.gz
 
 ifeq ($(BR2_ENABLE_LOCALE),)
 KODI_DEPENDENCIES += libiconv
diff --git a/package/libb64/libb64.mk b/package/libb64/libb64.mk
index 4dea9593a0..ed6d3cf4b4 100644
--- a/package/libb64/libb64.mk
+++ b/package/libb64/libb64.mk
@@ -14,7 +14,7 @@ LIBB64_INSTALL_STAGING = YES
 LIBB64_INSTALL_TARGET = NO
 
 define LIBB64_EXTRACT_CMDS
-	unzip $(DL_DIR)/$(LIBB64_SOURCE) -d $(BUILD_DIR)
+	unzip $(LIBB64_DL_DIR)/$(LIBB64_SOURCE) -d $(BUILD_DIR)
 endef
 
 define LIBB64_BUILD_CMDS
diff --git a/package/libfreeimage/libfreeimage.mk b/package/libfreeimage/libfreeimage.mk
index 0ca23933a6..e0aa1f0ae8 100644
--- a/package/libfreeimage/libfreeimage.mk
+++ b/package/libfreeimage/libfreeimage.mk
@@ -12,7 +12,7 @@ LIBFREEIMAGE_LICENSE_FILES = license-gplv2.txt license-gplv3.txt license-fi.txt
 LIBFREEIMAGE_INSTALL_STAGING = YES
 
 define LIBFREEIMAGE_EXTRACT_CMDS
-	$(UNZIP) $(DL_DIR)/$(LIBFREEIMAGE_SOURCE) -d $(@D)
+	$(UNZIP) $(LIBFREEIMAGE_DL_DIR)/$(LIBFREEIMAGE_SOURCE) -d $(@D)
 	mv $(@D)/FreeImage/* $(@D)
 	rmdir $(@D)/FreeImage
 endef
diff --git a/package/libjson/libjson.mk b/package/libjson/libjson.mk
index 74224ba657..d04ddc40f3 100644
--- a/package/libjson/libjson.mk
+++ b/package/libjson/libjson.mk
@@ -23,7 +23,7 @@ endif
 LIBJSON_MAKE_OPTS += BUILD_TYPE= CXXFLAGS="$(LIBJSON_CXXFLAGS)"
 
 define LIBJSON_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(LIBJSON_SOURCE)
+	$(UNZIP) -d $(@D) $(LIBJSON_DL_DIR)/$(LIBJSON_SOURCE)
 	mv $(@D)/libjson/* $(@D)
 	$(RM) -r $(@D)/libjson
 	$(SED) '/ldconfig/d' $(@D)/makefile
diff --git a/package/libsoil/libsoil.mk b/package/libsoil/libsoil.mk
index efa67d9eee..2945edd8dd 100644
--- a/package/libsoil/libsoil.mk
+++ b/package/libsoil/libsoil.mk
@@ -14,7 +14,7 @@ LIBSOIL_LICENSE_FILES = src/stb_image_aug.c src/image_helper.c
 LIBSOIL_MAKEFILE = ../projects/makefile/alternate_Makefile.txt
 
 define LIBSOIL_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(LIBSOIL_SOURCE)
+	$(UNZIP) -d $(@D) $(LIBSOIL_DL_DIR)/$(LIBSOIL_SOURCE)
 	mv $(@D)/Simple\ OpenGL\ Image\ Library/* $(@D)
 endef
 
diff --git a/package/lsof/lsof.mk b/package/lsof/lsof.mk
index e5cd4bce6e..0dc8e2de2f 100644
--- a/package/lsof/lsof.mk
+++ b/package/lsof/lsof.mk
@@ -32,7 +32,7 @@ endif
 
 # The .tar.bz2 contains another .tar, which contains the source code.
 define LSOF_EXTRACT_CMDS
-	$(call suitable-extractor,$(LSOF_SOURCE)) $(DL_DIR)/$(LSOF_SOURCE) | \
+	$(call suitable-extractor,$(LSOF_SOURCE)) $(LSOF_DL_DIR)/$(LSOF_SOURCE) | \
 		$(TAR) -O $(TAR_OPTIONS) - lsof_$(LSOF_VERSION)/lsof_$(LSOF_VERSION)_src.tar | \
 	$(TAR) --strip-components=1 -C $(LSOF_DIR) $(TAR_OPTIONS) -
 endef
diff --git a/package/musl-compat-headers/musl-compat-headers.mk b/package/musl-compat-headers/musl-compat-headers.mk
index 91f5074ef6..1cfa894879 100644
--- a/package/musl-compat-headers/musl-compat-headers.mk
+++ b/package/musl-compat-headers/musl-compat-headers.mk
@@ -20,7 +20,7 @@ MUSL_COMPAT_HEADERS_INSTALL_STAGING = YES
 
 # Copying both headers so legal-info finds them (they are _LICENSE_FILES)
 define MUSL_COMPAT_HEADERS_EXTRACT_CMDS
-	$(INSTALL) -m 0644 -D $(DL_DIR)/$(notdir $(MUSL_COMPAT_HEADERS_QUEUE_H)) $(@D)/queue.h
+	$(INSTALL) -m 0644 -D $(MUSL_COMPAT_HEADERS_DL_DIR)/$(notdir $(MUSL_COMPAT_HEADERS_QUEUE_H)) $(@D)/queue.h
 	$(INSTALL) -m 0644 -D $(MUSL_COMPAT_HEADERS_PKGDIR)/cdefs.h $(@D)/cdefs.h
 endef
 
diff --git a/package/nmon/nmon.mk b/package/nmon/nmon.mk
index f561d6dcc0..f7edd66a26 100644
--- a/package/nmon/nmon.mk
+++ b/package/nmon/nmon.mk
@@ -13,7 +13,7 @@ NMON_DEPENDENCIES = ncurses
 NMON_CFLAGS = $(TARGET_CFLAGS) -D JFS -D GETUSER -D LARGEMEM -D DEBIAN
 
 define NMON_EXTRACT_CMDS
-	cp $(DL_DIR)/$(NMON_SOURCE) $(@D)
+	cp $(NMON_DL_DIR)/$(NMON_SOURCE) $(@D)
 endef
 
 define NMON_BUILD_CMDS
diff --git a/package/nvidia-driver/nvidia-driver.mk b/package/nvidia-driver/nvidia-driver.mk
index e56661059f..404fa881ec 100644
--- a/package/nvidia-driver/nvidia-driver.mk
+++ b/package/nvidia-driver/nvidia-driver.mk
@@ -144,7 +144,7 @@ endif # BR2_PACKAGE_NVIDIA_DRIVER_MODULE == y
 # virtually everywhere, and it is fine enough to provide useful options.
 # Except it can't extract into an existing (even empty) directory.
 define NVIDIA_DRIVER_EXTRACT_CMDS
-	$(SHELL) $(DL_DIR)/$(NVIDIA_DRIVER_SOURCE) --extract-only --target \
+	$(SHELL) $(NVIDIA_DRIVER_DL_DIR)/$(NVIDIA_DRIVER_SOURCE) --extract-only --target \
 		$(@D)/tmp-extract
 	chmod u+w -R $(@D)
 	mv $(@D)/tmp-extract/* $(@D)/tmp-extract/.manifest $(@D)
diff --git a/package/nvidia-tegra23/nvidia-tegra23-codecs/nvidia-tegra23-codecs.mk b/package/nvidia-tegra23/nvidia-tegra23-codecs/nvidia-tegra23-codecs.mk
index 5514643416..2885021c58 100644
--- a/package/nvidia-tegra23/nvidia-tegra23-codecs/nvidia-tegra23-codecs.mk
+++ b/package/nvidia-tegra23/nvidia-tegra23-codecs/nvidia-tegra23-codecs.mk
@@ -15,7 +15,7 @@ NVIDIA_TEGRA23_CODECS_REDISTRIBUTE = NO
 define NVIDIA_TEGRA23_CODECS_EXTRACT_CMDS
 	$(INSTALL) -d $(@D)
 	$(call suitable-extractor,$(NVIDIA_TEGRA23_CODECS_SOURCE)) \
-		$(DL_DIR)/$(NVIDIA_TEGRA23_CODECS_SOURCE) | \
+		$(NVIDIA_TEGRA23_CODECS_DL_DIR)/$(NVIDIA_TEGRA23_CODECS_SOURCE) | \
 	$(TAR) --strip-components=0 -C $(@D) $(TAR_OPTIONS) -
 	$(INSTALL) -d $(@D)/restricted_codecs
 	$(call suitable-extractor,$(@D)/restricted_codecs.tbz2) \
diff --git a/package/openobex/openobex.mk b/package/openobex/openobex.mk
index ea41f65a80..c19780110c 100644
--- a/package/openobex/openobex.mk
+++ b/package/openobex/openobex.mk
@@ -19,6 +19,12 @@ ifeq ($(BR2_PACKAGE_BLUEZ_UTILS),y)
 OPENOBEX_DEPENDENCIES += bluez_utils
 endif
 
+define OPENOBEX_EXTRACT_CMDS
+	$(UNZIP) -d $(@D) $(OPENOBEX_DL_DIR)/$(OPENOBEX_SOURCE)
+	mv $(@D)/openobex-$(OPENOBEX_VERSION)-Source/* $(@D)
+	$(RM) -r $(@D)/openobex-$(OPENOBEX_VERSION)-Source
+endef
+
 ifeq ($(BR2_PACKAGE_BLUEZ5_UTILS),y)
 OPENOBEX_DEPENDENCIES += bluez5_utils
 endif
diff --git a/package/opentyrian-data/opentyrian-data.mk b/package/opentyrian-data/opentyrian-data.mk
index 4b879df1c3..9e38b61bfa 100644
--- a/package/opentyrian-data/opentyrian-data.mk
+++ b/package/opentyrian-data/opentyrian-data.mk
@@ -10,7 +10,7 @@ OPENTYRIAN_DATA_SOURCE = tyrian21.zip
 OPENTYRIAN_DATA_LICENSE = Freeware
 
 define OPENTYRIAN_DATA_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(OPENTYRIAN_DATA_SOURCE)
+	$(UNZIP) -d $(@D) $(OPENTYRIAN_DATA_DL_DIR)/$(OPENTYRIAN_DATA_SOURCE)
 endef
 
 define OPENTYRIAN_DATA_INSTALL_TARGET_CMDS
diff --git a/package/perl/perl.mk b/package/perl/perl.mk
index 58bf3eb6ae..c47367b9da 100644
--- a/package/perl/perl.mk
+++ b/package/perl/perl.mk
@@ -24,7 +24,7 @@ PERL_EXTRA_DOWNLOADS = $(PERL_CROSS_SITE)/$(PERL_CROSS_SOURCE)
 # as a separate package. Instead, it is downloaded and extracted
 # together with perl
 define PERL_CROSS_EXTRACT
-	$(call suitable-extractor,$(PERL_CROSS_SOURCE)) $(DL_DIR)/$(PERL_CROSS_SOURCE) | \
+	$(call suitable-extractor,$(PERL_CROSS_SOURCE)) $(PERL_DL_DIR)/$(PERL_CROSS_SOURCE) | \
 	$(TAR) --strip-components=1 -C $(@D) $(TAR_OPTIONS) -
 endef
 PERL_POST_EXTRACT_HOOKS += PERL_CROSS_EXTRACT
diff --git a/package/python-keyring/python-keyring.mk b/package/python-keyring/python-keyring.mk
index 27db2802ed..8ac18ae052 100644
--- a/package/python-keyring/python-keyring.mk
+++ b/package/python-keyring/python-keyring.mk
@@ -6,7 +6,6 @@
 
 PYTHON_KEYRING_VERSION = 10.5.0
 PYTHON_KEYRING_SOURCE = keyring-$(PYTHON_KEYRING_VERSION).tar.gz
-PYTHON_KEYRING_SITE = https://pypi.python.org/packages/42/2e/51bd1739fe335095a2174db3f2f230346762e7e572471059540146a521f6
 PYTHON_KEYRING_SETUP_TYPE = setuptools
 PYTHON_KEYRING_LICENSE = MIT
 PYTHON_KEYRING_LICENSE_FILES = LICENSE
diff --git a/package/python-pytz/python-pytz.mk b/package/python-pytz/python-pytz.mk
index 6e130a6b2f..1d3603389b 100644
--- a/package/python-pytz/python-pytz.mk
+++ b/package/python-pytz/python-pytz.mk
@@ -12,7 +12,7 @@ PYTHON_PYTZ_LICENSE = MIT
 PYTHON_PYTZ_LICENSE_FILES = LICENSE.txt
 
 define PYTHON_PYTZ_EXTRACT_CMDS
-	unzip $(DL_DIR)/$(PYTHON_PYTZ_SOURCE) -d $(@D)
+	unzip $(PYTHON_PYTZ_DL_DIR)/$(PYTHON_PYTZ_SOURCE) -d $(@D)
 	mv $(@D)/pytz-$(PYTHON_PYTZ_VERSION)/* $(@D)
 	rmdir $(@D)/pytz-$(PYTHON_PYTZ_VERSION)
 endef
diff --git a/package/python-simplegeneric/python-simplegeneric.mk b/package/python-simplegeneric/python-simplegeneric.mk
index cc84320428..d3bfa69a7e 100644
--- a/package/python-simplegeneric/python-simplegeneric.mk
+++ b/package/python-simplegeneric/python-simplegeneric.mk
@@ -16,7 +16,7 @@ PYTHON_SIMPLEGENERIC_LICENSE = ZPL-2.1
 PYTHON_SIMPLEGENERIC_SETUP_TYPE = setuptools
 
 define PYTHON_SIMPLEGENERIC_EXTRACT_CMDS
-	unzip $(DL_DIR)/$(PYTHON_SIMPLEGENERIC_SOURCE) -d $(@D)
+	unzip $(PYTHON_SIMPLEGENERIC_DL_DIR)/$(PYTHON_SIMPLEGENERIC_SOURCE) -d $(@D)
 	mv $(@D)/simplegeneric-$(PYTHON_SIMPLEGENERIC_VERSION)/* $(@D)
 	rmdir $(@D)/simplegeneric-$(PYTHON_SIMPLEGENERIC_VERSION)
 endef
diff --git a/package/rapidxml/rapidxml.mk b/package/rapidxml/rapidxml.mk
index 2bec8fe1f2..9d034d8205 100644
--- a/package/rapidxml/rapidxml.mk
+++ b/package/rapidxml/rapidxml.mk
@@ -15,7 +15,7 @@ RAPIDXML_INSTALL_TARGET = NO
 RAPIDXML_INSTALL_STAGING = YES
 
 define RAPIDXML_EXTRACT_CMDS
-	$(UNZIP) -d $(@D) $(DL_DIR)/$(RAPIDXML_SOURCE)
+	$(UNZIP) -d $(@D) $(RAPIDXML_DL_DIR)/$(RAPIDXML_SOURCE)
 	mv $(@D)/rapidxml-$(RAPIDXML_VERSION)/* $(@D)/
 	rmdir $(@D)/rapidxml-$(RAPIDXML_VERSION)
 endef
diff --git a/package/rpi-wifi-firmware/rpi-wifi-firmware.mk b/package/rpi-wifi-firmware/rpi-wifi-firmware.mk
index 6c855a8e14..83473d5bb9 100644
--- a/package/rpi-wifi-firmware/rpi-wifi-firmware.mk
+++ b/package/rpi-wifi-firmware/rpi-wifi-firmware.mk
@@ -12,7 +12,7 @@ RPI_WIFI_FIRMWARE_SITE = https://raw.githubusercontent.com/RPi-Distro/firmware-n
 RPI_WIFI_FIRMWARE_LICENSE = PROPRIETARY
 
 define RPI_WIFI_FIRMWARE_EXTRACT_CMDS
-	cp $(DL_DIR)/$($(PKG)_SOURCE) $(@D)/
+	cp $(RPI_WIFI_FIRMWARE_DL_DIR)/$($(PKG)_SOURCE) $(@D)/
 endef
 
 define RPI_WIFI_FIRMWARE_INSTALL_TARGET_CMDS
diff --git a/package/rust-bin/rust-bin.mk b/package/rust-bin/rust-bin.mk
index cd5844b115..759d468304 100644
--- a/package/rust-bin/rust-bin.mk
+++ b/package/rust-bin/rust-bin.mk
@@ -22,7 +22,7 @@ HOST_RUST_BIN_LIBSTD_HOST_PREFIX = rust-std-$(RUST_BIN_VERSION)-$(RUSTC_HOST_NAM
 define HOST_RUST_BIN_LIBSTD_EXTRACT
 	mkdir -p $(@D)/std
 	$(foreach f,$(HOST_RUST_BIN_EXTRA_DOWNLOADS), \
-		$(call suitable-extractor,$(f)) $(DL_DIR)/$(f) | \
+		$(call suitable-extractor,$(f)) $(RUST_BIN_DL_DIR)/$(f) | \
 			$(TAR) -C $(@D)/std $(TAR_OPTIONS) -
 	)
 	cd $(@D)/rustc/lib/rustlib; \
diff --git a/package/sam-ba/sam-ba.mk b/package/sam-ba/sam-ba.mk
index a7941459c7..acc03a3b6b 100644
--- a/package/sam-ba/sam-ba.mk
+++ b/package/sam-ba/sam-ba.mk
@@ -13,7 +13,7 @@ SAM_BA_LICENSE_FILES = doc/license.txt tcl_lib/boards.tcl \
 		applets/sam4c/libraries/libchip_sam4c/include/sam4c/sam4c32e-1.h
 
 define HOST_SAM_BA_EXTRACT_CMDS
-	$(UNZIP) -d $(BUILD_DIR) $(DL_DIR)/$(SAM_BA_SOURCE)
+	$(UNZIP) -d $(BUILD_DIR) $(SAM_BA_DL_DIR)/$(SAM_BA_SOURCE)
 	mv $(BUILD_DIR)/sam-ba_cdc_linux/* $(@D)
 	rmdir $(BUILD_DIR)/sam-ba_cdc_linux/
 endef
diff --git a/package/spidev_test/spidev_test.mk b/package/spidev_test/spidev_test.mk
index 1d657803b4..bf8170cd2c 100644
--- a/package/spidev_test/spidev_test.mk
+++ b/package/spidev_test/spidev_test.mk
@@ -32,7 +32,7 @@ endef
 SPIDEV_TEST_POST_PATCH_HOOKS += SPIDEV_ADD_LINUX_IOCTL
 
 define SPIDEV_TEST_EXTRACT_CMDS
-	cp $(DL_DIR)/$(SPIDEV_TEST_SOURCE) $(@D)/spidev_test.c
+	cp $(SPIDEV_TEST_DL_DIR)/$(SPIDEV_TEST_SOURCE) $(@D)/spidev_test.c
 endef
 
 define SPIDEV_TEST_BUILD_CMDS
diff --git a/package/tar/tar.mk b/package/tar/tar.mk
index 92b6e9eaa5..9942e77737 100644
--- a/package/tar/tar.mk
+++ b/package/tar/tar.mk
@@ -40,7 +40,7 @@ HOST_TAR_SOURCE = tar-$(TAR_VERSION).cpio.gz
 define HOST_TAR_EXTRACT_CMDS
 	mkdir -p $(@D)
 	cd $(@D) && \
-		$(call suitable-extractor,$(HOST_TAR_SOURCE)) $(DL_DIR)/$(HOST_TAR_SOURCE) | cpio -i --preserve-modification-time
+		$(call suitable-extractor,$(HOST_TAR_SOURCE)) $(TAR_DL_DIR)/$(HOST_TAR_SOURCE) | cpio -i --preserve-modification-time
 	mv $(@D)/tar-$(TAR_VERSION)/* $(@D)
 	rmdir $(@D)/tar-$(TAR_VERSION)
 endef
diff --git a/package/tesseract-ocr/tesseract-ocr.mk b/package/tesseract-ocr/tesseract-ocr.mk
index 9e315b239a..919a4360ed 100644
--- a/package/tesseract-ocr/tesseract-ocr.mk
+++ b/package/tesseract-ocr/tesseract-ocr.mk
@@ -62,7 +62,7 @@ TESSERACT_OCR_PRE_CONFIGURE_HOOKS += TESSERACT_OCR_PRECONFIGURE
 # Language data files installation
 define TESSERACT_OCR_INSTALL_LANG_DATA
 	$(foreach langfile,$(TESSERACT_OCR_DATA_FILES), \
-		$(INSTALL) -D -m 0644 $(DL_DIR)/$(langfile) \
+		$(INSTALL) -D -m 0644 $(TESSERACT_OCR_DL_DIR)/$(langfile) \
 			$(TARGET_DIR)/usr/share/tessdata/$(langfile)
 	)
 endef
diff --git a/package/ti-cgt-pru/ti-cgt-pru.mk b/package/ti-cgt-pru/ti-cgt-pru.mk
index 0b9a64b3c9..df3b97bef5 100644
--- a/package/ti-cgt-pru/ti-cgt-pru.mk
+++ b/package/ti-cgt-pru/ti-cgt-pru.mk
@@ -13,8 +13,8 @@ TI_CGT_PRU_LICENSE_FILES = PRU_Code_Generation_Tools_2.2.x_manifest.html \
 	pru_rts_2_2_0_82167478-F8C9-49b2-82BD-12F8550770F9.spdx
 
 define HOST_TI_CGT_PRU_EXTRACT_CMDS
-	chmod +x $(DL_DIR)/$(TI_CGT_PRU_SOURCE)
-	$(DL_DIR)/$(TI_CGT_PRU_SOURCE) --prefix $(@D) --mode unattended
+	chmod +x $(TI_CGT_PRU_DL_DIR)/$(TI_CGT_PRU_SOURCE)
+	$(TI_CGT_PRU_DL_DIR)/$(TI_CGT_PRU_SOURCE) --prefix $(@D) --mode unattended
 	mv $(@D)/ti-cgt-pru_$(TI_CGT_PRU_VERSION)/* $(@D)
 	rmdir $(@D)/ti-cgt-pru_$(TI_CGT_PRU_VERSION)/
 endef
diff --git a/package/ti-gfx/ti-gfx.mk b/package/ti-gfx/ti-gfx.mk
index 428878a08b..9fad553780 100644
--- a/package/ti-gfx/ti-gfx.mk
+++ b/package/ti-gfx/ti-gfx.mk
@@ -94,8 +94,8 @@ TI_GFX_HDR_DIRS = OGLES2/EGL OGLES2/EWS OGLES2/GLES2 OGLES2/KHR \
 	OGLES/GLES bufferclass_ti/ pvr2d/ wsegl/
 
 define TI_GFX_EXTRACT_CMDS
-	chmod +x $(DL_DIR)/$(TI_GFX_SOURCE)
-	printf "Y\nY\n qY\n\n" | $(DL_DIR)/$(TI_GFX_SOURCE) \
+	chmod +x $(TI_GFX_DL_DIR)/$(TI_GFX_SOURCE)
+	printf "Y\nY\n qY\n\n" | $(TI_GFX_DL_DIR)/$(TI_GFX_SOURCE) \
 		--prefix $(@D) \
 		--mode console
 endef
diff --git a/package/ts4900-fpga/ts4900-fpga.mk b/package/ts4900-fpga/ts4900-fpga.mk
index ed951b8c8f..7bb62a3984 100644
--- a/package/ts4900-fpga/ts4900-fpga.mk
+++ b/package/ts4900-fpga/ts4900-fpga.mk
@@ -11,7 +11,7 @@ TS4900_FPGA_SITE = ftp://ftp.embeddedarm.com/ts-socket-macrocontrollers/ts-4900-
 # https://github.com/embeddedarm/meta-ts/blob/f31860f1204b64f765a5380d3b93a2cf18234f90/recipes-extras/ts4900-fpga/ts4900-fpga.bb#L6
 
 define TS4900_FPGA_EXTRACT_CMDS
-	cp $(DL_DIR)/$(TS4900_FPGA_SOURCE) $(@D)
+	cp $(TS4900_FPGA_DL_DIR)/$(TS4900_FPGA_SOURCE) $(@D)
 endef
 
 define TS4900_FPGA_INSTALL_TARGET_CMDS
diff --git a/package/unscd/unscd.mk b/package/unscd/unscd.mk
index f0eb5d6ad6..36f2de271b 100644
--- a/package/unscd/unscd.mk
+++ b/package/unscd/unscd.mk
@@ -11,7 +11,7 @@ UNSCD_LICENSE = GPL-2.0
 UNSCD_LICENSE_FILES = $(UNSCD_SOURCE)
 
 define UNSCD_EXTRACT_CMDS
-	cp $(DL_DIR)/$($(PKG)_SOURCE) $(@D)/
+	cp $(UNSCD_DL_DIR)/$($(PKG)_SOURCE) $(@D)/
 endef
 
 define UNSCD_BUILD_CMDS
diff --git a/package/urg/urg.mk b/package/urg/urg.mk
index 966627fe36..1346079893 100644
--- a/package/urg/urg.mk
+++ b/package/urg/urg.mk
@@ -25,7 +25,9 @@ endif
 URG_CONFIG_SCRIPTS = c_urg-config urg-config
 
 define URG_EXTRACT_CMDS
-	$(UNZIP) -d $(BUILD_DIR) $(DL_DIR)/$(URG_SOURCE)
+	$(UNZIP) -d $(BUILD_DIR) $(URG_DL_DIR)/$(URG_SOURCE)
+	test -d $(URG_DIR) || \
+		mv $(BUILD_DIR)/$(subst .zip,,$(URG_SOURCE)) $(URG_DIR)
 endef
 
 $(eval $(autotools-package))
diff --git a/package/waf/waf.mk b/package/waf/waf.mk
index e7ac891b39..639c26a3b0 100644
--- a/package/waf/waf.mk
+++ b/package/waf/waf.mk
@@ -9,7 +9,7 @@ WAF_SOURCE = waf-$(WAF_VERSION)
 WAF_SITE = https://waf.io/
 
 define HOST_WAF_EXTRACT_CMDS
-	$(INSTALL) -D -m 0755 $(DL_DIR)/waf-$(WAF_VERSION) $(@D)/waf
+	$(INSTALL) -D -m 0755 $(WAF_DL_DIR)/waf-$(WAF_VERSION) $(@D)/waf
 endef
 
 define HOST_WAF_INSTALL_CMDS
diff --git a/package/whetstone/whetstone.mk b/package/whetstone/whetstone.mk
index d9b45638c5..3b6ec3419b 100644
--- a/package/whetstone/whetstone.mk
+++ b/package/whetstone/whetstone.mk
@@ -9,7 +9,7 @@ WHETSTONE_SOURCE = whetstone.c
 WHETSTONE_SITE = http://www.netlib.org/benchmark
 
 define WHETSTONE_EXTRACT_CMDS
-	cp $(DL_DIR)/$($(PKG)_SOURCE) $(@D)/
+	cp $(WHETSTONE_DL_DIR)/$($(PKG)_SOURCE) $(@D)/
 endef
 
 define WHETSTONE_BUILD_CMDS
diff --git a/package/wilc1000-firmware/wilc1000-firmware.mk b/package/wilc1000-firmware/wilc1000-firmware.mk
index 5dcf19045b..6f504d67a0 100644
--- a/package/wilc1000-firmware/wilc1000-firmware.mk
+++ b/package/wilc1000-firmware/wilc1000-firmware.mk
@@ -11,7 +11,7 @@ WILC1000_FIRMWARE_SOURCE = v$(WILC1000_FIRMWARE_VERSION)_Firmware.zip
 WILC1000_FIRMWARE_LICENSE = PROPRIETARY
 
 define WILC1000_FIRMWARE_EXTRACT_CMDS
-	$(UNZIP) -d $(BUILD_DIR) $(DL_DIR)/$(WILC1000_FIRMWARE_SOURCE)
+	$(UNZIP) -d $(BUILD_DIR) $(WILC1000_FIRMWARE_DL_DIR)/$(WILC1000_FIRMWARE_SOURCE)
 	mv $(BUILD_DIR)/wireless-firmware-$(WILC1000_FIRMWARE_VERSION)_Firmware/* $(@D)
 	rmdir $(BUILD_DIR)/wireless-firmware-$(WILC1000_FIRMWARE_VERSION)_Firmware
 endef
diff --git a/package/zynq-boot-bin/zynq-boot-bin.mk b/package/zynq-boot-bin/zynq-boot-bin.mk
index 3648c63654..c523e04f45 100644
--- a/package/zynq-boot-bin/zynq-boot-bin.mk
+++ b/package/zynq-boot-bin/zynq-boot-bin.mk
@@ -12,7 +12,7 @@ ZYNQ_BOOT_BIN_LICENSE = GPL-3.0+
 HOST_ZYNQ_BOOT_BIN_DEPENDENCIES = host-python
 
 define HOST_ZYNQ_BOOT_BIN_EXTRACT_CMDS
-	cp $(DL_DIR)/$(ZYNQ_BOOT_BIN_SOURCE) $(@D)
+	cp $(ZYNQ_BOOT_BIN_DL_DIR)/$(ZYNQ_BOOT_BIN_SOURCE) $(@D)
 endef
 
 define HOST_ZYNQ_BOOT_BIN_INSTALL_CMDS
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 04/13] arc/xtensa: store the eXtensa overlay in the per-package DL_DIR
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
  2018-03-31 14:23 ` [Buildroot] [v3 02/13] download: put most of the infra in dl-wrapper Maxime Hadjinlian
  2018-03-31 14:23 ` [Buildroot] [v3 03/13] packages: use new $($PKG)_DL_DIR) variable Maxime Hadjinlian
@ 2018-03-31 14:23 ` Maxime Hadjinlian
  2018-03-31 14:23 ` [Buildroot] [v3 05/13] pkg-{download, generic}: use new $($(PKG)_DL_DIR) Maxime Hadjinlian
                   ` (8 subsequent siblings)
  11 siblings, 0 replies; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:23 UTC (permalink / raw)
  To: buildroot

From: "Yann E. MORIN" <yann.morin.1998@free.fr>

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 arch/arch.mk.xtensa | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/arch/arch.mk.xtensa b/arch/arch.mk.xtensa
index 2b6cd26d03..fd410f6bfa 100644
--- a/arch/arch.mk.xtensa
+++ b/arch/arch.mk.xtensa
@@ -12,7 +12,7 @@
 BR_ARCH_XTENSA_OVERLAY_FILE = $(call qstrip,$(BR2_XTENSA_OVERLAY_FILE))
 ifneq ($(filter http://% https://% ftp://% scp://%,$(BR_ARCH_XTENSA_OVERLAY_FILE)),)
 ARCH_XTENSA_OVERLAY_URL = $(BR_ARCH_XTENSA_OVERLAY_FILE)
-ARCH_XTENSA_OVERLAY_FILE = $(DL_DIR)/$(notdir $(BR_ARCH_XTENSA_OVERLAY_FILE))
+ARCH_XTENSA_OVERLAY_FILE = $($(PKG)_DL_DIR)/$(notdir $(BR_ARCH_XTENSA_OVERLAY_FILE))
 # Do not check that file, we can't know its hash
 BR_NO_CHECK_HASH_FOR += $(notdir $(ARCH_XTENSA_OVERLAY_URL))
 else
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 05/13] pkg-{download, generic}: use new $($(PKG)_DL_DIR)
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (2 preceding siblings ...)
  2018-03-31 14:23 ` [Buildroot] [v3 04/13] arc/xtensa: store the eXtensa overlay in the per-package DL_DIR Maxime Hadjinlian
@ 2018-03-31 14:23 ` Maxime Hadjinlian
  2018-04-01 18:20   ` Yann E. MORIN
  2018-03-31 14:24 ` [Buildroot] [v3 06/13] support/download: make sure the download folder is created Maxime Hadjinlian
                   ` (7 subsequent siblings)
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:23 UTC (permalink / raw)
  To: buildroot

Let the infrastructure use the new variable $(PKG)_DL_DIR

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 package/pkg-download.mk | 2 +-
 package/pkg-generic.mk  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index 14ea4ff361..0717943bb0 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -95,7 +95,7 @@ define DOWNLOAD
 		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
 		-n $($(PKG)_RAW_BASE_NAME) \
 		-N $($(PKG)_RAWNAME) \
-		-o $(DL_DIR)/$(notdir $(1)) \
+		-o $($(PKG)_DL_DIR)/$(notdir $(1)) \
 		$(if $($(PKG)_GIT_SUBMODULES),-r) \
 		$(DOWNLOAD_URIS) \
 		$(QUIET) \
diff --git a/package/pkg-generic.mk b/package/pkg-generic.mk
index 6d82f7027e..7587324368 100644
--- a/package/pkg-generic.mk
+++ b/package/pkg-generic.mk
@@ -609,7 +609,7 @@ $(2)_TARGET_DIRCLEAN =		$$($(2)_DIR)/.stamp_dircleaned
 
 # default extract command
 $(2)_EXTRACT_CMDS ?= \
-	$$(if $$($(2)_SOURCE),$$(INFLATE$$(suffix $$($(2)_SOURCE))) $$(DL_DIR)/$$($(2)_SOURCE) | \
+	$$(if $$($(2)_SOURCE),$$(INFLATE$$(suffix $$($(2)_SOURCE))) $$($(2)_DL_DIR)/$$($(2)_SOURCE) | \
 	$$(TAR) --strip-components=$$($(2)_STRIP_COMPONENTS) \
 		-C $$($(2)_DIR) \
 		$$(foreach x,$$($(2)_EXCLUDES),--exclude='$$(x)' ) \
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 06/13] support/download: make sure the download folder is created
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (3 preceding siblings ...)
  2018-03-31 14:23 ` [Buildroot] [v3 05/13] pkg-{download, generic}: use new $($(PKG)_DL_DIR) Maxime Hadjinlian
@ 2018-03-31 14:24 ` Maxime Hadjinlian
  2018-04-01 18:18   ` Yann E. MORIN
  2018-03-31 14:24 ` [Buildroot] [v3 07/13] pkg-generic: add a subdirectory to the DL_DIR Maxime Hadjinlian
                   ` (6 subsequent siblings)
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:24 UTC (permalink / raw)
  To: buildroot

At the moment, it means that we make sure that BR2_DL_DIR is created, in
the future, it will make sure that BR2_DL_DIR/PKG_NAME/ is created.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 support/download/dl-wrapper | 1 +
 1 file changed, 1 insertion(+)

diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
index 67e9742767..49caf3717b 100755
--- a/support/download/dl-wrapper
+++ b/support/download/dl-wrapper
@@ -50,6 +50,7 @@ main() {
     if [ -z "${output}" ]; then
         error "no output specified, use -o\n"
     fi
+    mkdir -p "$(dirname "${output}")"
 
     # If the output file already exists and:
     # - there's no .hash file: do not download it again and exit promptly
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 07/13] pkg-generic: add a subdirectory to the DL_DIR
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (4 preceding siblings ...)
  2018-03-31 14:24 ` [Buildroot] [v3 06/13] support/download: make sure the download folder is created Maxime Hadjinlian
@ 2018-03-31 14:24 ` Maxime Hadjinlian
  2018-04-01 18:17   ` Yann E. MORIN
  2018-03-31 14:24 ` [Buildroot] [v3 08/13] pkg-download: support new subdir for mirrors Maxime Hadjinlian
                   ` (5 subsequent siblings)
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:24 UTC (permalink / raw)
  To: buildroot

With all the previous changes, we are now ready to add a subdirectory to
the DL_DIR.
The structure will now be DL_DIR/PKG_NAME/{FILE1,FILE2}

This is needed for multiple reasons:
    - Avoid patches with name like SHA1.patch laying flat in DL_DIR,
    which makes it hard to know to which packages they apply
    - Avoid possible collisions if two releases have the same name
    (e.g: v01.tar)
    - Allow the possibility to handle a git cache per package in the
    newly created subdirectory.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 package/pkg-generic.mk | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/package/pkg-generic.mk b/package/pkg-generic.mk
index 7587324368..b1342228c8 100644
--- a/package/pkg-generic.mk
+++ b/package/pkg-generic.mk
@@ -428,7 +428,7 @@ endif
 
 $(2)_BASE_NAME	= $$(if $$($(2)_VERSION),$(1)-$$($(2)_VERSION),$(1))
 $(2)_RAW_BASE_NAME = $$(if $$($(2)_VERSION),$$($(2)_RAWNAME)-$$($(2)_VERSION),$$($(2)_RAWNAME))
-$(2)_DL_DIR	=  $$(DL_DIR)
+$(2)_DL_DIR 	=  $$(DL_DIR)/$$($(2)_RAWNAME)
 $(2)_DIR	=  $$(BUILD_DIR)/$$($(2)_BASE_NAME)
 
 ifndef $(2)_SUBDIR
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 08/13] pkg-download: support new subdir for mirrors
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (5 preceding siblings ...)
  2018-03-31 14:24 ` [Buildroot] [v3 07/13] pkg-generic: add a subdirectory to the DL_DIR Maxime Hadjinlian
@ 2018-03-31 14:24 ` Maxime Hadjinlian
  2018-04-01 14:42   ` Yann E. MORIN
  2018-03-31 14:24 ` [Buildroot] [v3 09/13] pkg-generic: introduce _SAME_SOURCE_AS Maxime Hadjinlian
                   ` (4 subsequent siblings)
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:24 UTC (permalink / raw)
  To: buildroot

Since we introduced subdirectory to the DL_DIR, we need to support them
in the PRIMARY and BACKUP mirror as they evolve to the new tree
structure.

We check first the new URI (with the subdir), and in case of failure, we
go check without.
By checking both URIs, we ensure that old mirror are usable.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 package/pkg-download.mk | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index 0717943bb0..23f3403098 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -75,6 +75,7 @@ export BR_NO_CHECK_HASH_FOR =
 
 ifneq ($(call qstrip,$(BR2_PRIMARY_SITE)),)
 DOWNLOAD_URIS += \
+	-u $(call getschemeplusuri,$(BR2_PRIMARY_SITE)/$(notdir $($(PKG)_DL_DIR)),urlencode) \
 	-u $(call getschemeplusuri,$(BR2_PRIMARY_SITE),urlencode)
 endif
 
@@ -83,6 +84,7 @@ DOWNLOAD_URIS += \
 	-u $($(PKG)_SITE_METHOD)+$(dir $(1))
 ifneq ($(call qstrip,$(BR2_BACKUP_SITE)),)
 DOWNLOAD_URIS += \
+	-u $(call getschemeplusuri,$(BR2_BACKUP_SITE)/$(notdir $($(PKG)_DL_DIR)),urlencode) \
 	-u $(call getschemeplusuri,$(BR2_BACKUP_SITE),urlencode)
 endif
 endif
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 09/13] pkg-generic: introduce _SAME_SOURCE_AS
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (6 preceding siblings ...)
  2018-03-31 14:24 ` [Buildroot] [v3 08/13] pkg-download: support new subdir for mirrors Maxime Hadjinlian
@ 2018-03-31 14:24 ` Maxime Hadjinlian
  2018-04-01 14:26   ` Yann E. MORIN
  2018-03-31 14:24 ` [Buildroot] [v3 10/13] package: share downloaded files for big packages Maxime Hadjinlian
                   ` (3 subsequent siblings)
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:24 UTC (permalink / raw)
  To: buildroot

This per package variable can be used to specify that a package shares
the same sources as another package.

The use case here is for example linux-headers and linux, which share
the same sources (because they are the same upstream project), so we
don't want to download twice the kernel, nor store it multiple times
either.

Make will automatically try to help by introducing leading and trailing
spaces when replacing a line-continuation '\', so we need to call
$(strip) on the variable.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 package/pkg-generic.mk | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/package/pkg-generic.mk b/package/pkg-generic.mk
index b1342228c8..f3829e9912 100644
--- a/package/pkg-generic.mk
+++ b/package/pkg-generic.mk
@@ -428,7 +428,9 @@ endif
 
 $(2)_BASE_NAME	= $$(if $$($(2)_VERSION),$(1)-$$($(2)_VERSION),$(1))
 $(2)_RAW_BASE_NAME = $$(if $$($(2)_VERSION),$$($(2)_RAWNAME)-$$($(2)_VERSION),$$($(2)_RAWNAME))
-$(2)_DL_DIR 	=  $$(DL_DIR)/$$($(2)_RAWNAME)
+$(2)_DL_DIR	= $$(strip $$(if $$($(2)_SAME_SOURCE_AS), \
+        $$($$(call UPPERCASE,$$($(2)_SAME_SOURCE_AS))_DL_DIR), \
+        $$(DL_DIR)/$$($(2)_RAWNAME)))
 $(2)_DIR	=  $$(BUILD_DIR)/$$($(2)_BASE_NAME)
 
 ifndef $(2)_SUBDIR
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 10/13] package: share downloaded files for big packages
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (7 preceding siblings ...)
  2018-03-31 14:24 ` [Buildroot] [v3 09/13] pkg-generic: introduce _SAME_SOURCE_AS Maxime Hadjinlian
@ 2018-03-31 14:24 ` Maxime Hadjinlian
  2018-04-01 14:18   ` Yann E. MORIN
  2018-03-31 14:24 ` [Buildroot] [v3 11/13] help/manual: update help about the new $(LIBFOO_DL_DIR) Maxime Hadjinlian
                   ` (2 subsequent siblings)
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:24 UTC (permalink / raw)
  To: buildroot

From: "Yann E. MORIN" <yann.morin.1998@free.fr>

Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>
---
 package/gcc/gcc-initial/gcc-initial.mk   | 5 +++++
 package/linux-headers/linux-headers.mk   | 3 +++
 package/mesa3d-headers/mesa3d-headers.mk | 1 +
 3 files changed, 9 insertions(+)

diff --git a/package/gcc/gcc-initial/gcc-initial.mk b/package/gcc/gcc-initial/gcc-initial.mk
index c476b2faeb..07da50afa8 100644
--- a/package/gcc/gcc-initial/gcc-initial.mk
+++ b/package/gcc/gcc-initial/gcc-initial.mk
@@ -8,6 +8,11 @@ GCC_INITIAL_VERSION = $(GCC_VERSION)
 GCC_INITIAL_SITE = $(GCC_SITE)
 GCC_INITIAL_SOURCE = $(GCC_SOURCE)
 
+# We do not have a 'gcc' package per-se; we only have two incarnations,
+# gcc-initial and gcc-final. gcc-initial is just am internal step that
+# users should not care about, while gcc-final is the one they shall see.
+HOST_GCC_INITIAL_SAME_SOURCE_AS = host-gcc-final
+
 HOST_GCC_INITIAL_DEPENDENCIES = $(HOST_GCC_COMMON_DEPENDENCIES)
 
 HOST_GCC_INITIAL_EXCLUDES = $(HOST_GCC_EXCLUDES)
diff --git a/package/linux-headers/linux-headers.mk b/package/linux-headers/linux-headers.mk
index f1e3790608..6f4071c61a 100644
--- a/package/linux-headers/linux-headers.mk
+++ b/package/linux-headers/linux-headers.mk
@@ -82,6 +82,9 @@ endif
 
 endif # ! BR2_KERNEL_HEADERS_AS_KERNEL
 
+# linux-headers really is the same as the linux package
+LINUX_HEADERS_SAME_SOURCE_AS = linux
+
 LINUX_HEADERS_LICENSE = GPL-2.0
 LINUX_HEADERS_LICENSE_FILES = COPYING
 
diff --git a/package/mesa3d-headers/mesa3d-headers.mk b/package/mesa3d-headers/mesa3d-headers.mk
index b48d965d8a..39a25270db 100644
--- a/package/mesa3d-headers/mesa3d-headers.mk
+++ b/package/mesa3d-headers/mesa3d-headers.mk
@@ -15,6 +15,7 @@ endif
 MESA3D_HEADERS_VERSION = 18.0.0
 MESA3D_HEADERS_SOURCE = mesa-$(MESA3D_HEADERS_VERSION).tar.xz
 MESA3D_HEADERS_SITE = https://mesa.freedesktop.org/archive
+MESA3D_HEADERS_SAME_SOURCE_AS = mesa3d
 MESA3D_HEADERS_LICENSE = MIT, SGI, Khronos
 MESA3D_HEADERS_LICENSE_FILES = docs/license.html
 
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 11/13] help/manual: update help about the new $(LIBFOO_DL_DIR)
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (8 preceding siblings ...)
  2018-03-31 14:24 ` [Buildroot] [v3 10/13] package: share downloaded files for big packages Maxime Hadjinlian
@ 2018-03-31 14:24 ` Maxime Hadjinlian
  2018-04-01 14:15   ` Yann E. MORIN
  2018-03-31 14:24 ` [Buildroot] [v3 12/13] download: add flock call before dl-wrapper Maxime Hadjinlian
  2018-03-31 14:24 ` [Buildroot] [v3 13/13] download: git: introduce cache feature Maxime Hadjinlian
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:24 UTC (permalink / raw)
  To: buildroot

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 Config.in                               | 3 +++
 docs/manual/adding-packages-generic.txt | 6 +++---
 2 files changed, 6 insertions(+), 3 deletions(-)

diff --git a/Config.in b/Config.in
index a74b24ae45..55b6e5e2a1 100644
--- a/Config.in
+++ b/Config.in
@@ -200,6 +200,9 @@ config BR2_DL_DIR
 	  If the Linux shell environment has defined the BR2_DL_DIR
 	  environment variable, then this overrides this configuration
 	  item.
+	  The directory is organized with a subdirectory for each package.
+	  Each package has its own $(LIBFOO_DL_DIR) variable that can be used
+	  to find the correct path.
 
 	  The default is $(TOPDIR)/dl
 
diff --git a/docs/manual/adding-packages-generic.txt b/docs/manual/adding-packages-generic.txt
index 9d1428ad40..a7578a782d 100644
--- a/docs/manual/adding-packages-generic.txt
+++ b/docs/manual/adding-packages-generic.txt
@@ -263,7 +263,7 @@ information is (assuming the package name is +libfoo+) :
   the file using this URL. Otherwise, Buildroot will assume the file
   to be downloaded is located at +LIBFOO_SITE+. Buildroot will not do
   anything with those additional files, except download them: it will
-  be up to the package recipe to use them from +$(DL_DIR)+.
+  be up to the package recipe to use them from +$(LIBFOO_DL_DIR)+.
 
 * +LIBFOO_SITE_METHOD+ determines the method used to fetch or copy the
   package source code. In many cases, Buildroot guesses the method
@@ -554,8 +554,8 @@ In the action definitions, you can use the following variables:
 * +$(@D)+, which contains the directory in which the package source
   code has been uncompressed.
 
-* +$(DL_DIR)+ contains the path to the directory where all the downloads made
-  by Buildroot are stored.
+* +$(LIBFOO_DL_DIR)+ contains the path to the directory where all the downloads
+  made by Buildroot are stored.
 
 * +$(TARGET_CC)+, +$(TARGET_LD)+, etc. to get the target
   cross-compilation utilities
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 12/13] download: add flock call before dl-wrapper
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (9 preceding siblings ...)
  2018-03-31 14:24 ` [Buildroot] [v3 11/13] help/manual: update help about the new $(LIBFOO_DL_DIR) Maxime Hadjinlian
@ 2018-03-31 14:24 ` Maxime Hadjinlian
  2018-04-01 14:09   ` Yann E. MORIN
  2018-03-31 14:24 ` [Buildroot] [v3 13/13] download: git: introduce cache feature Maxime Hadjinlian
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:24 UTC (permalink / raw)
  To: buildroot

In order to introduce the cache mechanisms, we need to have a lock on
the $(LIBFOO_DL_DIR), otherwise it would be impossible to do parallel
download (a shared DL_DIR for two buildroot instances).

To make sure the directory exists, the mkdir call has been removed from
the dl-wrapper and put in the infrastructure.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
 package/pkg-download.mk     | 4 +++-
 support/download/dl-wrapper | 1 -
 2 files changed, 3 insertions(+), 2 deletions(-)

diff --git a/package/pkg-download.mk b/package/pkg-download.mk
index 23f3403098..dff52007e6 100644
--- a/package/pkg-download.mk
+++ b/package/pkg-download.mk
@@ -19,6 +19,7 @@ SSH := $(call qstrip,$(BR2_SSH))
 export LOCALFILES := $(call qstrip,$(BR2_LOCALFILES))
 
 DL_WRAPPER = support/download/dl-wrapper
+FLOCK = flock $($(PKG)_DL_DIR)/
 
 # DL_DIR may have been set already from the environment
 ifeq ($(origin DL_DIR),undefined)
@@ -91,7 +92,8 @@ endif
 
 define DOWNLOAD
 	$(Q)$(if $(filter bzr cvs hg svn,$($(PKG)_SITE_METHOD)),BR_NO_CHECK_HASH_FOR=$(notdir $(1));) \
-	$(EXTRA_ENV) $(DL_WRAPPER) \
+	$(Q)mkdir -p $($(PKG)_DL_DIR)/
+	$(EXTRA_ENV) $(FLOCK) $(DL_WRAPPER) \
 		-c $($(PKG)_DL_VERSION) \
 		-f $(notdir $(1)) \
 		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
index 49caf3717b..67e9742767 100755
--- a/support/download/dl-wrapper
+++ b/support/download/dl-wrapper
@@ -50,7 +50,6 @@ main() {
     if [ -z "${output}" ]; then
         error "no output specified, use -o\n"
     fi
-    mkdir -p "$(dirname "${output}")"
 
     # If the output file already exists and:
     # - there's no .hash file: do not download it again and exit promptly
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 13/13] download: git: introduce cache feature
  2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
                   ` (10 preceding siblings ...)
  2018-03-31 14:24 ` [Buildroot] [v3 12/13] download: add flock call before dl-wrapper Maxime Hadjinlian
@ 2018-03-31 14:24 ` Maxime Hadjinlian
  2018-04-01 12:57   ` Yann E. MORIN
  11 siblings, 1 reply; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 14:24 UTC (permalink / raw)
  To: buildroot

Now we keep the git clone that we download and generates our tarball
from there.
The main goal here is that if you change the version of a package (say
Linux), instead of cloning all over again, you will simply 'git fetch'
from the repo the missing objects, then generates the tarball again.

This should speed the 'source' part of the build significantly.

The drawback is that the DL_DIR will grow much larger; but time is more
important than disk space nowadays.

Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
---
v1 -> v2:
   - Fix bad regex in the 'transform' option of tar (found by Peter
   Seiderer)
v2 -> v3:
   - Change git fetch origin to use the uri of the package instead of
   the name of the default remote 'origin' (Thomas Petazzoni)
---
 support/download/git | 70 ++++++++++++++++++++++++++++++----------------------
 1 file changed, 40 insertions(+), 30 deletions(-)

diff --git a/support/download/git b/support/download/git
index 58a2c6ad9d..301f7e792a 100755
--- a/support/download/git
+++ b/support/download/git
@@ -39,28 +39,34 @@ _git() {
     eval ${GIT} "${@}"
 }
 
-# Try a shallow clone, since it is faster than a full clone - but that only
-# works if the version is a ref (tag or branch). Before trying to do a shallow
-# clone we check if ${cset} is in the list provided by git ls-remote. If not
-# we fall back on a full clone.
-#
-# Messages for the type of clone used are provided to ease debugging in case of
-# problems
-git_done=0
-if [ -n "$(_git ls-remote "'${uri}'" "'${cset}'" 2>&1)" ]; then
-    printf "Doing shallow clone\n"
-    if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${uri}'" "'${basename}'"; then
-        git_done=1
-    else
-        printf "Shallow clone failed, falling back to doing a full clone\n"
+# We want to check if a cache of the git clone of this repo already exists.
+git_cache="${BR2_DL_DIR}/${basename%%-*}/git"
+
+# If the cache directory already exists, don't try to clone.
+if [ ! -d "${git_cache}" ]; then
+    # Try a shallow clone, since it is faster than a full clone - but that
+    # only works if the versionis a ref (tag or branch). Before trying to do a
+    # shallow clone we check if ${cset} is in the list provided by git
+    # ls-remote. If not we fall back on a full clone.
+    #
+    # Messages for the type of clone used are provided to ease debugging in
+    # case of problems
+    git_done=0
+    if [ -n "$(_git ls-remote "'${uri}'" "'${cset}'" 2>&1)" ]; then
+        printf "Doing shallow clone\n"
+        if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${uri}'" "'${git_cache}'"; then
+            git_done=1
+        else
+            printf "Shallow clone failed, falling back to doing a full clone\n"
+        fi
+    fi
+    if [ ${git_done} -eq 0 ]; then
+        printf "Doing full clone\n"
+        _git clone ${verbose} "${@}" "'${uri}'" "'${git_cache}'"
     fi
-fi
-if [ ${git_done} -eq 0 ]; then
-    printf "Doing full clone\n"
-    _git clone ${verbose} "${@}" "'${uri}'" "'${basename}'"
 fi
 
-pushd "${basename}" >/dev/null
+pushd "${git_cache}" >/dev/null
 
 # Try to get the special refs exposed by some forges (pull-requests for
 # github, changes for gerrit...). There is no easy way to know whether
@@ -69,7 +75,7 @@ pushd "${basename}" >/dev/null
 # below, if there is an issue anyway. Since most of the cset we're gonna
 # have to clone are not such special refs, consign the output to oblivion
 # so as not to alarm unsuspecting users, but still trace it as a warning.
-if ! _git fetch origin "'${cset}:${cset}'" >/dev/null 2>&1; then
+if ! _git fetch "'${uri}'" "'${cset}:${cset}'" >/dev/null 2>&1; then
     printf "Could not fetch special ref '%s'; assuming it is not special.\n" "${cset}"
 fi
 
@@ -86,20 +92,24 @@ if [ ${recurse} -eq 1 ]; then
     _git submodule update --init --recursive
 fi
 
-# We do not want the .git dir; we keep other .git files, in case they
-# are the only files in their directory.
+# Generate the archive, sort with the C locale so that it is reproducible
+# We do not want the .git dir; we keep other .git
+# files, in case they are the only files in their directory.
 # The .git dir would generate non reproducible tarballs as it depends on
 # the state of the remote server. It also would generate large tarballs
 # (gigabytes for some linux trees) when a full clone took place.
-rm -rf .git
+find . -not -type d \
+	-and -not -path "./.git/*" >"${BR2_DL_DIR}/${basename}.list"
+LC_ALL=C sort <"${BR2_DL_DIR}/${basename}.list" >"${BR2_DL_DIR}/${basename}.list.sorted"
 
-popd >/dev/null
-
-# Generate the archive, sort with the C locale so that it is reproducible
-find "${basename}" -not -type d >"${basename}.list"
-LC_ALL=C sort <"${basename}.list" >"${basename}.list.sorted"
 # Create GNU-format tarballs, since that's the format of the tarballs on
 # sources.buildroot.org and used in the *.hash files
-tar cf - --numeric-owner --owner=0 --group=0 --mtime="${date}" --format=gnu \
-         -T "${basename}.list.sorted" >"${output}.tar"
+tar cf - --transform="s/^\.$/${basename}/" \
+	--numeric-owner --owner=0 --group=0 --mtime="${date}" --format=gnu \
+         -T "${BR2_DL_DIR}/${basename}.list.sorted" >"${output}.tar"
 gzip -6 -n <"${output}.tar" >"${output}"
+
+rm -f "${BR2_DL_DIR}/${basename}.list"
+rm -f "${BR2_DL_DIR}/${basename}.list.sorted"
+
+popd >/dev/null
-- 
2.16.2

^ permalink raw reply related	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 02/13] download: put most of the infra in dl-wrapper
  2018-03-31 14:23 ` [Buildroot] [v3 02/13] download: put most of the infra in dl-wrapper Maxime Hadjinlian
@ 2018-03-31 17:02   ` Maxime Hadjinlian
  0 siblings, 0 replies; 26+ messages in thread
From: Maxime Hadjinlian @ 2018-03-31 17:02 UTC (permalink / raw)
  To: buildroot

Don't look at the "missing-hash.py" stuff, that's a tool that was
sitting in my tree and should not have been sent. Sorry for that.

On Sat, Mar 31, 2018 at 4:23 PM, Maxime Hadjinlian
<maxime.hadjinlian@gmail.com> wrote:
> The goal here is to simplify the infrastructure by putting most of the
> code in the dl-wrapper as it's easier to implement and to read.
>
> Most of the functions were common already, this patch finalizes it by
> making the pkg-download.mk pass all the parameters needed to the
> dl-wrapper which in turns will pass everything to every backend.
>
> The backend will then cherry-pick what it needs from these arguments
> and act accordingly.
>
> It eases the transition to the addition of a sub directory per package
> in the DL_DIR, and later on, a git cache.
>
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
> Tested-by: Luca Ceresoli <luca@lucaceresoli.net>
> Reviewed-by: Luca Ceresoli <luca@lucaceresoli.net>
> ---
> v1 -> v2:
>     - Rename cp backend to file (Arnout)
>     - Don't use BR_BACKEND_DL_GETOPTS for dl-wrapper (Arnout)
>     - Add "urlencode" to scheme passed to the dl-wrapper to support the
>     fact that we need to urlencode the filename when using PRIMARY and
>     BACKUP mirror (some files are named toto.c?v=1.0) (Arnout)
>     - Fix uristripscheme replaced by bash'ism (Arnout)
>     - Add the check hash into the loop, exit with error only if all the
>     download+check failed. (Arnout)
> ---
>  missing-hash.py               | 145 ++++++++++++++++++++++++++++++++++++
>  package/pkg-download.mk       | 166 ++++++++----------------------------------
>  support/download/cvs          |   2 +-
>  support/download/dl-wrapper   | 108 ++++++++++++++++++---------
>  support/download/{cp => file} |   4 +-
>  support/download/wget         |  10 ++-
>  6 files changed, 258 insertions(+), 177 deletions(-)
>  create mode 100755 missing-hash.py
>  rename support/download/{cp => file} (90%)
>
> diff --git a/missing-hash.py b/missing-hash.py
> new file mode 100755
> index 0000000000..5c8b3435a5
> --- /dev/null
> +++ b/missing-hash.py
> @@ -0,0 +1,145 @@
> +#!/usr/bin/env python
> +# -*- coding: utf-8 -*-
> +
> +import fnmatch
> +import distutils
> +import time
> +import ftplib
> +import glob
> +import logging
> +import os
> +import re
> +import subprocess
> +import sys
> +import urllib2
> +import sysconfig
> +
> +ERR_PROVIDER = ['exception list', 'website not reachable', 'alioth.debian.org']
> +
> +EXCLUDED_PKGS = [
> +        "boot/common.mk",
> +        "linux/linux-ext-fbtft.mk",
> +        "linux/linux-ext-xenomai.mk",
> +        "linux/linux-ext-rtai.mk",
> +        "package/efl/efl.mk",
> +        "package/freescale-imx/freescale-imx.mk",
> +        "package/gcc/gcc.mk",
> +        "package/gstreamer/gstreamer.mk",
> +        "package/gstreamer1/gstreamer1.mk",
> +        "package/gtk2-themes/gtk2-themes.mk",
> +        "package/matchbox/matchbox.mk",
> +        "package/opengl/opengl.mk",
> +        "package/qt5/qt5.mk",
> +        "package/x11r7/x11r7.mk"
> +]
> +
> +class Package(object):
> +
> +    def __init__(self, package_mk_path):
> +        self.mk_path = package_mk_path
> +        self.name = os.path.basename(os.path.splitext(package_mk_path)[0])
> +        self.mk_name = self.name.upper().replace('-', '_')
> +        self.infra = 'unknown'
> +        self.infra_host = False
> +        self.last_version = None
> +        self.hash = False
> +        self.provider = None
> +        self.source = None
> +        self.site = None
> +        self.version = None
> +
> +        data = sysconfig._parse_makefile(package_mk_path)
> +        for k in ["SITE", "SOURCE", "VERSION", "LICENSE_FILES", "LICENSE"]:
> +            k_name = "%s_%s" % (self.mk_name, k)
> +            if k_name in data.keys():
> +                value = None if data[k_name] == "" else data[k_name]
> +                setattr(self, k.lower(), value)
> +
> +        if "package/qt5/" in self.mk_path:
> +                data = sysconfig._parse_makefile("package/qt5/qt5.mk")
> +                self.version = data["QT5_VERSION"]
> +
> +        if "package/efl/" in self.mk_path:
> +                data = sysconfig._parse_makefile("package/efl/efl.mk")
> +                self.version = data["EFL_VERSION"]
> +
> +        with open(package_mk_path) as f:
> +            # Everything we could not obtain through the parsing of the mk
> +            # files will get obtained here.
> +            for line in f.readlines():
> +                if "%s_VERSION" % self.mk_name in line and\
> +                   self.version is None:
> +                        if "$" in line:
> +                                continue
> +                        self.version = line[line.rindex('=')+1:].strip()
> +
> +                if "-package)" not in line:
> +                    continue
> +                self.infra = line[line.rindex('(')+1:-2]
> +                if "host" in self.infra:
> +                    self.infra_host = True
> +                self.infra = self.infra[:self.infra.rindex('-')]
> +
> +        if "$" in str(self.version):
> +                self.version = None
> +
> +        self.hash_file = "%s.hash" % os.path.splitext(package_mk_path)[0]
> +        if os.path.exists(self.hash_file):
> +            self.hash = True
> +
> +        self.provider = self.get_provider()
> +
> +    def get_provider(self):
> +        if self.site is None:
> +            return None
> +
> +        if "github" in self.site:
> +            return "github"
> +        elif "sourceforge" in self.site:
> +            return "sourceforge"
> +
> +if __name__ == '__main__':
> +    matches = []
> +    for dir in ["boot", "linux", "package"]:
> +        for root, _, filenames in os.walk(dir):
> +            for filename in fnmatch.filter(filenames, '*.mk'):
> +                path = os.path.join(root, filename)
> +                if os.path.dirname(path) in dir:
> +                    continue
> +                matches.append(path)
> +
> +    print "#!/bin/sh"
> +
> +    matches.sort()
> +    packages = []
> +    count = 0
> +    for mk_path in matches:
> +
> +        if mk_path in EXCLUDED_PKGS:
> +            continue
> +
> +        pkg = Package(mk_path)
> +
> +        if pkg is None:
> +            continue
> +
> +        if pkg.hash is False:
> +            if pkg.site is not None and "github" not in pkg.site:
> +                if len(str(pkg.version)) >= 40:
> +                    continue
> +                print "make %s-source" % pkg.name
> +                print "my_file=$(find dl/ -type f)"
> +                print "touch %s" % pkg.hash_file
> +                print "echo '# Locally computed' >> %s" % pkg.hash_file
> +                print "output=$(sha256sum \"$my_file\")"
> +                print "sha256=$(echo $output | awk '{print $1}')"
> +                print "filename=$(echo $output | awk '{print $2}' | cut -d'/' -f2)"
> +                print "echo \"sha256 $sha256 $filename\" >> %s" % pkg.hash_file
> +                print "git add %s" % pkg.hash_file
> +                print "git commit -s -m \"package/%s: add hash file\"" % pkg.name
> +                print "make %s-dirclean" % pkg.name
> +                print "rm -Rf dl"
> +                print ""
> +                count += 1
> +
> +    print count
> diff --git a/package/pkg-download.mk b/package/pkg-download.mk
> index ce069b9926..14ea4ff361 100644
> --- a/package/pkg-download.mk
> +++ b/package/pkg-download.mk
> @@ -42,6 +42,8 @@ DL_DIR := $(shell mkdir -p $(DL_DIR) && cd $(DL_DIR) >/dev/null && pwd)
>  #
>  # geturischeme: http
>  geturischeme = $(firstword $(subst ://, ,$(call qstrip,$(1))))
> +# getschemeplusuri: git|parameter+http://example.com
> +getschemeplusuri = $(call geturischeme,$(1))$(if $(2),\|$(2))+$(1)
>  # stripurischeme: www.example.com/dir/file
>  stripurischeme = $(lastword $(subst ://, ,$(call qstrip,$(1))))
>  # domain: www.example.com
> @@ -61,152 +63,42 @@ github = https://github.com/$(1)/$(2)/archive/$(3)
>  export BR_NO_CHECK_HASH_FOR =
>
>  ################################################################################
> -# The DOWNLOAD_* helpers are in charge of getting a working copy
> -# of the source repository for their corresponding SCM,
> -# checking out the requested version / commit / tag, and create an
> -# archive out of it. DOWNLOAD_SCP uses scp to obtain a remote file with
> -# ssh authentication. DOWNLOAD_WGET is the normal wget-based download
> -# mechanism.
> +# DOWNLOAD -- Download helper. Will call DL_WRAPPER which will try to download
> +# source from:
> +# 1) BR2_PRIMARY_SITE if enabled
> +# 2) Download site, unless BR2_PRIMARY_SITE_ONLY is set
> +# 3) BR2_BACKUP_SITE if enabled, unless BR2_PRIMARY_SITE_ONLY is set
> +#
> +# Argument 1 is the source location
>  #
>  ################################################################################
>
> -define DOWNLOAD_GIT
> -       $(EXTRA_ENV) $(DL_WRAPPER) -b git \
> -               -o $(DL_DIR)/$($(PKG)_SOURCE) \
> -               $(if $($(PKG)_GIT_SUBMODULES),-r) \
> -               -H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
> -               $(QUIET) \
> -               -- \
> -               -u $($(PKG)_SITE) \
> -               -c $($(PKG)_DL_VERSION) \
> -               -n $($(PKG)_RAW_BASE_NAME) \
> -               $($(PKG)_DL_OPTS)
> -endef
> -
> -define DOWNLOAD_BZR
> -       $(EXTRA_ENV) $(DL_WRAPPER) -b bzr \
> -               -o $(DL_DIR)/$($(PKG)_SOURCE) \
> -               $(QUIET) \
> -               -- \
> -               -u $($(PKG)_SITE) \
> -               -c $($(PKG)_DL_VERSION) \
> -               -n $($(PKG)_RAW_BASE_NAME) \
> -               $($(PKG)_DL_OPTS)
> -endef
> +ifneq ($(call qstrip,$(BR2_PRIMARY_SITE)),)
> +DOWNLOAD_URIS += \
> +       -u $(call getschemeplusuri,$(BR2_PRIMARY_SITE),urlencode)
> +endif
>
> -define DOWNLOAD_CVS
> -       $(EXTRA_ENV) $(DL_WRAPPER) -b cvs \
> -               -o $(DL_DIR)/$($(PKG)_SOURCE) \
> -               $(QUIET) \
> -               -- \
> -               -u $(call stripurischeme,$(call qstrip,$($(PKG)_SITE))) \
> -               -c $($(PKG)_DL_VERSION) \
> -               -N $($(PKG)_RAWNAME) \
> -               -n $($(PKG)_RAW_BASE_NAME) \
> -               $($(PKG)_DL_OPTS)
> -endef
> +ifeq ($(BR2_PRIMARY_SITE_ONLY),)
> +DOWNLOAD_URIS += \
> +       -u $($(PKG)_SITE_METHOD)+$(dir $(1))
> +ifneq ($(call qstrip,$(BR2_BACKUP_SITE)),)
> +DOWNLOAD_URIS += \
> +       -u $(call getschemeplusuri,$(BR2_BACKUP_SITE),urlencode)
> +endif
> +endif
>
> -define DOWNLOAD_SVN
> -       $(EXTRA_ENV) $(DL_WRAPPER) -b svn \
> -               -o $(DL_DIR)/$($(PKG)_SOURCE) \
> -               $(QUIET) \
> -               -- \
> -               -u $($(PKG)_SITE) \
> +define DOWNLOAD
> +       $(Q)$(if $(filter bzr cvs hg svn,$($(PKG)_SITE_METHOD)),BR_NO_CHECK_HASH_FOR=$(notdir $(1));) \
> +       $(EXTRA_ENV) $(DL_WRAPPER) \
>                 -c $($(PKG)_DL_VERSION) \
> -               -n $($(PKG)_RAW_BASE_NAME) \
> -               $($(PKG)_DL_OPTS)
> -endef
> -
> -# SCP URIs should be of the form scp://[user@]host:filepath
> -# Note that filepath is relative to the user's home directory, so you may want
> -# to prepend the path with a slash: scp://[user@]host:/absolutepath
> -define DOWNLOAD_SCP
> -       $(EXTRA_ENV) $(DL_WRAPPER) -b scp \
> -               -o $(DL_DIR)/$(2) \
> +               -f $(notdir $(1)) \
>                 -H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
> -               $(QUIET) \
> -               -- \
> -               -u '$(call stripurischeme,$(call qstrip,$(1)))' \
> -               $($(PKG)_DL_OPTS)
> -endef
> -
> -define DOWNLOAD_HG
> -       $(EXTRA_ENV) $(DL_WRAPPER) -b hg \
> -               -o $(DL_DIR)/$($(PKG)_SOURCE) \
> -               $(QUIET) \
> -               -- \
> -               -u $($(PKG)_SITE) \
> -               -c $($(PKG)_DL_VERSION) \
>                 -n $($(PKG)_RAW_BASE_NAME) \
> -               $($(PKG)_DL_OPTS)
> -endef
> -
> -define DOWNLOAD_WGET
> -       $(EXTRA_ENV) $(DL_WRAPPER) -b wget \
> -               -o $(DL_DIR)/$(2) \
> -               -H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
> -               $(QUIET) \
> -               -- \
> -               -u '$(call qstrip,$(1))' \
> -               $($(PKG)_DL_OPTS)
> -endef
> -
> -define DOWNLOAD_LOCALFILES
> -       $(EXTRA_ENV) $(DL_WRAPPER) -b cp \
> -               -o $(DL_DIR)/$(2) \
> -               -H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
> +               -N $($(PKG)_RAWNAME) \
> +               -o $(DL_DIR)/$(notdir $(1)) \
> +               $(if $($(PKG)_GIT_SUBMODULES),-r) \
> +               $(DOWNLOAD_URIS) \
>                 $(QUIET) \
>                 -- \
> -               -u $(call stripurischeme,$(call qstrip,$(1))) \
>                 $($(PKG)_DL_OPTS)
>  endef
> -
> -################################################################################
> -# DOWNLOAD -- Download helper. Will try to download source from:
> -# 1) BR2_PRIMARY_SITE if enabled
> -# 2) Download site, unless BR2_PRIMARY_SITE_ONLY is set
> -# 3) BR2_BACKUP_SITE if enabled, unless BR2_PRIMARY_SITE_ONLY is set
> -#
> -# Argument 1 is the source location
> -#
> -# E.G. use like this:
> -# $(call DOWNLOAD,$(FOO_SITE))
> -#
> -# For PRIMARY and BACKUP site, any ? in the URL is replaced by %3F. A ? in
> -# the URL is used to separate query arguments, but the PRIMARY and BACKUP
> -# sites serve just plain files.
> -################################################################################
> -
> -define DOWNLOAD
> -       $(call DOWNLOAD_INNER,$(1),$(notdir $(1)),DOWNLOAD)
> -endef
> -
> -define DOWNLOAD_INNER
> -       $(Q)$(if $(filter bzr cvs hg svn,$($(PKG)_SITE_METHOD)),export BR_NO_CHECK_HASH_FOR=$(2);) \
> -       if test -n "$(call qstrip,$(BR2_PRIMARY_SITE))" ; then \
> -               case "$(call geturischeme,$(BR2_PRIMARY_SITE))" in \
> -                       file) $(call $(3)_LOCALFILES,$(BR2_PRIMARY_SITE)/$(2),$(2)) && exit ;; \
> -                       scp) $(call $(3)_SCP,$(BR2_PRIMARY_SITE)/$(2),$(2)) && exit ;; \
> -                       *) $(call $(3)_WGET,$(BR2_PRIMARY_SITE)/$(subst ?,%3F,$(2)),$(2)) && exit ;; \
> -               esac ; \
> -       fi ; \
> -       if test "$(BR2_PRIMARY_SITE_ONLY)" = "y" ; then \
> -               exit 1 ; \
> -       fi ; \
> -       if test -n "$(1)" ; then \
> -               case "$($(PKG)_SITE_METHOD)" in \
> -                       git) $($(3)_GIT) && exit ;; \
> -                       svn) $($(3)_SVN) && exit ;; \
> -                       cvs) $($(3)_CVS) && exit ;; \
> -                       bzr) $($(3)_BZR) && exit ;; \
> -                       file) $($(3)_LOCALFILES) && exit ;; \
> -                       scp) $($(3)_SCP) && exit ;; \
> -                       hg) $($(3)_HG) && exit ;; \
> -                       *) $(call $(3)_WGET,$(1),$(2)) && exit ;; \
> -               esac ; \
> -       fi ; \
> -       if test -n "$(call qstrip,$(BR2_BACKUP_SITE))" ; then \
> -               $(call $(3)_WGET,$(BR2_BACKUP_SITE)/$(subst ?,%3F,$(2)),$(2)) && exit ; \
> -       fi ; \
> -       exit 1
> -endef
> diff --git a/support/download/cvs b/support/download/cvs
> index 69d5c71f28..3f77b849e4 100755
> --- a/support/download/cvs
> +++ b/support/download/cvs
> @@ -21,7 +21,7 @@ while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
>      case "${OPT}" in
>      q)  verbose=-Q;;
>      o)  output="${OPTARG}";;
> -    u)  uri="${OPTARG}";;
> +    u)  uri="${OPTARG#*://}";;
>      c)  rev="${OPTARG}";;
>      N)  rawname="${OPTARG}";;
>      n)  basename="${OPTARG}";;
> diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
> index 510e7ef852..67e9742767 100755
> --- a/support/download/dl-wrapper
> +++ b/support/download/dl-wrapper
> @@ -19,31 +19,34 @@
>  # We want to catch any unexpected failure, and exit immediately.
>  set -e
>
> -export BR_BACKEND_DL_GETOPTS=":hc:o:n:N:H:ru:q"
> +export BR_BACKEND_DL_GETOPTS=":hc:o:n:N:H:ru:qf:e"
>
>  main() {
>      local OPT OPTARG
>      local backend output hfile recurse quiet
> +    local -a uris
>
>      # Parse our options; anything after '--' is for the backend
> -    while getopts :hb:o:H:rq OPT; do
> +    while getopts ":hc:o:n:N:H:rf:u:q" OPT; do
>          case "${OPT}" in
>          h)  help; exit 0;;
> -        b)  backend="${OPTARG}";;
> +        c)  cset="${OPTARG}";;
>          o)  output="${OPTARG}";;
> +        n)  raw_base_name="${OPTARG}";;
> +        N)  base_name="${OPTARG}";;
>          H)  hfile="${OPTARG}";;
>          r)  recurse="-r";;
> +        f)  filename="${OPTARG}";;
> +        u)  uris+=( "${OPTARG}" );;
>          q)  quiet="-q";;
>          :)  error "option '%s' expects a mandatory argument\n" "${OPTARG}";;
>          \?) error "unknown option '%s'\n" "${OPTARG}";;
>          esac
>      done
> +
>      # Forget our options, and keep only those for the backend
>      shift $((OPTIND-1))
>
> -    if [ -z "${backend}" ]; then
> -        error "no backend specified, use -b\n"
> -    fi
>      if [ -z "${output}" ]; then
>          error "no output specified, use -o\n"
>      fi
> @@ -77,28 +80,64 @@ main() {
>      tmpd="$(mktemp -d "${BUILD_DIR}/.${output##*/}.XXXXXX")"
>      tmpf="${tmpd}/output"
>
> -    # Helpers expect to run in a directory that is *really* trashable, so
> -    # they are free to create whatever files and/or sub-dirs they might need.
> -    # Doing the 'cd' here rather than in all backends is easier.
> -    cd "${tmpd}"
> -
> -    # If the backend fails, we can just remove the temporary directory to
> -    # remove all the cruft it may have left behind. Then we just exit in
> -    # error too.
> -    if ! "${OLDPWD}/support/download/${backend}" \
> -            ${quiet} ${recurse} \
> -            -o "${tmpf}" "${@}"
> -    then
> -        rm -rf "${tmpd}"
> -        exit 1
> -    fi
> +    # Look through all the uris that we were given to downoad the package
> +    # source
> +    download_and_check=0
> +    for uri in "${uris[@]}"; do
> +        backend=${uri%+*}
> +        case "${backend}" in
> +            git|svn|cvs|bzr|file|scp|hg) ;;
> +            *) backend="wget" ;;
> +        esac
> +        uri=${uri#*+}
> +
> +        urlencode=${backend#*|}
> +        # urlencode must be "urlencode"
> +        [ "${urlencode}" != "urlencode" ] && urlencode=""
> +
> +        # Helpers expect to run in a directory that is *really* trashable, so
> +        # they are free to create whatever files and/or sub-dirs they might need.
> +        # Doing the 'cd' here rather than in all backends is easier.
> +        cd "${tmpd}"
> +
> +        # If the backend fails, we can just remove the content of the temporary
> +        # directory to remove all the cruft it may have left behind, and tries
> +        # the next URI until it succeeds. Once out of URI to tries, we need to
> +        # cleanup and exit.
> +        if ! "${OLDPWD}/support/download/${backend}" \
> +                $([ -n "${urlencode}" ] && printf %s '-e') \
> +                -c "${cset}" \
> +                -n "${raw_base_name}" \
> +                -N "${raw_name}" \
> +                -f "${filename}" \
> +                -u "${uri}" \
> +                -o "${tmpf}" \
> +                ${quiet} ${recurse} "${@}"
> +        then
> +            rm -rf "${tmpd:?}/*"
> +            # cd back to keep path coherence
> +            cd "${OLDPWD}"
> +            continue
> +        fi
>
> -    # cd back to free the temp-dir, so we can remove it later
> -    cd "${OLDPWD}"
> +        # cd back to free the temp-dir, so we can remove it later
> +        cd "${OLDPWD}"
>
> -    # Check if the downloaded file is sane, and matches the stored hashes
> -    # for that file
> -    if ! support/download/check-hash ${quiet} "${hfile}" "${tmpf}" "${output##*/}"; then
> +        # Check if the downloaded file is sane, and matches the stored hashes
> +        # for that file
> +        if ! support/download/check-hash ${quiet} "${hfile}" "${tmpf}" "${output##*/}"; then
> +            rm -rf "${tmpd:?}/*"
> +            # cd back to keep path coherence
> +            cd "${OLDPWD}"
> +            continue
> +        fi
> +        download_and_check=1
> +        break
> +    done
> +
> +    # We tried every URI possible, none seems to work or to check against the
> +    # available hash. *ABORT MISSION*
> +    if [ "${download_and_check}" -eq 0 ]; then
>          rm -rf "${tmpd}"
>          exit 1
>      fi
> @@ -164,16 +203,13 @@ DESCRIPTION
>
>      -h  This help text.
>
> -    -b BACKEND
> -        Wrap the specified BACKEND. Known backends are:
> -            bzr     Bazaar
> -            cp      Local files
> -            cvs     Concurrent Versions System
> -            git     Git
> -            hg      Mercurial
> -            scp     Secure copy
> -            svn     Subversion
> -            wget    HTTP download
> +    -u URIs
> +        The URI to get the file from, the URI must respect the format given in
> +        the example.
> +        You may give as many '-u URI' as you want, the script will stop at the
> +        frist successful download.
> +
> +        Example: backend+URI; git+http://example.com or http+http://example.com
>
>      -o FILE
>          Store the downloaded archive in FILE.
> diff --git a/support/download/cp b/support/download/file
> similarity index 90%
> rename from support/download/cp
> rename to support/download/file
> index 52fe2de83d..a3e616a181 100755
> --- a/support/download/cp
> +++ b/support/download/file
> @@ -3,7 +3,7 @@
>  # We want to catch any unexpected failure, and exit immediately
>  set -e
>
> -# Download helper for cp, to be called from the download wrapper script
> +# Download helper for file, to be called from the download wrapper script
>  #
>  # Options:
>  #   -q          Be quiet.
> @@ -23,7 +23,7 @@ while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
>      case "${OPT}" in
>      q)  verbose=;;
>      o)  output="${OPTARG}";;
> -    u)  source="${OPTARG}";;
> +    u)  source="${OPTARG#*://}";;
>      :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
>      \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
>      esac
> diff --git a/support/download/wget b/support/download/wget
> index fece6663ca..c69e6071aa 100755
> --- a/support/download/wget
> +++ b/support/download/wget
> @@ -8,7 +8,9 @@ set -e
>  # Options:
>  #   -q          Be quiet.
>  #   -o FILE     Save into file FILE.
> +#   -f FILENAME The filename of the tarball to get at URL
>  #   -u URL      Download file at URL.
> +#   -e ENCODE   Tell wget to urlencode the filename passed to it
>  #
>  # Environment:
>  #   WGET     : the wget command to call
> @@ -18,7 +20,9 @@ while getopts "${BR_BACKEND_DL_GETOPTS}" OPT; do
>      case "${OPT}" in
>      q)  verbose=-q;;
>      o)  output="${OPTARG}";;
> +    f)  filename="${OPTARG}";;
>      u)  url="${OPTARG}";;
> +    e)  encode="-e";;
>      :)  printf "option '%s' expects a mandatory argument\n" "${OPTARG}"; exit 1;;
>      \?) printf "unknown option '%s'\n" "${OPTARG}" >&2; exit 1;;
>      esac
> @@ -32,4 +36,8 @@ _wget() {
>      eval ${WGET} "${@}"
>  }
>
> -_wget ${verbose} "${@}" -O "'${output}'" "'${url}'"
> +# Replace every '?' with '%3F' in the filename; only for the PRIMARY and BACKUP
> +# mirror
> +[ -n "${encode}" ] && filename=${filename//\?/%3F}
> +
> +_wget ${verbose} "${@}" -O "'${output}'" "'${url}/${filename}'"
> --
> 2.16.2
>

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 13/13] download: git: introduce cache feature
  2018-03-31 14:24 ` [Buildroot] [v3 13/13] download: git: introduce cache feature Maxime Hadjinlian
@ 2018-04-01 12:57   ` Yann E. MORIN
  2018-04-01 14:58     ` Arnout Vandecappelle
  0 siblings, 1 reply; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 12:57 UTC (permalink / raw)
  To: buildroot

Maxime, All,

On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> Now we keep the git clone that we download and generates our tarball
> from there.
> The main goal here is that if you change the version of a package (say
> Linux), instead of cloning all over again, you will simply 'git fetch'
> from the repo the missing objects, then generates the tarball again.
> 
> This should speed the 'source' part of the build significantly.
> 
> The drawback is that the DL_DIR will grow much larger; but time is more
> important than disk space nowadays.
> 
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
> ---
> v1 -> v2:
>    - Fix bad regex in the 'transform' option of tar (found by Peter
>    Seiderer)
> v2 -> v3:
>    - Change git fetch origin to use the uri of the package instead of
>    the name of the default remote 'origin' (Thomas Petazzoni)
> ---
>  support/download/git | 70 ++++++++++++++++++++++++++++++----------------------
>  1 file changed, 40 insertions(+), 30 deletions(-)
> 
> diff --git a/support/download/git b/support/download/git
> index 58a2c6ad9d..301f7e792a 100755
> --- a/support/download/git
> +++ b/support/download/git
> @@ -39,28 +39,34 @@ _git() {
>      eval ${GIT} "${@}"
>  }
>  
> -# Try a shallow clone, since it is faster than a full clone - but that only
> -# works if the version is a ref (tag or branch). Before trying to do a shallow
> -# clone we check if ${cset} is in the list provided by git ls-remote. If not
> -# we fall back on a full clone.
> -#
> -# Messages for the type of clone used are provided to ease debugging in case of
> -# problems
> -git_done=0
> -if [ -n "$(_git ls-remote "'${uri}'" "'${cset}'" 2>&1)" ]; then
> -    printf "Doing shallow clone\n"
> -    if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${uri}'" "'${basename}'"; then
> -        git_done=1
> -    else
> -        printf "Shallow clone failed, falling back to doing a full clone\n"
> +# We want to check if a cache of the git clone of this repo already exists.
> +git_cache="${BR2_DL_DIR}/${basename%%-*}/git"
> +
> +# If the cache directory already exists, don't try to clone.
> +if [ ! -d "${git_cache}" ]; then
> +    # Try a shallow clone, since it is faster than a full clone - but that
> +    # only works if the versionis a ref (tag or branch). Before trying to do a
> +    # shallow clone we check if ${cset} is in the list provided by git
> +    # ls-remote. If not we fall back on a full clone.
> +    #
> +    # Messages for the type of clone used are provided to ease debugging in
> +    # case of problems
> +    git_done=0
> +    if [ -n "$(_git ls-remote "'${uri}'" "'${cset}'" 2>&1)" ]; then
> +        printf "Doing shallow clone\n"
> +        if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${uri}'" "'${git_cache}'"; then
> +            git_done=1
> +        else
> +            printf "Shallow clone failed, falling back to doing a full clone\n"
> +        fi
> +    fi
> +    if [ ${git_done} -eq 0 ]; then
> +        printf "Doing full clone\n"
> +        _git clone ${verbose} "${@}" "'${uri}'" "'${git_cache}'"
>      fi
> -fi
> -if [ ${git_done} -eq 0 ]; then
> -    printf "Doing full clone\n"
> -    _git clone ${verbose} "${@}" "'${uri}'" "'${basename}'"
>  fi
>  
> -pushd "${basename}" >/dev/null
> +pushd "${git_cache}" >/dev/null
>  
>  # Try to get the special refs exposed by some forges (pull-requests for
>  # github, changes for gerrit...). There is no easy way to know whether
> @@ -69,7 +75,7 @@ pushd "${basename}" >/dev/null
>  # below, if there is an issue anyway. Since most of the cset we're gonna
>  # have to clone are not such special refs, consign the output to oblivion
>  # so as not to alarm unsuspecting users, but still trace it as a warning.
> -if ! _git fetch origin "'${cset}:${cset}'" >/dev/null 2>&1; then
> +if ! _git fetch "'${uri}'" "'${cset}:${cset}'" >/dev/null 2>&1; then

This does not work with some servers which refuse to directly serve
sha1s. I.e. github, when passed a sha1 as cset, will return:

    error: Server does not allow request for unadvertised object [SHA1]

It means we can not explicitly request a sha1 from such severs.

The solution to that, is to add, a little above this if-block, an
explicit fetch to the new remote.

Now, if we try to fetch from that remote, we don't get the new refs
either:

    $ git fetch https://some-server/some/repo
    $ git show [SHA1]
    fatal: bad object [SHA1]

The further tweak that we need is to redirect the original 'origin' to
the new remote first:

    $ git remote set-url origin https://some-server/some/repo
    $ git fetch -t
    $ git show [SHA1]
    [changeset is displayed]

So, here's the further patch I applied:

    diff --git a/support/download/git b/support/download/git
    index 301f7e792a..db74d67536 100755
    --- a/support/download/git
    +++ b/support/download/git
    @@ -68,6 +68,13 @@ fi
     
     pushd "${git_cache}" >/dev/null
     
    +printf "Fetching new and additional references...\n"
    +
    +# Redirect the tree to the current remote, so that we can fetch
    +# the required reference, whatever it is (tag, branch, sha1...)
    +_git remote set-url origin "'${uri}'"
    +_git fetch origin -t
    +
     # Try to get the special refs exposed by some forges (pull-requests for
     # github, changes for gerrit...). There is no easy way to know whether
     # the cset the user passed us is such a special ref or a tag or a sha1
    @@ -75,7 +82,7 @@ pushd "${git_cache}" >/dev/null
     # below, if there is an issue anyway. Since most of the cset we're gonna
     # have to clone are not such special refs, consign the output to oblivion
     # so as not to alarm unsuspecting users, but still trace it as a warning.
    -if ! _git fetch "'${uri}'" "'${cset}:${cset}'" >/dev/null 2>&1; then
    +if ! _git fetch origin "'${cset}:${cset}'" >/dev/null 2>&1; then
         printf "Could not fetch special ref '%s'; assuming it is not special.\n" "${cset}"
     fi
     

Let's see tomorrow if you grab it in your tree, or if I respin an
updated series from my own tree.

Otherwise: happy, much faster! :-)

Regards,
Yann E. MORIN.

>      printf "Could not fetch special ref '%s'; assuming it is not special.\n" "${cset}"
>  fi
>  
> @@ -86,20 +92,24 @@ if [ ${recurse} -eq 1 ]; then
>      _git submodule update --init --recursive
>  fi
>  
> -# We do not want the .git dir; we keep other .git files, in case they
> -# are the only files in their directory.
> +# Generate the archive, sort with the C locale so that it is reproducible
> +# We do not want the .git dir; we keep other .git
> +# files, in case they are the only files in their directory.
>  # The .git dir would generate non reproducible tarballs as it depends on
>  # the state of the remote server. It also would generate large tarballs
>  # (gigabytes for some linux trees) when a full clone took place.
> -rm -rf .git
> +find . -not -type d \
> +	-and -not -path "./.git/*" >"${BR2_DL_DIR}/${basename}.list"
> +LC_ALL=C sort <"${BR2_DL_DIR}/${basename}.list" >"${BR2_DL_DIR}/${basename}.list.sorted"
>  
> -popd >/dev/null
> -
> -# Generate the archive, sort with the C locale so that it is reproducible
> -find "${basename}" -not -type d >"${basename}.list"
> -LC_ALL=C sort <"${basename}.list" >"${basename}.list.sorted"
>  # Create GNU-format tarballs, since that's the format of the tarballs on
>  # sources.buildroot.org and used in the *.hash files
> -tar cf - --numeric-owner --owner=0 --group=0 --mtime="${date}" --format=gnu \
> -         -T "${basename}.list.sorted" >"${output}.tar"
> +tar cf - --transform="s/^\.$/${basename}/" \
> +	--numeric-owner --owner=0 --group=0 --mtime="${date}" --format=gnu \
> +         -T "${BR2_DL_DIR}/${basename}.list.sorted" >"${output}.tar"
>  gzip -6 -n <"${output}.tar" >"${output}"
> +
> +rm -f "${BR2_DL_DIR}/${basename}.list"
> +rm -f "${BR2_DL_DIR}/${basename}.list.sorted"
> +
> +popd >/dev/null
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 12/13] download: add flock call before dl-wrapper
  2018-03-31 14:24 ` [Buildroot] [v3 12/13] download: add flock call before dl-wrapper Maxime Hadjinlian
@ 2018-04-01 14:09   ` Yann E. MORIN
  2018-04-01 17:53     ` Yann E. MORIN
  0 siblings, 1 reply; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 14:09 UTC (permalink / raw)
  To: buildroot

Maxime, All,

On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> In order to introduce the cache mechanisms, we need to have a lock on
> the $(LIBFOO_DL_DIR), otherwise it would be impossible to do parallel
> download (a shared DL_DIR for two buildroot instances).
> 
> To make sure the directory exists, the mkdir call has been removed from
> the dl-wrapper and put in the infrastructure.
> 
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
> ---
>  package/pkg-download.mk     | 4 +++-
>  support/download/dl-wrapper | 1 -
>  2 files changed, 3 insertions(+), 2 deletions(-)
> 
> diff --git a/package/pkg-download.mk b/package/pkg-download.mk
> index 23f3403098..dff52007e6 100644
> --- a/package/pkg-download.mk
> +++ b/package/pkg-download.mk
> @@ -19,6 +19,7 @@ SSH := $(call qstrip,$(BR2_SSH))
>  export LOCALFILES := $(call qstrip,$(BR2_LOCALFILES))
>  
>  DL_WRAPPER = support/download/dl-wrapper
> +FLOCK = flock $($(PKG)_DL_DIR)/

This means that two downloads running in two concurrent Buildroot
builds, doing a wget from different tarballs from the same package
(eg. foo-1.tar and foo-2.tar) will not be able to complete in parallel,
and will be sequential, while we curently can do that, and it *is* safe
to do so.

We have a few objects we can flock, in  a generic manner:

  - the .stamp_downloaded stanp file: this would allow concurrent
    Buildroot builds of wget downloads, but would not protect against
    concurrent git clones;

  - the $($(PKG)_DL_DIR): would protect against concurrent git clones,
    but forbids concurrent wget of the same package;

And a third and fourth further options:

  - $($(PKG)_DL_DIR)/git: which would be the best of both world, but
    would need to be handled by the dl-wrapper itself when it calls
    the git backend (or hg, svn, cvs... when we later support those).

  - push the flock even further down into each individual backends,
    which would each be responsible to flock only the individual
    commands that require locking.

For the fourth solution, we may be able to shoe-horn the flock call into
the internals _XXX wrappers (e.g. the _git() function of the git
backend).

TBH, I believe that what you propsoe is Good Enough (TM), because the
third, optimised solution, is just getting a bit more complex, as there
is no resource (file or directory) that we can flock, besides the
package's own download dir. The fourth is pusshing into too fine-grained
for being entirely reliable...

So, I'm happy that we go with your proposed patch. But maybe expand the
commit log to explain a bit that restriction.

Regards,
Yann E. MORIN.

>  # DL_DIR may have been set already from the environment
>  ifeq ($(origin DL_DIR),undefined)
> @@ -91,7 +92,8 @@ endif
>  
>  define DOWNLOAD
>  	$(Q)$(if $(filter bzr cvs hg svn,$($(PKG)_SITE_METHOD)),BR_NO_CHECK_HASH_FOR=$(notdir $(1));) \
> -	$(EXTRA_ENV) $(DL_WRAPPER) \
> +	$(Q)mkdir -p $($(PKG)_DL_DIR)/
> +	$(EXTRA_ENV) $(FLOCK) $(DL_WRAPPER) \
>  		-c $($(PKG)_DL_VERSION) \
>  		-f $(notdir $(1)) \
>  		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
> diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
> index 49caf3717b..67e9742767 100755
> --- a/support/download/dl-wrapper
> +++ b/support/download/dl-wrapper
> @@ -50,7 +50,6 @@ main() {
>      if [ -z "${output}" ]; then
>          error "no output specified, use -o\n"
>      fi
> -    mkdir -p "$(dirname "${output}")"
>  
>      # If the output file already exists and:
>      # - there's no .hash file: do not download it again and exit promptly
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 11/13] help/manual: update help about the new $(LIBFOO_DL_DIR)
  2018-03-31 14:24 ` [Buildroot] [v3 11/13] help/manual: update help about the new $(LIBFOO_DL_DIR) Maxime Hadjinlian
@ 2018-04-01 14:15   ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 14:15 UTC (permalink / raw)
  To: buildroot

Maxime, All,

On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
> ---
>  Config.in                               | 3 +++
>  docs/manual/adding-packages-generic.txt | 6 +++---
>  2 files changed, 6 insertions(+), 3 deletions(-)
> 
> diff --git a/Config.in b/Config.in
> index a74b24ae45..55b6e5e2a1 100644
> --- a/Config.in
> +++ b/Config.in
> @@ -200,6 +200,9 @@ config BR2_DL_DIR
>  	  If the Linux shell environment has defined the BR2_DL_DIR
>  	  environment variable, then this overrides this configuration
>  	  item.
> +	  The directory is organized with a subdirectory for each package.
> +	  Each package has its own $(LIBFOO_DL_DIR) variable that can be used
> +	  to find the correct path.
>  
>  	  The default is $(TOPDIR)/dl
>  
> diff --git a/docs/manual/adding-packages-generic.txt b/docs/manual/adding-packages-generic.txt
> index 9d1428ad40..a7578a782d 100644
> --- a/docs/manual/adding-packages-generic.txt
> +++ b/docs/manual/adding-packages-generic.txt
> @@ -263,7 +263,7 @@ information is (assuming the package name is +libfoo+) :
>    the file using this URL. Otherwise, Buildroot will assume the file
>    to be downloaded is located at +LIBFOO_SITE+. Buildroot will not do
>    anything with those additional files, except download them: it will
> -  be up to the package recipe to use them from +$(DL_DIR)+.
> +  be up to the package recipe to use them from +$(LIBFOO_DL_DIR)+.
>  
>  * +LIBFOO_SITE_METHOD+ determines the method used to fetch or copy the
>    package source code. In many cases, Buildroot guesses the method
> @@ -554,8 +554,8 @@ In the action definitions, you can use the following variables:
>  * +$(@D)+, which contains the directory in which the package source
>    code has been uncompressed.
>  
> -* +$(DL_DIR)+ contains the path to the directory where all the downloads made
> -  by Buildroot are stored.
> +* +$(LIBFOO_DL_DIR)+ contains the path to the directory where all the downloads
> +  made by Buildroot are stored.

... where all the downloads made by Buildroot for +libfoo+ are stored in.

Regards,
Yann E. MORIN.

>  * +$(TARGET_CC)+, +$(TARGET_LD)+, etc. to get the target
>    cross-compilation utilities
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 10/13] package: share downloaded files for big packages
  2018-03-31 14:24 ` [Buildroot] [v3 10/13] package: share downloaded files for big packages Maxime Hadjinlian
@ 2018-04-01 14:18   ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 14:18 UTC (permalink / raw)
  To: buildroot

Maxime, All,

On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> From: "Yann E. MORIN" <yann.morin.1998@free.fr>
> 
> Signed-off-by: "Yann E. MORIN" <yann.morin.1998@free.fr>
> Cc: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
> Cc: Thomas Petazzoni <thomas.petazzoni@free-electrons.com>

You also modified the patch, and you carried it, so you should also have
added your own SoB line. ;-)

I would also note that I only cjhanged those three packages, just as an
example and tentative for the _SAME_SOURCE_AS feature. There might be
other packages that share the same sources, and we'd need to convert
them of the _SAME_SOURCE_AS feature is accepted.

Regards,
Yann E. MORIN.

> ---
>  package/gcc/gcc-initial/gcc-initial.mk   | 5 +++++
>  package/linux-headers/linux-headers.mk   | 3 +++
>  package/mesa3d-headers/mesa3d-headers.mk | 1 +
>  3 files changed, 9 insertions(+)
> 
> diff --git a/package/gcc/gcc-initial/gcc-initial.mk b/package/gcc/gcc-initial/gcc-initial.mk
> index c476b2faeb..07da50afa8 100644
> --- a/package/gcc/gcc-initial/gcc-initial.mk
> +++ b/package/gcc/gcc-initial/gcc-initial.mk
> @@ -8,6 +8,11 @@ GCC_INITIAL_VERSION = $(GCC_VERSION)
>  GCC_INITIAL_SITE = $(GCC_SITE)
>  GCC_INITIAL_SOURCE = $(GCC_SOURCE)
>  
> +# We do not have a 'gcc' package per-se; we only have two incarnations,
> +# gcc-initial and gcc-final. gcc-initial is just am internal step that
> +# users should not care about, while gcc-final is the one they shall see.
> +HOST_GCC_INITIAL_SAME_SOURCE_AS = host-gcc-final
> +
>  HOST_GCC_INITIAL_DEPENDENCIES = $(HOST_GCC_COMMON_DEPENDENCIES)
>  
>  HOST_GCC_INITIAL_EXCLUDES = $(HOST_GCC_EXCLUDES)
> diff --git a/package/linux-headers/linux-headers.mk b/package/linux-headers/linux-headers.mk
> index f1e3790608..6f4071c61a 100644
> --- a/package/linux-headers/linux-headers.mk
> +++ b/package/linux-headers/linux-headers.mk
> @@ -82,6 +82,9 @@ endif
>  
>  endif # ! BR2_KERNEL_HEADERS_AS_KERNEL
>  
> +# linux-headers really is the same as the linux package
> +LINUX_HEADERS_SAME_SOURCE_AS = linux
> +
>  LINUX_HEADERS_LICENSE = GPL-2.0
>  LINUX_HEADERS_LICENSE_FILES = COPYING
>  
> diff --git a/package/mesa3d-headers/mesa3d-headers.mk b/package/mesa3d-headers/mesa3d-headers.mk
> index b48d965d8a..39a25270db 100644
> --- a/package/mesa3d-headers/mesa3d-headers.mk
> +++ b/package/mesa3d-headers/mesa3d-headers.mk
> @@ -15,6 +15,7 @@ endif
>  MESA3D_HEADERS_VERSION = 18.0.0
>  MESA3D_HEADERS_SOURCE = mesa-$(MESA3D_HEADERS_VERSION).tar.xz
>  MESA3D_HEADERS_SITE = https://mesa.freedesktop.org/archive
> +MESA3D_HEADERS_SAME_SOURCE_AS = mesa3d
>  MESA3D_HEADERS_LICENSE = MIT, SGI, Khronos
>  MESA3D_HEADERS_LICENSE_FILES = docs/license.html
>  
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 09/13] pkg-generic: introduce _SAME_SOURCE_AS
  2018-03-31 14:24 ` [Buildroot] [v3 09/13] pkg-generic: introduce _SAME_SOURCE_AS Maxime Hadjinlian
@ 2018-04-01 14:26   ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 14:26 UTC (permalink / raw)
  To: buildroot

On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> This per package variable can be used to specify that a package shares
> the same sources as another package.
> 
> The use case here is for example linux-headers and linux, which share
> the same sources (because they are the same upstream project), so we
> don't want to download twice the kernel, nor store it multiple times
> either.
> 
> Make will automatically try to help by introducing leading and trailing
> spaces when replacing a line-continuation '\', so we need to call
> $(strip) on the variable.
> 
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
> ---
>  package/pkg-generic.mk | 4 +++-
>  1 file changed, 3 insertions(+), 1 deletion(-)
> 
> diff --git a/package/pkg-generic.mk b/package/pkg-generic.mk
> index b1342228c8..f3829e9912 100644
> --- a/package/pkg-generic.mk
> +++ b/package/pkg-generic.mk
> @@ -428,7 +428,9 @@ endif
>  
>  $(2)_BASE_NAME	= $$(if $$($(2)_VERSION),$(1)-$$($(2)_VERSION),$(1))
>  $(2)_RAW_BASE_NAME = $$(if $$($(2)_VERSION),$$($(2)_RAWNAME)-$$($(2)_VERSION),$$($(2)_RAWNAME))
> -$(2)_DL_DIR 	=  $$(DL_DIR)/$$($(2)_RAWNAME)
> +$(2)_DL_DIR	= $$(strip $$(if $$($(2)_SAME_SOURCE_AS), \
> +        $$($$(call UPPERCASE,$$($(2)_SAME_SOURCE_AS))_DL_DIR), \
> +        $$(DL_DIR)/$$($(2)_RAWNAME)))

Indentation with TABs, please.

May I suggest an alternate indentation (spaces here for easy display):

$(2)_DL_DIR = $$(strip $$(if $$($(2)_SAME_SOURCE_AS), \
                                $$($$(call UPPERCASE,$$($(2)_SAME_SOURCE_AS))_DL_DIR), \
                                $$(DL_DIR)/$$($(2)_RAWNAME)) \
                )

Regards,
Yann E. MORIN.

>  $(2)_DIR	=  $$(BUILD_DIR)/$$($(2)_BASE_NAME)
>  
>  ifndef $(2)_SUBDIR
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 08/13] pkg-download: support new subdir for mirrors
  2018-03-31 14:24 ` [Buildroot] [v3 08/13] pkg-download: support new subdir for mirrors Maxime Hadjinlian
@ 2018-04-01 14:42   ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 14:42 UTC (permalink / raw)
  To: buildroot

Maxime, All,

On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> Since we introduced subdirectory to the DL_DIR, we need to support them
> in the PRIMARY and BACKUP mirror as they evolve to the new tree
> structure.
> 
> We check first the new URI (with the subdir), and in case of failure, we
> go check without.
> By checking both URIs, we ensure that old mirror are usable.
> 
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>

Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>

Regards,
Yann E. MORIN.

> ---
>  package/pkg-download.mk | 2 ++
>  1 file changed, 2 insertions(+)
> 
> diff --git a/package/pkg-download.mk b/package/pkg-download.mk
> index 0717943bb0..23f3403098 100644
> --- a/package/pkg-download.mk
> +++ b/package/pkg-download.mk
> @@ -75,6 +75,7 @@ export BR_NO_CHECK_HASH_FOR =
>  
>  ifneq ($(call qstrip,$(BR2_PRIMARY_SITE)),)
>  DOWNLOAD_URIS += \
> +	-u $(call getschemeplusuri,$(BR2_PRIMARY_SITE)/$(notdir $($(PKG)_DL_DIR)),urlencode) \
>  	-u $(call getschemeplusuri,$(BR2_PRIMARY_SITE),urlencode)
>  endif
>  
> @@ -83,6 +84,7 @@ DOWNLOAD_URIS += \
>  	-u $($(PKG)_SITE_METHOD)+$(dir $(1))
>  ifneq ($(call qstrip,$(BR2_BACKUP_SITE)),)
>  DOWNLOAD_URIS += \
> +	-u $(call getschemeplusuri,$(BR2_BACKUP_SITE)/$(notdir $($(PKG)_DL_DIR)),urlencode) \
>  	-u $(call getschemeplusuri,$(BR2_BACKUP_SITE),urlencode)
>  endif
>  endif
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 13/13] download: git: introduce cache feature
  2018-04-01 12:57   ` Yann E. MORIN
@ 2018-04-01 14:58     ` Arnout Vandecappelle
  2018-04-01 18:13       ` Yann E. MORIN
  0 siblings, 1 reply; 26+ messages in thread
From: Arnout Vandecappelle @ 2018-04-01 14:58 UTC (permalink / raw)
  To: buildroot



On 01-04-18 14:57, Yann E. MORIN wrote:
> Maxime, All,
> 
> On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
>> Now we keep the git clone that we download and generates our tarball
>> from there.
>> The main goal here is that if you change the version of a package (say
>> Linux), instead of cloning all over again, you will simply 'git fetch'
>> from the repo the missing objects, then generates the tarball again.
>>
>> This should speed the 'source' part of the build significantly.
>>
>> The drawback is that the DL_DIR will grow much larger; but time is more
>> important than disk space nowadays.
>>
>> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
>> ---
>> v1 -> v2:
>>    - Fix bad regex in the 'transform' option of tar (found by Peter
>>    Seiderer)
>> v2 -> v3:
>>    - Change git fetch origin to use the uri of the package instead of
>>    the name of the default remote 'origin' (Thomas Petazzoni)
>> ---
>>  support/download/git | 70 ++++++++++++++++++++++++++++++----------------------
>>  1 file changed, 40 insertions(+), 30 deletions(-)
>>
>> diff --git a/support/download/git b/support/download/git
>> index 58a2c6ad9d..301f7e792a 100755
>> --- a/support/download/git
>> +++ b/support/download/git
>> @@ -39,28 +39,34 @@ _git() {
>>      eval ${GIT} "${@}"
>>  }
>>  
>> -# Try a shallow clone, since it is faster than a full clone - but that only
>> -# works if the version is a ref (tag or branch). Before trying to do a shallow
>> -# clone we check if ${cset} is in the list provided by git ls-remote. If not
>> -# we fall back on a full clone.
>> -#
>> -# Messages for the type of clone used are provided to ease debugging in case of
>> -# problems
>> -git_done=0
>> -if [ -n "$(_git ls-remote "'${uri}'" "'${cset}'" 2>&1)" ]; then
>> -    printf "Doing shallow clone\n"
>> -    if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${uri}'" "'${basename}'"; then
>> -        git_done=1
>> -    else
>> -        printf "Shallow clone failed, falling back to doing a full clone\n"
>> +# We want to check if a cache of the git clone of this repo already exists.
>> +git_cache="${BR2_DL_DIR}/${basename%%-*}/git"
>> +
>> +# If the cache directory already exists, don't try to clone.
>> +if [ ! -d "${git_cache}" ]; then
>> +    # Try a shallow clone, since it is faster than a full clone - but that
>> +    # only works if the versionis a ref (tag or branch). Before trying to do a
>> +    # shallow clone we check if ${cset} is in the list provided by git
>> +    # ls-remote. If not we fall back on a full clone.
>> +    #
>> +    # Messages for the type of clone used are provided to ease debugging in
>> +    # case of problems
>> +    git_done=0
>> +    if [ -n "$(_git ls-remote "'${uri}'" "'${cset}'" 2>&1)" ]; then
>> +        printf "Doing shallow clone\n"
>> +        if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${uri}'" "'${git_cache}'"; then
>> +            git_done=1
>> +        else
>> +            printf "Shallow clone failed, falling back to doing a full clone\n"
>> +        fi
>> +    fi
>> +    if [ ${git_done} -eq 0 ]; then
>> +        printf "Doing full clone\n"
>> +        _git clone ${verbose} "${@}" "'${uri}'" "'${git_cache}'"

 Can't we get rid of all the code above, and instead initialize an empty repository:

_git init "'${git_cache}'"
cd "'${git_cache}'"
_git remote set-url origin "'${uri}'"

 then do a fetch, and make it support shallow as well:

if [ -n "$(_git ls-remote origin "'${cset}'" 2>&1)" ]; then
  if _git fetch "${@}" --depth 1 origin "'${cset}'";
    git_done=1
  fi
fi
if [ ${git_done} -eq 0 ]; then
 _git fetch origin -t
fi

 Would this work?

>>      fi
>> -fi
>> -if [ ${git_done} -eq 0 ]; then
>> -    printf "Doing full clone\n"
>> -    _git clone ${verbose} "${@}" "'${uri}'" "'${basename}'"
>>  fi
>>  
>> -pushd "${basename}" >/dev/null
>> +pushd "${git_cache}" >/dev/null
>>  
>>  # Try to get the special refs exposed by some forges (pull-requests for
>>  # github, changes for gerrit...). There is no easy way to know whether
>> @@ -69,7 +75,7 @@ pushd "${basename}" >/dev/null
>>  # below, if there is an issue anyway. Since most of the cset we're gonna
>>  # have to clone are not such special refs, consign the output to oblivion
>>  # so as not to alarm unsuspecting users, but still trace it as a warning.
>> -if ! _git fetch origin "'${cset}:${cset}'" >/dev/null 2>&1; then
>> +if ! _git fetch "'${uri}'" "'${cset}:${cset}'" >/dev/null 2>&1; then
> 
> This does not work with some servers which refuse to directly serve
> sha1s. I.e. github, when passed a sha1 as cset, will return:
> 
>     error: Server does not allow request for unadvertised object [SHA1]
> 
> It means we can not explicitly request a sha1 from such severs.
> 
> The solution to that, is to add, a little above this if-block, an
> explicit fetch to the new remote.
> 
> Now, if we try to fetch from that remote, we don't get the new refs
> either:
> 
>     $ git fetch https://some-server/some/repo
>     $ git show [SHA1]
>     fatal: bad object [SHA1]
> 
> The further tweak that we need is to redirect the original 'origin' to
> the new remote first:
> 
>     $ git remote set-url origin https://some-server/some/repo

 I don't particularly like how the git cache will have the last used URL as its
'origin', but I can't find a better approach.

 Regards,
 Arnout

>     $ git fetch -t
>     $ git show [SHA1]
>     [changeset is displayed]
> 
> So, here's the further patch I applied:
> 
>     diff --git a/support/download/git b/support/download/git
>     index 301f7e792a..db74d67536 100755
>     --- a/support/download/git
>     +++ b/support/download/git
>     @@ -68,6 +68,13 @@ fi
>      
>      pushd "${git_cache}" >/dev/null
>      
>     +printf "Fetching new and additional references...\n"
>     +
>     +# Redirect the tree to the current remote, so that we can fetch
>     +# the required reference, whatever it is (tag, branch, sha1...)
>     +_git remote set-url origin "'${uri}'"
>     +_git fetch origin -t
>     +
>      # Try to get the special refs exposed by some forges (pull-requests for
>      # github, changes for gerrit...). There is no easy way to know whether
>      # the cset the user passed us is such a special ref or a tag or a sha1
>     @@ -75,7 +82,7 @@ pushd "${git_cache}" >/dev/null
>      # below, if there is an issue anyway. Since most of the cset we're gonna
>      # have to clone are not such special refs, consign the output to oblivion
>      # so as not to alarm unsuspecting users, but still trace it as a warning.
>     -if ! _git fetch "'${uri}'" "'${cset}:${cset}'" >/dev/null 2>&1; then
>     +if ! _git fetch origin "'${cset}:${cset}'" >/dev/null 2>&1; then
>          printf "Could not fetch special ref '%s'; assuming it is not special.\n" "${cset}"
>      fi
>      
> 
> Let's see tomorrow if you grab it in your tree, or if I respin an
> updated series from my own tree.
> 
> Otherwise: happy, much faster! :-)
> 
> Regards,
> Yann E. MORIN.
> 
>>      printf "Could not fetch special ref '%s'; assuming it is not special.\n" "${cset}"
>>  fi
>>  
>> @@ -86,20 +92,24 @@ if [ ${recurse} -eq 1 ]; then
>>      _git submodule update --init --recursive
>>  fi
>>  
>> -# We do not want the .git dir; we keep other .git files, in case they
>> -# are the only files in their directory.
>> +# Generate the archive, sort with the C locale so that it is reproducible
>> +# We do not want the .git dir; we keep other .git
>> +# files, in case they are the only files in their directory.
>>  # The .git dir would generate non reproducible tarballs as it depends on
>>  # the state of the remote server. It also would generate large tarballs
>>  # (gigabytes for some linux trees) when a full clone took place.
>> -rm -rf .git
>> +find . -not -type d \
>> +	-and -not -path "./.git/*" >"${BR2_DL_DIR}/${basename}.list"
>> +LC_ALL=C sort <"${BR2_DL_DIR}/${basename}.list" >"${BR2_DL_DIR}/${basename}.list.sorted"
>>  
>> -popd >/dev/null
>> -
>> -# Generate the archive, sort with the C locale so that it is reproducible
>> -find "${basename}" -not -type d >"${basename}.list"
>> -LC_ALL=C sort <"${basename}.list" >"${basename}.list.sorted"
>>  # Create GNU-format tarballs, since that's the format of the tarballs on
>>  # sources.buildroot.org and used in the *.hash files
>> -tar cf - --numeric-owner --owner=0 --group=0 --mtime="${date}" --format=gnu \
>> -         -T "${basename}.list.sorted" >"${output}.tar"
>> +tar cf - --transform="s/^\.$/${basename}/" \
>> +	--numeric-owner --owner=0 --group=0 --mtime="${date}" --format=gnu \
>> +         -T "${BR2_DL_DIR}/${basename}.list.sorted" >"${output}.tar"
>>  gzip -6 -n <"${output}.tar" >"${output}"
>> +
>> +rm -f "${BR2_DL_DIR}/${basename}.list"
>> +rm -f "${BR2_DL_DIR}/${basename}.list.sorted"
>> +
>> +popd >/dev/null
>> -- 
>> 2.16.2
>>
> 

-- 
Arnout Vandecappelle                          arnout at mind be
Senior Embedded Software Architect            +32-16-286500
Essensium/Mind                                http://www.mind.be
G.Geenslaan 9, 3001 Leuven, Belgium           BE 872 984 063 RPR Leuven
LinkedIn profile: http://www.linkedin.com/in/arnoutvandecappelle
GPG fingerprint:  7493 020B C7E3 8618 8DEC 222C 82EB F404 F9AC 0DDF

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 12/13] download: add flock call before dl-wrapper
  2018-04-01 14:09   ` Yann E. MORIN
@ 2018-04-01 17:53     ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 17:53 UTC (permalink / raw)
  To: buildroot

Maxime, All,

On 2018-04-01 16:09 +0200, Yann E. MORIN spake thusly:
> On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> > In order to introduce the cache mechanisms, we need to have a lock on
> > the $(LIBFOO_DL_DIR), otherwise it would be impossible to do parallel
> > download (a shared DL_DIR for two buildroot instances).
> > 
> > To make sure the directory exists, the mkdir call has been removed from
> > the dl-wrapper and put in the infrastructure.
> > 
> > Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
> > ---
> >  package/pkg-download.mk     | 4 +++-
> >  support/download/dl-wrapper | 1 -
> >  2 files changed, 3 insertions(+), 2 deletions(-)
> > 
> > diff --git a/package/pkg-download.mk b/package/pkg-download.mk
> > index 23f3403098..dff52007e6 100644
> > --- a/package/pkg-download.mk
> > +++ b/package/pkg-download.mk
> > @@ -19,6 +19,7 @@ SSH := $(call qstrip,$(BR2_SSH))
> >  export LOCALFILES := $(call qstrip,$(BR2_LOCALFILES))
> >  
> >  DL_WRAPPER = support/download/dl-wrapper
> > +FLOCK = flock $($(PKG)_DL_DIR)/
> 
> This means that two downloads running in two concurrent Buildroot
> builds, doing a wget from different tarballs from the same package
> (eg. foo-1.tar and foo-2.tar) will not be able to complete in parallel,
> and will be sequential, while we curently can do that, and it *is* safe
> to do so.
> 
> We have a few objects we can flock, in  a generic manner:
> 
>   - the .stamp_downloaded stanp file: this would allow concurrent
>     Buildroot builds of wget downloads, but would not protect against
>     concurrent git clones;
> 
>   - the $($(PKG)_DL_DIR): would protect against concurrent git clones,
>     but forbids concurrent wget of the same package;
> 
> And a third and fourth further options:
> 
>   - $($(PKG)_DL_DIR)/git: which would be the best of both world, but
>     would need to be handled by the dl-wrapper itself when it calls
>     the git backend (or hg, svn, cvs... when we later support those).

After a lie discussion, we've in fact decided to later go that route,
because it is not so complex to do afterall:

  - in pkg-generic.mk: based on the _SITE_METHOD, set:
        $(PKG)_SITE_LOCK = YES

  - in pkg-download.mk, in macro DOWNLOAD, when calling dl-wrapper,
    add something like:
        $(if $($(PKG)_SITE_LOCK),-L $($PKG)_DL_DIR))

  - in dl-wrapper, accept a new option, -L with as argument the resource
    to lock, in which case dl-wrapper would lock it when calling the
    wrapper, otherwise it would directly call the wrapper.

But as also said live, this can be an improvement for *after* this
series is applied.

Regards,
Yann E. MORIN.

>   - push the flock even further down into each individual backends,
>     which would each be responsible to flock only the individual
>     commands that require locking.
> 
> For the fourth solution, we may be able to shoe-horn the flock call into
> the internals _XXX wrappers (e.g. the _git() function of the git
> backend).
> 
> TBH, I believe that what you propsoe is Good Enough (TM), because the
> third, optimised solution, is just getting a bit more complex, as there
> is no resource (file or directory) that we can flock, besides the
> package's own download dir. The fourth is pusshing into too fine-grained
> for being entirely reliable...
> 
> So, I'm happy that we go with your proposed patch. But maybe expand the
> commit log to explain a bit that restriction.
> 
> Regards,
> Yann E. MORIN.
> 
> >  # DL_DIR may have been set already from the environment
> >  ifeq ($(origin DL_DIR),undefined)
> > @@ -91,7 +92,8 @@ endif
> >  
> >  define DOWNLOAD
> >  	$(Q)$(if $(filter bzr cvs hg svn,$($(PKG)_SITE_METHOD)),BR_NO_CHECK_HASH_FOR=$(notdir $(1));) \
> > -	$(EXTRA_ENV) $(DL_WRAPPER) \
> > +	$(Q)mkdir -p $($(PKG)_DL_DIR)/
> > +	$(EXTRA_ENV) $(FLOCK) $(DL_WRAPPER) \
> >  		-c $($(PKG)_DL_VERSION) \
> >  		-f $(notdir $(1)) \
> >  		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
> > diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
> > index 49caf3717b..67e9742767 100755
> > --- a/support/download/dl-wrapper
> > +++ b/support/download/dl-wrapper
> > @@ -50,7 +50,6 @@ main() {
> >      if [ -z "${output}" ]; then
> >          error "no output specified, use -o\n"
> >      fi
> > -    mkdir -p "$(dirname "${output}")"
> >  
> >      # If the output file already exists and:
> >      # - there's no .hash file: do not download it again and exit promptly
> > -- 
> > 2.16.2
> > 
> 
> -- 
> .-----------------.--------------------.------------------.--------------------.
> |  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
> | +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
> | +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
> | http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
> '------------------------------^-------^------------------^--------------------'
> _______________________________________________
> buildroot mailing list
> buildroot at busybox.net
> http://lists.busybox.net/mailman/listinfo/buildroot

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 13/13] download: git: introduce cache feature
  2018-04-01 14:58     ` Arnout Vandecappelle
@ 2018-04-01 18:13       ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 18:13 UTC (permalink / raw)
  To: buildroot

Arnout, All,

On 2018-04-01 16:58 +0200, Arnout Vandecappelle spake thusly:
> On 01-04-18 14:57, Yann E. MORIN wrote:
> > On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
[--SNIP--]
> >> +# If the cache directory already exists, don't try to clone.
> >> +if [ ! -d "${git_cache}" ]; then
> >> +    # Try a shallow clone, since it is faster than a full clone - but that
> >> +    # only works if the versionis a ref (tag or branch). Before trying to do a
> >> +    # shallow clone we check if ${cset} is in the list provided by git
> >> +    # ls-remote. If not we fall back on a full clone.
> >> +    #
> >> +    # Messages for the type of clone used are provided to ease debugging in
> >> +    # case of problems
> >> +    git_done=0
> >> +    if [ -n "$(_git ls-remote "'${uri}'" "'${cset}'" 2>&1)" ]; then
> >> +        printf "Doing shallow clone\n"
> >> +        if _git clone ${verbose} "${@}" --depth 1 -b "'${cset}'" "'${uri}'" "'${git_cache}'"; then
> >> +            git_done=1
> >> +        else
> >> +            printf "Shallow clone failed, falling back to doing a full clone\n"
> >> +        fi
> >> +    fi
> >> +    if [ ${git_done} -eq 0 ]; then
> >> +        printf "Doing full clone\n"
> >> +        _git clone ${verbose} "${@}" "'${uri}'" "'${git_cache}'"
> 
>  Can't we get rid of all the code above, and instead initialize an empty repository:
> 
> _git init "'${git_cache}'"
> cd "'${git_cache}'"
> _git remote set-url origin "'${uri}'"
> 
>  then do a fetch, and make it support shallow as well:
> 
> if [ -n "$(_git ls-remote origin "'${cset}'" 2>&1)" ]; then
>   if _git fetch "${@}" --depth 1 origin "'${cset}'";
>     git_done=1
>   fi
> fi
> if [ ${git_done} -eq 0 ]; then
>  _git fetch origin -t
> fi
> 
>  Would this work?

As seen IRL, this would work with "extra tweaks left as an exercise to
the reader". ;-)

Regards,
Yann E. MORIN.

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 07/13] pkg-generic: add a subdirectory to the DL_DIR
  2018-03-31 14:24 ` [Buildroot] [v3 07/13] pkg-generic: add a subdirectory to the DL_DIR Maxime Hadjinlian
@ 2018-04-01 18:17   ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 18:17 UTC (permalink / raw)
  To: buildroot

Maxime, All,

On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> With all the previous changes, we are now ready to add a subdirectory to
> the DL_DIR.
> The structure will now be DL_DIR/PKG_NAME/{FILE1,FILE2}
> 
> This is needed for multiple reasons:
>     - Avoid patches with name like SHA1.patch laying flat in DL_DIR,
>     which makes it hard to know to which packages they apply
>     - Avoid possible collisions if two releases have the same name
>     (e.g: v01.tar)
>     - Allow the possibility to handle a git cache per package in the
>     newly created subdirectory.
> 
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>

Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>

Regards,
Yann E. MORIN.

> ---
>  package/pkg-generic.mk | 2 +-
>  1 file changed, 1 insertion(+), 1 deletion(-)
> 
> diff --git a/package/pkg-generic.mk b/package/pkg-generic.mk
> index 7587324368..b1342228c8 100644
> --- a/package/pkg-generic.mk
> +++ b/package/pkg-generic.mk
> @@ -428,7 +428,7 @@ endif
>  
>  $(2)_BASE_NAME	= $$(if $$($(2)_VERSION),$(1)-$$($(2)_VERSION),$(1))
>  $(2)_RAW_BASE_NAME = $$(if $$($(2)_VERSION),$$($(2)_RAWNAME)-$$($(2)_VERSION),$$($(2)_RAWNAME))
> -$(2)_DL_DIR	=  $$(DL_DIR)
> +$(2)_DL_DIR 	=  $$(DL_DIR)/$$($(2)_RAWNAME)
>  $(2)_DIR	=  $$(BUILD_DIR)/$$($(2)_BASE_NAME)
>  
>  ifndef $(2)_SUBDIR
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 06/13] support/download: make sure the download folder is created
  2018-03-31 14:24 ` [Buildroot] [v3 06/13] support/download: make sure the download folder is created Maxime Hadjinlian
@ 2018-04-01 18:18   ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 18:18 UTC (permalink / raw)
  To: buildroot

Maxime, All,

On 2018-03-31 16:24 +0200, Maxime Hadjinlian spake thusly:
> At the moment, it means that we make sure that BR2_DL_DIR is created, in
> the future, it will make sure that BR2_DL_DIR/PKG_NAME/ is created.
> 
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>
> ---
>  support/download/dl-wrapper | 1 +
>  1 file changed, 1 insertion(+)
> 
> diff --git a/support/download/dl-wrapper b/support/download/dl-wrapper
> index 67e9742767..49caf3717b 100755
> --- a/support/download/dl-wrapper
> +++ b/support/download/dl-wrapper
> @@ -50,6 +50,7 @@ main() {
>      if [ -z "${output}" ]; then
>          error "no output specified, use -o\n"
>      fi
> +    mkdir -p "$(dirname "${output}")"

In a future oatch, when adding flock, you move that mkdir out of the
wrapper, upper to the DOWNLOAD macro. Why don't you do that right now,
instead?

Regards,
Yann E. MORIN.

>      # If the output file already exists and:
>      # - there's no .hash file: do not download it again and exit promptly
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

* [Buildroot] [v3 05/13] pkg-{download, generic}: use new $($(PKG)_DL_DIR)
  2018-03-31 14:23 ` [Buildroot] [v3 05/13] pkg-{download, generic}: use new $($(PKG)_DL_DIR) Maxime Hadjinlian
@ 2018-04-01 18:20   ` Yann E. MORIN
  0 siblings, 0 replies; 26+ messages in thread
From: Yann E. MORIN @ 2018-04-01 18:20 UTC (permalink / raw)
  To: buildroot

Maxime, all,

On 2018-03-31 16:23 +0200, Maxime Hadjinlian spake thusly:
> Let the infrastructure use the new variable $(PKG)_DL_DIR
> 
> Signed-off-by: Maxime Hadjinlian <maxime.hadjinlian@gmail.com>

Reviewed-by: "Yann E. MORIN" <yann.morin.1998@free.fr>

Regards,
Yann E. MORIN.

> ---
>  package/pkg-download.mk | 2 +-
>  package/pkg-generic.mk  | 2 +-
>  2 files changed, 2 insertions(+), 2 deletions(-)
> 
> diff --git a/package/pkg-download.mk b/package/pkg-download.mk
> index 14ea4ff361..0717943bb0 100644
> --- a/package/pkg-download.mk
> +++ b/package/pkg-download.mk
> @@ -95,7 +95,7 @@ define DOWNLOAD
>  		-H $(PKGDIR)/$($(PKG)_RAWNAME).hash \
>  		-n $($(PKG)_RAW_BASE_NAME) \
>  		-N $($(PKG)_RAWNAME) \
> -		-o $(DL_DIR)/$(notdir $(1)) \
> +		-o $($(PKG)_DL_DIR)/$(notdir $(1)) \
>  		$(if $($(PKG)_GIT_SUBMODULES),-r) \
>  		$(DOWNLOAD_URIS) \
>  		$(QUIET) \
> diff --git a/package/pkg-generic.mk b/package/pkg-generic.mk
> index 6d82f7027e..7587324368 100644
> --- a/package/pkg-generic.mk
> +++ b/package/pkg-generic.mk
> @@ -609,7 +609,7 @@ $(2)_TARGET_DIRCLEAN =		$$($(2)_DIR)/.stamp_dircleaned
>  
>  # default extract command
>  $(2)_EXTRACT_CMDS ?= \
> -	$$(if $$($(2)_SOURCE),$$(INFLATE$$(suffix $$($(2)_SOURCE))) $$(DL_DIR)/$$($(2)_SOURCE) | \
> +	$$(if $$($(2)_SOURCE),$$(INFLATE$$(suffix $$($(2)_SOURCE))) $$($(2)_DL_DIR)/$$($(2)_SOURCE) | \
>  	$$(TAR) --strip-components=$$($(2)_STRIP_COMPONENTS) \
>  		-C $$($(2)_DIR) \
>  		$$(foreach x,$$($(2)_EXCLUDES),--exclude='$$(x)' ) \
> -- 
> 2.16.2
> 

-- 
.-----------------.--------------------.------------------.--------------------.
|  Yann E. MORIN  | Real-Time Embedded | /"\ ASCII RIBBON | Erics' conspiracy: |
| +33 662 376 056 | Software  Designer | \ / CAMPAIGN     |  ___               |
| +33 223 225 172 `------------.-------:  X  AGAINST      |  \e/  There is no  |
| http://ymorin.is-a-geek.org/ | _/*\_ | / \ HTML MAIL    |   v   conspiracy.  |
'------------------------------^-------^------------------^--------------------'

^ permalink raw reply	[flat|nested] 26+ messages in thread

end of thread, other threads:[~2018-04-01 18:20 UTC | newest]

Thread overview: 26+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2018-03-31 14:23 [Buildroot] [v3 01/13] core/pkg-download: change all helpers to use common options Maxime Hadjinlian
2018-03-31 14:23 ` [Buildroot] [v3 02/13] download: put most of the infra in dl-wrapper Maxime Hadjinlian
2018-03-31 17:02   ` Maxime Hadjinlian
2018-03-31 14:23 ` [Buildroot] [v3 03/13] packages: use new $($PKG)_DL_DIR) variable Maxime Hadjinlian
2018-03-31 14:23 ` [Buildroot] [v3 04/13] arc/xtensa: store the eXtensa overlay in the per-package DL_DIR Maxime Hadjinlian
2018-03-31 14:23 ` [Buildroot] [v3 05/13] pkg-{download, generic}: use new $($(PKG)_DL_DIR) Maxime Hadjinlian
2018-04-01 18:20   ` Yann E. MORIN
2018-03-31 14:24 ` [Buildroot] [v3 06/13] support/download: make sure the download folder is created Maxime Hadjinlian
2018-04-01 18:18   ` Yann E. MORIN
2018-03-31 14:24 ` [Buildroot] [v3 07/13] pkg-generic: add a subdirectory to the DL_DIR Maxime Hadjinlian
2018-04-01 18:17   ` Yann E. MORIN
2018-03-31 14:24 ` [Buildroot] [v3 08/13] pkg-download: support new subdir for mirrors Maxime Hadjinlian
2018-04-01 14:42   ` Yann E. MORIN
2018-03-31 14:24 ` [Buildroot] [v3 09/13] pkg-generic: introduce _SAME_SOURCE_AS Maxime Hadjinlian
2018-04-01 14:26   ` Yann E. MORIN
2018-03-31 14:24 ` [Buildroot] [v3 10/13] package: share downloaded files for big packages Maxime Hadjinlian
2018-04-01 14:18   ` Yann E. MORIN
2018-03-31 14:24 ` [Buildroot] [v3 11/13] help/manual: update help about the new $(LIBFOO_DL_DIR) Maxime Hadjinlian
2018-04-01 14:15   ` Yann E. MORIN
2018-03-31 14:24 ` [Buildroot] [v3 12/13] download: add flock call before dl-wrapper Maxime Hadjinlian
2018-04-01 14:09   ` Yann E. MORIN
2018-04-01 17:53     ` Yann E. MORIN
2018-03-31 14:24 ` [Buildroot] [v3 13/13] download: git: introduce cache feature Maxime Hadjinlian
2018-04-01 12:57   ` Yann E. MORIN
2018-04-01 14:58     ` Arnout Vandecappelle
2018-04-01 18:13       ` Yann E. MORIN

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.