All of lore.kernel.org
 help / color / mirror / Atom feed
* [RFC][PATCH 0/6] NPM refactoring
@ 2019-10-22  9:03 Jean-Marie LEMETAYER
  2019-10-22  9:03 ` [RFC][PATCH 1/6] npm.bbclass: refactor the npm class Jean-Marie LEMETAYER
                   ` (7 more replies)
  0 siblings, 8 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-22  9:03 UTC (permalink / raw)
  To: openembedded-core; +Cc: brendan.le.foll, paul.eggleton, rennes

The current NPM support have several issues:
 - The current NPM fetcher downloads the dependency tree but not the other
   fetchers. The 'subdir' parameter was used to fix this issue.
 - They are multiple issues with package names (uppercase, exotic characters,
   scoped packages) even if they are inside the dependencies.
 - The lockdown file generation have issues. When a package depends on
   multiple version of the same package (all versions have the same checksum).

This patchset refactors the NPM support in Yocto:
 - As the NPM algorithm for dependency management is hard to handle, the new
   NPM fetcher downloads only the package source (and not the dependencies,
   like the other fetchers) (patch submitted in the bitbake-devel list).
 - The NPM class handles the dependencies using NPM (and not manually).
 - The NPM recipe creation is simplified to avoid issues.
 - The lockdown file is no more used as it is no longer relevant compared to the
   latest shrinkwrap file format.

This patchset may remove some features (lockdown file, license management for
dependencies) but fixes the majority of the NPM issues. All of these issues
from the bugzilla.yoctoproject.org are resolved by this patchset:
#10237, #10760, #11028, #11728, #11902, #12534

The fetcher and recipetool are now aware of a 'latest' keyword for the version
which allow to build the latest version available on the registry. This feature
fixes the two issues: #10515, #11029

Moreover the issue #13415 should also be fixed but I cannot test it.

I have tested the recipe creation and builds using a self made example to
generate build failures:
 - https://github.com/savoirfairelinux/node-server-example
 - https://npmjs.com/package/@savoirfairelinux/node-server-example

The test steps are these ones:
  $ source poky/oe-init-build-env
  $ bitbake-layers add-layer ../meta-openembedded/meta-oe
  $ devtool add "npm://registry.npmjs.org;name=@savoirfairelinux/node-server-example;version=latest"
  $ devtool build savoirfairelinux-node-server-example
  $ bitbake-layers create-layer ../meta-test
  $ bitbake-layers add-layer ../meta-test
  $ devtool finish savoirfairelinux-node-server-example ../meta-test
  $ echo IMAGE_INSTALL_append = '" savoirfairelinux-node-server-example"' >> conf/local.conf
  $ bitbake core-image-minimal

Also the 'devtool add' url could be one of these:
 - npm://registry.npmjs.org;name=@savoirfairelinux/node-server-example;version=latest
   This url uses the new npm fetcher and request the latest version available on
   the registry.
 - npm://registry.npmjs.org;name=@savoirfairelinux/node-server-example;version=1.0.0
   This url uses the new npm fetcher and request a fixed version.
 - git://github.com/savoirfairelinux/node-server-example.git;protocol=https
   This url uses the git fetcher. As the dependencies are managed by the NPM
   class, any fetcher can be used.

When this patchset will be merged, I have planned to update the NPM wiki:
  https://wiki.yoctoproject.org/wiki/TipsAndTricks/NPM

This patchset is also the base work for the full Angular support in Yocto that I
am preparing. These applications have a huge dependency tree which is ideal to
test the NPM support.

Jean-Marie LEMETAYER (6):
  npm.bbclass: refactor the npm class
  devtool: update command line options for npm
  recipetool/create_npm.py: refactor the npm recipe creation handler
  devtool/standard.py: update the append file for the npm recipes
  recipetool/create.py: replace 'latest' keyword for npm
  recipetool/create.py: remove the 'noverify' url parameter

 meta/classes/npm.bbclass             | 210 +++++++++----
 scripts/lib/devtool/standard.py      |  22 +-
 scripts/lib/recipetool/create.py     |  16 +-
 scripts/lib/recipetool/create_npm.py | 444 ++++++++++-----------------
 4 files changed, 331 insertions(+), 361 deletions(-)

--
2.20.1



^ permalink raw reply	[flat|nested] 36+ messages in thread

* [RFC][PATCH 1/6] npm.bbclass: refactor the npm class
  2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
@ 2019-10-22  9:03 ` Jean-Marie LEMETAYER
  2019-10-22 11:35   ` Alexander Kanavin
  2019-10-24 11:22   ` Stefan Herbrechtsmeier
  2019-10-22  9:03 ` [RFC][PATCH 2/6] devtool: update command line options for npm Jean-Marie LEMETAYER
                   ` (6 subsequent siblings)
  7 siblings, 2 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-22  9:03 UTC (permalink / raw)
  To: openembedded-core; +Cc: brendan.le.foll, paul.eggleton, rennes

Many issues were related to npm dependencies badly handled: package
names, installation directories, ... In fact npm is using an install
algorithm [1] which is hard to reproduce / anticipate. Moreover some
npm packages use scopes [2] which adds more complexity.

The simplest solution is to let npm do its job. Assuming the fetcher
only get the sources of the package, the class will now run
'npm install' to create a build directory. The build directory is then
copied wisely to the destination.

1: https://docs.npmjs.com/cli/install#algorithm
2: https://docs.npmjs.com/about-scopes

Signed-off-by: Jean-Marie LEMETAYER <jean-marie.lemetayer@savoirfairelinux.com>
---
 meta/classes/npm.bbclass | 210 ++++++++++++++++++++++++++-------------
 1 file changed, 143 insertions(+), 67 deletions(-)

diff --git a/meta/classes/npm.bbclass b/meta/classes/npm.bbclass
index 4b1f0a39f0..fc671e7223 100644
--- a/meta/classes/npm.bbclass
+++ b/meta/classes/npm.bbclass
@@ -1,19 +1,44 @@
+# Copyright (C) 2019 Savoir-Faire Linux
+#
+# This bbclass builds and installs an npm package to the target. The package
+# sources files should be fetched in the calling recipe by using the SRC_URI
+# variable. The ${S} variable should be updated depending of your fetcher.
+#
+# Usage:
+#  SRC_URI = "..."
+#  inherit npm
+#
+# Optional variables:
+#  NPM_SHRINKWRAP:
+#       Provide a shrinkwrap file [1]. If available a shrinkwrap file in the
+#       sources has priority over the one provided. A shrinkwrap file is
+#       mandatory in order to ensure build reproducibility.
+#       1: https://docs.npmjs.com/files/shrinkwrap.json
+#
+#  NPM_INSTALL_DEV:
+#       Set to 1 to also install devDependencies.
+#
+#  NPM_REGISTRY:
+#       Use the specified registry.
+#
+#  NPM_ARCH:
+#       Override the auto generated npm architecture.
+#
+#  NPM_INSTALL_EXTRA_ARGS:
+#       Add extra arguments to the 'npm install' execution.
+#       Use it at your own risk.
+
 DEPENDS_prepend = "nodejs-native "
 RDEPENDS_${PN}_prepend = "nodejs "
-S = "${WORKDIR}/npmpkg"
 
-def node_pkgname(d):
-    bpn = d.getVar('BPN')
-    if bpn.startswith("node-"):
-        return bpn[5:]
-    return bpn
+NPM_SHRINKWRAP ?= "${THISDIR}/${BPN}/npm-shrinkwrap.json"
 
-NPMPN ?= "${@node_pkgname(d)}"
+NPM_INSTALL_DEV ?= "0"
 
-NPM_INSTALLDIR = "${libdir}/node_modules/${NPMPN}"
+NPM_REGISTRY ?= "https://registry.npmjs.org"
 
 # function maps arch names to npm arch names
-def npm_oe_arch_map(target_arch, d):
+def npm_oe_arch_map(target_arch):
     import re
     if   re.match('p(pc|owerpc)(|64)', target_arch): return 'ppc'
     elif re.match('i.86$', target_arch): return 'ia32'
@@ -21,74 +46,125 @@ def npm_oe_arch_map(target_arch, d):
     elif re.match('arm64$', target_arch): return 'arm'
     return target_arch
 
-NPM_ARCH ?= "${@npm_oe_arch_map(d.getVar('TARGET_ARCH'), d)}"
-NPM_INSTALL_DEV ?= "0"
+NPM_ARCH ?= "${@npm_oe_arch_map(d.getVar('TARGET_ARCH'))}"
+
+NPM_INSTALL_EXTRA_ARGS ?= ""
+
+B = "${WORKDIR}/build"
+
+npm_install_shrinkwrap() {
+    # This function ensures that there is a shrinkwrap file in the specified
+    # directory. A shrinkwrap file is mandatory to have reproducible builds.
+    # If the shrinkwrap file is not already included in the sources,
+    # the recipe can provide one by using the NPM_SHRINKWRAP option.
+    # This function returns the filename of the installed file (if any).
+    if [ -f ${1}/npm-shrinkwrap.json ]
+    then
+        bbnote "Using the npm-shrinkwrap.json provided in the sources"
+    elif [ -f ${NPM_SHRINKWRAP} ]
+    then
+        install -m 644 ${NPM_SHRINKWRAP} ${1}
+        echo ${1}/npm-shrinkwrap.json
+    else
+        bbfatal "No mandatory NPM_SHRINKWRAP file found"
+    fi
+}
 
 npm_do_compile() {
-	# Copy in any additionally fetched modules
-	if [ -d ${WORKDIR}/node_modules ] ; then
-		cp -a ${WORKDIR}/node_modules ${S}/
-	fi
-	# changing the home directory to the working directory, the .npmrc will
-	# be created in this directory
-	export HOME=${WORKDIR}
-	if [  "${NPM_INSTALL_DEV}" = "1" ]; then
-		npm config set dev true
-	else
-		npm config set dev false
-	fi
-	npm set cache ${WORKDIR}/npm_cache
-	# clear cache before every build
-	npm cache clear --force
-	# Install pkg into ${S} without going to the registry
-	if [  "${NPM_INSTALL_DEV}" = "1" ]; then
-		npm --arch=${NPM_ARCH} --target_arch=${NPM_ARCH} --no-registry install
-	else
-		npm --arch=${NPM_ARCH} --target_arch=${NPM_ARCH} --production --no-registry install
-	fi
+    # This function executes the 'npm install' command which builds and
+    # installs every dependencies needed for the package. All the files are
+    # installed in a build directory ${B} without filtering anything. To do so,
+    # a combination of 'npm pack' and 'npm install' is used to ensure that the
+    # files in ${B} are actual copies instead of symbolic links (which is the
+    # default npm behavior).
+
+    # The npm command use by default a cache which is located in '~/.npm'. In
+    # order to force the next npm commands to disable caching, the npm cache
+    # needs to be cleared. But not to alter the local cache, the npm config
+    # needs to be updated to use another cache directory. The HOME needs to be
+    # updated as well to avoid modifying the local '~/.npmrc' file.
+    HOME=${WORKDIR}
+    npm config set cache ${WORKDIR}/npm_cache
+    npm cache clear --force
+
+    # First ensure that there is a shrinkwrap file in the sources.
+    local NPM_SHRINKWRAP_INSTALLED=$(npm_install_shrinkwrap ${S})
+
+    # Then create a tarball from a npm package whose sources must be in ${S}.
+    local NPM_PACK_FILE=$(cd ${WORKDIR} && npm pack ${S}/)
+
+    # Finally install and build the tarball package in ${B}.
+    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --loglevel silly"
+    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --prefix=${B}"
+    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --global"
+
+    if [ "${NPM_INSTALL_DEV}" != 1 ]
+    then
+        local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --production"
+    fi
+
+    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS} --arch=${NPM_ARCH}"
+    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS} --target_arch=${NPM_ARCH}"
+    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS} --release"
+
+    cd ${WORKDIR} && npm install \
+        ${NPM_INSTALL_EXTRA_ARGS} \
+        ${NPM_INSTALL_GYP_ARGS} \
+        ${NPM_INSTALL_ARGS} \
+        ${NPM_PACK_FILE}
+
+    # Clean source tree.
+    rm -f ${NPM_SHRINKWRAP_INSTALLED}
 }
 
 npm_do_install() {
-	# changing the home directory to the working directory, the .npmrc will
-	# be created in this directory
-	export HOME=${WORKDIR}
-	mkdir -p ${D}${libdir}/node_modules
-	local NPM_PACKFILE=$(npm pack .)
-	npm install --prefix ${D}${prefix} -g --arch=${NPM_ARCH} --target_arch=${NPM_ARCH} --production --no-registry ${NPM_PACKFILE}
-	ln -fs node_modules ${D}${libdir}/node
-	find ${D}${NPM_INSTALLDIR} -type f \( -name "*.a" -o -name "*.d" -o -name "*.o" \) -delete
-	if [ -d ${D}${prefix}/etc ] ; then
-		# This will be empty
-		rmdir ${D}${prefix}/etc
-	fi
-}
+    # This function creates the destination directory from the pre installed
+    # files in the ${B} directory.
+
+    # Copy the entire lib and bin directories from ${B} to ${D}.
+    install -d ${D}/${libdir}
+    cp --no-preserve=ownership --recursive ${B}/lib/. ${D}/${libdir}
+
+    if [ -d "${B}/bin" ]
+    then
+        install -d ${D}/${bindir}
+        cp --no-preserve=ownership --recursive ${B}/bin/. ${D}/${bindir}
+    fi
+
+    # If the package (or its dependencies) uses node-gyp to build native addons,
+    # object files, static libraries or other temporary files can be hidden in
+    # the lib directory. To reduce the package size and to avoid QA issues
+    # (staticdev with static library files) these files must be removed.
+
+    # Remove any node-gyp directory in ${D} to remove temporary build files.
+    for GYP_D_FILE in $(find ${D} -regex ".*/build/Release/[^/]*.node")
+    do
+        local GYP_D_DIR=${GYP_D_FILE%/Release/*}
+
+        rm --recursive --force ${GYP_D_DIR}
+    done
+
+    # Copy only the node-gyp release files from ${B} to ${D}.
+    for GYP_B_FILE in $(find ${B} -regex ".*/build/Release/[^/]*.node")
+    do
+        local GYP_D_FILE=${D}/${prefix}/${GYP_B_FILE#${B}}
+
+        install -d ${GYP_D_FILE%/*}
+        install -m 755 ${GYP_B_FILE} ${GYP_D_FILE}
+    done
+
+    # Remove the shrinkwrap file which does not need to be packed.
+    rm -f ${D}/${libdir}/node_modules/*/npm-shrinkwrap.json
+    rm -f ${D}/${libdir}/node_modules/@*/*/npm-shrinkwrap.json
 
-python populate_packages_prepend () {
-    instdir = d.expand('${D}${NPM_INSTALLDIR}')
-    extrapackages = oe.package.npm_split_package_dirs(instdir)
-    pkgnames = extrapackages.keys()
-    d.prependVar('PACKAGES', '%s ' % ' '.join(pkgnames))
-    for pkgname in pkgnames:
-        pkgrelpath, pdata = extrapackages[pkgname]
-        pkgpath = '${NPM_INSTALLDIR}/' + pkgrelpath
-        # package names can't have underscores but npm packages sometimes use them
-        oe_pkg_name = pkgname.replace('_', '-')
-        expanded_pkgname = d.expand(oe_pkg_name)
-        d.setVar('FILES_%s' % expanded_pkgname, pkgpath)
-        if pdata:
-            version = pdata.get('version', None)
-            if version:
-                d.setVar('PKGV_%s' % expanded_pkgname, version)
-            description = pdata.get('description', None)
-            if description:
-                d.setVar('SUMMARY_%s' % expanded_pkgname, description.replace(u"\u2018", "'").replace(u"\u2019", "'"))
-    d.appendVar('RDEPENDS_%s' % d.getVar('PN'), ' %s' % ' '.join(pkgnames).replace('_', '-'))
+    # node(1) is using /usr/lib/node as default include directory and npm(1) is
+    # using /usr/lib/node_modules as install directory. Let's make both happy.
+    ln -fs node_modules ${D}/${libdir}/node
 }
 
 FILES_${PN} += " \
     ${bindir} \
-    ${libdir}/node \
-    ${NPM_INSTALLDIR} \
+    ${libdir} \
 "
 
 EXPORT_FUNCTIONS do_compile do_install
-- 
2.20.1



^ permalink raw reply related	[flat|nested] 36+ messages in thread

* [RFC][PATCH 2/6] devtool: update command line options for npm
  2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
  2019-10-22  9:03 ` [RFC][PATCH 1/6] npm.bbclass: refactor the npm class Jean-Marie LEMETAYER
@ 2019-10-22  9:03 ` Jean-Marie LEMETAYER
  2019-10-22  9:03 ` [RFC][PATCH 3/6] recipetool/create_npm.py: refactor the npm recipe creation handler Jean-Marie LEMETAYER
                   ` (5 subsequent siblings)
  7 siblings, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-22  9:03 UTC (permalink / raw)
  To: openembedded-core; +Cc: brendan.le.foll, paul.eggleton, rennes

This commit renames the '--fetch-dev' option into '--npm-dev' which is
more easily understandable.

It also adds the '--npm-registry' option to allow creating a npm recipe
with a non default npm registry (e.g. if the SRC_URI is using git://).

Signed-off-by: Jean-Marie LEMETAYER <jean-marie.lemetayer@savoirfairelinux.com>
---
 scripts/lib/devtool/standard.py  |  9 ++++++---
 scripts/lib/recipetool/create.py | 12 +++++++-----
 2 files changed, 13 insertions(+), 8 deletions(-)

diff --git a/scripts/lib/devtool/standard.py b/scripts/lib/devtool/standard.py
index 1646971a91..fedf8b8262 100644
--- a/scripts/lib/devtool/standard.py
+++ b/scripts/lib/devtool/standard.py
@@ -145,8 +145,10 @@ def add(args, config, basepath, workspace):
         extracmdopts += ' --src-subdir "%s"' % args.src_subdir
     if args.autorev:
         extracmdopts += ' -a'
-    if args.fetch_dev:
-        extracmdopts += ' --fetch-dev'
+    if args.npm_dev:
+        extracmdopts += ' --npm-dev'
+    if args.npm_registry:
+        extracmdopts += ' --npm-registry "%s"' % args.npm_registry
     if args.mirrors:
         extracmdopts += ' --mirrors'
     if args.srcrev:
@@ -2187,7 +2189,8 @@ def register_commands(subparsers, context):
     group.add_argument('--same-dir', '-s', help='Build in same directory as source', action="store_true")
     group.add_argument('--no-same-dir', help='Force build in a separate build directory', action="store_true")
     parser_add.add_argument('--fetch', '-f', help='Fetch the specified URI and extract it to create the source tree (deprecated - pass as positional argument instead)', metavar='URI')
-    parser_add.add_argument('--fetch-dev', help='For npm, also fetch devDependencies', action="store_true")
+    parser_add.add_argument('--npm-dev', help='For npm, also fetch devDependencies', action="store_true")
+    parser_add.add_argument('--npm-registry', help='For npm, use the specified registry', type=str)
     parser_add.add_argument('--version', '-V', help='Version to use within recipe (PV)')
     parser_add.add_argument('--no-git', '-g', help='If fetching source, do not set up source tree as a git repository', action="store_true")
     group = parser_add.add_mutually_exclusive_group()
diff --git a/scripts/lib/recipetool/create.py b/scripts/lib/recipetool/create.py
index 1fb6b55530..932dc3f374 100644
--- a/scripts/lib/recipetool/create.py
+++ b/scripts/lib/recipetool/create.py
@@ -716,10 +716,11 @@ def create_recipe(args):
         lines_after.append('INSANE_SKIP_${PN} += "already-stripped"')
         lines_after.append('')
 
-    if args.fetch_dev:
-        extravalues['fetchdev'] = True
-    else:
-        extravalues['fetchdev'] = None
+    if args.npm_dev:
+        extravalues['NPM_INSTALL_DEV'] = 1
+
+    if args.npm_registry:
+        extravalues['NPM_REGISTRY'] = args.npm_registry
 
     # Find all plugins that want to register handlers
     logger.debug('Loading recipe handlers')
@@ -1315,7 +1316,8 @@ def register_commands(subparsers):
     group.add_argument('-S', '--srcrev', help='Source revision to fetch if fetching from an SCM such as git (default latest)')
     parser_create.add_argument('-B', '--srcbranch', help='Branch in source repository if fetching from an SCM such as git (default master)')
     parser_create.add_argument('--keep-temp', action="store_true", help='Keep temporary directory (for debugging)')
-    parser_create.add_argument('--fetch-dev', action="store_true", help='For npm, also fetch devDependencies')
+    parser_create.add_argument('--npm-dev', action="store_true", help='For npm, also fetch devDependencies')
+    parser_create.add_argument('--npm-registry', help='For npm, use the specified registry', type=str)
     parser_create.add_argument('--devtool', action="store_true", help=argparse.SUPPRESS)
     parser_create.add_argument('--mirrors', action="store_true", help='Enable PREMIRRORS and MIRRORS for source tree fetching (disabled by default).')
     parser_create.set_defaults(func=create_recipe)
-- 
2.20.1



^ permalink raw reply related	[flat|nested] 36+ messages in thread

* [RFC][PATCH 3/6] recipetool/create_npm.py: refactor the npm recipe creation handler
  2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
  2019-10-22  9:03 ` [RFC][PATCH 1/6] npm.bbclass: refactor the npm class Jean-Marie LEMETAYER
  2019-10-22  9:03 ` [RFC][PATCH 2/6] devtool: update command line options for npm Jean-Marie LEMETAYER
@ 2019-10-22  9:03 ` Jean-Marie LEMETAYER
  2019-10-22  9:03 ` [RFC][PATCH 4/6] devtool/standard.py: update the append file for the npm recipes Jean-Marie LEMETAYER
                   ` (4 subsequent siblings)
  7 siblings, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-22  9:03 UTC (permalink / raw)
  To: openembedded-core; +Cc: brendan.le.foll, paul.eggleton, rennes

This commit refactors the npm recipe creation handler to use the new npm
behavior. The process is kept as simple as possible and only generates
the shrinkwrap file.

To avoid naming issues the recipe name is now extracted from the npm
package name and not directly map.

Signed-off-by: Jean-Marie LEMETAYER <jean-marie.lemetayer@savoirfairelinux.com>
---
 scripts/lib/recipetool/create_npm.py | 444 ++++++++++-----------------
 1 file changed, 168 insertions(+), 276 deletions(-)

diff --git a/scripts/lib/recipetool/create_npm.py b/scripts/lib/recipetool/create_npm.py
index 39429ebad3..530d56e260 100644
--- a/scripts/lib/recipetool/create_npm.py
+++ b/scripts/lib/recipetool/create_npm.py
@@ -1,321 +1,213 @@
-# Recipe creation tool - node.js NPM module support plugin
-#
 # Copyright (C) 2016 Intel Corporation
+# Copyright (C) 2019 Savoir-Faire Linux
 #
 # SPDX-License-Identifier: GPL-2.0-only
 #
+"""
+    Recipe creation tool - npm module support plugin
+"""
 
+import json
+import logging
 import os
+import re
+import shutil
 import sys
-import logging
-import subprocess
 import tempfile
-import shutil
-import json
-from recipetool.create import RecipeHandler, split_pkg_licenses, handle_license_vars
+import bb
+from bb.fetch2 import runfetchcmd
+from recipetool.create import RecipeHandler
 
 logger = logging.getLogger('recipetool')
 
-
 tinfoil = None
 
 def tinfoil_init(instance):
+    """
+        Initialize tinfoil.
+    """
+
     global tinfoil
     tinfoil = instance
 
-
 class NpmRecipeHandler(RecipeHandler):
-    lockdownpath = None
+    """
+        Class to handle the npm recipe creation
+    """
+
+    @staticmethod
+    def _ensure_npm(d):
+        """
+            Check if the 'npm' command is available in the recipes, then build
+            it and add it to the PATH.
+        """
 
-    def _ensure_npm(self, fixed_setup=False):
         if not tinfoil.recipes_parsed:
             tinfoil.parse_recipes()
         try:
             rd = tinfoil.parse_recipe('nodejs-native')
         except bb.providers.NoProvider:
-            if fixed_setup:
-                msg = 'nodejs-native is required for npm but is not available within this SDK'
-            else:
-                msg = 'nodejs-native is required for npm but is not available - you will likely need to add a layer that provides nodejs'
-            logger.error(msg)
-            return None
+            logger.error("Nothing provides 'nodejs-native' which is required for the build")
+            logger.info("You will likely need to add a layer that provides nodejs")
+            sys.exit(14)
+
         bindir = rd.getVar('STAGING_BINDIR_NATIVE')
         npmpath = os.path.join(bindir, 'npm')
         if not os.path.exists(npmpath):
             tinfoil.build_targets('nodejs-native', 'addto_recipe_sysroot')
             if not os.path.exists(npmpath):
-                logger.error('npm required to process specified source, but nodejs-native did not seem to populate it')
-                return None
-        return bindir
-
-    def _handle_license(self, data):
-        '''
-        Handle the license value from an npm package.json file
-        '''
-        license = None
-        if 'license' in data:
-            license = data['license']
-            if isinstance(license, dict):
-                license = license.get('type', None)
-            if license:
-                if 'OR' in license:
-                    license = license.replace('OR', '|')
-                    license = license.replace('AND', '&')
-                    license = license.replace(' ', '_')
-                    if not license[0] == '(':
-                        license = '(' + license + ')'
-                else:
-                    license = license.replace('AND', '&')
-                    if license[0] == '(':
-                        license = license[1:]
-                    if license[-1] == ')':
-                        license = license[:-1]
-                license = license.replace('MIT/X11', 'MIT')
-                license = license.replace('Public Domain', 'PD')
-                license = license.replace('SEE LICENSE IN EULA',
-                                          'SEE-LICENSE-IN-EULA')
-        return license
-
-    def _shrinkwrap(self, srctree, localfilesdir, extravalues, lines_before, d):
-        try:
-            runenv = dict(os.environ, PATH=d.getVar('PATH'))
-            bb.process.run('npm shrinkwrap', cwd=srctree, stderr=subprocess.STDOUT, env=runenv, shell=True)
-        except bb.process.ExecutionError as e:
-            logger.warning('npm shrinkwrap failed:\n%s' % e.stdout)
-            return
+                logger.error("Failed to add 'npm' to sysroot")
+                sys.exit(14)
 
-        tmpfile = os.path.join(localfilesdir, 'npm-shrinkwrap.json')
-        shutil.move(os.path.join(srctree, 'npm-shrinkwrap.json'), tmpfile)
-        extravalues.setdefault('extrafiles', {})
-        extravalues['extrafiles']['npm-shrinkwrap.json'] = tmpfile
-        lines_before.append('NPM_SHRINKWRAP := "${THISDIR}/${PN}/npm-shrinkwrap.json"')
-
-    def _lockdown(self, srctree, localfilesdir, extravalues, lines_before, d):
-        runenv = dict(os.environ, PATH=d.getVar('PATH'))
-        if not NpmRecipeHandler.lockdownpath:
-            NpmRecipeHandler.lockdownpath = tempfile.mkdtemp('recipetool-npm-lockdown')
-            bb.process.run('npm install lockdown --prefix %s' % NpmRecipeHandler.lockdownpath,
-                           cwd=srctree, stderr=subprocess.STDOUT, env=runenv, shell=True)
-        relockbin = os.path.join(NpmRecipeHandler.lockdownpath, 'node_modules', 'lockdown', 'relock.js')
-        if not os.path.exists(relockbin):
-            logger.warning('Could not find relock.js within lockdown directory; skipping lockdown')
-            return
-        try:
-            bb.process.run('node %s' % relockbin, cwd=srctree, stderr=subprocess.STDOUT, env=runenv, shell=True)
-        except bb.process.ExecutionError as e:
-            logger.warning('lockdown-relock failed:\n%s' % e.stdout)
-            return
+        d.prependVar("PATH", "{}:".format(bindir))
+
+    @staticmethod
+    def _run_npm_install(d, srctree, development, registry):
+        """
+            Run the 'npm install' command without building the sources (if any).
+            This is only needed to generate the npm-shrinkwrap.json file.
+        """
+
+        cmd = "npm install"
+        cmd += " --ignore-scripts"
+
+        if development is None:
+            cmd += " --production"
+
+        if registry is not None:
+            cmd += " --registry {}".format(registry)
+
+        runfetchcmd(cmd, d, workdir=srctree)
+
+    @staticmethod
+    def _run_npm_shrinkwrap(d, srctree, development):
+        """
+            Run the 'npm shrinkwrap' command.
+        """
+
+        cmd = "npm shrinkwrap"
+
+        if development is not None:
+            cmd += " --development"
+
+        runfetchcmd(cmd, d, workdir=srctree)
+
+    def _generate_shrinkwrap(self, srctree, lines_before, lines_after, extravalues):
+        """
+            Check and generate the npm-shrinkwrap.json file if needed.
+        """
+
+        # Are we using the '--npm-dev' option ?
+        development = extravalues.get("NPM_INSTALL_DEV")
+
+        # Are we using the '--npm-registry' option ?
+        registry_option = extravalues.get("NPM_REGISTRY")
 
-        tmpfile = os.path.join(localfilesdir, 'lockdown.json')
-        shutil.move(os.path.join(srctree, 'lockdown.json'), tmpfile)
-        extravalues.setdefault('extrafiles', {})
-        extravalues['extrafiles']['lockdown.json'] = tmpfile
-        lines_before.append('NPM_LOCKDOWN := "${THISDIR}/${PN}/lockdown.json"')
+        # Get the registry from the fetch url if using 'npm://registry.url'
+        registry_fetch = None
 
-    def _handle_dependencies(self, d, deps, optdeps, devdeps, lines_before, srctree):
-        import scriptutils
-        # If this isn't a single module we need to get the dependencies
-        # and add them to SRC_URI
         def varfunc(varname, origvalue, op, newlines):
-            if varname == 'SRC_URI':
-                if not origvalue.startswith('npm://'):
-                    src_uri = origvalue.split()
-                    deplist = {}
-                    for dep, depver in optdeps.items():
-                        depdata = self.get_npm_data(dep, depver, d)
-                        if self.check_npm_optional_dependency(depdata):
-                            deplist[dep] = depdata
-                    for dep, depver in devdeps.items():
-                        depdata = self.get_npm_data(dep, depver, d)
-                        if self.check_npm_optional_dependency(depdata):
-                            deplist[dep] = depdata
-                    for dep, depver in deps.items():
-                        depdata = self.get_npm_data(dep, depver, d)
-                        deplist[dep] = depdata
-
-                    extra_urls = []
-                    for dep, depdata in deplist.items():
-                        version = depdata.get('version', None)
-                        if version:
-                            url = 'npm://registry.npmjs.org;name=%s;version=%s;subdir=node_modules/%s' % (dep, version, dep)
-                            extra_urls.append(url)
-                    if extra_urls:
-                        scriptutils.fetch_url(tinfoil, ' '.join(extra_urls), None, srctree, logger)
-                        src_uri.extend(extra_urls)
-                        return src_uri, None, -1, True
+            if varname == "SRC_URI":
+                if origvalue.startswith("npm://"):
+                    nonlocal registry_fetch
+                    registry_fetch = origvalue.replace("npm://", "http://", 1).split(";")[0]
             return origvalue, None, 0, True
-        updated, newlines = bb.utils.edit_metadata(lines_before, ['SRC_URI'], varfunc)
-        if updated:
-            del lines_before[:]
-            for line in newlines:
-                # Hack to avoid newlines that edit_metadata inserts
-                if line.endswith('\n'):
-                    line = line[:-1]
-                lines_before.append(line)
-        return updated
+
+        bb.utils.edit_metadata(lines_before, ["SRC_URI"], varfunc)
+
+        # Compute the proper registry value
+        if registry_fetch is not None:
+            registry = registry_fetch
+
+            if registry_option is not None:
+                logger.warning("The npm registry is specified multiple times")
+                logger.info("Using registry from the fetch url: '{}'".format(registry))
+                extravalues.pop("NPM_REGISTRY", None)
+
+        elif registry_option is not None:
+            registry = registry_option
+
+        else:
+            registry = None
+
+        # Initialize the npm environment
+        d = bb.data.createCopy(tinfoil.config_data)
+        self._ensure_npm(d)
+
+        # Check if a shinkwrap file is already in the source
+        if os.path.exists(os.path.join(srctree, "npm-shrinkwrap.json")):
+            logger.info("Using the npm-shrinkwrap.json provided in the sources")
+            return
+
+        # Generate the 'npm-shrinkwrap.json' file
+        self._run_npm_install(d, srctree, development, registry)
+        self._run_npm_shrinkwrap(d, srctree, development)
+
+        # Save the shrinkwrap file in a temporary location
+        tmpdir = tempfile.mkdtemp(prefix="recipetool-npm")
+        tmpfile = os.path.join(tmpdir, "npm-shrinkwrap.json")
+        shutil.move(os.path.join(srctree, "npm-shrinkwrap.json"), tmpfile)
+
+        # Add the shrinkwrap file as 'extrafiles'
+        extravalues.setdefault("extrafiles", {})
+        extravalues["extrafiles"]["npm-shrinkwrap.json"] = tmpfile
+
+        # Add a line in the recipe to handle the shrinkwrap file
+        lines_after.append("NPM_SHRINKWRAP = \"${THISDIR}/${BPN}/npm-shrinkwrap.json\"")
+
+        # Remove the 'node_modules' directory generated by 'npm install'
+        bb.utils.remove(os.path.join(srctree, "node_modules"), recurse=True)
+
+    @staticmethod
+    def _recipe_name_from_npm(name):
+        """
+            Generate a recipe name based on the npm package name.
+        """
+
+        name = name.lower()
+        name = re.sub("/", "-", name)
+        name = re.sub("[^a-z\-]", "", name)
+        name = name.strip("-")
+        return name
 
     def process(self, srctree, classes, lines_before, lines_after, handled, extravalues):
-        import bb.utils
-        import oe.package
-        from collections import OrderedDict
+        """
+            Handle the npm recipe creation
+        """
 
         if 'buildsystem' in handled:
             return False
 
-        def read_package_json(fn):
-            with open(fn, 'r', errors='surrogateescape') as f:
-                return json.loads(f.read())
+        files = RecipeHandler.checkfiles(srctree, ["package.json"])
 
-        files = RecipeHandler.checkfiles(srctree, ['package.json'])
-        if files:
-            d = bb.data.createCopy(tinfoil.config_data)
-            npm_bindir = self._ensure_npm()
-            if not npm_bindir:
-                sys.exit(14)
-            d.prependVar('PATH', '%s:' % npm_bindir)
-
-            data = read_package_json(files[0])
-            if 'name' in data and 'version' in data:
-                extravalues['PN'] = data['name']
-                extravalues['PV'] = data['version']
-                classes.append('npm')
-                handled.append('buildsystem')
-                if 'description' in data:
-                    extravalues['SUMMARY'] = data['description']
-                if 'homepage' in data:
-                    extravalues['HOMEPAGE'] = data['homepage']
-
-                fetchdev = extravalues['fetchdev'] or None
-                deps, optdeps, devdeps = self.get_npm_package_dependencies(data, fetchdev)
-                self._handle_dependencies(d, deps, optdeps, devdeps, lines_before, srctree)
-
-                # Shrinkwrap
-                localfilesdir = tempfile.mkdtemp(prefix='recipetool-npm')
-                self._shrinkwrap(srctree, localfilesdir, extravalues, lines_before, d)
-
-                # Lockdown
-                self._lockdown(srctree, localfilesdir, extravalues, lines_before, d)
-
-                # Split each npm module out to is own package
-                npmpackages = oe.package.npm_split_package_dirs(srctree)
-                licvalues = None
-                for item in handled:
-                    if isinstance(item, tuple):
-                        if item[0] == 'license':
-                            licvalues = item[1]
-                            break
-                if not licvalues:
-                    licvalues = handle_license_vars(srctree, lines_before, handled, extravalues, d)
-                if licvalues:
-                    # Augment the license list with information we have in the packages
-                    licenses = {}
-                    license = self._handle_license(data)
-                    if license:
-                        licenses['${PN}'] = license
-                    for pkgname, pkgitem in npmpackages.items():
-                        _, pdata = pkgitem
-                        license = self._handle_license(pdata)
-                        if license:
-                            licenses[pkgname] = license
-                    # Now write out the package-specific license values
-                    # We need to strip out the json data dicts for this since split_pkg_licenses
-                    # isn't expecting it
-                    packages = OrderedDict((x,y[0]) for x,y in npmpackages.items())
-                    packages['${PN}'] = ''
-                    pkglicenses = split_pkg_licenses(licvalues, packages, lines_after, licenses)
-                    all_licenses = list(set([item.replace('_', ' ') for pkglicense in pkglicenses.values() for item in pkglicense]))
-                    if '&' in all_licenses:
-                        all_licenses.remove('&')
-                    extravalues['LICENSE'] = ' & '.join(all_licenses)
-
-                # Need to move S setting after inherit npm
-                for i, line in enumerate(lines_before):
-                    if line.startswith('S ='):
-                        lines_before.pop(i)
-                        lines_after.insert(0, '# Must be set after inherit npm since that itself sets S')
-                        lines_after.insert(1, line)
-                        break
-
-                return True
-
-        return False
-
-    # FIXME this is duplicated from lib/bb/fetch2/npm.py
-    def _parse_view(self, output):
-        '''
-        Parse the output of npm view --json; the last JSON result
-        is assumed to be the one that we're interested in.
-        '''
-        pdata = None
-        outdeps = {}
-        datalines = []
-        bracelevel = 0
-        for line in output.splitlines():
-            if bracelevel:
-                datalines.append(line)
-            elif '{' in line:
-                datalines = []
-                datalines.append(line)
-            bracelevel = bracelevel + line.count('{') - line.count('}')
-        if datalines:
-            pdata = json.loads('\n'.join(datalines))
-        return pdata
-
-    # FIXME this is effectively duplicated from lib/bb/fetch2/npm.py
-    # (split out from _getdependencies())
-    def get_npm_data(self, pkg, version, d):
-        import bb.fetch2
-        pkgfullname = pkg
-        if version != '*' and not '/' in version:
-            pkgfullname += "@'%s'" % version
-        logger.debug(2, "Calling getdeps on %s" % pkg)
-        runenv = dict(os.environ, PATH=d.getVar('PATH'))
-        fetchcmd = "npm view %s --json" % pkgfullname
-        output, _ = bb.process.run(fetchcmd, stderr=subprocess.STDOUT, env=runenv, shell=True)
-        data = self._parse_view(output)
-        return data
-
-    # FIXME this is effectively duplicated from lib/bb/fetch2/npm.py
-    # (split out from _getdependencies())
-    def get_npm_package_dependencies(self, pdata, fetchdev):
-        dependencies = pdata.get('dependencies', {})
-        optionalDependencies = pdata.get('optionalDependencies', {})
-        dependencies.update(optionalDependencies)
-        if fetchdev:
-            devDependencies = pdata.get('devDependencies', {})
-            dependencies.update(devDependencies)
-        else:
-            devDependencies = {}
-        depsfound = {}
-        optdepsfound = {}
-        devdepsfound = {}
-        for dep in dependencies:
-            if dep in optionalDependencies:
-                optdepsfound[dep] = dependencies[dep]
-            elif dep in devDependencies:
-                devdepsfound[dep] = dependencies[dep]
-            else:
-                depsfound[dep] = dependencies[dep]
-        return depsfound, optdepsfound, devdepsfound
-
-    # FIXME this is effectively duplicated from lib/bb/fetch2/npm.py
-    # (split out from _getdependencies())
-    def check_npm_optional_dependency(self, pdata):
-        pkg_os = pdata.get('os', None)
-        if pkg_os:
-            if not isinstance(pkg_os, list):
-                pkg_os = [pkg_os]
-            blacklist = False
-            for item in pkg_os:
-                if item.startswith('!'):
-                    blacklist = True
-                    break
-            if (not blacklist and 'linux' not in pkg_os) or '!linux' in pkg_os:
-                pkg = pdata.get('name', 'Unnamed package')
-                logger.debug(2, "Skipping %s since it's incompatible with Linux" % pkg)
-                return False
-        return True
+        if not files:
+            return False
+
+        with open(files[0], "r", errors="surrogateescape") as f:
+            data = json.load(f)
+
+        if "name" not in data or "version" not in data:
+            return False
 
+        self._generate_shrinkwrap(srctree, lines_before, lines_after, extravalues)
+
+        extravalues["PN"] = self._recipe_name_from_npm(data["name"])
+        extravalues["PV"] = data["version"]
+
+        if "description" in data:
+            extravalues["SUMMARY"] = data["description"]
+
+        if "homepage" in data:
+            extravalues["HOMEPAGE"] = data["homepage"]
+
+        classes.append("npm")
+        handled.append("buildsystem")
+
+        return True
 
 def register_recipe_handlers(handlers):
+    """
+        Register the npm handler
+    """
+
     handlers.append((NpmRecipeHandler(), 60))
-- 
2.20.1



^ permalink raw reply related	[flat|nested] 36+ messages in thread

* [RFC][PATCH 4/6] devtool/standard.py: update the append file for the npm recipes
  2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
                   ` (2 preceding siblings ...)
  2019-10-22  9:03 ` [RFC][PATCH 3/6] recipetool/create_npm.py: refactor the npm recipe creation handler Jean-Marie LEMETAYER
@ 2019-10-22  9:03 ` Jean-Marie LEMETAYER
  2019-10-22  9:03 ` [RFC][PATCH 5/6] recipetool/create.py: replace 'latest' keyword for npm Jean-Marie LEMETAYER
                   ` (3 subsequent siblings)
  7 siblings, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-22  9:03 UTC (permalink / raw)
  To: openembedded-core; +Cc: brendan.le.foll, paul.eggleton, rennes

When creating a recipe using devtool, a workspace is created to store
the new recipe, the recipe source and some append files. These append
files are used by devtool to build the recipe using externalsrc (to use
the source which are in the workspace). They can also have some
additional actions according to the class of the recipe.

This commit updates the append file for the npm recipes. The
devtool / externalsrc files are removed in the npm build directory
instead of the install directory.

Signed-off-by: Jean-Marie LEMETAYER <jean-marie.lemetayer@savoirfairelinux.com>
---
 scripts/lib/devtool/standard.py | 13 +++++--------
 1 file changed, 5 insertions(+), 8 deletions(-)

diff --git a/scripts/lib/devtool/standard.py b/scripts/lib/devtool/standard.py
index fedf8b8262..28c079c5f6 100644
--- a/scripts/lib/devtool/standard.py
+++ b/scripts/lib/devtool/standard.py
@@ -262,14 +262,11 @@ def add(args, config, basepath, workspace):
                 f.write('}\n')
 
             if bb.data.inherits_class('npm', rd):
-                f.write('do_install_append() {\n')
-                f.write('    # Remove files added to source dir by devtool/externalsrc\n')
-                f.write('    rm -f ${NPM_INSTALLDIR}/singletask.lock\n')
-                f.write('    rm -rf ${NPM_INSTALLDIR}/.git\n')
-                f.write('    rm -rf ${NPM_INSTALLDIR}/oe-local-files\n')
-                f.write('    for symlink in ${EXTERNALSRC_SYMLINKS} ; do\n')
-                f.write('        rm -f ${NPM_INSTALLDIR}/${symlink%%:*}\n')
-                f.write('    done\n')
+                f.write('do_compile_append() {\n')
+                f.write('    rm -rf ${B}/lib/node_modules/*/.git\n')
+                f.write('    rm -rf ${B}/lib/node_modules/@*/*/.git\n')
+                f.write('    rm -f ${B}/lib/node_modules/*/singletask.lock\n')
+                f.write('    rm -f ${B}/lib/node_modules/@*/*/singletask.lock\n')
                 f.write('}\n')
 
         # Check if the new layer provides recipes whose priorities have been
-- 
2.20.1



^ permalink raw reply related	[flat|nested] 36+ messages in thread

* [RFC][PATCH 5/6] recipetool/create.py: replace 'latest' keyword for npm
  2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
                   ` (3 preceding siblings ...)
  2019-10-22  9:03 ` [RFC][PATCH 4/6] devtool/standard.py: update the append file for the npm recipes Jean-Marie LEMETAYER
@ 2019-10-22  9:03 ` Jean-Marie LEMETAYER
  2019-10-22  9:03 ` [RFC][PATCH 6/6] recipetool/create.py: remove the 'noverify' url parameter Jean-Marie LEMETAYER
                   ` (2 subsequent siblings)
  7 siblings, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-22  9:03 UTC (permalink / raw)
  To: openembedded-core; +Cc: brendan.le.foll, paul.eggleton, rennes

The new npm fetcher allows the 'latest' keyword to be used to download
the latest version on the registry. But the keyword must be replace
as soon as the version is determined to have a stable generated recipe.

Signed-off-by: Jean-Marie LEMETAYER <jean-marie.lemetayer@savoirfairelinux.com>
---
 scripts/lib/recipetool/create.py | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/scripts/lib/recipetool/create.py b/scripts/lib/recipetool/create.py
index 932dc3f374..e49e73ed9b 100644
--- a/scripts/lib/recipetool/create.py
+++ b/scripts/lib/recipetool/create.py
@@ -833,6 +833,8 @@ def create_recipe(args):
         elif line.startswith('SRC_URI = '):
             if realpv and not pv_srcpv:
                 line = line.replace(realpv, '${PV}')
+            if scheme == 'npm':
+                line = line.replace('version=latest', 'version=${PV}')
         elif line.startswith('PV = '):
             if realpv:
                 # Replace the first part of the PV value
-- 
2.20.1



^ permalink raw reply related	[flat|nested] 36+ messages in thread

* [RFC][PATCH 6/6] recipetool/create.py: remove the 'noverify' url parameter
  2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
                   ` (4 preceding siblings ...)
  2019-10-22  9:03 ` [RFC][PATCH 5/6] recipetool/create.py: replace 'latest' keyword for npm Jean-Marie LEMETAYER
@ 2019-10-22  9:03 ` Jean-Marie LEMETAYER
  2019-10-22 11:22 ` [RFC][PATCH 0/6] NPM refactoring Richard Purdie
  2019-10-25  8:01 ` André Draszik
  7 siblings, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-22  9:03 UTC (permalink / raw)
  To: openembedded-core; +Cc: brendan.le.foll, paul.eggleton, rennes

This commit removes the 'noverify' parameter which was added to the url
to fix warnings with the shrinkwrap / lockdown file generation. This is
not needed anymore with the new npm fetcher.

Signed-off-by: Jean-Marie LEMETAYER <jean-marie.lemetayer@savoirfairelinux.com>
---
 scripts/lib/recipetool/create.py | 2 --
 1 file changed, 2 deletions(-)

diff --git a/scripts/lib/recipetool/create.py b/scripts/lib/recipetool/create.py
index e49e73ed9b..b4ab1117b8 100644
--- a/scripts/lib/recipetool/create.py
+++ b/scripts/lib/recipetool/create.py
@@ -477,8 +477,6 @@ def create_recipe(args):
             storeTagName = params['tag']
             params['nobranch'] = '1'
             del params['tag']
-        if scheme == 'npm':
-            params['noverify'] = '1'
         fetchuri = bb.fetch2.encodeurl((scheme, network, path, user, passwd, params))
 
         tmpparent = tinfoil.config_data.getVar('BASE_WORKDIR')
-- 
2.20.1



^ permalink raw reply related	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
                   ` (5 preceding siblings ...)
  2019-10-22  9:03 ` [RFC][PATCH 6/6] recipetool/create.py: remove the 'noverify' url parameter Jean-Marie LEMETAYER
@ 2019-10-22 11:22 ` Richard Purdie
  2019-10-23 13:17   ` Jean-Marie LEMETAYER
  2019-10-24 12:01   ` Stefan Herbrechtsmeier
  2019-10-25  8:01 ` André Draszik
  7 siblings, 2 replies; 36+ messages in thread
From: Richard Purdie @ 2019-10-22 11:22 UTC (permalink / raw)
  To: Jean-Marie LEMETAYER, openembedded-core
  Cc: brendan.le.foll, paul.eggleton, rennes

On Tue, 2019-10-22 at 11:03 +0200, Jean-Marie LEMETAYER wrote:
> The current NPM support have several issues:
>  - The current NPM fetcher downloads the dependency tree but not the other
>    fetchers. The 'subdir' parameter was used to fix this issue.
>  - They are multiple issues with package names (uppercase, exotic characters,
>    scoped packages) even if they are inside the dependencies.
>  - The lockdown file generation have issues. When a package depends on
>    multiple version of the same package (all versions have the same checksum).
> 
> This patchset refactors the NPM support in Yocto:
>  - As the NPM algorithm for dependency management is hard to handle, the new
>    NPM fetcher downloads only the package source (and not the dependencies,
>    like the other fetchers) (patch submitted in the bitbake-devel list).
>  - The NPM class handles the dependencies using NPM (and not manually).
>  - The NPM recipe creation is simplified to avoid issues.
>  - The lockdown file is no more used as it is no longer relevant compared to the
>    latest shrinkwrap file format.
> 
> This patchset may remove some features (lockdown file, license management for
> dependencies) but fixes the majority of the NPM issues. All of these issues
> from the bugzilla.yoctoproject.org are resolved by this patchset:
> #10237, #10760, #11028, #11728, #11902, #12534

One key requirement which many of our users have from the fetcher is
that its deterministic and allows for "offline" builds.

What this means is that should I have a populated DL_DIR, the build
should not need to touch the network. Also, only do_fetch tasks would
make network accesses.

What is the situation for npm after these changes with regard to this?

Cheers,

Richard



^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 1/6] npm.bbclass: refactor the npm class
  2019-10-22  9:03 ` [RFC][PATCH 1/6] npm.bbclass: refactor the npm class Jean-Marie LEMETAYER
@ 2019-10-22 11:35   ` Alexander Kanavin
  2019-10-23 13:17     ` Jean-Marie LEMETAYER
  2019-10-24 11:22   ` Stefan Herbrechtsmeier
  1 sibling, 1 reply; 36+ messages in thread
From: Alexander Kanavin @ 2019-10-22 11:35 UTC (permalink / raw)
  To: Jean-Marie LEMETAYER
  Cc: brendan.le.foll, Paul Eggleton (paul.eggleton@linux.intel.com),
	rennes, OE-core

[-- Attachment #1: Type: text/plain, Size: 902 bytes --]

On Tue, 22 Oct 2019 at 11:12, Jean-Marie LEMETAYER <
jean-marie.lemetayer@savoirfairelinux.com> wrote:

> The simplest solution is to let npm do its job. Assuming the fetcher
> only get the sources of the package, the class will now run
> 'npm install' to create a build directory. The build directory is then
> copied wisely to the destination.
>

I agree that npm dependency handing is a mess, and we should indeed let npm
do the job.

However, 'npm install' pulls various things from the network; this can only
be happen during do_fetch(), so that offline builds can continue to work.
Whatever has been downloaded, needs to go to DL_DIR.

It also needs to be reproducible (there should be a guarantee that it
always pulls the same set of sources which are verified through a checksum
of some kind), I guess that can only be achieved through upstream provided
shrinkwrap?

Alex

[-- Attachment #2: Type: text/html, Size: 1310 bytes --]

^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 1/6] npm.bbclass: refactor the npm class
  2019-10-22 11:35   ` Alexander Kanavin
@ 2019-10-23 13:17     ` Jean-Marie LEMETAYER
  0 siblings, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-23 13:17 UTC (permalink / raw)
  To: Alexander Kanavin; +Cc: paul eggleton, Savoir-faire Linux Rennes, OE-core

On Oct 22, 2019, at 1:35 PM, Alexander Kanavin alex.kanavin@gmail.com wrote:
> On Tue, 22 Oct 2019 at 11:12, Jean-Marie LEMETAYER <
> jean-marie.lemetayer@savoirfairelinux.com> wrote:
> 
>> The simplest solution is to let npm do its job. Assuming the fetcher
>> only get the sources of the package, the class will now run
>> 'npm install' to create a build directory. The build directory is then
>> copied wisely to the destination.
>>
> 
> I agree that npm dependency handing is a mess, and we should indeed let npm
> do the job.
> 
> However, 'npm install' pulls various things from the network; this can only
> be happen during do_fetch(), so that offline builds can continue to work.
> Whatever has been downloaded, needs to go to DL_DIR.

You are right, I am working on a v2 to fix this point.

> It also needs to be reproducible (there should be a guarantee that it
> always pulls the same set of sources which are verified through a checksum
> of some kind), I guess that can only be achieved through upstream provided
> shrinkwrap?

Yes, with current npm versions the "npm-shrinkwrap.json" [1] or the
"package-lock.json" [2] are both used to describe the dependency tree and
verify the integrity of the dependencies. This is what makes the build
reproducible. One of these files is mandatory to build the recipe (in the
source files or provided by the NPM_SHRINKWRAP variable).

For information the current 'node' version packed in meta-oe is '10.16.3'
which includes 'npm' version '6.9.0' [3].

1: https://docs.npmjs.com/files/shrinkwrap.json
2: https://docs.npmjs.com/files/package-lock.json
3: https://nodejs.org/en/download/releases/

Regards,
Jean-Marie


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-22 11:22 ` [RFC][PATCH 0/6] NPM refactoring Richard Purdie
@ 2019-10-23 13:17   ` Jean-Marie LEMETAYER
  2019-10-24 12:01   ` Stefan Herbrechtsmeier
  1 sibling, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-23 13:17 UTC (permalink / raw)
  To: Richard Purdie; +Cc: paul eggleton, Savoir-faire Linux Rennes, OE-core

On Oct 22, 2019, at 1:22 PM, Richard Purdie richard.purdie@linuxfoundation.org wrote:
> On Tue, 2019-10-22 at 11:03 +0200, Jean-Marie LEMETAYER wrote:
>> The current NPM support have several issues:
>>  - The current NPM fetcher downloads the dependency tree but not the other
>>    fetchers. The 'subdir' parameter was used to fix this issue.
>>  - They are multiple issues with package names (uppercase, exotic characters,
>>    scoped packages) even if they are inside the dependencies.
>>  - The lockdown file generation have issues. When a package depends on
>>    multiple version of the same package (all versions have the same checksum).
>> 
>> This patchset refactors the NPM support in Yocto:
>>  - As the NPM algorithm for dependency management is hard to handle, the new
>>    NPM fetcher downloads only the package source (and not the dependencies,
>>    like the other fetchers) (patch submitted in the bitbake-devel list).
>>  - The NPM class handles the dependencies using NPM (and not manually).
>>  - The NPM recipe creation is simplified to avoid issues.
>>  - The lockdown file is no more used as it is no longer relevant compared to the
>>    latest shrinkwrap file format.
>> 
>> This patchset may remove some features (lockdown file, license management for
>> dependencies) but fixes the majority of the NPM issues. All of these issues
>> from the bugzilla.yoctoproject.org are resolved by this patchset:
>> #10237, #10760, #11028, #11728, #11902, #12534
> 
> One key requirement which many of our users have from the fetcher is
> that its deterministic and allows for "offline" builds.
> 
> What this means is that should I have a populated DL_DIR, the build
> should not need to touch the network. Also, only do_fetch tasks would
> make network accesses.
> 
> What is the situation for npm after these changes with regard to this?

This is a good point. With this patchset the build is working but some
network accesses are done during the do_compile task (the 'npm install'
command). This needs to be fixed.

Other people have reported some issues with this patchset which will be
fixed in the v2:
 - Use do_fetch only to access the network.
 - Respect the existing MIRROR, BB_NO_NETWORK, BB_ALLOWED_NETWORKS settings.
 - Add some tests for the fetcher.

Regards,
Jean-Marie


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 1/6] npm.bbclass: refactor the npm class
  2019-10-22  9:03 ` [RFC][PATCH 1/6] npm.bbclass: refactor the npm class Jean-Marie LEMETAYER
  2019-10-22 11:35   ` Alexander Kanavin
@ 2019-10-24 11:22   ` Stefan Herbrechtsmeier
  2019-10-24 15:13     ` Jean-Marie LEMETAYER
  1 sibling, 1 reply; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-24 11:22 UTC (permalink / raw)
  To: Jean-Marie LEMETAYER, openembedded-core
  Cc: brendan.le.foll, paul.eggleton, rennes

Hi Jean-Marie,

Am 22.10.19 um 11:03 schrieb Jean-Marie LEMETAYER:
> Many issues were related to npm dependencies badly handled: package
> names, installation directories, ... In fact npm is using an install
> algorithm [1] which is hard to reproduce / anticipate.

Why do you think it is hard to reproduce?

> Moreover some
> npm packages use scopes [2] which adds more complexity.

The addition complexity is limited.

> 
> The simplest solution is to let npm do its job. Assuming the fetcher
> only get the sources of the package, the class will now run
> 'npm install' to create a build directory. The build directory is then
> copied wisely to the destination.

You use an full-blown package manager which total different design goals 
and no understanding for embedded / restricted systems.

> 
> 1: https://docs.npmjs.com/cli/install#algorithm
> 2: https://docs.npmjs.com/about-scopes
> 
> Signed-off-by: Jean-Marie LEMETAYER <jean-marie.lemetayer@savoirfairelinux.com>
> ---
>   meta/classes/npm.bbclass | 210 ++++++++++++++++++++++++++-------------
>   1 file changed, 143 insertions(+), 67 deletions(-)
> 
> diff --git a/meta/classes/npm.bbclass b/meta/classes/npm.bbclass
> index 4b1f0a39f0..fc671e7223 100644
> --- a/meta/classes/npm.bbclass
> +++ b/meta/classes/npm.bbclass
> @@ -1,19 +1,44 @@
> +# Copyright (C) 2019 Savoir-Faire Linux
> +#
> +# This bbclass builds and installs an npm package to the target. The package
> +# sources files should be fetched in the calling recipe by using the SRC_URI
> +# variable. The ${S} variable should be updated depending of your fetcher.
> +#
> +# Usage:
> +#  SRC_URI = "..."
> +#  inherit npm
> +#
> +# Optional variables:
> +#  NPM_SHRINKWRAP:
> +#       Provide a shrinkwrap file [1]. If available a shrinkwrap file in the
> +#       sources has priority over the one provided. A shrinkwrap file is
> +#       mandatory in order to ensure build reproducibility.
> +#       1: https://docs.npmjs.com/files/shrinkwrap.json
> +#
> +#  NPM_INSTALL_DEV:
> +#       Set to 1 to also install devDependencies.
> +#
> +#  NPM_REGISTRY:
> +#       Use the specified registry.
> +#
> +#  NPM_ARCH:
> +#       Override the auto generated npm architecture.
> +#
> +#  NPM_INSTALL_EXTRA_ARGS:
> +#       Add extra arguments to the 'npm install' execution.
> +#       Use it at your own risk.
> +
>   DEPENDS_prepend = "nodejs-native "
>   RDEPENDS_${PN}_prepend = "nodejs "
> -S = "${WORKDIR}/npmpkg"
>   
> -def node_pkgname(d):
> -    bpn = d.getVar('BPN')
> -    if bpn.startswith("node-"):
> -        return bpn[5:]
> -    return bpn
> +NPM_SHRINKWRAP ?= "${THISDIR}/${BPN}/npm-shrinkwrap.json"
>   
> -NPMPN ?= "${@node_pkgname(d)}"
> +NPM_INSTALL_DEV ?= "0"
>   
> -NPM_INSTALLDIR = "${libdir}/node_modules/${NPMPN}"
> +NPM_REGISTRY ?= "https://registry.npmjs.org"
>   
>   # function maps arch names to npm arch names
> -def npm_oe_arch_map(target_arch, d):
> +def npm_oe_arch_map(target_arch):
>       import re
>       if   re.match('p(pc|owerpc)(|64)', target_arch): return 'ppc'
>       elif re.match('i.86$', target_arch): return 'ia32'
> @@ -21,74 +46,125 @@ def npm_oe_arch_map(target_arch, d):
>       elif re.match('arm64$', target_arch): return 'arm'
>       return target_arch
>   
> -NPM_ARCH ?= "${@npm_oe_arch_map(d.getVar('TARGET_ARCH'), d)}"
> -NPM_INSTALL_DEV ?= "0"
> +NPM_ARCH ?= "${@npm_oe_arch_map(d.getVar('TARGET_ARCH'))}"
> +
> +NPM_INSTALL_EXTRA_ARGS ?= ""
> +
> +B = "${WORKDIR}/build"
> +
> +npm_install_shrinkwrap() {
> +    # This function ensures that there is a shrinkwrap file in the specified
> +    # directory. A shrinkwrap file is mandatory to have reproducible builds.
> +    # If the shrinkwrap file is not already included in the sources,
> +    # the recipe can provide one by using the NPM_SHRINKWRAP option.
> +    # This function returns the filename of the installed file (if any).
> +    if [ -f ${1}/npm-shrinkwrap.json ]
> +    then
> +        bbnote "Using the npm-shrinkwrap.json provided in the sources"
> +    elif [ -f ${NPM_SHRINKWRAP} ]
> +    then
> +        install -m 644 ${NPM_SHRINKWRAP} ${1}
> +        echo ${1}/npm-shrinkwrap.json
> +    else
> +        bbfatal "No mandatory NPM_SHRINKWRAP file found"
> +    fi
> +}
>   
>   npm_do_compile() {
> -	# Copy in any additionally fetched modules
> -	if [ -d ${WORKDIR}/node_modules ] ; then
> -		cp -a ${WORKDIR}/node_modules ${S}/
> -	fi
> -	# changing the home directory to the working directory, the .npmrc will
> -	# be created in this directory
> -	export HOME=${WORKDIR}
> -	if [  "${NPM_INSTALL_DEV}" = "1" ]; then
> -		npm config set dev true
> -	else
> -		npm config set dev false
> -	fi
> -	npm set cache ${WORKDIR}/npm_cache
> -	# clear cache before every build
> -	npm cache clear --force
> -	# Install pkg into ${S} without going to the registry
> -	if [  "${NPM_INSTALL_DEV}" = "1" ]; then
> -		npm --arch=${NPM_ARCH} --target_arch=${NPM_ARCH} --no-registry install
> -	else
> -		npm --arch=${NPM_ARCH} --target_arch=${NPM_ARCH} --production --no-registry install
> -	fi
> +    # This function executes the 'npm install' command which builds and
> +    # installs every dependencies needed for the package. All the files are
> +    # installed in a build directory ${B} without filtering anything. To do so,
> +    # a combination of 'npm pack' and 'npm install' is used to ensure that the
> +    # files in ${B} are actual copies instead of symbolic links (which is the
> +    # default npm behavior).
> +
> +    # The npm command use by default a cache which is located in '~/.npm'. In
> +    # order to force the next npm commands to disable caching, the npm cache
> +    # needs to be cleared. But not to alter the local cache, the npm config
> +    # needs to be updated to use another cache directory. The HOME needs to be
> +    # updated as well to avoid modifying the local '~/.npmrc' file.
> +    HOME=${WORKDIR}
> +    npm config set cache ${WORKDIR}/npm_cache
> +    npm cache clear --force
> +
> +    # First ensure that there is a shrinkwrap file in the sources.
> +    local NPM_SHRINKWRAP_INSTALLED=$(npm_install_shrinkwrap ${S})
> +
> +    # Then create a tarball from a npm package whose sources must be in ${S}.
> +    local NPM_PACK_FILE=$(cd ${WORKDIR} && npm pack ${S}/)
> +
> +    # Finally install and build the tarball package in ${B}.
> +    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --loglevel silly"
> +    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --prefix=${B}"
> +    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --global"
> +
> +    if [ "${NPM_INSTALL_DEV}" != 1 ]
> +    then
> +        local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --production"
> +    fi
> +
> +    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS} --arch=${NPM_ARCH}"
> +    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS} --target_arch=${NPM_ARCH}"
> +    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS} --release"
> +
> +    cd ${WORKDIR} && npm install \

Why you don't use "npm ci"?

> +        ${NPM_INSTALL_EXTRA_ARGS} \
> +        ${NPM_INSTALL_GYP_ARGS} \
> +        ${NPM_INSTALL_ARGS} \
> +        ${NPM_PACK_FILE}
> +
> +    # Clean source tree.
> +    rm -f ${NPM_SHRINKWRAP_INSTALLED}
>   }
>   
>   npm_do_install() {
> -	# changing the home directory to the working directory, the .npmrc will
> -	# be created in this directory
> -	export HOME=${WORKDIR}
> -	mkdir -p ${D}${libdir}/node_modules
> -	local NPM_PACKFILE=$(npm pack .)
> -	npm install --prefix ${D}${prefix} -g --arch=${NPM_ARCH} --target_arch=${NPM_ARCH} --production --no-registry ${NPM_PACKFILE}
> -	ln -fs node_modules ${D}${libdir}/node
> -	find ${D}${NPM_INSTALLDIR} -type f \( -name "*.a" -o -name "*.d" -o -name "*.o" \) -delete
> -	if [ -d ${D}${prefix}/etc ] ; then
> -		# This will be empty
> -		rmdir ${D}${prefix}/etc
> -	fi
> -}
> +    # This function creates the destination directory from the pre installed
> +    # files in the ${B} directory.
> +
> +    # Copy the entire lib and bin directories from ${B} to ${D}.
> +    install -d ${D}/${libdir}
> +    cp --no-preserve=ownership --recursive ${B}/lib/. ${D}/${libdir}
> +
> +    if [ -d "${B}/bin" ]
> +    then
> +        install -d ${D}/${bindir}
> +        cp --no-preserve=ownership --recursive ${B}/bin/. ${D}/${bindir}
> +    fi
> +
> +    # If the package (or its dependencies) uses node-gyp to build native addons,
> +    # object files, static libraries or other temporary files can be hidden in
> +    # the lib directory. To reduce the package size and to avoid QA issues
> +    # (staticdev with static library files) these files must be removed.
> +
> +    # Remove any node-gyp directory in ${D} to remove temporary build files.
> +    for GYP_D_FILE in $(find ${D} -regex ".*/build/Release/[^/]*.node")
> +    do
> +        local GYP_D_DIR=${GYP_D_FILE%/Release/*}
> +
> +        rm --recursive --force ${GYP_D_DIR}
> +    done
> +
> +    # Copy only the node-gyp release files from ${B} to ${D}.
> +    for GYP_B_FILE in $(find ${B} -regex ".*/build/Release/[^/]*.node")
> +    do
> +        local GYP_D_FILE=${D}/${prefix}/${GYP_B_FILE#${B}}
> +
> +        install -d ${GYP_D_FILE%/*}
> +        install -m 755 ${GYP_B_FILE} ${GYP_D_FILE}
> +    done
> +
> +    # Remove the shrinkwrap file which does not need to be packed.
> +    rm -f ${D}/${libdir}/node_modules/*/npm-shrinkwrap.json
> +    rm -f ${D}/${libdir}/node_modules/@*/*/npm-shrinkwrap.json
>   
> -python populate_packages_prepend () {
> -    instdir = d.expand('${D}${NPM_INSTALLDIR}')
> -    extrapackages = oe.package.npm_split_package_dirs(instdir)
> -    pkgnames = extrapackages.keys()
> -    d.prependVar('PACKAGES', '%s ' % ' '.join(pkgnames))
> -    for pkgname in pkgnames:
> -        pkgrelpath, pdata = extrapackages[pkgname]
> -        pkgpath = '${NPM_INSTALLDIR}/' + pkgrelpath
> -        # package names can't have underscores but npm packages sometimes use them
> -        oe_pkg_name = pkgname.replace('_', '-')
> -        expanded_pkgname = d.expand(oe_pkg_name)
> -        d.setVar('FILES_%s' % expanded_pkgname, pkgpath)
> -        if pdata:
> -            version = pdata.get('version', None)
> -            if version:
> -                d.setVar('PKGV_%s' % expanded_pkgname, version)
> -            description = pdata.get('description', None)
> -            if description:
> -                d.setVar('SUMMARY_%s' % expanded_pkgname, description.replace(u"\u2018", "'").replace(u"\u2019", "'"))
> -    d.appendVar('RDEPENDS_%s' % d.getVar('PN'), ' %s' % ' '.join(pkgnames).replace('_', '-'))
> +    # node(1) is using /usr/lib/node as default include directory and npm(1) is
> +    # using /usr/lib/node_modules as install directory. Let's make both happy.
> +    ln -fs node_modules ${D}/${libdir}/node
>   }
>   
>   FILES_${PN} += " \
>       ${bindir} \
> -    ${libdir}/node \
> -    ${NPM_INSTALLDIR} \
> +    ${libdir} \
>   "
>   
>   EXPORT_FUNCTIONS do_compile do_install
> 

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-22 11:22 ` [RFC][PATCH 0/6] NPM refactoring Richard Purdie
  2019-10-23 13:17   ` Jean-Marie LEMETAYER
@ 2019-10-24 12:01   ` Stefan Herbrechtsmeier
  2019-10-24 12:12     ` Alexander Kanavin
  2019-10-24 15:13     ` Jean-Marie LEMETAYER
  1 sibling, 2 replies; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-24 12:01 UTC (permalink / raw)
  To: Richard Purdie, Jean-Marie LEMETAYER, openembedded-core
  Cc: brendan.le.foll, paul.eggleton, rennes

Hi Jean-Marie,

Am 22.10.19 um 13:22 schrieb Richard Purdie:
> On Tue, 2019-10-22 at 11:03 +0200, Jean-Marie LEMETAYER wrote:
>> The current NPM support have several issues:
>>   - The current NPM fetcher downloads the dependency tree but not the other
>>     fetchers. The 'subdir' parameter was used to fix this issue.
>>   - They are multiple issues with package names (uppercase, exotic characters,
>>     scoped packages) even if they are inside the dependencies.
>>   - The lockdown file generation have issues. When a package depends on
>>     multiple version of the same package (all versions have the same checksum).
>>
>> This patchset refactors the NPM support in Yocto:
>>   - As the NPM algorithm for dependency management is hard to handle, the new
>>     NPM fetcher downloads only the package source (and not the dependencies,
>>     like the other fetchers) (patch submitted in the bitbake-devel list).

What make the new fetcher different from the simple wget fetcher?

>>   - The NPM class handles the dependencies using NPM (and not manually).

Is this really an improvement? NPM will do the cross compile during 
fetch, loads additionally archives (not packages) from the internet and 
doesn't reuse dependencies.

>>   - The NPM recipe creation is simplified to avoid issues.

We create new not obvious issues. How you would handle prebuild binaries?

>>   - The lockdown file is no more used as it is no longer relevant compared to the
>>     latest shrinkwrap file format.
>>
>> This patchset may remove some features (lockdown file, license management for
>> dependencies)

You really remove the license management of the dependencies? I think a 
main feature of OE is the license management.

> but fixes the majority of the NPM issues. All of these issues
>> from the bugzilla.yoctoproject.org are resolved by this patchset:
>> #10237, #10760, #11028, #11728, #11902, #12534
> 
> One key requirement which many of our users have from the fetcher is
> that its deterministic and allows for "offline" builds.

I think this is impossible with npm because every dependency could run a 
script and download additional files (ex. prebuild).

> What this means is that should I have a populated DL_DIR, the build
> should not need to touch the network. Also, only do_fetch tasks would
> make network accesses.

@Richard: What is your opinion about the per recipe dependency? 
Typically OE use one recipe per project. The NPM based solution handle a 
project and all dependencies via one recipe.

@Jean-Marie: Do you know PNPM? They use a different node_modules layout 
which allows the reuse of dependencies.

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 12:01   ` Stefan Herbrechtsmeier
@ 2019-10-24 12:12     ` Alexander Kanavin
  2019-10-24 12:40       ` Stefan Herbrechtsmeier
                         ` (2 more replies)
  2019-10-24 15:13     ` Jean-Marie LEMETAYER
  1 sibling, 3 replies; 36+ messages in thread
From: Alexander Kanavin @ 2019-10-24 12:12 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier
  Cc: brendan.le.foll, Paul Eggleton (paul.eggleton@linux.intel.com),
	rennes, OE-core

[-- Attachment #1: Type: text/plain, Size: 794 bytes --]

On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier <
stefan@herbrechtsmeier.net> wrote:

>
> @Richard: What is your opinion about the per recipe dependency?
> Typically OE use one recipe per project. The NPM based solution handle a
> project and all dependencies via one recipe.
>

I don't think it's at all realistic to stick to the 'one recipe per
component' in node.js world. A typical 'npm install' can pull down
hundreds, or over a thousand dependencies, it's not feasible to have a
recipe for each.

I very much welcome a solution that uses 'npm install' in a way that
preserves offline builds, and integrity/reproducibility of downloads.
License management should be also handled by npm, and if it isn't, then we
need to work with the upstream to address it.

Alex

[-- Attachment #2: Type: text/html, Size: 1180 bytes --]

^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 12:12     ` Alexander Kanavin
@ 2019-10-24 12:40       ` Stefan Herbrechtsmeier
  2019-10-24 12:45         ` Alexander Kanavin
  2019-10-24 15:13         ` Jean-Marie LEMETAYER
  2019-10-24 13:36       ` richard.purdie
  2019-10-24 15:37       ` Adrian Bunk
  2 siblings, 2 replies; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-24 12:40 UTC (permalink / raw)
  To: Alexander Kanavin
  Cc: brendan.le.foll, Paul Eggleton (paul.eggleton@linux.intel.com),
	rennes, OE-core

Am 24.10.19 um 14:12 schrieb Alexander Kanavin:
> On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier 
> <stefan@herbrechtsmeier.net <mailto:stefan@herbrechtsmeier.net>> wrote:
> 
> 
>     @Richard: What is your opinion about the per recipe dependency?
>     Typically OE use one recipe per project. The NPM based solution
>     handle a
>     project and all dependencies via one recipe.
> 
> 
> I don't think it's at all realistic to stick to the 'one recipe per 
> component' in node.js world. A typical 'npm install' can pull down 
> hundreds, or over a thousand dependencies, it's not feasible to have a 
> recipe for each.

Do you have an example package?

> 
> I very much welcome a solution that uses 'npm install' in a way that 
> preserves offline builds, and integrity/reproducibility of downloads.

Fist we should use 'npm ci' instead of 'npm install'.

How would you handle prebuild binaries? Would you disable prebuild binaries?

How would you handle native packages (ex. angular-cli)?

How would you patch dependencies?

How would you remove unneeded files (ex. documentation, examples, source 
code)?

> License management should be also handled by npm, and if it isn't, then 
> we need to work with the upstream to address it.

To my knowledge npm don't check license files. It only reads the license 
in the package.json.

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 12:40       ` Stefan Herbrechtsmeier
@ 2019-10-24 12:45         ` Alexander Kanavin
  2019-10-24 13:52           ` Stefan Herbrechtsmeier
  2019-10-24 15:13         ` Jean-Marie LEMETAYER
  1 sibling, 1 reply; 36+ messages in thread
From: Alexander Kanavin @ 2019-10-24 12:45 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier
  Cc: Paul Eggleton (paul.eggleton@linux.intel.com), rennes, OE-core

[-- Attachment #1: Type: text/plain, Size: 654 bytes --]

On Thu, 24 Oct 2019 at 14:40, Stefan Herbrechtsmeier <
stefan@herbrechtsmeier.net> wrote:

> > I don't think it's at all realistic to stick to the 'one recipe per
> > component' in node.js world. A typical 'npm install' can pull down
> > hundreds, or over a thousand dependencies, it's not feasible to have a
> > recipe for each.
>
> Do you have an example package?
>

Yes:
http://lists.openembedded.org/pipermail/openembedded-architecture/2017-March/001270.html

I have also written an 'essay' on generally dealing with this kind of thing:
http://lists.openembedded.org/pipermail/openembedded-architecture/2017-March/001269.html

Alex

[-- Attachment #2: Type: text/html, Size: 1269 bytes --]

^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 12:12     ` Alexander Kanavin
  2019-10-24 12:40       ` Stefan Herbrechtsmeier
@ 2019-10-24 13:36       ` richard.purdie
  2019-10-24 15:20         ` Jean-Marie LEMETAYER
  2019-10-24 15:37       ` Adrian Bunk
  2 siblings, 1 reply; 36+ messages in thread
From: richard.purdie @ 2019-10-24 13:36 UTC (permalink / raw)
  To: Alexander Kanavin, Stefan Herbrechtsmeier
  Cc: brendan.le.foll, Paul Eggleton (paul.eggleton@linux.intel.com),
	rennes, OE-core

On Thu, 2019-10-24 at 14:12 +0200, Alexander Kanavin wrote:
> On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier <
> stefan@herbrechtsmeier.net> wrote:
> > @Richard: What is your opinion about the per recipe dependency? 
> > Typically OE use one recipe per project. The NPM based solution
> > handle a 
> > project and all dependencies via one recipe.
> 
> I don't think it's at all realistic to stick to the 'one recipe per
> component' in node.js world. A typical 'npm install' can pull down
> hundreds, or over a thousand dependencies, it's not feasible to have
> a recipe for each.
> 
> I very much welcome a solution that uses 'npm install' in a way that
> preserves offline builds, and integrity/reproducibility of downloads.
> License management should be also handled by npm, and if it isn't,
> then we need to work with the upstream to address it.

I understand however keep in mind the way this patch series has been
going, it could end up simply forceing all processing into the do_fetch
task.

We need determinism form the build in that building this today should
give the same result as a build run in X years time, assuming the same
host OS and so on, even if DL_DIR isn't populated. The state of the
internet should not change that.

I worry about the amount of magic "npm install" has going on which
would mean we couldn't achieve this.

Cheers,

Richard



^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 12:45         ` Alexander Kanavin
@ 2019-10-24 13:52           ` Stefan Herbrechtsmeier
  2019-10-24 14:22             ` Alexander Kanavin
  0 siblings, 1 reply; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-24 13:52 UTC (permalink / raw)
  To: Alexander Kanavin
  Cc: Paul Eggleton (paul.eggleton@linux.intel.com), rennes, OE-core

Am 24.10.19 um 14:45 schrieb Alexander Kanavin:
> On Thu, 24 Oct 2019 at 14:40, Stefan Herbrechtsmeier 
> <stefan@herbrechtsmeier.net <mailto:stefan@herbrechtsmeier.net>> wrote:
> 
>      > I don't think it's at all realistic to stick to the 'one recipe per
>      > component' in node.js world. A typical 'npm install' can pull down
>      > hundreds, or over a thousand dependencies, it's not feasible to
>     have a
>      > recipe for each.
> 
>     Do you have an example package?
> 
> 
> Yes:
> http://lists.openembedded.org/pipermail/openembedded-architecture/2017-March/001270.html

Do you have an up-to-date example? The mean.io project is dead.

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 13:52           ` Stefan Herbrechtsmeier
@ 2019-10-24 14:22             ` Alexander Kanavin
  2019-10-24 17:44               ` Stefan Herbrechtsmeier
  0 siblings, 1 reply; 36+ messages in thread
From: Alexander Kanavin @ 2019-10-24 14:22 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier
  Cc: Paul Eggleton (paul.eggleton@linux.intel.com), rennes, OE-core

[-- Attachment #1: Type: text/plain, Size: 364 bytes --]

On Thu, 24 Oct 2019 at 15:52, Stefan Herbrechtsmeier <
stefan@herbrechtsmeier.net> wrote:

>
> > Yes:
> >
> http://lists.openembedded.org/pipermail/openembedded-architecture/2017-March/001270.html
>
> Do you have an up-to-date example? The mean.io project is dead.
>

http://meanjs.org/ I guess. The package-lock.json in their tarball is 600K.

Alex

[-- Attachment #2: Type: text/html, Size: 914 bytes --]

^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 1/6] npm.bbclass: refactor the npm class
  2019-10-24 11:22   ` Stefan Herbrechtsmeier
@ 2019-10-24 15:13     ` Jean-Marie LEMETAYER
  0 siblings, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-24 15:13 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier; +Cc: paul eggleton, Savoir-faire Linux Rennes, OE-core

Hi Stefan,

On Oct 24, 2019, at 1:22 PM, Stefan Herbrechtsmeier stefan@herbrechtsmeier.net wrote:
> Am 22.10.19 um 11:03 schrieb Jean-Marie LEMETAYER:
>> Many issues were related to npm dependencies badly handled: package
>> names, installation directories, ... In fact npm is using an install
>> algorithm [1] which is hard to reproduce / anticipate.
> 
> Why do you think it is hard to reproduce?

The previous npm support tried to manage dependencies manually. But it could easily fail. My goal with this refactoring is to actually be able to _use_ npm recipes.

I remember that you are working on a npm refactoring too, but using one recipe per package and handling dependencies using RDEPENDS. To me this will generate a lot of work to create the recipes. For instance a basic angular application gathers more than thousands of dependencies and thus would lead to thousands of recipes. The work needed to create and maintain all those recipes is huge.

>> Moreover some
>> npm packages use scopes [2] which adds more complexity.
> 
> The addition complexity is limited.

You are right but this case was not handled in the previous npm support and is now supported.

>> The simplest solution is to let npm do its job. Assuming the fetcher
>> only get the sources of the package, the class will now run
>> 'npm install' to create a build directory. The build directory is then
>> copied wisely to the destination.
> 
> You use an full-blown package manager which total different design goals
> and no understanding for embedded / restricted systems.

This patchset save more memory space on the target than the actual implementation as all the build artifacts are removed. So I believe that it is suitable for embedded restricted systems.

>> +    # Finally install and build the tarball package in ${B}.
>> +    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --loglevel silly"
>> +    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --prefix=${B}"
>> +    local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --global"
>> +
>> +    if [ "${NPM_INSTALL_DEV}" != 1 ]
>> +    then
>> +        local NPM_INSTALL_ARGS="${NPM_INSTALL_ARGS} --production"
>> +    fi
>> +
>> +    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS} --arch=${NPM_ARCH}"
>> +    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS}
>> --target_arch=${NPM_ARCH}"
>> +    local NPM_INSTALL_GYP_ARGS="${NPM_INSTALL_GYP_ARGS} --release"
>> +
>> +    cd ${WORKDIR} && npm install \
> 
> Why you don't use "npm ci"?

Thanks for the tips, I did not know about the 'npm ci' command. Sadly it fails to handle '--global' installation. The 'npm pack && npm install' trick is much more efficient as explain in this commit:
  https://git.yoctoproject.org/cgit/cgit.cgi/poky/commit/?id=d38e1e2c2ea4646b34ea6282d3d7620df5b0374b

Regards,
Jean-Marie


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 12:01   ` Stefan Herbrechtsmeier
  2019-10-24 12:12     ` Alexander Kanavin
@ 2019-10-24 15:13     ` Jean-Marie LEMETAYER
  2019-10-24 16:18       ` Stefan Herbrechtsmeier
  1 sibling, 1 reply; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-24 15:13 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier; +Cc: paul eggleton, Savoir-faire Linux Rennes, OE-core

Hi Stefan,

On Oct 24, 2019, at 2:01 PM, Stefan Herbrechtsmeier stefan@herbrechtsmeier.net wrote:
> Am 22.10.19 um 13:22 schrieb Richard Purdie:
>> On Tue, 2019-10-22 at 11:03 +0200, Jean-Marie LEMETAYER wrote:
>>> The current NPM support have several issues:
>>>   - The current NPM fetcher downloads the dependency tree but not the other
>>>     fetchers. The 'subdir' parameter was used to fix this issue.
>>>   - They are multiple issues with package names (uppercase, exotic characters,
>>>     scoped packages) even if they are inside the dependencies.
>>>   - The lockdown file generation have issues. When a package depends on
>>>     multiple version of the same package (all versions have the same checksum).
>>>
>>> This patchset refactors the NPM support in Yocto:
>>>   - As the NPM algorithm for dependency management is hard to handle, the new
>>>     NPM fetcher downloads only the package source (and not the dependencies,
>>>     like the other fetchers) (patch submitted in the bitbake-devel list).
> 
> What make the new fetcher different from the simple wget fetcher?

The fetcher asks the registry for the tarball url and its integrity. Then it uses wget to download the tarball and then check the integrity given by the registry. Finally it unpacks the tarball according to the tarball format (the root directory is called 'package' which must be renamed).

>>>   - The NPM class handles the dependencies using NPM (and not manually).
> 
> Is this really an improvement? NPM will do the cross compile during
> fetch, loads additionally archives (not packages) from the internet and
> doesn't reuse dependencies.

In the next version of this patchset the dependencies are cached in a npm_cache directory stored in the DL_DIR during the do_fetch_append task (using 'npm cache --cache=${DL_DIR}/npm_cache'). The 'npm install' doesn't require any internet accesses because it is run with '--offline' option.

>>>   - The NPM recipe creation is simplified to avoid issues.
> 
> We create new not obvious issues. How you would handle prebuild binaries?

I have no case of using prebuild binaries. After checking some informations it seems that it is handled the same way as node-gyp (by using scripts in the package.json). A patchset to support that is welcome. 

>>>   - The lockdown file is no more used as it is no longer relevant compared to the
>>>     latest shrinkwrap file format.
>>>
>>> This patchset may remove some features (lockdown file, license management for
>>> dependencies)
> 
> You really remove the license management of the dependencies? I think a
> main feature of OE is the license management.

The previous npm support was creating subpackages to handle the license management of the dependencies. This is a good idea but it fails with package with exotic names. A work will be needed to cleanly remake it.

>> but fixes the majority of the NPM issues. All of these issues
>>> from the bugzilla.yoctoproject.org are resolved by this patchset:
>>> #10237, #10760, #11028, #11728, #11902, #12534
>> 
>> One key requirement which many of our users have from the fetcher is
>> that its deterministic and allows for "offline" builds.
> 
> I think this is impossible with npm because every dependency could run a
> script and download additional files (ex. prebuild).

I have no case of using prebuild binaries. Can you give me an example of a package using prebuild? Do you know about the '--offline' and the '--ignore-scripts' options? A patchset to support that is welcome.

>> What this means is that should I have a populated DL_DIR, the build
>> should not need to touch the network. Also, only do_fetch tasks would
>> make network accesses.
> 
> @Richard: What is your opinion about the per recipe dependency?
> Typically OE use one recipe per project. The NPM based solution handle a
> project and all dependencies via one recipe.
> 
> @Jean-Marie: Do you know PNPM? They use a different node_modules layout
> which allows the reuse of dependencies.

Thanks for the link. What is the value in our scope ?

Regards,
Jean-Marie


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 12:40       ` Stefan Herbrechtsmeier
  2019-10-24 12:45         ` Alexander Kanavin
@ 2019-10-24 15:13         ` Jean-Marie LEMETAYER
  2019-10-24 17:03           ` Stefan Herbrechtsmeier
  1 sibling, 1 reply; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-24 15:13 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier; +Cc: paul eggleton, Savoir-faire Linux Rennes, OE-core

Hi Stefan,

On Oct 24, 2019, at 2:40 PM, Stefan Herbrechtsmeier stefan@herbrechtsmeier.net wrote:
> Fist we should use 'npm ci' instead of 'npm install'.

Thanks for the tips, I did not know about the 'npm ci' command. Sadly it fails to handle '--global' installation. The 'npm pack && npm install' trick is much more efficient as explain in this commit:
  https://git.yoctoproject.org/cgit/cgit.cgi/poky/commit/?id=d38e1e2c2ea4646b34ea6282d3d7620df5b0374b

> How would you handle prebuild binaries? Would you disable prebuild binaries?

I have no case of using prebuild binaries. Can you give me an example of a package using prebuild?

> How would you handle native packages (ex. angular-cli)?

Like this (this a part of the next patchset to support angular):

  $ cat meta-oe/recipes-devtools/angular-cli/angular-cli_8.3.12.bb 
  SUMMARY = "CLI tool for Angular"
  HOMEPAGE = "https://github.com/angular/angular-cli"
  LICENSE = "MIT"
  LIC_FILES_CHKSUM = "file://LICENSE;md5=dc2a37e472c366af2a7b8bd0f2ba5af4"
  SRC_URI = "npm://registry.npmjs.org;name=@angular/cli;version=${PV}"
  S = "${WORKDIR}/npm"
  inherit npm
  NPM_SHRINKWRAP = "${THISDIR}/${BPN}/npm-shrinkwrap.json"
  BBCLASSEXTEND = "native"

> How would you patch dependencies?

I am not sure that it is managed in the actual implementation. I believe that the interest is limited.

> How would you remove unneeded files (ex. documentation, examples, source
> code)?

I don't. Only the build files are removed (when using addons). It is not possible to know which files should be kept or deleted.

>> License management should be also handled by npm, and if it isn't, then
>> we need to work with the upstream to address it.
> 
> To my knowledge npm don't check license files. It only reads the license
> in the package.json.

You are right, that's also a reason why the license management needs a rework.

Regards,
Jean-Marie


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 13:36       ` richard.purdie
@ 2019-10-24 15:20         ` Jean-Marie LEMETAYER
  0 siblings, 0 replies; 36+ messages in thread
From: Jean-Marie LEMETAYER @ 2019-10-24 15:20 UTC (permalink / raw)
  To: Richard Purdie; +Cc: paul eggleton, Savoir-faire Linux Rennes, OE-core

Hi Richard,

On Oct 24, 2019, at 3:36 PM, Richard Purdie richard.purdie@linuxfoundation.org wrote:
> On Thu, 2019-10-24 at 14:12 +0200, Alexander Kanavin wrote:
>> On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier <
>> stefan@herbrechtsmeier.net> wrote:
>> > @Richard: What is your opinion about the per recipe dependency?
>> > Typically OE use one recipe per project. The NPM based solution
>> > handle a
>> > project and all dependencies via one recipe.
>> 
>> I don't think it's at all realistic to stick to the 'one recipe per
>> component' in node.js world. A typical 'npm install' can pull down
>> hundreds, or over a thousand dependencies, it's not feasible to have
>> a recipe for each.
>> 
>> I very much welcome a solution that uses 'npm install' in a way that
>> preserves offline builds, and integrity/reproducibility of downloads.
>> License management should be also handled by npm, and if it isn't,
>> then we need to work with the upstream to address it.
> 
> I understand however keep in mind the way this patch series has been
> going, it could end up simply forceing all processing into the do_fetch
> task.
> 
> We need determinism form the build in that building this today should
> give the same result as a build run in X years time, assuming the same
> host OS and so on, even if DL_DIR isn't populated. The state of the
> internet should not change that.
> 
> I worry about the amount of magic "npm install" has going on which
> would mean we couldn't achieve this.

I have almost finish a new version for this patchset. All network accesses are now done during the do_fetch and verified using the check_network_access function.

The npm-shrinkwrap.json file ensures that the generated tree will be reproducible. It describes the wanted dependency tree and provides an integrity check for each packages. When using this file the build is totally predictable. Of course this file is mandatory.

Regards,
Jean-Marie




^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 12:12     ` Alexander Kanavin
  2019-10-24 12:40       ` Stefan Herbrechtsmeier
  2019-10-24 13:36       ` richard.purdie
@ 2019-10-24 15:37       ` Adrian Bunk
  2019-10-24 15:59         ` Richard Purdie
  2 siblings, 1 reply; 36+ messages in thread
From: Adrian Bunk @ 2019-10-24 15:37 UTC (permalink / raw)
  To: Alexander Kanavin
  Cc: brendan.le.foll, Paul Eggleton (paul.eggleton@linux.intel.com),
	rennes, OE-core

On Thu, Oct 24, 2019 at 02:12:43PM +0200, Alexander Kanavin wrote:
> On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier <
> stefan@herbrechtsmeier.net> wrote:
> 
> > @Richard: What is your opinion about the per recipe dependency?
> > Typically OE use one recipe per project. The NPM based solution handle a
> > project and all dependencies via one recipe.
> 
> I don't think it's at all realistic to stick to the 'one recipe per
> component' in node.js world. A typical 'npm install' can pull down
> hundreds, or over a thousand dependencies, it's not feasible to have a
> recipe for each.

Debian has for the perl/python/node/go/rust/haskell ecosystems
one recipe per component, with ~ 1k recipes each.

> I very much welcome a solution that uses 'npm install' in a way that
> preserves offline builds, and integrity/reproducibility of downloads.
> License management should be also handled by npm, and if it isn't, then we
> need to work with the upstream to address it.

How will CVE checking and security support work in such a setup?

Last time I looked at Rust I was wondering whether a vendored copy
of the OpenSSL sources was being used.

If git-lfs-native might run during during fetch, it would also be good
if relevant CVEs in the Go libraries it uses get fixed.

> Alex

cu
Adrian

-- 

       "Is there not promise of rain?" Ling Tan asked suddenly out
        of the darkness. There had been need of rain for many days.
       "Only a promise," Lao Er said.
                                       Pearl S. Buck - Dragon Seed



^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 15:37       ` Adrian Bunk
@ 2019-10-24 15:59         ` Richard Purdie
  2019-10-25  8:35           ` Stefan Herbrechtsmeier
  0 siblings, 1 reply; 36+ messages in thread
From: Richard Purdie @ 2019-10-24 15:59 UTC (permalink / raw)
  To: Adrian Bunk, Alexander Kanavin
  Cc: brendan.le.foll, Paul Eggleton (paul.eggleton@linux.intel.com),
	rennes, OE-core

On Thu, 2019-10-24 at 18:37 +0300, Adrian Bunk wrote:
> On Thu, Oct 24, 2019 at 02:12:43PM +0200, Alexander Kanavin wrote:
> > On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier <
> > stefan@herbrechtsmeier.net> wrote:
> > 
> > > @Richard: What is your opinion about the per recipe dependency?
> > > Typically OE use one recipe per project. The NPM based solution
> > > handle a
> > > project and all dependencies via one recipe.
> > 
> > I don't think it's at all realistic to stick to the 'one recipe per
> > component' in node.js world. A typical 'npm install' can pull down
> > hundreds, or over a thousand dependencies, it's not feasible to
> > have a
> > recipe for each.
> 
> Debian has for the perl/python/node/go/rust/haskell ecosystems
> one recipe per component, with ~ 1k recipes each.

I think we'll have to end up having a smaller number of recipes which
generate multiple packages. That gives a reasonable parsing time at the
expense of having to pre-generate some of the recipe, a bit like the
core perl and python recipes work today. The exact split will depend on
the ecosystems and the "blocks" people tend to build in as its a
compromise between building too much and parsing time.

Cheers,

Richard



^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 15:13     ` Jean-Marie LEMETAYER
@ 2019-10-24 16:18       ` Stefan Herbrechtsmeier
  0 siblings, 0 replies; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-24 16:18 UTC (permalink / raw)
  To: Jean-Marie LEMETAYER; +Cc: paul eggleton, Savoir-faire Linux Rennes, OE-core

Am 24.10.19 um 17:13 schrieb Jean-Marie LEMETAYER:
> Hi Stefan,
> 
> On Oct 24, 2019, at 2:01 PM, Stefan Herbrechtsmeier stefan@herbrechtsmeier.net wrote:
>> Am 22.10.19 um 13:22 schrieb Richard Purdie:
>>> On Tue, 2019-10-22 at 11:03 +0200, Jean-Marie LEMETAYER wrote:
>>>> The current NPM support have several issues:
>>>>    - The current NPM fetcher downloads the dependency tree but not the other
>>>>      fetchers. The 'subdir' parameter was used to fix this issue.
>>>>    - They are multiple issues with package names (uppercase, exotic characters,
>>>>      scoped packages) even if they are inside the dependencies.
>>>>    - The lockdown file generation have issues. When a package depends on
>>>>      multiple version of the same package (all versions have the same checksum).
>>>>
>>>> This patchset refactors the NPM support in Yocto:
>>>>    - As the NPM algorithm for dependency management is hard to handle, the new
>>>>      NPM fetcher downloads only the package source (and not the dependencies,
>>>>      like the other fetchers) (patch submitted in the bitbake-devel list).
>>
>> What make the new fetcher different from the simple wget fetcher?
> 
> The fetcher asks the registry for the tarball url and its integrity. Then it uses wget to download the tarball and then check the integrity given by the registry. Finally it unpacks the tarball according to the tarball format (the root directory is called 'package' which must be renamed).

Does this means you don't add the SRC_URI checksum to the recipe?

> 
>>>>    - The NPM class handles the dependencies using NPM (and not manually).
>>
>> Is this really an improvement? NPM will do the cross compile during
>> fetch, loads additionally archives (not packages) from the internet and
>> doesn't reuse dependencies.
> 
> In the next version of this patchset the dependencies are cached in a npm_cache directory stored in the DL_DIR during the do_fetch_append task (using 'npm cache --cache=${DL_DIR}/npm_cache'). The 'npm install' doesn't require any internet accesses because it is run with '--offline' option.

Does the npm cache works together with a download proxy?

> 
>>>>    - The NPM recipe creation is simplified to avoid issues.
>>
>> We create new not obvious issues. How you would handle prebuild binaries?
> 
> I have no case of using prebuild binaries. After checking some informations it seems that it is handled the same way as node-gyp (by using scripts in the package.json). A patchset to support that is welcome.

The problem is that prebuild is only one example and you detect the 
issue only if you remove the internet connection.


>>>>    - The lockdown file is no more used as it is no longer relevant compared to the
>>>>      latest shrinkwrap file format.
>>>>
>>>> This patchset may remove some features (lockdown file, license management for
>>>> dependencies)
>>
>> You really remove the license management of the dependencies? I think a
>> main feature of OE is the license management.
> 
> The previous npm support was creating subpackages to handle the license management of the dependencies. This is a good idea but it fails with package with exotic names. A work will be needed to cleanly remake it.

The license was independent of the sub packages. You need to parse all 
the package.json and dependency folders.

> 
>>> but fixes the majority of the NPM issues. All of these issues
>>>> from the bugzilla.yoctoproject.org are resolved by this patchset:
>>>> #10237, #10760, #11028, #11728, #11902, #12534
>>>
>>> One key requirement which many of our users have from the fetcher is
>>> that its deterministic and allows for "offline" builds.
>>
>> I think this is impossible with npm because every dependency could run a
>> script and download additional files (ex. prebuild).
> 
> I have no case of using prebuild binaries. Can you give me an example of a package using prebuild?

You could use the pouchdb package as an example and test it with an ARM 
soft float machine.

> Do you know about the '--offline' and the '--ignore-scripts' options? A patchset to support that is welcome.

Can you point me to the '--offline' option.

The '--ignore-scripts' option will disable all scripts and breaks binary 
packages.

> 
>>> What this means is that should I have a populated DL_DIR, the build
>>> should not need to touch the network. Also, only do_fetch tasks would
>>> make network accesses.
>>
>> @Richard: What is your opinion about the per recipe dependency?
>> Typically OE use one recipe per project. The NPM based solution handle a
>> project and all dependencies via one recipe.
>>
>> @Jean-Marie: Do you know PNPM? They use a different node_modules layout
>> which allows the reuse of dependencies.
> 
> Thanks for the link. What is the value in our scope ?

They use a different node_modules layout which avoid duplicates. Maybe 
it more interesting for my prototype.

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 15:13         ` Jean-Marie LEMETAYER
@ 2019-10-24 17:03           ` Stefan Herbrechtsmeier
  0 siblings, 0 replies; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-24 17:03 UTC (permalink / raw)
  To: Jean-Marie LEMETAYER; +Cc: paul eggleton, Savoir-faire Linux Rennes, OE-core

Hi Jean-Marie,

Am 24.10.19 um 17:13 schrieb Jean-Marie LEMETAYER:
> Hi Stefan,
> 
> On Oct 24, 2019, at 2:40 PM, Stefan Herbrechtsmeier stefan@herbrechtsmeier.net wrote:
>> Fist we should use 'npm ci' instead of 'npm install'.
> 
> Thanks for the tips, I did not know about the 'npm ci' command. Sadly it fails to handle '--global' installation. The 'npm pack && npm install' trick is much more efficient as explain in this commit:
>    https://git.yoctoproject.org/cgit/cgit.cgi/poky/commit/?id=d38e1e2c2ea4646b34ea6282d3d7620df5b0374b

Okay.

> 
>> How would you handle prebuild binaries? Would you disable prebuild binaries?
> 
> I have no case of using prebuild binaries. Can you give me an example of a package using prebuild?

The pouchdb package have some dependencies with prebuild binaries.

> 
>> How would you handle native packages (ex. angular-cli)?
> 
> Like this (this a part of the next patchset to support angular):
> 
>    $ cat meta-oe/recipes-devtools/angular-cli/angular-cli_8.3.12.bb
>    SUMMARY = "CLI tool for Angular"
>    HOMEPAGE = "https://github.com/angular/angular-cli"
>    LICENSE = "MIT"
>    LIC_FILES_CHKSUM = "file://LICENSE;md5=dc2a37e472c366af2a7b8bd0f2ba5af4"
>    SRC_URI = "npm://registry.npmjs.org;name=@angular/cli;version=${PV}"
>    S = "${WORKDIR}/npm"
>    inherit npm
>    NPM_SHRINKWRAP = "${THISDIR}/${BPN}/npm-shrinkwrap.json"
>    BBCLASSEXTEND = "native"

You don't set the SRC_URI checksum? How you ensure that the server 
didn't change the package?

How you use this package and how you handle other dependencies? How you 
build an angular application?

>> How would you patch dependencies?
> 
> I am not sure that it is managed in the actual implementation. I believe that the interest is limited.

This is needed if you use packages with c code.

Regards
   Stefan



^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 14:22             ` Alexander Kanavin
@ 2019-10-24 17:44               ` Stefan Herbrechtsmeier
  2019-10-24 17:58                 ` Alexander Kanavin
  0 siblings, 1 reply; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-24 17:44 UTC (permalink / raw)
  To: Alexander Kanavin
  Cc: Paul Eggleton (paul.eggleton@linux.intel.com), rennes, OE-core

Am 24.10.19 um 16:22 schrieb Alexander Kanavin:
> On Thu, 24 Oct 2019 at 15:52, Stefan Herbrechtsmeier 
> <stefan@herbrechtsmeier.net <mailto:stefan@herbrechtsmeier.net>> wrote:
> 
> 
>      > Yes:
>      >
>     http://lists.openembedded.org/pipermail/openembedded-architecture/2017-March/001270.html
> 
>     Do you have an up-to-date example? The mean.io <http://mean.io>
>     project is dead.
> 
> 
> http://meanjs.org/ I guess.

The project isn't really up-to-date and the "npm install" failed.

> The package-lock.json in their tarball is 600K.

The project use two major version and seven different versions with 30 
installations of debug. Furthermore the dependencies include build tools 
which should not be installed on the device.

The "@angular/cli" (242) and "node-red" (324) package share 106 packages.

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 17:44               ` Stefan Herbrechtsmeier
@ 2019-10-24 17:58                 ` Alexander Kanavin
  2019-10-25  8:58                   ` Stefan Herbrechtsmeier
  0 siblings, 1 reply; 36+ messages in thread
From: Alexander Kanavin @ 2019-10-24 17:58 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier
  Cc: Paul Eggleton (paul.eggleton@linux.intel.com), rennes, OE-core

[-- Attachment #1: Type: text/plain, Size: 1018 bytes --]

On Thu, 24 Oct 2019 at 19:45, Stefan Herbrechtsmeier <
stefan@herbrechtsmeier.net> wrote:

> > The package-lock.json in their tarball is 600K.
>
> The project use two major version and seven different versions with 30
> installations of debug. Furthermore the dependencies include build tools
> which should not be installed on the device.
>
> The "@angular/cli" (242) and "node-red" (324) package share 106 packages.
>

I have to ask: what point are you trying to make?

Here's a related lwn article describing a similar problem faced by opensuse:
https://lwn.net/Articles/712318/

""Ruby dependency hell has nothing on JavaScript dependency hell," he said.
A "hello world" application based on one JavaScript framework has 759
JavaScript dependencies; this framework is described as "a lightweight
alternative to Angular2". There is no way he is going to package all 759
dependencies for this thing; the current distribution package-management
approach just isn't going to work here."

Alex

[-- Attachment #2: Type: text/html, Size: 1517 bytes --]

^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
                   ` (6 preceding siblings ...)
  2019-10-22 11:22 ` [RFC][PATCH 0/6] NPM refactoring Richard Purdie
@ 2019-10-25  8:01 ` André Draszik
  2019-10-25  9:10   ` Stefan Herbrechtsmeier
  7 siblings, 1 reply; 36+ messages in thread
From: André Draszik @ 2019-10-25  8:01 UTC (permalink / raw)
  To: openembedded-core

Hi,

This has been an interesting discussion so far.

I'd like to throw in something else...

A couple years back I wrote a little python script to automatically
generate all the required dependency recipes given an npm project's
package.json

The generated recipes worked very well, including cross-compilation using
node-gyp.

At least at the time this was all reasonably straight forward, avoided *any* use
of npm, gave me all the benefits of reproducibility, yocto caching, license
management, correct cross-compilation for native code, etc. Also, the
generated yocto packages contained none of the test- or other items that
npm packages usually contain and don't need to be copied onto the target
for image builds. All this by simply inspecting the package.json

This approach also minimised maintenance cost, as all recipes were auto-generated.

The only downside was that bitbake became a bit slow, as the number of
tasks went to about 15000.

I can dig this out and share on Monday. This script could live in the
scripts/ subdirectory, allowing people to create recipes on demand for
projects they care about.


Cheers,
Andre'


On Tue, 2019-10-22 at 11:03 +0200, Jean-Marie LEMETAYER wrote:
> The current NPM support have several issues:
>  - The current NPM fetcher downloads the dependency tree but not the other
>    fetchers. The 'subdir' parameter was used to fix this issue.
>  - They are multiple issues with package names (uppercase, exotic characters,
>    scoped packages) even if they are inside the dependencies.
>  - The lockdown file generation have issues. When a package depends on
>    multiple version of the same package (all versions have the same checksum).
> 
> This patchset refactors the NPM support in Yocto:
>  - As the NPM algorithm for dependency management is hard to handle, the new
>    NPM fetcher downloads only the package source (and not the dependencies,
>    like the other fetchers) (patch submitted in the bitbake-devel list).
>  - The NPM class handles the dependencies using NPM (and not manually).
>  - The NPM recipe creation is simplified to avoid issues.
>  - The lockdown file is no more used as it is no longer relevant compared to the
>    latest shrinkwrap file format.
> 
> This patchset may remove some features (lockdown file, license management for
> dependencies) but fixes the majority of the NPM issues. All of these issues
> from the bugzilla.yoctoproject.org are resolved by this patchset:
> #10237, #10760, #11028, #11728, #11902, #12534
> 
> The fetcher and recipetool are now aware of a 'latest' keyword for the version
> which allow to build the latest version available on the registry. This feature
> fixes the two issues: #10515, #11029
> 
> Moreover the issue #13415 should also be fixed but I cannot test it.
> 
> I have tested the recipe creation and builds using a self made example to
> generate build failures:
>  - https://github.com/savoirfairelinux/node-server-example
>  - https://npmjs.com/package/@savoirfairelinux/node-server-example
> 
> The test steps are these ones:
>   $ source poky/oe-init-build-env
>   $ bitbake-layers add-layer ../meta-openembedded/meta-oe
>   $ devtool add "npm://registry.npmjs.org;name=@savoirfairelinux/node-server-example;version=latest"
>   $ devtool build savoirfairelinux-node-server-example
>   $ bitbake-layers create-layer ../meta-test
>   $ bitbake-layers add-layer ../meta-test
>   $ devtool finish savoirfairelinux-node-server-example ../meta-test
>   $ echo IMAGE_INSTALL_append = '" savoirfairelinux-node-server-example"' >> conf/local.conf
>   $ bitbake core-image-minimal
> 
> Also the 'devtool add' url could be one of these:
>  - npm://registry.npmjs.org;name=@savoirfairelinux/node-server-example;version=latest
>    This url uses the new npm fetcher and request the latest version available on
>    the registry.
>  - npm://registry.npmjs.org;name=@savoirfairelinux/node-server-example;version=1.0.0
>    This url uses the new npm fetcher and request a fixed version.
>  - git://github.com/savoirfairelinux/node-server-example.git;protocol=https
>    This url uses the git fetcher. As the dependencies are managed by the NPM
>    class, any fetcher can be used.
> 
> When this patchset will be merged, I have planned to update the NPM wiki:
>   https://wiki.yoctoproject.org/wiki/TipsAndTricks/NPM
> 
> This patchset is also the base work for the full Angular support in Yocto that I
> am preparing. These applications have a huge dependency tree which is ideal to
> test the NPM support.
> 
> Jean-Marie LEMETAYER (6):
>   npm.bbclass: refactor the npm class
>   devtool: update command line options for npm
>   recipetool/create_npm.py: refactor the npm recipe creation handler
>   devtool/standard.py: update the append file for the npm recipes
>   recipetool/create.py: replace 'latest' keyword for npm
>   recipetool/create.py: remove the 'noverify' url parameter
> 
>  meta/classes/npm.bbclass             | 210 +++++++++----
>  scripts/lib/devtool/standard.py      |  22 +-
>  scripts/lib/recipetool/create.py     |  16 +-
>  scripts/lib/recipetool/create_npm.py | 444 ++++++++++-----------------
>  4 files changed, 331 insertions(+), 361 deletions(-)
> 
> --
> 2.20.1
> 



^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 15:59         ` Richard Purdie
@ 2019-10-25  8:35           ` Stefan Herbrechtsmeier
  2019-10-25 11:08             ` Adrian Bunk
  0 siblings, 1 reply; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-25  8:35 UTC (permalink / raw)
  To: Richard Purdie, Adrian Bunk, Alexander Kanavin
  Cc: brendan.le.foll, Paul Eggleton (paul.eggleton@linux.intel.com),
	rennes, OE-core

Am 24.10.19 um 17:59 schrieb Richard Purdie:
> On Thu, 2019-10-24 at 18:37 +0300, Adrian Bunk wrote:
>> On Thu, Oct 24, 2019 at 02:12:43PM +0200, Alexander Kanavin wrote:
>>> On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier <
>>> stefan@herbrechtsmeier.net> wrote:
>>>
>>>> @Richard: What is your opinion about the per recipe dependency?
>>>> Typically OE use one recipe per project. The NPM based solution
>>>> handle a
>>>> project and all dependencies via one recipe.
>>>
>>> I don't think it's at all realistic to stick to the 'one recipe per
>>> component' in node.js world. A typical 'npm install' can pull down
>>> hundreds, or over a thousand dependencies, it's not feasible to
>>> have a
>>> recipe for each.
>>
>> Debian has for the perl/python/node/go/rust/haskell ecosystems
>> one recipe per component, with ~ 1k recipes each.
> 
> I think we'll have to end up having a smaller number of recipes which
> generate multiple packages. That gives a reasonable parsing time at the
> expense of having to pre-generate some of the recipe, a bit like the
> core perl and python recipes work today. The exact split will depend on
> the ecosystems and the "blocks" people tend to build in as its a
> compromise between building too much and parsing time.

How should this work? Node.js consist of multiple independent packages 
and every package define its own dependencies. Thereby the dependencies 
is defined with a version range which often reference a single version.

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-24 17:58                 ` Alexander Kanavin
@ 2019-10-25  8:58                   ` Stefan Herbrechtsmeier
  0 siblings, 0 replies; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-25  8:58 UTC (permalink / raw)
  To: Alexander Kanavin
  Cc: Paul Eggleton (paul.eggleton@linux.intel.com), rennes, OE-core

Am 24.10.19 um 19:58 schrieb Alexander Kanavin:
> On Thu, 24 Oct 2019 at 19:45, Stefan Herbrechtsmeier 
> <stefan@herbrechtsmeier.net <mailto:stefan@herbrechtsmeier.net>> wrote:
> 
>      > The package-lock.json in their tarball is 600K.
> 
>     The project use two major version and seven different versions with 30
>     installations of debug. Furthermore the dependencies include build
>     tools
>     which should not be installed on the device.
> 
>     The "@angular/cli" (242) and "node-red" (324) package share 106
>     packages.
> 
> 
> I have to ask: what point are you trying to make?

The both packages with different audience share between 44% and 33% of 
the packages. I think this true for other packages too.

> Here's a related lwn article describing a similar problem faced by opensuse:
> https://lwn.net/Articles/712318/

Unfortunately they also have no solution.

> ""Ruby dependency hell has nothing on JavaScript dependency hell," he 
> said. A "hello world" application based on one JavaScript framework has 
> 759 JavaScript dependencies; this framework is described as "a 
> lightweight alternative to Angular2".

Unfortunately they don't name the package nor give details about the 
dependencies. I assume they really count the dependencies with all there 
duplications and multiple minor or patch versions.


> There is no way he is going to 
> package all 759 dependencies for this thing; the current distribution 
> package-management approach just isn't going to work here."

What is the "current distribution package-management approach"? The 
manual creating of build configuration for every package or the 
automatic generation of recipes?

What is the alternative which fulfill the requirements of an embedded 
distribution?

My prototype automatically creates recipes from the packages. It only 
needs manual work if the package have no license file or a circular 
dependency.

The main problem is the pinning of specific versions inside the 
package.json and the increase of the major version if the support for a 
very old node.js version is removed.

The prototype assumes a compatible inside a major version and creates a 
recipe per major version. Thereby it always uses the latest version per 
major version. The recipe count could further optimized in a manual step 
which checks if there is really a breaking change between the major 
version. The number of this recipes is low in compare to the total count 
of recipes but the packages are often shared between different packages.

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-25  8:01 ` André Draszik
@ 2019-10-25  9:10   ` Stefan Herbrechtsmeier
  2019-10-29 10:52     ` André Draszik
  0 siblings, 1 reply; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-25  9:10 UTC (permalink / raw)
  To: André Draszik, openembedded-core

Hi Andre,

Am 25.10.19 um 10:01 schrieb André Draszik:
> Hi,
> 
> This has been an interesting discussion so far.
> 
> I'd like to throw in something else...
> 
> A couple years back I wrote a little python script to automatically
> generate all the required dependency recipes given an npm project's
> package.json

This is similar to my prototype but I try to reuse the recipetool and 
this makes the recipe creation very slow.

Do you reuse code from OE to generate the license information?

> The generated recipes worked very well, including cross-compilation using
> node-gyp.

Do you use any bbclass? I have create multiple bbclasses to minimize the 
code inside the recipes.

> At least at the time this was all reasonably straight forward, avoided *any* use
> of npm, gave me all the benefits of reproducibility, yocto caching, license
> management, correct cross-compilation for native code, etc. Also, the
> generated yocto packages contained none of the test- or other items that
> npm packages usually contain and don't need to be copied onto the target
> for image builds. All this by simply inspecting the package.json

This is the reason I go this way too.

> This approach also minimised maintenance cost, as all recipes were auto-generated.
> 
> The only downside was that bitbake became a bit slow, as the number of
> tasks went to about 15000.

Do you create a recipe per package version?

> I can dig this out and share on Monday. This script could live in the
> scripts/ subdirectory, allowing people to create recipes on demand for
> projects they care about.

Would be nice to see your code.

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-25  8:35           ` Stefan Herbrechtsmeier
@ 2019-10-25 11:08             ` Adrian Bunk
  2019-10-27  9:58               ` Stefan Herbrechtsmeier
  0 siblings, 1 reply; 36+ messages in thread
From: Adrian Bunk @ 2019-10-25 11:08 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier
  Cc: Paul Eggleton (paul.eggleton@linux.intel.com), rennes, OE-core

On Fri, Oct 25, 2019 at 10:35:20AM +0200, Stefan Herbrechtsmeier wrote:
> Am 24.10.19 um 17:59 schrieb Richard Purdie:
> > On Thu, 2019-10-24 at 18:37 +0300, Adrian Bunk wrote:
> > > On Thu, Oct 24, 2019 at 02:12:43PM +0200, Alexander Kanavin wrote:
> > > > On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier <
> > > > stefan@herbrechtsmeier.net> wrote:
> > > > 
> > > > > @Richard: What is your opinion about the per recipe dependency?
> > > > > Typically OE use one recipe per project. The NPM based solution
> > > > > handle a
> > > > > project and all dependencies via one recipe.
> > > > 
> > > > I don't think it's at all realistic to stick to the 'one recipe per
> > > > component' in node.js world. A typical 'npm install' can pull down
> > > > hundreds, or over a thousand dependencies, it's not feasible to
> > > > have a
> > > > recipe for each.
> > > 
> > > Debian has for the perl/python/node/go/rust/haskell ecosystems
> > > one recipe per component, with ~ 1k recipes each.
> > 
> > I think we'll have to end up having a smaller number of recipes which
> > generate multiple packages. That gives a reasonable parsing time at the
> > expense of having to pre-generate some of the recipe, a bit like the
> > core perl and python recipes work today. The exact split will depend on
> > the ecosystems and the "blocks" people tend to build in as its a
> > compromise between building too much and parsing time.
> 
> How should this work? Node.js consist of multiple independent packages and
> every package define its own dependencies. Thereby the dependencies is
> defined with a version range which often reference a single version.

Yes, it is a problem that backwards compatibility is frequently broken 
in Node.js packages and just using npm might result in 10 different
versions of the same package installed and used.

This is a huge mess that either has to be sorted out when packaging
(as is done in Debian), or you end up with security support basically 
impossible for something very exposed on the internet.

> Regards
>   Stefan

cu
Adrian

-- 

       "Is there not promise of rain?" Ling Tan asked suddenly out
        of the darkness. There had been need of rain for many days.
       "Only a promise," Lao Er said.
                                       Pearl S. Buck - Dragon Seed



^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-25 11:08             ` Adrian Bunk
@ 2019-10-27  9:58               ` Stefan Herbrechtsmeier
  0 siblings, 0 replies; 36+ messages in thread
From: Stefan Herbrechtsmeier @ 2019-10-27  9:58 UTC (permalink / raw)
  To: Adrian Bunk
  Cc: Paul Eggleton (paul.eggleton@linux.intel.com), rennes, OE-core

Am 25.10.19 um 13:08 schrieb Adrian Bunk:
> On Fri, Oct 25, 2019 at 10:35:20AM +0200, Stefan Herbrechtsmeier wrote:
>> Am 24.10.19 um 17:59 schrieb Richard Purdie:
>>> On Thu, 2019-10-24 at 18:37 +0300, Adrian Bunk wrote:
>>>> On Thu, Oct 24, 2019 at 02:12:43PM +0200, Alexander Kanavin wrote:
>>>>> On Thu, 24 Oct 2019 at 14:02, Stefan Herbrechtsmeier <
>>>>> stefan@herbrechtsmeier.net> wrote:
>>>>>
>>>>>> @Richard: What is your opinion about the per recipe dependency?
>>>>>> Typically OE use one recipe per project. The NPM based solution
>>>>>> handle a
>>>>>> project and all dependencies via one recipe.
>>>>>
>>>>> I don't think it's at all realistic to stick to the 'one recipe per
>>>>> component' in node.js world. A typical 'npm install' can pull down
>>>>> hundreds, or over a thousand dependencies, it's not feasible to
>>>>> have a
>>>>> recipe for each.
>>>>
>>>> Debian has for the perl/python/node/go/rust/haskell ecosystems
>>>> one recipe per component, with ~ 1k recipes each.
>>>
>>> I think we'll have to end up having a smaller number of recipes which
>>> generate multiple packages. That gives a reasonable parsing time at the
>>> expense of having to pre-generate some of the recipe, a bit like the
>>> core perl and python recipes work today. The exact split will depend on
>>> the ecosystems and the "blocks" people tend to build in as its a
>>> compromise between building too much and parsing time.
>>
>> How should this work? Node.js consist of multiple independent packages and
>> every package define its own dependencies. Thereby the dependencies is
>> defined with a version range which often reference a single version.
> 
> Yes, it is a problem that backwards compatibility is frequently broken
> in Node.js packages and just using npm might result in 10 different
> versions of the same package installed and used.

Therefore the distribution could add an compatibility identifier 
(interface version) to the packages and translate the version range of 
the dependencies into a compatibility identifier. The default 
compatibility identifier is the major version of the package. This 
allows the distribution to ship one package per compatibility 
identifier. The distribution could provide multiple compatibility 
identifiers by a single package or introduce additional packages / 
compatibility identifiers based on the major, minor and maybe patch 
version of the package.

This information could and should be shared between different 
distributions to automate the creation of distribution packages from npm 
packages.

> This is a huge mess that either has to be sorted out when packaging
> (as is done in Debian), or you end up with security support basically
> impossible for something very exposed on the internet.

Ack

Regards
   Stefan


^ permalink raw reply	[flat|nested] 36+ messages in thread

* Re: [RFC][PATCH 0/6] NPM refactoring
  2019-10-25  9:10   ` Stefan Herbrechtsmeier
@ 2019-10-29 10:52     ` André Draszik
  0 siblings, 0 replies; 36+ messages in thread
From: André Draszik @ 2019-10-29 10:52 UTC (permalink / raw)
  To: Stefan Herbrechtsmeier, openembedded-core

[-- Attachment #1: Type: text/plain, Size: 3843 bytes --]

Hi,

On Fri, 2019-10-25 at 11:10 +0200, Stefan Herbrechtsmeier wrote:
> Hi Andre,
> 
> Am 25.10.19 um 10:01 schrieb André Draszik:
> > Hi,
> > 
> > This has been an interesting discussion so far.
> > 
> > I'd like to throw in something else...
> > 
> > A couple years back I wrote a little python script to automatically
> > generate all the required dependency recipes given an npm project's
> > package.json
> 
> This is similar to my prototype but I try to reuse the recipetool and 
> this makes the recipe creation very slow.

Attached is the latest snapshot of scripts and support classes
that I had used back then...

nodejs.bbclass does the main work of handling the recipes.
It supports usage of the same npm package with different
versions by adding appropriate symlinks for the file system
Also, it extracts the runtime depends from package.json (and
add runtime provides) so as to get proper dependency trees.

nodegyp.bbclass is for recipes that use node-gyp (or node-pre-gyp).
This needs node-gyp and gyp (as build tools). I didn't 

create-recipe-from-json.py is the actual script, point it at a
package.json, and optional shrinkwrap. It does the other part of the
deduplication by keeping track of the dependency tree and adding the
appropriate symlink information to the recipes as understood by
nodejs.bbclass.

At the time I had a problem with runtime depends for native packages,
RDEPENDS for native packages wasn't implemented in OE back then.
I've commented out "DEPENDS_append_class-native" handling now,
as this should work without tricks these days.

node-pre-gyp support wasn't fully implemented in the create-recipe
script. npm-modules using node-gyp or node-pre-gyp did work fine and
can be cross-compiled without probllem, though.

Some npm-modules require extra handling, e.g. patches to be applied, this
is done in around line 205, where an extra 'require' line is added
to the generated recipe, I've also added nodejs-node-gyp.inc as an
example for what that could look like.

I didn't use recipetool, either cause I didn't know it back then, or it
wasn't available at the time.

It's very fast though. Most time is spent downloading the packages for
analysis...


> Do you reuse code from OE to generate the license information?


I had to look it up, this used 'licensee' https://github.com/licensee/licensee


> The generated recipes worked very well, including cross-compilation using
> > node-gyp.
> 
> Do you use any bbclass? I have create multiple bbclasses to minimize the 
> code inside the recipes.

All included in this mail.

> 
> > At least at the time this was all reasonably straight forward, avoided *any* use
> > of npm, gave me all the benefits of reproducibility, yocto caching, license
> > management, correct cross-compilation for native code, etc. Also, the
> > generated yocto packages contained none of the test- or other items that
> > npm packages usually contain and don't need to be copied onto the target
> > for image builds. All this by simply inspecting the package.json
> 
> This is the reason I go this way too.
> 
> > This approach also minimised maintenance cost, as all recipes were auto-generated.
> > 
> > The only downside was that bitbake became a bit slow, as the number of
> > tasks went to about 15000.
> 
> Do you create a recipe per package version?

Yes, but it is clever in the sense that deduplication is applied, including looking
for matching version numbers as per the version specification.

> 
> > I can dig this out and share on Monday. This script could live in the
> > scripts/ subdirectory, allowing people to create recipes on demand for
> > projects they care about.
> 
> Would be nice to see your code.
> 
> Regards
>    Stefan

Cheers,
Andre'


[-- Attachment #2: gyp.bbclass --]
[-- Type: text/plain, Size: 303 bytes --]

DEPENDS += "gyp-native"

inherit pythonnative

EXTRA_OEMAKE = "\
    CC.host="${BUILD_CC}" \
    CFLAGS.host="${BUILD_CFLAGS}" \
    CXX.host="${BUILD_CXX}" \
    CXXFLAGS.host="${BUILD_CXXFLAGS}" \
    LINK.host="${BUILD_CXX}" \
    LDFLAGS.host="${BUILD_LDFLAGS}" \
    \
    LINK.target="${CXX}" \
"

[-- Attachment #3: nodegyp.bbclass --]
[-- Type: text/plain, Size: 1386 bytes --]

DEPENDS += "nodejs-node-gyp1.0.3-native nodejs-native"

inherit gyp nodejs

export NODE_GYP_NODEJS_INCLUDE_DIR = "${PKG_CONFIG_SYSROOT_DIR}${NODEJS_INCLUDE_DIR}"

nodegyp_do_configure() {
        node-gyp --arch ${TARGET_ARCH} --verbose configure ${EXTRA_OECONF}
}

python __anonymous () {
    # Ensure we run this usually noexec task (due to nodejs.bbclass)
    d.delVarFlag("do_compile", "noexec")
}
nodegyp_do_compile() {
        node-gyp --arch ${TARGET_ARCH} --verbose build ${EXTRA_OEMAKE}
}
nodegyp_shell_do_install_helper() {
    cd ${D}${NODE_MODULE_DIR}
    # (node-)gyp creates multiple copies of the object files, we
    # only keep one!
    # node-expat: build/Release/node_expat.node (patch needed)
    # sqlite3: lib/node_sqlite3.node (no build/ needed!)
    if [ -d build ] ; then
        # dtrace-provider builds nothing for !MacOS
        if ls build/Release/*.node 2> /dev/null > /dev/null ; then
            tmpdir=$(mktemp -d --tmpdir=./)
            mv build/Release/*.node $tmpdir
            rm -rf build
            mkdir -p build/Release
            mv $tmpdir/*.node build/Release
            rm -rf $tmpdir
        else
            rm -rf build
        fi
    fi
}
python nodegyp_do_install() {
    bb.build.exec_func("nodejs_do_install", d)
    bb.build.exec_func("nodegyp_shell_do_install_helper", d)
}

EXPORT_FUNCTIONS do_configure do_compile do_install

[-- Attachment #4: nodejs.bbclass --]
[-- Type: text/plain, Size: 21038 bytes --]

# While nodejs(-native) is just a runtime dependency (RDEPENDS)
# we still need to add it to DEPENDS because:
# - the RDEPENDS is added to the target packages late, after
#   OE has determined which packages it needs to build. Hence
#   it wouldn't build nodejs(-native)
# - RDEPENDS are completely ignored for -native packages, so this
#   is our way to make sure that nodejs-native ends up in the
#   sysroot for native packages
DEPENDS += "nodejs"

NODE_MODULE = "${@('${BPN}'.replace('nodejs-', '', 1) if '${BPN}'.startswith('nodejs-') else '${BPN}').replace('${PV}', '', 1)}"
SRC_URI = "http://registry.npmjs.org/${NODE_MODULE}/-/${NODE_MODULE}-${PV}.tgz;subdir=${BP}"

NODEJS_SITELIB_DIR = "${libdir}/node_modules"
NODEJS_INCLUDE_DIR = "${includedir}/node"

export NODE_LIBDIR = "${PKG_CONFIG_SYSROOT_DIR}${NODEJS_SITELIB_DIR}"
NODE_MODULE_DIR = "${NODEJS_SITELIB_DIR}/${NODE_MODULE}${PV}"

# Tarballs store the contents of the module inside a 'package' directory
# This conflicts with OE's 'package' directory, which is dynamically
# created during the packaging process. So we have to make sure to extract
# the tarball into a different directory.
S = "${WORKDIR}/${BP}/package"

python nodejs_do_unpack() {
    import shutil
    bb.build.exec_func('base_do_unpack', d)
    # some node modules (ejs!) have an unusual directory structure,
    # i.e. they don't use package/ but ${NODE_MODULE}-v${PV}
    for path in [ "${BP}/${NODE_MODULE}-v${PV}", "${BP}/${NODE_MODULE}" ]:
        if os.path.exists(path):
            shutil.rmtree("${S}", True)
            shutil.move(path, "${S}")
            break
    # Remove bundled node_modules if there are any...
    shutil.rmtree("${S}/node_modules", True)
}

# Some node modules have Makefiles which don't build but run tests
# instead. This fails in OE, as Makefiles are expected to build if
# they exist. So let's just prevent automatic building.
do_compile[noexec] = "1"

python nodejs_do_install() {
    import shutil

    # copy everything
    shutil.copytree("${B}", "${D}${NODE_MODULE_DIR}", symlinks=True)

    # remove unneeded dirs
    os.chdir("${D}${NODE_MODULE_DIR}")
    # these are created by OE
    shutil.rmtree(".pc", True)
    shutil.rmtree("patches", True)
    # these are usually not needed
    shutil.rmtree("test", True)
    shutil.rmtree("tests", True)
    shutil.rmtree("example", True)
    shutil.rmtree("examples", True)
}

python nodejs_do_install_append_class-native() {
    import json

    if '${NODE_MODULE}' != 'gulp' and '${NODE_MODULE}' != 'node-gyp':
        return

    os.chdir("${D}${NODE_MODULE_DIR}")

    # install bin files
    bindir_files = {}
    metadata_p = json.load(open('package.json'))
    # which files do we need to install into bin/
    if 'bin' in metadata_p:
        # this can be a dict, or a simple string, https://docs.npmjs.com/files/package.json
        if isinstance(metadata_p['bin'], basestring):
            bindir_files[metadata_p['name']] = os.path.normpath(metadata_p['bin'])
        elif isinstance(metadata_p['bin'], dict):
            for link, target in metadata_p['bin'].items():
                bindir_files[link] = os.path.normpath(target)
        else:
            raise ValueError('Unsupported entry for "bin": ' + str(type(metadata_p['bin'])))
    if bindir_files:
        d_bindir = "${D}${bindir}"
        if not os.path.exists(d_bindir):
            os.mkdir(d_bindir)
        for link, target in bindir_files.items():
            os.chmod(target, 0755)
            os.symlink(os.path.join("${NODE_MODULE_DIR}", target), os.path.join(d_bindir, link))
}

NODEJS_SKIP_MODULES_SYMLINK ?= "0"
def nodejs_do_install_symlinks2(d, do_devdepends=False):
    """
    Symlink a node module's dependencies into the node_modules directory so node.js
    can find them
    """

    import json
    import os
    import shutil

    # inspired from fedora's nodejs-packaging-fedora-7.tar.xz::nodejs-symlink-deps
    sitelib = d.getVar('NODEJS_SITELIB_DIR', True)

    def symlink(source, dest):
        try:
            os.symlink(source, dest)
        except OSError:
            if os.path.isdir(dest):
                shutil.rmtree(dest)
            else:
                os.unlink(dest)
            os.symlink(source, dest)


    def symlink_deps(deps):
        s_sitelib = os.path.join(d.getVar('STAGING_DIR_NATIVE', True), sitelib.lstrip("/"))

        def symlink_dep_from_rdepends(rdepends, dep):
            target = d.getVarFlag("NODEJS_SYMLINKS", dep)
            if not target:
                # fatal for now...
                bb.fatal("Dependency {0} doesn't exist in NODEJS_SYMLINKS!".format(dep))
            for key in rdepends.keys():
                t = key.replace('nodejs-', '', 1).replace('-native', '', 1)
                if t.startswith(dep + '-'):
                    target = t
                    del rdepends[key]
                    break
            if not target:
                bb.fatal("Symlink to dependency {0} couldn't be resolved!".format(dep))
            if not do_devdepends:
                target = os.path.join("..", "..", target)
            else:
                target = os.path.join(s_sitelib, target)
            symlink(target, dep)

        def sanity_check_symlink(dep):
            # Symlink to dep should have been created due to RDPENDS_${PN}
            # parsing
            if not os.path.lexists(dep):
                bb.fatal("The package {0} depends on {1}, but the symlink "
                         "wasn't created yet. Error in RDPENDS_${{PN}}."
                         .format(d.getVar('PN', True), dep))
            if do_devdepends:
                if not os.path.exists(dep):
                    bb.fatal("The package nodejs-{0} has not been "
                             "installed into staging ({1}) even though "
                             "{2} needs it!"
                             .format(dep, os.path.readlink(dep), d.getVar('PN', True)))

        rdepends = {}
        me = d.getVar("PN", True)
        # -native packages have an empty PACKAGES
        #packages = d.getVar("PACKAGES", True).split() or [ d.getVar("PN", True) ]
        # actually, we only want the main package. As that is the place where
        # all dependencies should be stated!
        packages = [ d.getVar("PN", True) ]
        for pkg in packages:
            rdepends.update(bb.utils.explode_dep_versions2(d.getVar('RDEPENDS_' + pkg, True) or ""))
        for key in rdepends.keys():
            if not key.startswith('nodejs-'):
                del rdepends[key]

        if isinstance(deps, dict):
            for dep, ver in deps.iteritems():
                symlink_dep_from_rdepends(rdepends, dep)
            for dep, ver in deps.iteritems():
                sanity_check_symlink(dep)
        elif isinstance(deps, list):
            for dep in deps:
                symlink_dep_from_rdepends(rdepends, dep)
            for dep in deps:
                sanity_check_symlink(dep)
        elif isinstance(deps, basestring):
            symlink_dep_from_rdepends(rdepends, deps)
            sanity_check_symlink(deps)
        else:
            bb.fatal("Invalid package.json while building {0}: dependencies not a valid type".format(d.getVar('PN', True)))


    if not do_devdepends:
        d_sitelib = os.path.join(d.getVar('D', True), sitelib.lstrip('/'))
        modules = [os.path.join(d_sitelib, module) for module in os.listdir(d_sitelib)]
    else:
        modules = [d.getVar('S', True)]

    for path in modules:
        os.chdir(path)
        metadata = json.load(open('package.json'))

        # Normally, we would want to error out if the install process created this
        # directory, but it could also still be around from a previous run, so let's
        # just delete it.
        shutil.rmtree('node_modules', True)
        if (not do_devdepends and ('dependencies' in metadata or 'optionalDependencies' in metadata)) or (do_devdepends and 'devDependencies' in metadata):
            os.mkdir('node_modules')
            oldwd = os.getcwd()
            os.chdir('node_modules')

            if not do_devdepends:
                if 'dependencies' in metadata:
                    symlink_deps(metadata['dependencies'])
                if 'optionalDependencies' in metadata:
                    symlink_deps(metadata['optionalDependencies'])
            else:
                if 'devDependencies' in metadata:
                    symlink_deps(metadata['devDependencies'])

            os.chdir(oldwd)
            if not os.listdir('node_modules'):
                os.rmdir('node_modules')


python nodejs_do_install_symlinks() {
    if d.getVar("NODEJS_SKIP_MODULES_SYMLINK", True) == "0":
        nodejs_do_install_symlinks2(d, False)
}
nodejs_do_install_symlinks[vardeps] += "NODEJS_SYMLINKS NODEJS_SKIP_MODULES_SYMLINK NODEJS_SITELIB_DIR"
addtask install_symlinks after do_install before do_populate_sysroot do_package

python nodejs_do_link_devdepends() {
    nodejs_do_install_symlinks2(d, True)
}
do_link_devdepends[noexec] = "1"
do_link_devdepends[deptask] = "do_populate_sysroot"
nodejs_do_link_devdepends[vardeps] += "NODEJS_SYMLINKS NODEJS_SITELIB_DIR"
addtask link_devdepends after do_patch before do_compile

python nodejs_extract_runtime_provides() {
    """
    Extract RPROVIDES from package.json.

    See 'man npm-json' for details.
    """

    import json
    import os

    # inspired from package.bbclass
    sitelib = d.getVar('NODEJS_SITELIB_DIR', True)
    pkgdest = d.getVar('PKGDEST', True)

    packages = d.getVar("PACKAGES").split()
    for pkg in packages:

        # it's highly unlikely that we'll ever get more than one file but we
        # handle this in a generic way nevertheless
        files = []

        p_sitelib = os.path.join(pkgdest, pkg, sitelib.lstrip("/"))

        for file in pkgfiles[pkg]:
            # only process files in the sitelib, and ignore files local to the
            # current package
            if file.find(p_sitelib) == 0:
                if os.path.basename(file) == 'package.json':
                    files.append(file)

        for file in files:
            fh = open(file)
            metadata = json.load(fh)
            fh.close()

            # inspired from fedora's nodejs-packaging-fedora-7.tar.xz::nodejs.prov
            if 'name' in metadata and not ('private' in metadata and metadata['private']):
                pname = 'virtual-npm-' + metadata['name']

                ver = []
                # bitbake does not understand versioned RPROVIDES, see e.g. here
                # http://lists.openembedded.org/pipermail/openembedded-commits/2014-July/162238.html
                # (pkgconfig: Drop version from RPROVIDES)
                #if 'version' in metadata:
                #    # metadata is unicode, but OE expects string in these vars
                #    # due to package.bbclass::write_if_exists()
                #    ver = metadata['version'].encode('utf8')

                # metadata is unicode, but OE expects string in these vars
                # due to package.bbclass::write_if_exists()
                pname = pname.encode('utf8')

                # normally, what we have added as default via RPROVIDES_${PN} = ... at
                # the end of the file should match NODE_MODULE, so normally, there is
                # nothing to do here. Just in case, we have some code to be failsafe,
                # though.
                rprovides = bb.utils.explode_dep_versions2(d.getVar('RPROVIDES_' + pkg, True) or "")
                if pname not in rprovides:
                    rprovides[pname] = ver
                    d.setVar('RPROVIDES_' + pkg, bb.utils.join_deps(rprovides, commasep=False))
}


python nodejs_extract_runtime_depends() {
    """
    Extract RDEPENDS from package.json.

    See `man npm-json` for details.
    """

    import json
    import os

    # inspired from fedora's nodejs-packaging-fedora-7.tar.xz::nodejs.req
    # write out the node.js interpreter dependency
    def convert_dep(req, operator, version):
        """Converts one of the two possibly listed versions into an OE dependency"""

        deps = []

        if not version or version == '*':
            # any version will do
            deps.append(req)
        elif operator in ['>', '<', '<=', '>=', '=']:
            # any prefix other than ~ and ^ makes things dead simple
            deps.append(' '.join([req, '(' + operator, version + ')']))
        else:
            # here be dragons...
            # split the dotted portions into a list (handling trailing dots properly)
            parts = [part if part else 'x' for part in version.split('.')]
            parts = [int(part) if part != 'x' and not '-' in part
                                                        else part for part in parts]

            if len(parts) == 1 or parts[1] == 'x':
                # 1 or 1.x or 1.x.x or ~1 or ^1
                if parts[0] != 0:
                    deps.append('{0} (>= {1})'.format(req, parts[0]))
                deps.append('{0} (< {1})'.format(req, parts[0]+1))
            elif len(parts) == 3 or operator != '~':
                # 1.2.3 or 1.2.3-4 or 1.2.x or ~1.2.3 or ^1.2.3 or 1.2
                if len(parts) == 2 or parts[2] == 'x':
                    # 1.2.x or 1.2
                    deps.append('{0} (>= {1}.{2})'.format(req, parts[0], parts[1]))
                    deps.append('{0} (< {1}.{2})'.format(req, parts[0], parts[1]+1))
                elif operator == '~' or (operator == '^' and parts[0] == 0 and parts[1] > 0):
                    # ~1.2.3 or ^0.1.2 (zero is special with the caret operator)
                    deps.append('{0} (>= {1})'.format(req, version))
                    deps.append('{0} (< {1}.{2})'.format(req, parts[0], parts[1]+1))
                elif operator == '^' and parts[0:1] != [0,0]:
                    # ^1.2.3
                    deps.append('{0} (>= {1})'.format(req, version))
                    deps.append('{0} (< {1})'.format(req, parts[0]+1))
                else:
                    # 1.2.3 or 1.2.3-4 or ^0.0.3
                    # This should normally be:
                    # deps.append('{0} (= {1})'.format(req, version))
                    # but we add PR and assume that it will always be r0 for
                    # all recipes. Otherwise dependencies can not be resolved
                    # as the package version would not match (due to the
                    # -${PR} suffix)
                    deps.append('{0} (= {1}-r0)'.format(req, version))
            elif operator == '~':
                # ~1.2
                deps.append('{0} (>= {1})'.format(req, version))
                deps.append('{0} (< {1})'.format(req, parts[0]+1))
            elif operator == '^':
                # ^1.2
                deps.append('{0} (>= {1})'.format(req, version))
                deps.append('{0} (< {1})'.format(req, parts[0]+1))

        return deps

    def process_dep(req, version):
        """Converts an individual npm dependency into OE dependencies"""

        import re

        deps = []

        # there's no way we can do anything like an OR dependency
        if '||' in version:
            bb.warn("The {0} package has an OR (||) dependency on {1}: {2}\n".format(d.getVar('PN', True), req, version) +
                    "Please manually include a versioned RDEPENDS_$PN = \"{0} (<version>)\" in the".format(req) +
                    "{0} recipe if necessary".format(d.getVar('FILE', True)))
            deps.append(req)
        elif ' - ' in version:
            gt, lt = version.split(' - ')
            deps.append(req + ' (>= ' + gt + ')')
            deps.append(req + ' (<= ' + lt + ')')
        else:
            RE_VERSION = re.compile(r'\s*v?([<>=~^]{0,2})\s*([0-9][0-9\.\-]*)\s*')
            m = re.match(RE_VERSION, version)
            if m:
                deps += convert_dep(req, m.group(1), m.group(2))

                # There could be up to two versions here (e.g.">1.0 <3.1")
                if len(version) > m.end():
                    m = re.match(RE_VERSION, version[m.end():])
                    if m:
                        deps += convert_dep(req, m.group(1), m.group(2))
            else:
                deps.append(req)

        return deps

    sitelib = d.getVar('NODEJS_SITELIB_DIR', True)
    pkgdest = d.getVar('PKGDEST', True)

    # inspired from package.bbclass
    packages = d.getVar("PACKAGES").split()
    for pkg in packages:
        deps = []
        recommends = []

        # it's highly unlikely that we'll ever get more than one file but we
        # handle this in a generic way nevertheless
        files = []

        p_sitelib = os.path.join(pkgdest, pkg, sitelib.lstrip("/"))

        for file in pkgfiles[pkg]:
            # only process files in the sitelib, and ignore files local to the
            # current package
            if file.find(p_sitelib) == 0:
                if os.path.basename(file) == 'package.json':
                    files.append(file)

        for file in files:
            fh = open(file)
            metadata = json.load(fh)
            fh.close()

            req = 'nodejs'
            if 'engines' in metadata and isinstance(metadata['engines'], dict) \
                                                and 'node' in metadata['engines']:
                deps += process_dep(req, metadata['engines']['node'])
            else:
                deps.append(req)

            if 'dependencies' in metadata:
                if isinstance(metadata['dependencies'], dict):
                    for name, version in metadata['dependencies'].iteritems():
                        req = 'virtual-npm-' + name
                        deps += process_dep(req, version)
                elif isinstance(metadata['dependencies'], list):
                    for name in metadata['dependencies']:
                        req = 'virtual-npm-' + name
                        deps.append(req)
                elif isinstance(metadata['dependencies'], basestring):
                    req = 'virtual-npm-' + metadata['dependencies']
                    deps.append(req)
                else:
                    raise TypeError('invalid package.json: dependencies not a valid type')

            if 'optionalDependencies' in metadata:
                if isinstance(metadata['optionalDependencies'], dict):
                    for name, version in metadata['optionalDependencies'].iteritems():
                        req = 'virtual-npm-' + name
                        recommends += process_dep(req, version)
                elif isinstance(metadata['optionalDependencies'], list):
                    for name in metadata['optionalDependencies']:
                        req = 'virtual-npm-' + name
                        recommends.append(req)
                elif isinstance(metadata['optionalDependencies'], basestring):
                    req = 'virtual-npm-' + metadata['optionalDependencies']
                    recommends.append(req)
                else:
                    raise TypeError('invalid package.json: optionalDependencies not a valid type')

        rdepends = bb.utils.explode_dep_versions2(d.getVar('RDEPENDS_' + pkg, True) or "")
        # remove all nodejs-xxx rdependencies originating in the recipe itself
        # they are not useful, as we only want RDEPENDS on the virtual-npm-xxx
        # packages, as that allows us to have multiple versions installed in
        # one image.
        # Maybe we should switch all recipes to using DEPENDS to avoid the need
        # for this?
        for key in rdepends.keys():
            if key.startswith('nodejs-'):
                del rdepends[key]
        # add new rdependencies which we just figured out
        for dep in deps:
            # metadata is unicode, but OE expects string in these vars
            # due to package.bbclass::write_if_exists()
            dep = dep.encode('utf8')
            if dep not in rdepends:
                rdepends[dep] = []
        d.setVar('RDEPENDS_' + pkg, bb.utils.join_deps(rdepends, commasep=False))

        rrecommends = bb.utils.explode_dep_versions2(d.getVar('RRECOMMENDS_' + pkg, True) or "")
        # add new rrecommends which we just figured out
        for recommend in recommends:
            # metadata is unicode, but OE expects string in these vars
            # due to package.bbclass::write_if_exists()
            recommend = recommend.encode('utf8')
            if recommend not in rrecommends:
                rrecommends[recommend] = []
        d.setVar('RRECOMMENDS_' + pkg, bb.utils.join_deps(rrecommends, commasep=False))
}

PACKAGEFUNCS =+ "nodejs_extract_runtime_provides nodejs_extract_runtime_depends"

EXPORT_FUNCTIONS do_unpack do_install do_install_symlinks do_link_devdepends

FILES_${PN} += "${NODE_MODULE_DIR}"
DOTDEBUG-dbg += "${NODE_MODULE_DIR}/*/*/.debug"
DOTDEBUG-dbg += "${NODE_MODULE_DIR}/*/.debug"
DOTDEBUG-dbg += "${NODE_MODULE_DIR}/.debug"

RPROVIDES_${PN} = "virtual-npm-${NODE_MODULE}"

[-- Attachment #5: create-recipe-from-json.py --]
[-- Type: text/x-python3, Size: 28922 bytes --]

#!/usr/bin/env python3

# TODO:
# - check for node-pre-gyp usage

import argparse
import os
import json
import urllib.request
import io
import tarfile
import hashlib
from distutils.version import LooseVersion
import shutil
import tempfile
import subprocess


global args


def required_length(nmin,nmax):
    class RequiredLength(argparse.Action):
        def __call__(self, parser, args, values, option_string=None):
            if not nmin<=len(values)<=nmax:
                msg='argument "{f}" requires between {nmin} and {nmax} arguments'.format(
                    f=self.dest,nmin=nmin,nmax=nmax)
                raise argparse.ArgumentTypeError(msg)
            setattr(args, self.dest, values)
    return RequiredLength


class Package:
    """ Package class """
    def __init__(self, name, version):
        self.name = name
        self.set_version(version)
        self.summary = None
        self.homepage = None
        self.license = None
        self.license_file = None
        self.license_file_md5sum = None
        self.md5sum = None
        self.sha256sum = None
        self.using_gyp = False
        self.deps = []
        self.details_skipped = False
        self.parent = None
        self.metadata_s = None
        self.is_devdepend = False
        self.is_circular = False

    def to_JSON(self):
        def members_to_dump(obj):
            # prevent endless recursion by avoiding the 'parent' member
            return {attr: value for attr, value in obj.__dict__.items()
                    if not attr.startswith('__') and attr != 'parent' and attr != 'metadata_s'}

        return json.dumps(self, default=members_to_dump, sort_keys=False, indent=4, separators=(',', ': '))

    def debugprint(self, indent=''):
        extra = ''
        if self.using_gyp:
            extra = extra + ' gyp'
        if self.details_skipped:
            extra = extra + ' skipped'
        if self.is_circular:
            extra = extra + ' circular'
        if extra:
            extra = ' (' + extra.lstrip() + ')'
        print('{0}{1} {2}{3}'.format(indent, self.name, self.version, extra))

        for p in self.deps:
            p.debugprint(indent + '  ')

    def set_version(self, version):
        if isinstance(version, str):
            self.version = version
        else:
            raise TypeError('Unuspported Version: ' + version)
        self.src_uri = 'https://registry.npmjs.org/{0}/-/{0}-{1}.tgz'.format(self.name, self.version)

    def update_version_and_uri_from_shrinkwrap(self, value):
        # order is important here, set_version() will set a
        # default URL, but 'resolved' could point to somewhere
        # else.
        if 'version' in value:
            self.set_version(value['version'])
        if 'resolved' in value:
            self.src_uri = value['resolved']

    def get_latest_compatible(self):
        # get latest compatible version based on self.version
        import re

        print('get_latest_compatible: ' + self.name + ' before: self.version is ' + self.version)
        try:
            info = urllib.request.urlopen('file:///tmp/npm-cache/modinfo/{0}'.format(self.name)).read()
        except urllib.error.URLError:
            if not os.path.exists('/tmp/npm-cache/modinfo'):
                os.makedirs('/tmp/npm-cache/modinfo')
            # use real URL and save for later re-use
            info = urllib.request.urlopen('https://registry.npmjs.org/{0}'.format(self.name)).read()
            with open('/tmp/npm-cache/modinfo/{0}'.format(self.name), 'w+b') as f:
                f.write(info)
        pinfo = json.loads(info.decode('utf-8'))

        best = None

        # check if the latest is compatible
        latest = pinfo['dist-tags']['latest']
        if npm_version_is_compatible(latest, self.version, self.name) == True:
            best = latest

        if not best:
            for k, v in pinfo['versions'].items():
                if npm_version_is_compatible(k, self.version, self.name) == True:
                    print('  ' + k + ' is compatible! ')
                    if not best:
                        print('No best yet, using ' + k)
                        best = k
                    elif LooseVersion(k) > LooseVersion(best):
                        print('Updating best to ' + k)
                        best = k
                    else:
                        print('leaving best untouched (' + best + ')')

        if not best:
            parent = self.parent
            parent.add_dep(self)
            while parent.parent:
                parent = parent.parent
            raise ValueError('No best version found for package {0}\n{1}\n{2}'.format(self.name, parent.to_JSON(), parent.debugprint()))

        print('best is ' + best)
        self.set_version(best)
        self.src_uri = pinfo['versions'][self.version]['dist']['tarball']
        print(self.name + ' after: self.version is ' + self.version + ' and URL is ' + self.src_uri)

    def sanitise_homepage(self):
        if self.homepage:
            if self.homepage.startswith('git@github.com:'):
                self.homepage = self.homepage.replace('git@github.com:', 'https://github.com/', 1)
            elif self.homepage.startswith('git://github.com'):
                self.homepage = self.homepage.replace('git://github.com', 'https://github.com', 1)
            elif not self.homepage.startswith(('git://', 'https://', 'http://')):
                self.homepage = self.homepage.replace('', 'https://github.com/', 1)

    def add_dep(self, dep):
        self.deps.append(dep)

    def write_bb_recipe(self, outdir):
        import re

        if self.details_skipped:
            return
        depends = ''
        rdepends = ''
        depends_class_native = ''
        symlinks = ''
        for p in sorted(self.deps, key=lambda pkg: pkg.name):
            newdep = ' nodejs-{0}{1}'.format(p.name.replace('_', '-'), p.version)
            if p.is_devdepend:
                depends = depends + newdep + '-native'
            else:
                if not p.details_skipped:
                    # prevent recursive depends inside DEPENDS, i.e. only add if not
                    # existing already
                    depends_class_native = depends_class_native + newdep
                else:
                    # for testing, add stuff back in
                    depends_class_native = depends_class_native + newdep
                rdepends = rdepends + newdep
            symlinks = symlinks + '\nNODEJS_SYMLINKS[{0}] = "{0}{1}"'.format(p.name, p.version)
        if depends:
            depends = '\nDEPENDS = "{0}"\n'.format(depends.lstrip())
        if rdepends:
            if rdepends == depends_class_native:
                # in this case, RDEPENDS will automatically be translated to append -native
                # by bitbake ...
                depends_class_native = ' ${RDEPENDS_${PN}}'
            else:
                # ... whereas here we need to do this manually
                depends_class_native = re.sub(r'(\S+)', r'\1-native', depends_class_native)
            #if depends_class_native:
            #    depends_class_native = 'DEPENDS_append_class-native = "{0}"\n'.format(depends_class_native)
            rdepends = '\nRDEPENDS_${{PN}} = "{0}"\n{1}'.format(rdepends.lstrip(), depends_class_native)
        if symlinks:
            symlinks = symlinks + '\n'

        lic_files_chksum = ''
        if self.license_file:
            lic_files_chksum = 'LIC_FILES_CHKSUM = "file://{0};md5={1}"\n'.format(self.license_file, self.license_file_md5sum)

        arch = 'nodegyp' if self.using_gyp else 'allarch'

        pn = self.name.replace('_', '-')
        module_name = ''
        if pn != self.name:
            module_name = 'NODE_MODULE = "{0}"\n'.format(self.name)

        # FIXME: some packages need patches etc., so we hard-code that here and expect an
        # .inc file to be available with the necessary extra bits.
        require = ''
        if self.name == 'node-gyp':
            require = '\nrequire nodejs-{0}.inc\n'.format(self.name)

        srcuri = ''
        if self.src_uri != 'https://registry.npmjs.org/{0}/-/{0}-{1}.tgz'.format(self.name, self.version) and \
           self.src_uri != 'http://registry.npmjs.org/{0}/-/{0}-{1}.tgz'.format(self.name, self.version):
            # esprima-fb has a different filename scheme. Let's be generic, though
            srcuri = '\nSRC_URI = "{0};subdir=${{BP}}"\n'.format(self.src_uri)

        sfx = 0
        fname = '{0}/nodejs-{1}{2}_{2}.bb'.format(outdir, pn, self.version)
        while os.path.exists(fname):
            sfx = sfx + 1
            fname = '{0}/nodejs-{1}{2}_{2}.bb.{3}'.format(outdir, pn, self.version, sfx)
        with open(fname, 'wt', encoding='utf-8') as f:
            f.write('''SUMMARY = "{summary}"
HOMEPAGE = "{hp}"
LICENSE = "{lic}"
{licsum}{depends}
SRC_URI[md5sum] = "{md5}"
SRC_URI[sha256sum] = "{sha256}"

inherit nodejs {arch}
{modname}{require}{srcuri}{rdepends}{symlinks}
BBCLASSEXTEND = "native"
'''.format(summary=self.summary, hp=self.homepage, lic=self.license if self.license else 'CLOSED', licsum=lic_files_chksum, depends=depends, md5=self.md5sum if self.md5sum else 'FIXME', sha256=self.sha256sum if self.sha256sum else 'FIXME', arch=arch, modname=module_name, require=require, srcuri=srcuri, rdepends=rdepends, symlinks=symlinks))
        for p in self.deps:
            p.write_bb_recipe(outdir)


def hashfile(afile, hasher, blocksize=65536):
    afile.seek(0)
    buf = afile.read(blocksize)
    while len(buf) > 0:
        hasher.update(buf)
        buf = afile.read(blocksize)
    afile.seek(0)
    return hasher.hexdigest()


def npm_version_is_compatible(version, req_version_spec, pname):
    import re

    print(pname + ': req: ' + req_version_spec + ' against ' + version)

    def compare_dep(version, operator, req_version_spec):
        print('  {0} {1}{2}?'.format(version, operator, req_version_spec))
        if not req_version_spec or req_version_spec == '*':
            # any version will do
            print('    yes (any)')
            return True
        elif operator in ['>', '<', '<=', '>=', '=']:
            # any prefix other than ~ and ^ makes things dead simple
            if operator == '>'    and LooseVersion(version) >  LooseVersion(req_version_spec):
                print('    yes (>)')
                return True
            elif operator == '<'  and LooseVersion(version) <  LooseVersion(req_version_spec):
                print('    yes (<)')
                return True
            elif operator == '<=' and LooseVersion(version) <= LooseVersion(req_version_spec):
                print('    yes (<=)')
                return True
            elif operator == '>=' and LooseVersion(version) >= LooseVersion(req_version_spec):
                print('    yes (>=)')
                return True
            elif operator == '='  and LooseVersion(version) == LooseVersion(req_version_spec):
                print('    yes (=)')
                return True
        else:
            # here be dragons...
            # split the dotted portions into a list (handling trailing dots properly)
            parts = [part if part else 'x' for part in req_version_spec.split('.')]
            parts = [int(part) if part != 'x' and not '-' in part
                                                        else part for part in parts]

            if len(parts) == 1 or parts[1] == 'x':
                # 1 or 1.x or 1.x.x or ~1 or ^1
                if LooseVersion(version) < LooseVersion('{0}'.format(parts[0]+1)) and (parts[0] == 0 or LooseVersion(version) >= LooseVersion('{0}'.format(parts[0]))):
                    print('    yes (1 or 1.x or 1.x.x or ~1 or ^1)')
                    return True
            elif len(parts) == 3 or operator != '~':
                # 1.2.3 or 1.2.3-4 or 1.2.x or ~1.2.3 or ^1.2.3 or 1.2
                if len(parts) == 2 or parts[2] == 'x':
                    # 1.2.x or 1.2
                    if LooseVersion(version) >= LooseVersion('{0}.{1}'.format(parts[0], parts[1])) and LooseVersion(version) < LooseVersion('{0}.{1}'.format(parts[0], parts[1]+1)):
                        print('    yes (1.2.x or 1.2)')
                        return True
                elif operator == '~' or (operator == '^' and parts[0] == 0 and parts[1] > 0):
                    # ~1.2.3 or ^0.1.2 (zero is special with the caret operator)
                    if LooseVersion(version) >= LooseVersion(req_version_spec) and LooseVersion(version) < LooseVersion('{0}.{1}'.format(parts[0], parts[1]+1)):
                        print('    yes (~1.2.3 or ^0.1.2)')
                        return True
                elif operator == '^' and parts[0:1] != [0,0]:
                    # ^1.2.3
                    if LooseVersion(version) >= LooseVersion(req_version_spec) and LooseVersion(version) < LooseVersion('{0}'.format(parts[0]+1)):
                        print('    yes (^1.2.3)')
                        return True
                else:
                    # 1.2.3 or 1.2.3-4 or ^0.0.3
                    if LooseVersion(version) == LooseVersion(req_version_spec):
                        print('    yes (1.2.3 or ^0.0.3)')
                        return True
            elif operator == '~':
                # ~1.2
                if LooseVersion(version) >= LooseVersion(req_version_spec) and LooseVersion(version) < LooseVersion('{0}'.format(parts[0]+1)):
                    print('    yes (~1.2)')
                    return True
            elif operator == '^':
                # ^1.2
                if LooseVersion(version) >= LooseVersion(req_version_spec) and LooseVersion(version) < LooseVersion('{0}'.format(parts[0]+1)):
                    print('    yes (^1.2)')
                    return True
        print('    no')
        return False

    # if there is an OR (||) dependency, we have to iterate through all of
    # them. Let's start from the end, assuming the most recent version is
    # specified at the end.
    for req_version in req_version_spec.split('||')[::-1]:
        req_version = req_version.strip()
        if ' - ' in req_version:
            gt, lt = req_version.split(' - ')
            if LooseVersion(version) >= LooseVersion(gt) and LooseVersion(version) <= LooseVersion(lt):
                return True
        else:
            RE_VERSION = re.compile(r'\s*v?([<>=~^]{0,2})\s*([0-9][0-9\.\-]*)\s*')
            if pname == 'esprima-fb':
                # astw 1.1.0 depends on esprima-fb 3001.1.0-dev-harmony-fb
                # not sure if we should allow characters in version numbers in general
                RE_VERSION = re.compile(r'\s*v?([<>=~^]{0,2})\s*([0-9][0-9a-z\.\-]*)\s*')
            m = re.match(RE_VERSION, req_version)
            if m:
                maybe_ok = compare_dep(version, m.group(1), m.group(2))
                # There could be up to two versions here (e.g.">1.0 <3.1")
                if maybe_ok and len(req_version) > m.end():
                    m = re.match(RE_VERSION, req_version[m.end():])
                    if m:
                        print('  {0} so far: {1}'.format(pname, maybe_ok))
                        maybe_ok = compare_dep(version, m.group(1), m.group(2))
                if maybe_ok:
                    return maybe_ok
            else:
                return True
    return False


def process_pkg(p, metadata_p, do_devdepends=False, do_devdepends_2nd=False):
    def get_license(p, metadata_p, key):
        if key in metadata_p:
            if isinstance(metadata_p[key], list):
                if isinstance(metadata_p[key][0], dict):
                    p.license = metadata_p[key][0]['type']
                    return True
                # FIXME?
                p.license = metadata_p[key][0]
                return True
            elif isinstance(metadata_p[key], dict):
                p.license = metadata_p[key]['type']
                return True
            elif isinstance(metadata_p[key], str):
                # FIXME?
                p.license = metadata_p[key]
                return True
        return False


    def safemembers(members):
        resolved_path = lambda x: os.path.realpath(os.path.abspath(x))

        def badpath(path, base):
            # joinpath will ignore base if path is absolute
            return not resolved_path(os.path.join(base, path)).startswith(base)

        def badlink(info, base):
            # Links are interpreted relative to the directory containing the link
            tip = resolved_path(os.path.join(base, os.path.dirname(info.name)))
            return badpath(info.linkname, base=tip)

        base = resolved_path('.')

        for finfo in members:
            if badpath(finfo.name, base):
                print('{0} is blocked: illegal path'.format(finfo.name))
            elif finfo.issym() and badlink(finfo,base):
                print('{0} is blocked: hardlink to {1}'.format(finfo.name, finfo.linkname))
            elif finfo.islnk() and badlink(finfo,base):
                print('{0} is blocked: symlink to {1}'.format(finfo.name, finfo.linkname))
            else:
                yield finfo


    def package_already_available(pkg):
        parent = pkg.parent
        while parent:
            #print('  -> checking parent {0} deps:'.format(parent.name))
            for parent_dep in parent.deps:
                #print('    -> checking parent {0} dep: {1} {2}'.format(parent.name, parent_dep.name, parent_dep.version))
                if not parent_dep.details_skipped and parent_dep.name == pkg.name and npm_version_is_compatible(parent_dep.version, pkg.version, pkg.name):
                    print('      -> found as {0}'.format(parent_dep.version))
                    return parent_dep.version
            parent = parent.parent
        return None


    def have_circular(pkg):
        parent = pkg.parent
        while parent:
            if parent.name == pkg.name:
                print('      -> circular found as {0}'.format(parent.version))
                return True
            parent = parent.parent
        return False


    p.summary = metadata_p['description'] if 'description' in metadata_p else None
    if 'repository' in metadata_p:
        repo = metadata_p['repository']
        if isinstance(repo, dict):
            p.homepage = repo['url']
        elif isinstance(repo, str):
            p.homepage = repo
        else:
            raise TypeError('Unsupported "repository" scheme: {0}'.format(metadata_p['repository']))

    metadata_keys = ['dependencies', 'optionalDependencies' ]
    if do_devdepends:
        metadata_keys.append('devDependencies')
    for metadata_key in metadata_keys:
        if metadata_key in metadata_p:
            is_devdepends = True if metadata_key == 'devDependencies' else False
            metadata_depends = {}
            if isinstance(metadata_p[metadata_key], list):
                # this sucks... Let's convert the deps list to
                # a dict with any version
                print('converting list to dict!')
                metadata_depends = { name:'*' for name in metadata_p[metadata_key] }
            else:
                metadata_depends = metadata_p[metadata_key]
            for name, value in metadata_depends.items():
                dep_p = Package(name, value)
                dep_p.parent = p
                dep_p.is_devdepend = is_devdepends

                if name == 'node-pre-gyp':
                    raise ValueError('node-pre-gyp is required but sucks, hence unsupported at the moment')

                if have_circular(dep_p):
                    dep_p.is_circular = True

                value = None
                if p.metadata_s:
                    if 'dependencies' in p.metadata_s:
                        for n, v in p.metadata_s['dependencies'].items():
                            if n == dep_p.name:
                                dep_p.update_version_and_uri_from_shrinkwrap(v)
                                dep_p.metadata_s = v
                                break;
                if not p.metadata_s or not dep_p.metadata_s:
                    if p.metadata_s:
                        # We have a shrinkwrap.json, but no entry for this
                        # particular package. NPM has probably decided that
                        # it doesn't need this particular dependency, as it
                        # has been satisfied higher up the tree already.
                        print('No info in shrinkwrap about {0} {1}, should exist higher up the tree'.format(dep_p.name, dep_p.version))
                    else:
                        # no shrinkwrap - we have to figure out the latest
                        # compatible version and URL for it
                        # firstly, we'll see if it exists up the tree already, though
                        print('no shrinkwrap, check if {0} {1} exists higher up'.format(dep_p.name, dep_p.version))
                    available_version = package_already_available(dep_p)
                    if available_version != None:
                        dep_p.details_skipped = True
                        dep_p.version = available_version
                        print('  -> yes as {0}'.format(dep_p.version))
                    elif not p.metadata_s or dep_p.is_devdepend:
                        # We have no shrinkwrap.json, or
                        # we have a shrinkwrap.json, but no entry for this
                        # particular package. We also couldn't find this
                        # particular package higher up in the tree. Since
                        # it's a devdepend, things are OK, as shrinkwrap not
                        # necessarily contain devdepends (--save-dev)
                        # We now have to figure out the latest compatible
                        # version and URL for it
                        print('  -> no, should update URI')
                        dep_p.get_latest_compatible()
                    else:
                        # Something must be wrong
                        raise ValueError('{0} {1} does not exist higher up the tree even though shrinkwrap suggests it'.format(dep_p.name, dep_p.version))
                p.add_dep(dep_p)

    for dep_p in p.deps:
        if dep_p.details_skipped:
            print('Skipping parsing of depends of {0} {1} as it exists higher up the tree'.format(dep_p.name, dep_p.version))
            continue

        # download
        print('Get ' + dep_p.src_uri)

        # try from local cache first
        try:
            fo = io.BytesIO(urllib.request.urlopen('file:///tmp/npm-cache/{0}-{1}.tgz'.format(dep_p.name, dep_p.version)).read())
        except urllib.error.URLError:
            if not os.path.exists('/tmp/npm-cache'):
                os.mkdir('/tmp/npm-cache')
            # use real URL and save for later re-use
            fo = io.BytesIO(urllib.request.urlopen(dep_p.src_uri).read())
            with open('/tmp/npm-cache/{0}-{1}.tgz'.format(dep_p.name, dep_p.version), 'w+b') as f:
                f.write(fo.read())

        dep_p.md5sum = hashfile(fo, hashlib.md5())
        dep_p.sha256sum = hashfile(fo, hashlib.sha256())

        metadata_dep_p = None
        with tarfile.open(fileobj=fo) as t_f:
            # some package have unusual directory names (ejs-2.2.4.tgz)
            for pkgdir in ['package', '{0}-v{1}'.format(dep_p.name, dep_p.version), '{0}'.format(dep_p.name)]:
                old_wd = os.getcwd()
                try:
                    fnull = None
                    tmpdir = None
                    metadata_dep_p = json.loads(t_f.extractfile('{0}/package.json'.format(pkgdir)).read().decode('utf-8'))
                    # figure out if *.gyp exists in the package's root dir
                    names = t_f.getnames()
                    for name in names:
                        if os.path.dirname(name) == pkgdir and os.path.splitext(name)[1] == ".gyp":
                            dep_p.using_gyp = True
                            break

                    # extract archive in a safe way:
                    # http://stackoverflow.com/questions/10060069/safely-extract-zip-or-tar-using-python
                    tmpdir = tempfile.mkdtemp(prefix='tmp.create-json.{0}-{1}.'.format(dep_p.name, dep_p.version))
                    t_f.extractall(path=tmpdir, members=safemembers(t_f))
                    # use licensee to determine license https://github.com/benbalter/licensee
                    # licensee only works in git directories, so let's create one...
                    os.chdir('{0}/{1}'.format(tmpdir, pkgdir))
                    fnull = open(os.devnull, 'wb')
                    subprocess.call(['git', 'init'], stdout=fnull)
                    subprocess.call(['git', 'add', '-A', '.'], stdout=fnull)
                    # allow empty just in case this is a git repository already
                    subprocess.call(['git', 'commit', '-m', 'foo', '--allow-empty'], stdout=fnull)
                    licensee = subprocess.check_output('licensee', stderr=fnull, universal_newlines=True)
                    for line in iter(licensee.splitlines()):
                        if line.startswith('License file: '):
                            a, b = line.split(':')
                            dep_p.license_file = b.strip()
                            with open(dep_p.license_file, 'rb') as f:
                                dep_p.license_file_md5sum = hashfile(f, hashlib.md5())
                        elif line.startswith('License: '):
                            a, b = line.split(':')
                            dep_p.license = b.strip()
                            if dep_p.license == 'MIT License':
                                dep_p.license = 'MIT'
                            elif dep_p.license == 'Apache License 2.0':
                                dep_p.license = 'Apache-2.0'
                            elif dep_p.license == 'BSD 3-clause "New" or "Revised" License':
                                dep_p.license = 'BSD-3-Clause'
                            elif dep_p.license == 'BSD 2-clause "Simplified" License':
                                dep_p.license = 'BSD-2-Clause'
                            elif dep_p.license == 'ISC License':
                                dep_p.license = 'ISC'
                            elif dep_p.license == 'no license':
                                dep_p.license = None
                    break
                except KeyError:
                    # pkgdir not found, try next
                    pass
                except subprocess.CalledProcessError:
                    pass
                except FileNotFoundError:
                    # licensee not available
                    pass
                finally:
                    if fnull:
                        fnull.close()
                    os.chdir(old_wd)
                    if tmpdir and os.path.exists(tmpdir):
                        shutil.rmtree(tmpdir)

        process_pkg(dep_p, metadata_dep_p, do_devdepends_2nd)

#    if not p.license:
#        if not get_license(p, metadata_p, 'licenses'):
#            get_license(p, metadata_p, 'license')

    p.sanitise_homepage()


def process_json_from_file(packagejson, shrinkwrap, outdir, do_devdepends, do_devdepends_2nd):
    fh = open(packagejson)
    metadata_p = json.load(fh)
    fh.close()

    metadata_s = None
    if shrinkwrap:
        fh = open(shrinkwrap)
        metadata_s = json.load(fh)
        fh.close()

    p = Package(metadata_p['name'], metadata_p['version'])
    p.metadata_s = metadata_s
    process_pkg(p, metadata_p, do_devdepends, do_devdepends_2nd)

    print('JSON dump')
    print(p.to_JSON())
    print('manual dump')
    p.debugprint()

    p.write_bb_recipe(outdir)


def main():
    parser = argparse.ArgumentParser()
    parser.add_argument('-o', '--output', default=os.getcwd(), help='Directory to write content to. Defaults to $PWD.')
    parser.add_argument('-s', '--shrinkwrap', help='npm-shrinkwrap.json to go with package.json')
    parser.add_argument('-d', '--debug', help='Enable debug.', action='count')
    parser.add_argument('-x', '--devdepends', help='Parse devdepends of main package', action='store_true')
    parser.add_argument('-x2', '--devdepends2', help='Parse devdepends of 2nd package', action='store_true')
    parser.add_argument('packagejson', help='package.json to process')
    args = parser.parse_args()

    args.output = args.output + '/'
    if not os.path.exists(os.path.dirname(args.output)):
        os.makedirs(os.path.dirname(args.output))
    args.output = os.path.dirname(args.output)

    process_json_from_file(args.packagejson, args.shrinkwrap,
                           args.output,
                           args.devdepends, args.devdepends2)


if __name__ == '__main__': 
    main()


[-- Attachment #6: nodejs-node-gyp.inc --]
[-- Type: text/plain, Size: 879 bytes --]

DESCRIPTION = "\
 node-gyp is a cross-platform command-line tool written in Node.js\
 for compiling native addon modules for Node.js.\
 .\
 It features:\
  * Easy to use, consistent interface\
  * Same commands to build a module on every platform\
  * Support of multiple target versions of Node.js\
 .\
 node-gyp replaces node-waf program which was deprecated in Node.js 0.8\
 and removed since Node.js 0.10."

SRC_URI_append = "\
    file://0001-configure-use-system-headers-for-default-nodedir.patch \
    file://0002-addon.gypi-remove-bogus-include-paths.patch \
    file://0003-configure.js-compat-with-our-old-version-of-gyp.patch \
"

do_unpack_append() {
    import shutil
    # remove bundled gyp
    shutil.rmtree('${S}/gyp/', True)
}

do_install_append() {
    os.mkdir("${D}${NODE_MODULE_DIR}/gyp")
    os.symlink("${bindir}/gyp", "${D}${NODE_MODULE_DIR}/gyp/gyp")
}

^ permalink raw reply	[flat|nested] 36+ messages in thread

end of thread, other threads:[~2019-10-29 10:52 UTC | newest]

Thread overview: 36+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2019-10-22  9:03 [RFC][PATCH 0/6] NPM refactoring Jean-Marie LEMETAYER
2019-10-22  9:03 ` [RFC][PATCH 1/6] npm.bbclass: refactor the npm class Jean-Marie LEMETAYER
2019-10-22 11:35   ` Alexander Kanavin
2019-10-23 13:17     ` Jean-Marie LEMETAYER
2019-10-24 11:22   ` Stefan Herbrechtsmeier
2019-10-24 15:13     ` Jean-Marie LEMETAYER
2019-10-22  9:03 ` [RFC][PATCH 2/6] devtool: update command line options for npm Jean-Marie LEMETAYER
2019-10-22  9:03 ` [RFC][PATCH 3/6] recipetool/create_npm.py: refactor the npm recipe creation handler Jean-Marie LEMETAYER
2019-10-22  9:03 ` [RFC][PATCH 4/6] devtool/standard.py: update the append file for the npm recipes Jean-Marie LEMETAYER
2019-10-22  9:03 ` [RFC][PATCH 5/6] recipetool/create.py: replace 'latest' keyword for npm Jean-Marie LEMETAYER
2019-10-22  9:03 ` [RFC][PATCH 6/6] recipetool/create.py: remove the 'noverify' url parameter Jean-Marie LEMETAYER
2019-10-22 11:22 ` [RFC][PATCH 0/6] NPM refactoring Richard Purdie
2019-10-23 13:17   ` Jean-Marie LEMETAYER
2019-10-24 12:01   ` Stefan Herbrechtsmeier
2019-10-24 12:12     ` Alexander Kanavin
2019-10-24 12:40       ` Stefan Herbrechtsmeier
2019-10-24 12:45         ` Alexander Kanavin
2019-10-24 13:52           ` Stefan Herbrechtsmeier
2019-10-24 14:22             ` Alexander Kanavin
2019-10-24 17:44               ` Stefan Herbrechtsmeier
2019-10-24 17:58                 ` Alexander Kanavin
2019-10-25  8:58                   ` Stefan Herbrechtsmeier
2019-10-24 15:13         ` Jean-Marie LEMETAYER
2019-10-24 17:03           ` Stefan Herbrechtsmeier
2019-10-24 13:36       ` richard.purdie
2019-10-24 15:20         ` Jean-Marie LEMETAYER
2019-10-24 15:37       ` Adrian Bunk
2019-10-24 15:59         ` Richard Purdie
2019-10-25  8:35           ` Stefan Herbrechtsmeier
2019-10-25 11:08             ` Adrian Bunk
2019-10-27  9:58               ` Stefan Herbrechtsmeier
2019-10-24 15:13     ` Jean-Marie LEMETAYER
2019-10-24 16:18       ` Stefan Herbrechtsmeier
2019-10-25  8:01 ` André Draszik
2019-10-25  9:10   ` Stefan Herbrechtsmeier
2019-10-29 10:52     ` André Draszik

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.