All of lore.kernel.org
 help / color / mirror / Atom feed
* Installing GDB or the saga of getting RPM/Zypp to run on JFFS2
@ 2011-07-01 12:48 Holger Hans Peter Freyther
  2011-07-01 13:05 ` Koen Kooi
                   ` (2 more replies)
  0 siblings, 3 replies; 8+ messages in thread
From: Holger Hans Peter Freyther @ 2011-07-01 12:48 UTC (permalink / raw)
  To: poky

Hi,

I would like to share some more bits of my poky/yocto experience. We
have started to move our software from the desktop to the device and
we had some minor issues and wanted to use/install gdb. So far we are
based on a -minimal image.

So the next step was to include the variable for the package manager in
our custom image and then the fun starts.

a) I CTRL+C'ed the build once and later I noticed that my poky repo had
the git origin changed. I think bitbake should check the result of the
os.chdir... I will prepare a patch for that...

b) libzypp didn't find gettext... (general ramblings against CMake and not
providing information like autoconf), so somehow I had a mix of gettext
0.17 and 0.18 (no idea where I got a temp 0.18 build from, maybe due
a rebase that i aborted or such).

$ bitbake -cclean virtual/gettext... fixed that for me.

For my colleague it somehow ended up not finding udev... but I have no
other information about it, the issue fixed 'itself' by having another
reason to rebuild udev.

c) then we found that RPM does not work on jffs2, or more precisely that
Berkeley DB opens the db files with MAP_READ|MAP_WRITE which is not
supported by jffs2. Googling reveals this[1], which didn't fix the problem
for me. I know have:

$ cat /etc/rpm/macros
%__dbi_txn      create lock log txn auto_commit nommap private.

The key here is 'private', the NO_MMAP is only for readonly data (a nice
red herring). I found an explanation for private here[2], I have also
tested this with 'shared' (which maps to DB_SYSTEM_MEM).

Now I wonder how to fix that, should we add a /etc/rpm/macros when jffs2
is in the filesystem list? which mode should one use by default? for my
target audience private is an acceptable option. Using ubifs is not something
I want to try right now.

d) we also had a very weird issue of packages-split/libgudev.lock being
there and packaging failing, it might be due the OE tree moving and
bblayers.conf pointing to another tree.

e) gdb 7.2 on ARM (but should apply to others too)
Could not find platform independent libraries <prefix>
Could not find platform dependent libraries <exec_prefix>
Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>]
'import site' failed; use -v for traceback

Traceback (most recent call last):
  File "<string>", line 32, in <module>
ImportError: No module named os.path
...
>b
>mmap
Function "mmap" not defined.
Make breakpoint pending on future shared library load? (y or [n]) \n
...[answered N; input not from terminal]


I assume that GDB needs
1.) an RDEPENDS on libthread_db
2.) besides libpython it needs some minimal python support, I am not
    sure if libpython should have this RDEPENDS
3.) either there is a GLIBC with isatty... or I am missing some kernel
    option... or something weird in libreadline/ncurses (e.g. missing
    terminfo) but it would be nice to be able to answer questions..

f) The size of zypper
libzypp is a _huge_ library, did anyone look into shrinking it? maybe
by making zypper link statically to it? Did anyone look into the alternatives
like SMART?

While looking into the size I also found that:
	- udev drags in libgobject/libgthread due libgudev
	- udev installs keymaps even when i have no keyboard in my
	  machine flags...
	- oprofile installs the data for all platforms, would be nice
	  if we could only install the files that are needed by the
	  given platform.


Okay this is a lot, but it all started from the 'I would like to install
GDB' task. Feel free to start new top level threads for the separate
issues or point me to the bugzilla and I can create tickets.

holger


[1] http://elinux.org/RPM_jffs2_issue
[2] http://www.mathematik.uni-ulm.de/help/BerkeleyDB/ref/env/region.html




^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: Installing GDB or the saga of getting RPM/Zypp to run on JFFS2
  2011-07-01 12:48 Installing GDB or the saga of getting RPM/Zypp to run on JFFS2 Holger Hans Peter Freyther
@ 2011-07-01 13:05 ` Koen Kooi
  2011-07-01 13:44 ` Mark Hatle
  2011-07-01 14:18   ` Holger Hans Peter Freyther
  2 siblings, 0 replies; 8+ messages in thread
From: Koen Kooi @ 2011-07-01 13:05 UTC (permalink / raw)
  To: Holger Hans Peter Freyther; +Cc: poky


Op 1 jul 2011, om 14:48 heeft Holger Hans Peter Freyther het volgende geschreven:

> While looking into the size I also found that:
> 	- udev drags in libgobject/libgthread due libgudev
> 	- udev installs keymaps even when i have no keyboard in my
> 	  machine flags...

The udev in meta-oe should have the above 2 issues address and is a lot faster as well

^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: Installing GDB or the saga of getting RPM/Zypp to run on JFFS2
  2011-07-01 12:48 Installing GDB or the saga of getting RPM/Zypp to run on JFFS2 Holger Hans Peter Freyther
  2011-07-01 13:05 ` Koen Kooi
@ 2011-07-01 13:44 ` Mark Hatle
  2011-07-02 21:09   ` Holger Hans Peter Freyther
  2011-07-03 12:28   ` Holger Hans Peter Freyther
  2011-07-01 14:18   ` Holger Hans Peter Freyther
  2 siblings, 2 replies; 8+ messages in thread
From: Mark Hatle @ 2011-07-01 13:44 UTC (permalink / raw)
  To: poky

On 7/1/11 7:48 AM, Holger Hans Peter Freyther wrote:
> Hi,
> 
> I would like to share some more bits of my poky/yocto experience. We
> have started to move our software from the desktop to the device and
> we had some minor issues and wanted to use/install gdb. So far we are
> based on a -minimal image.
> 
> So the next step was to include the variable for the package manager in
> our custom image and then the fun starts.
> 
> a) I CTRL+C'ed the build once and later I noticed that my poky repo had
> the git origin changed. I think bitbake should check the result of the
> os.chdir... I will prepare a patch for that...
> 
> b) libzypp didn't find gettext... (general ramblings against CMake and not
> providing information like autoconf), so somehow I had a mix of gettext
> 0.17 and 0.18 (no idea where I got a temp 0.18 build from, maybe due
> a rebase that i aborted or such).
> 
> $ bitbake -cclean virtual/gettext... fixed that for me.
> 
> For my colleague it somehow ended up not finding udev... but I have no
> other information about it, the issue fixed 'itself' by having another
> reason to rebuild udev.
> 
> c) then we found that RPM does not work on jffs2, or more precisely that
> Berkeley DB opens the db files with MAP_READ|MAP_WRITE which is not
> supported by jffs2. Googling reveals this[1], which didn't fix the problem
> for me. I know have:
> 
> $ cat /etc/rpm/macros
> %__dbi_txn      create lock log txn auto_commit nommap private.

There is also a database configuration file:

/usr/lib/rpm/DB_CONFIG

This contains database specific settings such as the memory pool and similar.

Note, I've never tried RPM on a jffs2 before.  I'd not expected someone to use
it there either.. ;)  ipk is likely more appropriate simply due to cached file
sizes and such.

See bug http://bugzilla.yoctoproject.org/show_bug.cgi?id=1174

RPM, as currently configured, ends up gobbling up a lot of disk space.  We're
working on a solution to this (changing the values in DB_CONFIG).. but it hasn't
yet been implemented.

> The key here is 'private', the NO_MMAP is only for readonly data (a nice
> red herring). I found an explanation for private here[2], I have also
> tested this with 'shared' (which maps to DB_SYSTEM_MEM).
> 
> Now I wonder how to fix that, should we add a /etc/rpm/macros when jffs2
> is in the filesystem list? which mode should one use by default? for my
> target audience private is an acceptable option. Using ubifs is not something
> I want to try right now.

macro files are being loaded from:

%{_usrlibrpm}/macros:%{_usrlibrpm}/poky/macros:%{_usrlibrpm}/poky/%{_target}/macros:%{_etcrpm}/macros.*:%{_etcrpm}/macros:%{_etcrpm}/%{_target}/macros:~/.oerpmmacros

The key above is the /etc/rpm/macros.*

I would recommend a new file be generated called /etc/rpm/macros.jffs2 that
changes the setting as appropriate for that filesystem.  (How it's placed into
the filesystem I'm not sure.  I think it all comes down to detecting we're
building a jffs2 filesystem and doing it there.  Perhaps in the rootfs_rpm.bbclass?)

> 
> d) we also had a very weird issue of packages-split/libgudev.lock being
> there and packaging failing, it might be due the OE tree moving and
> bblayers.conf pointing to another tree.
> 
> e) gdb 7.2 on ARM (but should apply to others too)
> Could not find platform independent libraries <prefix>
> Could not find platform dependent libraries <exec_prefix>
> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>]
> 'import site' failed; use -v for traceback
> 
> Traceback (most recent call last):
>   File "<string>", line 32, in <module>
> ImportError: No module named os.path
> ...
>> b
>> mmap
> Function "mmap" not defined.
> Make breakpoint pending on future shared library load? (y or [n]) \n
> ...[answered N; input not from terminal]
> 
> 
> I assume that GDB needs
> 1.) an RDEPENDS on libthread_db

libpthread_db is only needed if you are debugging more then one thread.

> 2.) besides libpython it needs some minimal python support, I am not
>     sure if libpython should have this RDEPENDS

GDB includes some python scripts to make it easier to walk various structures
and similar.  I don't remember off hand where these get placed, but you should
see in your FS something of the format gdb...py, if you remove these it'll no
longer load the python helpers.

> 3.) either there is a GLIBC with isatty... or I am missing some kernel
>     option... or something weird in libreadline/ncurses (e.g. missing
>     terminfo) but it would be nice to be able to answer questions..
> 
> f) The size of zypper
> libzypp is a _huge_ library, did anyone look into shrinking it? maybe
> by making zypper link statically to it? Did anyone look into the alternatives
> like SMART?

zypper is somewhat of a pig.  Again this wasn't really designed for really small
systems.

I've used smart in the past, it's worth looking into... but it's been a lack of
time.  (Plus smart, at least used to, require python.. which itself is a pig, if
you don't already need python.)

> While looking into the size I also found that:
> 	- udev drags in libgobject/libgthread due libgudev
> 	- udev installs keymaps even when i have no keyboard in my
> 	  machine flags...

These should definitely be fixed.

> 	- oprofile installs the data for all platforms, would be nice
> 	  if we could only install the files that are needed by the
> 	  given platform.
> 
> 
> Okay this is a lot, but it all started from the 'I would like to install
> GDB' task. Feel free to start new top level threads for the separate
> issues or point me to the bugzilla and I can create tickets.

Most of my debugging so far with GDB on the target has been on ext3 and "large"
filesystems.  Alternatively I've run gdbserver on fairly small systems -- no RPM
or .debug on the target.. the .debug stuff was on the host side.

--Mark

> holger
> 
> 
> [1] http://elinux.org/RPM_jffs2_issue
> [2] http://www.mathematik.uni-ulm.de/help/BerkeleyDB/ref/env/region.html
> 
> 
> _______________________________________________
> poky mailing list
> poky@yoctoproject.org
> https://lists.yoctoproject.org/listinfo/poky



^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: [poky] Installing GDB or the saga of getting RPM/Zypp to run on JFFS2
  2011-07-01 12:48 Installing GDB or the saga of getting RPM/Zypp to run on JFFS2 Holger Hans Peter Freyther
@ 2011-07-01 14:18   ` Holger Hans Peter Freyther
  2011-07-01 13:44 ` Mark Hatle
  2011-07-01 14:18   ` Holger Hans Peter Freyther
  2 siblings, 0 replies; 8+ messages in thread
From: Holger Hans Peter Freyther @ 2011-07-01 14:18 UTC (permalink / raw)
  To: poky, bitbake-devel

[-- Attachment #1: Type: text/plain, Size: 360 bytes --]

On 07/01/2011 02:48 PM, Holger Hans Peter Freyther wrote:
> Hi,

> 
> a) I CTRL+C'ed the build once and later I noticed that my poky repo had
> the git origin changed. I think bitbake should check the result of the
> os.chdir... I will prepare a patch for that...

Okay, I am doing an oe-core build using this patch right now. Any comments?

holger

[-- Attachment #2: 0001-misc-Replace-os.chdir-with-bb.utils.checked_chdir.patch --]
[-- Type: text/x-patch, Size: 30426 bytes --]

From 89e38923bdc01f2ef3f5ea81e5fd435e416b2ba6 Mon Sep 17 00:00:00 2001
From: Holger Hans Peter Freyther <zecke@selfish.org>
Date: Fri, 1 Jul 2011 22:12:52 +0800
Subject: [PATCH] misc: Replace os.chdir with bb.utils.checked_chdir

The Python os.chdir does not return the status, in some
rare condition it seems possible that a Keyboard interrupt
can interrupt the chdir and continue with the execution from
the wrong directory. Introduce the bb.utils.checked_chdir
to verify the chdir operation.

The users in bitbake of chdir are a bit sloppy, it is possible
that someone is using a relative directory, uses a trailing
slash or double slashes (e.g. license-destdir//).
---
 lib/bb/build.py           |    4 ++--
 lib/bb/cache.py           |    4 ++--
 lib/bb/fetch/bzr.py       |    6 +++---
 lib/bb/fetch/cvs.py       |   10 +++++-----
 lib/bb/fetch/git.py       |   18 +++++++++---------
 lib/bb/fetch/hg.py        |    8 ++++----
 lib/bb/fetch/osc.py       |    6 +++---
 lib/bb/fetch/perforce.py  |    2 +-
 lib/bb/fetch/repo.py      |    4 ++--
 lib/bb/fetch/svk.py       |    4 ++--
 lib/bb/fetch/svn.py       |    6 +++---
 lib/bb/fetch2/__init__.py |    6 +++---
 lib/bb/fetch2/bzr.py      |    6 +++---
 lib/bb/fetch2/cvs.py      |   10 +++++-----
 lib/bb/fetch2/git.py      |   14 +++++++-------
 lib/bb/fetch2/hg.py       |    8 ++++----
 lib/bb/fetch2/osc.py      |    6 +++---
 lib/bb/fetch2/perforce.py |    2 +-
 lib/bb/fetch2/repo.py     |    4 ++--
 lib/bb/fetch2/svk.py      |    4 ++--
 lib/bb/fetch2/svn.py      |    6 +++---
 lib/bb/pysh/builtin.py    |    4 ++--
 lib/bb/pysh/interp.py     |    4 ++--
 lib/bb/utils.py           |   17 +++++++++++++++++
 24 files changed, 90 insertions(+), 73 deletions(-)

diff --git a/lib/bb/build.py b/lib/bb/build.py
index 1c73ae2..0980e7a 100644
--- a/lib/bb/build.py
+++ b/lib/bb/build.py
@@ -191,7 +191,7 @@ def exec_func_python(func, d, runfile, cwd=None):
             olddir = os.getcwd()
         except OSError:
             olddir = None
-        os.chdir(cwd)
+        bb.utils.checked_chdir(cwd)
 
     try:
         comp = utils.better_compile(code, func, bbfile)
@@ -204,7 +204,7 @@ def exec_func_python(func, d, runfile, cwd=None):
     finally:
         if cwd and olddir:
             try:
-                os.chdir(olddir)
+                bb.utils.checked_chdir(olddir)
             except OSError:
                 pass
 
diff --git a/lib/bb/cache.py b/lib/bb/cache.py
index 99d7395..81d24ab 100644
--- a/lib/bb/cache.py
+++ b/lib/bb/cache.py
@@ -657,11 +657,11 @@ class Cache(object):
                 data.setVar('__BBAPPEND', " ".join(appends), bb_data)
             bb_data = parse.handle(bbfile, bb_data)
             if chdir_back:
-                os.chdir(oldpath)
+                bb.utils.checked_chdir(oldpath)
             return bb_data
         except:
             if chdir_back:
-                os.chdir(oldpath)
+                bb.utils.checked_chdir(oldpath)
             raise
 
 
diff --git a/lib/bb/fetch/bzr.py b/lib/bb/fetch/bzr.py
index 85a9294..9774c65 100644
--- a/lib/bb/fetch/bzr.py
+++ b/lib/bb/fetch/bzr.py
@@ -88,18 +88,18 @@ class Bzr(Fetch):
         if os.access(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir), '.bzr'), os.R_OK):
             bzrcmd = self._buildbzrcommand(ud, d, "update")
             logger.debug(1, "BZR Update %s", loc)
-            os.chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
+            bb.utils.checked_chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
             runfetchcmd(bzrcmd, d)
         else:
             bb.utils.remove(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir)), True)
             bzrcmd = self._buildbzrcommand(ud, d, "fetch")
             logger.debug(1, "BZR Checkout %s", loc)
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", bzrcmd)
             runfetchcmd(bzrcmd, d)
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
 
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
diff --git a/lib/bb/fetch/cvs.py b/lib/bb/fetch/cvs.py
index 64450af..9f9f2fd 100644
--- a/lib/bb/fetch/cvs.py
+++ b/lib/bb/fetch/cvs.py
@@ -132,13 +132,13 @@ class Cvs(Fetch):
         if os.access(os.path.join(moddir, 'CVS'), os.R_OK):
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(moddir)
+            bb.utils.checked_chdir(moddir)
             myret = os.system(cvsupdatecmd)
         else:
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(pkgdir)
-            os.chdir(pkgdir)
+            bb.utils.checked_chdir(pkgdir)
             logger.debug(1, "Running %s", cvscmd)
             myret = os.system(cvscmd)
 
@@ -157,11 +157,11 @@ class Cvs(Fetch):
 
         # tar them up to a defined filename
         if 'fullpath' in ud.parm:
-            os.chdir(pkgdir)
+            bb.utils.checked_chdir(pkgdir)
             myret = os.system("tar %s -czf %s %s" % (tar_flags, ud.localpath, localdir))
         else:
-            os.chdir(moddir)
-            os.chdir('..')
+            bb.utils.checked_chdir(moddir)
+            bb.utils.checked_chdir('..')
             myret = os.system("tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.basename(moddir)))
 
         if myret != 0:
diff --git a/lib/bb/fetch/git.py b/lib/bb/fetch/git.py
index 49c1cfe..0c1f170 100644
--- a/lib/bb/fetch/git.py
+++ b/lib/bb/fetch/git.py
@@ -132,14 +132,14 @@ class Git(Fetch):
         # If the checkout doesn't exist and the mirror tarball does, extract it
         if not os.path.exists(ud.clonedir) and os.path.exists(repofile):
             bb.utils.mkdirhier(ud.clonedir)
-            os.chdir(ud.clonedir)
+            bb.utils.checked_chdir(ud.clonedir)
             runfetchcmd("tar -xzf %s" % (repofile), d)
 
         # If the repo still doesn't exist, fallback to cloning it
         if not os.path.exists(ud.clonedir):
             runfetchcmd("%s clone -n %s://%s%s%s %s" % (ud.basecmd, ud.proto, username, ud.host, ud.path, ud.clonedir), d)
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         # Update the checkout if needed
         if not self._contains_ref(ud.tag, d) or 'fullclone' in ud.parm:
             # Remove all but the .git directory
@@ -153,7 +153,7 @@ class Git(Fetch):
             runfetchcmd("%s pack-redundant --all | xargs -r rm" % ud.basecmd, d)
 
         # Generate a mirror tarball if needed
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         mirror_tarballs = data.getVar("BB_GENERATE_MIRROR_TARBALLS", d, True)
         if mirror_tarballs != "0" or 'fullclone' in ud.parm:
             logger.info("Creating tarball of git repository")
@@ -185,19 +185,19 @@ class Git(Fetch):
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
             runfetchcmd("%s clone -n %s %s" % (ud.basecmd, ud.clonedir, coprefix), d)
-            os.chdir(coprefix)
+            bb.utils.checked_chdir(coprefix)
             runfetchcmd("%s checkout -q -f %s%s" % (ud.basecmd, ud.tag, readpathspec), d)
         else:
             bb.utils.mkdirhier(codir)
-            os.chdir(ud.clonedir)
+            bb.utils.checked_chdir(ud.clonedir)
             runfetchcmd("%s read-tree %s%s" % (ud.basecmd, ud.tag, readpathspec), d)
             runfetchcmd("%s checkout-index -q -f --prefix=%s -a" % (ud.basecmd, coprefix), d)
 
-        os.chdir(codir)
+        bb.utils.checked_chdir(codir)
         logger.info("Creating tarball of git checkout")
         runfetchcmd("tar -czf %s %s" % (ud.localpath, os.path.join(".", "*") ), d)
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         bb.utils.prunedir(codir)
 
     def supports_srcrev(self):
@@ -327,12 +327,12 @@ class Git(Fetch):
                 return None
 
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         if not self._contains_ref(rev, d):
             self.go(None, ud, d)
 
         output = runfetchcmd("%s rev-list %s -- 2> /dev/null | wc -l" % (ud.basecmd, rev), d, quiet=True)
-        os.chdir(cwd)
+        bb.utils.checked_chdir(cwd)
 
         buildindex = "%s" % output.split()[0]
         logger.debug(1, "GIT repository for %s in %s is returning %s revisions in rev-list before %s", url, ud.clonedir, buildindex, rev)
diff --git a/lib/bb/fetch/hg.py b/lib/bb/fetch/hg.py
index 2b3aec5..84a94b5 100644
--- a/lib/bb/fetch/hg.py
+++ b/lib/bb/fetch/hg.py
@@ -123,7 +123,7 @@ class Hg(Fetch):
             updatecmd = self._buildhgcommand(ud, d, "pull")
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", updatecmd)
             runfetchcmd(updatecmd, d)
 
@@ -132,14 +132,14 @@ class Hg(Fetch):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", fetchcmd)
             runfetchcmd(fetchcmd, d)
 
         # Even when we clone (fetch), we still need to update as hg's clone
         # won't checkout the specified revision if its on a branch
         updatecmd = self._buildhgcommand(ud, d, "update")
-        os.chdir(ud.moddir)
+        bb.utils.checked_chdir(ud.moddir)
         logger.debug(1, "Running %s", updatecmd)
         runfetchcmd(updatecmd, d)
 
@@ -149,7 +149,7 @@ class Hg(Fetch):
         else:
             tar_flags = "--exclude '.hg' --exclude '.hgrags'"
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
         try:
             runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d)
         except:
diff --git a/lib/bb/fetch/osc.py b/lib/bb/fetch/osc.py
index 32237b9..3a69846 100644
--- a/lib/bb/fetch/osc.py
+++ b/lib/bb/fetch/osc.py
@@ -91,7 +91,7 @@ class Osc(Fetch):
             oscupdatecmd = self._buildosccommand(ud, d, "update")
             logger.info("Update "+ loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", oscupdatecmd)
             runfetchcmd(oscupdatecmd, d)
         else:
@@ -99,11 +99,11 @@ class Osc(Fetch):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", oscfetchcmd)
             runfetchcmd(oscfetchcmd, d)
 
-        os.chdir(os.path.join(ud.pkgdir + ud.path))
+        bb.utils.checked_chdir(os.path.join(ud.pkgdir + ud.path))
         # tar them up to a defined filename
         try:
             runfetchcmd("tar -czf %s %s" % (ud.localpath, ud.module), d)
diff --git a/lib/bb/fetch/perforce.py b/lib/bb/fetch/perforce.py
index e933d27..b1e79ce 100644
--- a/lib/bb/fetch/perforce.py
+++ b/lib/bb/fetch/perforce.py
@@ -168,7 +168,7 @@ class Perforce(Fetch):
             cset = Perforce.getcset(d, depot, host, user, pswd, parm)
             depot = "%s@%s" % (depot, cset)
 
-        os.chdir(tmpfile)
+        bb.utils.checked_chdir(tmpfile)
         logger.info("Fetch " + loc)
         logger.info("%s%s files %s", p4cmd, p4opt, depot)
         p4file = os.popen("%s%s files %s" % (p4cmd, p4opt, depot))
diff --git a/lib/bb/fetch/repo.py b/lib/bb/fetch/repo.py
index 512fffb..cda36bf 100644
--- a/lib/bb/fetch/repo.py
+++ b/lib/bb/fetch/repo.py
@@ -72,12 +72,12 @@ class Repo(Fetch):
             username = ""
 
         bb.utils.mkdirhier(os.path.join(codir, "repo"))
-        os.chdir(os.path.join(codir, "repo"))
+        bb.utils.checked_chdir(os.path.join(codir, "repo"))
         if not os.path.exists(os.path.join(codir, "repo", ".repo")):
             runfetchcmd("repo init -m %s -b %s -u %s://%s%s%s" % (ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), d)
 
         runfetchcmd("repo sync", d)
-        os.chdir(codir)
+        bb.utils.checked_chdir(codir)
 
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
diff --git a/lib/bb/fetch/svk.py b/lib/bb/fetch/svk.py
index dc818d2..d67ae16 100644
--- a/lib/bb/fetch/svk.py
+++ b/lib/bb/fetch/svk.py
@@ -80,7 +80,7 @@ class Svk(Fetch):
             raise FetchError(ud.module)
 
         # check out sources there
-        os.chdir(tmpfile)
+        bb.utils.checked_chdir(tmpfile)
         logger.info("Fetch " + loc)
         logger.debug(1, "Running %s", svkcmd)
         myret = os.system(svkcmd)
@@ -91,7 +91,7 @@ class Svk(Fetch):
                 pass
             raise FetchError(ud.module)
 
-        os.chdir(os.path.join(tmpfile, os.path.dirname(ud.module)))
+        bb.utils.checked_chdir(os.path.join(tmpfile, os.path.dirname(ud.module)))
         # tar them up to a defined filename
         myret = os.system("tar -czf %s %s" % (ud.localpath, os.path.basename(ud.module)))
         if myret != 0:
diff --git a/lib/bb/fetch/svn.py b/lib/bb/fetch/svn.py
index 6c2a118..25c13a4 100644
--- a/lib/bb/fetch/svn.py
+++ b/lib/bb/fetch/svn.py
@@ -139,7 +139,7 @@ class Svn(Fetch):
             svnupdatecmd = self._buildsvncommand(ud, d, "update")
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", svnupdatecmd)
             runfetchcmd(svnupdatecmd, d)
         else:
@@ -147,7 +147,7 @@ class Svn(Fetch):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", svnfetchcmd)
             runfetchcmd(svnfetchcmd, d)
 
@@ -157,7 +157,7 @@ class Svn(Fetch):
         else:
             tar_flags = "--exclude '.svn'"
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
         # tar them up to a defined filename
         try:
             runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d)
diff --git a/lib/bb/fetch2/__init__.py b/lib/bb/fetch2/__init__.py
index e9a64c5..e325f6e 100644
--- a/lib/bb/fetch2/__init__.py
+++ b/lib/bb/fetch2/__init__.py
@@ -739,17 +739,17 @@ class FetchMethod(object):
 
         # Change to subdir before executing command
         save_cwd = os.getcwd();
-        os.chdir(rootdir)
+        bb.utils.checked_chdir(rootdir)
         if 'subdir' in urldata.parm:
             newdir = ("%s/%s" % (rootdir, urldata.parm.get('subdir')))
             bb.utils.mkdirhier(newdir)
-            os.chdir(newdir)
+            bb.utils.checked_chdir(newdir)
 
         cmd = "PATH=\"%s\" %s" % (bb.data.getVar('PATH', data, True), cmd)
         bb.note("Unpacking %s to %s/" % (file, os.getcwd()))
         ret = subprocess.call(cmd, preexec_fn=subprocess_setup, shell=True)
 
-        os.chdir(save_cwd)
+        bb.utils.checked_chdir(save_cwd)
 
         if ret != 0:
             raise UnpackError("Unpack command %s failed with return value %s" % (cmd, ret), urldata.url)
diff --git a/lib/bb/fetch2/bzr.py b/lib/bb/fetch2/bzr.py
index 0d10eb4..404d004 100644
--- a/lib/bb/fetch2/bzr.py
+++ b/lib/bb/fetch2/bzr.py
@@ -88,7 +88,7 @@ class Bzr(FetchMethod):
             bzrcmd = self._buildbzrcommand(ud, d, "update")
             logger.debug(1, "BZR Update %s", loc)
             bb.fetch2.check_network_access(d, bzrcmd, ud.url)
-            os.chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
+            bb.utils.checked_chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
             runfetchcmd(bzrcmd, d)
         else:
             bb.utils.remove(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir)), True)
@@ -96,11 +96,11 @@ class Bzr(FetchMethod):
             bb.fetch2.check_network_access(d, bzrcmd, ud.url)
             logger.debug(1, "BZR Checkout %s", loc)
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", bzrcmd)
             runfetchcmd(bzrcmd, d)
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
 
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
diff --git a/lib/bb/fetch2/cvs.py b/lib/bb/fetch2/cvs.py
index a111467..0710b32 100644
--- a/lib/bb/fetch2/cvs.py
+++ b/lib/bb/fetch2/cvs.py
@@ -134,13 +134,13 @@ class Cvs(FetchMethod):
             logger.info("Update " + loc)
             bb.fetch2.check_network_access(d, cvsupdatecmd, ud.url)
             # update sources there
-            os.chdir(moddir)
+            bb.utils.checked_chdir(moddir)
             cmd = cvsupdatecmd
         else:
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(pkgdir)
-            os.chdir(pkgdir)
+            bb.utils.checked_chdir(pkgdir)
             logger.debug(1, "Running %s", cvscmd)
             bb.fetch2.check_network_access(d, cvscmd, ud.url)
             cmd = cvscmd
@@ -158,11 +158,11 @@ class Cvs(FetchMethod):
 
         # tar them up to a defined filename
         if 'fullpath' in ud.parm:
-            os.chdir(pkgdir)
+            bb.utils.checked_chdir(pkgdir)
             cmd = "tar %s -czf %s %s" % (tar_flags, ud.localpath, localdir)
         else:
-            os.chdir(moddir)
-            os.chdir('..')
+            bb.utils.checked_chdir(moddir)
+            bb.utils.checked_chdir('..')
             cmd = "tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.basename(moddir))
 
         runfetchcmd(cmd, d, cleanup = [ud.localpath])
diff --git a/lib/bb/fetch2/git.py b/lib/bb/fetch2/git.py
index 534c87d..d4f841b 100644
--- a/lib/bb/fetch2/git.py
+++ b/lib/bb/fetch2/git.py
@@ -135,7 +135,7 @@ class Git(FetchMethod):
     def need_update(self, u, ud, d):
         if not os.path.exists(ud.clonedir):
             return True
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         for name in ud.names:
             if not self._contains_ref(ud.revisions[name], d):
                 return True
@@ -165,7 +165,7 @@ class Git(FetchMethod):
         # If the checkout doesn't exist and the mirror tarball does, extract it
         if not os.path.exists(ud.clonedir) and os.path.exists(ud.fullmirror):
             bb.utils.mkdirhier(ud.clonedir)
-            os.chdir(ud.clonedir)
+            bb.utils.checked_chdir(ud.clonedir)
             runfetchcmd("tar -xzf %s" % (ud.fullmirror), d)
 
         # If the repo still doesn't exist, fallback to cloning it
@@ -175,7 +175,7 @@ class Git(FetchMethod):
             bb.fetch2.check_network_access(d, clone_cmd)
             runfetchcmd(clone_cmd, d)
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         # Update the checkout if needed
         needupdate = False
         for name in ud.names:
@@ -199,7 +199,7 @@ class Git(FetchMethod):
     def build_mirror_data(self, url, ud, d):
         # Generate a mirror tarball if needed
         if ud.write_tarballs and (ud.repochanged or not os.path.exists(ud.fullmirror)):
-            os.chdir(ud.clonedir)
+            bb.utils.checked_chdir(ud.clonedir)
             logger.info("Creating tarball of git repository")
             runfetchcmd("tar -czf %s %s" % (ud.fullmirror, os.path.join(".") ), d)
 
@@ -219,7 +219,7 @@ class Git(FetchMethod):
 
         runfetchcmd("git clone -s -n %s %s" % (ud.clonedir, destdir), d)
         if not ud.nocheckout:
-            os.chdir(destdir)
+            bb.utils.checked_chdir(destdir)
             if subdir != "":
                 runfetchcmd("%s read-tree %s%s" % (ud.basecmd, ud.revisions[ud.names[0]], readpathspec), d)
                 runfetchcmd("%s checkout-index -q -f -a" % ud.basecmd, d)
@@ -289,12 +289,12 @@ class Git(FetchMethod):
                 return None
 
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         if not self._contains_ref(rev, d):
             self.download(None, ud, d)
 
         output = runfetchcmd("%s rev-list %s -- 2> /dev/null | wc -l" % (ud.basecmd, rev), d, quiet=True)
-        os.chdir(cwd)
+        bb.utils.checked_chdir(cwd)
 
         buildindex = "%s" % output.split()[0]
         logger.debug(1, "GIT repository for %s in %s is returning %s revisions in rev-list before %s", url, ud.clonedir, buildindex, rev)
diff --git a/lib/bb/fetch2/hg.py b/lib/bb/fetch2/hg.py
index 793831a..f8a3b1e 100644
--- a/lib/bb/fetch2/hg.py
+++ b/lib/bb/fetch2/hg.py
@@ -124,7 +124,7 @@ class Hg(FetchMethod):
             updatecmd = self._buildhgcommand(ud, d, "pull")
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", updatecmd)
             bb.fetch2.check_network_access(d, updatecmd, ud.url)
             runfetchcmd(updatecmd, d)
@@ -134,7 +134,7 @@ class Hg(FetchMethod):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", fetchcmd)
             bb.fetch2.check_network_access(d, fetchcmd, ud.url)
             runfetchcmd(fetchcmd, d)
@@ -142,7 +142,7 @@ class Hg(FetchMethod):
         # Even when we clone (fetch), we still need to update as hg's clone
         # won't checkout the specified revision if its on a branch
         updatecmd = self._buildhgcommand(ud, d, "update")
-        os.chdir(ud.moddir)
+        bb.utils.checked_chdir(ud.moddir)
         logger.debug(1, "Running %s", updatecmd)
         runfetchcmd(updatecmd, d)
 
@@ -152,7 +152,7 @@ class Hg(FetchMethod):
         else:
             tar_flags = "--exclude '.hg' --exclude '.hgrags'"
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
         runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d, cleanup = [ud.localpath])
 
     def supports_srcrev(self):
diff --git a/lib/bb/fetch2/osc.py b/lib/bb/fetch2/osc.py
index a16a53e..1e22ca9 100644
--- a/lib/bb/fetch2/osc.py
+++ b/lib/bb/fetch2/osc.py
@@ -88,7 +88,7 @@ class Osc(FetchMethod):
             oscupdatecmd = self._buildosccommand(ud, d, "update")
             logger.info("Update "+ loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", oscupdatecmd)
             bb.fetch2.check_network_access(d, oscupdatecmd, ud.url)
             runfetchcmd(oscupdatecmd, d)
@@ -97,12 +97,12 @@ class Osc(FetchMethod):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", oscfetchcmd)
             bb.fetch2.check_network_access(d, oscfetchcmd, ud.url)
             runfetchcmd(oscfetchcmd, d)
 
-        os.chdir(os.path.join(ud.pkgdir + ud.path))
+        bb.utils.checked_chdir(os.path.join(ud.pkgdir + ud.path))
         # tar them up to a defined filename
         runfetchcmd("tar -czf %s %s" % (ud.localpath, ud.module), d, cleanup = [ud.localpath])
 
diff --git a/lib/bb/fetch2/perforce.py b/lib/bb/fetch2/perforce.py
index cbdc848..bb13594 100644
--- a/lib/bb/fetch2/perforce.py
+++ b/lib/bb/fetch2/perforce.py
@@ -165,7 +165,7 @@ class Perforce(FetchMethod):
             cset = Perforce.getcset(d, depot, host, user, pswd, parm)
             depot = "%s@%s" % (depot, cset)
 
-        os.chdir(tmpfile)
+        bb.utils.checked_chdir(tmpfile)
         logger.info("Fetch " + loc)
         logger.info("%s%s files %s", p4cmd, p4opt, depot)
         p4file = os.popen("%s%s files %s" % (p4cmd, p4opt, depot))
diff --git a/lib/bb/fetch2/repo.py b/lib/bb/fetch2/repo.py
index 8300da8..e267917 100644
--- a/lib/bb/fetch2/repo.py
+++ b/lib/bb/fetch2/repo.py
@@ -70,14 +70,14 @@ class Repo(FetchMethod):
             username = ""
 
         bb.utils.mkdirhier(os.path.join(codir, "repo"))
-        os.chdir(os.path.join(codir, "repo"))
+        bb.utils.checked_chdir(os.path.join(codir, "repo"))
         if not os.path.exists(os.path.join(codir, "repo", ".repo")):
             bb.fetch2.check_network_access(d, "repo init -m %s -b %s -u %s://%s%s%s" % (ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), ud.url)
             runfetchcmd("repo init -m %s -b %s -u %s://%s%s%s" % (ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), d)
 
         bb.fetch2.check_network_access(d, "repo sync %s" % ud.url, ud.url)
         runfetchcmd("repo sync", d)
-        os.chdir(codir)
+        bb.utils.checked_chdir(codir)
 
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
diff --git a/lib/bb/fetch2/svk.py b/lib/bb/fetch2/svk.py
index 9d34abf..a2fb864 100644
--- a/lib/bb/fetch2/svk.py
+++ b/lib/bb/fetch2/svk.py
@@ -84,12 +84,12 @@ class Svk(FetchMethod):
             raise FetchError("Fetch: unable to create temporary directory.. make sure 'mktemp' is in the PATH.", loc)
 
         # check out sources there
-        os.chdir(tmpfile)
+        bb.utils.checked_chdir(tmpfile)
         logger.info("Fetch " + loc)
         logger.debug(1, "Running %s", svkcmd)
         runfetchcmd(svkcmd, d, cleanup = [tmpfile])
 
-        os.chdir(os.path.join(tmpfile, os.path.dirname(ud.module)))
+        bb.utils.checked_chdir(os.path.join(tmpfile, os.path.dirname(ud.module)))
         # tar them up to a defined filename
         runfetchcmd("tar -czf %s %s" % (ud.localpath, os.path.basename(ud.module)), d, cleanup = [ud.localpath])
 
diff --git a/lib/bb/fetch2/svn.py b/lib/bb/fetch2/svn.py
index 59d7ccb..0bc5203 100644
--- a/lib/bb/fetch2/svn.py
+++ b/lib/bb/fetch2/svn.py
@@ -116,7 +116,7 @@ class Svn(FetchMethod):
             svnupdatecmd = self._buildsvncommand(ud, d, "update")
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", svnupdatecmd)
             bb.fetch2.check_network_access(d, svnupdatecmd, ud.url)
             runfetchcmd(svnupdatecmd, d)
@@ -125,7 +125,7 @@ class Svn(FetchMethod):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", svnfetchcmd)
             bb.fetch2.check_network_access(d, svnfetchcmd, ud.url)
             runfetchcmd(svnfetchcmd, d)
@@ -136,7 +136,7 @@ class Svn(FetchMethod):
         else:
             tar_flags = "--exclude '.svn'"
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
         # tar them up to a defined filename
         runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d, cleanup = [ud.localpath])
 
diff --git a/lib/bb/pysh/builtin.py b/lib/bb/pysh/builtin.py
index b748e4a..693becf 100644
--- a/lib/bb/pysh/builtin.py
+++ b/lib/bb/pysh/builtin.py
@@ -581,11 +581,11 @@ def utility_sort(name, args, interp, env, stdin, stdout, stderr, debugflags):
     # Load all files lines
     curdir = os.getcwd()
     try:
-        os.chdir(env['PWD'])
+        bb.utils.checked_chdir(env['PWD'])
         for path in args:
             alllines += sort(path)
     finally:
-        os.chdir(curdir)
+        bb.utils.checked_chdir(curdir)
             
     alllines.sort()
     for line in alllines:
diff --git a/lib/bb/pysh/interp.py b/lib/bb/pysh/interp.py
index 25d8c92..da3aa21 100644
--- a/lib/bb/pysh/interp.py
+++ b/lib/bb/pysh/interp.py
@@ -1261,11 +1261,11 @@ class Interpreter:
             
         def pwd_glob(pattern):
             cwd = os.getcwd()
-            os.chdir(self._env['PWD'])
+            bb.utils.checked_chdir(self._env['PWD'])
             try:
                 return glob.glob(pattern) 
             finally:
-                os.chdir(cwd)    
+                bb.utils.checked_chdir(cwd)
             
         #TODO: check working directory issues here wrt relative patterns
         try:
diff --git a/lib/bb/utils.py b/lib/bb/utils.py
index 82e5dc4..99d3d8a 100644
--- a/lib/bb/utils.py
+++ b/lib/bb/utils.py
@@ -856,3 +856,20 @@ def to_boolean(string, default=None):
         return False
     else:
         raise ValueError("Invalid value for to_boolean: %s" % string)
+
+def checked_chdir(name):
+    """I change the directory and verify that it was changed, if
+    not I am raising an exception."""
+
+    # We are a bit sloppy with '/' or using a relative dir like
+    # .. in the fetchers. We resolve the new path here, remove a
+    # trailing slash and see if things match now.
+    old_dir = os.getcwd()
+    new_dir = os.path.join(old_dir, name)
+    new_dir = new_dir.replace('//', '/')
+    if new_dir[-1] == '/':
+        new_dir = new_dir[:-1]
+    os.chdir(new_dir)
+
+    if os.getcwd() != new_dir:
+        raise OSError('Failed to change to directory: \'%s\'. You are in \'%s\', resolved \'%s\'' % (new_dir, os.getcwd(), new_dir))
-- 
1.7.4.1


^ permalink raw reply related	[flat|nested] 8+ messages in thread

* Re: Installing GDB or the saga of getting RPM/Zypp to run on JFFS2
@ 2011-07-01 14:18   ` Holger Hans Peter Freyther
  0 siblings, 0 replies; 8+ messages in thread
From: Holger Hans Peter Freyther @ 2011-07-01 14:18 UTC (permalink / raw)
  To: poky, bitbake-devel

[-- Attachment #1: Type: text/plain, Size: 360 bytes --]

On 07/01/2011 02:48 PM, Holger Hans Peter Freyther wrote:
> Hi,

> 
> a) I CTRL+C'ed the build once and later I noticed that my poky repo had
> the git origin changed. I think bitbake should check the result of the
> os.chdir... I will prepare a patch for that...

Okay, I am doing an oe-core build using this patch right now. Any comments?

holger

[-- Attachment #2: 0001-misc-Replace-os.chdir-with-bb.utils.checked_chdir.patch --]
[-- Type: text/x-patch, Size: 30426 bytes --]

From 89e38923bdc01f2ef3f5ea81e5fd435e416b2ba6 Mon Sep 17 00:00:00 2001
From: Holger Hans Peter Freyther <zecke@selfish.org>
Date: Fri, 1 Jul 2011 22:12:52 +0800
Subject: [PATCH] misc: Replace os.chdir with bb.utils.checked_chdir

The Python os.chdir does not return the status, in some
rare condition it seems possible that a Keyboard interrupt
can interrupt the chdir and continue with the execution from
the wrong directory. Introduce the bb.utils.checked_chdir
to verify the chdir operation.

The users in bitbake of chdir are a bit sloppy, it is possible
that someone is using a relative directory, uses a trailing
slash or double slashes (e.g. license-destdir//).
---
 lib/bb/build.py           |    4 ++--
 lib/bb/cache.py           |    4 ++--
 lib/bb/fetch/bzr.py       |    6 +++---
 lib/bb/fetch/cvs.py       |   10 +++++-----
 lib/bb/fetch/git.py       |   18 +++++++++---------
 lib/bb/fetch/hg.py        |    8 ++++----
 lib/bb/fetch/osc.py       |    6 +++---
 lib/bb/fetch/perforce.py  |    2 +-
 lib/bb/fetch/repo.py      |    4 ++--
 lib/bb/fetch/svk.py       |    4 ++--
 lib/bb/fetch/svn.py       |    6 +++---
 lib/bb/fetch2/__init__.py |    6 +++---
 lib/bb/fetch2/bzr.py      |    6 +++---
 lib/bb/fetch2/cvs.py      |   10 +++++-----
 lib/bb/fetch2/git.py      |   14 +++++++-------
 lib/bb/fetch2/hg.py       |    8 ++++----
 lib/bb/fetch2/osc.py      |    6 +++---
 lib/bb/fetch2/perforce.py |    2 +-
 lib/bb/fetch2/repo.py     |    4 ++--
 lib/bb/fetch2/svk.py      |    4 ++--
 lib/bb/fetch2/svn.py      |    6 +++---
 lib/bb/pysh/builtin.py    |    4 ++--
 lib/bb/pysh/interp.py     |    4 ++--
 lib/bb/utils.py           |   17 +++++++++++++++++
 24 files changed, 90 insertions(+), 73 deletions(-)

diff --git a/lib/bb/build.py b/lib/bb/build.py
index 1c73ae2..0980e7a 100644
--- a/lib/bb/build.py
+++ b/lib/bb/build.py
@@ -191,7 +191,7 @@ def exec_func_python(func, d, runfile, cwd=None):
             olddir = os.getcwd()
         except OSError:
             olddir = None
-        os.chdir(cwd)
+        bb.utils.checked_chdir(cwd)
 
     try:
         comp = utils.better_compile(code, func, bbfile)
@@ -204,7 +204,7 @@ def exec_func_python(func, d, runfile, cwd=None):
     finally:
         if cwd and olddir:
             try:
-                os.chdir(olddir)
+                bb.utils.checked_chdir(olddir)
             except OSError:
                 pass
 
diff --git a/lib/bb/cache.py b/lib/bb/cache.py
index 99d7395..81d24ab 100644
--- a/lib/bb/cache.py
+++ b/lib/bb/cache.py
@@ -657,11 +657,11 @@ class Cache(object):
                 data.setVar('__BBAPPEND', " ".join(appends), bb_data)
             bb_data = parse.handle(bbfile, bb_data)
             if chdir_back:
-                os.chdir(oldpath)
+                bb.utils.checked_chdir(oldpath)
             return bb_data
         except:
             if chdir_back:
-                os.chdir(oldpath)
+                bb.utils.checked_chdir(oldpath)
             raise
 
 
diff --git a/lib/bb/fetch/bzr.py b/lib/bb/fetch/bzr.py
index 85a9294..9774c65 100644
--- a/lib/bb/fetch/bzr.py
+++ b/lib/bb/fetch/bzr.py
@@ -88,18 +88,18 @@ class Bzr(Fetch):
         if os.access(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir), '.bzr'), os.R_OK):
             bzrcmd = self._buildbzrcommand(ud, d, "update")
             logger.debug(1, "BZR Update %s", loc)
-            os.chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
+            bb.utils.checked_chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
             runfetchcmd(bzrcmd, d)
         else:
             bb.utils.remove(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir)), True)
             bzrcmd = self._buildbzrcommand(ud, d, "fetch")
             logger.debug(1, "BZR Checkout %s", loc)
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", bzrcmd)
             runfetchcmd(bzrcmd, d)
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
 
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
diff --git a/lib/bb/fetch/cvs.py b/lib/bb/fetch/cvs.py
index 64450af..9f9f2fd 100644
--- a/lib/bb/fetch/cvs.py
+++ b/lib/bb/fetch/cvs.py
@@ -132,13 +132,13 @@ class Cvs(Fetch):
         if os.access(os.path.join(moddir, 'CVS'), os.R_OK):
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(moddir)
+            bb.utils.checked_chdir(moddir)
             myret = os.system(cvsupdatecmd)
         else:
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(pkgdir)
-            os.chdir(pkgdir)
+            bb.utils.checked_chdir(pkgdir)
             logger.debug(1, "Running %s", cvscmd)
             myret = os.system(cvscmd)
 
@@ -157,11 +157,11 @@ class Cvs(Fetch):
 
         # tar them up to a defined filename
         if 'fullpath' in ud.parm:
-            os.chdir(pkgdir)
+            bb.utils.checked_chdir(pkgdir)
             myret = os.system("tar %s -czf %s %s" % (tar_flags, ud.localpath, localdir))
         else:
-            os.chdir(moddir)
-            os.chdir('..')
+            bb.utils.checked_chdir(moddir)
+            bb.utils.checked_chdir('..')
             myret = os.system("tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.basename(moddir)))
 
         if myret != 0:
diff --git a/lib/bb/fetch/git.py b/lib/bb/fetch/git.py
index 49c1cfe..0c1f170 100644
--- a/lib/bb/fetch/git.py
+++ b/lib/bb/fetch/git.py
@@ -132,14 +132,14 @@ class Git(Fetch):
         # If the checkout doesn't exist and the mirror tarball does, extract it
         if not os.path.exists(ud.clonedir) and os.path.exists(repofile):
             bb.utils.mkdirhier(ud.clonedir)
-            os.chdir(ud.clonedir)
+            bb.utils.checked_chdir(ud.clonedir)
             runfetchcmd("tar -xzf %s" % (repofile), d)
 
         # If the repo still doesn't exist, fallback to cloning it
         if not os.path.exists(ud.clonedir):
             runfetchcmd("%s clone -n %s://%s%s%s %s" % (ud.basecmd, ud.proto, username, ud.host, ud.path, ud.clonedir), d)
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         # Update the checkout if needed
         if not self._contains_ref(ud.tag, d) or 'fullclone' in ud.parm:
             # Remove all but the .git directory
@@ -153,7 +153,7 @@ class Git(Fetch):
             runfetchcmd("%s pack-redundant --all | xargs -r rm" % ud.basecmd, d)
 
         # Generate a mirror tarball if needed
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         mirror_tarballs = data.getVar("BB_GENERATE_MIRROR_TARBALLS", d, True)
         if mirror_tarballs != "0" or 'fullclone' in ud.parm:
             logger.info("Creating tarball of git repository")
@@ -185,19 +185,19 @@ class Git(Fetch):
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
             runfetchcmd("%s clone -n %s %s" % (ud.basecmd, ud.clonedir, coprefix), d)
-            os.chdir(coprefix)
+            bb.utils.checked_chdir(coprefix)
             runfetchcmd("%s checkout -q -f %s%s" % (ud.basecmd, ud.tag, readpathspec), d)
         else:
             bb.utils.mkdirhier(codir)
-            os.chdir(ud.clonedir)
+            bb.utils.checked_chdir(ud.clonedir)
             runfetchcmd("%s read-tree %s%s" % (ud.basecmd, ud.tag, readpathspec), d)
             runfetchcmd("%s checkout-index -q -f --prefix=%s -a" % (ud.basecmd, coprefix), d)
 
-        os.chdir(codir)
+        bb.utils.checked_chdir(codir)
         logger.info("Creating tarball of git checkout")
         runfetchcmd("tar -czf %s %s" % (ud.localpath, os.path.join(".", "*") ), d)
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         bb.utils.prunedir(codir)
 
     def supports_srcrev(self):
@@ -327,12 +327,12 @@ class Git(Fetch):
                 return None
 
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         if not self._contains_ref(rev, d):
             self.go(None, ud, d)
 
         output = runfetchcmd("%s rev-list %s -- 2> /dev/null | wc -l" % (ud.basecmd, rev), d, quiet=True)
-        os.chdir(cwd)
+        bb.utils.checked_chdir(cwd)
 
         buildindex = "%s" % output.split()[0]
         logger.debug(1, "GIT repository for %s in %s is returning %s revisions in rev-list before %s", url, ud.clonedir, buildindex, rev)
diff --git a/lib/bb/fetch/hg.py b/lib/bb/fetch/hg.py
index 2b3aec5..84a94b5 100644
--- a/lib/bb/fetch/hg.py
+++ b/lib/bb/fetch/hg.py
@@ -123,7 +123,7 @@ class Hg(Fetch):
             updatecmd = self._buildhgcommand(ud, d, "pull")
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", updatecmd)
             runfetchcmd(updatecmd, d)
 
@@ -132,14 +132,14 @@ class Hg(Fetch):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", fetchcmd)
             runfetchcmd(fetchcmd, d)
 
         # Even when we clone (fetch), we still need to update as hg's clone
         # won't checkout the specified revision if its on a branch
         updatecmd = self._buildhgcommand(ud, d, "update")
-        os.chdir(ud.moddir)
+        bb.utils.checked_chdir(ud.moddir)
         logger.debug(1, "Running %s", updatecmd)
         runfetchcmd(updatecmd, d)
 
@@ -149,7 +149,7 @@ class Hg(Fetch):
         else:
             tar_flags = "--exclude '.hg' --exclude '.hgrags'"
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
         try:
             runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d)
         except:
diff --git a/lib/bb/fetch/osc.py b/lib/bb/fetch/osc.py
index 32237b9..3a69846 100644
--- a/lib/bb/fetch/osc.py
+++ b/lib/bb/fetch/osc.py
@@ -91,7 +91,7 @@ class Osc(Fetch):
             oscupdatecmd = self._buildosccommand(ud, d, "update")
             logger.info("Update "+ loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", oscupdatecmd)
             runfetchcmd(oscupdatecmd, d)
         else:
@@ -99,11 +99,11 @@ class Osc(Fetch):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", oscfetchcmd)
             runfetchcmd(oscfetchcmd, d)
 
-        os.chdir(os.path.join(ud.pkgdir + ud.path))
+        bb.utils.checked_chdir(os.path.join(ud.pkgdir + ud.path))
         # tar them up to a defined filename
         try:
             runfetchcmd("tar -czf %s %s" % (ud.localpath, ud.module), d)
diff --git a/lib/bb/fetch/perforce.py b/lib/bb/fetch/perforce.py
index e933d27..b1e79ce 100644
--- a/lib/bb/fetch/perforce.py
+++ b/lib/bb/fetch/perforce.py
@@ -168,7 +168,7 @@ class Perforce(Fetch):
             cset = Perforce.getcset(d, depot, host, user, pswd, parm)
             depot = "%s@%s" % (depot, cset)
 
-        os.chdir(tmpfile)
+        bb.utils.checked_chdir(tmpfile)
         logger.info("Fetch " + loc)
         logger.info("%s%s files %s", p4cmd, p4opt, depot)
         p4file = os.popen("%s%s files %s" % (p4cmd, p4opt, depot))
diff --git a/lib/bb/fetch/repo.py b/lib/bb/fetch/repo.py
index 512fffb..cda36bf 100644
--- a/lib/bb/fetch/repo.py
+++ b/lib/bb/fetch/repo.py
@@ -72,12 +72,12 @@ class Repo(Fetch):
             username = ""
 
         bb.utils.mkdirhier(os.path.join(codir, "repo"))
-        os.chdir(os.path.join(codir, "repo"))
+        bb.utils.checked_chdir(os.path.join(codir, "repo"))
         if not os.path.exists(os.path.join(codir, "repo", ".repo")):
             runfetchcmd("repo init -m %s -b %s -u %s://%s%s%s" % (ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), d)
 
         runfetchcmd("repo sync", d)
-        os.chdir(codir)
+        bb.utils.checked_chdir(codir)
 
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
diff --git a/lib/bb/fetch/svk.py b/lib/bb/fetch/svk.py
index dc818d2..d67ae16 100644
--- a/lib/bb/fetch/svk.py
+++ b/lib/bb/fetch/svk.py
@@ -80,7 +80,7 @@ class Svk(Fetch):
             raise FetchError(ud.module)
 
         # check out sources there
-        os.chdir(tmpfile)
+        bb.utils.checked_chdir(tmpfile)
         logger.info("Fetch " + loc)
         logger.debug(1, "Running %s", svkcmd)
         myret = os.system(svkcmd)
@@ -91,7 +91,7 @@ class Svk(Fetch):
                 pass
             raise FetchError(ud.module)
 
-        os.chdir(os.path.join(tmpfile, os.path.dirname(ud.module)))
+        bb.utils.checked_chdir(os.path.join(tmpfile, os.path.dirname(ud.module)))
         # tar them up to a defined filename
         myret = os.system("tar -czf %s %s" % (ud.localpath, os.path.basename(ud.module)))
         if myret != 0:
diff --git a/lib/bb/fetch/svn.py b/lib/bb/fetch/svn.py
index 6c2a118..25c13a4 100644
--- a/lib/bb/fetch/svn.py
+++ b/lib/bb/fetch/svn.py
@@ -139,7 +139,7 @@ class Svn(Fetch):
             svnupdatecmd = self._buildsvncommand(ud, d, "update")
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", svnupdatecmd)
             runfetchcmd(svnupdatecmd, d)
         else:
@@ -147,7 +147,7 @@ class Svn(Fetch):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", svnfetchcmd)
             runfetchcmd(svnfetchcmd, d)
 
@@ -157,7 +157,7 @@ class Svn(Fetch):
         else:
             tar_flags = "--exclude '.svn'"
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
         # tar them up to a defined filename
         try:
             runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d)
diff --git a/lib/bb/fetch2/__init__.py b/lib/bb/fetch2/__init__.py
index e9a64c5..e325f6e 100644
--- a/lib/bb/fetch2/__init__.py
+++ b/lib/bb/fetch2/__init__.py
@@ -739,17 +739,17 @@ class FetchMethod(object):
 
         # Change to subdir before executing command
         save_cwd = os.getcwd();
-        os.chdir(rootdir)
+        bb.utils.checked_chdir(rootdir)
         if 'subdir' in urldata.parm:
             newdir = ("%s/%s" % (rootdir, urldata.parm.get('subdir')))
             bb.utils.mkdirhier(newdir)
-            os.chdir(newdir)
+            bb.utils.checked_chdir(newdir)
 
         cmd = "PATH=\"%s\" %s" % (bb.data.getVar('PATH', data, True), cmd)
         bb.note("Unpacking %s to %s/" % (file, os.getcwd()))
         ret = subprocess.call(cmd, preexec_fn=subprocess_setup, shell=True)
 
-        os.chdir(save_cwd)
+        bb.utils.checked_chdir(save_cwd)
 
         if ret != 0:
             raise UnpackError("Unpack command %s failed with return value %s" % (cmd, ret), urldata.url)
diff --git a/lib/bb/fetch2/bzr.py b/lib/bb/fetch2/bzr.py
index 0d10eb4..404d004 100644
--- a/lib/bb/fetch2/bzr.py
+++ b/lib/bb/fetch2/bzr.py
@@ -88,7 +88,7 @@ class Bzr(FetchMethod):
             bzrcmd = self._buildbzrcommand(ud, d, "update")
             logger.debug(1, "BZR Update %s", loc)
             bb.fetch2.check_network_access(d, bzrcmd, ud.url)
-            os.chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
+            bb.utils.checked_chdir(os.path.join (ud.pkgdir, os.path.basename(ud.path)))
             runfetchcmd(bzrcmd, d)
         else:
             bb.utils.remove(os.path.join(ud.pkgdir, os.path.basename(ud.pkgdir)), True)
@@ -96,11 +96,11 @@ class Bzr(FetchMethod):
             bb.fetch2.check_network_access(d, bzrcmd, ud.url)
             logger.debug(1, "BZR Checkout %s", loc)
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", bzrcmd)
             runfetchcmd(bzrcmd, d)
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
 
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
diff --git a/lib/bb/fetch2/cvs.py b/lib/bb/fetch2/cvs.py
index a111467..0710b32 100644
--- a/lib/bb/fetch2/cvs.py
+++ b/lib/bb/fetch2/cvs.py
@@ -134,13 +134,13 @@ class Cvs(FetchMethod):
             logger.info("Update " + loc)
             bb.fetch2.check_network_access(d, cvsupdatecmd, ud.url)
             # update sources there
-            os.chdir(moddir)
+            bb.utils.checked_chdir(moddir)
             cmd = cvsupdatecmd
         else:
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(pkgdir)
-            os.chdir(pkgdir)
+            bb.utils.checked_chdir(pkgdir)
             logger.debug(1, "Running %s", cvscmd)
             bb.fetch2.check_network_access(d, cvscmd, ud.url)
             cmd = cvscmd
@@ -158,11 +158,11 @@ class Cvs(FetchMethod):
 
         # tar them up to a defined filename
         if 'fullpath' in ud.parm:
-            os.chdir(pkgdir)
+            bb.utils.checked_chdir(pkgdir)
             cmd = "tar %s -czf %s %s" % (tar_flags, ud.localpath, localdir)
         else:
-            os.chdir(moddir)
-            os.chdir('..')
+            bb.utils.checked_chdir(moddir)
+            bb.utils.checked_chdir('..')
             cmd = "tar %s -czf %s %s" % (tar_flags, ud.localpath, os.path.basename(moddir))
 
         runfetchcmd(cmd, d, cleanup = [ud.localpath])
diff --git a/lib/bb/fetch2/git.py b/lib/bb/fetch2/git.py
index 534c87d..d4f841b 100644
--- a/lib/bb/fetch2/git.py
+++ b/lib/bb/fetch2/git.py
@@ -135,7 +135,7 @@ class Git(FetchMethod):
     def need_update(self, u, ud, d):
         if not os.path.exists(ud.clonedir):
             return True
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         for name in ud.names:
             if not self._contains_ref(ud.revisions[name], d):
                 return True
@@ -165,7 +165,7 @@ class Git(FetchMethod):
         # If the checkout doesn't exist and the mirror tarball does, extract it
         if not os.path.exists(ud.clonedir) and os.path.exists(ud.fullmirror):
             bb.utils.mkdirhier(ud.clonedir)
-            os.chdir(ud.clonedir)
+            bb.utils.checked_chdir(ud.clonedir)
             runfetchcmd("tar -xzf %s" % (ud.fullmirror), d)
 
         # If the repo still doesn't exist, fallback to cloning it
@@ -175,7 +175,7 @@ class Git(FetchMethod):
             bb.fetch2.check_network_access(d, clone_cmd)
             runfetchcmd(clone_cmd, d)
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         # Update the checkout if needed
         needupdate = False
         for name in ud.names:
@@ -199,7 +199,7 @@ class Git(FetchMethod):
     def build_mirror_data(self, url, ud, d):
         # Generate a mirror tarball if needed
         if ud.write_tarballs and (ud.repochanged or not os.path.exists(ud.fullmirror)):
-            os.chdir(ud.clonedir)
+            bb.utils.checked_chdir(ud.clonedir)
             logger.info("Creating tarball of git repository")
             runfetchcmd("tar -czf %s %s" % (ud.fullmirror, os.path.join(".") ), d)
 
@@ -219,7 +219,7 @@ class Git(FetchMethod):
 
         runfetchcmd("git clone -s -n %s %s" % (ud.clonedir, destdir), d)
         if not ud.nocheckout:
-            os.chdir(destdir)
+            bb.utils.checked_chdir(destdir)
             if subdir != "":
                 runfetchcmd("%s read-tree %s%s" % (ud.basecmd, ud.revisions[ud.names[0]], readpathspec), d)
                 runfetchcmd("%s checkout-index -q -f -a" % ud.basecmd, d)
@@ -289,12 +289,12 @@ class Git(FetchMethod):
                 return None
 
 
-        os.chdir(ud.clonedir)
+        bb.utils.checked_chdir(ud.clonedir)
         if not self._contains_ref(rev, d):
             self.download(None, ud, d)
 
         output = runfetchcmd("%s rev-list %s -- 2> /dev/null | wc -l" % (ud.basecmd, rev), d, quiet=True)
-        os.chdir(cwd)
+        bb.utils.checked_chdir(cwd)
 
         buildindex = "%s" % output.split()[0]
         logger.debug(1, "GIT repository for %s in %s is returning %s revisions in rev-list before %s", url, ud.clonedir, buildindex, rev)
diff --git a/lib/bb/fetch2/hg.py b/lib/bb/fetch2/hg.py
index 793831a..f8a3b1e 100644
--- a/lib/bb/fetch2/hg.py
+++ b/lib/bb/fetch2/hg.py
@@ -124,7 +124,7 @@ class Hg(FetchMethod):
             updatecmd = self._buildhgcommand(ud, d, "pull")
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", updatecmd)
             bb.fetch2.check_network_access(d, updatecmd, ud.url)
             runfetchcmd(updatecmd, d)
@@ -134,7 +134,7 @@ class Hg(FetchMethod):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", fetchcmd)
             bb.fetch2.check_network_access(d, fetchcmd, ud.url)
             runfetchcmd(fetchcmd, d)
@@ -142,7 +142,7 @@ class Hg(FetchMethod):
         # Even when we clone (fetch), we still need to update as hg's clone
         # won't checkout the specified revision if its on a branch
         updatecmd = self._buildhgcommand(ud, d, "update")
-        os.chdir(ud.moddir)
+        bb.utils.checked_chdir(ud.moddir)
         logger.debug(1, "Running %s", updatecmd)
         runfetchcmd(updatecmd, d)
 
@@ -152,7 +152,7 @@ class Hg(FetchMethod):
         else:
             tar_flags = "--exclude '.hg' --exclude '.hgrags'"
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
         runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d, cleanup = [ud.localpath])
 
     def supports_srcrev(self):
diff --git a/lib/bb/fetch2/osc.py b/lib/bb/fetch2/osc.py
index a16a53e..1e22ca9 100644
--- a/lib/bb/fetch2/osc.py
+++ b/lib/bb/fetch2/osc.py
@@ -88,7 +88,7 @@ class Osc(FetchMethod):
             oscupdatecmd = self._buildosccommand(ud, d, "update")
             logger.info("Update "+ loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", oscupdatecmd)
             bb.fetch2.check_network_access(d, oscupdatecmd, ud.url)
             runfetchcmd(oscupdatecmd, d)
@@ -97,12 +97,12 @@ class Osc(FetchMethod):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", oscfetchcmd)
             bb.fetch2.check_network_access(d, oscfetchcmd, ud.url)
             runfetchcmd(oscfetchcmd, d)
 
-        os.chdir(os.path.join(ud.pkgdir + ud.path))
+        bb.utils.checked_chdir(os.path.join(ud.pkgdir + ud.path))
         # tar them up to a defined filename
         runfetchcmd("tar -czf %s %s" % (ud.localpath, ud.module), d, cleanup = [ud.localpath])
 
diff --git a/lib/bb/fetch2/perforce.py b/lib/bb/fetch2/perforce.py
index cbdc848..bb13594 100644
--- a/lib/bb/fetch2/perforce.py
+++ b/lib/bb/fetch2/perforce.py
@@ -165,7 +165,7 @@ class Perforce(FetchMethod):
             cset = Perforce.getcset(d, depot, host, user, pswd, parm)
             depot = "%s@%s" % (depot, cset)
 
-        os.chdir(tmpfile)
+        bb.utils.checked_chdir(tmpfile)
         logger.info("Fetch " + loc)
         logger.info("%s%s files %s", p4cmd, p4opt, depot)
         p4file = os.popen("%s%s files %s" % (p4cmd, p4opt, depot))
diff --git a/lib/bb/fetch2/repo.py b/lib/bb/fetch2/repo.py
index 8300da8..e267917 100644
--- a/lib/bb/fetch2/repo.py
+++ b/lib/bb/fetch2/repo.py
@@ -70,14 +70,14 @@ class Repo(FetchMethod):
             username = ""
 
         bb.utils.mkdirhier(os.path.join(codir, "repo"))
-        os.chdir(os.path.join(codir, "repo"))
+        bb.utils.checked_chdir(os.path.join(codir, "repo"))
         if not os.path.exists(os.path.join(codir, "repo", ".repo")):
             bb.fetch2.check_network_access(d, "repo init -m %s -b %s -u %s://%s%s%s" % (ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), ud.url)
             runfetchcmd("repo init -m %s -b %s -u %s://%s%s%s" % (ud.manifest, ud.branch, ud.proto, username, ud.host, ud.path), d)
 
         bb.fetch2.check_network_access(d, "repo sync %s" % ud.url, ud.url)
         runfetchcmd("repo sync", d)
-        os.chdir(codir)
+        bb.utils.checked_chdir(codir)
 
         scmdata = ud.parm.get("scmdata", "")
         if scmdata == "keep":
diff --git a/lib/bb/fetch2/svk.py b/lib/bb/fetch2/svk.py
index 9d34abf..a2fb864 100644
--- a/lib/bb/fetch2/svk.py
+++ b/lib/bb/fetch2/svk.py
@@ -84,12 +84,12 @@ class Svk(FetchMethod):
             raise FetchError("Fetch: unable to create temporary directory.. make sure 'mktemp' is in the PATH.", loc)
 
         # check out sources there
-        os.chdir(tmpfile)
+        bb.utils.checked_chdir(tmpfile)
         logger.info("Fetch " + loc)
         logger.debug(1, "Running %s", svkcmd)
         runfetchcmd(svkcmd, d, cleanup = [tmpfile])
 
-        os.chdir(os.path.join(tmpfile, os.path.dirname(ud.module)))
+        bb.utils.checked_chdir(os.path.join(tmpfile, os.path.dirname(ud.module)))
         # tar them up to a defined filename
         runfetchcmd("tar -czf %s %s" % (ud.localpath, os.path.basename(ud.module)), d, cleanup = [ud.localpath])
 
diff --git a/lib/bb/fetch2/svn.py b/lib/bb/fetch2/svn.py
index 59d7ccb..0bc5203 100644
--- a/lib/bb/fetch2/svn.py
+++ b/lib/bb/fetch2/svn.py
@@ -116,7 +116,7 @@ class Svn(FetchMethod):
             svnupdatecmd = self._buildsvncommand(ud, d, "update")
             logger.info("Update " + loc)
             # update sources there
-            os.chdir(ud.moddir)
+            bb.utils.checked_chdir(ud.moddir)
             logger.debug(1, "Running %s", svnupdatecmd)
             bb.fetch2.check_network_access(d, svnupdatecmd, ud.url)
             runfetchcmd(svnupdatecmd, d)
@@ -125,7 +125,7 @@ class Svn(FetchMethod):
             logger.info("Fetch " + loc)
             # check out sources there
             bb.utils.mkdirhier(ud.pkgdir)
-            os.chdir(ud.pkgdir)
+            bb.utils.checked_chdir(ud.pkgdir)
             logger.debug(1, "Running %s", svnfetchcmd)
             bb.fetch2.check_network_access(d, svnfetchcmd, ud.url)
             runfetchcmd(svnfetchcmd, d)
@@ -136,7 +136,7 @@ class Svn(FetchMethod):
         else:
             tar_flags = "--exclude '.svn'"
 
-        os.chdir(ud.pkgdir)
+        bb.utils.checked_chdir(ud.pkgdir)
         # tar them up to a defined filename
         runfetchcmd("tar %s -czf %s %s" % (tar_flags, ud.localpath, ud.module), d, cleanup = [ud.localpath])
 
diff --git a/lib/bb/pysh/builtin.py b/lib/bb/pysh/builtin.py
index b748e4a..693becf 100644
--- a/lib/bb/pysh/builtin.py
+++ b/lib/bb/pysh/builtin.py
@@ -581,11 +581,11 @@ def utility_sort(name, args, interp, env, stdin, stdout, stderr, debugflags):
     # Load all files lines
     curdir = os.getcwd()
     try:
-        os.chdir(env['PWD'])
+        bb.utils.checked_chdir(env['PWD'])
         for path in args:
             alllines += sort(path)
     finally:
-        os.chdir(curdir)
+        bb.utils.checked_chdir(curdir)
             
     alllines.sort()
     for line in alllines:
diff --git a/lib/bb/pysh/interp.py b/lib/bb/pysh/interp.py
index 25d8c92..da3aa21 100644
--- a/lib/bb/pysh/interp.py
+++ b/lib/bb/pysh/interp.py
@@ -1261,11 +1261,11 @@ class Interpreter:
             
         def pwd_glob(pattern):
             cwd = os.getcwd()
-            os.chdir(self._env['PWD'])
+            bb.utils.checked_chdir(self._env['PWD'])
             try:
                 return glob.glob(pattern) 
             finally:
-                os.chdir(cwd)    
+                bb.utils.checked_chdir(cwd)
             
         #TODO: check working directory issues here wrt relative patterns
         try:
diff --git a/lib/bb/utils.py b/lib/bb/utils.py
index 82e5dc4..99d3d8a 100644
--- a/lib/bb/utils.py
+++ b/lib/bb/utils.py
@@ -856,3 +856,20 @@ def to_boolean(string, default=None):
         return False
     else:
         raise ValueError("Invalid value for to_boolean: %s" % string)
+
+def checked_chdir(name):
+    """I change the directory and verify that it was changed, if
+    not I am raising an exception."""
+
+    # We are a bit sloppy with '/' or using a relative dir like
+    # .. in the fetchers. We resolve the new path here, remove a
+    # trailing slash and see if things match now.
+    old_dir = os.getcwd()
+    new_dir = os.path.join(old_dir, name)
+    new_dir = new_dir.replace('//', '/')
+    if new_dir[-1] == '/':
+        new_dir = new_dir[:-1]
+    os.chdir(new_dir)
+
+    if os.getcwd() != new_dir:
+        raise OSError('Failed to change to directory: \'%s\'. You are in \'%s\', resolved \'%s\'' % (new_dir, os.getcwd(), new_dir))
-- 
1.7.4.1


^ permalink raw reply related	[flat|nested] 8+ messages in thread

* Re: Installing GDB or the saga of getting RPM/Zypp to run on JFFS2
  2011-07-01 13:44 ` Mark Hatle
@ 2011-07-02 21:09   ` Holger Hans Peter Freyther
  2011-07-03 12:28   ` Holger Hans Peter Freyther
  1 sibling, 0 replies; 8+ messages in thread
From: Holger Hans Peter Freyther @ 2011-07-02 21:09 UTC (permalink / raw)
  To: poky

[-- Attachment #1: Type: text/plain, Size: 1639 bytes --]

On 07/01/2011 03:44 PM, Mark Hatle wrote:
>> f) The size of zypper
>> > libzypp is a _huge_ library, did anyone look into shrinking it? maybe
>> > by making zypper link statically to it? Did anyone look into the alternatives
>> > like SMART?
> zypper is somewhat of a pig.  Again this wasn't really designed for really small
> systems.
> 
> I've used smart in the past, it's worth looking into... but it's been a lack of
> time.  (Plus smart, at least used to, require python.. which itself is a pig, if
> you don't already need python.)
> 


Hi again,

I looked a bit around and I found:
	- zypper
	- SMART
	- yum
	- apt-rpm

zypper/libzypp:
	- I tried to produce a static libzypp and hoped the compiler would
	  make it small, but I ended up in unresolved symbols in zypper and
	  didn't feel like continuing (and using -Wl,--begin-group.. )

SMART:
	- Oh, I assume that the python installation needed will bringt it to
	  the ballpark of libzypp/zypper

yum:
	- It is slow on my laptop, it will not fly on a 300Mhz device..


apt-rpm:
	- I stumbled over a port for rpm 5.x (rpm is such a mess), it
	  compiles (after patching out a json dependency, for a library
	  using Scons, highlighting broken scons support).
	- Even with things in the package we don't need the installed size
	  is < 2MB
	- genbasedir requires our RPMs to be in RPMS.all/, RPMS.armv5te/
	  format... and it is segfaulting.
	- I have attached my current bb file (against my companies bblayer)


anyone wants to jump in with trying to get apt-rpm to replace zypper? The
net-saving will be probably around 5-6MB...


[-- Attachment #2: apt_rpm --]
[-- Type: text/plain, Size: 3910 bytes --]

diff --git a/meta-sysmocom/recipes-system/apt-rpm/apt-rpm_0.5.bb b/meta-sysmocom/recipes-system/apt-rpm/apt-rpm_0.5.bb
new file mode 100644
index 0000000..1681104
--- /dev/null
+++ b/meta-sysmocom/recipes-system/apt-rpm/apt-rpm_0.5.bb
@@ -0,0 +1,22 @@
+DESCRIPTION = "apt-rpm for rpm based distros"
+HOMEPAGE = "http://gitorious.org/rpm5distro/apt-rpm"
+SRC_URI = "git://gitorious.org/rpm5distro/apt-rpm.git;protocol=git \
+	   file://hacks.patch"
+SRCREV = "2a6bd7a847e9a7ad269c637a1b673dafb71d26dd"
+LICENSE = "GPL"
+LIC_FILES_CHKSUM = "file://COPYING.GPL;md5=0636e73ff0215e8d672dc4c32c317bb3"
+
+S = "${WORKDIR}/git"
+
+DEPENDS = "rpm"
+DEPENDS_virtclass-native = "rpm-native"
+PR = "r1"
+
+EXTRA_OECONF += " --disable-scripts"
+
+FILES_${PN} += "${libdir}/apt/methods/*"
+FILES_${PN}-dbg += "${libdir}/apt/methods/.debug"
+
+inherit autotools gettext
+
+BBCLASSEXTEND = "native"
diff --git a/meta-sysmocom/recipes-system/apt-rpm/files/hacks.patch b/meta-sysmocom/recipes-system/apt-rpm/files/hacks.patch
new file mode 100644
index 0000000..c652984
--- /dev/null
+++ b/meta-sysmocom/recipes-system/apt-rpm/files/hacks.patch
@@ -0,0 +1,118 @@
+diff --git a/Makefile.am b/Makefile.am
+index f4efa1f..540abbe 100644
+--- a/Makefile.am
++++ b/Makefile.am
+@@ -2,7 +2,7 @@ SUBDIRS =
+ if WITH_LUAEXT
+ SUBDIRS += luaext
+ endif
+-SUBDIRS += apt-pkg methods cmdline tools doc po
++SUBDIRS += apt-pkg methods cmdline tools doc
+ SUBDIRS += test
+ 
+ ACLOCAL_AMFLAGS = -I m4 -I buildlib
+diff --git a/apt-pkg/Makefile.am b/apt-pkg/Makefile.am
+index 670ccb4..58ea9d2 100644
+--- a/apt-pkg/Makefile.am
++++ b/apt-pkg/Makefile.am
+@@ -6,7 +6,6 @@ pkgconfigdir = $(libdir)/pkgconfig
+ pkgconfig_DATA = libapt-pkg.pc
+ 
+ libapt_pkg_la_LIBADD = @RPM_LIBS@
+-libapt_pkg_la_LIBADD += -ljsoncpp
+ libapt_pkg_la_LDFLAGS = -version-info 4:0:0
+ 
+ AM_CPPFLAGS = -DLIBDIR=\"$(libdir)\" -DPKGDATADIR=\"$(pkgdatadir)\"
+diff --git a/apt-pkg/contrib/dudf.cc b/apt-pkg/contrib/dudf.cc
+index 9272670..af48504 100644
+--- a/apt-pkg/contrib/dudf.cc
++++ b/apt-pkg/contrib/dudf.cc
+@@ -28,7 +28,7 @@
+ #include <libxml/parser.h>
+ #include <libxml/tree.h>
+ 
+-#include <jsoncpp/json.h>
++//#include <jsoncpp/json.h>
+ 
+ #include <tr1/cinttypes>
+ 
+@@ -106,6 +106,7 @@ GlobalDudf::GlobalDudf()
+ 
+ }
+ 
++#if 0
+ Json::Value getTagArray(Header hdr, rpmTag tagname)
+ {
+ 	HE_t tagdata = (HE_t)memset(alloca(sizeof(*tagdata)), 0, sizeof(*tagdata));
+@@ -177,6 +178,7 @@ Json::Value getTagString(Header hdr, rpmTag tag)
+ 	return val;
+     }
+ }
++#endif
+ 
+ bool isDepFlag(rpmuint64_t value)
+ {
+@@ -190,6 +192,7 @@ struct StringComparator {
+ };
+ 
+ 
++#if 0
+ /*
+  * Generic getter for versioned dependency style info
+  * It should work for requires, provides, conflicts and obsoletes
+@@ -268,6 +271,7 @@ Json::Value getTagRequires(Header hdr, rpmTag tag, rpmTag flag_tag, rpmTag versi
+ 	return *arr;
+ 
+ }
++#endif
+ 
+ /**
+  * Dump the RPM installation status into a JSON object to be embedded in the DUDF
+@@ -276,6 +280,7 @@ Json::Value getTagRequires(Header hdr, rpmTag tag, rpmTag flag_tag, rpmTag versi
+ string GlobalDudf::dumpRPMDb()
+ {
+ 	stringstream rpmDbDump;
++#if 0
+ 	//TODO: We should be reusing the RPM db connection that apt already has, maybe
+ 	// through RPMDBHandler class
+ 	rpmReadConfigFiles(NULL, NULL);
+@@ -328,6 +333,7 @@ string GlobalDudf::dumpRPMDb()
+ 		rpmDbDump << fast.write(*array_of_pkgs) << endl;
+ 	}
+ 
++#endif
+ 	return rpmDbDump.str();
+ 
+ }
+diff --git a/configure.ac b/configure.ac
+index 6da87a5..8318505 100644
+--- a/configure.ac
++++ b/configure.ac
+@@ -296,7 +296,6 @@ AC_CONFIG_FILES([
+ 	  tools/Makefile
+ 	  doc/Makefile
+ 	  test/Makefile
+-	  po/Makefile.in
+ 	  ])
+ AC_CONFIG_LINKS([include/apti18n.h:buildlib/gettext.h])
+ 

^ permalink raw reply related	[flat|nested] 8+ messages in thread

* Re: Installing GDB or the saga of getting RPM/Zypp to run on JFFS2
  2011-07-01 13:44 ` Mark Hatle
  2011-07-02 21:09   ` Holger Hans Peter Freyther
@ 2011-07-03 12:28   ` Holger Hans Peter Freyther
  2011-07-12 19:55     ` Mark Hatle
  1 sibling, 1 reply; 8+ messages in thread
From: Holger Hans Peter Freyther @ 2011-07-03 12:28 UTC (permalink / raw)
  To: poky

On 07/01/2011 03:44 PM, Mark Hatle wrote:

> 
> macro files are being loaded from:
> 
> %{_usrlibrpm}/macros:%{_usrlibrpm}/poky/macros:%{_usrlibrpm}/poky/%{_target}/macros:%{_etcrpm}/macros.*:%{_etcrpm}/macros:%{_etcrpm}/%{_target}/macros:~/.oerpmmacros
> 
> The key above is the /etc/rpm/macros.*
> 
> I would recommend a new file be generated called /etc/rpm/macros.jffs2 that
> changes the setting as appropriate for that filesystem.  (How it's placed into
> the filesystem I'm not sure.  I think it all comes down to detecting we're
> building a jffs2 filesystem and doing it there.  Perhaps in the rootfs_rpm.bbclass?)

What do you think about something like the change below? Alternatively one
could write a post-inst script that checks if one is on jffs2 and then creates
the config file.


diff --git a/meta/classes/rootfs_rpm.bbclass b/meta/classes/rootfs_rpm.bbclass
index 3a11858..70459e5 100644
--- a/meta/classes/rootfs_rpm.bbclass
+++ b/meta/classes/rootfs_rpm.bbclass
@@ -139,6 +139,14 @@ EOF
        install -d ${IMAGE_ROOTFS}/${sysconfdir}
        echo ${BUILDNAME} > ${IMAGE_ROOTFS}/${sysconfdir}/version

+
+       # check if there is a jffs2 install, it requires a workaround
+       # due lacking support for MMAP read/write.
+       (echo "${IMAGE_FSTYPES}" | grep "jffs2" > /dev/null)
+       if [ $? == 0 ]; then
+               echo "%__dbi_txn      create lock log txn auto_commit nommap
private" > ${IMAGE_ROOTFS}/etc/rpm/macros.jffs2
+       fi
+
        ${RPM_POSTPROCESS_COMMANDS}
        ${ROOTFS_POSTPROCESS_COMMAND}

@@ -164,6 +172,7 @@ EOF
 }

 remove_packaging_data_files() {
+       rm -rf ${IMAGE_ROOTFS}/etc/rpm/macros.jffs2
        rm -rf ${IMAGE_ROOTFS}${rpmlibdir}
        rm -rf ${IMAGE_ROOTFS}${opkglibdir}
 }


^ permalink raw reply related	[flat|nested] 8+ messages in thread

* Re: Installing GDB or the saga of getting RPM/Zypp to run on JFFS2
  2011-07-03 12:28   ` Holger Hans Peter Freyther
@ 2011-07-12 19:55     ` Mark Hatle
  0 siblings, 0 replies; 8+ messages in thread
From: Mark Hatle @ 2011-07-12 19:55 UTC (permalink / raw)
  To: poky

On 7/3/11 7:28 AM, Holger Hans Peter Freyther wrote:
> On 07/01/2011 03:44 PM, Mark Hatle wrote:
> 
>>
>> macro files are being loaded from:
>>
>> %{_usrlibrpm}/macros:%{_usrlibrpm}/poky/macros:%{_usrlibrpm}/poky/%{_target}/macros:%{_etcrpm}/macros.*:%{_etcrpm}/macros:%{_etcrpm}/%{_target}/macros:~/.oerpmmacros
>>
>> The key above is the /etc/rpm/macros.*
>>
>> I would recommend a new file be generated called /etc/rpm/macros.jffs2 that
>> changes the setting as appropriate for that filesystem.  (How it's placed into
>> the filesystem I'm not sure.  I think it all comes down to detecting we're
>> building a jffs2 filesystem and doing it there.  Perhaps in the rootfs_rpm.bbclass?)
> 
> What do you think about something like the change below? Alternatively one
> could write a post-inst script that checks if one is on jffs2 and then creates
> the config file.

Sorry for the late reply.  I'm back from vacation now.

The below is fine with me.  I suggest it be submitted as a patch to oe-core.
I'll be happy to ack it.

--Mark

> 
> diff --git a/meta/classes/rootfs_rpm.bbclass b/meta/classes/rootfs_rpm.bbclass
> index 3a11858..70459e5 100644
> --- a/meta/classes/rootfs_rpm.bbclass
> +++ b/meta/classes/rootfs_rpm.bbclass
> @@ -139,6 +139,14 @@ EOF
>         install -d ${IMAGE_ROOTFS}/${sysconfdir}
>         echo ${BUILDNAME} > ${IMAGE_ROOTFS}/${sysconfdir}/version
> 
> +
> +       # check if there is a jffs2 install, it requires a workaround
> +       # due lacking support for MMAP read/write.
> +       (echo "${IMAGE_FSTYPES}" | grep "jffs2" > /dev/null)
> +       if [ $? == 0 ]; then
> +               echo "%__dbi_txn      create lock log txn auto_commit nommap
> private" > ${IMAGE_ROOTFS}/etc/rpm/macros.jffs2
> +       fi
> +
>         ${RPM_POSTPROCESS_COMMANDS}
>         ${ROOTFS_POSTPROCESS_COMMAND}
> 
> @@ -164,6 +172,7 @@ EOF
>  }
> 
>  remove_packaging_data_files() {
> +       rm -rf ${IMAGE_ROOTFS}/etc/rpm/macros.jffs2
>         rm -rf ${IMAGE_ROOTFS}${rpmlibdir}
>         rm -rf ${IMAGE_ROOTFS}${opkglibdir}
>  }
> _______________________________________________
> poky mailing list
> poky@yoctoproject.org
> https://lists.yoctoproject.org/listinfo/poky



^ permalink raw reply	[flat|nested] 8+ messages in thread

end of thread, other threads:[~2011-07-12 19:55 UTC | newest]

Thread overview: 8+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2011-07-01 12:48 Installing GDB or the saga of getting RPM/Zypp to run on JFFS2 Holger Hans Peter Freyther
2011-07-01 13:05 ` Koen Kooi
2011-07-01 13:44 ` Mark Hatle
2011-07-02 21:09   ` Holger Hans Peter Freyther
2011-07-03 12:28   ` Holger Hans Peter Freyther
2011-07-12 19:55     ` Mark Hatle
2011-07-01 14:18 ` [poky] " Holger Hans Peter Freyther
2011-07-01 14:18   ` Holger Hans Peter Freyther

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.