All of lore.kernel.org
 help / color / mirror / Atom feed
* Large pack causes git clone failures ... what to do?
@ 2010-08-31  7:16 Geoff Russell
  2010-08-31 18:02 ` Shawn O. Pearce
  0 siblings, 1 reply; 7+ messages in thread
From: Geoff Russell @ 2010-08-31  7:16 UTC (permalink / raw)
  To: git

Hi,

I did a "git gc" on a repository and ended up with a 4GB pack ... now I
can't clone the repository and get the following:


remote: fatal: Out of memory? mmap failed: Cannot allocate memory
remote: aborting due to possible repository corruption on the remote side.
fatal: early EOF
error: git upload-pack: git-pack-objects died with error.
fatal: git upload-pack: aborting due to possible repository corruption
on the remote side.
fatal: index-pack failed

How do I deal with this?   I'm running git version 1.6.2.3

I've looked at "git repack --max-pack-size", but which that created new packs it
didn't delete the old monster. If I run gc, how do I tell it about the
max-pack size? It doesn't
seem to support this argument.

Cheers,
Geoff

--
6 Fifth Ave,
St Morris, S.A. 5068
Australia
Ph: 041 8805 184 / 08 8332 5069
http://perfidy.com.au

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Large pack causes git clone failures ... what to do?
  2010-08-31  7:16 Large pack causes git clone failures ... what to do? Geoff Russell
@ 2010-08-31 18:02 ` Shawn O. Pearce
  2010-08-31 22:03   ` Geoff Russell
  0 siblings, 1 reply; 7+ messages in thread
From: Shawn O. Pearce @ 2010-08-31 18:02 UTC (permalink / raw)
  To: Geoff Russell; +Cc: git

Geoff Russell <geoffrey.russell@gmail.com> wrote:
> I did a "git gc" on a repository and ended up with a 4GB pack ... now I
> can't clone the repository and get the following:
> 
> remote: fatal: Out of memory? mmap failed: Cannot allocate memory
> remote: aborting due to possible repository corruption on the remote side.
> fatal: early EOF
> error: git upload-pack: git-pack-objects died with error.
> fatal: git upload-pack: aborting due to possible repository corruption
> on the remote side.
> fatal: index-pack failed
> 
> How do I deal with this?   I'm running git version 1.6.2.3

Are you on a 32 bit Linux system?  Or 64 bit?  Git should be auto
selecting a unit that would allow it to mmap slices of that 4GB pack.

> I've looked at "git repack --max-pack-size", but which that
> created new packs it didn't delete the old monster.

You really needed to run:

  git repack --max-pack-size=.. -a -d

The -d flag tells it to remove the old packs once the new packs
are ready, and the -a flag tells it to reconsider every object
in the repository, rather than just those that are loose.

But if you can't clone it, you probably can't repack it.  Clone works
by creating a pack file on the server, just like repack does.
Except it sends the pack out to the network stream instead of to
local disk.

-- 
Shawn.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Large pack causes git clone failures ... what to do?
  2010-08-31 18:02 ` Shawn O. Pearce
@ 2010-08-31 22:03   ` Geoff Russell
  2010-09-01  1:53     ` Geoff Russell
  0 siblings, 1 reply; 7+ messages in thread
From: Geoff Russell @ 2010-08-31 22:03 UTC (permalink / raw)
  To: Shawn O. Pearce; +Cc: git

Thanks Shawn,

On Wed, Sep 1, 2010 at 3:32 AM, Shawn O. Pearce <spearce@spearce.org> wrote:
> Geoff Russell <geoffrey.russell@gmail.com> wrote:
>> I did a "git gc" on a repository and ended up with a 4GB pack ... now I
>> can't clone the repository and get the following:
>> ...
>
> Are you on a 32 bit Linux system?  Or 64 bit?  Git should be auto
> selecting a unit that would allow it to mmap slices of that 4GB pack.

32bit

>
>> I've looked at "git repack --max-pack-size", but which that
>> created new packs it didn't delete the old monster.
>
> You really needed to run:
>
>  git repack --max-pack-size=.. -a -d
>
> The -d flag tells it to remove the old packs once the new packs
> are ready, and the -a flag tells it to reconsider every object
> in the repository, rather than just those that are loose.

Ok, will try.

>
> But if you can't clone it, you probably can't repack it.  Clone works

The cloning fails at different points in the process and the server is normally
under some load, so perhaps load is a factor.

> by creating a pack file on the server, just like repack does.
> Except it sends the pack out to the network stream instead of to
> local disk.

Does clone from a client take note of the pack.packSizeLimit if I set it
on the server? Or does it use the client value?

Cheers and many thanks, annoying problems like this always happen at really
inconvenient times :)

Geoff.

>
> --
> Shawn.
>

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Large pack causes git clone failures ... what to do?
  2010-08-31 22:03   ` Geoff Russell
@ 2010-09-01  1:53     ` Geoff Russell
  2010-09-01  3:02       ` Geoff Russell
  2010-09-01 14:38       ` Shawn O. Pearce
  0 siblings, 2 replies; 7+ messages in thread
From: Geoff Russell @ 2010-09-01  1:53 UTC (permalink / raw)
  To: Shawn O. Pearce; +Cc: git

On Wed, Sep 1, 2010 at 7:33 AM, Geoff Russell
<geoffrey.russell@gmail.com> wrote:
> Thanks Shawn,
>
>...
>> You really needed to run:
>>
>>  git repack --max-pack-size=.. -a -d
>>
>> The -d flag tells it to remove the old packs once the new packs
>> are ready, and the -a flag tells it to reconsider every object
>> in the repository, rather than just those that are loose.
>
> Ok, will try.

The repack failed with a "fatal: Out of memory, malloc failed", perhaps I
just need to try a machine with more memory!

I'm still interested in whether clone from a client take note of the
pack.packSizeLimit if I set it
on the server? Or does it use the client value?

Cheers,
Geoff

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Large pack causes git clone failures ... what to do?
  2010-09-01  1:53     ` Geoff Russell
@ 2010-09-01  3:02       ` Geoff Russell
  2010-09-01 14:38       ` Shawn O. Pearce
  1 sibling, 0 replies; 7+ messages in thread
From: Geoff Russell @ 2010-09-01  3:02 UTC (permalink / raw)
  To: Shawn O. Pearce; +Cc: git

On Wed, Sep 1, 2010 at 11:23 AM, Geoff Russell
<geoffrey.russell@gmail.com> wrote:
> On Wed, Sep 1, 2010 at 7:33 AM, Geoff Russell
> <geoffrey.russell@gmail.com> wrote:
>> Thanks Shawn,
>>
>>...
>>> You really needed to run:
>>>
>>>  git repack --max-pack-size=.. -a -d
>>>
>>> The -d flag tells it to remove the old packs once the new packs
>>> are ready, and the -a flag tells it to reconsider every object
>>> in the repository, rather than just those that are loose.
>>
>> Ok, will try.
>
> The repack failed with a "fatal: Out of memory, malloc failed", perhaps I
> just need to try a machine with more memory!

Ok, I rsynced the directory to a machine with 12Gb of memory and ran the
repack (git version 1.7.2.2) the repack worked (and quickly) but left
a "bad" sha1 file
behind:

$ git repack --max-pack-size=100M -a -d
Counting objects: 517563, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (154217/154217), done.
Writing objects: 100% (517563/517563), done.
Total 517563 (delta 353081), reused 465715 (delta 335261)
Removing duplicate objects: 100% (256/256), done.

$ git fsck
bad sha1 file: ./objects/5b/.fd25f132c21493b661978fc9362f673ea6e58b.cwxzjT
dangling commit c7a4ecaa1732869f9bfa21d948cb8714fd303713

I removed the bad file on the presumption that it was a working file
and reran the fsck and all looked okay.

Cheers,
Geoff.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Large pack causes git clone failures ... what to do?
  2010-09-01  1:53     ` Geoff Russell
  2010-09-01  3:02       ` Geoff Russell
@ 2010-09-01 14:38       ` Shawn O. Pearce
  2010-09-06  0:34         ` Geoff Russell
  1 sibling, 1 reply; 7+ messages in thread
From: Shawn O. Pearce @ 2010-09-01 14:38 UTC (permalink / raw)
  To: Geoff Russell; +Cc: git

Geoff Russell <geoffrey.russell@gmail.com> wrote:
> 
> I'm still interested in whether clone from a client take note of the
> pack.packSizeLimit if I set it
> on the server? Or does it use the client value?

Neither.

A clone doesn't split its pack.  It stores the entire project
as a single pack file.  If your filesystem cannot do that, the
clone fails.

-- 
Shawn.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: Large pack causes git clone failures ... what to do?
  2010-09-01 14:38       ` Shawn O. Pearce
@ 2010-09-06  0:34         ` Geoff Russell
  0 siblings, 0 replies; 7+ messages in thread
From: Geoff Russell @ 2010-09-06  0:34 UTC (permalink / raw)
  To: Shawn O. Pearce, git

On Thu, Sep 2, 2010 at 12:08 AM, Shawn O. Pearce <spearce@spearce.org> wrote:
> Geoff Russell <geoffrey.russell@gmail.com> wrote:
>>
>> I'm still interested in whether clone from a client take note of the
>> pack.packSizeLimit if I set it
>> on the server? Or does it use the client value?
>
> Neither.
>
> A clone doesn't split its pack.  It stores the entire project
> as a single pack file.  If your filesystem cannot do that, the
> clone fails.

I've moved the "master" repository to a faster machine with plenty of
memory and all the problems have gone away.  I was making wrong
guesses about the cause. A fresh clone gives a huge pack, but no problems
and everything runs much better

Thanks for your help.

Geoff.

^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2010-09-06  0:35 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2010-08-31  7:16 Large pack causes git clone failures ... what to do? Geoff Russell
2010-08-31 18:02 ` Shawn O. Pearce
2010-08-31 22:03   ` Geoff Russell
2010-09-01  1:53     ` Geoff Russell
2010-09-01  3:02       ` Geoff Russell
2010-09-01 14:38       ` Shawn O. Pearce
2010-09-06  0:34         ` Geoff Russell

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.