Quantcast

Large pack causes git clone failures ... what to do?

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Large pack causes git clone failures ... what to do?

Geoff Russell-3
Hi,

I did a "git gc" on a repository and ended up with a 4GB pack ... now I
can't clone the repository and get the following:


remote: fatal: Out of memory? mmap failed: Cannot allocate memory
remote: aborting due to possible repository corruption on the remote side.
fatal: early EOF
error: git upload-pack: git-pack-objects died with error.
fatal: git upload-pack: aborting due to possible repository corruption
on the remote side.
fatal: index-pack failed

How do I deal with this?   I'm running git version 1.6.2.3

I've looked at "git repack --max-pack-size", but which that created new packs it
didn't delete the old monster. If I run gc, how do I tell it about the
max-pack size? It doesn't
seem to support this argument.

Cheers,
Geoff

--
6 Fifth Ave,
St Morris, S.A. 5068
Australia
Ph: 041 8805 184 / 08 8332 5069
http://perfidy.com.au
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [hidden email]
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Large pack causes git clone failures ... what to do?

Shawn Pearce
Geoff Russell <[hidden email]> wrote:

> I did a "git gc" on a repository and ended up with a 4GB pack ... now I
> can't clone the repository and get the following:
>
> remote: fatal: Out of memory? mmap failed: Cannot allocate memory
> remote: aborting due to possible repository corruption on the remote side.
> fatal: early EOF
> error: git upload-pack: git-pack-objects died with error.
> fatal: git upload-pack: aborting due to possible repository corruption
> on the remote side.
> fatal: index-pack failed
>
> How do I deal with this?   I'm running git version 1.6.2.3

Are you on a 32 bit Linux system?  Or 64 bit?  Git should be auto
selecting a unit that would allow it to mmap slices of that 4GB pack.

> I've looked at "git repack --max-pack-size", but which that
> created new packs it didn't delete the old monster.

You really needed to run:

  git repack --max-pack-size=.. -a -d

The -d flag tells it to remove the old packs once the new packs
are ready, and the -a flag tells it to reconsider every object
in the repository, rather than just those that are loose.

But if you can't clone it, you probably can't repack it.  Clone works
by creating a pack file on the server, just like repack does.
Except it sends the pack out to the network stream instead of to
local disk.

--
Shawn.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [hidden email]
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Large pack causes git clone failures ... what to do?

Geoff Russell-3
Thanks Shawn,

On Wed, Sep 1, 2010 at 3:32 AM, Shawn O. Pearce <[hidden email]> wrote:
> Geoff Russell <[hidden email]> wrote:
>> I did a "git gc" on a repository and ended up with a 4GB pack ... now I
>> can't clone the repository and get the following:
>> ...
>
> Are you on a 32 bit Linux system?  Or 64 bit?  Git should be auto
> selecting a unit that would allow it to mmap slices of that 4GB pack.

32bit

>
>> I've looked at "git repack --max-pack-size", but which that
>> created new packs it didn't delete the old monster.
>
> You really needed to run:
>
>  git repack --max-pack-size=.. -a -d
>
> The -d flag tells it to remove the old packs once the new packs
> are ready, and the -a flag tells it to reconsider every object
> in the repository, rather than just those that are loose.

Ok, will try.

>
> But if you can't clone it, you probably can't repack it.  Clone works

The cloning fails at different points in the process and the server is normally
under some load, so perhaps load is a factor.

> by creating a pack file on the server, just like repack does.
> Except it sends the pack out to the network stream instead of to
> local disk.

Does clone from a client take note of the pack.packSizeLimit if I set it
on the server? Or does it use the client value?

Cheers and many thanks, annoying problems like this always happen at really
inconvenient times :)

Geoff.

>
> --
> Shawn.
>
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [hidden email]
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Large pack causes git clone failures ... what to do?

Geoff Russell-3
On Wed, Sep 1, 2010 at 7:33 AM, Geoff Russell
<[hidden email]> wrote:

> Thanks Shawn,
>
>...
>> You really needed to run:
>>
>>  git repack --max-pack-size=.. -a -d
>>
>> The -d flag tells it to remove the old packs once the new packs
>> are ready, and the -a flag tells it to reconsider every object
>> in the repository, rather than just those that are loose.
>
> Ok, will try.

The repack failed with a "fatal: Out of memory, malloc failed", perhaps I
just need to try a machine with more memory!

I'm still interested in whether clone from a client take note of the
pack.packSizeLimit if I set it
on the server? Or does it use the client value?

Cheers,
Geoff
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [hidden email]
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Large pack causes git clone failures ... what to do?

Geoff Russell-3
On Wed, Sep 1, 2010 at 11:23 AM, Geoff Russell
<[hidden email]> wrote:

> On Wed, Sep 1, 2010 at 7:33 AM, Geoff Russell
> <[hidden email]> wrote:
>> Thanks Shawn,
>>
>>...
>>> You really needed to run:
>>>
>>>  git repack --max-pack-size=.. -a -d
>>>
>>> The -d flag tells it to remove the old packs once the new packs
>>> are ready, and the -a flag tells it to reconsider every object
>>> in the repository, rather than just those that are loose.
>>
>> Ok, will try.
>
> The repack failed with a "fatal: Out of memory, malloc failed", perhaps I
> just need to try a machine with more memory!

Ok, I rsynced the directory to a machine with 12Gb of memory and ran the
repack (git version 1.7.2.2) the repack worked (and quickly) but left
a "bad" sha1 file
behind:

$ git repack --max-pack-size=100M -a -d
Counting objects: 517563, done.
Delta compression using up to 8 threads.
Compressing objects: 100% (154217/154217), done.
Writing objects: 100% (517563/517563), done.
Total 517563 (delta 353081), reused 465715 (delta 335261)
Removing duplicate objects: 100% (256/256), done.

$ git fsck
bad sha1 file: ./objects/5b/.fd25f132c21493b661978fc9362f673ea6e58b.cwxzjT
dangling commit c7a4ecaa1732869f9bfa21d948cb8714fd303713

I removed the bad file on the presumption that it was a working file
and reran the fsck and all looked okay.

Cheers,
Geoff.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [hidden email]
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Large pack causes git clone failures ... what to do?

Shawn Pearce
In reply to this post by Geoff Russell-3
Geoff Russell <[hidden email]> wrote:
>
> I'm still interested in whether clone from a client take note of the
> pack.packSizeLimit if I set it
> on the server? Or does it use the client value?

Neither.

A clone doesn't split its pack.  It stores the entire project
as a single pack file.  If your filesystem cannot do that, the
clone fails.

--
Shawn.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [hidden email]
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Large pack causes git clone failures ... what to do?

Geoff Russell-3
On Thu, Sep 2, 2010 at 12:08 AM, Shawn O. Pearce <[hidden email]> wrote:

> Geoff Russell <[hidden email]> wrote:
>>
>> I'm still interested in whether clone from a client take note of the
>> pack.packSizeLimit if I set it
>> on the server? Or does it use the client value?
>
> Neither.
>
> A clone doesn't split its pack.  It stores the entire project
> as a single pack file.  If your filesystem cannot do that, the
> clone fails.

I've moved the "master" repository to a faster machine with plenty of
memory and all the problems have gone away.  I was making wrong
guesses about the cause. A fresh clone gives a huge pack, but no problems
and everything runs much better

Thanks for your help.

Geoff.
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to [hidden email]
More majordomo info at  http://vger.kernel.org/majordomo-info.html
Loading...