Skip to content

Repack: use one thread, but allow deltas#1178

Merged
derrickstolee merged 1 commit intomicrosoft:masterfrom
derrickstolee:packfile-maintenance-deltas
May 20, 2019
Merged

Repack: use one thread, but allow deltas#1178
derrickstolee merged 1 commit intomicrosoft:masterfrom
derrickstolee:packfile-maintenance-deltas

Conversation

@derrickstolee
Copy link
Copy Markdown
Contributor

When running 'git multi-pack-index repack', we are setting two
config options intended to speed up the underlying 'git pack-objects'
command:

pack.delta=0    (default is 50)
pack.window=0   (default is 10)

These were inserted to prevent the delta calculations from taking over
a user's processor during a background operation. When packing the
from-loose packs, this can become an expensive operation.

However, this came with a significant downside, due to my
misunderstanding of how these options work. When repacking the (already
nicely-packed) prefetch packs, these options force deltified trees
to become un-deltified. This means the resulting pack can be larger
than the given batch size.

To prevent losing these good deltas, drop these config options and
instead use pack.threads=1 to prevent multiple threads from taking
over the machine. In combination with the recent lower-priority git
processes, this should keep the background repack from disrupting
users, but will also keep our pack directory small.

In my testing, I used the Windows repository and ran the packfile
maintenance step with a batch size of "100m" instead of "2g". This
allowed me to run it with my real data, which was currently in a
state where "2g" would do nothing.

Before: 588m pack, repack took 50s
After: 80m pack, repack took 28s*

The fact that the repack sped up is possibly related to writing
less data to disk. I would expect this to slow down in some cases.

This expansion of deltas explains why users running the packfile
maintenance step directly have a higher than expected steady-state.
We are not-optimally repacking the data.

When running 'git multi-pack-index repack', we are setting two
config options intended to speed up the underlying 'git pack-objects'
command:

    pack.delta=0    (default is 50)
    pack.window=0   (default is 10)

These were inserted to prevent the delta calculations from taking over
a user's processor during a background operation. When packing the
from-loose packs, this can become an expensive operation.

However, this came with a significant downside, due to my
misunderstanding of how these options work. When repacking the (already
nicely-packed) prefetch packs, these options force deltified trees
to become un-deltified. This means the resulting pack can be larger
than the given batch size.

To prevent losing these good deltas, drop these config options and
instead use pack.threads=1 to prevent multiple threads from taking
over the machine. In combination with the recent lower-priority git
processes, this should keep the background repack from disrupting
users, but will also keep our pack directory small.

In my testing, I used the Windows repository and ran the packfile
maintenance step with a batch size of "100m" instead of "2g". This
allowed me to run it with my real data, which was currently in a
state where "2g" would do nothing.

Before: 588m pack, repack took 50s
 After:  80m pack, repack took 28s*

The fact that the repack sped up is possibly related to writing
less data to disk. I would expect this to slow down in some cases.

This expansion of deltas explains why users running the packfile
maintenance step directly have a higher than expected steady-state.
We are not-optimally repacking the data.

Signed-off-by: Derrick Stolee <[email protected]>
@derrickstolee derrickstolee merged commit 2ae2527 into microsoft:master May 20, 2019
derrickstolee added a commit that referenced this pull request May 20, 2019
…mputations

This includes two changes in/against master:

#1157 : Run git operations at a lower prioirity
#1178 : compute deltas while running packfile maintenance.
@jrbriggs jrbriggs modified the milestones: M152, M153 May 23, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants