[yocto] Safely cleaning 'downloads'

Martin Jansa martin.jansa at gmail.com
Thu Oct 1 10:12:33 PDT 2015


On Thu, Oct 01, 2015 at 09:54:51AM -0700, Christopher Larson wrote:
> On Thu, Oct 1, 2015 at 9:49 AM, Gary Thomas <gary at mlbassoc.com> wrote:
> 
> > On 2015-10-01 10:38, Smith, Virgil wrote:
> >
> >> The following is roughly the procedure I follow and that works for me.
> >> Maybe someone could chime in with how some of this should be trimmed based
> >> on yocto/bitbake intent/design.
> >> Even so I'd probably stick with this level of extremism because without a
> >> known good backup of your downloads(sources) you may be incapable of
> >> (tweaking and) rebuilding your products if anything happens to your build
> >> server.
> >>
> >> The only reason I've seen that simply deleting the downloads folder
> >> causes problems is that external servers/content go away, violate their git
> >> history, or replace files with non-identical contents.
> >>
> >>
> >> Warning: The following does not maintain PR server information, so
> >> automatic upgrading of your own packages could break.  If you rely on this
> >> work out how to extract that information (and back it up regularly).
> >>
> >> 1. rename/move your current downloads folder and create a new one.
> >> 2. for all of your product build configurations empty out the following
> >> folders
> >> 2.1 cache
> >> 2.2 state-cache
> >> 2.3 tmp
> >> 3. build (bitbake) all your product images with all appropriate
> >> configuration variances
> >> 4. run the following command to extract the unexpanded sources from
> >> downloads
> >> find -H downloads -maxdepth 1 \
> >>       -not -type d   -and   -not -name "*.done" \
> >>       -exec cp -L {} sources-tmp \;
> >>
> >> You now have everything you *currently* need for a sources mirror in the
> >> sources-tmp folder.
> >>
> >> 5. move sources-tmp to wherever/whatever backs your SOURCE_MIRROR_URL.
> >> 6. Check those contents into some form of revision control (even if that
> >> is just a manual set of backup folders/media).
> >>
> >>
> >> Yes this is costs time and space, you just have to decide how much your
> >> images and how much being able to reproduce them (with or without 'small'
> >> changes) is worth.
> >>
> >
> > I'm already doing more or less this same sequence.  I use these commands to
> > stage the downloaded files to my mirror (/work/misc/Poky/sources for
> > historical reasons)
> >   ( cd downloads;
> >     find . -maxdepth 1 -type f | grep -v '.lock$' | grep -v '.done$'
> > >/tmp/files.$$;
> >     rsync -auv --files-from=/tmp/files.$$ . /work/misc/Poky/sources
> >   )
> >
> > This works very well (I've been doing it for many years).  The issue I'm
> > trying
> > to work on now is that my script leaves 'downloads' possibly full of
> > files, especially
> > if there are new versions that have just been downloaded.  This is
> > especially noticeable
> > for the tarballs of GIT trees - there are a number that I need/use that
> > are measured in
> > gigabytes (e.g. the RaspberryPi board firmware is 4194568645 bytes as of
> > 2015-07-20!)
> > Once I've saved these to my mirror(s), I'd like to be able to purge them
> > from the local
> > download directory in my builds.  As mentioned, I've found that just
> > wiping that in a
> > build tree tends to break things quite badly.  Of course I can always
> > start over with
> > a new build tree, but that also defeats the purpose of incremental builds.
> 
> 
> I'd think something like this would get the job done:
> 
> 1. Do a build of all your supported machines and configurations with
> BB_GENERATE_MIRROR_TARBALLS=1 to ensure you have current, not out of date
> scm tarballs.
> 
> 2. Set up builds of all your supported machines and configurations, using a
> new DL_DIR, with PREMIRRORS pointing to the old DL_DIR.
> 
> 3. Either clean up the old DL_DIR by access time before you kicked off the
> builds, or resolve the symlinks in the new DL_DIR and remove the old.

I'm doing the same, but make sure not to re-use sstate in 2nd build,
otherwise many components can be re-used from sstate without the need to
download their sources.

It's easier to use fetchall task instead of actual build in 2nd step.

Similarly when doing the same to clean sstate-cache (if you don't trust
sstate-cache-management.sh) you can end with 2nd build completely built
from sstate, but sstate for many intermediate dependencies wasn't
accessed - you can almost build whole image just by reusing packagedata
sstate archives for all included packages, but once you modify one of
them then you'll need do_populate_sysroot archives for all it's
dependencies.

Regards,

-- 
Martin 'JaMa' Jansa     jabber: Martin.Jansa at gmail.com
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 181 bytes
Desc: Digital signature
URL: <http://lists.yoctoproject.org/pipermail/yocto/attachments/20151001/b844d50e/attachment.pgp>


More information about the yocto mailing list