Web lists-archives.com

Re: [PATCH v3] coccicheck: process every source file at once




On Fri, Oct 05, 2018 at 08:50:50PM +0200, SZEDER Gábor wrote:

> On Fri, Oct 05, 2018 at 12:59:01PM -0400, Jeff King wrote:
> > On Fri, Oct 05, 2018 at 04:53:35PM +0000, Keller, Jacob E wrote:
> > 
> > > > Are we OK with saying 1.3-1.8GB is necessary to run coccicheck? That
> > > > doesn't feel like an exorbitant request for a developer-only tool these
> > > > days, but I have noticed some people on the list tend to have lousier
> > > > machines than I do. ;)
> > > > 
> > > > -Peff
> > > 
> > > It's probably not worth trying to make this more complicated and scale
> > > up how many files we do at once based on the amount of available
> > > memory on the system...
> > 
> > Yeah, that sounds too complicated. At most I'd give a Makefile knob to
> > say "spatch in batches of $(N)". But I'd prefer to avoid even that
> > complexity if we can.
> 
> But perhaps one more if-else, e.g.:
> 
>   if test -n "$(COCCICHECK_ALL_AT_ONCE)"; then \
>       <all at once from Jacob>
>   else
>       <old for loop>
>   fi
> 
> would be an acceptable compromise?  Dunno.

That's OK, too, assuming people would actually want to use it. I'm also
OK shipping this (with the "make -j" fix you suggested) and seeing if
anybody actually complains. I assume there are only a handful of people
running coccicheck in the first place.

-Peff