1
0
Fork 0
mirror of git://git.code.sf.net/p/cdesktopenv/code synced 2025-03-09 15:50:02 +00:00

Tweak and regress-test 'command -x' (re: 66e1d446)

Turns out the assumption I was operating on, that Linux and macOS
align arguments on 32 or 64 bit boundaries, is incorrect -- they
just need some extra bytes per argument. So we can use a bit more
of the arguments buffer on these systems than I thought.

src/cmd/ksh93/features/externs:
- Change the feature test to simply detect the # of extra bytes per
  argument needed. On *BSD and commercial Unices, ARG_EXTRA_BYTES
  shows as zero; on Linux and macOS (64-bit), this yields 8. On
  Linux (32-bit), this yields 4.

src/cmd/ksh93/sh/path.c: path_xargs():
- Do not try to calculate alignment, just add ARG_EXTRA_BYTES to
  each argument.
- Also add this when substracting the length of environment
  variables and leading and trailing static command arguments.

src/cmd/ksh93/tests/path.sh:
- Test command -v/-V with -x.
- Add a robust regression test for command -x.

src/cmd/ksh93/data/builtins.c, src/cmd/ksh93/sh.1:
- Tweak docs. Glob patterns also expand to multiple words.
This commit is contained in:
Martijn Dekker 2021-02-01 00:28:18 +00:00
parent f37098f177
commit 6a0e9a1a75
6 changed files with 103 additions and 44 deletions

View file

@ -164,23 +164,20 @@ static pid_t path_xargs(Shell_t *shp,const char *path, char *argv[],char *const
return((pid_t)-1);
size = shp->gd->lim.arg_max-2048;
for(ev=envp; cp= *ev; ev++)
size -= strlen(cp)+1;
size -= strlen(cp) + 1 + ARG_EXTRA_BYTES;
for(av=argv; (cp= *av) && av< &argv[shp->xargmin]; av++)
size -= strlen(cp)+1;
size -= strlen(cp) + 1 + ARG_EXTRA_BYTES;
for(av=avlast; cp= *av; av++,nlast++)
size -= strlen(cp)+1;
size -= strlen(cp) + 1 + ARG_EXTRA_BYTES;
av = &argv[shp->xargmin];
if(!spawn)
job_clear();
shp->exitval = 0;
while(av<avlast)
{
/* for each argument, account for terminating zero and possible alignment */
/* for each argument, account for terminating zero and possible extra bytes */
for(xv=av,left=size; left>0 && av<avlast;)
{
n = strlen(*av++) + 1 + ARG_ALIGN_BYTES;
left -= n + (ARG_ALIGN_BYTES ? n % ARG_ALIGN_BYTES : 0);
}
left -= strlen(*av++) + 1 + ARG_EXTRA_BYTES;
/* leave at least two for last */
if(left<0 && (avlast-av)<2)
av--;