On this page:
3.1 shell/  pipeline-macro stability
3.2 shell/  pipeline-macro guide
3.3 shell/  pipeline-macro reference
3.3.1 Running Pipelines
run-pipeline
3.3.2 Pipeline Flags
&bg
&pipeline-ret
&in
&<
&out
&>
&>!
&>>
&err
&strict
&permissive
&lazy
&lazy-timeout
3.3.3 Pipeline Operators
=composite-pipe=
=basic-unix-pipe=
=quoting-basic-unix-pipe=
=unix-pipe=
\|
=basic-object-pipe/  expression=
=basic-object-pipe/  form=
=basic-object-pipe=
\|>
=object-pipe/  expression=
=object-pipe/  form=
=object-pipe=
\|>>
default-pipeline-starter
3.3.4 Defining Pipeline Operators
define-pipeline-operator
define-pipeline-alias
define-simple-pipeline-alias
current-pipeline-argument
_
expand-pipeline-arguments
3.3.5 Inspecting Pipelines
pipeline?
pipeline-success?
pipeline-wait
pipeline-return
pipeline-start-ms
pipeline-end-ms
3.4 Demo stuff reference
=map=
=filter=
6.12

3 Pipeline Macro Library

 (require shell/pipeline-macro) package: shell-pipeline

3.1 shell/pipeline-macro stability

This library is not entirely stable.

The base set of pipeline operators is likely to change, and some of the names I want to review before a stable release.

3.2 shell/pipeline-macro guide

This module is a macro DSL wrapper around the Mixed Unix-style and Racket Object Pipelines library. It is designed for running pipelines of external processes (which pass each other ports) and Racket functions (which pass each other objects). It does this with a very flat syntax and user-definable pipeline operators, which provide a lot of convenient sugar for making pipelines shorter. It is particularly tailored for use in a line-based syntax, like that of the Rash language.

Here are some quick examples:

;; Pipe the output of ls to grep.
(run-pipeline =unix-pipe= ls -l =unix-pipe= grep foo)
 
;; To save on space, let's assume % is bound to =unix-pipe=
(run-pipeline % ls -l % grep foo)

We can also pipeline objects. Object pipelines are full of functions instead of process specifications.

;; This will return 2
(run-pipeline =object-pipe= list 1 2 3 =object-pipe= second)
 
;; To save on space, let's assume %> is bound to =object-pipe=
(run-pipeline %> list 1 2 3 %> second)

We can mix the two:

;; Capitalized ls output.  =object-pipe= automatically converts ports to strings.
(run-pipeline % ls -l %> string-upcase)

I am really running out of steam for documenting right now... TODO - write a good guide.

3.3 shell/pipeline-macro reference

3.3.1 Running Pipelines

syntax

(run-pipeline pipeline-flag ...
              pipeline-member-spec ...
              pipeline-flag ...)
 
pipeline-member-spec = pipe-operator pipe-operator-arg ...
     
pipeline-flag = &bg
  | &pipeline-ret
  | &in file-expression
  | &< file-name
  | &out file-expression
  | &> file-name
  | &>> file-name
  | &>! file-name
  | &err file-expression
  | &strict
  | &permissive
  | &lazy
  | &lazy-timeout timeout-expression
  | &env env-expression
Run a pipeline. This is a macro wrapper for shell/mixed-pipeline/run-pipeline that uses pipeline operator macros to specify the pipeline to be run. So you should read the docs about that as well.

The pipeline flags affect the options passed to shell/mixed-pipeline/run-pipeline and are documented separately.

The pipeline-member-specs are transformed according to the pipeline operators given. If the first non-flag argument to run-pipeline is not a pipeline operator, then a default is put in its place as determined by default-pipeline-operator. The full names of pipeline operators are conventionally identifiers surrounded with = signs.

At the time of writing I’m not really sure what to write here, so have an example:

(run-pipeline =object-pipe= list 1 2 3
              =map= + 1 current-pipeline-argument
              =map= + 1)

This returns

(list 3 4 5)

. Notice that current-pipeline-argument is placed automatically for the second =map= operator – it could have been left off of the other one as well, but I wanted to show what identifier is sneakily added.

If we instead run
(run-pipeline =object-pipe= list 1 2 3
              =map= + 1 current-pipeline-argument
              =map= + 1
              &bg)
we will get a pipeline object back. Conceptually it is still running when it is returned, though in this case it’s likely finished by the time we can inspect it. We can use pipeline?, pipeline-success?, pipeline-return, etc on it.

3.3.2 Pipeline Flags

pipeline-flag

(&bg)

pipeline-flag

(&pipeline-ret)

pipeline-flag

(&in port-expression)

pipeline-flag

(&< file-name)

pipeline-flag

(&out port/reader-expression)

pipeline-flag

(&> file-name)

pipeline-flag

(&>! file-name)

pipeline-flag

(&>> file-name)

pipeline-flag

(&err port-expression)

pipeline-flag

(&strict)

pipeline-flag

(&permissive)

pipeline-flag

(&lazy)

pipeline-flag

(&lazy-timeout)

These identifiers are all errors if used outside of run-pipeline. They are essentially used in place of #:keywords to not conflict with pipeline operators that take keywords.

&<, &>, &>>, and &>! each take a file name and cause (respectively) input redirection from the given file, output redirection to the given file erroring if the file exists, output redirection appending to the given file, and output redirection truncating the given file. &in, &out, and &err take an argument suitable to be passed to #:in, #:out, and #:err of shell/mixed-pipeline/run-pipeline.

&bg and &pipeline-ret toggle #:bg and #:return-pipeline-object, and &strict, &permissive, and &lazy set the #:strictness argument.

3.3.3 Pipeline Operators

The core module only provides a few simple pipeline operators. There are many more in the demo/ directory in the source repository. Most of them are hastily written experiments, but some good ones should eventually be standardized.

pipeline-operator

(=composite-pipe= (pipe-op arg ...) ...+)

Produces a composite pipeline member spec made from the pipeline operators given. This is really more for use when defining new pipeline operotars than for use in a pipeline itself.

pipeline-operator

(=basic-unix-pipe= options ... args ...+)

Produces a unix-pipeline-member-spec with the given arguments as the process/function argument list. The arguments are not quoted. Any argument that produces a list will be spliced into the argument list.

Options all take an argument, must precede any arguments, and are as follows:

#:as - This is sugar for adding on an object pipeline member afterward that parses the output somehow. This should be given either #f (no transformation), a port reading function (eg. port->string), or one of a pre-set list of symbols: 'string, 'trim, 'lines, or 'words.

#:e> - Accepts a file name (as an identifier), redirects the error stream to that file. Produces an error if the file exists.

#:e>! - Accepts a file name (as an identifier), redirects the error stream to that file. Truncates the file if it exists.

#:e>> - Accepts a file name (as an identifier), redirects the error stream to that file. Appends to the file if it exists.

#:err - Takes an expression to produce an error redirection value suitable for unix-pipeline-member-spec.

#:success - Takes an expression suitable for the #:success argument of unix-pipeline-member-spec.

TODO - env modification

pipeline-operator

(=quoting-basic-unix-pipe= options ... args ...+)

Like =basic-unix-pipe=, except that it quotes all of its arguments that are identifiers. All non-identifier arguments (notably parenthesized forms) are not quoted, and thus you can unquote by using parentheses.

(define x "/etc")
(define-syntax id (syntax-parser [(_ x) #'x]))
 
;; I find I really don't mind this as a means of unquoting here.
(run-pipeline =quoting-basic-unix-pipe= ls (id x))

pipeline-operator

(=unix-pipe= arg ...+)

This is the pipe that does more or less what you expect. It does tilde expansion (~ -> $HOME). It does globbing. When you have $identifiers-with-dollar-signs they are expanded into variable references. When $DOLLAR_IDENTIFIERS_ARE_CAPITALIZED they are expanded to environment variable lookups.

After all that expansion, it passes through to =quoting-basic-unix-pipe=.

However, if the first argument is a pipeline alias defined with define-pipeline-alias or define-simple-pipeline-alias, then the operator from that alias is swapped in instead, skipping everything else that this operator would normally do.

(run-pipeline =unix-pipe= echo $HOME/*.rkt)
(define-simple-pipeline-alias d 'ls '--color=auto)
(define dfdir 'dotfiles)
(run-pipeline =unix-pipe= d $HOME/$dfdir)

Usually \| is used instead.

pipeline-operator

(\| arg ...+)

Alias for =unix-pipe=.

Note that the backslash is required in the normal racket reader because | is normally treated specially. In the Rash reader, you can get this by typing just |.

pipeline-operator

(=basic-object-pipe/expression= e)

The simplest object pipe. e is simply the body of a lambda. When used as a pipeline starter, the lambda accepts no arguments. Otherwise it is a single-argument function, and current-pipeline-argument is used to refer to its argument.

pipeline-operator

(=basic-object-pipe/form= arg ...+)

Creates an object pipe where (arg ...) is the body of a function.

As with other object pipes, when used as a pipeline starter it generates a lambda with no arguments, and as a pipeline joint it generates a lambda with one argument, current-pipeline-argument.

pipeline-operator

(=basic-object-pipe= arg ...+)

Like =basic-object-pipe/form=, except that when not used as a pipeline starter, if the current-pipeline-argument is not used within the arguments, it is appended as the last argument.

To discover whether current-pipeline-argument is used, each argument is local-expanded. So (arg ...) must be equivalent to a function application form and not a macro invocation form.

Usually \|> is used instead.

pipeline-operator

(\|> arg ...+)

Note that the backslash is required in the normal racket reader because | is normally treated specially. In the Rash reader, you can get this by typing just |>.

pipeline-operator

(=object-pipe/expression= arg ...+)

Like =basic-object-pipe/expression=, but when it receives a port as an argument, it converts it to a string.

pipeline-operator

(=object-pipe/form= arg ...+)

Like =basic-object-pipe/form=, but when it receives a port as an argument, it converts it to a string.

pipeline-operator

(=object-pipe= arg ...+)

Like =basic-object-pipe=, but when it receives a port as an argument, it converts it to a string.

Usually \|>> is used instead.

pipeline-operator

(\|>> arg ...+)

Alias for =object-pipe=.

Note that the backslash is required in the normal racket reader because | is normally treated specially. In the Rash reader, you can get this by typing just |>>.

pipeline-operator

default-pipeline-starter

Syntax parameter determining which pipeline operator is inserted when a run-pipeline form doesn’t explicitly start with one.

I’ve written various other pipeline operators that are a little more exciting and that are currently in the demo directory of the repository. I’ll eventually polish them up and put them somewhere stable. They include things like unix pipes that automatically glob things, unix pipes that have lexically scoped alias resolution, =filter=, =map=, =for/stream=,=for/list/unix-arg=,=for/list/unix-input=...

3.3.4 Defining Pipeline Operators

syntax

(define-pipeline-operator name start-or-joint ...)

 
start-or-joint = #:start transformer
  | #:joint transformer
Define a pipeline operator. Pipeline operators can act differently when they are in the starting position of a pipeline or later (joint position). Specifically, when an operator creates an object-pipeline-member-spec, it needs to have a function that accepts 0 arguments when in the start position and 1 argument in others.

If a transformer function is not specified for one of the options, a default implementation (that generates an error) is used.

The transformer will receive a syntax object corresponding to (name-of-pipe argument ...), so it will likely want to ignore its first argument like most macros do. But simetimes it may be useful to recur.

Example uses are in the demo directory in the repository.

syntax

(define-pipeline-alias name transformer)

Defines an alias macro recognized by =unix-pipe= and maybe others.

transformer must be a syntax transformer function, and must return a syntax object that starts with a pipeline operator.

;; Unix `find` has to take the directory first, but we want
;; to always add the -type f flag at the end.
(define-pipeline-alias find-f
  (syntax-parser
    [(_ arg ...) #'(=unix-pipe= find arg ... -type f)]))
 
;; these are equivalent
(run-pipeline =unix-pipe= find-f ".")
(run-pipeline =unix-pipe= find "." -type f)

syntax

(define-simple-pipeline-alias name cmd arg ...)

Simple sugar for define-pipeline-alias. It defines an alias transformer that uses =unix-pipe=.

(define-simple-pipeline-alias ls 'ls '--color=auto)
;; these are equivalent
(run-pipeline =unix-pipe= d -l $HOME)
(run-pipeline =unix-pipe= 'ls '--color=auto -l $HOME)

The name of the implicit argument for object pipes. The default is an error, and pipe operators that accept it must set it up using expand-pipeline-arguments or syntax-parameterize.

Usually _ is used instead.

syntax

_

Alias for current-pipeline-argument. It’s an underscore, if you’re having trouble telling which of the many horizontal line characters it is since it’s all alone right there in that bar.

TODO - document this.

3.3.5 Inspecting Pipelines

procedure

pipeline? : procedure?

3.4 Demo stuff reference

These things are documented in the Rash documentation, but I’m adding these definitions to not have broken links...

You can get these with

(require rash/demo/setup)

pipeline-operator

(=map= arg ...)

pipeline-operator

(=filter= arg ...)