bash: any way to reuse the last output?

Bill Rugolsky Jr. brugolsky at telemetry-investments.com
Fri Jan 23 13:37:00 UTC 2004


On Fri, Jan 23, 2004 at 11:01:10AM -0200, Alexandre Oliva wrote:
> Now, if you really want to be able to take the output of some random
> program you ran and feed that into another program, there are ways
> to do that.  Start GNU screen, or a shell within GNU Emacs, or just
> use some terminal program with a configurable scrollback buffer and,
> whenever you need the output of some other program, you can scroll
> back and cut&paste the output into the input of another program.  No
> need to build such abilities into the shell.

expect(1) is another possibility.  As has been explained in this thread,
something needs to "tee" the output to a file.  When you run a command
and it generates output to, e.g., /dev/pts/N, the shell is not involved.
(It's usually sleeping in the wait4() system call until the command pipeline
terminates.)  Piping the output to another process causes stdio buffering
which is not what you want.  You need pty terminal handling.

One could use expect's "interact" command to save the output of each
command in a ring of files (say 0-9) in a spool directory.  As Alexandre
says, the stdout/stderr thing is a problem, and one probably needs *3*
files, stdout, stderr, and stdout+stderr.  A set of simple shell aliases
can be used to refer to the Nth previous output for that session.

Doing a poor job of this requires only a few lines of expect(1) script,
but you'd want your script to recognize curses(3) programs and avoid
writing editing sessions (e.g., vi) to a file.  Not to mention output
from background tasks, etc.

Regards,

	Bill Rugolsky





More information about the fedora-list mailing list