0

Working in node, I have a bash script that acts as glue for my build and deployment process. It calls npm, gulp, the AWS cli, and docker as well as a number of standard command-line tools (sed, grep, export, etc.) Many of those tools call other processes as well (babel and typescript, for example.

When I run this script manually, I see a ton of output in the console. When I automate the same script and try to output the results to a log file, a lot of what I saw in the console never appears.

The command I'm currently running looks like this:

NODE_ENV=sandbox ./bapc.sh >> /var/log/bapc.log 2>&1

How can I call the script so that everything winds up in the log file?

  • Have you seen and tried the various methods in http://askubuntu.com/questions/420981/how-do-i-save-terminal-output-to-a-file/731237 ? However, the 2>&1 should take care of redirecting STDERR to STDOUT already. Are you talking about static output which you still see in the console after the command exited when you scroll up, or are you maybe talking about dynamic output, like interactive progress bars or status lines that change throughout the execution time and are no longer visible in the console after the command exited? – Byte Commander Apr 10 '17 at 20:16
  • For the most part, the dynamic output is handled and not concerning. The situation I'm facing is that my script will call gulp, which will call babel, which will call a typescript transpiler. I see output from all four programs in the console, but only from my script and maybe gulp in the log. – Yes - that Jake. Apr 10 '17 at 20:20
  • What if you run your command in a subshell by wrapping it in braces like ( NODE_ENV=sandbox ./bapc.sh ) >> /var/log/bapc.log 2>&1 and write the output redirection outside? – Byte Commander Apr 10 '17 at 20:28

0 Answers0