# Bash
Prefer to write quick scripts in Bash? We've got you covered.
You can run any Bash in a Pipedream step within your workflows.
WARNING
Bash steps are available in a limited alpha release.
You can still run arbitrary Bash scripts, including sharing data between steps as well as accessing environment variables.
However, you can't connect accounts, return HTTP responses, or take advantage of other features available in the Node.js environment at this time. If you have any questions, find bugs or have feedback please contact support (opens new window).
# Adding a Bash code step
- Click the + icon to add a new step
- Click Custom Code
- In the new step, select the
bash
runtime in language dropdown
# Logging and debugging
When it comes to debugging Bash scripts, echo
is your friend.
Your echo
statements will print their output in the workflow step results:
MESSAGE='Hello world'
# The message will now be available in the "Result > Logs" area in the workflow step
echo $MESSAGE
# Available binaries
Bash steps come with many common and useful binaries preinstalled and available in $PATH
for you to use out of the box. These binaries include but aren't limited to:
curl
for making HTTP requestsjq
for manipulating and viewing JSON datagit
for interacting with remote repositories
Unfortunately it is not possible to install packages from a package manager like apt
or yum
.
If you need a package pre-installed in your Bash steps, just ask us (opens new window).
Otherwise, you can use the /tmp
directory to download and install software from source.
# Making an HTTP request
curl
is already preinstalled in Bash steps, we recommend using it for making HTTP requests in your code for sending or requesting data from APIs or webpages.
# Making a GET
request
You can use curl
to perform GET
requests from websites or APIs directly.
# Get the current weather in San Francisco
WEATHER=`curl --silent https://wttr.in/San\ Francisco\?format=3`
echo $WEATHER
# Produces:
# San Francisco: 🌫 +48°F
TIP
Use the --silent
flag with curl
to suppress extra extra diagnostic information that curl
produces when making requests.
This enables you to only worry about the body of the response so you can visualize it with tools like echo
or jq
.
# Making a POST
request
curl
can also make POST
s requests as well. The -X
flag allow you to specify the HTTP method you'd like to use for an HTTP request.
The -d
flag is for passing data in the POST
request.
curl --silent -X POST https://postman-echo.com/post -d 'name=Bulbasaur&id=1'
# To store the API response in a variable, interpolate the response into a string and store it in variable
RESPONSE=`curl --silent -X POST https://postman-echo.com/post -d 'name=Bulbasaur&id=1'`
# Now the response is stored as a variable
echo $RESPONSE
# Using API key authentication
Some APIs require you to authenticate with a secret API key.
curl
has an -h
flag where you can pass your API key as a token.
For example, here's how to retrieve mentions from the Twitter API:
# Define the "Authorization" header to include your Twitter API key
curl --silent -X POST -h "Authorization: Bearer $(<your api key here>)" https://api.twitter.com/2/users/@pipedream/mentions
# Sharing data between steps
A step can accept data from other steps in the same workflow, or pass data downstream to others.
# Using data from another step
In Bash steps, data from the initial workflow trigger and other steps are available in the $PIPEDREAM_STEPS
environment variable.
In this example, we'll pretend this data is coming into our HTTP trigger via a POST request.
{
"id": 1,
"name": "Bulbasaur",
"type": "plant"
}
In our Bash script, we can access this data via the $PIPEDREAM_STEPS
file. Specifically, this data from the POST request into our workflow is available in the trigger
object.
echo $PIPEDREAM_STEPS | jq .trigger.event
# Results in { id: 1, name: "Bulbasaur", type: "plant" }
TIP
The period (.
) in front the trigger.event
in the example is not a typo. This is to define the starting point for jq
to traverse down the JSON in the HTTP response.
# Sending data downstream to other steps
To share data for future steps to use downstream, append it to the $PIPEDREAM_EXPORTS
file.
# Retrieve the data from an API and store it in a variable
DATA=`curl --silent https://pokeapi.co/api/v2/pokemon/charizard`
# Write data to $PIPEDREAM_EXPORTS to share with steps downstream
EXPORT="key:json=${DATA}"
echo $EXPORT >> $PIPEDREAM_EXPORTS
WARNING
Not all data types can be stored in the $PIPEDREAM_EXPORTS
data shared between workflow steps.
For the best experience, we recommend only exporting strings from Bash steps that can be serialized to JSON.
# Using environment variables
You can leverage any environment variables defined in your Pipedream account in a bash step. This is useful for keeping your secrets out of code as well as keeping them flexible to swap API keys without having to update each step individually.
To access them, just append the $
in front of the environment variable name.
echo $POKEDEX_API_KEY
Or an even more useful example, using the stored environment variable to make an authenticated API request.
curl --silent -X POST -h "Authorization: Bearer $TWITTER_API_KEY" https://api.twitter.com/2/users/@pipedream/mentions
# Raising exceptions
You may need to stop your step immediately. You can use the normal exit
function available in Bash to quit the step prematurely.
echo "Exiting now!" 1>&2
exit 1
WARNING
Using exit
to quit a Bash step early won't stop the execution of the rest of the workflow.
Exiting a Bash step will only apply that particular step in the workflow.
This will exit the step and output the error message to stderr
which will appear in the results of the step in the workflow.
# File storage
If you need to download and store files, you can place them in the /tmp
directory.
# Writing a file to /tmp
For example, to download a file to /tmp
using curl
# Download the current weather in Cleveland in PNG format
curl --silent https://wttr.in/Cleveland.png --output /tmp/weather.png
# Output the contents of /tmp to confirm the file is there
ls /tmp
WARNING
The /tmp
directory does not have unlimited storage. Please refer to the disk limits for details.
← Go Destinations →