Loading Data Into Bash Variables

Go away or I will replace you with a very small shell script. For those unfamiliar with Unix and Linux environments, bash is the command line shell that is standard on many distributions. These examples grew out of challenges attempting to automate EC2 processes. These basic principles of course can be applied more generally as needed. My goal is to simply provide the options I need most often in a single place. As I continue to automate routine tasks, I might just be able to replace myself, or others, with a series of scripts!

The Gaps

The Solutions

Loading data from a URL

With Amazon EC2 many startup values are available via a HTTP request to the internal address instance-data.ec2.internal. If you want to know more about these values, the Developer Guide is a good resource.

MY_INSTANCE_ID=`exec wget -q -O - http://instance-data.ec2.internal/latest/meta-data/instance-id`

This script grabs the instance id and puts it into a variable and prints it back out by using the result of a remote execution to wget with the quiet (-q) and the output file set as standard output (-O -), the second dash is what sends the data to standard output so don’t forget it! Now anywhere in our script we want the instance id for string comparison, logging or whatever, we have it!

Loading data from a file

What if the data we want to load is in a file on the disk? This method is not good for processing giant apache access logs, but with smaller text files, it will work just fine.

FILE_DATA=( $( /bin/cat file_data.txt ) )
for I in $(/usr/bin/seq 0 $((${#FILE_DATA[@]} - 1)))
		echo $I $FILE_DATA[$i]

What’s going on? The code is being loaded into an array, in bash, called FILE_DATA. It then loops over each element in the array using a for loop. Finally within the loop, we simply print the current index and then output the line we loaded. This would be roughly equivalent to running cat -n file_data.txt from the shell directly, but obviously gives us the flexibility to do further processing with the string contained in the variable.

Loading data from the user

Obviously this is not ideal for creating a process that runs on a cron job. However, if a script is being run by a user, they often need to tweak something about the way it runs that often can’t be detected automatically. In this case, you’ll want the user to key the data directly into your script.

read -p "Enter Something: " VARIABLE

This example uses read with the optional prompt (-p) flag. This causes the text in the quotes to be displayed on the users standard output or terminal window.

Loading data from the command line

A step further is to let the user pass in data on the command line at run time. This of course can also be automated if needed. The following example leverages getopts to parse the parameters that were called in.

while getopts ":ab:" OPTION
	case $OPTION in
	a ) OPT_A=1 ;;
	b ) OPT_B=$OPTARG ;;
shift $(($OPTIND - 1))
echo $OPT_A $OPT_B

The example script takes 2 different parameters a flag “-a” and a flag “-b” which expect data. In the example, default values are provided for each value, this gives the effect of making all flags optional. Using the flag -a would likely toggle a specific behavior within your script, perhaps loading a specific configuration file instead of the default one. If you wanted to collect data in each field, you simply add a colon “:” after each flag, ‘a’ in this example, following the getopts command. You would then update the case statement to reflect your expectation of data being present in $OPTARG. See the modified script below for clarification.

while getopts ":a:b:" OPTION
	case $OPTION in
	a ) OPT_A=$OPTARG ;;
	b ) OPT_B=$OPTARG ;;
shift $(($OPTIND - 1))
echo $OPT_A $OPT_B

But wait… there’s more!

There’s also a simple way to pass data in that just stores the input from the command line into the $1, $2, $3, $4 and so on input variables.

echo $2 $1

The script above when run as “./test_script hello world” will output “world hello”. This method can be handy for scripting quick tasks that you often use a series of parameters for. For example, adding the flags “-la” to “ls” as demonstrated below.

ls -la $1

Script Configuration

So now that we can get different bits of data from all these different sources, what if all my scripts leverage the same data? Can’t I just have it as a single configuration file that I edit once? YES! This next example does just that. While it doesn’t technically load data into a variable, it does allow you to encapsulate your code, including a file full of variable assignments, into logical chunks. In my case, I was looking to avoid editing multiple scripts when configuration changes were needed.

First I created my configuration script, my_script.cfg, in the same directory I am running my example script below.

# Comments are allowed

Now the script that uses the configuration file above.

if [ -f my_script.cfg ];then 
	. my_script.cfg
echo $OPT_1 $OPT_2 $OPT_3

Dissecting the script you’ll see that I first set some default values. Next the code checks for the existence of the configuration file. If found, it is included. It’s important to note that this is included because it actually allows you to run code within the configuration file. An EC2 instance might, for example, place all of the calls to instance-data.ec2.internal for metadata into a configuration file that’s simply included on scripts that use that information.

That’s it! Hope you find this resource helpful!

And for anyone looking to put those around you on alert, buy the t-shirt from Think Geek.

This entry was posted in Linux, Servers and tagged , , , , . Bookmark the permalink.

7 Responses to Loading Data Into Bash Variables

  1. Raj says:

    How to read a variable value in my korn shell script from .xml file.?
    suppose .xml format is like this:

    I would like to read a variable value max_threads from .xml

    any one knows, help me on this.

  2. Raj says:

    How to get the CPU_IDLE time using the UNIX ‘sar’ command in korn shell?

    If any one knows, pls answer.

  3. Raj says:

    I have a requirement like:
    In .ksh(korn sshell) need to store all sleep jobs into array and add them to oracle table(temp).?

    any one knows, help on this.

  4. Dov says:

    In your example loading data from a file, you have

    echo $I $FILE_DATA[$i]

    but I think it needs to be

    echo $I ${FILE_DATA[$I]}

  5. Thanks, I was looking for the latest solution, simply

    . /etc/myconf.cnf

  6. Ricki says:

    Re this: Loading data from a file

    A much simpler approach I think is:
    FILE_DATA=`cat file_data.txt`
    echo “$FILE_DATA”

    (the quotes around $FILE_DATA are critical to preserve newlines)

  7. Joe says:

    Please note that when you use “source” or “.” in a script, everything in the “sourced” file is executed as bash code at the same level of your shell (variable assignments will be available in the calling script). This is super cool. It’s the “everything” part you have to watch out for. If someone has access to the file being “sourced”, they can put arbitrary bash code in it and it will be executed by your script with the same access as your script has. This can be a very big security risk (and even worse if your script runs with elevated permissions such as those grated by sudo.)