Nailing automation with Bash: Core concepts and implementations
2021-05-03 11:35:16 Author: infosecwriteups.com(查看原文) 阅读量:193 收藏

Manas Harsh

Source: Google images

Hi homies, I hope you all are doing great and learning new things daily. Recently, I had posted a bash oneliner on Twitter which solves some of your automation queries and I got a lot of questions regarding how to, where to with bash. Many of the people found it difficult to understand and that’s completely fine, there is nothing to worry about. So this is the reason I thought to write this blog if it helps some people out there to understand the basic concept of bash automation.

So, what is bash scripting? Bash is a command language interpreter. It is widely available on various operating systems and is a default command interpreter on most GNU/Linux systems. The name is an acronym for the ‘Bourne-Again SHell’. Shell is a macro processor which allows for interactive or non-interactive command execution. Scripting is writing a program for the shell to execute and a shell script is a file or program that the shell will execute. This is the basic understanding of what we are doing here. If you want some deep information about it, I highly recommend following this blog: Click here:)

Well, this blog is not to define how bash works and not the whole programming concept either. It is totally focused on Bash automation for bounties and stuff. Before we dig deeper, let’s see some commands we need to keep in mind:

  • Echo: echo is a command that outputs the strings it is being passed as arguments.
  • Grep: Grep is used to search for a string of characters in a specified file. The text search pattern is called a regular expression. When it finds a match, it prints the line with the result.
  • Sed: Sed performs basic text transformations on an input stream (a file or input from a pipeline) in a single pass through the stream, so it is very efficient. However, it is sed’s ability to filter text in a pipeline that particularly distinguishes it from other types of editor.
  • AWK: Awk command searches files for text containing a pattern. When a line or text matches, awk performs a specific action on that line/text.
  • Xargs: Xargs can be used to build and execute commands from standard input. Some commands like grep can accept input as parameters, but some commands accept arguments, this is a place where xargs came into the picture.

So these were some commands we will be using here, in this article. There are many more of them like sort, tee, uniq, cat but we can do pretty much of our work with the help of these commands. I will include some one-liners as well which will contain automated tools from our daily use.

Well, a question arises as where we can use our bash automation? The best place is the recon part. You can save a lot of time from automating many things and merging a lot of tools so that you won’t have to use every one of them seperately. A good example of a simple bash script is here:

This is just to show you how multiple tools can be merged to get desired output in one place. Not a very fancy method applied here, but it will definitely save some of your time.

Another example is, you can write a simple one-liner and create an alias for that. An alias is a (usually short) name that the shell translates into another (usually longer) name or command. Aliases allow you to define new commands by substituting a string for the first token of a simple command. For example, suppose you have this simple one-liner script:

echo “target.com” | nuclei -t /tools/nuclei-templates -o output.txt

We can make an alias of this oneliner and next time, we only need to provide a target and our tool will work on that. If you want to learn more about alias, here is an article which will help you: click here:) For a simple understanding, open your .bashrc file and add your alias like this:

alias domains=subfinder -d target.com -o output.txt

After saving this in .bashrc, when we will type domains there in the terminal, we will get the same output as subfinder -d target.com -o output.txt. This is the basic concept of alias. There are many more things to explore and I leave it to you how you sharpen your skills. Let me tell you, you can run tools written in Python, Go or any other languages as well with the help of an alias. Just define the directories and the clear syntaxes with uses, and you are ready to go.

I got another question a lot of times like how we can create one-liners with bash and use them in daily automation. Well, it depends on which use you want to create a one-liner. Suppose you want to grab all the URLs and you want to filter some special words in it, this simple oneliner will help:

cat target.txt | grep “?url=” | sort -u | tee output.txt

Here, it will grab all the URLs containing ?url= parameter. After it, we will get a unique sorted output file where we can do our further stuff. Here, if you don’t know about piping( | ), it passes your output to another command for further processing so that we will get our desired output for the last command used. The tee command is normally used to split the output of a program so that it can be both displayed and saved in a file.

So, these were some basic uses of bash. Let’s see some other commands as we have discussed in the earlier part of this blog. Let’s take an example:

sed ‘s/ab/abc/g’ file.txt

This is a very basic example of sed command. What we are doing here is, we are replacing all ab to abc in file.txt. If you think deeply, this could be very handy when you are creating a oneliner for brute forcers. The s character in sed stands for substitute and g stands for global. Now, if you understand this game, we can write a small one-liner to do a subdomain brute-forcing:

echo “target.com” | sed ‘s#^#http://#'

Here, ^ is used for marking the first character of a line.

You can use awk as well. As I have mentioned already what awk does, let’s look at a simple bash one-liner that uses awk:

cat file.txt | awk ‘{print NR,length($0);}’

What awk will do here is, it will print the line number and number of characters on each line. So now you can figure out what to do with awk in automating your recon work. these are some of the commands which you can use to advance your automation skill.

Now, here are some of the examples of one-liners that could be very handy:

cat domains.txt | gau | cut -d “/” -f 4,5 | sed ‘s/?.*//’ | sort -u

This oneliner will take all the data from domain.txt, pass it through gau and make a big list of directories. Later on, you can find what are some useful ones. We have used the cut command which is cutting the part from lines and sending the output to sed. we have used sort -u to sorting the unique ones.

This oneliner contains tools that are available already and we have used piping to send the output of one tool to another:

echo “Domain.com” | waybackurls | httpx -silent -timeout 2 -threads 100 | gf redirect | anew

What this small one-liner will do is, it will send the domain.com to waybackurls where we will get all the URLs available on waybackmachine. After that, it uses a tool called httpx which will see which domains are alive and which are not. Then the httpx data will be sent to gf tool, with a pattern for checking redirections. At last, the final output will be converted in new lines with anew which helps to add new lines to your files.

Now, here comes the best part. You can just create a recon.sh script which will contain all of the one-liners mentioned here and once it gets executed, you will save a lot of time. Not only that, you can create an alias for each of them or any way you wish. The main purpose is, write some one-liners, create a script with them and run in one go. There is no rocket science. Yes, to understand bash better, I would highly recommend learning it. If you wish the same, this course from udemy can help and the best part is it’s absolutely free: Bash scripting. You can check JavaTpoint’s bash learning as well.

I hope this blog teaches you some of the basic concepts of bash. I leave it as a task for you if you want to dig deeper as everything can’t get covered in a blog and I don’t want to make it a huge one as well. This was just to let you understand the basic idea of how to use bash in automation. Also, sed and awk are huge in themselves and they contain a lot of uses. In fact, awk is a whole tool you can use alone in bash. However, you have all the time in the world to visit Google and explore the stuff out there. To be honest, learning bash and implementing it in hacking is super fun and amazing too. I can bet you will start loving it once you learn some bash.

This will be it for this blog and I hope you like it. If you have any doubts, questions, or suggestions regarding this, kindly shoot me a dm on twitter:) I will be always happy to assist/learn.

Take care, happy hacking!

Adios❤

Twitter: @manasH4rsh

LinkedIn: https://www.linkedin.com/in/manasharsh/


文章来源: https://infosecwriteups.com/nailing-automation-with-bash-core-concepts-and-implementations-95a05a613a44?source=rss----7b722bfd1b8d--bug_bounty
如有侵权请联系:admin#unsafe.sh