LP UNIT II Qtionbanks

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

1. Define grep.

Write a grep command to display the lines which does not matches all the given
pattern.

grep command in Unix/Linux


The grep filter searches a file for a particular pattern of characters, and displays all lines that contain
that pattern. The pattern that is searched in the file is referred to as the regular expression (grep
stands for globally search for regular expression and print out).
Syntax:
grep [options] pattern [files]
Options Description
-c : This prints only a count of the lines that match a pattern
-h : Display the matched lines, but do not display the filenames.
-i : Ignores, case for matching
-l : Displays list of a filenames only.
-n : Display the matched lines and their line numbers.
-v : This prints out all the lines that do not matches the pattern
-e exp : Specifies expression with this option. Can use multiple times.
-f file : Takes patterns from file, one per line.
-E : Treats pattern as an extended regular expression (ERE)
-w : Match whole word
-o : Print only the matched parts of a matching line,
with each such part on a separate output line.
Sample Commands
Consider the below file as an input.
$cat > geekfile.txt
unix is great os. unix is opensource. unix is free os.
learn operating system.
Unix linux which one you choose.
uNix is easy to learn.unix is a multiuser os.Learn unix .unix is a powerful.
1. Case insensitive search : The -i option enables to search for a string case insensitively in the give
file. It matches the words like “UNIX”, “Unix”, “unix”.
$grep -i "UNix" geekfile.txt
Output:

unix is great os. unix is opensource. unix is free os.


Unix linux which one you choose.
uNix is easy to learn.unix is a multiuser os.Learn unix .unix is a powerful.
2. Displaying the count of number of matches : We can find the number of lines that matches the
given string/pattern
$grep -c "unix" geekfile.txt
Output:
2
3. Display the file names that matches the pattern : We can just display the files that contains the
given string/pattern.
$grep -l "unix" *

or

$grep -l "unix" f1.txt f2.txt f3.xt f4.txt


Output:
geekfile.txt
4. Checking for the whole words in a file : By default, grep matches the given string/pattern even if
it found as a substring in a file. The -w option to grep makes it match only the whole words.
$ grep -w "unix" geekfile.txt
Output:
unix is great os. unix is opensource. unix is free os.
uNix is easy to learn.unix is a multiuser os.Learn unix .unix is a powerful.
5. Displaying only the matched pattern : By default, grep displays the entire line which has the
matched string. We can make the grep to display only the matched string by using the -o option.
$ grep -o "unix" geekfile.txt
Output:
unix
unix
unix
unix
unix
unix

6. Show line number while displaying the output using grep -n : To show the line number of file
with the line matched.
$ grep -n "unix" geekfile.txt
Output:

1:unix is great os. unix is opensource. unix is free os.


4:uNix is easy to learn.unix is a multiuser os.Learn unix .unix is a powerful.
7. Inverting the pattern match : You can display the lines that are not matched with the specified
search sting pattern using the -v option.
$ grep -v "unix" geekfile.txt
Output:
learn operating system.
Unix linux which one you choose.
8. Matching the lines that start with a string : The ^ regular expression pattern specifies the start of
a line. This can be used in grep to match the lines which start with the given string or pattern.
$ grep "^unix" geekfile.txt
Output:
unix is great os. unix is opensource. unix is free os.
9. Matching the lines that end with a string : The $ regular expression pattern specifies the end of
a line. This can be used in grep to match the lines which end with the given string or pattern.
$ grep "os$" geekfile.txt
10.Specifies expression with -e option. Can use multiple times :
$grep –e "Agarwal" –e "Aggarwal" –e "Agrawal" geekfile.txt
11. -f file option Takes patterns from file, one per line.
$cat pattern.txt

Agarwal
Aggarwal
Agrawal
$grep –f pattern.txt geekfile.txt

2. Describe about I/O Redirection operations and built in variables in Shell.

When we type something into our terminal program, we’ll often see output. For
example:
$ echo hello
hello

As we can see, echo hello is a command that means “output hello”. But where does
that output really go?

Standard output
Every Unix-based operating system has a concept of “a default place for output to
go”. Since that phrase is a mouthful, everyone calls it “standard output”, or “stdout”,
pronounced standard out. Your shell (probably bash or zsh) is constantly watching
that default output place. When your shell sees new output there, it prints it out on
the screen so that you, the human, can see it. Otherwise echo hello would send “hello”
to that default place and it would stay there forever.

Standard input
Standard input (“stdin”, pronounced standard in) is the default place where
commands listen for information. For example, if you type cat with no arguments, it
listens for input on stdin, outputting what you type to stdout, until you send it an
EOF character (CTRL+d):
$ cat
hello there
hello there
say it again
say it again
[ctrl+d]

As you can see, with standard input, you can send a string to a command directly.

As you can see, with standard input, you can send a string to a command
directly.

Pipes
Pipes connect the standard output of one command to the standard input
of another. You do this by separating the two commands with the pipe
symbol (|). Here’s an example:
$ echo "hello there"
hello there
$ echo "hello there" | sed "s/hello/hi/"
hi there

prints hello there to stdout. But when we pipe it to sed "s/hello/hi/",


echo "hello there"

sed takes that output as its input and replaces “hello” with “hi”, then prints
out that result to stdout. Your shell only sees the final result after it’s been
processed by sed, and prints that result to the screen.

Hey, if sed sends its result to standard out, can we pipe sed to another sed?
Yep!
$ echo "hello there" | sed "s/hello/hi/" | sed "s/there/robots/"
hi robots

Above, we’ve connected echo to sed, then connected that to another sed.
Pipes are great for taking output of one command and transforming it using
other commands like jq. They’re a key part of the Unix philosophy of “small
sharp tools”: since commands can be chained together with pipes, each
command only needs to do one thing and then hand it off to another
command.
3. Write about here document?
Here documents
← /dev/null discards unwanted output Home

To create a here document use the following syntax:

command <<HERE
text1
text2
testN
$varName
HERE

This type of redirection tells the shell to read input from the current source (HERE) until a line containg only word
(HERE) is seen. HERE word is not subjected to variable name, parameter expansion, arithmetic
expansion, pathname expansion, or command substitution. All of the lines read up to that point are then used as the
standard input for a command. Files are processed in this manner are commonly called here documents. If you do
not want variable name, parameter expansion, arithmetic expansion, pathname expansion, or command
substitution quote HERE in a single quote:

command <<'HERE'
text1
text2
testN
$varName
HERE

Example
Use here document feature to give constant text to a command. For example the following command will count the
words for input:

echo 'This is a test.' | wc -w

Sample outputs:

But, how do you count lots of lines at a time? Use here document as follows:

wc -w <<EOF
> This is a test.
> Apple juice.
> 100% fruit juice and no added sugar, colour or preservative.
> EOF
Sample outputs:

16

The <<, reads the shell input typed after the wc command at the PS2 prompts, >) up to a line which is identical to
word EOF.

HERE document and mail command


For example, write an email using the mail command. Create a shell script called tapebackup1.sh:

#!/bin/bash
# run tar command and dump data to tape
tar -cvf /dev/st0 /www /home 2>/dev/null

# Okay find out if tar was a success or a failure


[ $? -eq 0 ] && status="Success!" || status="Failed!!!"

# write an email to admin


mail -s 'Backup status' vivek@nixcraft.co.in<<END_OF_EMAIL

The backup job finished.

End date: $(date)


Hostname : $(hostname)
Status : $status

END_OF_EMAIL

Save and close the file. Run it as follows:

chmod +x tapebackup1.sh
./tapebackup1.sh

Sample outputs:

Subject: Test
From: root <root@www-03.nixcraft.net.in>
Date: 12:57 Am
To: vivek@nixcraft.co.in

The backup job finished.

End date: Thu Sep 17 14:27:35 CDT 2009


Hostname : txvip1.simplyguide.org
Status : Success

4. Write briefly on case control structure in bash with examples.


5. In addition to simple lists of commands, shell scripts can contain all the familiar control
structures, such as conditionals and loops:

Name Syntax Example


if CONDITION; then
if COMMANDS...; fi
if [ $test ]; then echo "test is true"; fi

for VARIABLE in
for LIST...; do for i in 1 2 3; do echo $i iteration ; done
COMMANDS...; done

[ EXPRESSION ]
Check if EXPRESSION is true or false. See man test for a full
explanation of valid expressions.
[ -e FILENAME ] True iff the give file exists
[ $hippos -eq 5 ] True iff the variable hippos contains the number 5
[ $hippos = "five" ] True iff the variable hippos contains the string "five"
condition True iff the file with a filename given in the
[ ! -x $filename ]
variable filename is not executable.
[ -e FILE1 -o -d FILE2
] True when either FILE1 exists or FILE2 is a directory

True when $compiler contains a string and the filename


[ -n $compiler -a
$objfile -ot $cppfile ] whose name is given by $objfile is older than that given by
$cppfile

6. See man test for further help with conditionals.


7. As may be expected, the shell language contains many special features for working with
filenames. It is thus very easy to create control structures based on the simple examples
above that perform complex operations on files.
8. For example, it is trivial to write a for loop that will rename files to fit a certain pattern.
Since the shell itself is evaluating the loop, the LIST can be a regular expressions just like
those given to any other command.
9. This script adds the prefix "text-" to all files in the current directory whose names end with
".txt", and moves them to the "unix/" subdirectory:
10. #!/bin/bash
11.
12. for file in *.txt; do
13. mv $file text/unix-$file
14. done
15.
16. But suppose we want to make sure no files get overwritten. This can be done with an if
statement that tests for the existence of a file. The next script will do the same as the above,
but also preserves any files that would have been lost by moving them to the "backup/"
subdirectory:
17. #!/bin/bash
18.
19. for file in *.txt; do
20. if [ -e text/unix-$file ]; then
21. mv text/unix-$file backup/unix-$file
22. fi
23.
24. mv $file text/unix-$file
25. done
26.
27. See here for more ways to modify filenames with variables.
28. It's often a good idea to test scripts before running them on important data. Two useful
commands for this are echo and touch. Placing echo before commands which displays what
would have been run instead of actually doing things, which is helpful for checking things
like filename modification. touch FILENAME... will create empty dummy files suitable for
manipulation by scripts.
29. The #!/bin/bash line is called a shebang. It's a special indicator that tells the operating
system which shell should be used to interpret the script, in this case bash. The shebang is
always the first line of a script and is composed of the characters #! followed by the full
pathname of the interpreter's executable.
30. Writing Procedures
31. Bash procedures, or "shell functions", provide a way to group a set of commands for a
certain task together so that they can be conveniently invoked by one command. They can be
thought of like aliases that expand to a script instead of a single command.
32. Like aliases, procedures must be declared before use. The declaration is of the form:
33. procedure_name () {
34. BODY...
35. }
36.
37. Where BODY is the commands to be executed, one per line.
38. The "()" are used only to indicated that a procedure is being declared. They are not present
when calling it. To execute a procedure, simply use procedure_name args....
39. NOTE: like aliases or variables, procedures are not available outside the scope in which
they were declared. If a procedure is declared within a subshell (for example, a separate
script beginning with a shebang), invoking it from the calling shell will give an error!
40. For example, given the script exampleProcedure.sh:
41. #!/bin/bash
42.
43. exampleProcedure () {
44. echo This is exampleProcedure
45. }
46.
47. the following test.sh uses exampleProcedure to output This is exampleProcedure:
48. #!/bin/bash
49.
50. #Get it defined in the current shell scope, using "."
51. #or "source", NOT "./"
52. . exampleProcedure.sh
53.
54. #Call it by name
55. exampleProcedure
56.

57. Quoting
58. Quoting text causes Bash to interpret text differently. For example, a filename containing
spaces must be enclosed in single or double quotes, or Bash will pass it as multiple separate
command line tokens.
59. There are three different forms of quoting, each for a different purpose.
60. Single quotes (')
61. Every character inside these quotes is interpreted completely literally. No variable
substitutions are performed.
62. Double quotes (")
63. Many special characters, such as spaces, are significant. However, some are interpreted.
Notably, variable substitution with $ is allowed.
64. Backquotes (`)
65. Unlike the other two, this form does not prevent interpretation of the quoted tokens.
Instead, text inside is interpreted as a command line, and its output substituted into the
original command line

5. Explain with example the process of creation and execution of a shell script?

A new process is created because an existing process makes an exact copy of itself. This
child process has the same environment as its parent, only the process ID number is
different. This procedure is called forking.

After the forking process, the address space of the child process is overwritten with the new
process data. This is done through an exec call to the system.

The fork-and-exec mechanism thus switches an old command with a new, while the
environment in which the new program is executed remains the same, including
configuration of input and output devices, environment variables and priority. This
mechanism is used to create all UNIX processes, so it also applies to the Linux operating
system. Even the first process, init, with process ID 1, is forked during the boot procedure
in the so-called bootstrapping procedure.

This scheme illustrates the fork-and-exec mechanism. The process ID changes after the fork
procedure:

Figure 4-1. Fork-and-exec mechanism


There are a couple of cases in which init becomes the parent of a process, while the process
was not started by init, as we already saw in the pstreeexample. Many programs, for
instance, daemonize their child processes, so they can keep on running when the parent
stops or is being stopped. A window manager is a typical example; it starts
an xterm process that generates a shell that accepts commands. The window manager then
denies any further responsibility and passes the child process to init. Using this mechanism,
it is possible to change window managers without interrupting running applications.

Every now and then things go wrong, even in good families. In an exceptional case, a
process might finish while the parent does not wait for the completion of this process. Such
an unburied process is called a zombie process.

You might also like