How to run .sh file [duplicate]


This question already has an answer here:

How to run scripts without typing the full path? 8 answers

You can create an alias of the full command by running the following in the terminal:

alias viber=/home/nazar/Software/Viber/

Now you can run the script by just typing viber.

Note that this will be working for the current session of the shell only. To make it permanent save it in ~/.bash_aliases (or ~/.bashrc):

$ echo 'alias viber=/home/nazar/Software/Viber/' >> ~/.bash_aliases $ source ~/.bash_aliases

The first command will save the alias permanently in ~/.bash_aliases, the preferred file to save aliases. It will create the file if not exists already. The second command will make the alias working from the current shell session.

An alternate method is to create a symbolic link of the executable script in the /usr/local/bin or /usr/bin directory(given they are in your PATH environment...

0 0

How to run no more than one unique instance of a script or command. Prevent duplicate cron jobs running.


When you set up a cron job, sometimes you need to make sure that you will always have just one running instance at a time. This is useful also when you want to be sure that a script that can take longer than expected does not get executed again if the previous call hasn’t finished.

This can be done with another caller shell script that detects a running instance before executing it again (e.g.: using pidof), or you can use programs written specifically to handle this situation (e.g.: flock or run-once).

In this tutorial we will be using different approaches to have just one instance of an example script called that simply prints “Hello” in an infinite while loop.

#!/bin/sh while true do echo "Hello" sleep 2 done

And make it executable.

$ chmod +x $ ./ Hello Hello ...

Script approach

We make a...

0 0
Hi. I know that if I want to use .sh files, I need make it executable and to execute

But my question is as follows : I don't know why, but when I click on any .sh file, a program launches with "Wine Windows Program Loader" name and I see wine's icon jumping for a few seconds near my cursor. Then it disappears. I know Wine shouldn't be used to open .sh files, I don't know why it's so. I can for example use Kwrite to open it and even make it default program for opening, but what when I want to execute it? I mean, what command should I type in "Open in.." window?

I prefer to use Yakuake, so it would be great if you could tell me the command for yakuake, not...

0 0

Hi all, today we're gonna learn how to find and remove duplicate files on you Linux PC or Server. So, here's tools that you may use anyone of them according to your needs and comfort.

Whether you’re using Linux on your desktop or a server, there are good tools that will scan your system for duplicate files and help you remove them to free up space. Solid graphical and command-line interfaces are both available. Duplicate files are an unnecessary waste of disk space. After all, if you really need the same file in two different locations you could always set up a symbolic link or hard link, storing the data in only one location on disk.

1) FSlint

FSlint is available in various Linux distributions binary repository, including Ubuntu, Debian, Fedora, and Red Hat. Just fire up your package manager and install the “fslint” package. This utility provides a convenient graphical interface by default and it also includes command-line versions of its various...

0 0

How to remove / delete duplicate records / lines from a file?

Let us consider a file with the following content. The duplicate record is 'Linux' with 2 entries :

$ cat file Unix Linux Solaris AIX Linux 1

. Using





$ sort file | uniq AIX Linux Solaris Unix uniq

command retains only unique records from a file. In other words, uniq removes duplicates. However,


command needs a sorted file as input.

2. Only the sort command without uniq command:

$ sort -u file AIX Linux Solaris Unix sort

with -u option removes all the duplicate records and hence


is not needed at all.

Without changing order of contents:
The above 2 methods change the order of the file. The unique records may not be in the order in which it appears in the file. The below 2 methods will print the file without duplicates in the same order in which it was present in the file.

3. Using the awk :

0 0

Here’s a quick script to show duplicate files on Linux. It should cope with arbitrary spaces in file names, and to save time and CPU resources, it will checksum only files of the same size.

Usage: Save the script to or whatever, then run it with no arguments. A list of duplicated files is output.

#!/bin/bash # # Quick script to list duplicate files under the current directory # v 1.2 # echo Running find... >&2 find . -type f -size +0 -printf "%-25s %pn" | sort -n | uniq -D -w 25 > /tmp/dupsizes.$$ echo Calculating $(wc -l &2 cat /tmp/dupsizes.$$ | sed 's/^w* *(.*)/md5sum "1"/' | sh | sort | uniq -w32 --all-repeated=separate > /tmp/dups.$$ echo Found $(grep -c . /tmp/dups.$$) duplicated files while read md5 filename do if [[ ! -z "$filename" ]]; then ls -l "$filename" else echo fi done

Any duplicate files that the script finds are printed in "ls -ls" format, and grouped by duplicate sets.

bash-4.2$ ./ | more Running find......
0 0

Stevey asked the Answer Line forum for advise on finding and removing duplicate files.

A hard drive is like a family garage--junk expands to fill available space. An SSD behaves very much the same way, but with less space.

A good duplicate file finder will help you reduce your digital junk levels. It can search for files with the same name, the same size, and/or the exact same contents. It helps you examine each file and decide which one to keep. It can ignore small files, so you can concentrate on the more wasteful redundancy.

I'm going to recommend two such programs, both free for non-commercial use.

[Email your tech questions to or post them on the PCW Answer Line forum.]

All things considered, I recommend Digital Volcano's Duplicate Cleaner Free. The attractive, three-tab interface allows the program to provide plenty of options without overwhelming you.

When preparing Duplicate Cleaner for scanning your drive, you...

0 0
Helpful Administrative Scripts

These scripts are distributed under standard GNU Public License terms. You are free to use and distribute them, provided you preserve the attribution comments in them. They are not that sophisticated, but if you find them useful or have suggestions for improvement, please email. unix @ the-welters . com and let me know what you think. Most of these are written in Bourne shell. While I think Perl is a much better language for larger or more complex scripts, most of these are so simple that Bourne shell suites them just fine.

cdpinfo - display Cisco CDP packet info via tcpdump or snoop createVgCloneScript - duplicate a volume group, volume, and filesystem structure dimmslots - display the dimm slot arrangement on an IBM Power System lvinfo - a script to help document Logical Volume Manager information on an AIX system. wackVG - Delete the contents of a volume group, including all all files and volumes. pstree - this script analyzes "ps" command...
0 0


I’ve never been much of a TV watcher. I tend to get my news online and/or, wherever possible, from the original source. I’m also a licensed HAM. Needless to say, I like all things radio.

There are many satellites orbiting overhead, some of which are weather satellites. Many of these take specialized equipment to receive, and when received, are generally encrypted so the owner of the satellite can sell that data to news stations, weather channels, etc. However, a few are not encrypted and just require a small amount of effort and some minimal hardware to receive.

These satellites tend to fall into 2 main categories. APT (Automatic Picture Transmission), and LRPT (Low Rate Picture Transmission). APT is an analog signal and is generally the easiest to get started with. LRPT is digital and has always been a little trickier, but with just a little extra effort, you can receive both with the same equipment.

This guide is designed for people who...

0 0