yum -y install gcc make gperf genisoimage flex bison ncurses ncurses-devel pcre-devel augeas-devel augeas readline-devel
checking for cpio... cpio
checking for gperf... no
configure: error: gperf must be installed
configure: error: Package requirements (augeas >= 1.2.0) were not met:
Requested 'augeas >= 1.2.0' but version of augeas is 1.0.0
yum remove augeas augeas-libs augeas-devel
wget http://downl........
systemd is like the service manager for your Centos and other modern Linux distributions (including Debian/Mint/Ubuntu) allows you to enable services, stop them, restart them, check their status and even reboot your system.
The key commands or arguments you will use with systemctl are the following:
Unit Commands:
list-units [PATTERN...] List loaded units
&nbs........
I am using a GTX 1060 but replace the download for the driver with the correct/current version for your particular card by visiting: http://www.nvidia.com/Download/index.aspx?lang=en-us
yum install automake curl openssl-devel libcurl-devel gcc gcc-c++
yum -y install kernel-devel-`uname -r`
yum -y install unzip
#the........
There are usually two reasons for this.
#1 The most common is that you need to enable the -r (recursive) flag with zip to make it recurse into directories.
So the solution is to use -r
zip -r somefile.zip yourfiles
#2 If you are using bash scripting based on ls without the full path or for some other reason the full path is missing, zip looks for the files in the current directory so this will always fail.........
whois in Linux is incredibly out of date and does not seem to recognize most new TLDs domains, but there is a quick and easy tip/hack/tweak for this.
An example of new TLD's site as .review .site .club
whois somesite.club
No whois server is known for this kind of object.
bash to the rescue
Now I did try to apply this in .bashrc but DONOT! Ithink the * wil........
service named status
rndc: connect failed: 127.0.0.1#953: connection refused
named (pid 10557) is running...
This issue is normally caused by a permissions issue where named doesn't have the permissions to read the rndc.key.
Check /var/log/messages:
Jan 4 17:06:22 storagebox named[10753]: none:0: open: /etc/rndc.key: permission denied
Jan 4 17:06:22 storagebox named[10........
This can save a lot of time, otherwise grep will go through an entire directory recursively searching every type of file but what if you are sure you only need to search txt or php files?
grep -r -i --include=*.php "what you are searching for" /the/path/to/search........
wget http://ftp.gnu.org/gnu/bash/bash-4.3.tar.gz
tar xzvf bash-4.3.tar.gz
cd bash-4.3/
wget --no-directories --level 1 --recursive http://ftp.gnu.org/gnu/bash/bash-4.3-patches/
for patch in `ls bash43-*|grep -v .sig$`; do
echo applying "$patch"
patch -p0 < $patch
done
./configure;make;make install
#it will install to /usr/bin/bash but if your bash is somewhere else you need to overwrite the old one.........
c1: warnings being treated as errors
stonith_signal.h:34: error: 'stonith_signal_set_simple_handler' defined but not used
gmake[3]: *** [apcmaster.lo] Error 1
gmake[3]: Leaving directory `/root/rpmbuild/BUILD/heartbeat-1.2.4/lib/plugins/stonith'
gmake[2]: *** [all-recursive] Error 1
gmake[2]: Leaving directory `/root/rpmbuild/BUILD/heartbeat-1.2.4/lib/plugins'
gmake[1]: *** [all-recursive] Error 1
gmake[1]: Leaving directory `/root/rpmbuild/BUI........
Have you ever found a website/page that has several or perhaps dozens, hundreds or thousands of files that you need downloaded but don't have the time to manually do it?
wget's recursive function called with -r does that, but also with some quirks to be warned about.
If you're doing it from a standard web based directory structure, you will notice there is still a link to .. and wget will follow that.
Eg. let's say you have files in http://serverip/documen........
Normally if you're in a certain directory you could do:
find *.txt and it will work as expected, but it won't work recursively through child directories, here's the correct way to do it:
find . -type f -name *.txt
The "-type f" is optional because that means only files, but we could have specified d for directory etc...
The above command will work recursively as you'd expect. In that way I find "find" to be un........
I don't have a solution other than to use rsync, I used diff on about 1.7TB of data which includes hundreds of thousands if not millions of small files to ensure nothing was missing or corrupt.
diff didn't even get past the first large directory without spitting that error out.
Keep in mind I used "diff -r" because that means recursive, otherwise it wouldn't compare all files and subdirectories and would be a false way of doing it.........
Install the "Editors" and "Net" groups that will give you rsync, ssh, ssh-keygen and cron.
The trickiest thing that I keep forgetting about each time is you have to run "cron-config" which adds the cron service to Windows, and without doing that obviously no cron jobs will be run thus making automatic backups impossible.
Warning about rsync/cygwin and using the -a archive switch.
It's a good thing I caught this because it doesn't work ri........
Not sure what rsync switches/options to use?
rsync -PDrphogtl
The short version would be:
rsync -Pha
I think these are really common sense options to use and probaby should be the default.
Explanation of rsync switches
P = display the progress
D = hybrid of --specials and --devices so all special and device files will be copied as well.
r = recursive (otherwise rsync won't copy files deeper than........
This will give you the basic info needed to browse and connect to Samba shares from the command line. From the GUI of Gnome or KDE etc, it is pretty standard and straight forward. However, I've found very little guides on how to do it from the command line and if you're like me, a nerd who prefers command line for its simplicity and for remote use, this is the way to go.
First get a list of all the Samba/SMB shares on the target.
smbclient -L hostname........