And then send an email?
-
You could use lynx, curl, wget. Check return code and use mail to send email. Put it in cron.
Or use an existing monitoring package, such as Nagios.
xeon : Or Zenoss as a monitoring package.Joe H. : nagios, big brother and other monitoring packages keep track of state -- so they won't mail you *every* time it's not up (everyone loves the 100 messages of '(x) service is down!' in the morning). Some can be configured so that a single fail isn't reason to start alerting, and there's a system of escalation so it might start w/ email, move to paging, then contact more people if the service stays down.J.Zimmerman : To really be useful the ZenOSS ZenPack to monitor a website requires a license. We are running ZenOSS core and are really missing that functionality. It was easier to monitor a website with Nagios and easier to build a custom plugin for Nagios. A simple notification shell script using bash/wget/grep/mailx should be adequate. If not look at Nagios.Jim B : If you are going to go the route of using monitoring software I'd recommend argent guardian. It's not free but unlike the free options you'll have it up and running in a few hours with whatever escalation schedule and notifications you likeFrom alex -
You can write a simple Perl program that runs in a loop at 5 minute intervals.
#!/usr/bin/perl use LWP::Simple qw(get); for (;;) { sleep 300; if ( get("http://URL GOES HERE") ) { print 'ok'; } else { print 'error'; } }OR
#!/usr/bin/perl use Net::Telnet; $connection=Net::Telnet->new(Timeout => 5, Host => "www.example.com", Port=>80, Errmode => sub {&error;}); sub error { print "Connection Failed!\n"; }Evan Carroll : Telneting to port 80 doesn't establish the website is up, it only establishes something is accepting connections on port 80.djangofan : no, but you an add to the script to get the http headers from the telnet request...From djangofan -
If it's important enough to send email, you may want to be able to customize your monitoring and alerts.
I would recommend trying one of the following free packages:
Zenoss - VMWare appliance
Nagios - VMWare appliance
MuninFrom hurfdurf -
I put the following in my crontab, and every 15 minutes if checks my web site, and if it's down I get an email.
*/15 * * * * if ! wget -q --spider http://www.rochesterflyingclub.com/ >/dev/null 2>&1; then echo "Rochester Flying Club site is down" ; fiFrom Paul Tomblin -
Well, if you don't want to do by yourself, you can try a free online tool to do that:
It not only checks if your site is available, but if it has been changed, blacklisted, IPs changed, etc.
From sucuri -
http://www.pingdom.com/ is free for a single web site.
Uptime and response time with graphs, notifications, iPhone app, etc.
From astrostl -
Could do this with a bash one liner:
#!/usr/bin/bash while `sleep $((60*5))`; wget -nv http://not.com/exist.html 2>&1 | grep -C1 ERROR && mailx -s "WEBSITE ERROR" "user@host.com" ; done ;Nagios is a great suggestion if you find yourself needing more information /like/ this (graphs, long term logs, easy reports) - I'd find it total overkill to get this job done though.
David Mackintosh : Lot of line breaks in that one-liner.Evan Carroll : the top is a loop the one liner is : `wget -nv http://not.com/exist.html 2>&1 | grep -C1 ERROR && mailx -s "WEBSITE ERROR" "user@host.com" ;` Of course, you could fold the loop too, but that would but that would be cheating. The point is, such a tiny task is easily done.From Evan Carroll -
Nagios http://nagios.org/ is what you need. It's a bit of software that monitors things. It can do what you want and much much more.
From Rory McCann -
We use nagios (actually Groundwork) to do this exact thing. You can use the basic http and https checks to see if the web server responds to an IP or a domain name. If you want to make sure that the website is running and displaying valid content you need to go a little further. We use the http_content plugin to check that specific content is displaying.
From Craig
0 comments:
Post a Comment