What do malicious log entries look like?

A good server admin will be scrubbing her logs constantly. Whether you do this automatically or by hand, there is one thing that is a sure sign of malicious behavior: the ol’ “they forgot to delete that file” trick.

When you setup things like phpMyAdmin or anything else that has a setup or install script, the locations of these scripts reside in a very predictable place. What some people try to do is exploit the fact that, after setup, some people forget to delete these scripts or change their permissions.

Going through my logs this morning, I found this: - - [25/Jan/2014:09:41:29 -0800] "GET /pma/scripts/setup.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:09:41:29 -0800] "GET /myadmin/scripts/setup.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:10:07:39 -0800] "GET /phpTest/zologize/axa.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:10:07:41 -0800] "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:10:07:41 -0800] "GET /pma/scripts/setup.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:10:07:43 -0800] "GET /myadmin/scripts/setup.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:12:52:48 -0800] "GET /phpTest/zologize/axa.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:12:52:48 -0800] "GET /phpMyAdmin/scripts/setup.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:12:52:48 -0800] "GET /pma/scripts/setup.php HTTP/1.1" 301 178 "-" "-" - - [25/Jan/2014:12:52:49 -0800] "GET /myadmin/scripts/setup.php HTTP/1.1" 301 178 "-" "-"

See what they’re trying to do there? They’ve got some pre-loaded urls that they’re trying to reach, all of which are predetermined locations for things like phpMyAdmin. This series of “hackers” tried to exploit a vulnerability that just isn’t there, and they ran into my blackhole which now blocks them entirely from the network.

Be sure to look for patterns like this when you’re scrubbing your logs. If the same IP is trying to access something that you know isn’t there, and they keep trying to change the way to get in or even just keep hitting it over and over again, chances are they’ve got some bad intentions.

29 thoughts on “What do malicious log entries look like?

  1. Same actions at mine website!

    Allmost every half hour another IPaddress but the same requests!

    And every time the same HTTP 404!

    1. If they’re getting a 404 error then that usually means they’re looking for content that doesn’t exist. As in my example, this is most often the case with people who are using bots to break into a loosely secure box. Do you need help filtering these out?

      One thing I would recommend is to also check your ssh traffic. If there are people trying to break in on port 80, then there will surely be people trying to break in via SSH. Ensure that you have disabled root login, activated fail2ban, and are actively checking your logs.

  2. I got fed-up with all these idiots using my bandwidth, so knocked up a bit of perl:

    #!/usr/bin/perl -w

    my $file;
    my $line;
    my $drop=” -j DROP”;
    my $prev=””;

    use File::Tail;
    maxinterval=>1, interval=>1);
    while (defined($line=$file->read)) {
    if ($line =~ /php/i) {
    $line =~ s/ .*//g;
    if ($line ne $prev) {
    `echo “$line” >> /home/USER/logfile`;
    `/usr/sbin/iptables -I INPUT -s $line $drop`;

    and run in a ‘screen’ session. The iptables are not persistent, so on each reboot you start again – but I don’t reboot much.

    The last 24 hours or so got this:


    1. Great example. Have you considered using ipset along with iptables? In the case of my company, we have to block anywhere between 5-50 thousand IPs each day, and ipsets are light years faster than individual iptables. Your method does work, though. I would go one step further and have this cron’d to run each day or week (right now we do twice daily log scrubs because of how “popular” our servers are).

        1. Of course. However, fail2ban and similar services are generally considered ‘corrective controls’ in the security world. Being able to employ log-parsing strategies and your own bash scripts to analyze and prevent future malicious traffic is a type of detective and preventive control, and doubles your server’s security in the long-run.

          As with any corrective control (after-the-fact methodologies of banning *after* the log entry was made), there are many ways around it. Here’s one example of a type of attack that can get around fail2ban.

      1. Thanks for the heads-up on ipsets. New code:

        #!/usr/bin/perl -w
        # Nick – 30/01/2014

        my $file;
        my $line;
        my $log=”/PATH/TO/APACHE/access_log”;

        open(LOG,”/usr/bin/tail -F -n 1 $log |”) || die “ERROR: could not open log file.\n”;

        while () {
        if ( ($line =~ /php/i) || ($line =~ /xml/i)
        || ($line =~ /\/manager\/html/i) || ($line =~ /w00tw00t/i)
        || ($line =~ /\/HNAP1\//i) || ($line =~ /CONNECT/i) ) {
        $line =~ s/ .*//g;
        `ipset add bh $line -exist`;
        `ipset save bh -f /home/USER/FILE`;

        and in my iptables rules:

        ipset restore -f /home/USER/FILE
        $IPT -A INPUT -m set –match-set bh src -j DROP

        Works a treat – I also changed to using the system ‘tail’ as it responds quicker than perl’s FILE::TAIL module for some reason.

        Lastly, another observation. Since I have been running this, ‘php’ scan have dropped from 2 to 3 every 2 hours to NONE from 06:53 this morning (13 hours ago) – so I wonder if the bot network detects the drops and pulls my ip out of the bot network control centres? I dunno.


        1. Nick, I noticed a huge drop in the php scans on my end, too. It may be that they’ve automatically removed URLs that drop so they’re not wasting resources. If I was going to write a scraper, that’s how I would do it, too.

          – Jesse

  3. First of, I forgot to mention I do not run any php (nor have it installed), so can just purge any of these requests to the bin immediately. Blocked as soon as I get hit, no questions asked.

    Secondly I forgot to say that perl FILE::TAIL takes about a second to ‘catch-up’ so I may get 3 or 4 hits when ‘idiot’ is scanning my server before the iptables rule is written (hence why I test in the code if it’s a duplicate’). Perhaps I should look at what these people are doing to scan access_log (although the whole code is too way over the top for me to consider – just detect it and DROP):


    And as to your question, I get very little real visitor hits compared to these idiots, so to keep it easy, just that little bit of code works great, and even though I have been running it a little over 36 hours now, I have seen a large drop in my server stats (webalizer) :)


    1. Nick, are you on a Linux server? Are you using fail2ban? If this sort of malicious behavior exists on port 80, I guarantee you have some shady log entries for your ssh, ftp, mail, and other ports. In your /etc/ssh/sshd_config you can also set `AddUsers you@your_ip` to ensure that ssh access can only be granted by you (make sure you have a backdoor into your system, though, or you’ll lock yourself out completely if your IP address changes!).

      If you’re on linux there are tons of quick commands you can run to get a list of all IPs that are being malicious. Lots of simple copy+paste style log analytics.

      – Jesse

  4. Yes, of course it’s GNU/Linux – haven’t used anything else since 2002 or something.

    Don’t worry about ssh*, I have all that tied down, and mail server is locked down too (best thing is a blacklist file that updates often).

    It was just the PHP scans that bother me, due to using bandwidth. I saw one the other day that took about 5 minutes to run and hit every combination of php file (and other stuff) that you can think of.

    It would be interesting to see what happens to the other end when these people run these ‘kiddie scripts’, and it bombs out in the first second – hopefully they are running MS stuff and get a BSOD.


  5. Hello Folks,

    My Magneto site its down again in January — my host says they will fix — its already 16 hours and still we have a errors on site:

    Fatal error: session_start() [function.session-start]: Failed to initialize storage module: user (path: /home/xxxxx/public_html/var/session) in /home/xxxxx/public_html/app/code/core/Mage/Core/Model/Session/Abstract/Varien.php on line 123

    I just review my Raw Access Log and I find that: – – [27/Jan/2014:18:03:38 -0800] “GET /phpTest/zologize/axa.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:18:03:39 -0800] “GET /phpMyAdmin/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:18:03:40 -0800] “GET /pma/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:18:03:41 -0800] “GET /myadmin/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:18:24:28 -0800] “GET /phpTest/zologize/axa.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:18:24:32 -0800] “GET /phpMyAdmin/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:18:24:32 -0800] “GET /pma/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:18:24:34 -0800] “GET /myadmin/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:21:55:28 -0800] “GET /phpTest/zologize/axa.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:21:55:30 -0800] “GET /phpMyAdmin/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:21:55:32 -0800] “GET /pma/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:21:55:33 -0800] “GET /myadmin/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:22:23:00 -0800] “GET /phpTest/zologize/axa.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:22:23:01 -0800] “GET /phpMyAdmin/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:22:23:02 -0800] “GET /pma/scripts/setup.php HTTP/1.1″ 302 – “-” “-” – – [27/Jan/2014:22:23:03 -0800] “GET /myadmin/scripts/setup.php HTTP/1.1″ 302 – “-” “-”

    This is a possible that my error listed above its because they find the black hole in our system….???

    Sorry but I am on the middle to learn all that stuff ;-)

    Thanks for any help in advance

    Regards, Arthur

    1. Hi Arthur,

      Sounds like you need a new host! My company DashingWP only does WordPress, so if you’re ever looking for a WP-exclusive environment please get a hold of me.

      As for your problems, I don’t see anything in the logs that indicate access to your system. All of those requests are getting a 302 error. If anything was returning a “200” then that’s cause for concern (that means it was found and returned). I would definitely get those IPs banned and gone.

      As for the failed session_start, I’ve seen this happen when the folder it’s referring to (/var/session) is not writable. Have you checked that?

  6. Arthur,

    What you are seeing in the logs is a[are] typical bot scans for PHP vulnerabilities – that is nothing to do with why your host is down.

    I have now enhanced my code a bit.

    The perl script now logs IP addresses to a file… and in my IPTABLES script I added:

    while read line; do
    $IPT -A INPUT -s $line -j DROP

    so that now, not only does the IP get DROPped in a second, on any reboot or firewall restart, it is now permanent.

    AS to the fail2ban comment, I looked at several things like this, and for my needs, a lot are way over the top requiring more stuff to be watched than necessary.

    IP's caught to date now after about 38 hours?:


  7. I’m new to log checking. I have an entry that is exactly like everyone else has posted about. These show up just for today: /phpTest/zologize/axa.php /phpTest/zologize/axa.php /phpTest/zologize/axa.php /phpTest/zologize/axa.php

    We have no PHP installed at all on our server. What is the best way for me to approach this using IIS 6. What type of checks should I add to our server? Is there software that is good for handling this?

    Thank you

    1. I think the default place for access logs is


      Otherwise, check under IIS Manager, select the computer on the left pane, and in the middle pane, go under “Logging” in the IIS area. There you will se the default location for all sites (this is however overridable on all sites)

      You could also look into


      Which will contain similar log files that only represents errors.

  8. Interesting article. I have many entries like
    We don’t use php. We have a website used by students. It is likely one of them may be seeing if he/she can do something interesting :) while others are accessing our site in the normal mode. The issue is, all of them may be coming via the same IP (they are in a LAN, and access the net via a proxy). Any ideas on how to handle that?

  9. Hi,
    almost the same logs entries here.
    In my case i have used fail2ban + better wp security, getting a lot of attempts from a lot of ip’s.
    Maybe they will calm down.

  10. Quick update on this; since I implemented the above script that immediately drops the IP involved in these scans, I have seen my HTTP port 80 traffic drop by at least 50% over the last month and a half :)


      1. Just for an example here. I updated the kernel on my server last week and forget to set the script running… results from daily hits (hint: look at 5th April):

        1 45 3.21% 39 4.22% 40 4.26% 23 3.70% 21 4.98% 105 4.05%
        2 59 4.21% 44 4.76% 47 5.01% 29 4.66% 28 6.64% 109 4.22%
        3 90 6.42% 81 8.77% 75 7.99% 41 6.59% 40 9.48% 112 4.32%
        4 92 6.57% 83 8.98% 81 8.63% 39 6.27% 38 9.00% 270 10.42%
        5 301 21.48% 42 4.55% 48 5.11% 21 3.38% 19 4.50% 124 4.78%

  11. Ok, they have bad intentions. But is it really of any use trying to block them? I mean, if the things they are looking for don’t exist or are properly secured, they will just continue generating some 404s, and looking a the speed they’re changing IP-Adresses it seems to me like tilting at windmills.

  12. Both of these ideas should work well on regular exposed web servers, however, my data is coming through a load balancer first, which means if I block anybody, it will be my own load balancer!

    Additionally, if they’re coming as HTTPS traffic, the URL will be encrypted, so I won’t be able to inspect the URI stem prior to arrival at the server.

  13. To whom it may concern, this might also be useful. i’ve found a way to make those pesky no referral, no user agent bots to keep out:

    SetEnvIfNoCase User-Agent ^$ keep_out
    Deny from env=keep_out

    It should return a 403 forbidden.

  14. Sorry for my n00bness but why not blocking at the webserver level ?
    In my case it’s a home hosted debian/nginx server behind a cheap router/firewall.
    Does it really saves any bandwith if we block at the fw level rather than at the webserver ?
    I’m tempted to use the script quoted above that uses ipsets…


Leave a Reply