Access log analyzer


log reports


days validity


per log free


for end users

bots or visitors?

25% bots
75% visitors

All response codes

200 OK
301, 302 or 404
Free log analyzer

Online access log statistic

LogAlerts helps analyzing specific crawling problems for search engine bots in websites. It is used for specific purposes and intended for users with above-average expertise in search engine optimization.

Key features and functionalities in brief:

  • a raw access log analysis to retrieve useful information about visits;
  • used to identify how search engines crawl pages;
  • can be very useful for redirected domain analysis by clearly showing the generated redirects;
  • you can get a sense of site visits that do not have a Google Analytics tracking code;
  • helps optimize the crawling budget.

Basic concept

The Access log itself is a list of all queries of individual files and pages that are generated on a site by users and bots. Often these data are unprocessed and their aggregation requires an external program.

By analyzing the raw access logs that the server keeps for us, we can extract a lot of useful information, which for the most part can not be verified in any other way.

LogAlerts allows data to be rendered in a readable way. Thus, specific analyzes can be made to solve problems difficult to detect by other methods.

Log file format

The format of the file itself depends directly on the hosting configuration. One of the most commonly used format is Combined format. The template is as follows:

"%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\""

As with actual data it may look similar: - frank [10/Oct/2000:13:55:36 -0700] "GET /apache_pb.gif HTTP/1.0" 200 2326 "" "Mozilla/4.08 [en] (Win98; I ;Nav)"

As is clear from the examples, each site visit is recorded in a specific format, clearly distinguishing the following data: what type of query is running, such as GET or POST; IP address of the accessor; server response code; User-Agent and whether the query comes from a bot. aggregates this data and visualizes it in an easy-to-read way.

Interpretation of reports

When uploading the * .log file for analysis, you have the option to select the "Only crawlers" checkbox. In this way, you will only generate a query report from search engine bots and other similar crawlers. This is useful if you want to see robot-only behavior without including standard users.

After successfully uploading the file, you will be able to download the report locally in HTML format or open it directly in a browser. You can share the link with the report and third parties to analyze the data and interpret the information.

Download demo file for quick test

For convenience of users, we provide an example log file that you can test the tool without following the steps below:
Demo file here