We all know that someone trying to seo without comprehensive stats showing him the performance of his website, is really flying blind... well, not blind, but without radar. And without radar, your dead in a dog fight.
I have had awstats installed for some time now, and can say I like how easy it is to use. I wish that I could track the progress/path of a IP throught the site, but I can't. I wish it differentiated between googlebot and freshbot, but it doesn't.
It's free, and I like that. Its well maintained and stable... i love that.
I've heard some good word about webalizer, and I'm considering trying it out to see what else is out there, and explore its capabilities.
So what are you using and why?
We use AWSTATS too but recently I have seen the light awstats is ok for basic reporting but if you want to look at what your users are doing and actually use the statistics effectively then try this product:
It is 495$ but worth every cent I can't believe how powerful it is I have literally made well of 10k in a few days just implementing improvements I learn from the intelligence this product provides on just one site.
Basically if you want to get serious about improving the user experience and forcing everyone who visits your site to buy, buy, buy! then this is a great help, and the people making it are great to Instant messanger support and they know their stuff.
All I have to say is WOW! I have always despised how most stats programs worked and displayed thier info. I have to say this is the most inovation thing Ive seen in the stats market so far. I can finally understand my visitors actions by seeing them in front of me. Hats off to ClickTracks.
That is incredible but how does it work?
It used remote or downloaded log files just like the rest of the stats programs. The difference is in it's presentation (and maybe tracking abilities).
I did find one MAJOR flaw. It does NOT track spiders. I understand the main function of this program is to track visitors actions on a site but without tracking spiders it's not workth the $500. Imagine if we could visually track spiders in the way it lets us track visitors. Now that would rock.
From thier help section.
Many of the requests to your web server are from 'robots' or 'spiders'. These are programs that browse the web automatically, for example to compile indexes for search engines.
Although these requests are important for your site's visibility on search engines, they're not relevant for visitor behavior analysis, and so ClickTracks automatically removes sessions that appear to be generated from robots and spiders. It does this in two ways:
If the session begins by requesting 'robots.txt' this and all subsequent pages from this session are ignored.
If the user agent contains any string from the file 'UserAgentsToIgnore' (using partial matching) the session is ignored.
Okay, I see that if you have two of the same links on one page (points to the same internal page) it cannot differentiate between those (didn;t see how it could without uniquely tagging the links). But still, after trying it on one site, I am really impressed. It showed me some behavior I wasn't expecting.
The real power of clicktracks in in the tagging feature, it takes some time to get to grips with all the feaures avalible but the essence of tagging is that you can say show me all the visitors who went to page A, entered through google and bought on my site compared against the percentage of users who entered through google but went to page B.
The tagging really comes into it's own when you turn on logging of cookies at the end of your apache LogFormat directive. Then clicktracks can help you trak groups that are tagged with cookies so it could be show me all the users who returned to the site within 7 days of their first visit and then purchased after viewing page C compared with all the visitors who purchased after returning after 30 days and viewing page D.
As far as I am concerned this is what the web is all about the power to know your users is finally within reach. imagine an offline marketing agency who could get data like this about consumer trends they would be wetting their knickers with excitment. It is like your whole customer base is now a focus group on the web.
I must remind myself to keep the next revolution under my hat or at least put an affiliate code in :), but in all seriousness I gain a lot from being back here and I would really urge everyone to try this software out if you have a server log then you should be using this product [clicktracks.com] as far as I am concerned even if your site is only small and 500$ seems like a lot compared to the amount you can learn by using this software right with your site it is nothing. It is the first piece of software I have personally bought above 250$ and if I couldn't afford it I think I would have got a loan out to buy it I really think it is that good.
It's getting late here so I'm off to bed but if you haven't tried it yet do it now before you forget and post back here with your comments and if anyone does aything really cool with it let me know I am currently looking for ideas of how to best use it to improve all our sites.
I think the graphics are cool, but I'm not blown away.
You can do 'basically' the same thing with FastStats.
Granted, I may try the free download just to see if it helps with conversions, but the same info is available with the previously mentioned program, just not with the cool graphics of your own website.
I'm just a skeptic that this can help with conversions, but I'll try the free trial and report back as to how it compares with my other program.
I've been wrong before...
Welp .. hrmm
Does any know of ones you can install directly on the server? My log files amount to about 10 gigs a day, and its just inconveniant to download them every day etc.
Im using AwStats, but I am thinking of plunking down for Urchin ... anyone try that?
Plus, if clicktracks doesnt track search engines .. blah!
Sounds like something one of my own stats scripts does (not to their degree of configurability). I tells me which visitors viewed say both the pricing page and order form, where they came from, what page they came in on, what keywords were used and so on.
I have a question which may not be 100% on topic, but seeing this post made me think about this.
The same script filters out known bots before it does anything else. Next it checks how many users viewed more than 1 page and it used that list to generate the remainder of the stats.
As to my question, the ratio of number of visitors (after filtering out known bots) vs those who view more than 1 page is about 60% .
Is this number low? I've ran the script for some clients who have high traffic sites and (give 1-2%) results are the same. Are they visitors who come in but don't find exactly what they were after and leave, or is the bot blocking routine not strict enough?
Demo of clicktracks looks impressive.
Can you run log two different programs at the same time?
That is can I use another one for tracking the spiders?
I demo'd this (clicktracks) and it told me nothing I diddnt already know through analog and similar.
The tagging is a real pain, a mining approach would be much more suitable.
looks very impressive clicktracks! shame it can't be server-installed. we also have big log files
we have been using nettracker for the last year also with great results. it is installed on the server and generates results daily or every time you click update.
although it takes a while getting used to, it has a lot of good info, such as paths through the site, ip and visitor tracking through the site, spider views, entry and exits.
it costs same price clicktracks and has been well worth the investment. especially with ppc bids.
be warned it is time consuming going through the stats though.
Clicktracks is interesting. Yes, all the info it gives can be obtained from other packages, but the unusual presentation of the data gave me pause for thought on a few occasions during the hour or so I've played with it. I think anything that could give you a new perspective on your visitors is potentially useful.
RefPosition carry out 100% manual submissions as per the links packages you choose to help you achieve your search marketing goals
I have tried it and I like it very much. However, compared to AWSTATS it's not _that_ much better. It shows enter and exit pages and can to some degree be hooked up with your payment stats (I tried this, but never managed to get it to work).
Personally I find that AWSTATS together with my own scripts works just fine. I set the referer as a cookie the first time users come to my site, and if they pay, I log the referer. Together with a bunch of other tweaks, AND looking at AWSTATS I think I'm doing ok.
I'm checking out clicktracks right now and I may start using it from now on.
I've been using NetTracker for about 45 days now and love it. Very nice layout as well as customizability. Allows me to automate some tasks like log downloads and emailing reports out to clients. It does cost $$ though but I've found it to be very useful for what I need.
Like lorax we use NetTracker.
Where it scores are that it can be installed on a server, you can have it effectively working real time if your logs rotate that way. It is also fantastic at tracking your marketing (but not out of the box, but then what software does all these things out of the box?). It is useful because it can be browser based, so if you have clients that want to give you access to data then it is good for that.
The current version 6 is a big step up from older versions, so anyone that looked at it before and dispelled it, look again. You can drill in to everything, so the steps might be :
1. You want to see the visitors from today
2. You want to see the Yahoo visitors
3. You want to see the .co.uk Yahoo visitors
4. You want to see the ones that visited using the keyword "widget"
5. You want to see the path that those that typed "widget" took
I think there is a 15 day full trial.
"but I am thinking of plunking down for Urchin ... anyone try that?"
We use urchin on a few hundred sites. It's the best I've seen at handling large log files. With Urchin4 you can also switch on the utm which uses both your log files and cookie tracking to eliminate the AOL caching issues..
I have been using Analog for a while.
I manage about 40 sites and have no time to download logs every week (or day) and create reports. I want to configure my stats package and leave it to get on with regular reporting. Analog is good but there is a lot it doesn't do so I'm interested in finding a better one.
I need a stats package that:
- Runs on the server
- Automatically creates reports in a web page (so my clients can view updates of their stats without pestering me).
- Shows keyphrases (not just key words)
Analog does this but it doesn't:
- show paths through site.
- provide detailed spider info.
And while we're on this subject:
Does anyone know of a way of monitoring numbers of people who click the email link in a site? i.e. those who make contact.
We've using Traceworks.com for some time on some projects.
Though it has it's limitations being based on js, it's still very accurate and has that nice roi tracking feature. Tracks se's, ppc, banners - what ever. A bit expensive, but very good support.
I have been impressed overall with ClickTracks. It is nice to quickly see the percentages of people who click on a particular link. I have been playing with a few different link formats on a site of mine and it is nice to quickly see the effect of a change. There isn't any information that I couldn't get using FastStats but I like the way ClickTracks presents it. For general traffic checking I still prefer Fast Stats though.