Status
Not open for further replies.

mneylon

Administrator
Staff member
Can anyone recommend a perl or php script for generating sitemaps compatible with Google and Yahoo?
 

Forbairt

Teaching / Designing / Developing
cheers for that I'm always meaning to write my own :) I tend to use the xml-sitemap generator site most of the time.

how would it handle something like your dvds site though ?

:eek: :D
 

Forbairt

Teaching / Designing / Developing
I'd assume it'd time out :D

Is it set up with a cron job and writes to file or is it real time generated ? (if so for my uses it'll be getting hacked) (maybe I should just read the instructions)
 

mneylon

Administrator
Staff member
It seems to do it in realtime. I don't see how that would mean it is being hacked - badly thought out maybe
 

Forbairt

Teaching / Designing / Developing
sorry ... I should have said ... I'll be hacking it to write to files with a cron job :)

(I'll be hacking it to write to file)
 

paul

Ninja
Handy script.

I've found that it takes a little bit of time for your sitemap to get noticed, i.e. G will download it, but it will take a bit of time (1/2 months) before they start showing as indexed on the console. Or maybe that's just my experience.
 

Forbairt

Teaching / Designing / Developing
its been a day + for me ... I think I'll test with a new site shortly though :eek:
 

paul

Ninja
hmmm maybe it's the sheer number of pages I have then ...
 

Forbairt

Teaching / Designing / Developing
That could be it ... the few sites I have would range from 10 - 400 urls so nothing that major
 

paul

Ninja
Just almost at the 5k mark of 88k pages now. Interesting to see the traffic grow on such a large site like this.
 

RedCardinal

New Member
Just almost at the 5k mark of 88k pages now. Interesting to see the traffic grow on such a large site like this.

Problem is big sites need lots of pagerank to get well indexed. While it might tell you you've got 5k pages indexed, it doesn't tell you where. So you tend to find most pages in supplemental index where they dont have much chance to pull in organic traffic. Interested to hear how long they take though to index you.

Sure if you have any problems you could always ask nicely to use the Rapid Inclusion System recently developed by a crack team of Irish devs...
 

paul

Ninja
Problem is big sites need lots of pagerank to get well indexed. While it might tell you you've got 5k pages indexed, it doesn't tell you where. So you tend to find most pages in supplemental index where they dont have much chance to pull in organic traffic. Interested to hear how long they take though to index you.
I am currently doing this as one of my on going experiments. To see how, using sitemaps, a good hierarchy, breadcrumbs and all that lark, that I can squeeze some more pages in to the index. It doesn't have any strong links, I think their is only one or two PR2s.
Sure if you have any problems you could always ask nicely to use the Rapid Inclusion System recently developed by a crack team of Irish devs...
for the next experiment I will make sure I will do that ;)
 
Status
Not open for further replies.
Top