Jump to content


rusmuscat's Content

There have been 4 items by rusmuscat (Search limited from 13-November 18)


By content type

See this member's

Sort by                Order  

#2206 Mapping A Database-driven Website

Posted by rusmuscat on 08 April 2008 - 05:44 PM in Sitemap Automator

Thank you. I just saw this message. For some reason I didn't get an email notification when your reply was posted.

Actually, it's creating a LOT more pages than we have products. As I indicated earlier, we have about 1100 actual products, but it generated over 7200 page URLs. I will play around with the filters and let you know if there are any other questions.

Vlad

Is it not making a page for each of your products? The way your website is set up, you have way more pages than you suspect (as you can see from your sitemap).

Filters are pretty simple to create. Click the Add filters button. In the window that appears click the plus button. Now select your action (in this case Do not Add). And the set your conditions similar to how you would set a condition for a smart playlist in itunes or a smart folder in the Finder.

That is the only manual we provide. If you need something clarified please let me know.




#2191 Mapping A Database-driven Website

Posted by rusmuscat on 03 April 2008 - 06:42 PM in Sitemap Automator

Thank you, Paul,

When R_SA was scanning the website, I was watching the databases and their behavior, and they were going wild with searches and finds. So I kinda assumed that the program was looking into the databases. I guess I'm not clear on the process by which all the various queries are generated by the program -- perhaps it has do to with the Lasso code it's seeing on the various pages.

As to what I'm trying to accomplish, perhaps it would be easier to explain if you would visit our website, www.musicarussica.com. We have 3 main categories of products -- sheet music (about 800 titles), books and "Monuments" collections (about two dozen titles) and CDs (about 300 currently active titles). Thus, a thorough and useful sitemap would contain a page for each of the products that we sell -- altogether about 1100 pages, plus the main navigation pages that correspond to the product and service categories we offer.

This is what I'm hoping R_SA will enable us to do. The question is how?

BTW, *is* there another manual somewhere, or is the 19-page document I have -- it?

Thanks.

Vlad

It doesn't touch your database. It just scans your links. So however you access your database to build your site, that is what RAGE Sitemap Automator will find.

Since it has no access to your database you can't tell it to not look at certain fields. Though if you explain more about what the issue is I can help you create filters so that certain pages are not added or scanned for additional links.

What more would you like to know about filters? What are you trying to accomplish?




#2189 Mapping A Database-driven Website

Posted by rusmuscat on 02 April 2008 - 07:27 PM in Sitemap Automator

Well, the good news is that it finished! Took about 16 hours, and found over 7100 pages. Very impressive! I wish that, for the purposes of filtering, I understood a little more about how Sitemap Automator interacts with the databases. Can you tell it not to look at certain fields?

When you say "the manual", are you referring to the 19-page "Welcome to RAGE Google Sitemap Automator.pdf" document, or something else? The "Welcome" doc is not very detailed on the filtering process.

Thanks.

Vlad

Hi Vlad,

You should probably stop it manually. It will keep searching as it finds new pages, but obviously pages linked so deep within your site are probably not very important anyways.

You can use Filters to not add or scan certain pages. You could use them to say "Don't add pages in my products folder" or "Don't add pages that contain xyz". They are actually pretty powerful.

The manual has full details on them.




#2186 Mapping A Database-driven Website

Posted by rusmuscat on 02 April 2008 - 02:58 PM in Sitemap Automator

Hello, I'm a new user of Sitemap Automator. I'm attempting to create a sitemap for a database-driven website (Filemaker Pro via Lasso). We have a fairly complex site that dynamically generates pages for approximately 700 titles of sheet music and about 300 CDs that we sell, based on various searches. There are potentially quite a few possible links, and when I added '.lasso' to the type of page to index Sitemap Automator is doing an impressive job of looking inside the databases at all the links: so far, it tells me it has visited approximately 165,000 links and added about 6700 web pages.

Now, obviously, not all those pages are important for a sitemap. I have several questions.
1. Sitemap Automator has been going for about 8-10 hours. Will it eventually stop? Or is this a process like "approaching infinity"? It will find fewer and fewer links but will never fully stop trying?
2. Is there a way I can limit the process and tell it to focus only on those pages that are important for a sitemap? Manually examining the 6700+ URLs it has listed thus far is not a particularly easy task.

Thanks for any insights.

Vlad