Archive for the 'RAGE Sitemap Automator' Category

New Step by Step Video Tutorial to Get Your Website in Search Engines

Tuesday, September 1st, 2009

We have published an updated step by step video tutorial for getting your website listed in all major search engines with Sitemap Automator. The tutorial was created by courtesy of AppTorial Video Tutorials.

Whether you are using WebDesign, iWeb, Dreamweaver, RapidWeaver, Sandvox, or any other web development tool of your choice, follow the quick and easy step by step video tutorial to learn how to get your website listed in all major search engines with just a few minutes of work.

Sitemap Automator is the easiest way to create, publish and notify search engines of your industry standard XML Sitemap. XML Sitemaps are used to let search engines know about all pages on your website, how important each page is, and how often each page is updated.

XML Sitemap Automator adds Support for Bing, Secure HTTPS Websites

Monday, August 17th, 2009

Sitemap Automator, the most popular XML Sitemap generator for Mac OS X, has been updated to version 2.1.

This free update for registered users includes complete support for Microsoft’s new Bing search engine, support for HTTPS secure websites, and fixes various issues with scanning some relative URLS.

Download the new update for Sitemap Automator update today.

Google Confirms XML Sitemaps Helps Find New & Changed Content Faster

Monday, June 15th, 2009

Google has confirmed that a study using XML Sitemaps will;

help search engines find new and changed content faster.

According to their Webmaster Tools Blog and a study conducted on the effectiveness of XML Sitemaps.

Two important points can be taken directly from this study;

We conduct this test over a dataset consisting of over five billion URLs that were seen by both systems. According to the most recent statistics at the time of the writing, 78% of these URLs were seen by Sitemaps first, compared to 22% that were seen through Discovery first.

And more importantly;

The main observation from this study is that for an archival domain, Discovery is 63% “efficient” and Sitemaps is 99% efficient in crawling the domain.

Since creating an XML Sitemap is so easy and effortless, there is no reason not to have one.

Google Webmaster Tools Gets New Features & Updates

Thursday, June 11th, 2009

Google Webmaster Tools gets an all new interface and new features.

Some of the highlights of the new interface and new features include;

  1. An all new dashboard that shows your most important data including top search queries, top incoming links as well as any errors found on your website which may effect how Google and other searches crawl your website.
  2. Up to 100 top search queries for your website and up to 100 top clicked search queries.
  3. The ability to track XML Sitemaps submitted by other users of your site.
  4. You can now subscribe to messages sent to your Webmaster Tools account so you get them right to your email. You no longer have to login to your account to check your messages

All in all, this new updated offers some much needed new features. We especially like the addition of showing up to 100 search queries. If you haven’t already done so, create your XML Sitemap today and get access to all of this important information about your website.

Is your website load time preventing Google from indexing your site?

Thursday, February 12th, 2009

Nobody likes visiting a slow loading website. Websites that take a long time to load (more than 5 seconds) are losing out on potential visitors. Studies have shown that visitors will wait no more than 8 seconds on average for a page to load. But did you know that search engines won’t wait for pages to load either?

Google in particular does not like slow loading websites. In fact, if you are using adwords to promote your site, the speed of your website’s load time is used as a Quality Factor to help determine your cost-per-click and your ad position.

Even more important however, is that your website’s load time will effect how many of your pages Google will index. Take a look at the following graphs taken from Google Webmaster Tools. The bottom graph shows the time spent downloading a page while the middle graph shows the number of kilobytes downloaded per day and the top graph shows the number of pages crawled per day.

How slow loading websites affect Google bot

You can easily see that as the time spent downloading pages went up Google indexed LESS pages. There is a clear inverse relationship between download time and number of pages indexed.

How do you check your own site?

You can check this for your own site by creating an XML Sitemap file and checking the Crawl Stats in your Google Webmaster Tools account.

How do you fix this?

Fortunately there is a very easy way to fix this. You need to reduce the size of your website and you can easily do this with WebCrusher website optimizer. WebCrusher will instantly optimize your site by removing unneeded website code and compressing image files. Then it can automatically publish your site directly to your webserver whether your are using an actual web host or MobileMe/iDisk to host your site.

In just 5 minutes you can have a faster website, that loads as quickly as possible for your visitors and insures Google and other search engines can properly scan and index all of your webpage files.

You also want to make sure that you are using a fast web hosting service to host your site. If you are a MobileMe/iDisk user it may be time to change to a fast and reliable web hosting service.

RAGE Software Updates Search Engine Optimization Software Suite for Mac OS X at Macworld Expo 2009

Monday, January 5th, 2009

We are pleased to announce major updates to our search engine optimization software suite for Mac OS X. The updates include new versions of a number of our products including Sitemap Automator, WebCrusher and SERank.

Sitemap Automator
The only way to get your website recognized by all major search engines, Sitemap Automator 2.0 now lets you easily manage an unlimited number of websites. In addition, it can easily show you potential search engine crawling errors on your website, supports robot.txt files and offers improved support for iWeb built websites.

This 2.0 update is the first paid upgrade since the initial release back in 2005. If you purchased before May 1, 2008 you can purchase an update for just $19.95 USD. If you purchased after May 1, 2008 then you can get your free upgrade serial number from http://www.ragesw.com/getupdate.php

WebCrusher
Optimize your website so they load as fast as possible for your visitors. At the click of a button, WebCrusher can take any website and intelligently remove unneeded code and compress images making the website load as quickly as possible in any web browser. This free update to version 1.4 includes enhanced CSS optimizations and improved iWeb optimizations.

SERank
See exactly where your website ranks in over 60 search engines for your most popular keywords. Discover who ranks higher than you and how you can rise above them to get the number one spot on all major search engines. Version 1.9 adds enhanced reporting features, new exporting options with compatibility with Microsoft Excel and Apple Numbers, as well as includes important bug fixes.

If you will be attending Macworld Expo 2009 please stop by our booth at #1744 and we would be happy to talk with your about our software and these updates.

Video Tutorial: Submit XML Sitemaps to Google for iWeb and MobileMe

Monday, January 5th, 2009

We have gotten a lot of questions regarding how to properly create an XML sitemap using Sitemap Automator and get your iWeb based website into all major search engines. The process usually takes less than a few minutes, but can be confusing. This is why we have put together a short video tutorial on how to easily create an XML Sitemap and get your iWeb based website in search engines today.

This step by step video tutorial is still useful if you have not used iWeb.

We also provide detailed, written instructions for getting your iWeb based website into all major search engines if you require more help.

RAGE Sitemap Automator Released With Full iWeb Support

Tuesday, April 29th, 2008

Yesterday we released an update to RAGE Sitemap Automator, the essential tool for getting your website listed in search engines.

The latest update adds full support for iWeb based websites. It can now properly scan and find all links on an iWeb created website as well as automatically publish your XML Sitemap file on your iDisk in the correct location. It also adds some other fixed and enhancements.

If you have an iWeb web site, make sure you have read our SEO for iWeb blog post so you can learn how to rank your iWeb site high in search engines.

Note: We released version 1.9.6 today which fixed a small bug that made incompatible sitemap files. So if you have downloaded version 1.9.5 please download the free 1.9.6 update.

SEO For iWeb: How to get your iWeb Websites into Google & Other Major Search Engines

Saturday, April 12th, 2008

Update: The SEO for iWeb Walkthrough Video Tutorial has been released. It walks you through the entire process of optimizing your iWeb website for search engines, explaining everything step by step.

We always get questions from iWeb users asking how they can improve their website rankings. We also get comments that say RAGE Sitemap Automator doesn’t find all web pages on an iWeb site. Well there are a few reasons for this that we will discuss here.

UPDATE: As of version 2.0 of Sitemap Automator it will properly scan iWeb based websites, making it one of the only tools that is able to do this properly. However, the following tips are still essential for success with search engines. See our step by step video guide on creating an XML Sitemap file.

iWeb websites are not made to be search engine friendly. In fact, almost all iWeb based websites that we get are really hurting their search engine rankings. If you follow these few simple instructions, you will see some significant improvements.

1) iWeb Page Titles

As of iWeb 08 (and now iWeb 09), most built in templates have a large header caption at the top of the page. Your website’s title tag will actually reflect what you enter here. Many users simply keep this as the default caption, not utilizing the most important on-page optimization you can use for search engines.

The trick is to give your page a title that includes both the keywords you want to appear for in search engines and that accurately describes your website content. Your web page title appears at the very top of your web browser, and in a search engines results page. Search engines use your title tag to get an idea of what they will find on your website.

Update: With iWeb SEO Tool you can now edit your web page titles, meta tags and alternative image text after you publish your site. You no longer have to worry about how iWeb gets your title tag.

For templates without these header captions, or if you remove the caption, iWeb will use your page file name as its title. Below is a screen shot of your iWeb Inspector window which lets you edit the file name of your selected web page. Give it a good title using the advice we provided above.

Add Custom Title Tags To Your iWeb Website

2) Navigation bars

One of the biggest problems with iWeb is the way it creates your navigation bars. Instead of using standard HTML which search engines can use to correctly find all files on your website, it uses Javascript which makes it extremely hard for search engines to scan and index your website properly.

Fortunately there is a way you can work around this problem. Select your main page (or the first page that contains your navigation bar) and open the Inspector window. Click the Page Inspector tab (second tab) and deselect the option ‘Display Navigation Bar’ as shown in the following screen shot;

Add Custom Title Tags To Your iWeb Website

Now you’re going to create your own navigation bar with proper links to each of your pages. Create a new Text Box field and place it at the top of your page, while moving all your other content down. To quickly move all content down, go to Edit – Select All. Hold the shift key as you drag all your content down which will help insure you don’t accidentally move the content off center. Put the new Text Box at the top of your page and add captions for each of your pages separated using tabs or spaces so that they look like a proper navigation bar. Then select each caption and go to the ‘Link Inspector’ tab in the Inspector window. Select ‘Enable as a hyperlink’ and choose ‘One of My Pages’ from the ‘Link To’ drop down menu. Lastly select the page you want to link to from the ‘Page’ drop down menu.

Proper navigation bars in iWeb

Although you should do this for each of your pages for best results, insuring you do it on your main page only is extremely helpful for search engines.

3) The Right Content

One of the biggest issues I see with iWeb created websites is users choosing non-standard web fonts for their website. Just to provide some background information, there are a number of fonts that are considered safe for use on the web. These are fonts that are guaranteed to be installed on a users computer no matter what operating system or web browser they use. If a font is not installed on a users computer and you use it on your website, it will not display properly for them. iWeb works around this issue by turning your text into pictures if you use a non-standard font. This is why your webpage always looks the same no matter where you view it. Unfortunately, search engines can not ‘read’ text that is turned into pictures and this will severely impact your potential search engines rankings.

You must stick to the standard web fonts, which are listed below for you. This insures that your website has the best possible chance of ranking high for the keywords you are targeting.

Web safe fonts include;

  • Arial
  • Courier New
  • Georgia
  • Times New Roman
  • Verdana
  • Trebuchet MS
  • Helvetica

*Note: Some of the above fonts may not always be installed on a persons computer but will be easily replaced with a very similar looking alternative if they can not be found. That is why they have all been included in the above list.

You want to make sure that your website content contains the keywords you want to rank high for in search engines. It’s not good enough to “be in search engines”, you want to appear when a potential customer types in one of your keywords. Search Engines will not know what your webpage is about unless you include the proper keywords in your web page content.

4) iWeb Landing Pages

Lastly, something I see very often is users making a so called “landing page” as their home page. This is the type of page where it may simply show your company logo with a “Click here to enter link”. Basically anything that requires a user to take one more step in order to see your website is never a good thing.

This applies to search engines as well. Your home page is considered your most important page by default so make sure you are taking full advantage of it. Link to other important pages directly from your home page and make sure it includes keyword rich content.

In the next post I will go over some iWeb misconceptions as well as some search engine misconceptions that can typically affect iWeb users.

Remember, search engines will not simply choose your site out of the billions out there and rank them at the top of their search index unless you give them good reason too. Getting high in search engines for the keywords your customers are searching for can be extremely profitable to you and will take some time to achieve. Don’t expect immediate results and keep learning about the strategies you can employ to get high rankings.

Download our Free Mac SEO Guide to learn how you can get higher rankings with your iWeb websites.

Quickly Submit Your Web Site To MSN | MSN Webmaster Tools Go Live

Tuesday, November 20th, 2007

A few days ago MSN’s Webmaster Tools officially came out of beta. Like Google’s Webmaster Tools, MSN’s tools let you quickly submit your sitemap.xml file that you have created with RAGE Google Sitemap Automator. Once submitted you are provided with search engine feedback including;

  • How many of your web pages are indexed in MSN’s search
  • The date MSN’s search spiders last crawled your web site
  • Which, if any, of your web pages could not be properly indexed
  • Your web site’s most popular incoming links

Although you won’t find as many resources and tools that are provided by Google’s Webmaster Tools, you will get some interesting information about how your web site is performing in MSN’s search.

If you already have your sitemap.xml file published on your web site, all you have to do is create your free Windows Live account and follow the simple steps to submitting your sitemap.xml file. The next update to RAGE Google Sitemap Automator will feature an all new way to ensure that your sitemap.xml files are recognized by all major search engines including MSN.