Your website is ranking for keywords you may not even know it is ranking for and you can instantly boost the number of visitors you are getting from search engines with a single change to your site.
So you have your XML Sitemap file submitted to your Google Webmaster Tools account. Now, login to your account to see what search terms your website is appearing for AND what search terms people are actually clicking on to visit your site.
The trick is to find the relevant search terms that are appearing a lot in search results, but that customers are not clicking. By simply changing your Title Tag and your description tag, you can dramatically improve the number of visitors who visit your website.
Watch this step by step video tutorial to learn exactly how to do this;
Make sure you have created and submitted your XML Sitemap file. If you think you’ve missed a step and feel a little lost, we have our step by step walkthrough SEO videos which explain everything you need to know about optimizing your website for the best possible search engine rankings.
We have just released a new update to Sitemap Automator, the easiest way to get your website listed in all major search engines.
This update will now show you any broken links on your website and where they can be found on your site. Just double click a broken link and Sitemap Automator will take you to the page that contains that link.
Broken links can negatively effect both your search engine rankings and your website visitor’s experience. Ensuring all your links work correctly is the starting point of a professional website.
One of the things that many webmaster don’t realize is how vulnerable their website becomes when they start using social networking scripts such as forums and blogs to help build communities around their website. Although it is a very good idea to integrate these scripts on your site, when doing so you can potentially give your visitors access to sensitive parts of your website that they would normally not have
Since many blogs and forums are open source, malicious users have full access to the source code and can find vulnerabilities or take advantage of how a system works in order to gain access to protected pages on your site, spam your site, or use your site to help get your visitors to their site.
New versions of your installed blogs, forums and other software
Spammy or abused user-generated content
Abused forum pages or egregious amounts of comment spam
This is very important to keep track of because if your site has been hacked Google will either remove your website completely from their search index or warn users every time they click your site in their search results. This can dramatically affect your reputation and prevent users from visiting your site.
A Google Webmaster Tools account is free to set up. You should submit your XML Sitemap file in order to get your website listed in Google and help Google’s crawler access the important pages on your website.
A question that often comes up is why a website is not yet in search engines, or if it has been added to search engines, why isn’t it showing up for the keywords I want it to. These questions can be answered by looking at the data provided by Google Webmaster Tools. By creating an XML Sitemap file and submitting it to Google Webmaster Tools, you can quickly and easily learn the answers to this questions.
Some of the questions covered by the article include;
Why hasn’t Google indexed my website yet?
Does my site appear relevant to Google for my targeted terms or niche?
Why am I not getting clicks from Google?
Has my website been hacked?
Does my website load fast compared to other websites?
Pay special attention to the last question, how fast your website loads. This will become an important search engine ranking factor in early 2010. This has been confirmed by manysources already, that is why Google provides a site speed analysis test in their Google Webmaster Tools.
Of course, you can already be ahead of the game with WebCrusher, which will instantly speed up your website so it loads as fast as possible in all web browsers.
Your meta description tag is a short description used to describe each page on your website. Although most major search engines do not use it to determine where your website ranks in their search results, they do use it in their search results to give searchers an idea of what your page is all about.
A good meta description tag can entice searchers to visit your website, over some of the other listed results. Basically it’s used to quickly grab the attention of a searcher so that they click your link in the search results.
For years, if you left out a meta description tag, or your meta description tag did not properly describe your page, Google would attempt to find more relevant snippets in your webpage content. of course, this caused a problem because now you have no control over what’s displayed in Google’s search results. Leaving it up to a computer (Google’s indexing software), means you never know whether or not it will entice people to click your link, or even if it would make sense.
Recently, Google has started doing the same thing with Title tags. Title tags are what you see at the very top of your web browser when you visit a webpage. They are probably the most important on-page optimization you can make on each of your webpages. If you leave your title tag out, or if you do not have a title tag that accurately describes your webpage, Google will attempt to change it to something more relevant. Of course, this causes the same issue with the meta description tag.
A good title tag has the following characteristics;
It accurately describes what is one the corresponding webpage
It is between 60 and 120 characters long
It includes 2-3 keywords or keyword phrases that you want to rank well for
It’s different for every one of your pages
You can watch Matt Cutts, head of Google’s web spam team, talk about this change below.
Google recently updated Google Webmaster Tools with an all new keyword tool. Just like before, you get a list of your most prevalent keywords on your site except now you get additional information like the significants of the keywords and where the top 10 pages that the keyword can be found on.
Just click the Keywords link under “Your web site on the web” in your Google Webmaster Tools account. You will see a list of keywords just as before, except now a new significance column is also displayed. Click any of your keywords to see how many times they occur on your entire site and the top ten pages they occur on.
One great use of this feature is to help find out if your website has been hacked, and if so, where the affected pages are.
Usually when a spammer hacks your website they will load your site up with random keywords related to what they are trying to sell. Then they will link to the hacked pages from other pages on other websites they have hacked. With this new keyword tool you can not only discover whether or not your site has been hacked (loaded with spam keywords), but you can now see exactly which pages have been affected.
The above method is used commonly on sites with community driven resources such as forums, blogs, or other social networking scripts. Always make sure this has not happened with your site with this new Keywords Tool. If it has, this is something you need to fix ASAP because it may get your banned from Google and other search engines.
Now with Sitemap Automator 2.2 you can create XML Sitemaps for you very large websites. Sitemap Automator can now create XML Sitemap Index files meaning you can now submit more than 50,000 links to search engines.
This free update also includes improved support for MobileMe users with the ability to publish your XML Sitemaps to your root directory. By publishing to your root directory you will be able to see all the stats in your Google Webmaster Tools account. Also, by publishing your XML Sitemap files to your root directory on your MobileMe account, they will no longer get overwritten when making changes to your iWeb site and republishing it.
Sitemap Automator helps users get their website listed in all major search engines by creating specially formatted sitemap files, called XML Sitemaps, and submitting them to Google, Yahoo, Bing and Ask.com.
Last Thursday, Google changed the way verification works for your website. Instead of just publishing a blank file to your web server, you now need to download a special XML file that contains important information about your account.
In version 2.1 of Sitemap Automator, you only have the option of pasting a file name in the ‘Publish Verification File’ command under the File menu. We posted a new beta version a few days ago which will now let you publish the actual verification file that Google provides.
The new beta of Sitemap Automator also includes enhanced support for MobileMe including publishing to the root directory so you no longer need to re-publish your XML Sitemap file each time you publish your iWeb site.
The beta version of Sitemap Automator can be downloaded from Trybeta. Just create a free beta testers account on the website.
If you find problems with this version, please let as know as soon as possible.
Whether you are using WebDesign, iWeb, Dreamweaver, RapidWeaver, Sandvox, or any other web development tool of your choice, follow the quick and easy step by step video tutorial to learn how to get your website listed in all major search engines with just a few minutes of work.
Sitemap Automator is the easiest way to create, publish and notify search engines of your industry standard XML Sitemap. XML Sitemaps are used to let search engines know about all pages on your website, how important each page is, and how often each page is updated.