Some Ways You Can Control Your Real Estate Website

Duane Forrester over at Bing wrote another great article entitled 9 Things You Need to Control in regards to optimizing your your real estate website.  These suggestions are a must for anyone with a website.

By understanding these items you can improve your sites visibility to the search engines and hopefully provide a boost to your SERP’s (Search Engine Results Page).

Social sharing integration

This almost goes without saying, but I still see so many websites not involved socially with their visitors.  Social integration for the REALTOR® is a MUST.  Real Estate is so personal and a big part of your success is all about the conversation.  Since Social Media is all about the connecting and conversation they are natural partners.  At the very least get buttons embedded into your pages so your visitors can share content with their friends and followers.  AddThis is a great Social plugin you can use and it is simple to install.

Title, Description, Alt Tags, etc.

This is perhaps the easiest item on the list but the most often forgotten.  Having a good title, description and alt tags are vital to your websites SERP success.

A quick note on the meta description.  While writing unique descriptions for each page won’t vault you necessarily to the top of the search engines – it can make a difference when a searcher is looking at results and likes your description over the others.  Better to have your words appearing in the search results than random text that search engines take from the page because your meta description was low quality or empty.

Content

This might seem pretty obvious to talk about but this is a huge issue for many Agents.  You just built a new website and never change any of the information.   I talked about how to build good content in a previous post but I still see websites with little effort put in to building good, original, and dynamic content.  Take the time to write keyword rich content and become the expert in your area.  Your reward will be an elevation in the search engine rankings.

Sitemap.xml

The sitemap.xml is a file that search spiders use to find all the pages in your website.  This is especially important when you either have a lot of pages on your site or they are hard to crawl.  If you do not have a sitemap, talk to your web designer about getting one.

Verification access

This is pretty straight forward.  You need to be able to place the verification code in place to use Bing’s or Google’s webmaster tools.  The verification process is used to among other things autorize Google and Bing to read your sitemap.  Authorization can be done by embedding a tag in the code of your web page or by a notation added in the DNS for the website.  No matter the option used, you need to have access to make this happen.  If you do not then you cannot take advantage of the many things Google and Bing have to offer your website..

UX improvements

If you don’t have a website your visitors love, you’re missing an opportunity.  Get cracking on a user experience review and see where you’re bleeding users.  By staying in tune what users like and dislike about your website, you can make the necessary changes needed to field a UX-winning site.  And if you keep your visitors happy they may share you more often with friends netting you more links.  Visitors are also more likely to come back to you again in the future if they like the site and find it easy to get what they’re after.

It might seem like a small thing to focus on the user experience but that UX directly influences the happiness of visitors.  If you don’t have input on UX improvements, you need to push.  This is an important aspect of optimizing any website.

Rel=canonical

Canonical is a useful tool if you have multiple pages all pointing to the same thing.  Using this you can tell the search engines which URL is the most important to you and combine them into one URL location boost the rank of that one you deem the most important.

The hard part is finding all the instances and getting the code on the page.  This is usually a web designers job but once done will provide a clearer picture for the search engines when they crawl your website.

Robots.txt

I bet many people are not even sure what the robots.txt file is or that they even have one.  but having one  setup properly is vital.

Search spiders use this file to understand how to interact with the website. The Robots.txt file tells the search engines what they can and cannot do when indexing, how often to index and if they should exclude any pages.  This is something your web designer usually sets up but it is a good idea to make sure you have one and that it is setup correctly.

All the items above are not in any particular order and aren’t meant to cover every single thing you need to be in control of, but should at least get you started down the right paths.  Once you get these items under control you can then think more about your website in terms of its SEO (Search Engine optimization) value.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s