Saturday, January 07, 2006

Strategies for Success 2006 In Search Engine

As you know, every year is always rocked by a plethora of changes in the search engine marketing world. The acquisition of smaller companies by the Big 3 changes the marketing landscape as we know it every month and with every update to the index that is made, we hold our breath and hope that we come out better (if not, the same) in the end. So when it comes to the new year, there are many things that we should look out for to stay on top of the rankings.

1. Quality Content: I say this so often and I cannot overemphasize this enough: Content is KING! Search engine spiders, crawl the net to find what? Content! Your site has information (hopefully) that you want the spiders to see and include in their index. By the creation and publication of quality content, you give the search engines more reason to return. You are feeding them what they want. In 2006, you should be finding creative ways to get your content noticed and viewed as well as finding creative ways to publish fresh content on a regular basis. A very good way this is done is through the use of message boards (hosted on your site) and by blogs (enabling you to publish more frequently).

2. Don't Overextend Your Link Exchange Structure: Backlinks were a popular way to increase your rankings fast in the search engines. The tradition holds: find a PR7 website and trade backlinks and you'll be indexed in Google within 24 hours. That strategy still holds true and is beneficial for new websites.

But in my opinion the days of tremendous link-swapping are coming to an end. Many websites have been founded with the purpose of allowing you to exchange links with other websites. This has caused a massive influx of webmasters who want to exchange a ton of links with the hope that it will help them in the search engines.
But what really matters when it comes to links is the amount of quality one-way backlinks that direct users to your website. You want the balance of links to be in your favor, that is what leads to success.

Also, there has been talk of search engines taking notice of these "link-farms" and penalizing those who take part in them. So if you do take part in link exchanges, please be moderate in respect to the number of exchanges you take part in.

3. RSS and XML: Two new technologies that have begun to take center stage especially in 2005 include a programming language that has been around for several years called XML. XML is shorth for extensible markup language and is a derivative from HTML. The main difference is your ability to create descriptive tags for your data.

This has led to the advent of RSS or real simple syndication. RSS is a way for you to publish your data to an XML file hosted on your site. Users subscribe to your RSS feed via the XML file and whenever you make a changes to your XML file they are notified. It's become a major technology used by news agencies and bloggers alike as a simple method of publishing your information across a wide variety of platforms.

XML has also proved useful with the Google Sitemaps program, newly released in 2005. The optional tags available with the XML sitemap allow you to be descriptive about the individual pages on your site including dates the individual pages were modified. There are some small things you need to pay attention to when creating this: namely you have to follow the Google xml schema, and you have to be diligent about tracking and fixing errors in the code. But if used correctly, it is a great way to help Google index the hidden pages of your website due to _javascript or flash.

4. Stay away from Flash and _Javascript for the time being: Flash and _Javascript are very powerful tools for creating dynamic and eye catching websites. The most prominent problem with the two technologies is that the spiders can't index through them (at least not yet). This limits your ability to have the search engines index portions of your site. Many have speculated that the Big 3 are working on solving this problem, but for the time being, avoid or limit your use of these technologies.

5. Avoid Unethical SEO: There are a lot of programs out there that help you to acheive maxmum linkback ratios in a very short amount of time. Some of them are good; some are bad. In fact, some of them will waste your effort trying to post trivial comments on blogs or trying to maximize your link exchanges. In my opinion, you should seek success in SEM the right, ethical way. Seek out honest web companies to exchange a moderate amount of links with. Post only relevant comments to forums and blogs because that behavior leads to lasting link backs. Also, don't try to manipulate your website to make it appear to have a higher PR than you really do. Google sees that one!

6. Last, but not least, Articles: There is a little bit of controversial talk about whether it is right to post articles for free use in directories. In my opinion, you are providing a well needed service to webmasters and I don't see this one as a potential loss for 2006. Information is valuable. And websites that need content (especially fresh content) desire what you do to make their efforts a success. So it is natural for your website rankings to benefit through backlinks from those articles. It's a win-win situation.

One other thought on this subject. Right now, the search engines can punish websites for having duplicate content, and that is an argument that many will propose. But, the search engines will usually only punish you if the html format of a web site is similar, not a couple of articles. So posting articles is safe for now.

But be cautious. Many lucrative methods of ethical SEO can be turned into a problem when too many people attempt to abuse the technology.

So that's it. Short, but informative. SEO is both an art and a technology that we have to use correctly for the right type of success. Who knows what the year ahead may bring, but playing your cards right, you can acheive success and avoid any pitfalls that may come.

About the Author: John Wooton Author and Creator, The SEO Journal Blog, http://seojournal05.blogspot.com/.

Web Page Optimization

We all want to have the most attractive website that leaves a visitor wide-eyed and completely dazzled. Usually an extremely attractive website design involves lots of graphical elements, increasing the overall page size which causes the page to download slowly to the browser. This article will provide some useful tips on how to keep your website design attractive but still downloads quickly.

As the average internet bandwidth rate per computer is raising, more and more webmasters allow themselves to develop complex websites laden with heavy graphic elements. In extreme cases you can find websites that take as much as a few minutes to load their content in your browser. Of course the user will never wait that long for a website to load, and will move on to the next website in his search results.

So why are webmasters still developing slow loading bloated websites? Primarily due to a lack of knowledge of simple graphic optimization techniques that will allow them to maintain an attractive website while keeping the page size smaller.

How many of you are aware of the fact that a box with rounded corners can be achieved using CSS code only, without the need for any graphic image. Well it is possible! Before those of you familiar with CSS say that it cannot be done for every type of browser and a relatively high level of programming is required, I say that dealing with the most common mistakes web designers make regarding optimization can have simple solutions.

Never limit the web designer by placing any restrictions that impact the final outcome. You might make the claim that what a web designer can do with graphic software is impossible to implement by code. I disagree. When the design is finished and you are ready to slice it into small images to be used in the html code, your creativity is been tested. Everything you do at this stage will affect the total page size. If your design contains rounded shapes that overlap each other or areas with color gradients, then you must slice it carefully so the outcome is a small file size.

Let's look at what efficient slicing means:
1.
Do not make large slices that contain lots of different colors. Use a small number of slices where each slice contains a limited number of colors.

2. Do not make a large slice that contains the same graphic structure. Slice a small portion of it and duplicate it in your code. This is a very common mistake that webmasters/programmers make when dealing with gradient color background.

3. Do not use JPEG file format all the time. In some cases a GIF format will be much smaller in size. A rule of thumb - a slice with high number of colors will be smaller in size using the JPEG format rather than the GIF format, and the opposite is also true. Check each option separately. Every 1KB that you reduce from the image file size will eventually add up to a significant reduction in page size.

4. If you have text on a solid color background, do not slice it at all. Use code to create the background instead. Remember that you can define both the font style and background color of the area using CSS.

Advanced Techniques

Graphically optimizing a website is more than just knowing how to do image optimizations. There are some advanced techniques that required a high level of programming. CSS2 has much more to offer then CSS does. Although not all browsers have adopted this standard yet you should be ready for when they do. _JavaScript also gives you a set of options to create some cool effects without needing to overload the page with Flash. Using limited tools like _JavaScript compared to an advanced application like Flash to create the desired effects can be difficult. However think about the outcome. For a onetime effort you can differentiate your website from others. You will have an attractive professional looking website that loads quickly.

Back to the Future

As PDAs, smart mobile phones and mini laptops are used with wireless internet connections for internet browsing, publishing fast loading web pages will enhance the browsing experience not only for those using wide bandwidth connections but also will make the browsing experience user friendly (or may I say, bandwidth friendly) to the wireless clients.

For those who insist that web design optimization is not necessary because everyone will have high bandwidth connections eventually, I agree up to a point. However, the software companies are creating applications that use more bandwidth because they know it is available for them to use. Get used to writing well optimized web pages because this cat and mouse game will never end, and it is better to learn the rules of the game then it is to be bitten.