Archive for December, 2006
Friday, December 29th, 2006
Content rich means different things for different individuals, because what one person finds useful, another may not. Content rich is all about providing information that is considered valuable to your target audience. Information that visitors might find useful could consist of product or industry facts, statistics, reviews, tutorials, or educational information related to a specific industry.
When creating a content rich website, do not be afraid to think outside of the box. Unique ideas will generally garner more attention than the mundane and more common content concepts. Over the years the unique content that has garnered the most attention, the Subservient Chicken and JibJab, may not be appropriate for a business website, there are still lots of “out of the box” things that you can do.
Here are some ideas on how you can build content for your website that will attract website visitors.
Calendar of Events
If you website appeals to a specific audience manage and maintain a calendar of events. The events should relate to a specific region or topic.
Ex. Hawaii Local Events – http://calendar.gohawaii.com/ (regional) or ex. Librarian Events – http://www.infotoday.com/calendar.shtml (topic specific events)
Sponsorships and Contests
Conducting a contest is a great way to generate interest and incoming links, everyone wants to win and in order to garner votes many competitors will tell their audience about contests and voting options. (more…)
Thursday, December 28th, 2006
Indexing has always been considered a highly targeted science. Enter a search query into Google search and the pages that are displayed are generally optimized towards that exact word or term. However, in their continual battle to server the most relevant but most natural pages with genuinely useful information Google has injected latent semantic indexing (LSI) into its algorithms.
What is LSI?
LSI is a unique indexing method that potentially takes Google search one step closer to becoming human in its way of thinking. If we were to manually search through web pages to find information related to a given search term we would be likely to generate our own results based on the theme of the site, rather than whether a word exists or doesn’t exist on the page.
Why Search Engines Might Adopt Latent Semantic Indexing
The extremely rigid form of “keyword indexing” also meant that black hat SEO techniques were easier to implement. Search engines could be manipulated into ranking a site highly by using set formula. Originally, cramming a page with a particular keyword or set of keywords meant a site would rank highly for that search term. The proceeding set of algorithms ensured that your link profile played more of an important part than your keyword density. Reciprocal linking soon followed once again making it possible to manipulate the search engine spiders by exchanging links with tens, hundreds, or thousands of websites. (more…)
Thursday, December 28th, 2006
Very interesting article by Grant Crowell over at SearchEngineWatch.com.
Here are some of the more interesting snippets for those looking to capitalize on working their images into their search engine optimization efforts.
…image search optimization offers the following advantages:
- Free product promotion. “Its another avenue of search marketing without having to pay for the click.”
- More optimization opportunities than regular search alone. Smith added that photo sharing sites Social image sharing sites have more contextual clues that search engines can use for their ranking criteria. “There’s a lot more signals involved than regular web pages.”
- Less competition. “Image search right now is a widely underused area for retailers. Some spaces have very few retailers or no major retailers at all.”
Evans attests that features natural to image search–easier to optimize, free inclusion, and less competition from major retailers – create special advantages of image search optimization for niche markets and smaller retailers.
“This is one case where smaller retailers without large content management systems can hold an advantage,” said Evans. “Smaller retailers have direct control over picture descriptions, picture names and content that is directly around the pictures and on the page. Content Management Systems have a lot more constraints on content and files names and therefore it is a lot more difficult to optimize for image search,” she said. (more…)
Monday, December 18th, 2006
Thursday, December 14th, 2006
Great video from WebProNews about the benefits of SEO for online retailers.
Thursday, December 7th, 2006
With so many strategies in the SEO world, using RSS Feeds can often be overlooked. This article on RSS Feeds can help you understand how effective RSS can be towards the SEO and Internet Marketing of your site.
Top 6 Strategies To Put Your RSS Feed Promotion On Steroids
As I wrote in one of my previous articles, the orange revolution has begun. RSS is here and it has taken the internet by storm. The number of webmasters becoming a part of this online revolution is growing on a fast pace.
The reason is, RSS feeds have proved to increase traffic in a big way. And creating your own RSS feed would surely be a step towards tapping in a new source of traffic.
However, just publishing your own RSS feed and waiting for miracles to happen won’t work. In order to get those most wanted subscribers, RSS feeds need to be promoted and marketed in the right way. With a good dose of quality traffic directed towards your feed, long term subscriptions will definitely start pouring in.
The following 6 ways will not only help you get the word out about your RSS feed but also help you retain a fresh flow of targeted visitors to your website.
Strategy #1: Create an individual webpage for your RSS feed
The most important part of your promotion should be creating a webpage dedicated to your RSS feed. Since RSS is still new to most of the online users, you can give your readers an introduction and explain how beneficial it is to subscribe to your RSS feed.
You can also have a small FAQ section about RSS here answering any questions and doubts your readers might have. Most of the time, the key here is to give your potential subscribers more than one option to add your feed. Different subscription options like “My Yahoo!” “My MSN” and “Newsgator” allow your visitors to add and subscribe to your feed with their own convenience. (more…)
Wednesday, December 6th, 2006
At Big Oak SEO we have been employing deep linking for the past year and finding a tool to help with a task like deep linking is a nice surprise. Read on to see how you can begin your deep linking SEO campaign with some help.
Sometimes one of the hardest things to do for any website is build links to deep content.
These are the links that generally will help the site perform best overall in the search engines. This is because while it may take hundreds or thousands of links to help a home page rank highly, it may only take a few to help a deep page rank.
In this article I give you some tips on building those deep links.
I’ve had a tool on my computer since it first came out that I’ve always loved. But it wasn’t until today that I realized it could do much more for me.
The tool is Blinkx Desktop Indexer. (more…)
Wednesday, December 6th, 2006
Linking to other web sites can hurt your rankings if you link to the wrong web sites. Links to web spammers or “bad neighborhoods” on the web can have a negative effect on your rankings on Google and other search engines.
A Google official has recently commented on bad neighborhood links and how they affect your web site rankings on Google.
What are bad neighborhood links?
Google doesn’t like the following type of web sites:
- free for all links pages (FFA)
- link farms (automated linking schemes with lots of unrelated links)
- known web spammers
Linking to that type of site can have a negative effect on your Google rankings.
Is there an official Google statement about bad neighborhood links?
In a discussion in a webmaster forum, Google’s Adam Lasnik has recently clarified what Google looks for in regards to bad neighborhood penalties:
- There is no relation between outbound links and Google’s supplemental index“It’s unlikely that your outbound linking is causing your pages to be listed in the supplemental, rather than main index.”
- Google looks for bad neighborhood linking patterns
“Also, be assured that we’re not looking to penalize folks for a ‘bad’ link here and there. Rather, our algorithms are tuned to look for patterns of ‘egregious’ linking behavior – both on individual sites and in the aggregate.”
- You should check the links on your web site”It’s certainly in your users’ interest that you regularly audit outgoing links on your site (especially prominent ones) to ensure that you’re not losing folks’ trust by sending them to inappropriate places or 404 pages.
Sure, it’s great to keep Google happy, but it’s usually more important (long term) to have your users be return visitors.”
What does this mean to your web site?
Google looks for linking patterns. That means that it may not hurt your site if you link to a bad neighborhood by mistake.
Many links to 404 error pages might cause ranking problems. It’s a good idea to check the links on your web site every now and then.
Tuesday, December 5th, 2006
There are two camps in the world of directory-to-directory reciprocal linking, and each have a distinct approach to the work.
One camp treats linking as a branding function of the business, as well as a way to provide a genuine resource directory for their site visitors. It’s what I call “traditional” reciprocal linking, because that’s all there was to it in the pre-Google days. There were no “games” to play. Many practitioners of this method of linking have been going grassroots web marketing work since the mid-1990s, and they understand it, at their core.Ã‚Â Others have learned from these masters, usually by following their successful examples. Their strategy is not very complex, but their approach and methods are very thorough, and they’ve enjoyed some phenomenal search results for a long time. So why change what has always worked? The answer is, they don’t.
The second camp is comprised of people who are absolutely certain that, with the right “tweaking” of their websites, that they can rocket to the top of the search engines. Branding is not the goal. Search engine placement is the over-riding concern, which causes them to pursue all manner of complex strategies and theories, in hopes of “gaming” the search engines.Ã‚Â Having read much of this complex theory over many years, and reviewed it against hundreds of real world examples, little of it holds any water. Beyond that, it adds layers of complexity, cost, and work to a process that needs no more. (more…)