Archive for the ‘SEO Strategies’ Category
Wednesday, November 20th, 2013 |
This list was compiled by Eric Ward so you know it has to be good.
1. Down With Toolbar PageRank And Up With… What?
by Julie Joyce
There are quality alternative metrics to PageRank, such as the various ones used by Majestic, Ahrefs, Open Site Explorer and Link Research Tools. Let’s look at how these metrics are calculated, taking it from the source. After that, we’ll see how the numbers stack up across a few different sites.
2. 315 Businesses Boost Rankings by Optimizing Their Google+ Local Pages [Study]
by Jon Schepke
Ripe opportunity still exists to boost local visibility with some focused SEO efforts. Consider the five primary optimization techniques employed by the businesses in this study and the impact the effort delivered on local visibility.
3. How Google+ Shut Me Down for Impersonating Myself
by Ken Mueller
Despite having a YouTube channel (owned by Google) under my own name for years, and a Gmail account, along with other Google properties, I’ve been shut down by Google+ because my “profile impersonates someone.” So, I have my picture and my name, and I’m impersonating someone? Myself perhaps?
4. Are all comments with links spam?
Video by Matt Cutts
5. How Google Might Use the Context of Links to Identify Link Spam
By Bill Slawski
In 2004, Google filed for a patent that describes how the search engine might pay more attention to the context of a link, such as the words that surround the link, to better understand the context of those links. In the example of unnatural links from the Webmaster Central blog post, we see clearly how links in an example post might be created in a way where the context of those links makes little sense.
6. Building High Quality Backlinks with Dofollow Relationship
By Michael Martinez
Where Larry and Sergey were wrong was in treating links as “votes”. Links have never been “votes”. But the aggregate data analysis really doesn’t need to use that metaphor. Or, rather, I should say it’s a neutral metaphor at the aggregate level — it doesn’t matter if you say “links are votes” or “links are NOT votes” when you look at the SET OF LINKS. Linking behavior always reveals a pattern.
Monday, February 25th, 2013 |
The SEO industry has seen a four-fold increase in the past three years with all the business web sites and blogs tirelessly working to increase their content’s visibility in the internet searches. This competition resulted in the origin of several SEO techniques for increasing the web site rankings given by the search engines. These techniques mainly concentrate on the usage of keywords that the users will most likely look for and it resulted in some SEOs filling the content cleverly with those keywords for their own gains. The end result of this process is that several worthy web pages lacked visibility.
To counteract this, the search engines came up with a new concept in their search algorithms which is called Linear Semantic Search (more commonly known as Linear Semantic Indexing or LSI). It targets not only on the specific keywords that are being searched for, but also on their semantic relationship with the other contents of the page and other common words that are related to them. The prominent characteristic of this method is its capacity to dig out the conceptual content of a web page through forming associations among the terms occurring in related contexts. For example, if you search for the word ‘ppi claims’, you will get different results like ‘Reclaim PPI’, ‘Mis-sold PPI’ and ‘ppi claims – no win, no fee’ all of which are related to the keyword.
It fueled the organic search by searching for relevance instead of the traditional way of looking for the keywords that match with the text which is being searched for. This method also put a check on the backlink and keyword spammers and many other black-hat SEO techniques finally proclaiming that the content is the ultimate key for better rankings. The introduction of this process has done a world of good not only for the web site owners, but also the internet users as it increased the probability of finding the exact information they are looking for.
LSI revolutionized the way of search as the algorithm looks into the pages as a human does essentially basing on relevance. Here are a few advices on how to implement LSI for your web site.
- Use the webmaster tools provided by different search engines to scrutinize the indexing your web pages and make the necessary changes.
- Adopt using the correct keywords in all parts of the content like titles, meta-tags, web links, image titles, etc.
- Avoid stuffing your content with key words or else it will be considered to be spammed by the search engines.
- Consult an SEO professional if you cannot do it by yourself. A little investment can pay you good results.
- Constantly research on the keywords that are more searched for because the trends of searching vary constantly according to the time of the year, geographical locations, etc.
To stay ahead in the race of ‘Internet Marketing’, constantly modifying your web site for standing against the linear semantic search is one of the most important SEO techniques.
About the Author:
This guest post is contributed by Zara, a Guest blogger from UK. She has written many articles on SEO, technology and finance. Apart from blogging she does research on ppi claims . Catch her @financeport
Friday, January 20th, 2012 |
I skate to where the puck is going to be, not where it has been. – Wayne Gretzky
Blog commenting, article spinning, article directory submission, profile links: such methods are still in the search engine optimization tool box for many internet marketers because they still work here and there. What these site owners don’t want to come to terms with is the fact that their success has a very short life span. Sooner or later, Google engineers will update their algorithm and their rankings will evaporate in seconds.
Rather than spending months trying to trick Google into ranking your content based on hundreds or thousands of subpar quality links, spend months on generating links that will last for decades. Realize that although those subpar quality links will get you a quick payoff in the short term, you are wasting a massive amount of your time because, sooner or later, Google will devalue those links.
Even short term success isn’t guaranteed. If you’re getting traffic from high ranking keywords, inevitably you’ll have people looking at what other type of content you have on the web. If people find thousands of spun articles that link directly to your site, written in language only Jabba the Hut would understand, you can be sure that they aren’t going to buy anything associated with your site’s brand. It’s pretty obvious when a site is looking to provide useful information and when it is looking to cash in by gaming Google. Eventually, your site won’t be a visited by harmless Googler, but a Google engineer who will manually inspect your site and determine its fate in seconds.
Yes, it takes a lot of time to write quality guest posts that are posted on established blogs. Yes, it takes a lot of time to seek out blog owners who are open to guest posts in the first place. Yes, it takes a lot of time to create content which people would naturally want to link to. However, all of those time consuming methods will give you more value in the long run because Google will value those methods for many years! Aside from guest posting and creating quality content, there are many other white hat methods that you can invest in…
- Infographics, which can be used as link bait.
- Youtube videos, which can give your site traffic and links, if you have interesting enough content.
- Message boards, which are hard to set up, but can be a source of sustained traffic and links that grows indefinitely.
- Press releases that may find their way on big news sites and provide you with traffic and links.
Too many SEO’s fail to think of their work as a long term investment, which is what it really is. Rather than trying to emulate the quick results of PPC with questionable linking practices, focus on creating value.
Nickolay Lamm is an internet marketing specialist who manages InventHelp and InventHelp Scam.
Wednesday, July 20th, 2011 |
Article writing is the single most effective tactic for driving traffic to any website. Unfortunately, there are too many webmasters out there that place a greater importance on quantity than on quality. For every webmaster out there that focuses on writing quality content, there are at least a few who simply do keyword research and pump out as much content as possible to hit those keywords.
In the earlier days of Google, this was an effective strategy. However, things have changed and it is now more effective to spend time writing a few quality articles than it is to build a content farm out of mediocre articles. Low quality articles litter the internet, frustrate visitors and make the lives of legitimate website owners that much more difficult.
High quality articles are not written with the sole intention of hitting keywords. High quality articles are written with the reader in mind first. The best articles are those that are unique, well-written and backed by research. Visitors to your website can tell instantly whether or not the content is worth spending the time to read.
Here are four reasons why quality trumps quantity in the realm of article writing:
Quality Articles Provide Value to the Reader
By “value,” I’m referring to articles that readers find useful, interesting, entertaining or though-provoking. These articles reward website visitors for spending the time to read what you’ve written. This generates good will among your visitors and encourages repeat website visits. Every visitor that bookmarks your website is one less visitor that you have to earn through Google.
High quality articles also generate more natural links to your website. When people stumble upon well-written content, they are more likely to share that content with other people. This generates links on social media sites such as Facebook and on third party websites. Those links bring more direct traffic and help your site rank higher in search engines.
Let’s look at an example:
A 500 word article on the topic of wireless internet is way too short to fully explore such a wide-open subject. There’s nothing inherently wrong with short articles, but a 500 word article on “wireless internet” is highly unlikely to provide any real value to the reader. A better approach would be to drill down to a more specific topic and then thoroughly cover that topic.
Quality Articles Generate Trust
If you consistently provide your readers with useful information, your readers will come to trust you. Not only does this trust earn repeat visitors and more backlinks, but it also makes your readers more likely to listen to your recommendations. People who trust you are more likely to buy your products and click on your affiliate links.
But remember: trust is not earned in a day. Trust is earned over time by posting high quality articles again and again. No matter how many articles you publish every day, it takes time to develop a relationship with your visitors. That is why it is so important for you to always write high quality articles, even if it means a slower rate of production.
Quality Articles Last Longer
High quality articles last longer because they remain useful to your readers. A thorough “how-to” guide of some sort can last for years if the subject remains relevant. Once again, these articles encourage repeat visits to your website.
As an added bonus, high quality articles stick around longer in search engine results. When people share high quality articles over a long course of time, those articles slowly generate new links and maintain high rankings in search engines. Low quality articles eventually fall off the top page of Google and collect dust in the vast archives of the internet.
Google No Longer Rewards Content Farms
Google has actively stepped up its efforts to weed out obvious content farms from its search engine results. The recent Panda update was designed specifically for this reason. Google has always stated that its goal is to provide the most relevant and high quality search results possible for its users. Websites that simply pump out low quality content all day long do not provide searchers with the information they want.
The Panda update goes to show how important quality content is. Hate ‘em or love ‘em, you have to admit Google has a good handle on what kind of search results people like to see. If Google places greater importance on quality than on quantity, you can bet the people that use Google do as well. Besides, common sense tells us that people would rather find a few snippets useful information than a truck load of filler text.
by James Wilson
Monday, May 10th, 2010 |
Good news travels fast; bad news travels faster. Recent spurt in cases of rampant, and sometimes baseless, negative online publicity has affected brand image of many companies. In 2005, a single post by a blogger criticizing Dell’s support services pulled down the company’s reputation by a couple of notches. The corporate world is waking up to the necessity of playing an active role in online reputation management.
Online Reputation Management involves not only analyzing all that is being written about your brand online but also repairing any damage found and constructing a positive image. A successful ORM campaign should involve public relations and search engine marketing. You have to ensure that good things are being said about you on various websites and these websites get top search engine rankings.
How to monitor/track your online reputation
Regular monitoring of online media will help you keep tabs on your public perception. It will also alert you in cases of copyright violations, competitor smear campaigns, domain squatting, etc.
There are many online tools you can use to keep one step ahead and take immediate action. Google Alerts is the most popular monitoring tools that will track and inform you if your brand name comes up in news, feeds, videos, blogs and web results. There are blog-specific search tools like Technorati and Feedster. Twitter Search and Social Mention can also help you catch the buzz about you in social networking sites.
How to repair your online reputation
With the growth of user-generated media like blogs, Tweets and Yelps, the chances of creating negative publicity have also increased. The first step towards tackling negative comments is to create your real presence in popular consumer-generated websites. Responding to your critics on these sites will build trust around your brand. In cases of inaccurate projections, you can request the comment authors to pull down their posts by giving substantial evidence.
Press releases can be posted on popular press release submission sites. Expert articles pertaining to your industry can be submitted to reputed sites with back-link to your website.
You can also buy domains with your brand name (for example if your company is called blush, blush.com, blush.net or blushsucks.com are some domain names you can book) to prevent people with malicious intent misusing them against you.
Not to keep picking on Dell (I’m using a Dell computer right now), but here is a negative site that could have been snagged to prevent bad publicity: http://www.ihatedell.net/.
Sometimes, repair can be a long-drawn exhaustive process. Companies often use search engine optimization techniques to push down negative websites and increase the visibility of websites with positive content.
How to improve your online reputation
An important step in improving online reputation is creation of positive sentiments through various content platforms. This means you have to fully use your online assets. Start by optimizing your corporate website so that it gets top search engine rankings.
Leverage your relationship with your partners to include information about your company on their websites. Set up and maintain blogs that highlight your products, customer testimonials, services and company-related news. This idea involves grabbing as many of the search engine results as possible on the first page of search results. Using high ranking partners will help tremendously.
A proactive online reputation management initiative goes a long way in helping people and companies. It’s one of the best tools to gather useful feedback from customers. In cases of complaints, it gives companies a chance to take early action and prevents build up of a sudden crisis. Above all, what makes ORM a critical business strategy is the role it can play to influence a rapidly growing force called the online media.
Monday, April 19th, 2010 |
Infographics, or information graphics, have been around for as long as man has been able to draw. The earliest cave paintings are a form of infographic as they pictorially depict the life and activities of our very distant ancestors. Thousands of years later, we still readily understand them. The infographic underwent significant development in the 20th Century and an infographic, rather than written or spoken language, has been used in our first communication effort with extraterrestrials!
Infographics are widely used in our society, in mathematics, mapmaking, signage, news media, education, travel, medicine, politics and even religion. No aspect of our lives is untouched by the application of infographics.
So why are they so popular?
Infographics convey knowledge and advice, even mandatory orders, in a form which the human brain readily recognizes and associates with the information behind the representation. This is known as visualization.
Before man learned to read and write, he drew. Modern written language is itself derived from the development of drawings which became standardized into symbols and in turn, into recognizable letters and numerals we now recognize. Hieroglyphics from ancient Egypt are a good example of an intermediate written language which revolves around symbology and formed the basis for the development of vowels and consonants.
Graphical representation renders itself far more accessible and understandable by people; whether they understand the language of the designer or not. The reason why people accept so much information via infographics compared to text is explained by how our brains have formed over time. During man’s early development, we were not equipped with language, never mind the ability to read and write. Man primarily looked at the world around him, his eyes being the primary sense with smell, sound, touch and taste running distant also rans. Visuals are how our brains are “hard wired” to “read” as our default operating system – what we can visualize is our primary mechanism for taking in information as a consequence. A baby must learn to speak, must be taught to read and write but, they have no issue in drawing as soon as they can hold a crayon.
As the saying goes, a picture paints a thousand words, which is why, possibly, the most important infographic is currently aboard the Pioneer 10 spacecraft. Pioneer 10 was launched in 1972 and is currently journeying through outer space – the first vessel to leave the solar system. It contains the Pioneer Plaque (http://en.wikipedia.org/wiki/Pioneer_plaque); a pictorial representation of humankind, our planet and solar system and where we are located. This is a powerful testament to the universal understanding provided by infographics which are not restricted by language barriers.
Visual language is universal for those who can see; imagine your car journey without graphical road signs for instance. Graphical images can be very quickly assimilated by the human brain, and render a meaning which is clear without the need to read text. In part, modern infographics are so readily understandable because we have become educated by the basic grammar of visual language. We know, for instance, that a bar running through a left-pointing arrow means, “Don’t turn left” for instance. Possibly the most important development in road signs has been the stick figure drawings that represent people (originating from the Munich Olympics in 1972).
Newspapers have probably done more to lay the basis for our understanding and appreciation of infographics than any other medium. In the 1970’s, British newspapers started to develop a series of charts and graphical representations to convey information in an understandable format to readers. This was rapidly picked up by USA Today when it launched in 1982, and spread to other mainstream media publications such as Time magazine.
Infographics have not been without their critics. Newspaper critics and traditionalists deride the “chart junk” which populates papers and the media. They argue that infographics demean the information being conveyed. At the same time, the idea that infographics are artistic has also received derisory comments from the art world. The idea that an infographic is where “art meets science”, is not widely accepted in the journalistic or art world, but nevertheless, the reading public clearly appreciates the graphical, and sometimes comical, representation of information.
What of the future? A notable exception to the long list of infographic applications is in television. Television has only recently embraced the notion of the infographic for transmission of frequently complex and large volumes of data in a visual fashion. Perhaps this is because television itself is a visual medium relaying information in real-time, i.e. without the need for a fast data burst to our brains. This does lead to the question – how much more powerful could a televised infographic be in relaying information to people? The televisual infographic is under development at this time, but how successful they will be we shall have to discover for ourselves as they start being broadcast on our screens.”
Friday, March 19th, 2010 |
Press release frequency is a topic that doesn’t get nearly enough attention. It seems like every post I read on the subject of press releases has to do with how to write them, how to optimize them, and where to publish them. And while all of these things are important, the frequency of your press release distribution is just as vital to the success of your PR campaign as any other factor.
So, what’s the ideal press release frequency?
I recommend sending out a press release at least once a month. Now, I know what you’re saying: “But what if I don’t have big news to announce every month?” My response: You may not have “big news” every month, but you always have an interesting story to tell. You just have to know where to look for it and which angle to attack it from.
Now, why do I say you should distribute a new press release at least once a month? There are a few reasons.
- • It’s the leaky faucet approach to PR—PR magnets (those who seem to always get PR) use the leaky faucet approach to PR. This theory is based on the premise that if you drop a series of newsworthy press releases consistently over a long period of time, the media will eventually cover your company. The idea is that you always need to remain in the awareness of the media, and by publishing newsworthy press releases regularly they’re constantly reminded of you. Sooner or later, your press release will be in the right place at the right time.
- It’s important for SEO—Online press release distribution is an often overlooked, yet vitally important, tool that can help any SEO campaign. By optimizing your press releases for targeted keywords, your press releases can grab top rankings in Google and other search engines. The search engines just love press releases. In fact, I’ve had experiences where press releases stayed on the first page of Google for the targeted keyword for more than a year. And when you add in the backlinks that you can include in your press release, it’s clear just how powerful press release distribution is for SEO.
- It helps educate your audience—One of the biggest challenges of any PR campaign is educating your target audience on who you are, what you do, and what makes your company different. By distributing press releases on a monthly basis, you can build brand awareness, and over time, your target audience will get to know your company. Ideally, this will lead to more leads and sales.
How often do you publish a new press release? Why? Share your thoughts by leaving a comment.
This article is written by Mickie Kennedy, founder of eReleases, the online leader in affordable press release distribution. Download a free copy of the PR Checklist – a 24 point list of Press Release Dos and Don’ts here: http://www.ereleases.com/prchecklist.html
Wednesday, March 17th, 2010 |
Link building is something a lot of people struggle to do effectively. The problem isn’t always knowing how to build links, it’s sticking to a couple of tactics and ensuring they yield results before moving onto the next one. In this post I am going to discuss to one tactic you can implement straight away using Google Alerts & RSS Feeds.
1. Building Ideas
One of the biggest mistakes people make is treating link building as a numbers game. They build a bunch of links and never think about them again. You should treat every piece of content as a sales piece for the site you are promoting. The content should be topical and relevant to the industry you are in. It should be themed around popular subjects.. To get ideas for your content, build your own RSS Feeds as follows:
You can enter keywords in search.twitter.com and build a social feed for them. If you use RT “keyword”, it will tell you what people are retweeting. The feed is available at the top right hand corner.
b. Digg / Delicious / PopURLS
These 3 sites are not only great sources of information, but can be used to highlight popular content around your target keyword. All of them allow you to search on a particular keyword and sign up to that RSS feed. Again this will allow you to quickly scan through content and see what is being marked as popular.
2. Stalking Article Writers
Once you have decided on your content from step one, do some investigating on where this kind of content gets picked up. Go to Ezine Articles and find a similar article. Click into it and check right down the bottom for “Most Published EzineArticles in the <Selected Category>”. Select a couple of those article titles and punch into Google [intitle:”<Article Title>”]. This will build you a list of sites (link targets) that accept content you are going to write and also popular writers in your market. For each writer you deem the most popular (you guessed it), sign up to their RSS Feed on Ezine.
3. Tracking Your Links
You should now have produced a batch of content that is already been picked up my 3rd party sites in your market. What most people do wrong at this point is seed the content and then forget about it. This is where Google Alerts come in. Create an alert for every piece of content you seed out. Simply track the article title (in quotes). Within your Google Alerts, set these as “Feed” and pull them into a folder named for the keyword you are targeting. The default for these is “Email”
Now you have a bunch of great articles out in the wild being picked up by 3rd party sites. Each time an article is picked up, review the site and offer more unique content if it’s worth getting a better link from them.
This is just one easy tactic you can implement straight way using RSS Feeds + Google Alerts. There are literally dozens of like these.
Searchbrat.com offer custom link building services to increase your sites visibility and ROI. Check out the full range of SEO Services being offered.
Thursday, March 11th, 2010 |
The less competitive and more specific ‘longtail keywords’ are the epitome of opportunity. There is almost endless amounts of longtail traffic out there and if optimised in the right way, sites can capture a great deal of them. But is it worth spending time creating lots of content and optimising it to pull in longtail traffic?
Firstly, websites don’t necessarily need to be all that powerful to rank for longtail keywords. This means that if you have new or weak site and you cannot compete for the top terms yet, you can always tap into the longtail search at some level. It is very difficult indeed to rank well for a whole host of generic terms as well, whilst there isn’t really anything stopping you ranking for many thousands of longtail terms. This post shows that in order to pull in more longtail traffic, 50% of the work you need to do is onsite work, compared to only 5% onsite work for the top level keywords. With this in mind, if you are not proficient in link-building, but can look after your onsite optimisation and copy, you can still perform well under your own steam, rather than having to outsource any offsite work. The most important thing to say about longtail search terms though, is that they convert much better. As mentioned earlier in part 1, longer keyword searches perform better than short, and so even though traffic might be lower with longtail, sales can still be higher.
Longtail search terms can be something of an unknown entity when it comes to predicting just how much traffic they will provide. To a large extent we know that generic keywords will provide a least some traffic if we rank well for them, but there is no guarantee that longtail search terms will do the same. In order to get anywhere will longtail search, you need to have good amounts of unique copy on your site. You often play a law-of-averages game with longtail – the more content you produce, the greater the chances someone will search for longtail search terms found within it. Not everyone has the time or ability to produce large volumes of content though, and it can seem like a risky investment in resources if there is no guarantee of traffic. Lastly, long tail search habits tend to change more frequently than the large generic terms. For example a certain range or style of dolls house might be popular on month and then receive no search the next, but people will always search for the generic term “dolls houses”. This means that you might spend lots of time optimising for keywords that your research shows people are using, only to find they are redundant before your pages even get crawled.
So what is the answer then?
Annoyingly this really depends on many factors specific to your site. For example, how powerful is the site? What are you selling? How competitive is the market? Etc. What I have found from experience however is that a happy medium is often best. By all means go after the top terms if you think your site has a chance of ranking, but at the same time, make sure you site is positioned to capture as many longtail terms as possible. What I can tell you though, is the worst thing you can. That is, blindly throw all your efforts in one or the other month after month, without considering where your best ROI might come from. Unfortunately, I see many SEOs do this very thing all the time.
Duncan is a search and online marketing specialist in the UK. He is also passionate about travel and blogs for an Oceania cruises company.
Wednesday, March 10th, 2010 |
There is a tendency in SEO to go blazing after the most competitive “glamour keywords” in an attempt to get them ranking high in the SERPS. This is certainly not a fool’s pursuit as there are benefits to ranking for such terms. However, more SEOs these days are waking up to the potential power of longtail search terms and some are even finding they give a much better ROI. So which should you be going after, the head or tail of the search term beast?
Top Generic Terms
The competiveness of top level terms within each niche varies. Trying to rank for “fishing equipment” for example, is likely to be a lot harder than trying to ranking for “tree surgery equipment”. However, as the SERPS become more competitive each day, it can require a lot of time and effort to reach the first page in even the smallest markets. So should we really be investing our blood, sweat and tears in trying to rank a few measly keywords.
First of all, the most generic keywords tend have the highest search volumes. Therefore, if you can get into a good position in the SERPS, you’re pretty much guaranteed to get some traffic from them. Also, by going after the top level terms and building links using these terms in the anchor text, you’re likely to pull in a number of the longer-tail keywords at the same. For example, if you do a lot of work on the term “car insurance” and you see movement up the rankings, you’re likely to see some boost for terms such as “car insurance quotes” or “buy car insurance”. Additionally, ranking for top level terms often helps brand awareness and credibility. When most people see a site ranking highly in the SERPS for competitive terms, they are more likely to assume that site has quality and is trustworthy…if only they knew!
As mentioned earlier, trying to rank for competitive generic terms often requires a great deal of time and effort and can be a little like trying to climb a mountain without actually knowing how high it goes. Whilst SEOs can make informed guesses about just what it will take to move up the next slot in the SERPS, no one can really know for sure and so a term you are plugging away at for months might not even budge an inch. Also, by narrowing your focus on such specific terms, it is very easy to ignore a whole load of terms on the next level down that can also provide good traffic in their own rights. Lastly, the more generic (often single-word) terms do not convert as well as longer-tail terms. Ignoring brand terms, this report shows that conversion rate increases with the number of words in the search query, all the way up to four-word phrases.
Tomorrow we’ll post part II of this this article.
Duncan is an SEO engineer from England. He is also passionate about travel and blogs for a river cruises agent.