Is Placing Content Higher up the Page an SEO Myth?

The somewhat misleadlingly titled article “16 Elements You Must Include in Your Site Design”:www.webhostdir.com/news/articles/shownews.asp?id=16417 has been proving popular in Delicious lately.
Guideline #5 recommends that you:
bq. Place content near the top of the page. Search engines spider the content that comes first in your source code.


Nothing too earth-shattering there about considering SEO when building web pages. This has been said before many times.
However, through a coincidence of timing, search engine optimization expert Jill Whalen covered this very topic in the latest issue of her @High Rankings Advisor@ newsletter but provided “quite different advice”:www.highrankings.com/advisor/lowbodycopy/.
Asked by a reader about whether body copy appearing 2/3 of the way down a page’s source code negatively impacts search engine rankings, she replied:
bq. This is an old SEO myth. It actually makes no difference where in the source
code the copy of the page shows up.
That’s a pretty definitive statement about a widely-held SEO belief. However, given Jill’s expertise in this area, I am inclined to believe her.
What do others think – do you agree or disagree?

9 thoughts to “Is Placing Content Higher up the Page an SEO Myth?”

  1. Ofcourse it doesn’t matter where in the source code is the content, since the search engines spiders run their algorithms on the source code as a whole and the algorithm does not include some sort of a counter that checks how far into the code is the content, but only what type of a content it is (a link, the title of the web page, etc.).

  2. I Don’t think it makes a difference. I’m not an expert, but I don’t see how the spiders could work out how far down in your page the content is.

  3. I think it stems from the belief that search engines don’t spider the *whole* page. Just say, the top 2k of characters. This would make sense, as programatically speaking the page would be read into a buffer, and that buffer must have a theoretical limit on how big it can be.

  4. I think this is rediculess. People in SEO it seems forget at times that the search engines are in business of providing relevent results, not making it harder for them to trick their way to the top.
    If almost every page’s content is 2/3s down the page, then relevency would be lost by excluding or devaluing everything beyond the first 1/3.

  5. Thinking a little further, it might be possible that some search engines try and identify the overall topic of the page by looking at headings and key phrases at the top? or maybe once did?

  6. All – it does seem counterintuitive for search engines to have a cut-off after a certain number of characters as this stands to make their results less accurate (a case of cutting off your nose to spite your face).
    On the other hand, as “Richard points out”:https://smileycat.com/miaow/archives/000261.php#comment-12842 , I can see how they might need to draw the line at some point.
    However, surely this would be a pretty large number, no? Which to all intents and purposes means that your body copy can be as far down in your source code as you like and you won’t be penalised. Myth debunked!
    Of course, if you build your sites using CSS for presentation, this issue doesn’t really present itself anyway.

  7. I would highly doubt that Search engines weigh content less heavily if it is below a certain arbitrary length down the page. However they do assign greater importance to content within header tags.

  8. The search engines index at least 100k of content of any HTML file. Some are indexing even more these days, but the 100k number has been known for quite some time.

  9. It doesn’t harm you, if you place content high in your pages, in fact if google human spiders decide to visit your page, you will not be thrown away, since they only read the first few lines of your page. If they don’t see any relevancy they’ll strip you of your page ranking in google index.

Comments are closed.