Thursday, November 8, 2007

Mads Gorm Larsen: Web 2.0

Mads Gorm Larsen: Web 2.0

Marketing through Social networks like myspace

I have for a while now,been intrigued by the possibilities to position a homepage/company/brand through web 2.0 applications and social networks like MySpace and Facebook.

So this week I started creating profiles for my two homepages:

http://www.officeguide.dk
http://www.vistaguide.dk

I started up building a MySpace account and then moved on to FaceBook, at Facebook i also started a personal profile, which of course also links to both the homepages as well.

The next couple of weeks I will be doing some testing and try to measure just how much traffic theese social networks can generate. The results will be posted at this blog, so stay tuned.

Tuesday, November 6, 2007

Google’s patent specification revealed.

A couple of days ago I stumbled upon an article, trying to elaborate on the patent Google recently filed. The article stated that Google measures content changes to try and determine how stable or fresh a web page actually is.


But of course this doesn’t mean that you should always change the content of your websites, because changing it too often will make the webpage appear shallow and Google tries to distinguish between real and superfluous content changes.
Some of the factors Google possibly uses to record changes in web pages are:

the frequency of changes
the amount of changes (substantial or shallow changes)
the change in keyword density
the number of new web pages that link to a web page
the changes in anchor texts (the text that is used to link to a web page)
the number of links to low trust web sites (for example too many affiliate links on one web page)


Section 0128 in the patent filing, explains how one shouldn’t change the focus of too many pages at once:


"A significant change over time in the set of topics associated with a document may indicate that the document has changed owners and previous document indicators, such as score, anchor text, etc., are no longer reliable.


Similarly, a spike in the number of topics could indicate spam. For example, if a particular document is associated with a set of one or more topics over what may be considered a 'stable' period of time and then a (sudden) spike occurs in the number of topics associated with the document, this may be an indication that the document has been taken over as a 'doorway' document.


Another indication may include the disappearance of the original topics associated with the document. If one or more of these situations are detected, then [Google] may reduce the relative score of such documents and/or the links, anchor text, or other data associated the document."

Read the whole patent application here

Tuesday, October 23, 2007

Google filters new sites

I recently received an e-mail about the controversy of Google pagerank and how long it takes for Google to list a new page. I thought it was quite interesting, so here is a summarization of the e-mail.

Apperently Yahoo and MSN finds and list new pages approximatly three weeks after page launch. Google however can be about 12 weeks to categorize and list a new page. Earlier speculations about Google would simply block out new pages with some kind of filter appears to be true.

Trying to figure out how Google list pages is quite a science and it is very hard to actually prove, how its search algorithm functions in reality. We can only speculate and do tests to a certain point, this however is also what makes SEO (Search Engine Optimization) such an interesting "science".






The timeline illustrates the brith of a webpage and how long it takes for the search engines crawlers, to crawl and list the site.