From Wikipedia’s own website: “The greater the number and quality of Wikipedia articles, the greater the number of people will link to us, and therefore the higher the rankings (and numbers of listings) we’ll have on Google. Hence, on Wikipedia ‘the rich (will) get richer’; or ‘if we build it, they will come,’ and in greater and greater numbers.”
In March 2000, Nupedia was launch around articles written by experts and reviewed under a formal process. In January 2001, a feeder project with the goal of making a publicly editable encyclopedia, and the supporting technology choice of using a wiki, gave birth to Wikipedia. By the end of 2001, with roughly 20,000 articles, Wikipedia gained serious ground with search engines and quickly overshadowed all but three websites in terms of SERP visibility.
Indeed, we already knew that collaborative writing can create vast amounts of information. However, Wikipedia’s organic search success is due to more than just content and built-in quality control processes.
1. Platform and automation: Highly search engine-optimized pages, navigation, technical readability, and linking structure are all vital. Two hundred and fifty million internally optimized links help Wikipedia’s SEO efforts tremendously. Search optimized internal links across related and high-quality pages make a world of difference, but manual coding is not an option to execute deep links consistently over time.
You also need additional flexibility because your websites and lead generation mini-sites are not geared towards being know-it-all encyclopedias. However, there is good news here because you will augment your clout with search engines by segmenting your content across a number of domains (links spread across root domains is a sign of quality). You need a platform that scales your efforts across multiple domains.
Wikipedia uses MediaWiki. What do you use?
Yes, we know WordPress is a great blogging platform, but, no, it is not an option to scale search engine visibility across vast number of website properties.
2. Quality is key. Content should at least equal that of sites you wish to beat in SERPs. But that’s the only point I will include on that list that requires on-going thinking from your part in the absence of hundreds of thousands of Wikipedians. Interns or offshore writers might be good sources to cost-effectively meet the quality threshold.
3. Volume matters. Thirteen million articles filled with original and relevant content are bound to give you a good level of visibility: Wikipedia counts 165 million inbound links.
But what is less known is that pages start with a small nominal value in terms of page rank. As a result, the more pages you’ve got, the more page rank you create for yourself. And that’s page rank you can pass around throughout your own network of pages and websites. In short, you can keep mostly to yourself. That’s where the next point comes in.
4. Manage link equity. Wikipedia works as a vortex that sucks out inbound link equity (a.k.a. Google Juice) from outside the network (see opening statement) and never sends it back thanks to the systemic implementation of the infamous rel=”nofollow” tag. Follow links within your corpus of websites; follow contextual outbound links to authoritative websites, and use ‘no follow’ tag for others. You can automate most of this too.
Pick a technical framework built to scale and manage exceptions to the rules only. Then write any amount of quality content you can muster, and augment volume over time. For this, obviously I would not recommend any else than SEO Samba as the first multi-site SEO execution platform or SEO Software as a Service. Success breeds more success.