History of The Hyperlink
- 1965: The term “hyperlink” was coined by Ted Nelson and his assistant Calvin Curtin at the start of Project Xanadu. Nelson had been inspired by “As We May Think”, a popular essay by Vannevar Bush. In this essay, Bush described a microfilm-based machine in which one could link any two pages of information into a “trail” of related information, and then scroll back and forth among pages.
- 1964 through 1980: Nelson transposed Bush’s concept of automated cross-referencing into the computer context, made it applicable to specific text strings rather than whole pages, generalized it from a local desk-sized machine to a theoretical worldwide computer network, and advocated the creation of such a network.
- 1966: Working independently, a team led by Douglas Engelbart was the first to implement the hyperlink concept for scrolling within a single document.
- 1968: Links were used for connecting between paragraphs within separate documents with NLS.
- 1987: A database program HyperCard was released for the Apple Macintosh that allowed hyperlinking between various types of pages within a document.
- August 6, 1991: A team of CERN engineers led by Sir Tim Berners-Lee worked to provide what was the first tangible example of the potential of the World Wide Web by launching the first website
- 1991 – 1998: Websites would buy links solely for traffic, and would structure their purchases much like a media buy. When website owners first started buying links it was based on traffic not ranking manipulation in the search engines.
- 1995: Larry Page and Sergey Brin meet at Stanford.According to some accounts, they disagree about almost everything during this first meeting.
- 1998: Google sets up workspace in Susan Wojcicki‘s garage at 232 Santa Margarita, Menlo Park.
- 1998 – today: Once Google started gaining popularity they became a focus for the SEO community. SEO’s began to learn the value of links as they pertained to rankings, and how links were used changed forever. SEO’s focused on PageRank (a link analysis algorithm, named after Larry Page that assigns a numerical weighting to each element of a hyperlinked set of documents, with the purpose of “measuring” its relative importance within the set of webpages) to try and increase a page’s page rank, and ultimately the rank within the search engines.
Types of Links
A reciprocal link is a mutual link between two objects, commonly between two websites to ensure mutual traffic. These types of links have been utilized in some of the early link manipulation schemes to inflate page rank. These links can be easily spotted on sites, since they are usually located on pages called “links” and contain a plethora of types of links, most of which are off topic and look like spam.
Resource Links are a category of links, which can be either one-way or two-way, and are usually referenced as “Resources” or “Information” in navbars, but sometimes, especially in the early, less compartmentalized years of the Web, are simply called “links”.
Editorially Given Links
One-way, editorial based links are a category of links that hold some of the highest value from the search engines. They are usually found within content and have the least risk of being devalued by the search engines.
These are links that are located within the main body of content. They usually are editorially given and can also be called inline links. These, if editorially given, and not purchased, hold high value and low risk of being devalued by the search engines. One of the reasons they are of high value is that the text surrounding the link gives context to the link.
Footer or Sidebar (Blog Roll) Links
These are exactly what they sound like. They are links placed in the footer and/or sidebar of a site. In most cases, these links are devalued by the search engines since they are usually ‘run-of-site’ links (meaning they are on most, if not all, pages on the website) and are a common place to put paid links.
This type of linking is a technique used to build backlinks to a website. This is the process of using forum communities that allow outbound hyperlinks in a member’s signature. This type of linking can be valuable for driving traffic and in some cases can add traditional link metrics if done well.
Most of the time, links left as comments on a blog get the rel=nofollow attribute added to them. Most black hat SEO’s see little value in these type of links because they don’t add direct SEO value; however, this mentality is very shortsighted. If the comment is well-thought-out and pertains to the on-going discussion with other people commenting on the article, the click through rate can contribute to the user metrics module of the search engine ranking algorithm and thus has the opportunity to influence rankings.
Website directories are lists of links to websites, which are sorted into categories with the goal of making these sites easier for users to find. Website owners can submit their site to many of these directories. Some directories accept payment for listing in their directory, while others are free.
Social and Profile Links
This includes links within social media profiles or links shared on sites such as twitter or Facebook. Links shared via social media sites are mostly all no-followed but carry value for PR and improving rankings for logged-in users on Google. The links on Twitter have also shown to increase indexation rate thus proving that Google can and will follow links with the nofollow tag to discover new content, but will not pass ranking metrics.
3 Benefits of External Links
With the emergence of Google, links became a primary signal that the search engines used to determine rankings within the search results. It is believed that Google determines rankings based on two broad factors: relevancy to a search term and popularity; and many have labeled an external link as a “vote of popularity” for a site and somewhat of a structured voting system for pages.
- Relevancy of sites linking in
- Relevancy of pages linking in
- Quality of sites linking in
- Quality of web page linking in
- Link profile diversity
- Anchor text diversity
- Different IP addresses of linking sites
- Different TLDs
- Diversity of link placements
- Authority Link Per Inbound Link
- Backlinks from bad neighborhoods
- Social media links ratio
- Backlinks trends and patterns
- Citations in Wikipedia and Dmoz
- Backlinks from social bookmarking sites
Indexing and Crawl Rate
Crawl rate and indexation is one of the most valuable indirect benefits of increasing the amount of links to a website. As total quality links to a site increases, so does the entry points that Google and the other search engines have to access and crawl your website. Therefor quality websites (which in themselves have quality links pointing to them) pointing to your website will get their pages spidered more often thus giving the search engines the ability to follow those links to pages on your website to discover new content.
With large sites that target long tail keywords, such as model numbers or words that have low search volume and low competition coupled with a high conversion rate, indexation in the primary Google index is key for increasing sales. Links deep into the site can create packets of metrics that can be distributed and give these long tail targeted pages the little boost they need to become sticky in the primary index.
Crawl Rate and Depth
Crawl rate and how deep Google will crawl your site is partially based on your link graph and the number of deep links that your site is getting.
Googlebot will crawl a site starting with the most important pages (which have the most links pointing to them), down to the pages that have the least value. At some point Google will determine that a page does not have enough “importance” for them to waste their time crawling or indexing the content. – Matt Cutts
The key to getting Google to crawl your site as completely as possible is to develop high value deep links from external websites.
Traffic and Branding
Before Google, the big push for getting links was based primarily on driving traffic and was treated much like a traditional media buy. Sites were analyzed based on how much traffic they received, and sites with higher traffic could garner high link prices. Today this is less of a mentality of online link builders, and falls more on traditional media buying teams who use banners and images to drive traffic and branding. Although IMO this should still be a focus of link builders since user metrics are becoming more important when the search engines are trying to determine value of a website.