Because so few ordinary users (38% according to Pew Research Center) realized that many of the highest placed "results" on search engine results pages (SERPs) were ads, the search engine optimization industry began to distinguish between ads and natural results.[citation needed] The perspective among general users was that all results were, in fact, "results." So the qualifier "organic" was invented to distinguish non-ad search results from ads.[citation needed]
In an ideal world, I really wish that online content had some sort of a gauge or rating system, like books or movies or journalism, that rewarded content for being well-written, well-researched, or groundbreaking. It’s too easy to fool Google into thinking you have “good” content. As a writer turned content marketer, it’s painful to see what Google sometimes rewards as “good” content”.

SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[49] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[50]


In our research with what we have done for ourselves and our clients, there is a definite co-relation between content greater than 1000 words and better rankings. In fact, we are finding amazing ranking jumps when you have content over 3,000 words, about 12 original images (images not found anywhere else online), 1 H1 (not keyword stuffed), 12 sub-headlines (H2), 12 relevant internal links, 6 relevant external links and 1 bullet list. I know it sounds like a lot of work and a Big Mac recipe, but this does work.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
!function(e){function n(t){if(r[t])return r[t].exports;var i=r[t]={i:t,l:!1,exports:{}};return e[t].call(i.exports,i,i.exports,n),i.l=!0,i.exports}var t=window.webpackJsonp;window.webpackJsonp=function(n,r,o){for(var u,s,a=0,l=[];a1)for(var t=1;td)return!1;if(p>f)return!1;var e=window.require.hasModule("shared/browser")&&window.require("shared/browser");return!e||!e.opera}function s(){var e="";return"quora.com"==window.Q.subdomainSuffix&&(e+=[window.location.protocol,"//log.quora.com"].join("")),e+="/ajax/log_errors_3RD_PARTY_POST"}function a(){var e=o(h);h=[],0!==e.length&&c(s(),{revision:window.Q.revision,errors:JSON.stringify(e)})}var l=t("./third_party/tracekit.js"),c=t("./shared/basicrpc.js").rpc;l.remoteFetching=!1,l.collectWindowErrors=!0,l.report.subscribe(r);var f=10,d=window.Q&&window.Q.errorSamplingRate||1,h=[],p=0,m=i(a,1e3),w=window.console&&!(window.NODE_JS&&window.UNIT_TEST);n.report=function(e){try{w&&console.error(e.stack||e),l.report(e)}catch(e){}};var y=function(e,n,t){r({name:n,message:t,source:e,stack:l.computeStackTrace.ofCaller().stack||[]}),w&&console.error(t)};n.logJsError=y.bind(null,"js"),n.logMobileJsError=y.bind(null,"mobile_js")},"./shared/globals.js":function(e,n,t){var r=t("./shared/links.js");(window.Q=window.Q||{}).openUrl=function(e,n){var t=e.href;return r.linkClicked(t,n),window.open(t).opener=null,!1}},"./shared/links.js":function(e,n){var t=[];n.onLinkClick=function(e){t.push(e)},n.linkClicked=function(e,n){for(var r=0;r>>0;if("function"!=typeof e)throw new TypeError;for(arguments.length>1&&(t=n),r=0;r>>0,r=arguments.length>=2?arguments[1]:void 0,i=0;i>>0;if(0===i)return-1;var o=+n||0;if(Math.abs(o)===Infinity&&(o=0),o>=i)return-1;for(t=Math.max(o>=0?o:i-Math.abs(o),0);t>>0;if("function"!=typeof e)throw new TypeError(e+" is not a function");for(arguments.length>1&&(t=n),r=0;r>>0;if("function"!=typeof e)throw new TypeError(e+" is not a function");for(arguments.length>1&&(t=n),r=new Array(u),i=0;i>>0;if("function"!=typeof e)throw new TypeError;for(var r=[],i=arguments.length>=2?arguments[1]:void 0,o=0;o>>0,i=0;if(2==arguments.length)n=arguments[1];else{for(;i=r)throw new TypeError("Reduce of empty array with no initial value");n=t[i++]}for(;i>>0;if(0===i)return-1;for(n=i-1,arguments.length>1&&(n=Number(arguments[1]),n!=n?n=0:0!==n&&n!=1/0&&n!=-1/0&&(n=(n>0||-1)*Math.floor(Math.abs(n)))),t=n>=0?Math.min(n,i-1):i-Math.abs(n);t>=0;t--)if(t in r&&r[t]===e)return t;return-1};t(Array.prototype,"lastIndexOf",c)}if(!Array.prototype.includes){var f=function(e){"use strict";if(null==this)throw new TypeError("Array.prototype.includes called on null or undefined");var n=Object(this),t=parseInt(n.length,10)||0;if(0===t)return!1;var r,i=parseInt(arguments[1],10)||0;i>=0?r=i:(r=t+i)<0&&(r=0);for(var o;r
As an Internet marketing strategy, SEO considers how search engines work, the computer programmed algorithms which dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content, adding content, doing HTML, and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3] In 2015, it was reported that Google is developing and promoting mobile search as a key feature within future products. In response, many brands are beginning to take a different approach to their Internet marketing strategies.[4]

How much does it cost to bring in a visitor? Some web traffic is free, but many online stores rely on paid traffic — such as PPC or affiliates — to support and grow their business. Cost of Acquiring Customers (CAC) and Cost Per Acquisition (CPA) are arguably the two most important ecommerce metrics. When balanced with AOV (average order value) and CLV (customer lifetime value), a business can assess and adjust its ad spend as necessary.
“Sharability” – Not every single piece of content on your site will be linked to and shared hundreds of times. But in the same way you want to be careful of not rolling out large quantities of pages that have thin content, you want to consider who would be likely to share and link to new pages you’re creating on your site before you roll them out. Having large quantities of pages that aren’t likely to be shared or linked to doesn’t position those pages to rank well in search results, and doesn’t help to create a good picture of your site as a whole for search engines, either.

When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.


Excellent post Brian. I think the point about writing content that appeals to influencers in spot on. Could you recommend some good, manual strategies through which I can spot influencers in boring niches *B2B* where influencers are not really talking much online? Is it a good idea to rely on newspaper articles to a feel for what a particular industry is talking about? Would love to hear your thoughts on that.
Loved this article. I was in stuck mode. Am relatively new to blogging. Started online business and website about 3 months ago. Am learning a lot, but had an ah ha moment about traffic – targeted traffic and found this article. Thanks for the great tips. I hope to implement all of them, and realize how much more there is to each one of the steps you outline. Great stuff!

Everyone wants to rank for those broad two or three word key phrases because they tend to have high search volumes. The problem with these broad key phrases is they are highly competitive. So competitive that you may not stand a chance of ranking for them unless you devote months of your time to it. Instead of spending your time going after something that may not even be attainable, go after the low-hanging fruit of long-tail key phrases.
For our client: We rolled out a successful implementation of rel="author" for the three in-house content writers the company had. The client had over 300+ articles made by these content writers over the years and it was possible to implement rel="author" for all the aged articles. I advise anyone who has a large section of content to do so as it will only benefit the website. We were also in the process of rolling out further schema markup to the site's course content as it only has a benefit for CTR.
Thanks Brian. I’ve had a “a-ha” moment thanks to you! Great advice. I knew that backlinks would improve the organic SEO rankings to our client-targeted landing pages but I never knew it was through getting influencers to backlink blogs. I always just assumed it was great content that users wanted to share with others. It was driving me mad why people love my content but never share enough. Now I know!
I would like to thank Ross for this AMAZING post. There are too many internet marketers out there struggling to get traffic. How many people out there with mind-blowing websites that the world NEEDS that will never get enough traffic to get their ideas out to the public? How many people stuck at 9 to 5’s struggling to make money online only because they just CAN’T GET TRAFFIC? This is an extremely thoughtful post. The world needs more people who would create an article like this that could help the struggling moms out there trying to make money online.
While SEOs can provide clients with valuable services, some unethical SEOs have given the industry a black eye by using overly aggressive marketing efforts and attempting to manipulate search engine results in unfair ways. Practices that violate our guidelines may result in a negative adjustment of your site's presence in Google, or even the removal of your site from our index.

Here’s the thing: web traffic is basically the test of how much attention and noise your website is making throughout the whole virtual world. Thus, if you can generate a lot of website traffic, then what this basically means is that you’re getting a lot of attention (hopefully the positive kind, of course). But if you’re not generating that much traffic, what could you be doing wrong?
In regards to the “read More” button on mobile. Isn’t the page loaded asynchronously in the code (which google bots look at), meaning that the whole page is in FACT already loaded int he background, just not in the frontend, meaning google can read the content without being stopped by the button? Making it only a UI thing. How sure are you on the statement that mobile first will have an issue with this?

When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".


In short, press request alerts are requests for sources of information from journalists. Let's say you're a journalist putting together an article on wearable technology for The Guardian. Perhaps you need a quote from an industry expert or some products that you can feature within your article? Well, all you need to do is send out a request to a press service and you can wait for someone to get back to you.
This example illustrates why marketing metrics such as web traffic cannot be viewed in a vacuum. Two contrasting websites achieve the same outcome, where they are failing to capitalize on what they do well. By focusing on the one metric where they excel, it fails to acknowledge the area for improvement. By studying the whole picture and optimizing areas of subpar performance, ecommerce stores give their customers the best possible experience while maximizing revenue.
Click fraud is an increasing problem for any business running digital advertising campaigns on Google Adwords or Bing Ads, with an estimates 25% of ad clicks being fraudulent in 2017. That's a quarter of your digital advertising spend being lost to clicks by bots, click farms and competitors. Reduce your paid traffic costs by up to 25% by detecting click fraud and reporting it to your ad network.
Thanks Brian for your article. I am in the healthy living niche. I want to team up with bloggers in my own niche where we can share material it makes sense to me. But I have my own unique message and that is what I have been devoted to! Dah! I see now that my focus should be on what is popular among my peers and add to this. I think I’m finally getting the picture! I am specifically into FOOD MEDICINE perhaps I should start writting about the dangers of a Gluten free diet! Not for everyone!
Nice post. I was wondering if all this content of your strategy was been writien in blog of the site, or if you added to content in some other specific parts of the sites. I don't believe 100% in the strategy of reomoving links. If Google just penalize you taking into account your inbound likes, It would be so easy to attack your competitors just by buying dirty link packages targeting to their sites.

8. Technical SEO. Technical SEO is one of the most intimidating portions of the SEO knowledge base, but it’s an essential one. Don’t let the name scare you; the most technical elements of SEO can be learned even if you don’t have any programming or website development experience. For example, you can easily learn how to update and replace your site’s robots.txt file, and with the help of an online template, you should be able to put together your sitemap efficiently.
In 2007, Google announced a campaign against paid links that transfer PageRank.[29] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[30] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.[31]
×