Moreover: if you don’t have to, don’t change your URLs. Even if your URLs aren’t “pretty,” if you don’t feel as though they’re negatively impacting users and your business in general, don’t change them to be more keyword focused for “better SEO.” If you do have to change your URL structure, make sure to use the proper (301 permanent) type of redirect. This is a common mistake businesses make when they redesign their websites.
In 2007, Google announced a campaign against paid links that transfer PageRank.[29] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[30] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript.[31]
Websites produce traffic rankings and statistics based on those people who access the sites while using their toolbars and other means of online measurements. The difficulty with this is that it does not look at the complete traffic picture for a site. Large sites usually hire the services of companies such as the Nielsen NetRatings or Quantcast, but their reports are available only by subscription.
SEMRush has a relatively new feature that allows you to quickly see the highest-trafficked pages for a given domain. It’s a bit buried, so can be easy to miss, but it’s a no-brainer shortcut to quickly unveil the topics with massive traffic. Unfortunately it doesn’t immediately give you traffic or traffic cost, but one extra step will solve that for you.
If you've never been on Product Hunt before, it's like a daily Reddit feed for new products. Products get submitted to the community and they're voted on. Each day products are stacked in descending order based on how many votes they've had. Ranking at the top of the daily list can result in thousands of conversion-focused traffic to your site, just as the creator of Nomad List found out.
Guest blogging is a legitimate way to create a press mention (and link) on a credible website. It’s a powerful, but sensitive tactic that starts with empathy for the needs of the editor and their target audience. When used properly, a high-value article is published in your name on another site and potentially links back to the in-depth article on your website.
Generating Organic Traffic is far from easy seeing you have to consistently update your blog with new and unique content. The less you update your blog the less your article pushes up in Google’s index. I would also suggest your Domain Name has a targeted keywords in it! So if someone is searching a set of keywords that are not only a topic title on your site but also a part of the domain name you have a really good chance of being displayed in Google’s Top Ten.
This topic seems actually quite controversial. Google answered the question by what could be taken as a denial. But their answer was kind of open to interpretations. And on the other hand, there are studies (one of them from Moz) that showed linking out has an impact. So, how can you be so assertive? Is it something that comes out from your own experiments?

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[60] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[61] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[62] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.

But of course, in a world that runs on efficiency and efficacy, it’s hard to demand all of that from you. This is where the option to buy website traffic comes in. When you choose to buy, you’re basically passing on the responsibility to someone else. You’re entrusting another company with the task of making sure that your website does generate the necessary amount of website traffic.


The actual content of your page itself is, of course, very important. Different types of pages will have different “jobs” – your cornerstone content asset that you want lots of folks to link to needs to be very different than your support content that you want to make sure your users find and get an answer from quickly. That said, Google has been increasingly favoring certain types of content, and as you build out any of the pages on your site, there are a few things to keep in mind:


Yesterday I was re doing our process for ideas and alltop was a part of it. Now I have also known it was a bit spammy (some of my grey sites are featured ) but now it seems way too bad. You have places like new York times next to random adsense blog x. Guy kawasaki needs to really start giving some sort of influence ranking or at least culling the total crap ones.
Like the hundreds of people already, I thought this was an amazing post. You have a great way of breaking things down into ways that the average reader will be able to understand and make actionable. I think this is a great resource for our readers, so I included it in my monthly roundup of the best SEO, social media, and content marketing articles. https://www.northcutt.com/blog/2014/02/january-resource-round-up-the-best-of-seo-social-media-and-content-marketing/

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[63] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[64] As of 2006, Google had an 85–90% market share in Germany.[65] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[65] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[66] That market share is achieved in a number of countries.


You know, the plus sides or that, it’s free, so you’re not risking anything. The downsides to organic traffic, or free traffic, are, we see it done poorly so many times. You see the people who are spamming YouTube comments, who are spamming Facebook Groups, and who are trying to get their business, business, or their affiliate links out there. But this is not the way to go about it, . There’s a right way, and there’s wrong way, and this is the wrong way. When you’re just spamming your links up, don’t expect any sales anytime soon.

Hi there, am interested to try your trick in Wikipedia, but am also not sure of how should I do tht, coz i read some posts saying tht “Please note that Wikipedia hates spams, so don’t spam them; if you do, they can block your IP and/or website URL, check their blocking policy and if they blacklist you, you can be sure that Google may know about it.”

Thanks for a this timely article. If I understand it correctly, are you saying that we would better be off looking at market data in our niche and make an article of that for influencers to share rather than actionable tips that the target clients would be interested in? Shouldn’t there be a double strategy – articles for the influencers to share and articles for the users to enjoy?

Though a long break is never suggested, there are times that money can be shifted and put towards other resources for a short time. A good example would be an online retailer. In the couple of weeks leading up to the Christmas holidays, you are unlikely to get more organic placement than you already have. Besides, the window of opportunity for shipping gifts to arrive before Christmas is ending, and you are heading into a slow season.
Great post Ross but I have a question on scaling the work that goes into producing the Kob score: how do you recommend you go about getting the MOZ difficulty score – do you do it manually then VLOOKUP everything or some other way? My current membership at MOZ allows 750 searches a day for KW difficulty so this can be a limiting factor in this research. Would you agree?
The other forms is just social media. So, this is just social. This is using the platforms, such as Facebook, such as, you know, Facebook Groups, using Instagram, using YouTube, using social media in order to promote your business, your links, or whatever. So, this to me, is, it depends, the quality of the traffic that you get from social, really depends on the platform. Honestly, I’d say something like Twitter would not get you very high quality traffic. In fact, traffic from Twitter, in my experience, is usually very low quality, which means low conversions, and no sales, basically. So, you know, this isn’t a hard or fast rule, but basically, something like YouTube, on the other hands, is very high converting. So, basically, you got social, you got SEO, this is organic traffic. It’s free.

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[46]
I’ve always been one to create great content, but now I see it may not necessarily be the right content. Can Share Triggers work for all niches including things like plumbing companies, computer repair, maybe even handy men that have a website for their business? I would say I’m estimating half the views a month as I should. Hopefully some of these strategies will help.
Hi SEO 4 Attorneys, it could be any thing is this for your site for a clients site.It could be an attempt at negative SEO from a competitor? The thing is people may try to push 100's of spammy links to a site in hopes to knock it down. In the end of the day my best advice is to monitor your link profile on a weekly basis. Try to remove negative links where possible if you cant remove them then opt for the disavow tool as a last resort. 

What kind of advice would you give is your site is growing but seems to be attracting the wrong kind of traffic? My visitor numbers are going up but all other indicators such as bounce rate, time page, pages per visit seem to be developing in the wrong direction. Not sure if that’s to be expected or if there is something that I should be doing to counter that development?
(function(){"use strict";function u(e){return"function"==typeof e||"object"==typeof e&&null!==e}function s(e){return"function"==typeof e}function a(e){X=e}function l(e){G=e}function c(){return function(){r.nextTick(p)}}function f(){var e=0,n=new ne(p),t=document.createTextNode("");return n.observe(t,{characterData:!0}),function(){t.data=e=++e%2}}function d(){var e=new MessageChannel;return e.port1.onmessage=p,function(){e.port2.postMessage(0)}}function h(){return function(){setTimeout(p,1)}}function p(){for(var e=0;et.length)&&(n=t.length),n-=e.length;var r=t.indexOf(e,n);return-1!==r&&r===n}),String.prototype.startsWith||(String.prototype.startsWith=function(e,n){return n=n||0,this.substr(n,e.length)===e}),String.prototype.trim||(String.prototype.trim=function(){return this.replace(/^[\s\uFEFF\xA0]+|[\s\uFEFF\xA0]+$/g,"")}),String.prototype.includes||(String.prototype.includes=function(e,n){"use strict";return"number"!=typeof n&&(n=0),!(n+e.length>this.length)&&-1!==this.indexOf(e,n)})},"./shared/require-global.js":function(e,n,t){e.exports=t("./shared/require-shim.js")},"./shared/require-shim.js":function(e,n,t){var r=t("./shared/errors.js"),i=(this.window,!1),o=null,u=null,s=new Promise(function(e,n){o=e,u=n}),a=function(e){if(!a.hasModule(e)){var n=new Error('Cannot find module "'+e+'"');throw n.code="MODULE_NOT_FOUND",n}return t("./"+e+".js")};a.loadChunk=function(e){return s.then(function(){return"main"==e?t.e("main").then(function(e){t("./main.js")}.bind(null,t))["catch"](t.oe):"dev"==e?Promise.all([t.e("main"),t.e("dev")]).then(function(e){t("./shared/dev.js")}.bind(null,t))["catch"](t.oe):"internal"==e?Promise.all([t.e("main"),t.e("internal"),t.e("qtext2"),t.e("dev")]).then(function(e){t("./internal.js")}.bind(null,t))["catch"](t.oe):"ads_manager"==e?Promise.all([t.e("main"),t.e("ads_manager")]).then(function(e){undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined}.bind(null,t))["catch"](t.oe):"publisher_dashboard"==e?t.e("publisher_dashboard").then(function(e){undefined,undefined,undefined,undefined,undefined,undefined,undefined,undefined}.bind(null,t))["catch"](t.oe):"content_widgets"==e?Promise.all([t.e("main"),t.e("content_widgets")]).then(function(e){t("./content_widgets.iframe.js")}.bind(null,t))["catch"](t.oe):void 0})},a.whenReady=function(e,n){Promise.all(window.webpackChunks.map(function(e){return a.loadChunk(e)})).then(function(){n()})},a.installPageProperties=function(e,n){window.Q.settings=e,window.Q.gating=n,i=!0,o()},a.assertPagePropertiesInstalled=function(){i||(u(),r.logJsError("installPageProperties","The install page properties promise was rejected in require-shim."))},a.prefetchAll=function(){t("./settings.js");Promise.all([t.e("main"),t.e("qtext2")]).then(function(){}.bind(null,t))["catch"](t.oe)},a.hasModule=function(e){return!!window.NODE_JS||t.m.hasOwnProperty("./"+e+".js")},a.execAll=function(){var e=Object.keys(t.m);try{for(var n=0;n=c?n():document.fonts.load(l(o,'"'+o.family+'"'),s).then(function(n){1<=n.length?e():setTimeout(t,25)},function(){n()})}t()});var w=new Promise(function(e,n){a=setTimeout(n,c)});Promise.race([w,m]).then(function(){clearTimeout(a),e(o)},function(){n(o)})}else t(function(){function t(){var n;(n=-1!=y&&-1!=g||-1!=y&&-1!=v||-1!=g&&-1!=v)&&((n=y!=g&&y!=v&&g!=v)||(null===f&&(n=/AppleWebKit\/([0-9]+)(?:\.([0-9]+))/.exec(window.navigator.userAgent),f=!!n&&(536>parseInt(n[1],10)||536===parseInt(n[1],10)&&11>=parseInt(n[2],10))),n=f&&(y==b&&g==b&&v==b||y==x&&g==x&&v==x||y==j&&g==j&&v==j)),n=!n),n&&(null!==_.parentNode&&_.parentNode.removeChild(_),clearTimeout(a),e(o))}function d(){if((new Date).getTime()-h>=c)null!==_.parentNode&&_.parentNode.removeChild(_),n(o);else{var e=document.hidden;!0!==e&&void 0!==e||(y=p.a.offsetWidth,g=m.a.offsetWidth,v=w.a.offsetWidth,t()),a=setTimeout(d,50)}}var p=new r(s),m=new r(s),w=new r(s),y=-1,g=-1,v=-1,b=-1,x=-1,j=-1,_=document.createElement("div");_.dir="ltr",i(p,l(o,"sans-serif")),i(m,l(o,"serif")),i(w,l(o,"monospace")),_.appendChild(p.a),_.appendChild(m.a),_.appendChild(w.a),document.body.appendChild(_),b=p.a.offsetWidth,x=m.a.offsetWidth,j=w.a.offsetWidth,d(),u(p,function(e){y=e,t()}),i(p,l(o,'"'+o.family+'",sans-serif')),u(m,function(e){g=e,t()}),i(m,l(o,'"'+o.family+'",serif')),u(w,function(e){v=e,t()}),i(w,l(o,'"'+o.family+'",monospace'))})})},void 0!==e?e.exports=s:(window.FontFaceObserver=s,window.FontFaceObserver.prototype.load=s.prototype.load)}()},"./third_party/tracekit.js":function(e,n){/**
How you mark up your images can impact not only the way that search engines perceive your page, but also how much search traffic from image search your site generates. An alt attribute is an HTML element that allows you to provide alternative information for an image if a user can’t view it. Your images may break over time (files get deleted, users have difficulty connecting to your site, etc.) so having a useful description of the image can be helpful from an overall usability perspective. This also gives you another opportunity – outside of your content – to help search engines understand what your page is about.

One common scam is the creation of "shadow" domains that funnel users to a site by using deceptive redirects. These shadow domains often will be owned by the SEO who claims to be working on a client's behalf. However, if the relationship sours, the SEO may point the domain to a different site, or even to a competitor's domain. If that happens, the client has paid to develop a competing site owned entirely by the SEO.


Hi Brian! I enjoy reading your posts and use as much info as I possibly can. I build and sell storage sheds and cabins. The problem I have is that there are no top bloggers in my market or wikipedia articles with deadlinks that have to do with my market. 95% of my traffic and sales are generated via Facebook paid advertising. Would love to get more organic traffic and would be interested in your thoughts concerning this.
Earlier in the comment stream, there was a brief discussion about page load time/website speed and its effect on page ranking. I have tried to find unbiased information about which hosting company to use when starting a blog or a small WordPress sites, keeping in mind the importance of speed. This endeavor has been harder than expected as most hosting review sites have some kind of affiliate relationship with the hosting companies they review.

11th point to me would be too look at your social media properties, work out how you can use them to assist your SEO strategy. I mean working on competitions via social channels to drive SEO benefit to your main site is great, working on re-doing your YouTube videos to assist the main site and also working on your content sharing strategy via these social sites back to the main site.
Well, yes and no. Sure, you can get hit with an algorithm change or penalty that destroys all your traffic. However, if you have good people who know what they are doing, this is not likely to happen, and if it does, it is easy (in most cases) to get your visits back. Panda and Penguin are another story, but if you get hit by those it is typically not accidental.

The thing about SEO in 2018 is that Google changes its algorithms more than once a day! Reports say that the company changes its algorithms up to 600 times a year. While the majority of those updates consist of smaller changes, among them is the occasional, major update like Hummingbird or Panda that can really wreak havoc with your traffic and search rankings.
This post and the Skycraper technique changed my mind about how I approach SEO, I’m not a marketing expert and I haven’t ranked sites that monetize really well, I’m just a guy trying to get some projects moving on and I’m not even in the marketing business so I just wanted to say that the way you write makes the information accesible, even if you’re not a native english speaker as myself.
Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site's user experience and performance in organic search results. You're likely already familiar with many of the topics in this guide, because they're essential ingredients for any web page, but you may not be making the most out of them.

To prevent users from linking to one version of a URL and others linking to a different version (this could split the reputation of that content between the URLs), focus on using and referring to one URL in the structure and internal linking of your pages. If you do find that people are accessing the same content through multiple URLs, setting up a 301 redirect32 from non-preferred URLs to the dominant URL is a good solution for this. You may also use canonical URL or use the rel="canonical"33 link element if you cannot redirect.

Web traffic is the amount of data sent and received by visitors to a website. This necessarily does not include the traffic generated by bots. Since the mid-1990s, web traffic has been the largest portion of Internet traffic.[1] This is determined by the number of visitors and the number of pages they visit. Sites monitor the incoming and outgoing traffic to see which parts or pages of their site are popular and if there are any apparent trends, such as one specific page being viewed mostly by people in a particular country. There are many ways to monitor this traffic and the gathered data is used to help structure sites, highlight security problems or indicate a potential lack of bandwidth.


For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
The term “organic traffic” is used for referring to the visitors that land on your website as a result of unpaid (“organic”) search results. Organic traffic is the opposite of paid traffic, which defines the visits generated by paid ads. Visitors who are considered organic find your website after using a search engine like Google or Bing, so they are not “referred” by any other website.
×