For many, SEO is still seen as a bit of a dark art, with people huddled in smoky rooms trying to game Google and get their site ahead of the competition at all costs.
If only our daily working life resembled a Hollywood film that closely, but alas, it doesn’t. Instead it basically involves ensuring our client sites have great content that fully answers users’ needs, is useful, relevant, original and set up in a way that the search engines can easily access.
But plenty of blogs still exist perpetuating the same tired old tactics that either worked 15 years ago but haven’t been relevant for more than a decade, or that never really did anything except get your site blacklisted.
In the old days, we’d have to explain the risks of the so-called ‘black hat’ techniques. But more recently we’ve needed to allay the fears of clients who’ve read a series of SEO blogs and are panicking that their site is not fit for purpose when it comes to ranking.
So we thought it would be useful to highlight some of the myths that are out there and explain the truth behind them.
Submitting a site to search engines – search engines crawl billions of URLs every day, spreading out from site to site based on who links to who. This allows the spiders to work methodically through sites, cataloguing them as they go. By all means, feel free to submit your site to the search engines – it will likely help Google find new content faster and easier, which will mean your shiny new post at least has the chance to rank. But your site is capable of being indexed without doing it – and Google themselves say (on the very screen where you submit your URL) that it doesn’t impact on whether or when you will be indexed.
PDFs are no good for SEO – there was a time when Google struggled to rank PDFs as it was unable to get into the files and read the content. That time, however, was loooong ago. As long as the PDF is set up correctly, Google is more than capable of reading and indexing it. If it’s not setup correctly, it won’t rank – but guess what, neither will a standard web page.
Typically, we wouldn’t recommend clients just have a PDF for important content, but instead have a HTML landing page with the option to download a PDF if needed. However, we usually say this for user experience rather than SEO reasons, and there are undoubtedly certain bits of content that work much better being in PDF format. If that’s the case, use it. It’s more than capable of holding its own in the search results.
You can only rank well if you also do Google Ads (formerly AdWords) – this one has been around for as long as Google has offered paid results, and is probably driven more by conspiracy theories than any hard proof. And that’s because there is no hard proof. The two departments are separate, and sites that don’t do any paid search consistently rank at the top ahead of companies who do pay for ads. Again, there are various reasons why SEO and PPC should be used in combination, but these are marketing reasons rather than SEO ones.
Fresh content is vital – a few years ago, Google introduced the concept of Quality Deserves Freshness (or QDF) to its search results. The idea is a simple one – newer content is better than older content because it will have new ideas and up-to-date information, providing a better user experience.
The problem is that QDF is simply not relevant to every search query. Some content is evergreen, because the information that it includes has never, and likely will never, be updated. What do you do in this case? Simple. Leave it alone. Google is after rewarding new quality content, not just new content, so don’t waste time making barely noticeable tweaks to headings and intro paragraphs to try and rank higher.
Instead, make the content great to begin with, and then think what other great content you can be putting on your site (or how you can update the content that DOES need freshening up).
Social media feeds mean fresh content – one of the ways you can add fresh, interesting content to your site is to ensure your company blog is kept up to date with useful articles that address the topics of the day.
One way that you CAN’T add fresh content to the site is with a social media feed. While it shows users that you’re nice and active on Facebook, Twitter, LinkedIn or Instagram, the search engine spiders don’t actually ‘see’ any of that content. Instead, they just see the code for whatever widget you’re using to call your social stuff in, and that’s pretty much it.
Even if Google could see your social content, it would be unlikely to reward a site simply for having an active Facebook account – it rewards the best user experience, not the best use of social.
Meta description is necessary for ranking (or, meta description is not needed at all) – meta tags refer to anything in the backend that the user won’t see on your site, but that the search engines will see when crawling it. In the old days, these were used to decide where a site should rank. To an extent a few of these – like the ‘Title’ tag – still hold some weight. Others, like the meta description, don’t help with rankings whatsoever – and haven’t done so for years.
Does that mean all the blogs telling you not to bother with them are correct? Not quite. Meta descriptions won’t boost your SEO but it will help your click-through rate. That’s because the meta description is usually used by Google for the text that sits with your result in the search results.
It’s your opportunity to make a sales pitch for your site, so make it count – just don’t expect it to enhance your search position.
You shouldn’t link to other sites – this is an odd one as it’s tricky to work out where it would have come from, what the thought-process was, or why it’s still going strong. It possibly stems from the belief that sending someone to another site risks losing that potential customer, but that wouldn’t really explain why it would hamper SEO.
It more likely comes from a fear that Google will look at your links and think you’re engaging in some sort of link exchange. That’s where site A links to site B, which then links to site C, which links back to site A. But Google is smart enough to know when unrelated sites link to each other just to help each other’s rankings, and when a genuine relationship exists. If that’s the case, don’t be afraid to provide some link love to others – after all, why should other sites send visitors your way and provide important link juice if you’re unwilling to do the same?
Google looks at your Google Analytics data to decide your rankings – you may be looking at ‘poor’ stats on your site – high bounce rate, perhaps, or low time on site – and going into panic mode that Google will scanning your Google Analytics accounts to find a stick to beat you with.
It’s a valid concern – after all, part of Google’s decision-making on how much you pay for Ads clicks is your Quality Score, and part of that looks at how well people respond to your page when they go there from an advert. But just because that’s how Ads works doesn’t mean it’s how the rest of Google works.
As we’ve said before, the two departments are separate, and learnings from one can’t necessarily be transferred to the other. Google is able to use that as a metric for Ads because it needs you to have Ads code on your site in order for it to function. But there are hundreds of thousands of sites that don’t use Google Analytics, so it would render part of the algorithm unusable on a huge percentage of websites.
On top of that, the algorithm would have no way of deciding what stats were good and what ones weren’t. As we’ve covered in our piece on UX metrics, a high bounce rate in itself is no indication of poor quality – a good landing page will have a high bounce rate because you’ve given the user all the info they needed, they took the required action (eg filling out a form), and then left your site. Equally, a well-laid out contact page will have a high bounce rate and low time on site because it quickly fulfilled the user need. If Google penalised based on stats, it would run the risk of wrongly scoring sites that work well.
So don’t worry, Google isn’t trawling Analytics and penalising you. You should, of course, be worried about poorly performing pages – but for CRO and UX reasons rather than SEO ones.
Google has a ‘sandbox’ for new sites / older domains have more authority – these myths go hand in hand, and are both untrue for pretty much the same reason. It was long thought that new sites went into a ‘sandbox’ that stopped them being ranked for a people of time, a sort of ‘new site’ penalty that people felt was to stop spammy sites popping up left, right and centre,
Similarly, older sites have more ‘authority’ because they’re seen as established, trusted and worthy – their very age alone was a ranking factor that put them ahead of old newer upstarts.
Trouble is, neither are true. There are many reasons why a new site may not be ranked straight away – Google hasn’t gotten round to finding it yet, it has no links to help it rank, Google has found it but is still deciding where it should fit into the rankings, and so on. These are, by and large, all brought about because the site is new, but that doesn’t mean there’s any form of new site penalty or sandbox.
Similarly, sites that have been around for a while will have had longer to put into place the best practices that will often lead to a good ranking. They’ll have good quality, relevant links, excellent content, will possibly have ironed out technical issues that can harm SEO and so on. They have had longer to do things well, so it makes sense that they would rank ahead of a site that hasn’t had that time – but it doesn’t mean its age is being used as a ranking factor.
For a start, plenty of people have bought domains that have sat unused for a decade, so how would that fit into the algorithm? Alongside that, for every old site that ranks well are countless more old sites that appear nowhere. And for every new site struggling to get noticed is one that appears almost instantly. Focus on giving your users what they want, worry less about the age of your website – it’s not a factor, but even if it was you wouldn’t be able to impact it, so don’t stress it!
You’ll be penalised for duplicate content – this myth stems from the misunderstanding of a penalisation vs Google’s algorithm doing its job. A penalisation is a manual thing, undertaken by a human employee at Google. There are several reasons for these, but duplicate content aren’t among them.
The algorithm, however, will use multiple factors to decide if you should rank, and where. So if you’re not appearing, it’s more likely because the algorithm has looked at your site, that of the original content, and simply decided they have more reason to rank than you do.
But while you won’t be ‘penalised’ for duplicate content, it is worth asking yourself why you have that content on your site at all – why take someone else’s content when you can create your own, better content and set yourself apart?
So what should you do for SEO?
Simple – provide good quality content, answer user need, encourage other sites to link to you (by offering them content, guest posts, interviews and more) and ensure your site is as easy as possible for the search engine spiders to read and index. Need help with that? Drop us a line and we can see where it fits in with an integrated digital marketing campaign.